Search Results

Search found 6826 results on 274 pages for 'dedicated hosting'.

Page 176/274 | < Previous Page | 172 173 174 175 176 177 178 179 180 181 182 183  | Next Page >

  • Multi sim GSM modem or alternative

    - by Ando
    I'm trying to administer the SMS trafic of my businesss centrally through a web portal. In Europe (except UK) we don't have a numbers/SMS trafic provider like Twilio or Clickatell, nor any build in way to administer the SMS traffic for a number via http, so I will have to buy the long numbers and administer the SMS traffic myself. For this I was looking into a hardware solution for hosting all my SIM cards - I have like 400 sims cards (= numbers). I saw that GSM modems might fit in but they don't seem to scale up very well. Could you recommend me a GSM modem? If this is not the best way to approach this, what would my alternatives be? Thanks in advance

    Read the article

  • How secure is Windows IPSec VPN?

    - by sergeb
    I know the answer is, depends on how you configure it... But bare with me - our IPSec Site-to-Site VPN is configured by one of the most trusted hosting companies. One of our clients expressed concerns that "Windows Server 2008 Server IPSEC is not ICSA certified and lacks some of the common features for maintaining VPN stability" (they refer to the lack of "auto keep live" feature). They also are saying that "Windows platforms are not recommended as VPN endpoints due to security concerns and this is one reason that the ICSA testing labs will not certify it as a valid IPSEC solution" (I couldn't find a proof to this one) Are there any whitepapers or references that can prove the security of Windows IPSec implementation? Thanks!

    Read the article

  • Not all of your nameservers are in different subnets. Single point of failure

    - by user2118559
    Using VPS hosting and dynadot (domain registrar) DNS. Checked domain name with http://www.intodns.com and get some warnings Different subnets WARNING: Not all of your nameservers are in different subnets and Different autonomous systems WARNING: Single point of failure As understand to avoid the warning must have second Ip4 address and both the addresses must point to different servers? If both Ip addresses point to the same server, it does not help? I mean each server has own Ip address. If one server down, then visitors can access website (files) on another server? Is this the reason why need more than one Ip? Tried to point website to 2 ip addresses and after some time get warning from uptimerobot Connection Timeout

    Read the article

  • Running a bash script from an HTML link or button

    - by Andrew
    I have a webserver that's hosting lots of images. I want the client to be able to press a button or a link, which will run a bash script, which will create a video based on all these pictures. The script I'm trying to run is this: #!/bin/bash # cd to the directory cd /var/www/gallery # use ffmpeg to make video ffmpeg -pattern_type glob -i 'img-*jpg' -r 1 video.mp4 # Take the first file in the directory and name it video.mp4.jpg (for thumbnail) cp `ls | sort -n | head -1` video.mp4.jpg The script is located on the server. So when the client clicks the link or button, the script will run, and the video is created. I've tried both solutions listed here but I can't seem to get it to work. I have php installed on my server.

    Read the article

  • Question about Domain Forwarding [beginner]

    - by Jack W-H
    Hello folks Just a quick beginner's question here. I have a webapp located at domainxyz.com, and it generates short URLs for long posts automatically - so rather than visit domainxyz.com/reallylongpostnamehere I can just type domainxyz.com/a5c and be taken there automatically. However, I've bought a shorter domain name - short.com - and I want to be able to visit short.com/a5c and be redirected (or forwarded) to domainxyz.com/a5c. Or short.com/7f0 -- domainxyz.com/7f0. This way, although it seems a tad illogical it saves me setting up another hosting account on short.com to deal with the URL shortening. Is this possible? I realise you can forward domains, but, can you forward domains AND forward the URL segments? Thanks! Jack

    Read the article

  • Port Forwarding: Why do my local sites on 80 work but not those on 8080?

    - by Chadworthington
    I setup my router to forward port 80 to the PC hosting my web site. As a result, I am able to access this url (Don't bother clicking on it, it's just an example): http://my.url.com/ When i click on this link, it works: http://localhost:8080/tfs/web/ I also forward port 8080 to the same web server box But when I try to access this url I get the eror "Page Cannot be displayed:" http://my.url.com:8080/tfs/web/ I fwded port 8080 the same way I fwded port 80. I also turned off Windows Firewall, in case it was blocking port 8080. Any theories why port 80 works but 8080 does not?

    Read the article

  • Why are my files in /var/lock and where did they just go?!

    - by Nicky Hajal
    I am hosting a website on Debian 5.0 & Apache2. Today one of my websites was down, Apache said it couldn't find the directory. I located the files and the whole site once in /var/www/site was now /var/lock/site. All the files were present. I was confused, but figured I'd just move it back. mv /var/lock/site /var/www All looked fine... Except that only the directories moved and the files appear to be lost! I am working on restoring from backups but I would really love to know what happened and where my files went (the backups are a few days old). Thanks for your help!

    Read the article

  • Facebook links to my site resolve as 403 forbidden

    - by filip
    Hi I'm experiencing a super weird problem. Whenever I post links to my website on Facebook, they come up as Forbidden. The site itself works great and I have no seen this when linking on other sites. Could this be a server misconfiguration? Any thoughts on where to look? here's some Info: I have a dedicated server running WHM 11.25.0 i have 2 sites hosted here using cPanel 11.25.0 the error msg: Forbidden You don't have permission to access / on this server. Additionally, a 404 Not Found error was encountered while trying to use an ErrorDocument to handle the request. Apache/2.2.14 (Unix) mod_ssl/2.2.14 OpenSSL/0.9.8i DAV/2 mod_auth_passthrough/2.1 mod_bwlimited/1.4 FrontPage/5.0.2.2635 Server at ---- Port 80

    Read the article

  • Load Balancer recommendations

    - by delerious010
    I provide hosting service for about 250 clients to date, and this is increasing on a monthly basis. For each client, I have 2 "services" configured for L4 balancing / persistence .. one on port 80, another for port 443 which redirects to another internal port as well as 4 servers per service. This equates to a total of 500 "services" and 2000 "servers". I'm currently running with a couple CoyotePoint load balancers, and have had a look at some Barracudas but so far I'm really not impressed by those. Could anyone recommend some good load balancers which would be able to support this sort of load ? And which offer a good API, or shell access to automate management.

    Read the article

  • How can I remove HTTP headers with .htaccess in Apache?

    - by Daniel Magliola
    I have a website that is sending out "cache-control" and "pragma" HTTP headers for PHP requests. I'm not doing that in the code, so I'm assuming it's some kind of Apache configuration, as suggested by this question (you don't really need to go there for this question's context) I don't have anything in my .htaccess files, so it's gotta be in Apache's configuration itself, but I can't access that, this is a shared hosting, I only have FTP access to my website's directory. Is there any way that I can add directives to my .htaccess files that will remove the headers added by the global configuration, or otherwise override the directive so that they're not added in the first place? Thank you very much Daniel

    Read the article

  • Information about recent code injection from http://superiot.ru

    - by klennepette
    Hello, I manage the hosting for a few dozen websites. Since about a week I've been finding this code in 12 different websites in theindex.php files: <script type="text/javascript" src="http://superiot.ru/**.js"></script> // The name of the actual javascript file differs <!-- some hash here--> Some of the websites are on different servers, some aren't. I'm just wondering if anyone else has been seeing this too. Edit with some more information: All servers are centOS 5.3 PHP versions are either 5.2.9 or 5.2.4 Apache versions are either 2.2.3 or 1.3.39

    Read the article

  • rsync for coping file

    - by vinayrks
    I am migrating my old server to new server . I used this server for hosting website . first I tried sftp but due to huge number of files and connection time out , it simply didn't work . then I tried rsync .rsync working good , but only problem I am facing it updating file very nicely & fastly but do not copy new files please help me . because still i need to transfer lots of file. I am using this command : rsync -anv -e ssh oldserver:/path/ /path

    Read the article

  • It is possible to use the Exchange 2010 web client with an Exchange 2007 mail server?

    - by michielvoo
    We are evaluating our options to upgrade our Windows SBS 2003 server. We are considering Windows SBS 2008, which comes with Exchange 2007 and an extra Windows Server 2008 Standard license. If we also bought Exchange 2010, could we install it on the Windows Server 2008 Standard machine and use the web client in combination with the Exchange 2007 server (that would be hosting the mailboxes)? Is that a supported server role for Exchange 2010? I remember reading about so called front-end server configurations, but I have no experience with that. Thanks!

    Read the article

  • Why would it be a bad idea to have database connection open between client requests?

    - by AspOnMyNet
    1) Book I’m reading argues that connections shouldn’t be opened between client requests, since they are a finite resource. I realize that max pool size can quickly be reached and thus any further attempts to open a connection will be queued until connection becomes available and for that reason it would be imperative that we release connection as soon as possible. But assuming all request will open connection to the same DB, then I’m not sure how having a connection open between two client requests would be any less efficient than having each request first acquiring a connection from connection pool and later returning that object to connection pool? 2) Book also recommends that when database code is encapsulated in a dedicated data access class, then method M opening a database connection should also close that connection. a) I assume one reason why M should also close it, is because if method M opening the connection doesn’t also close it, but instead this connection object is used inside several methods, then it’s more likely that a programmer will forget to close it. b) Are there any other reasons why a method opening the connection should also close it? thanx

    Read the article

  • capistrano still asks for the 1st password even though I've set up an ssh key???

    - by Greg
    Hi, Background: I've setup an ssh key to avoid having to use passwords with capistrano per http://www.picky-ricky.com/2009/01/ssh-keys-with-capistrano.html. A basic ssh to my server does work fine without asking for passwords. I'm using "dreamhost.com" for hosting. Issue - When I run 'cap deploy' I still get asked for the 1st password (even through the previous 2nd and 3rd password requests are now automated). It is the capistrano command that start with "git clone - q ssh:....." for which the password is being requested. Question - Is there something I've missed? How can I get "cap deploy" totally passwordless? Some excerts from config/deploy.rb are: set :use_sudo, false ssh_options[:keys] = [File.join(ENV["HOME"], ".ssh", "id_rsa")] default_run_options[:pty] = true thanks PS. The permissions on the server are: drwx------ 2 mylogin pg840652 4096 2010-02-22 15:56 .ssh -rw------- 1 mylogin pg840652 404 2010-02-22 15:45 authorized_keys

    Read the article

  • SMTP server problem

    - by ram
    Hi, Our requirement is to send weekly newsletters to our website customers. For which we wanted to have local hosted SMTP server in our office. We are not using SMTP server provided by website hosting provider, as we wanted to reduce the network traffic and avoid IP blocking due to bulk mails. We are sending newsletters on weekly basis from our local SMTP server. But due to some reasons, some emails are going to spam and some are not reaching to customers and sometimes there are bounce messages to follow bulk email guidelines (mainly from Gmail). Can you please suggest me, how to achieve my problem. I also wanted to know what type of technology generally Linkedin or banks uses to send notifications emails to all its customers. When they send bulk emails, they will always reach inbox with out any problem. I want the same solution to implement for my website. Please suggest me. Thank you very much in advance.

    Read the article

  • Mod_rewrite not working on ISPConfig 3 Server

    - by Akahadaka
    Problem I recently migrated a Drupal site from a shared hosting server to my own VM. Everything appears to work correctly, except clean urls. My VM Setup Ubuntu 10.04 LAMP ISPConfig 3 What I've tried From reading up on a number of drupal forums I've tried the following in this order Check that mod_rewrite is installed and enabled Changed PHP from FastCGI to Mod_PHP (prefer to use FastCGI or suPHP though to avoid having tmp/files folders with 777 permissions) Changed the Redirect type to L in ISPConfig Sites-domain.com-Redirect Changed /etc/apache2/sites-enabled/000-default <Directory /var/www/> Options Indexes FollowSymLinks MultiViews AllowOverride All ... </Directory> Not sure about points 3 and 4, I do want all domains to be able to use mod_rewrite out of the box. Question Have I done something wrong or am I missing a step? Ultimately I would like to use FastCGI and clean urls working on all ISPConfig 3 domains without having to make any changes to individual domain settings. Any ideas appreciated, I'll try them all.

    Read the article

  • Setting up DNS in WHM/cPanel

    - by Jon Furmanski
    I don't understand what I'm doing wrong, but I'm sure this is a simple fix. I setup WHM/cPanel for the first time on my VPS and understand how DNS works for the most part (or so I thought). I created under the main domain name 2 nameservers (ns1.maindomain.com & ns2.maindomain.com). I have 2 IP address for my sever so each one points to a unique IP: ns1.maindomain.com => 198.x.x.204 ns2.maindomain.com => 198.x.x.205 I also set up reverse DNS with my hosting provider. When I put in my two nameservers under another domain (secondary domain), GoDaddy states that the nameservers are invalid. Any ideas on why this is or any configurations in cPanel that need to be made?

    Read the article

  • How hard for a Software Developer to Maintain a Server

    - by Samy
    I'm a software developer and don't have much experience as a sysadmin. I developed a web app and was considering buying a server and hosting the web app on it. Is this a huge undertaking for a web developer? What's the level of difficulty of maintaining a server and keeping up with the latest security patches and all that kind of fun stuff. I'm a single user, and not planning to sell the service to others. Can someone also recommend an OS for my case, and maybe some good learning resources that's concise and not too overwhelming.

    Read the article

  • Working WCF WebServices with NLB server

    - by gguth
    Im starting the architecture of a new project using WCF, but im not the right person to make some network considerations, so im doing some research but cannot find the answers to these questions: We´ll host the WCF service in a common Windows Service app in 2 servers and we´ll have another server to make the Load-Balancing job using the WNLB. The fact that we are hosting the WCF in a Windows Service app can disturb the NLB job? Before my research i thought the load balancing was tought to configure, but with NLB it seems to be very simple, its really that simple? Note: The binding will be basicHttpBinding

    Read the article

  • SSL issue with emails

    - by JackWillDavis
    OK, so I have somebody hosting a site on my CentOS 5.8 Plesk 11 control panel. He has a EV SSL which is validating the site fine however he has failed the PCI check because it is saying his email servers (SMTP, IPAM, POP) have the wrong name on the servers. This is because his SSL certificate is not a wildcard certificate and the email servers are flagging the default Plesk SSL certificate. Is there a way to stop Plesk automatically connecting emails via the default SSL? I'm fairly new to things like this so I hope I've written everything I need, let me know if any more details are needed. Jack

    Read the article

  • Basic clarification about Limited FTP/sFTP users

    - by mattewre
    I would like to get some clarification about the correct way to create limited users to access to my VPS user as WEBSERVER with Nginix. I'm used to NOT install FTP and access via SFTP only. It is ok for every set up? this is what I usually do from to create a limited user called "admin" that should be able to have access via SFTP to the folder with the website data mkdir -p /var/www/mysite.com/ adduser admin adduser admin www-data chown -R root:root /var/www chmod -R 755 /var/www chmod -R 755 /var/www/mysite.com chown -R admin:www-data /var/www/mysite.com/ It seems not to be the correct way, I always have problems with permission when I upload some files (for example with Wordpress in general). I would like to create an user that does work exactly as the one that the "provides" give to their client when they buy an Hosting service (that is a FTP, I would prefer SFTP access). It is for personal user, but I think that a limited user is a lot safer to use then the "root" via SFTP.

    Read the article

  • Can VMWare Workstation 7.x and Sun VirtualBox 3.1.x co-exist on the same Windows 7 64bit Host togeth

    - by Heston T. Holtmann
    Will installing Sun Virtual Box bash or interfere with my VMWare installtion? I don't need to run VMs from both Virtual-Machine software packages at the same time but I do need to run some older Virtual-Machines from Sun-Virtualbox on the same 64-bit Windows 7 host until I can migrate those VMs to VMWare. Before switching from Linux host to Windows host, I ensured to export the VirtualBox VM to an OVF "appliance" with intentions of importing into VMWare Workstation 7. But VMWare gives me an error stating it can't import it. Background info My old workstation host: 32-bit Ubuntu 9.04 running Sun Virtual Box 3.x hosting Windows-XP VM Guest for Windows Software app development (VS2008, etc) Needs I need to get my original Sun-VBox Windows-XP Guest running on my new Windows 7 Workstation either imported into VMWare or running on the Windows version of Sun-Virtual box (I have the VM-Guest Backed up and copied to the new computer data drive. New workstation host: 64bit Windows 7 running VMWare Workstation 7 to host 32bit Ubuntu 9.10 for linux project work.

    Read the article

  • How to determine cpu, ram needed for rails app?

    - by Ben
    What is the most accurate way to determine the amount of cpu speed and ram needed to run my rails app? I believe there are stress testing tools like Tsung, but how do I determine, for example, that I need X more ram, or X more CPU? I would like to find some way to roughly gauge the performance needs of my application so I can anticipate future needs. I think this data will also be useful for me to decide whether to upgrade one machine, or get another dedicated machine and put all the databases on that one. Essentially, I am concerned about scaling issues, and how to anticipate them. Thanks in advance for the help!

    Read the article

  • "Half" ssh authorization to a server with git repository

    - by hsz
    Hello ! Currently I have purchased web hosting with ssh access. I have created a git repository on it and if I set my public key in ~/.ssh/authorized_keys file, I have access to that repo, I can push/pull data, etc. This solution allows access for every user that has his public key in authorized_keys file. But there is one thing that I want to avoid. Every user can login to the server too and has access to whole ssh account. Is it possible to create a blacklist of users' keys that will not have an access to ssh ? I see it that way: user logs in to a git - ok, allow for every one user logs in to ssh account ~/.profile file is hooked and called a custom script: check user's public key if public key is in ~/.ssh/blacklist_keys call bash exit/logout Is it possible in any way ?

    Read the article

< Previous Page | 172 173 174 175 176 177 178 179 180 181 182 183  | Next Page >