Search Results

Search found 45175 results on 1807 pages for 'web deployment'.

Page 344/1807 | < Previous Page | 340 341 342 343 344 345 346 347 348 349 350 351  | Next Page >

  • AWS RDS connection count

    - by wmarbut
    I am using AWS RDS with MySQL for a project and have a "large" instance. The documentation is clear on what this means as far as compute resources and RAM goes, but I can't find anything that documents how many open database connections that I can have. The app that I am using is PHP and it utilizes PDO with persistent connections. This means that the number of open connections could reach the maximum number of PHP child processes running at any given point. How do I ensure that my RDS instance has a max connections setting high enough to be comfortable with this?

    Read the article

  • Capacity Planning IIS6

    - by user45457
    Hello, I am planning to host 5000 site with separate application pool. Based on my estimate there are around 1600- 2000 worker process will be running on the server. So my question is: Is is possible to host 5000 sites on a server with IIS 6. Server Configuration: 16 Core @ 2.27 Ghz CPU 8 GB RAM Please tell me if u require any other information. Kartik

    Read the article

  • Nginx ssl redirection of images

    - by krishna raj
    Hi. I am trying to set up nginx as reverse proxy for a tomcat server using SSL connection. I want the client's browser to load my tomcat application when nginx reverse proxy's IP is called from client's browser. My tomcat application's address is 192.168.25.25 and nginx proxy's address is 192.168.25.50 In my nginx.conf file i have added these lines # location / { proxy_pass https://192.168.25.25:443/myapp/; proxy_redirect https://192.168.25.25/myapp/ https://192.168.25.25/; } # Some of the images in my application is stored at 192.168.25.25/images/ . Now these directories cant be accessed as the proxy_pass is set to 192.168.25.25:443/myapp. Is there way to access images directory also without changing proxy_pass ? Thanks in advance.

    Read the article

  • Outlook Web Access and LDAPS

    - by john
    Hello, Having trouble setting up ldaps for password change facility in OWA. we've installed the certificate according to MS article http://support.microsoft.com/kb/321051. but, even a simple test from the domain controller with ldp does not work. thanks for any tips.

    Read the article

  • AWS Large Instance: /mnt does not show all the space that should be available

    - by Emile Baizel
    I just created a Large (m1.large) 64 bit instance which comes with 850 GB instance storage. Look at the Large Instance http://aws.amazon.com/ec2/instance-types/ A 'df -h' from the root folder gives me the output below. The /mnt is where I'm thinking the instance storage is but here it is only showing me 414G. I have set up two servers and both are showing the same numbers. root@ip-11-11-11-11:/# df -h Filesystem Size Used Avail Use% Mounted on /dev/sda1 7.9G 1.1G 6.5G 14% / none 3.7G 112K 3.7G 1% /dev none 3.7G 0 3.7G 0% /dev/shm none 3.7G 48K 3.7G 1% /var/run none 3.7G 0 3.7G 0% /var/lock /dev/sdb 414G 199M 393G 1% /mnt

    Read the article

  • Snow Leopard Server, how to turn on compression for web

    - by gbrandt
    I am trying to make the webserver in Snow Leopard compress all output by default. The only thing I have found is to add SetOutputFilter DEFLATE in the .htaccess file for a directory. I really don't want to add an .htaccess file to every directory served. How can I globally get Apache2 on Snow Leopard to compress output?

    Read the article

  • Why the system information message when accessing an Ubuntu server doesn't match free -m?

    - by Andres
    Each time I SSH into my AWS Ubuntu servers I see a system information message, showing load, memory usage and packages available to install, like this: Welcome to Ubuntu 12.04.3 LTS (GNU/Linux 3.2.0-51-virtual x86_64) * Documentation: https://help.ubuntu.com/ System information as of Sun Nov 10 18:06:43 EST 2013 System load: 0.08 Processes: 127 Usage of /: 4.9% of 98.43GB Users logged in: 1 Memory usage: 69% IP address for eth0: 10.236.136.233 Swap usage: 100% Graph this data and manage this system at https://landscape.canonical.com/ 13 packages can be updated. 0 updates are security updates. Get cloud support with Ubuntu Advantage Cloud Guest http://www.ubuntu.com/business/services/cloud Use Juju to deploy your cloud instances and workloads. https://juju.ubuntu.com/#cloud-precise *** /dev/xvda1 will be checked for errors at next reboot *** *** System restart required *** My question is about the memory percentage shown. In this case, it's showing a 69% of memory usage, but since the swap usage was 100% I checked it by myself. So when I run free -m I get this: total used free shared buffers cached Mem: 1652 1635 17 0 4 29 -/+ buffers/cache: 1601 51 Swap: 895 895 0 And that's of course closer to 100% than to 69%

    Read the article

  • ASP.NET Security Exception when Switch IIS7 to Use UNC Path for Content

    - by Jeremy H.
    I have a Windows Server 2008 R2 box running IIS7.5 with Medium Trust configured for ASP.NET. When I have the website running from local content (e.g.: c:\inetpub\wwwroot) everything works fine. When I change IIS to use a UNC path for the content (e.g.: \\computer\wwwroot) I get the following error: Security Exception Description: The application attempted to perform an operation not allowed by the security policy. To grant this application the required permission please contact your system administrator or change the application's trust level in the configuration file. Exception Details: System.Security.SecurityException: Request for the permission of type 'System.Data.SqlClient.SqlClientPermission, System.Data, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089' failed. I'm trying to figure out why ASP.NET/IIS would allow for the SQL call when using local content but not when using a UNC path. Any ideas what I need to do to use a UNC path from IIS7 properly?

    Read the article

  • Magento Community - Hosting :: Need Advice which sharing hosting will run magento fast

    - by user43353
    Hi, Need Advice which sharing hosting will run Magento Community fast or some other not expensive solution. This website will not have a lot of users, I only need that it will run fast for 100-20 users in same time. The problem with magento is database design that make this system very slow , also other staff not the best. I had hostmonster.com and justhost.com for previous website but it wasn't fast enough for single user that not located in USA (my customer areas: Asia, Africa). each action that involve database takes a lot of time. Thanks

    Read the article

  • Why does jmeter not work?

    - by Foolish
    I use jmeter to record requests and then perform a performance test after it records all the requests with proxy server. These requests contain a post form. After that I run the test cases, but I found the post form doesn't work -- it cannot create a record in the website's database automatically. But before that I used Webload and everything was OK. What's the problem? What can I do for this?

    Read the article

  • ftp user and www-data

    - by andrii
    I have a websites that should be completely accessible for Apache(www-data user) and I have an ftp user(that has the same username on the server: user1) that should be able to update the website. I need the user to be able to add files to the website directory, but the www-data should have a full access to the added files. user1 is a member of www-data group. For now when user1 uploads file to the server the file has a rights: 644, but I need 664. How I could organise it. Many thanks

    Read the article

  • amazon dynamoDB or MySQL for storing large arrays inside each row

    - by Logan Besecker
    I am trying to decide which database I should use for an application I'm making. I was leaning toward dynamoDB because of its scalability, but then I read in the documentation which said: there is a limit of 64 KB on the item size although it looks like MySQL has a similar restriction documented here This application will be storing a lot of data in two arrays, which could contain upwards of 10,000-100,000 strings in each. I estimate that these strings will each be somewhere around 20 characters long, so each element of the array will be around 40bytes and each array could be around 4MB. Given this predicament, what database on amazon AWS would you use; or how would you get around the limit of size per row? Thanks in advance, Logan Besecker

    Read the article

  • Turn Off Google Chrome Annoying Link Hover

    - by Volomike
    I like Google Chrome a lot, but there are two problems I have with it. The Address Bar search history feature that I want to turn off, and the feature where I hover over a hyperlink and in the bottom lefthand corner it shows a light blue hover tooltip about where that link goes. I really don't care where a given link goes and wish I could turn that feature off, even if I have to install a Google Chrome Extension. So, for these two reasons, I will not install Google Chrome as my primary browser and will remain with Firefox. So, is there a fix to turn the annoying link hover tooltip feature off?

    Read the article

  • Why do AWS spot-instance prices spike above the "on demand" pricing?

    - by Laykes
    Amazon Pricing on Spot Instance Inconsistencies This is something which will be best explained through screenshots of a historical chart of instance pricings. If you look at a lot of the instance prices for spot instances, you will notice regular patterns of spikes. See here: As you can see, the price for this compute medium instance, regularly spikes above the on demand price. A c1.medium instance (on demand), would only cost $0.186 per hour. But for a period of a few weeks, in zone B, the price would regularly spike to $1.20. This is some 6 times the actual on demand price. It's also not isolated. If you look at zone-b again for small instances, there is a similar, spike frequently. Which goes 4x the on demand pricing. Does anyone know why this happens? Here are a few suggestions Someone entered $1.2 instead of $0.12 (I would discount this since it happened 20 times over the space of 3 weeks). Amazon regularly artifically inflate their prices by bidding on their own instances to get the most bang for their buck. (I would discount this since it would be ridiculous and bad business) Some company launched 1000 servers at once, and wants to make sure that they all launch. (I would discount this since they would presumably launch them at a price which would be below the minimum on demand price. Why would you pay above on demand for a single server?). It's a bug in their reporting?

    Read the article

  • Installing an EC2 (EBS based) instance on another AWS account

    - by imaginative
    We're moving from one AWS account to another. I have a couple of EC2 instances I would like to move over. These instances are EBS based. Googling around, I'm seeing all sorts of convoluted answers for how to appropriately launch this EBS based instance on a new account. Surely there has to be a simple way of going about this. As of right now, what I have done so far is backup the EB2 Volume as a snapshot. I then altered the permissions of the snapshot to allow the new AWS account number to be able to access it. I am not sure where to go from here. Do I have to create an image from the EBS snapshot? Upload it to S3? What's the next step(s)?

    Read the article

  • How to set Virtualbox appliance as webdev portable sollution?

    - by tenshimsm
    I just want to set a a Virtualbox virtual appliance to make it portable. Meaning that I'll enable a network config which will not need to be changed when I am using my laptop in a different network. I want the virtual machine to have internet access to keep it updated and be able to always have direct access from host using, for example, the IP 10.0.2.100 even when I am in a 192.168.0.1 network. So the first virtual network adapter will have a static ip (10.0.2.100) and the second will receive it from the DHCP. I don't know if 2 virtual adapters are needed or just one to accomplish that.

    Read the article

  • How to Install Python Modules on Web Server?

    - by sanghan
    Im running a python cgi script in the cgi-bin directory which uses the sqlite3 module. I run it and it says that it does not recognize the name.. So how do I install this module or other modules on the server hosted by networksolutions? Python documentation has this: python setup.py install --home=<dir> but I have no idea where or how I would run that line. Any help would be appreciated.

    Read the article

  • How can I install Satchmo?

    - by Jonathan Hayward
    I am trying to install Satchmo 0.9 on an Ubuntu 9.10 32-bit guest off of the instructions at http://bitbucket.org/chris1610/satchmo/downloads/Satchmo.pdf. I run into difficulties at 2.1.2: pip install -r http://bitbucket.org/chris1610/satchmo/raw/tip/scripts/requirements.txt pip install -e hg+http://bitbucket.org/chris1610/satchmo/@v0.9#egg=satchmo The first command fails because a compile error for how it's trying to build PIL. So I ran an "aptitude install python-imaging", locally copy the first line's requirements.text, and remove the line that's unsuccessfully trying to build PIL. The first line completes without reported error, as does the second. The next step tells me to change directory to the /path/to/new/store, and run: python clonesatchmo.py A little bit of trouble here; I am told that clonesatchmo.py will be in /bin by now, and it isn't there, but I put some Satchmo stuff under /usr/local, create a symlink in /bin, and run: python /bin/clonesatchmo.py This gives: jonathan@ubuntu:~/store$ python /bin/clonesatchmo.py Creating the Satchmo Application Traceback (most recent call last): File "/bin/clonesatchmo.py", line 108, in <module> create_satchmo_site(opts.site_name) File "/bin/clonesatchmo.py", line 47, in create_satchmo_site import satchmo_skeleton ImportError: No module named satchmo_skeleton A find after apparently checking out the repository reveals that there is no file with a name like satchmo*skeleton* on my system. I thought that bash might be prone to take part of the second pip invocation's URL as the beginning of a comment; I tried both: pip install -e hg+http://bitbucket.org/chris1610/satchmo/@v0.9\#egg=satchmo pip install -e hg+http://bitbucket.org/chris1610/satchmo/@v0.9#egg=satchmo Neither way of doing it seems to take care of the import error mentioned above. How can I get a Satchmo installation under Ubuntu, or at least enough of a Satchmo installation that I am able to start with a skeleton of a store and then flesh it out the way I want? Thanks, Jonathan

    Read the article

  • Symantec site slow only when using a Mac?

    - by cozmokramer8
    I didn't know this was possible but it certainly seems slow when only using a Mac. I participate in the Symantec Connect forums regularly and when I get on it with my Windows PC or laptop it loads perfectly, but my Mac laptop the site rarely loads or if it does it takes a couple minutes for each page. My question is why does a website run slower on the Mac vs PC? No other sites experience this same issue just the symantec connect forums. Thanks, Grant

    Read the article

  • Ubuntu web server 11.10 ftp/server issue

    - by Nate
    I was wondering if I could get some help with FTP, atleast I'm pretty sure it has to do with FTP. Although it could have to do with something else, I'm not 100% sure.. Now, for fare warning, I'm no ubuntu dominator, I'm pretty newb. Anyway, I've attempted to build a webserver to to test php and what not for a site I'm building. Now everything works, the php, the sql etc. By the way, I built this in VMware, so it's virtual, over a network, so I can access stuff from anywhere. I'm in a college right now so yeah. The one problem I have is this. I go into the terminal, and do ifconfig to find my IP. I get it and go to a browser on a different machine and type that IP in. I get the "index of/" page, where I can browse the website I'm making. I can click through folders and what not. I can click on things and they open up. Now lets say I'm working on my desktop and open up an FTP and drag and drop something into there, go to the IP in the browser again and try to open it. I either get "Server error The website encountered an error while retrieving http://my_server_ip/phpinfo.php. It may be down for maintenance or configured incorrectly. Here are some suggestions: Reload this webpage later." or "Forbidden You don't have permission to access /html.html on this server." But, lets say I make it on the server itself, and try, bam, magic it works. I'm sure I set the permissions to let everyone open and view the files, but maybe I didn't? I'm not sure, and this is where I was hoping I could get some help. By the way, I followed a tutorial on changing the www folder (apache) from /var/www to home/"user"/www. I can't recall how I did that, but it's there and my ftp goes to the home/"user"/www folder. But yeah, any and all help is appreciated. Like I said, I'm really new to this, but I do enjoy attempting to make these servers and learning how they work, so it's not like making this webserver is a project for a class, It's just assisting me in testing stuff for another class and possibly other websites later on down the road. Anyway, anyone who decides to help, thanks so much, I'd really appreciate it. Nate. P.S. I'm using Ubuntu 11.10 desktop edition with a LAMP server

    Read the article

< Previous Page | 340 341 342 343 344 345 346 347 348 349 350 351  | Next Page >