Search Results

Search found 9935 results on 398 pages for 'pages'.

Page 222/398 | < Previous Page | 218 219 220 221 222 223 224 225 226 227 228 229  | Next Page >

  • How accurate is apache benchmark?

    - by matthewsteiner
    Alright, so I'm in development right now and I'd like to understand exactly how good the benchmarks are. I've just been using apache benchmark. Do they include the server sending the files? Also, is "requests per second" literally how many users can visit the page within one second? If it's at 30 requests per second, can literally 30 people be refreshing pages every second and the server will be fine? It seems like a lot to me. I know a lot of people get way better stats out of their servers, but I haven't done much optimization yet. Also, will increasing your ram increase you rps linearly? I have 512mb, so if I upgrade to 1gb, would that mean I'd get about 60 rps? How does concurrency affect your rps?

    Read the article

  • Opening offline version of Microsoft Books Online in browser

    - by ercan
    I often use the MSDN website for language reference. In order to make navigation faster, I downloaded the offline version of SQL Server 2005 Books Online from here: http://www.microsoft.com/downloads/details.aspx?familyid=be6a2c5d-00df-4220-b133-29c1e0b6585f&displaylang=en The reason why it is 137MB is that it comes with its own GUI, which, not surprisingly, is rather poor! Apparently though, the pages are written in html. The URIs look like: ms-help://MS.SQLCC.v9/sqlcc9/html/674933a8-e423-4d44-a39b-2a997e2c2333.htm . I can open the URI in IE, but with errors. Do you know if I can open them with Firefox and how? Or is there a simple HTML version of "MS Books Online", for example in a ZIP file?

    Read the article

  • Snow Leopard Permissions in Shared Folders reset on saving file

    - by jan
    I have several users who access their accounts on OSX through their windows machines over samba. As soon as they update/save a file, it sets the permissions to -rwxr----- which means no other users can read the files. This affects apache user, for example, so changes to their files under their Sites directory means Apache can no longer serve the pages. I've looked into /etc/smb.conf, /var/db/smb.conf, and /var/db/samba/smb.shares but I can't figure out how to force it to use the parent folders permissions. Thanks in advance.

    Read the article

  • Best practices for re-IP'ing / migrating servers and applications

    - by warren
    Some of this question would be highly application-specific, but what approaches do you take when looking to migrate applications from one server/platform to another and servers form one network segment to another? For applications that can't be re-IP'd (many exist in this category), the general answer is to nuke and pave (or extend a clusterable application, then remove the segment that needs to be "moved"). For "normal" applications (httpd, mail, directory services, etc), what are the checks ou perform before, during, and after a move to ensure the health of the migrated app/server? An example with Apache: backup httpd conf directory change httpd conf files to use new IP address of server change (or add) IP of server restart Apache verify web server still serves pages reboot server verify environment comes back up healthy

    Read the article

  • Latency in TCP/IP-over-Ethernet networks

    - by aix
    What resources (books, Web pages etc) would you recommend that: explain the causes of latency in TCP/IP-over-Ethernet networks; mention tools for looking out for things that cause latency (e.g. certain entries in netstat -s); suggest ways to tweak the Linux TCP stack to reduce TCP latency (Nagle, socket buffers etc). The closest I am aware of is this document, but it's rather brief. Alternatively, you're welcome to answer the above questions directly. edit To be clear, the question isn't just about "abnormal" latency, but about latency in general. Additionally, it is specifically about TCP/IP-over-Ethernet and not about other protocols (even if they have better latency characteristics.)

    Read the article

  • Apache, modifying response codes from 404 to 301

    - by user72539
    Hi, I'm running a magento installation on an apache server. There are many pages indexed in both google and linked to from external sites. I can't use 301 redirects in a .htaccess file as I can't be sure I will catch all the links. At the moment all requests are rewritten through magento and if a request isn't found magento returns a 404 File not Found. Is there a way of using one of the apache modules to filter the response* from magento and if a 404 Not Found is being sent back then replace the response with a standard 301 Redirect to the home page? E.g. Request to Magento -- Apache -- Rewrite to Magento index.php page -- page processed. Response if request exists -- return results (200) if request doesn't exist -- return 404 -- apache filter change response -- return 301 redirect to / I appreciate any help. Thanks, Jon as far as I am aware mod_rewrite is only used to rewrite requests and doesn't allow the modification of responses.

    Read the article

  • Why do browsers have so many possible exploits?

    - by Beau Martínez
    When browsing I am ocassionally given warnings about pages that host malware "that could damage my computer". I am seriously perplexed as to why, in 2010, browsers still have possible exploits and can be cracked. My question is "Why?". I'm assuming it's because of the quick development that occured in the browser wars which were unsufficiently tested, but I'm unsure. Surely WebKit would have patched all the issues in KHTML, or Gecko sorted out the flaws in Netscape's engine, and the IE coders sorted through their codebase to eliminate possible flaws? (Somewhat related: http://superuser.com/questions/117770/which-browser-is-the-most-secure-research-and-practically-based.)

    Read the article

  • Configuring Apache for multiple clients

    - by Chris_K
    Last week I had a question here about suexec / suphp but I tried to accomplish too much. I'm going to narrow the scope a bit and try again. I'd like to configure a LAMP server to host multiple clients. I'd like it to seem (from the client's viewpoint) just like any other shared hosting environment. Web sites in their home directory, no need to muck around with file ownerships to get pages served, etc. It would seem that a configuration that involves suexec and suphp is the way to go(?) I'm specifically looking for a current/modern guide on how to accomplish this (I'll be using CentOS if it matters) and I'm afraid I need more than a link to Apache docs. Are there any good How-To's out there? The few I've found have been pretty out of date, but it is quite possible my search was weak.

    Read the article

  • Is there a browser extension which can copy a webpage snippet/clip/scnapshot to clipboard

    - by Yuriy Kulikov
    I have to save links to web pages into a google drive document. What I want is to copy simple webpage snippet, similar to the one this G+ extension button does, to clipboard. One picture and page title as a link. I was looking for many days now and I still couldn't find anything which does the same thing. In the end the thing I came up with is to hit the G+ button and copy contents of the popup. I am wondering if anybody knows how can this be done right? Thanks in advance, BR, Yuriy

    Read the article

  • What Defines an AD Object as "Inactive"

    - by Malnizzle
    I am going to be using some DSQUERY/DSMOVE scripts to clean up my AD Domin. One option is to move inactive objects to a OU that has restrictive GPOs applied to it. Something like: DSQUERY computer -inactive 10 | DSMOVE -newparent <distinguished name of target OU> My question is what value defines an object, both user and computer, as "inactive" for a period of time? Is it the last time a computer was logged on to for computer accounts, and for users is it the last time that the user account logged on to a computer? But what if, say for example, I had a web server that wasn't rebooted and or logged into for a couple of months but remain powered on and functioning as normal, would it be defined as "inactive" where as technically it's still serving web pages and so on? Thanks for the help!

    Read the article

  • Apache is running but there is no page displayed

    - by Michael Ozeryansky
    I am on a Mac OSX and I am using the built in PHP and Apache2. I have been setting up MySQL and finally when I got MySQL working my local site won't display. Do note that I did have the web server running and delivering PHP enabled pages, just no database connection. But my question is not about MySQL. I have changed various settings in the 'http.conf' file, and I have the line: '127.0.0.1 localhost' in my hosts file. I also have other alias' pointing to 127.0.0.1. I have checked everything I could about Apache and I have made sure that any message in the error_log is ok. I currently have my errorLevel set to debug, so I get all the messages. At this point (HOURS of self fixing) I think I need help. What can I provide for someone to figure this out with me. Thanks.

    Read the article

  • Logging all Firefox HTTP Request Headers?

    - by Hayek
    I'm using Ruby+Watir to request pages through Firefox. I would like to record the headers and content of every http request made through the browser. Would it be possible to configure a proxy solution to store this information, either in a file or pipe it into an application? I'm running Ubuntu x64. // Edit: I would like to store the data in logs because I would like to view it later. Preferably, I am looking for a solution that runs quietly in the background and stores the headers/content in files.

    Read the article

  • Apache Error Log - "Web Path" instead of Filesystem Path

    - by Craconia
    Hello everyone, I'm running Apache on Linux and I'm using OpenSSH to provide SFTP access to some customers so they can upload their pages and also look at their respective site logs (access & error). I'm using the new feature in OpenSSH to chroot their SFTP access and so far so good. My problem is that on the error_log, every reference for "File not found..." is given using the OS filesystem path as opposed to the "Web" path. I'd rather have the web path on the error log in order not to reveal the OS path. Since I'm already chrooting the users, I don't want to reveal WHERE on the OS their files are actually located... Is it possible to change this behaviour via any directive? I tried looking for it but couldn't find anything :( Thanks, Craconia

    Read the article

  • Windows XP problems displaying internet browser backgrounds correctly

    - by Samurai Waffle
    My friend has a Windows XP computer that doesn't show the colored background on pages, it's always white. On top of that some pictures won't show up, there will just be an empty white frame. Also when you left click on a folder, instead of opening it up, it opens up a new window that turns out to be the search results window. I've never heard of these problems before, and I can't find any information on the internet about it. I assume it's a virus deeply imbedded into the system, but no virus scanner has found it. Thanks for the help!

    Read the article

  • Custom 403 Error page not showing

    - by Rahul Sekhar
    I want to restrict access to certain folders (includes, xml and logs for example) and so I've given them 700 permissions, and all files within them 600 permissions. Firstly, is this the right approach to restrict access? I have a .htaccess file in my root that handles rewriting and error documents. There are two pages in the root - 403.php and 404.php - for 403 and 404 errors. And I have these rules added to my .htaccess file: ErrorDocument 404 /404.php ErrorDocument 403 /403.php Now, the 404 page works just fine. The 403 page does not show when I try to access the 'includes' folder - I get the standard apache 403 error page instead, saying 'Additionally, a 404 Not Found error was encountered while trying to use an ErrorDocument to handle the request.' However, when I try going to the .htaccess file (in the web root) in my browser, I get my custom 403 error page. Why is this happening?

    Read the article

  • .htaccess redirect for www in parent folder and children react

    - by ServerChecker
    We were having a problem with the Norton seal not showing up on our affiliate marketing landing pages (landers). Turns out, the Norton seal was super picky about the "www." prefix. I had folder paths like /lp/cmpx where x was a number 1-100 and indicated advertising campaign number. So, to remedy this, I stuck this in my .htaccess file right after the RewriteEngine On line: RewriteCond %{HTTP_HOST} ^example\.com RewriteRule ^(.*)$ http://www.example.com/lp/cmp1/$1 [R=302,QSA,NC,L] Trouble is, I had to do that under every campaign folder, changing cmp1 to whatever the folder name was. Therefore, my question is... Is there a way I can do this with an .htaccess file under the parent folder (/lp in this case) and it will work for each of the children? EDIT: Note that I stuck an .htaccess file in /lp just now to test: RewriteEngine On RewriteCond %{HTTP_HOST} ^example\.com RewriteRule ^(.*)$ http://www.example.com/lp/$1 [R=302,QSA,NC,L] This yielded no effect to the /lp/cmpx folders underneath, to my dismay.

    Read the article

  • How to configure custom error page in Plesk 9.3 for non existing folder?

    - by Junior Mayhé
    I'm trying to configure Plesk in order to show website visitors a custom error html. The current hosted site is an ASP.NET site. This site shows its custom errors on error403.aspx and error404.aspx files. Now to comply with plesk, I've created error_docs with required files like forbidden.html, etc... When user try to navigate http://mysite.com/a_missing_page.aspx, the visitor is redirected to error404.aspx correctly. But when user try to navigate to a non existent directory http://mysite.com/a_missing_folder/ the site takes me to IIS 404 regular page. Plesk has Custom error documents activated on Web hosting settings. ASP.NET Error pages defined in web.config are showing fine. But it seems plesk wont show its custom html error documents. The bottom line here is about setting up a custom error page to a directory. Is it possible to do this using Plesk or do I have to change it manually on IIS?

    Read the article

  • Explanation of command to uppercase the first letter of a filename

    - by hazielquake
    I'm trying to learn to rename files with the command line, and after browsing around a lot of pages I finally found a command that uppercases the first letter of a file, but the problem is that I want to understand the meaning of each command. The command is: for i in *; do new=`echo "$i" | sed -e 's/^./\U&/'`; mv "$i" "$new";done I understand the 'for' kinda... but not the 'echo' or '`' and especially the sed command. if someone has a little patience to explain the meaning of each thing that'd be awesome! Thanks!

    Read the article

  • Can 'screen' grab an existing process and tie itself to it?

    - by warren
    Scenario: Started a process that's going to take "a while" to complete outside of screen. Need to leave the terminal / netowrk hiccups Process lost Would be nice if: Started a process outside of screen Realize error Run screen <magic-goes-here> and it grabs the active process to itself From the man pages and --help info, I don't see a way this can be done. Is this possible directly with screen? If not, is it possible to change the owning shell of a process, so that the bash (or other shell of your choosing) instance inside screen can have a command run which will change the parent shell of the initial process to itself from the originator?

    Read the article

  • Internet connection slower than network connection speed

    - by Mike Pateras
    I've got a computer connected to a wireless router on a different floor. When I look at the network connection, I'm told the signal strength is low, and that I've got a connection of about 26mbps (often higher). However, my internet connection on that machine is very slow. Speedtests show it at about 1-2mbps, and it really shows when loading pages and video. I have fiber optic internet access, and the machine that's connected to the router/modem via cable gets the 20mbps on speed tests, and is extremely fast in every day use. My question is, is the advertised 26mbps+ connection speed perhaps inaccurate, and that my wireless bandwidth is the likely bottleneck here? Or is the signal strength what's key here? And what might I do about this? Power cycling the router helped a bit, a speed test went as high as 6mbps after doing that.

    Read the article

  • Apache: how to set custom 401 error page and save original behaviour

    - by petRUShka
    I have Kerberos-based authentication with Apache/2.2.3 (Linux/SUSE). When user is trying to open some url, browser ask him about domain login and password like in HTTP Basic Auth. If user cancel such request 3 times Apache returns 401 Authorization Required error page. My current virtual host config is <Directory /home/user/www/current/public/> Options -MultiViews +FollowSymLinks AllowOverride None Order allow,deny Allow from all AuthType Kerberos AuthName "Domain login" KrbAuthRealms DOMAIN.COM KrbMethodK5Passwd On Krb5KeyTab /etc/httpd/httpd.keytab require valid-user </Directory> I want to set nice custom 401 error page with some instructions for users. And I added such line in virtual host config: ErrorDocument 401 /pages/401 It works, when user can't authorize apache redirects him to my nice page. But Apache doesn't ask user login\password as it did before. I want this functionality and nice error page simultaneously! Is it possible to make it works properly?

    Read the article

  • Asp.net 4.0 Handler Mappings Missing in IIS7

    - by Marc
    I have two Windows 2008 R2 Servers running an asp.net 4.0 app. The server that is having problems actually loads asp.net pages just fine, but if there are any ajax calls they don't work. I noticed there are no .net 4.0 specific Handler Mappings in IIS for this server like the other server has. It's literally missing all .net 4.0 mappings (.axd, .soap, .cshtm, .ashx and even .aspx). I've tried running "aspnet_regiis -ir" but that didn't help. Should I reinstall the .net 4.0 framework? Manually add all these missing mappings? Is there something else going on? What I don't want to do is add a ton of handlers to a web.config, they aren't needed on the server that works so it shouldn't be needed on the broken one.

    Read the article

  • Problems with Adobe Flash

    - by Georg Scholz
    I'm running Windows 7, 64-Bit on a Core-i5 HP Notebook. For approximately the last 3 months, I've had major problems with Adobe Flash. Flash has been uninstalled and installed again multiple times. No changes. Here is what happens in different browsers: (All browsers are in the latest version as of this post) Firefox: Doesn't work at all. This bothers me most, since FF is my favorite browser; I'm using a lot of Plugins. Have tried to de-activate all Plugins, but no change. IE and Chrome: Works, but most pages with Flash are stuck during page load for ~30 seconds. After that, everything is fine. Opera: No problems, everything working as it should. Strange, eh? Any help is highly appreciated!

    Read the article

  • Rendering extension-less files with php creates 404 errors when accessing the directory index

    - by ojcar
    I'm trying to render all files in a directory as php files. These files don't have any extension. I do this by adding the following .htaccess file: SetHandler application/x-httpd-php5 DirectoryIndex index index.php index.html The problem is that I'm getting 404 errors when accessing the index file on a url. For instance http://foo.com/mydir/ will result in a 404 error (in the logs) and a "No input file specified" message in the browser. If I remove the SetHandler line things work correctly for the index file but my other pages do not render as php. PHP 5.2.11 Apache 2.2.14 Linux Any ideas of what I'm doing wrong?

    Read the article

  • Export a single layer as an image in Photoshop

    - by wrburgess
    I have a lot of designers send me layered PSDs of their designs and I need to break out the pieces of the designs to place on web pages. I can do a decent number of things in Photoshop, but I'm hardly efficient with it. My old way of just copying the image that's in a layer and pasting into a new image seems to take forever as I screw around with cropping and such. I've got Photoshop CS5, so I don't need external software to do anything, but I just need to figure out how to take a single layer, that may hold something small like an icon, and export it as a PNG or JPG. I am aware of the script called "Export Layers to Files" but it took about an hour and exported ALL of my layers to a huge number of files. I wasn't looking for a solution that broad. Is there an easy way to do this?

    Read the article

< Previous Page | 218 219 220 221 222 223 224 225 226 227 228 229  | Next Page >