Search Results

Search found 2190 results on 88 pages for 'htaccess'.

Page 1/88 | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • Seeking htaccess help: Converting multiple subdomains (both http and https) to www.domain.com using .htaccess

    - by Joshua Dorkin
    I've been trying to get an answer to this question on other forums (the folks at SuperUser thought this was the place I needed to post) and via my connections, but I haven't gotten very far. Hopefully you guys can help me find an answer: I've got a dozen old subdomains that have been indexed by Google. These have been indexed as both http AND https. I've managed to redirect all the subdomains properly, provided they are not https, but can't get any of the https subdomains to property redirect. Here's the code I'm using: RewriteCond %{HTTP_HOST} ^subdomain1.mysite.com$ [NC] RewriteRule ^(.*)$ http://www.mysite.com/$1 [R=301,L] RewriteCond %{HTTP_HOST} ^subdomain2.mysite.com$ [NC] RewriteRule ^(.*)$ http://www.mysite.com/$1 [R=301,L] RewriteCond %{HTTP_HOST} ^subdomain3.mysite.com$ [NC] RewriteRule ^(.*)$ http://www.mysite.com/$1 [R=301,L] This works great until someone goes to: https://subdomain2.mysite.com$ which is not redirected back to http://www.mysite.com$ How can I get this to work? Additionally, I'm guessing there is an easier way to make it happen than setting up a dozen pairs of Rewrite conditions/rewrite rule? Is there any way to do this in just a few lines, including one where I list all the subdomains? I'd actually also like to redirect everything on https://www.mysite.com$ to http://www.mysite.com$ except for 3 folders These are mysite.com/secure, mysite.com/store, mysite.com/user -- is there any good way to add this to the htaccess file? Any suggestions would be great! Thank you in advance for any help.

    Read the article

  • Seeking .htaccess help: Converting multiple subdomains (both HTTP and HTTPS) to www.domain.com using .htaccess

    - by Joshua Dorkin
    I've been trying to get an answer to this question on other forums (the folks at SuperUser thought this was the place I needed to post) and via my connections, but I haven't gotten very far. Hopefully you guys can help me find an answer. I've got a dozen old subdomains that have been indexed by Google. These have been indexed as both HTTP AND HTTPS. I've managed to redirect all the subdomains properly, provided they are not HTTPS, but can't get any of the HTTPS subdomains to property redirect. Here's the code I'm using: RewriteCond %{HTTP_HOST} ^subdomain1.mysite.com$ [NC] RewriteRule ^(.*)$ http://www.mysite.com/$1 [R=301,L] RewriteCond %{HTTP_HOST} ^subdomain2.mysite.com$ [NC] RewriteRule ^(.*)$ http://www.mysite.com/$1 [R=301,L] RewriteCond %{HTTP_HOST} ^subdomain3.mysite.com$ [NC] RewriteRule ^(.*)$ http://www.mysite.com/$1 [R=301,L] This works great until someone goes to: https://subdomain2.mysite.com$ which is not redirected back to http://www.mysite.com$ How can I get this to work? Additionally, I'm guessing there is an easier way to make it happen than setting up a dozen pairs of RewriteCond/RewriteRule? Is there any way to do this in just a few lines, including one where I list all the subdomains? I'd actually also like to redirect everything on https://www.mysite.com$ to http://www.mysite.com$ except for 3 folders. These are mysite.com/secure, mysite.com/store, mysite.com/user. Is there any good way to add this to the .htaccess file?

    Read the article

  • Redirect Using htaccess

    - by manyxcxi
    I am trying to redirect /folder to / using .htaccess but all am I getting is the Apache HTTP Server Test Page. My root directory looks like this: / .htaccess -/folder -/folder2 -/folder3 My .htaccess looks like this: RewriteEngine On RewriteCond %{REQUEST_URI} !^/folder/ RewriteRule (.*) /folder/$1 What am I doing wrong? I checked my httpd.conf (I'm running Centos) and the mod_rewrite library is being loaded. As a side note, my server is not a www server, it's simply a virtual machine so it's hostname is centosvm.

    Read the article

  • performance wise htaccess

    - by purpler
    hese's the my htaccess template, i wonder if anything could be added to increase website performance.. # Defaults AddDefaultCharset UTF-8 DefaultLanguage en-US ServerSignature Off FileETag None Header unset ETag Options -MultiViews #Options All -Indexes # Force the latest IE version or ChromeFrame <IfModule mod_setenvif.c> <IfModule mod_headers.c> BrowserMatch MSIE ie Header set X-UA-Compatible "IE=Edge,chrome=1" env=ie </IfModule> </IfModule> # Proxy X-UA Setup <IfModule mod_headers.c> Header append Vary User-Agent </IfModule> #Rewrites Options +FollowSymlinks RewriteEngine On RewriteBase / # Redirect to non-WWW RewriteCond %{HTTPS} !=on RewriteCond %{HTTP_HOST} ^www\.(.+)$ [NC] RewriteRule ^(.*)$ http://%1/$1 [R=301,L] # Redirect to WWW RewriteCond %{HTTP_HOST} ^domain.com RewriteRule (.*) http://www.domain.com/$1 [R=301,L] # Redirect index to root RewriteRule ^(.*)index\.(php|html)$ /$1 [R=301,L] # Caching ExpiresActive On ExpiresDefault A0 Header set Cache-Control "public" # 1 Year Long Cache <FilesMatch "\.(flv|fla|ico|pdf|avi|mov|ppt|doc|mp3|wmv|wav|png|jpg|jpeg|gif|swf|js|css|ttf|eot|woff|svg|svgz)$"> ExpiresDefault A31622400 </FilesMatch> # Proxy Caching <FilesMatch "\.(css|js|png)$"> ExpiresDefault A31622400 Header set Cache-Control "private" </FilesMatch> # Protect against DOS attacks by limiting file upload size LimitRequestBody 10240000 # Proper SVG serving AddType image/svg+xml svg svgz AddEncoding gzip svgz # GZip Compression <IfModule mod_deflate.c> <FilesMatch "\.(php|html|css|js|xml|txt|ttf|otf|eot|svg)$" > SetOutputFilter DEFLATE </FilesMatch> </IfModule> # Error page ErrorDocument 404 /404.html # Deny access to sensitive files <FilesMatch "\.(htaccess|ini|log|psd)$"> Order Allow,Deny Deny from all </FilesMatch>

    Read the article

  • preformance wise htaccess

    - by purpler
    hese's the my htaccess template, i wonder if anything could be added to increase website performance.. # Defaults AddDefaultCharset UTF-8 DefaultLanguage en-US ServerSignature Off FileETag None Header unset ETag Options -MultiViews #Options All -Indexes # Force the latest IE version or ChromeFrame <IfModule mod_setenvif.c> <IfModule mod_headers.c> BrowserMatch MSIE ie Header set X-UA-Compatible "IE=Edge,chrome=1" env=ie </IfModule> </IfModule> # Proxy X-UA Setup <IfModule mod_headers.c> Header append Vary User-Agent </IfModule> #Rewrites Options +FollowSymlinks RewriteEngine On RewriteBase / # Redirect to non-WWW RewriteCond %{HTTPS} !=on RewriteCond %{HTTP_HOST} ^www\.(.+)$ [NC] RewriteRule ^(.*)$ http://%1/$1 [R=301,L] # Redirect to WWW RewriteCond %{HTTP_HOST} ^domain.com RewriteRule (.*) http://www.domain.com/$1 [R=301,L] # Redirect index to root RewriteRule ^(.*)index\.(php|html)$ /$1 [R=301,L] # Caching ExpiresActive On ExpiresDefault A0 Header set Cache-Control "public" # 1 Year Long Cache <FilesMatch "\.(flv|fla|ico|pdf|avi|mov|ppt|doc|mp3|wmv|wav|png|jpg|jpeg|gif|swf|js|css|ttf|eot|woff|svg|svgz)$"> ExpiresDefault A31622400 </FilesMatch> # Proxy Caching <FilesMatch "\.(css|js|png)$"> ExpiresDefault A31622400 Header set Cache-Control "private" </FilesMatch> # Protect against DOS attacks by limiting file upload size LimitRequestBody 10240000 # Proper SVG serving AddType image/svg+xml svg svgz AddEncoding gzip svgz # GZip Compression <IfModule mod_deflate.c> <FilesMatch "\.(php|html|css|js|xml|txt|ttf|otf|eot|svg)$" > SetOutputFilter DEFLATE </FilesMatch> </IfModule> # Error page ErrorDocument 404 /404.html # Deny access to sensitive files <FilesMatch "\.(htaccess|ini|log|psd)$"> Order Allow,Deny Deny from all </FilesMatch>

    Read the article

  • .htaccess password not working for all files

    - by hapalibashi
    My .htaccess on Rackspace looks like this: AuthType Basic AuthName "Restricted Area" AuthUserFile /path-to-htdocs/.htpasswd Require valid-user Now I would expect this to password protect the whole directory, however, it only protects files with .php extension! What is wrong with it? Is it something in the default http.conf that I cannot override? The path used to .htpasswd is correct as it accepts the user/passwd in the case of .php

    Read the article

  • .htaccess Permission denied. Unable to check htaccess file

    - by Josh
    I have a strange problem when adding a sub-domain to our virtual server. I have done similar sub-domains before and they have worked fine. When I try to access the sub-domain I get an 403 Forbidden error. I checked the error logs and have the following error: pcfg_openfile: unable to check htaccess file, ensure it is readable I've searched Google and could only find solutions regarding file and folder permissions, that I have checked and the solution isn't solved. I also saw problems with Frontpage Extensions, but that's not installed on the server. Edit Forgot to say that there isn't a .htaccess file in the directory of the sub-domain Edit #2 Still not been able to find a solution on this. Only things I have been able to find out is: It doesn't seem to be a problem with any .htaccess files (I've tried creating blank ones, with correct user privileges). It doesn't seem to be a problem with any folder permissions as they are all set correct. There isn't a problem with the way the sub-domain has been set up, as I've tried pointing the DocumentRoot to another folder and it worked fine. I've also done sub-domains fine before with no problem. Edit #3 Find out more information. I don't think it can be a file permission problem now, because if I access it by going to the server ip and then the directory where the site is hosted it all works fine (minus the stylesheets & images, which is just down to how they are linked)

    Read the article

  • Disable .htaccess or disable some rules from .htaccess on specific URL

    - by petRUShka
    I have Kerberos-based authentication and I want to disable it on only root url: mysite.com/. And I want it to works fine on any other page like mysite.com/page1. I have such things in my .htaccess: AuthType Kerberos AuthName "Domain login" KrbAuthRealms DOMAIN.COM KrbMethodK5Passwd on Krb5KeyTab /etc/httpd/httpd.keytab require valid-user I want to turn it off only for root URL. As workaround it is possible to turn off using .htaccess in virtual host config. Unfortunately I don't know how to do it. Part of my vhost.conf: <Directory /home/user/www/current/public/> Options -MultiViews +FollowSymLinks AllowOverride All Order allow,deny Allow from all </Directory> It would be great if you can advice me something!

    Read the article

  • .htaccess Permission denied. Unable to check htaccess file

    - by Josh
    Hi, I have a strange problem when adding a sub-domain to our virtual server. I have done similar sub-domains before and they have worked fine. When I try to access the sub-domain I get an 403 Forbidden error. I checked the error logs and have the following error: pcfg_openfile: unable to check htaccess file, ensure it is readable I've searched Google and could only find solutions regarding file and folder permissions, that I have checked and the solution isn't solved. I also saw problems with Frontpage Extensions, but that's not installed on the server. Edit Forgot to say that there isn't a .htaccess file in the directory of the sub-domain

    Read the article

  • Disable .htaccess from apache allowoverride none, still reads .htaccess files

    - by John Magnolia
    I have moved all of our .htaccess config into <Directory> blocks and set AllowOverride None in the default and default-ssl. Although after restarting apache it is still reading the .htaccess files. How can I completely turn off reading these files? Update of all files with "AllowOverride" /etc/apache2/mods-available/userdir.conf <IfModule mod_userdir.c> UserDir public_html UserDir disabled root <Directory /home/*/public_html> AllowOverride FileInfo AuthConfig Limit Indexes Options MultiViews Indexes SymLinksIfOwnerMatch IncludesNoExec <Limit GET POST OPTIONS> Order allow,deny Allow from all </Limit> <LimitExcept GET POST OPTIONS> Order deny,allow Deny from all </LimitExcept> </Directory> </IfModule> /etc/apache2/mods-available/alias.conf <IfModule alias_module> # # Aliases: Add here as many aliases as you need (with no limit). The format is # Alias fakename realname # # Note that if you include a trailing / on fakename then the server will # require it to be present in the URL. So "/icons" isn't aliased in this # example, only "/icons/". If the fakename is slash-terminated, then the # realname must also be slash terminated, and if the fakename omits the # trailing slash, the realname must also omit it. # # We include the /icons/ alias for FancyIndexed directory listings. If # you do not use FancyIndexing, you may comment this out. # Alias /icons/ "/usr/share/apache2/icons/" <Directory "/usr/share/apache2/icons"> Options Indexes MultiViews AllowOverride None Order allow,deny Allow from all </Directory> </IfModule> /etc/apache2/httpd.conf # # Directives to allow use of AWStats as a CGI # Alias /awstatsclasses "/usr/share/doc/awstats/examples/wwwroot/classes/" Alias /awstatscss "/usr/share/doc/awstats/examples/wwwroot/css/" Alias /awstatsicons "/usr/share/doc/awstats/examples/wwwroot/icon/" ScriptAlias /awstats/ "/usr/share/doc/awstats/examples/wwwroot/cgi-bin/" # # This is to permit URL access to scripts/files in AWStats directory. # <Directory "/usr/share/doc/awstats/examples/wwwroot"> Options None AllowOverride None Order allow,deny Allow from all </Directory> Alias /awstats-icon/ /usr/share/awstats/icon/ <Directory /usr/share/awstats/icon> Options None AllowOverride None Order allow,deny Allow from all </Directory> /etc/apache2/sites-available/default-ssl <IfModule mod_ssl.c> <VirtualHost _default_:443> ServerAdmin webmaster@localhost DocumentRoot /var/www <Directory /> Options FollowSymLinks AllowOverride None </Directory> <Directory /var/www/> Options Indexes FollowSymLinks MultiViews AllowOverride None </Directory> ScriptAlias /cgi-bin/ /usr/lib/cgi-bin/ <Directory "/usr/lib/cgi-bin"> AllowOverride None Options +ExecCGI -MultiViews +SymLinksIfOwnerMatch Order allow,deny Allow from all </Directory> ErrorLog ${APACHE_LOG_DIR}/error.log # Possible values include: debug, info, notice, warn, error, crit, # alert, emerg. LogLevel warn CustomLog ${APACHE_LOG_DIR}/ssl_access.log combined # SSL Engine Switch: # Enable/Disable SSL for this virtual host. SSLEngine on # A self-signed (snakeoil) certificate can be created by installing # the ssl-cert package. See # /usr/share/doc/apache2.2-common/README.Debian.gz for more info. # If both key and certificate are stored in the same file, only the # SSLCertificateFile directive is needed. SSLCertificateFile /etc/ssl/certs/ssl-cert-snakeoil.pem SSLCertificateKeyFile /etc/ssl/private/ssl-cert-snakeoil.key # Server Certificate Chain: # Point SSLCertificateChainFile at a file containing the # concatenation of PEM encoded CA certificates which form the # certificate chain for the server certificate. Alternatively # the referenced file can be the same as SSLCertificateFile # when the CA certificates are directly appended to the server # certificate for convinience. #SSLCertificateChainFile /etc/apache2/ssl.crt/server-ca.crt # Certificate Authority (CA): # Set the CA certificate verification path where to find CA # certificates for client authentication or alternatively one # huge file containing all of them (file must be PEM encoded) # Note: Inside SSLCACertificatePath you need hash symlinks # to point to the certificate files. Use the provided # Makefile to update the hash symlinks after changes. #SSLCACertificatePath /etc/ssl/certs/ #SSLCACertificateFile /etc/apache2/ssl.crt/ca-bundle.crt # Certificate Revocation Lists (CRL): # Set the CA revocation path where to find CA CRLs for client # authentication or alternatively one huge file containing all # of them (file must be PEM encoded) # Note: Inside SSLCARevocationPath you need hash symlinks # to point to the certificate files. Use the provided # Makefile to update the hash symlinks after changes. #SSLCARevocationPath /etc/apache2/ssl.crl/ #SSLCARevocationFile /etc/apache2/ssl.crl/ca-bundle.crl # Client Authentication (Type): # Client certificate verification type and depth. Types are # none, optional, require and optional_no_ca. Depth is a # number which specifies how deeply to verify the certificate # issuer chain before deciding the certificate is not valid. #SSLVerifyClient require #SSLVerifyDepth 10 # Access Control: # With SSLRequire you can do per-directory access control based # on arbitrary complex boolean expressions containing server # variable checks and other lookup directives. The syntax is a # mixture between C and Perl. See the mod_ssl documentation # for more details. #<Location /> #SSLRequire ( %{SSL_CIPHER} !~ m/^(EXP|NULL)/ \ # and %{SSL_CLIENT_S_DN_O} eq "Snake Oil, Ltd." \ # and %{SSL_CLIENT_S_DN_OU} in {"Staff", "CA", "Dev"} \ # and %{TIME_WDAY} >= 1 and %{TIME_WDAY} <= 5 \ # and %{TIME_HOUR} >= 8 and %{TIME_HOUR} <= 20 ) \ # or %{REMOTE_ADDR} =~ m/^192\.76\.162\.[0-9]+$/ #</Location> # SSL Engine Options: # Set various options for the SSL engine. # o FakeBasicAuth: # Translate the client X.509 into a Basic Authorisation. This means that # the standard Auth/DBMAuth methods can be used for access control. The # user name is the `one line' version of the client's X.509 certificate. # Note that no password is obtained from the user. Every entry in the user # file needs this password: `xxj31ZMTZzkVA'. # o ExportCertData: # This exports two additional environment variables: SSL_CLIENT_CERT and # SSL_SERVER_CERT. These contain the PEM-encoded certificates of the # server (always existing) and the client (only existing when client # authentication is used). This can be used to import the certificates # into CGI scripts. # o StdEnvVars: # This exports the standard SSL/TLS related `SSL_*' environment variables. # Per default this exportation is switched off for performance reasons, # because the extraction step is an expensive operation and is usually # useless for serving static content. So one usually enables the # exportation for CGI and SSI requests only. # o StrictRequire: # This denies access when "SSLRequireSSL" or "SSLRequire" applied even # under a "Satisfy any" situation, i.e. when it applies access is denied # and no other module can change it. # o OptRenegotiate: # This enables optimized SSL connection renegotiation handling when SSL # directives are used in per-directory context. #SSLOptions +FakeBasicAuth +ExportCertData +StrictRequire <FilesMatch "\.(cgi|shtml|phtml|php)$"> SSLOptions +StdEnvVars </FilesMatch> <Directory /usr/lib/cgi-bin> SSLOptions +StdEnvVars </Directory> # SSL Protocol Adjustments: # The safe and default but still SSL/TLS standard compliant shutdown # approach is that mod_ssl sends the close notify alert but doesn't wait for # the close notify alert from client. When you need a different shutdown # approach you can use one of the following variables: # o ssl-unclean-shutdown: # This forces an unclean shutdown when the connection is closed, i.e. no # SSL close notify alert is send or allowed to received. This violates # the SSL/TLS standard but is needed for some brain-dead browsers. Use # this when you receive I/O errors because of the standard approach where # mod_ssl sends the close notify alert. # o ssl-accurate-shutdown: # This forces an accurate shutdown when the connection is closed, i.e. a # SSL close notify alert is send and mod_ssl waits for the close notify # alert of the client. This is 100% SSL/TLS standard compliant, but in # practice often causes hanging connections with brain-dead browsers. Use # this only for browsers where you know that their SSL implementation # works correctly. # Notice: Most problems of broken clients are also related to the HTTP # keep-alive facility, so you usually additionally want to disable # keep-alive for those clients, too. Use variable "nokeepalive" for this. # Similarly, one has to force some clients to use HTTP/1.0 to workaround # their broken HTTP/1.1 implementation. Use variables "downgrade-1.0" and # "force-response-1.0" for this. BrowserMatch "MSIE [2-6]" \ nokeepalive ssl-unclean-shutdown \ downgrade-1.0 force-response-1.0 # MSIE 7 and newer should be able to use keepalive BrowserMatch "MSIE [17-9]" ssl-unclean-shutdown </VirtualHost> </IfModule> /etc/apache2/sites-available/default <VirtualHost *:80> ServerAdmin webmaster@localhost DocumentRoot /var/www <Directory /> Options FollowSymLinks AllowOverride None </Directory> <Directory /var/www/> Options -Indexes FollowSymLinks MultiViews AllowOverride None Order allow,deny allow from all </Directory> ScriptAlias /cgi-bin/ /usr/lib/cgi-bin/ <Directory "/usr/lib/cgi-bin"> AllowOverride None Options +ExecCGI -MultiViews +SymLinksIfOwnerMatch Order allow,deny Allow from all </Directory> Alias /delboy /usr/share/phpmyadmin <Directory /usr/share/phpmyadmin> # Restrict phpmyadmin access Order Deny,Allow Allow from all </Directory> ErrorLog ${APACHE_LOG_DIR}/error.log # Possible values include: debug, info, notice, warn, error, crit, # alert, emerg. LogLevel warn CustomLog ${APACHE_LOG_DIR}/access.log combined Alias /doc/ "/usr/share/doc/" <Directory "/usr/share/doc/"> Options Indexes MultiViews FollowSymLinks AllowOverride None Order deny,allow Deny from all Allow from 127.0.0.0/255.0.0.0 ::1/128 </Directory> </VirtualHost> /etc/apache2/conf.d/security # # Disable access to the entire file system except for the directories that # are explicitly allowed later. # # This currently breaks the configurations that come with some web application # Debian packages. # #<Directory /> # AllowOverride None # Order Deny,Allow # Deny from all #</Directory> # Changing the following options will not really affect the security of the # server, but might make attacks slightly more difficult in some cases. # # ServerTokens # This directive configures what you return as the Server HTTP response # Header. The default is 'Full' which sends information about the OS-Type # and compiled in modules. # Set to one of: Full | OS | Minimal | Minor | Major | Prod # where Full conveys the most information, and Prod the least. # #ServerTokens Minimal ServerTokens OS #ServerTokens Full # # Optionally add a line containing the server version and virtual host # name to server-generated pages (internal error documents, FTP directory # listings, mod_status and mod_info output etc., but not CGI generated # documents or custom error documents). # Set to "EMail" to also include a mailto: link to the ServerAdmin. # Set to one of: On | Off | EMail # #ServerSignature Off ServerSignature On # # Allow TRACE method # # Set to "extended" to also reflect the request body (only for testing and # diagnostic purposes). # # Set to one of: On | Off | extended # TraceEnable Off #TraceEnable On /etc/apache2/apache2.conf # # Based upon the NCSA server configuration files originally by Rob McCool. # # This is the main Apache server configuration file. It contains the # configuration directives that give the server its instructions. # See http://httpd.apache.org/docs/2.2/ for detailed information about # the directives. # # Do NOT simply read the instructions in here without understanding # what they do. They're here only as hints or reminders. If you are unsure # consult the online docs. You have been warned. # # The configuration directives are grouped into three basic sections: # 1. Directives that control the operation of the Apache server process as a # whole (the 'global environment'). # 2. Directives that define the parameters of the 'main' or 'default' server, # which responds to requests that aren't handled by a virtual host. # These directives also provide default values for the settings # of all virtual hosts. # 3. Settings for virtual hosts, which allow Web requests to be sent to # different IP addresses or hostnames and have them handled by the # same Apache server process. # # Configuration and logfile names: If the filenames you specify for many # of the server's control files begin with "/" (or "drive:/" for Win32), the # server will use that explicit path. If the filenames do *not* begin # with "/", the value of ServerRoot is prepended -- so "foo.log" # with ServerRoot set to "/etc/apache2" will be interpreted by the # server as "/etc/apache2/foo.log". # ### Section 1: Global Environment # # The directives in this section affect the overall operation of Apache, # such as the number of concurrent requests it can handle or where it # can find its configuration files. # # # ServerRoot: The top of the directory tree under which the server's # configuration, error, and log files are kept. # # NOTE! If you intend to place this on an NFS (or otherwise network) # mounted filesystem then please read the LockFile documentation (available # at <URL:http://httpd.apache.org/docs/2.2/mod/mpm_common.html#lockfile>); # you will save yourself a lot of trouble. # # Do NOT add a slash at the end of the directory path. # #ServerRoot "/etc/apache2" # # The accept serialization lock file MUST BE STORED ON A LOCAL DISK. # LockFile ${APACHE_LOCK_DIR}/accept.lock # # PidFile: The file in which the server should record its process # identification number when it starts. # This needs to be set in /etc/apache2/envvars # PidFile ${APACHE_PID_FILE} # # Timeout: The number of seconds before receives and sends time out. # Timeout 300 # # KeepAlive: Whether or not to allow persistent connections (more than # one request per connection). Set to "Off" to deactivate. # KeepAlive On # # MaxKeepAliveRequests: The maximum number of requests to allow # during a persistent connection. Set to 0 to allow an unlimited amount. # We recommend you leave this number high, for maximum performance. # MaxKeepAliveRequests 100 # # KeepAliveTimeout: Number of seconds to wait for the next request from the # same client on the same connection. # KeepAliveTimeout 4 ## ## Server-Pool Size Regulation (MPM specific) ## # prefork MPM # StartServers: number of server processes to start # MinSpareServers: minimum number of server processes which are kept spare # MaxSpareServers: maximum number of server processes which are kept spare # MaxClients: maximum number of server processes allowed to start # MaxRequestsPerChild: maximum number of requests a server process serves <IfModule mpm_prefork_module> StartServers 5 MinSpareServers 5 MaxSpareServers 10 MaxClients 150 MaxRequestsPerChild 500 </IfModule> # worker MPM # StartServers: initial number of server processes to start # MaxClients: maximum number of simultaneous client connections # MinSpareThreads: minimum number of worker threads which are kept spare # MaxSpareThreads: maximum number of worker threads which are kept spare # ThreadLimit: ThreadsPerChild can be changed to this maximum value during a # graceful restart. ThreadLimit can only be changed by stopping # and starting Apache. # ThreadsPerChild: constant number of worker threads in each server process # MaxRequestsPerChild: maximum number of requests a server process serves <IfModule mpm_worker_module> StartServers 2 MinSpareThreads 25 MaxSpareThreads 75 ThreadLimit 64 ThreadsPerChild 25 MaxClients 150 MaxRequestsPerChild 0 </IfModule> # event MPM # StartServers: initial number of server processes to start # MaxClients: maximum number of simultaneous client connections # MinSpareThreads: minimum number of worker threads which are kept spare # MaxSpareThreads: maximum number of worker threads which are kept spare # ThreadsPerChild: constant number of worker threads in each server process # MaxRequestsPerChild: maximum number of requests a server process serves <IfModule mpm_event_module> StartServers 2 MaxClients 150 MinSpareThreads 25 MaxSpareThreads 75 ThreadLimit 64 ThreadsPerChild 25 MaxRequestsPerChild 0 </IfModule> # These need to be set in /etc/apache2/envvars User ${APACHE_RUN_USER} Group ${APACHE_RUN_GROUP} # # AccessFileName: The name of the file to look for in each directory # for additional configuration directives. See also the AllowOverride # directive. # AccessFileName .htaccess # # The following lines prevent .htaccess and .htpasswd files from being # viewed by Web clients. # <Files ~ "^\.ht"> Order allow,deny Deny from all Satisfy all </Files> # # DefaultType is the default MIME type the server will use for a document # if it cannot otherwise determine one, such as from filename extensions. # If your server contains mostly text or HTML documents, "text/plain" is # a good value. If most of your content is binary, such as applications # or images, you may want to use "application/octet-stream" instead to # keep browsers from trying to display binary files as though they are # text. # DefaultType text/plain # # HostnameLookups: Log the names of clients or just their IP addresses # e.g., www.apache.org (on) or 204.62.129.132 (off). # The default is off because it'd be overall better for the net if people # had to knowingly turn this feature on, since enabling it means that # each client request will result in AT LEAST one lookup request to the # nameserver. # HostnameLookups Off # ErrorLog: The location of the error log file. # If you do not specify an ErrorLog directive within a <VirtualHost> # container, error messages relating to that virtual host will be # logged here. If you *do* define an error logfile for a <VirtualHost> # container, that host's errors will be logged there and not here. # ErrorLog ${APACHE_LOG_DIR}/error.log # # LogLevel: Control the number of messages logged to the error_log. # Possible values include: debug, info, notice, warn, error, crit, # alert, emerg. # LogLevel warn # Include module configuration: Include mods-enabled/*.load Include mods-enabled/*.conf # Include all the user configurations: Include httpd.conf # Include ports listing Include ports.conf # # The following directives define some format nicknames for use with # a CustomLog directive (see below). # If you are behind a reverse proxy, you might want to change %h into %{X-Forwarded-For}i # LogFormat "%v:%p %h %l %u %t \"%r\" %>s %O \"%{Referer}i\" \"%{User-Agent}i\"" vhost_combined LogFormat "%h %l %u %t \"%r\" %>s %O \"%{Referer}i\" \"%{User-Agent}i\"" combined LogFormat "%h %l %u %t \"%r\" %>s %O" common LogFormat "%{Referer}i -> %U" referer LogFormat "%{User-agent}i" agent # Include of directories ignores editors' and dpkg's backup files, # see README.Debian for details. # Include generic snippets of statements Include conf.d/ # Include the virtual host configurations: Include sites-enabled/

    Read the article

  • Can .htaccess slow down a site?

    - by Cody Sharp
    I'm working with a client on an e-commerce website. I implemented clean URLs using .htaccess. I also used .htaccess to solve canonical issues such as redirecting www to non-www and removing index.php from the URL. The website recently began to slow down dramatically, sometimes not even loading. The site is hosted on GoDaddy, and when the client called GoDaddy they told him it was the .htaccess file slowing down the website. I find this highly unlikely because of my past experiences, but I'm not 100% sure. My thinking is that the client's website is most likely on a shared server with a busy neighborhood, thus slowing down the site. It's not always slow, but rather sporadic throughout the day, loading fast at some points and slow at other points in time. Can the .htaccess file slow down a website to a crawl? If so, are there better ways to solve these problems with different rewrite rules and such? Here is what the actual .htaccess file looks like: Options +FollowSymlinks RewriteEngine On RewriteBase / RewriteCond %{HTTP_HOST} ^www.example.net [NC] RewriteRule ^(.*)$ http://example.net/$1 [L,R=301] RewriteRule ^products/([0-9a-zA-Z\_\-]*)\.htm([l]?)$ index.php p=product&product_code=$1 [L] RewriteRule ^catalog/([0-9a-zA-Z\_\-]*)\.htm([l]?)$ index.php p=catalog&catalog_code=$1 [L] RewriteRule ^pages/([0-9a-zA-Z\_\-]*)\.htm([l]?)$ index.php?p=page&page_id=$1 [L] RewriteRule ^index\.htm([l]?)$ index.php?p=home [L] RewriteRule ^site_map\.htm([l]?)$ index.php?p=site_map [L] RewriteCond %{QUERY_STRING} ^p=home$ RewriteRule (.*) ? [R=permanent] I'm a .htaccess and regex novice, so any pointed out mistakes would also help. Thank you.

    Read the article

  • .htaccess - lose the file .html extension

    - by Darren Sweeney
    I'm having a bad .htaccess day! I want a user to be able to type the URL mysite.com/about instead of mysite.com/about.html On .htaccess file I have: RewriteEngine On RewriteCond %{SCRIPT_FILENAME} !-f RewriteCond %{SCRIPT_FILENAME} !-d RewriteRule ^/(.*)$ /$1.html [NC,L] But this simply does not work? I will add though that if i try this further inside the site e.g. mysite.com/pages/contact Works perfectly whether I have the above code in the .htaccess or not What am I doing wrong?

    Read the article

  • Extend depth of .htaccess to all subfolders and their children

    - by JoXll
    I need to be able to use .htaccess in all subfolders for full depth. E.g. I have .htaccess in public_html folder: \public_html\.htaccess How I make it to work for the folder small as well? \public_html\home\images\red\thumbs\small\ It only enforces up to home directory not more. ErrorDocument 403 http://google.com Order Deny,Allow Deny from all Allow from 11.22.33.44 Options +FollowSymlinks RewriteEngine On RewriteCond %{HTTPS} off RewriteCond %{HTTP_HOST} ^www\.(.*)$ [NC] RewriteRule ^(.*)$ http://%1/$1 [R=301,L]

    Read the article

  • htaccess execution order and priority

    - by ChrisRamakers
    Can anyone explain to me in what order apache executes .htaccess files residing in different levels of the same path and how the rewrite rules therein are prioritized? For example, why doesn't the rewrite rule in the first .htaccess below work and is the one in /blog prioritized? .htaccess in / RewriteEngine on RewriteBase / RewriteRule ^blog offline.html [L] .htaccess in /blog RewriteEngine On RewriteBase /blog/ RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule . /blog/index.php [L] Ps: i'm not simply looking for an answer but for a way to understand the apache/modrewrite internals ... why is more important to me than how to fix this :) Thanks!

    Read the article

  • Need to sanity-check my .htaccess, especially Limit GET POST line for Google repellent

    - by jose
    I need a sanity check on this .htaccess (from a WordPress site) I inherited from a 5 month+ old site. What's the symptom? Google + Bing crawl, but don't index any of the pages. Let me be clear: I'm not mad about "not ranking high." I think something is (accidentally) rejecting search engine indexing. I am not an expert on .htaccess, but one part especially looked funny, the Limit GET POST line. Is it not weird to have both Allow and Deny all, with no parameters? Also, I've ruled out robots.txt, but if I were you I'd want to see it, so here it is: User-agent: * Crawl-delay: 30 And here's the more suspect .htaccess: # temp redirect wordpress content feeds to feedburner <IfModule mod_rewrite.c> RewriteEngine on RewriteCond %{HTTP_USER_AGENT} !FeedBurner [NC] RewriteCond %{HTTP_USER_AGENT} !FeedValidator [NC] RewriteRule ^feed/?([_0-9a-z-]+)?/?$ http://feeds.feedburner.com/anonymousblog [R=302,NC,L] </IfModule> # temp redirect wordpress comment feeds to feedburner <IfModule mod_rewrite.c> RewriteEngine on RewriteCond %{HTTP_USER_AGENT} !FeedBurner [NC] RewriteCond %{HTTP_USER_AGENT} !FeedValidator [NC] RewriteRule ^comments/feed/?([_0-9a-z-]+)?/?$ http://feeds.feedburner.com/anonymous_comments [R=302,NC,L] </IfModule> <IfModule mod_rewrite.c> RewriteEngine On RewriteBase / RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule . /index.php [L] </IfModule> IndexIgnore .htaccess */.??* *~ *# */HEADER* */README* */_vti* <Limit GET POST> order deny,allow deny from all allow from all </Limit> <Limit PUT DELETE> order deny,allow deny from all </Limit> php_value memory_limit 32M Adding header by request: <meta http-equiv="Content-Type" content="text/html; charset=UTF-8" /> <meta name="robots" content="noindex,nofollow" /> <meta name="description" content="buncha junk i've deleted." /> <meta name="keywords" content="keywords i've deleted" /> <meta name="viewport" content="width=device-width" />

    Read the article

  • Disallow all user agents except one using .htaccess?

    - by Kian Mayne
    I've been struggling to get this .htaccess working. The aim is to disallow all user agents besides my app. The app sends a GET request with a user agent of lets say 'AcmeUpdater'. Whenever I try to navigate to any file in the folder, I get a 500 - Internal Server Error. Here are the rules I'm using: <IfModule mod_rewrite.c> Options +FollowSymLinks RewriteEngine on RewriteBase / RewriteCond %{HTTP_USER_AGENT} !^KMUpdaterClient* RewriteRule .* - [F,L] </IfModule> I have updated the .htaccess file as suggested in the answer by Nick, and restarted Apache. After trying a couple of different things, it seems that just the presence of a .htaccess is causing the 500 error. I'm getting nothing in the error logs. The .htaccess file at the document root looks like the following: <IfModule mod_rewrite.c> Options +FollowSymLinks ErrorDocument 404 /index.php?error=404 RewriteEngine On RewriteBase / RewriteRule ^index\.php$ - [L] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d </IfModule> So I realised that the error logs were in chronological order rather than the reverse chronological I expected (Oops!). The error I'm getting is: </IfModule> without matching <IfModule> section. I removed the </IfModule> and still I get that error. Ideas?

    Read the article

  • Showing content from pages at different URL's (masking), possibly with .htaccess

    - by zigojacko
    If I have URL's like:- domain.com/category/widgets/filter/blue domain.com/category/widgets/filter/red And it is pretty difficult to reconstruct them to something like:- domain.com/category/blue-widgets domain.com/category/red-widgets Is there any way at all that I can use URL rewrites or anything else with .htaccess or on the server to display the URL's as the domain.com/category/blue-widgets on the domain.com/category/widgets/filter/blue page? I've looked into masking URL's but got nowhere and this has been something bugging me for almost 6 months now. Is there any way to achieve what I want to do? FYI: This is a Magento website and the above process, I am wanting to implement for potentially hundreds of URL's. Edit To respond to @kkugelmann's answer:- I couldn't get your proposed RewriteRule to make a difference at all in the .htaccess file so I started testing a few things in this .htaccess tester:- The proposed RewriteRule didn't work in this tester:- However, the following did:- But adding any of these RewriteRule's into the website's .htaccess file did not rewrite the URL at all... Edit2 By the way, if I add [R=301,L] to the end of the URL rewrite rule, it does actually then rewrite the rule, but of course 301 redirects it as well which is unwanted behaviour. Edit3 I found another question with the same issue... And an accepted answer that solved the problem which seemed to be something to do with using mod_proxy and the [P] tag on the rule (if I try this, the page 404's).

    Read the article

  • URL hex characters in .htaccess

    - by Steve
    There is an old page with a space in the filename, and this is no longer found on the website. So I need to redirect this page to another page using a 301 redirect in .htaccess. If I place the filename directly into .htaccess (Bouquets%20%26%20Loose.html), the redirect does not work. If I escape the % sign like this (Bouquets\%20\%26\%20Loose.html), the redirect still does not work. How do I get this redirect to work in .htaccess? Thanks.

    Read the article

  • getting 500 intenal error when setting 301 redirect using .htaccess

    - by sam
    im trying to use a 301 redirect to direct users and bots to my new site but when i put the .htaccess live i keep getting a 500 internal error shown. The site is actually a subdomain which i want to redirect to another subdomain on another site (im not sure if thats relivant but i thought i should include it) the site is hosted on a apache server The 301 htaccess code im using is : Options +FollowSymLinks RewriteEngine on RewriteRule (.*) http://www.blog.mysite.co.uk/$1 [R=301,L] any idea what might be wrong with this ?

    Read the article

  • I need a little help with .htaccess rewrite

    - by Pinokyo
    I need a little help with .htaccess file I have songs, singers and albums links I want to rewrite. I all ready rewrote the links and they are like this: the links for the songs is like this: /song/song_name for singers: /singer_name for albums: /album_name From my .htaccess file: RewriteEngine on RewriteRule ^singer/([^/\.]+)/?$ /core/controller.php?singer=$1 [L] RewriteRule ^song/([^/\.]+)/?$ /core/controller.php?song=$1 [L] RewriteRule ^album/([^/\.]+)/?$ /core/controller.php?album=$1 [L] I need the links for the songs, singers and albums to be like this: for songs /singer_name/song_name for singers /singer_name for albums /singer_name/album_name can anyone help me with this please.

    Read the article

  • To Fix HTTP 400-499 error codes with 301 redirects in .htaccess file

    - by user2131844
    Google previously indexed my websites pages (sitemap.xml) with below format: www.domain.com/2013/04/18/hot?test-gadgets-of-2013-to-include-in-?your-list www.domain.com/2013/02/09/rin?gdroid I have resubmitted the sitemap but there are still 404 errors in Google/Bing engine. Could you please help me to write 301 redirects rule in .htaccess file so when some clicks the URL for: www.domain.com/2013/02/09/rin?gdroid They should be redirected to: www.domain.com/rin?gdroid How we can write rule in .htaccess file to remove date part 2013/02/09/?

    Read the article

  • Redirecting requests for .html pages in subdirectories to the same page in root with .htaccess

    - by Asherion
    I am porting a site from an old version of a CMS to a newer version which has different page addressing techniques. I'm unfortunately not very good with htaccess at all. URL/blog/sublblog/article.html is now simply URL/article.html Unfortunately, this will destroy any linking programs they have going, and break all the old links. I need a way to use .htaccess say: if request = /(any subdirectory)/(string).html then redirect to /(string).html If that makes any sense.

    Read the article

  • Htaccess 301 redirect dynamic URL

    - by Jarede
    I don't know a whole lot about .htaccess rules so forgive and help me ask the correct question. Currently I have a .htaccess rule like: RewriteRule ^surveys/(\S+)/directory/(\d+)/(\d+)/entry/(\d+)/?$ directories/index.cfm?sFuseAction=XXX.YYYY.ZZZZ&nDirectoryID=$2&nEntryID=$4&nCategoryID=$3&sDirectory=$1 [NC,L] which I want to do a 301 redirect to: RewriteRule ^(\S+)/directory/(\d+)/(\d+)/entry/(\d+)/?$ directories/index.cfm?sFuseAction=XXX.YYYY.ZZZZ&nDirectoryID=$2&nEntryID=$4&nCategoryID=$3&sDirectory=$1 [NC,L] I'm unsure of the correct syntax to go about making these redirect correctly.

    Read the article

  • Clean URLs issue using .htaccess in PHP project

    - by x4ph4r
    I am working on a PHP laravel project. I am currently facing issues with .htaccess file. I have following .htaccess: <IfModule mod_rewrite.c> Options -MultiViews RewriteEngine On RewriteBase / RewriteCond %{REQUEST_FILENAME} !-f RewriteRule ^ index.php [L] </IfModule> When I reload my page the it gave me following error: 404 Not Found The requested URL /contacts was not found on this server. Then I opened /etc/apache2/users/username.conf file which had following line of code: <Directory "/Users/username/Sites/"> Options Indexes MultiViews AllowOverride None Order allow,deny Allow from all </Directory> In above code I changed AllowOverride None to AllowOverride All. Then I reload page and got following error: 403 Forbidden You don't have permission to access /contacts on this server. When I add FollowSymLinks to .htaccess file Options such as like this Options -MultiViews FollowSymLinks. Then sometimes I get this 500 Internal Server Error error and sometime this *Error 324 (net::ERR_EMPTY_RESPONSE): The server closed the connection without sending any data*. Each time I reload my page one of these errors with FollowSymLinks option. I also uncomment following lines in /etc/apache2/httpd.conf: LoadModule rewrite_module libexec/apache2/mod_rewrite.so LoadModule php5_module libexec/apache2/libphp5.so and still I am getting same permission denied error. Please help me I am trying to solve this problem for past 3 days but it is till unresolved.

    Read the article

  • In Joomla In htaccess REQUEST_URI is always returning index.php instead of SEF URL

    - by Saumil
    I have installed joomsef version 3.9.9 with the Joomla 1.5.25. Now I want to set https for some of the section of my site(e.g URI starts with /events/) while wanting rest of all urls on http.I am setting rules in .htaccess file but not getting output as expected. I am checking REQUEST_URI of the SEF urls but always getting index.php as URI. Here is my htaccess code. ########## Begin - Custom redirects # # If you need to redirect some pages, or set a canonical non-www to # www redirect (or vice versa), place that code here. Ensure those # redirects use the correct RewriteRule syntax and the [R=301,L] flags. # ########## End - Custom redirects # Uncomment following line if your webserver's URL # is not directly related to physical file paths. # Update Your Joomla! Directory (just / for root) # RewriteBase / ########## Begin - Joomla! core SEF Section # RewriteRule .* - [E=HTTP_AUTHORIZATION:%{HTTP:Authorization}] # # If the requested path and file is not /index.php and the request # has not already been internally rewritten to the index.php script RewriteCond %{REQUEST_URI} !^/index\.php # and the request is for root, or for an extensionless URL, or the # requested URL ends with one of the listed extensions RewriteCond %{REQUEST_URI} (/[^.]*|\.(php|html?|feed|pdf|raw))$ [NC] # and the requested path and file doesn't directly match a physical file RewriteCond %{REQUEST_FILENAME} !-f # and the requested path and file doesn't directly match a physical folder RewriteCond %{REQUEST_FILENAME} !-d # internally rewrite the request to the index.php script RewriteRule .* index.php [L] # ########## End - Joomla! core SEF Section # Here is my code e.g site url is http://mydomain.com/events RewriteCond %{REQUEST_URI} ^/(events)$ RewriteCond %{HTTPS} !ON RewriteRule (.*) https://%{REQUEST_HOST}%{REQUEST_URI}/$1 [L,R=301] I am not getting why REQUEST_URI is reffering index.php even though my url in address bar is like this http://mydomain.com/events . I am using JOOMSEF(Joomla extension for SEF URLS).If I am removing other rules from the htaccess file then joomla stops working. I am not getting a way to handle this as I am not expert.Please let me know if someone has passed through same situation and have solution or suggest some work around. Thanks

    Read the article

1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >