Search Results

Search found 14878 results on 596 pages for 'mod security'.

Page 72/596 | < Previous Page | 68 69 70 71 72 73 74 75 76 77 78 79  | Next Page >

  • Properly force SSL with .htaccess, no double authentication

    - by cwd
    I'm trying to force SSL with .htaccess on a shared host. This means there I only have access to .htaccess and not the vhosts config. I know you can put a rule in the VirtualHost config file to force SSL which will be picked up there (and acted upon first), preventing double authentication, but I can't get to that. Here's the progress I've made: Config 1 This works pretty well but it does force double authentication if you visit http://site.com - once for http and then once for https. Once you are logged in, it automatically redirects http://site.com/page1.html to the https coutnerpart just fine: RewriteEngine On RewriteCond %{HTTPS} !=on RewriteRule ^ https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301] RewriteEngine on RewriteCond %{HTTP_HOST} !(^www\.site\.com*)$ RewriteRule (.*) https://www.site.com$1 [R=301,L] AuthName "Locked" AuthUserFile "/home/.htpasswd" AuthType Basic require valid-user Config 2 If I add this to the top of the file, it works a lot better in that it will switch to SSL before prompting for the password: SSLOptions +StrictRequire SSLRequireSSL SSLRequire %{HTTP_HOST} eq "site.com" ErrorDocument 403 https://site.com It's clever how it will use the SSLRequireSSL option and the ErrorDocument403 to redirect to the secure version of the site. My only complaint is that if you try and access http://site.com/page1.html it will redirect to https://site.com/ So it is forcing SSL without a double-login, but it is not properly forwarding non-SSL resources to their SSL counterparts. Regarding the first config, Insyte mentioned "using mod_rewrite to perform a simple redirect is a bit of overkill. Use the Redirect directive instead. It's possible this may even fix your problem, as I believe mod_rewrite rules are some of the last directives to be processed, just before the file is actually grabbed from the filesystem" I have not had no such luck on finding a force-ssl config option with the redirect directive and so have been unable to test this theory.

    Read the article

  • URL rewrite from www.domain.com/sudirectory to http://domain.com/subdirectory

    - by chrizzbee
    I need a solution for the following problem: I use a CMS and want the backend only be available at http://domain.com/backend and not at http://www.domain.com/backend. How do I have to change my .htaccess file to achieve this? I already have a rewrite rule from HTTP (non-www) to www. Here's what I currently have in my .htaccess file: ## # Uncomment the following lines to add "www." to the domain: # RewriteCond %{HTTP_HOST} ^shaba-baden\.ch$ [NC] RewriteRule (.*) http://www.shaba-baden.ch/$1 [R=301,L] # # Uncomment the following lines to remove "www." from the domain: # # RewriteCond %{HTTP_HOST} ^www\.example\.com$ [NC] # RewriteRule (.*) http://example.com/$1 [R=301,L] # # Make sure to replace "example.com" with your domain name. ## So, the first bit is the redirect from HTTP to www. It works on the domain part of the URL. As explained, I need a rewrite rule from the backend login at http://www.shaba-baden.ch/contao to http://shaba-baden.ch/contao

    Read the article

  • Redirecting a CSS file based on .htaccess rules.

    - by Anthony Hiscox
    I'm trying to hack the css files on OSTicket by replacing them with my own custom ones when a specific URL is accessed. The URL that is accessed for this example is http://osticket.cts/helpdesk/scp/css/main.css and I would like it to use the css file at http://osticket.cts/test.css why won't this .htaccess file (in web root, not /helpdesk/scp/) work? Is there an easy way to debug these rules, some way to find out what apache did when the URL was accessed and where it's failing? error.log doesn't show anything useful. RewriteEngine On RewriteCond %{HTTP_HOST} ^osticket\.cts$ [NC] RewriteRule ^(.*)main\.css$ /test.css [NC, L]

    Read the article

  • How do I redirect www and non but not IP

    - by Chad T Parson
    I am trying to redirect www.domain.com or domain.com to www.domain.com/temp.html I am using the following code: RewriteCond %{HTTP_HOST} ^.*$ RewriteRule ^/?$ "http\:\/\/www\.domain\.com\/temp\.html" [R=301,L] That works however I do not want to redirect IP. So if someone types in the static IP of the domain then I do not want them to be redirected to www.domain.com/temp.html Anyone have the code to take care of this?

    Read the article

  • .htaccess - lose the file .html extension

    - by Darren Sweeney
    I'm having a bad .htaccess day! I want a user to be able to type the URL mysite.com/about instead of mysite.com/about.html On .htaccess file I have: RewriteEngine On RewriteCond %{SCRIPT_FILENAME} !-f RewriteCond %{SCRIPT_FILENAME} !-d RewriteRule ^/(.*)$ /$1.html [NC,L] But this simply does not work? I will add though that if i try this further inside the site e.g. mysite.com/pages/contact Works perfectly whether I have the above code in the .htaccess or not What am I doing wrong?

    Read the article

  • Mod_rewrite and urls that don't end with .php

    - by Kevin Laity
    I'm trying to use Mod_rewrite to hide the .php extensions of my pages. However, it refuses to do any rewriting unless the input url ends with .php, which makes that impossible. I can confirm that rewriting works fine as long as the url has .php at the end. RewriteRule a\.php b\.php Works, while RewriteRule a\.html b\.html does not. How can I turn off this behavior and allow it to rewrite all urls? I'm on a shared host so whatever I do has to be done from a .htaccess file. Update: There seems to be some confusion about what I'm asking here. The question is not about how to write the rule, the question is about server configuration. The rule I'm using is fine, I can test that locally. But the server I'm working with is somehow configured so that mod_rewrite doesn't attempt to rewrite anything that doesn't end with .php

    Read the article

  • How to implement a safe password history

    - by Lorenzo
    Passwords shouldn't be stored in plain text for obvious security reasons: you have to store hashes, and you should also generate the hash carefully to avoid rainbow table attacks. However, usually you have the requirement to store the last n passwords and to enforce minimal complexity and minimal change between the different passwords (to prevent the user from using a sequence like Password_1, Password_2, ..., Password_n). This would be trivial with plain text passwords, but how can you do that by storing only hashes? In other words: how it is possible to implement a safe password history mechanism?

    Read the article

  • Why not expose a primary key

    - by Angelo Neuschitzer
    In my education I have been told that it is a flawed idea to expose actual primary keys (not only DB keys, but all primary accessors) to the user. I always thought it to be a security problem (because an attacker could attempt to read stuff not their own). Now I have to check if the user is allowed to access anyway, so is there a different reason behind it? Also, as my users have to access the data anyway I will need to have a public key for the outside world somewhere in between. Now that public key has the same problems as the primary key, doesn't it?

    Read the article

  • Hello, can you just send me all your data please?

    - by fatherjack
    LiveJournal Tags: Security,SQL Server Our house phone rang on Saturday night and Mrs Fatherjack answered. I was in the other room but I heard her trying to explain to the caller that they were in some way mistaken. Eventually, as she got more irate with the caller, I went out and started to catch up with the events so far. The caller was trying to convince my wife that our computer was infected with a virus. She was confident that it wasn't. Her patience expired after almost 10 minutes...(read more)

    Read the article

  • redirecting in node.js behind mod_rewrite proxy

    - by chmanie
    I have a node.js application running behind an Apache mod_rewrite proxy configured in a .htaccess file like this: RewriteCond %{HTTP_HOST} =mydomain.com [OR] RewriteCond %{HTTP_HOST} =www.mydomain.com RewriteRule (.*) http://localhost:3000/$1 [QSA,P] When I now do a redirect (e.g. express' res.redirect()) inside my node.js application (which runs on port 3000), the user is always redirected to http://localhost:3000/ (which is in fact exactly what is defined above but not the desired behaviour). Is there any way around this?

    Read the article

  • Mod_rewrite and urls that don't end with .php

    - by Kevin Laity
    I'm trying to use Mod_rewrite to hide the .php extensions of my pages. However, it refuses to do any rewriting unless the input url ends with .php, which makes that impossible. I can confirm that rewriting works fine as long as the url has .php at the end. RewriteRule a\.php b\.php Works, while RewriteRule a\.html b\.html does not. How can I turn off this behavior and allow it to rewrite all urls? I'm on a shared host so whatever I do has to be done from a .htaccess file. Update: There seems to be some confusion about what I'm asking here. The question is not about how to write the rule, the question is about server configuration. The rule I'm using is fine, I can test that locally. But the server I'm working with is somehow configured so that mod_rewrite doesn't attempt to rewrite anything that doesn't end with .php

    Read the article

  • Wild card redirect in htaccess giving error this webpage has a redirect loop

    - by kath
    In my website I changed the directory name "vehicles-cars" to "vehicles-cars-for-sale". When I tried to redirect using a wild card redirect from my old directory name to new directory name in my web hosting cPanel account, I get an error every time I open pages from that directory: this webpage has a redirect loop The website is php. The problem is that that I have lots of pages with the old directory indexed in Google and they are getting duplicate content. I really need some advice on what to do with this problem. Here is .htaccess file code for redirect, thanks. RewriteEngine on RewriteCond %{HTTP_HOST} ^adsbuz\.com$ [OR] RewriteCond %{HTTP_HOST} ^www\.adsbuz\.com$ RewriteRule ^vehicles\-cars\/?(.*)$ "http\:\/\/adsbuz\.com\/vehicles\-cars\-for\-sale\/$1" [R=301,L]

    Read the article

  • Using env variables with RewriteRule and ErrorDocument

    - by misterte
    Hi, I'm having problems with the following while config. my Apache server to Rewrite some urls. SetEnv PATH_TO_DIR /directory RewriteRule ^%{PATH_TO_DIR}/([a-zA-Z0-9_\-]+)/([a-zA-Z0-9_\-\.]+)/?$ /index.php?dir=$1&file=$2 ErrorDocument 404 %{PATH_TO_DIR}/index.php?dir=null&file=error This conf. used to work perfectly fine until I used SetEnv PATH... etc. I need to use this because there are lots of rules, not just those. Can anyone point out my mistake? Apache returns %{PATH_TO_DIR}/index.php?dir=null&file=error when I try anything (www.site.com/foo/bar for instance). Apache returns the ErrorDocument if i just try to fetch the index. I know it's not a problem with the rewrite rules because they work when I remove the PATH_TO_DIR variable and just hard code it. Thanks! A.

    Read the article

  • htaccess execution order and priority

    - by ChrisRamakers
    Can anyone explain to me in what order apache executes .htaccess files residing in different levels of the same path and how the rewrite rules therein are prioritized? For example, why doesn't the rewrite rule in the first .htaccess below work and is the one in /blog prioritized? .htaccess in / RewriteEngine on RewriteBase / RewriteRule ^blog offline.html [L] .htaccess in /blog RewriteEngine On RewriteBase /blog/ RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule . /blog/index.php [L] Ps: i'm not simply looking for an answer but for a way to understand the apache/modrewrite internals ... why is more important to me than how to fix this :) Thanks!

    Read the article

  • How does one block unsupported web browsers?

    - by Sn3akyP3t3
    Web browsers with an end of life no longer receive security updates which not only makes them vulnerable to the end user, but I imagine its not safe for the server's which receive visits by them either. Is it practical to block or enforce and notify the end user that their browser is unsafe and unsupported? If so, how would one achieve that? I don't know of any official or crowd-sourced listing with that information to parse and keep up to date. I'm aware that the practice can be custom built with User Agent parsing and feature detection for HTML5 enabled browsers.

    Read the article

  • mod_rewrite works within directory not on root

    - by Anvesh Saxena
    I am having problem in my RewriteRule for the tags portion. What I am able to debug is that the rule is been triggered at least because the page "tags.php" is been rendered but without the URL parameters. This .htaccess file with the rules is within root for my sub-domain and has following content for tags postion. # Rewrite rule for tags RewriteRule ^tags/(\w+)/(\d+)/?$ tags.php?tag_name=$1&tag_id=$2 RewriteRule ^tags/(\w+)/?$ tags.php?tag_name=$1 RewriteRule ^tags/?$ tags.php?tag_name= Another problem that I ain't able to debug is that the similar .htaccess file exists for a directory within my sub domain and is working as expected with the necessary URL parameters also been available. The .htaccess file within the directory reads as follows # Rewrite rule for tags RewriteRule ^tags/(\w+)/(\d+)/?$ restAPI.php?type=tags&tag_name=$1&tag_id=$2 RewriteRule ^tags/(\w+)/?$ restAPI.php?type=tags&tag_name=$1 RewriteRule ^tags/?$ restAPI.php?type=tags&tag_name= Could anyone point me the problem that I might be having in my Rewrite rules, I am also facing Internal server error sometimes which I am second guessing is due to the linked problem. Note:- I have Apache version 2.2.23 on my shared hosting.

    Read the article

  • Warning about SSL ceritificate, am I under attack ?

    - by Bunny Rabbit
    Lately I've been getting a lot of warnings about SSL certifications on my pc, Empathy keeps telling me that Facebook's certificate is self-signed and can't be trusted, and also, there are occasional warnings in Google-Chrome about security. I remember the last one saying that that the page is secured but some of the resources that the page is using are not from a secure connection, something like that. Is my pc hacked / under attack? How can I check that, and if so, how can I safeguard myself? PS: One thing that comes to my mind is that I might be under an arp poisoning / spoofing attack.

    Read the article

  • Does purposely linking to an invalid URL and then using 301 affect SEO?

    - by Mike
    On a section of my site, I am currently using .htaccess rewrites to put the ID as part of the URL instead of in the query, like so: RewriteRule ^([a-z_]+)?/?tours/([0-9]+)/(.*) /tours/tour_text.php?lang=$1&id=$2&urlstr=$3 [L] For example, if someone goes to /en/tours/12/some-text-here it will rewrite it to /tours/tour_text.php?lang=en&id=12&urlstr=some-text-here. However I don't want the users to be able to put just any text, so if they type in the wrong some-text-here part it will 301 redirect them to the right page. This works perfectly, but I can see a potential problem potential arising when localizing the website, so I just wanted to make sure it's not actually a problem. How it is now, if someone goes to /en/tours/12/some-text-here, the anchor to the Spanish version of that page will be /es/tours/12/some-text-here (i.e. only changing the "en" to "es"), and then the script will then 301 them to the correct Spanish text (something like /es/tours/12/algun-texto-aqui). And the reverse will also be the same. The anchor on the Spanish version to the English version would be /en/tours/12/algun-texto-aqui and then they will be forwarded with 301 back to /en/tours/12/some-text-here. Basically, the anchor changes the language and the 301 changes the string at the end. So I have two questions: Does purposely and permanently having invalid URLs on your site that get 301'ed to the correct ones have any effect on SEO? I could make it just show the correct URL to begin with, but this is a significant amount of work due to how I am handling the translations, so I would prefer just to 301 them. Will the invalid URLs that are contained in the links be added to the search engine indexes even if they get 301'ed to another page?

    Read the article

  • Ask the Readers: How Do You Browse Securely Away From Home?

    - by Jason Fitzpatrick
    When you’re browsing away from home, be it on your smartphone, tablet, or laptop, how do you keep your browsing sessions secure? This week we’re interested in hearing all about your mobile security tips and tricks. When you’re out and about you often, out of necessity or convenience, need to connect to open Wi-Fi hotspots and otherwise put your data out there in ways that you don’t when you’re at home. This week we want to hear about your tips, tricks, and applications for keeping your data secure and private when you’re away from your home network. Sound off in the comments with your tips and then check back on Friday for the What You Said roundup. HTG Explains: Why Linux Doesn’t Need Defragmenting How to Convert News Feeds to Ebooks with Calibre How To Customize Your Wallpaper with Google Image Searches, RSS Feeds, and More

    Read the article

  • Suggest-a-Session for Oracle Develop 2010: Last chance to get your paper submitted.

    - by olaf.heimburger
    While working with Oracle Technologies at customer projects we all come across solutions and ideas that are worth to share with a greater audience. When you missed the Call For Paper for Oracle OpenWorld and Oracle Develop you have the chance to get in. The Oracle Mix Community provides a tool called Suggest-a-Session for submitting and voting the sessions you would like to attend. My Suggestions When you pass by, do not forget to vote for my sessions. These are: Real-World Single Sign-On and ADF Security The Personal Newsletter Generator: Implement Cool Applications with ADF Faces Thank you for your support.

    Read the article

  • apache domain redirect to subfolder

    - by Dennis
    I have a hosting account with godaddy. Its a linux system running apache. The way they do their setup is your primary domain is the root folder. When you add a subdomain its in a subfolder of the root which sucks. I want to setup a subfolder structure to organize my domains.. I called godday support and they said to use redirects.. but did not know how to do that.. How its setup now: primary domain: www.domain.com / sub.domain.com /sub I want to create a directory structure and then redirect to each but only show www.domain.com in the url www.domain.com /domain/www sub.domain.com /domain/sub I tried using: RewriteEngine On RewriteCond %{HTTP_HOST} ^(www.)?domain.com$ RewriteRule ^(/)?$ domain/www [L] but it just changes the url to www.domain.com/domain/www Can this be done in htaccess?

    Read the article

  • Wordpress .htaccess preventing subfolder access

    - by John K.
    This is sort of a goofy setup, but it's not in my power to reconfigure it at this time. I'm running in a shared hosting environment. The domain is example.com. This is an add-on domain on the host side with example.com being redirected to the www/example.com sub-directory. That directory houses a standard Wordpress site which acts as the main site when you visit example.com. The .htaccess file within that directory is: # BEGIN WordPress <IfModule mod_rewrite.c> RewriteEngine On RewriteBase / RewriteRule ^index\.php$ - [L] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule . /index.php [L] </IfModule> # END WordPress <IfModule mod_rewrite.c> RewriteEngine On RewriteRule ^wp-admin/profile\.php$ /ssm/welcome [R] </IfModule> I have a subdirectory, at the root level with the /example.com subdirectory that houses a cake php application. That subdirectory is /tracker. My problem is that when I attempt to browse to example.com/tracker, I get a 404 from Wordpress because perma links are on. What I think I need is a rewrite rule in the Wordpress .htaccess file that short circuits the existing rewrite rules and permits example.com/tracker to work independently of the Wordpress install. Or a rewrite rule at the root level that short circuits the redirect to the /example.com directory in the first place. Not sure how well I explained that so here's a summary. The www/ directory structure: example.com/ tracker/ Add on domain of www.example.com redirecting to the /example.com directory with Wordpress and a tracker/ directory running CakePHP which I would like to access via www.example.com/tracker. If you need further info or clarification let me know!

    Read the article

  • Removing 301 redirect from site root

    - by Jon Clements
    I'm having a look at a friends website (a fairly old PHP based one) which they've been advised needs re-structuring. The key points being: URLs should be lower case and more "friendly". The root of the domain should be not be re-directed. The first point I'm happy with (and the URLs needed tidying up anyway) and have a draft plan of action, however the second is baffling me as to not only the best way to do it, but also whether it should be done. Currently http://www.example.com/ is redirected to http://www.example.com/some-link-with-keywords/ using the follow index.php in the root of the Apache2 instance. <?php $nextpage = "some-link-with-keywords/"; header( "HTTP/1.1 301 Moved Permanently" ); header( "Status: 301 Moved Permanently" ); header("Location: $nextpage"); exit(0); // This is Optional but suggested, to avoid any accidental output ?> As far as I'm aware, this has been the case for around three years -- and I'm sorely tempted to advise to not worry about it. It would appear taking off the 301 could: Potentially affect page ranking (as the 'homepage' would disappear - although it couldn't disappear because of the next point...) Introduce maintainance issues as existing users would still have the re-directed page in their cache Following the above, introduce duplicate content Confuse Google/other SE's as to what the homepage actually is now I may be over-analysing this but I have a feeling it's not as simple as removing the 301 from the root, and 301'ing the previous target to the root... Any suggestions (including it's not worth it) are sincerely appreciated.

    Read the article

  • What are the downsides of leaving automation tags in production code?

    - by joshin4colours
    I've been setting up debug tags for automated testing of a GWT-based web application. This involves turning on custom debug id tags/attributes for elements in the source of the app. It's a non-trivial task, particularly for larger, more complex web applications. Recently there's been some discussion of whether enabling such debug ids is a good idea to do across the board. Currently the debug ids are only turned on in development and testing servers, not in production. There have been points raised that enabling debug ids does cause performance to take a hit, and that debug ids in production may lead to security issues. What are benefits of doing this? Are there any significant risks for turning on debug tags in production code?

    Read the article

  • How to Remove Extensions From, and Force the Trailing Slash at the End of URLs?

    - by Kronbernkzion
    Example of current file structure: example.com/foo.php example.com/bar.html example.com/directory/ example.com/directory/foo.php example.com/directory/bar.html example.com/cgi-bin/directory/foo.cgi I would like to remove HTML, PHP and CGI extensions from, and then force the trailing slash at the end of URLs. So, it could look like this: example.com/foo/ example.com/bar/ example.com/directory/ example.com/directory/foo/ example.com/directory/bar/ example.com/cgi-bin/directory/foo/ I am very frustrated because I've searched for 17 hours straight for solution and visited more than a few hundred pages on various blogs and forums. I'm not joking. So I think I've done my research. Here is the code that sits in my .htaccess file right now: RewriteCond %{REQUEST_FILENAME} !-d RewriteCond %{REQUEST_FILENAME}\.html -f RewriteRule ^(([^/]+/)*[^./]+)/$ $1.html RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteCond %{REQUEST_URI} !(\.[a-zA-Z0-9]|/)$ RewriteRule (.*)$ /$1/ [R=301,L] As you can see, this code only removes .html (and I'm not very happy with it because I think it could be done a lot simpler). I can remove the extension from PHP files when I rename them to .html through .htaccess, but that's not what I want. I want to remove it straight. This is the first thing I don't know how to do. The second thing is actually very annoying. My .htaccess file with code above, adds .html/ to every string entered after example.com/directory/foo/. So if I enter example.com/directory/foo/bar (obviously /bar doesn't exist since foo is a file), instead of just displaying message that page is not found, it converts it to example.com/directory/foo/bar.html/, then searches for a file for a few seconds and then displays the not found message. This, of course, is bad behavior. So, once again, I need the code in .htaccess to do the following things: Remove .html extension Remove .php extension Remove .cgi extension Force the trailing slash at the end of URLs Requests should behave correctly (no adding trailing slashes or extensions to strings if file or directory doesn't exist on server) Code should be as simple as possible I would very much appreciate any help. And to first person that gives me the solution, I'll send two $50 iTunes Store gift cards for US store. If this offends anyone, I am truly sorry and I apologize. Thanks in advance.

    Read the article

< Previous Page | 68 69 70 71 72 73 74 75 76 77 78 79  | Next Page >