Search Results

Search found 8013 results on 321 pages for 'clean urls'.

Page 92/321 | < Previous Page | 88 89 90 91 92 93 94 95 96 97 98 99  | Next Page >

  • RewriteRule working local but not on remote server

    - by m0tv
    I have a .htaccess file with one simple RewriteRule: RewriteEngine on RewriteRule ^([A-Za-z0-9-]+)$ ?site=$1 I want to have an url like http://www.example.com/imprint and forward it to http://www.example.com/?site=imprint I checked this rule with an RewriteRule tester which gave me the results I want to achieve. On my local development system it works well too. But on a remote server the URLs just give me a 404 error. Other more simple rewrite rules are working with no problems, so everything must be set up correctly (I think..). The problem is that I don't have access to any error logs or the server configs. So the only thing I can do is to guess... Can anyone tell me if theres something wrong with this rule? Or anything else I can do or test to solve this? Or has someone an idea what could be wrong on the server?

    Read the article

  • Blocking path scanning

    - by clinisbut
    I'm seeing in my access log a number of request very suspicious: /i /im /imaa /imag /image /images /images/d /images/di /images/dis They part from a known resource (in the above example /images/disrupt.jpg). All comming from same IP. Requests varies from 1/sec to 10/sec, seems somewhat random. It's obviously they are trying to find something and seems they are using a script. How do I block this kind of behaviour? I though of blocking the IP request, at least for a given time. Keeping in mind that: Request intervals seems legitimate (at least I think so). I don't want to end blocking a search engine bot, which may find 404 urls too (and that's a different problem, I know). ¿Do they use always same IP?

    Read the article

  • How to get rid of crawling errors due to the URL Encoded Slashes (%2F) problem in Apache

    - by user14198
    The Google web crawler has indexed a whole set of URLs with encoded slashes (%2F) for our site. I assume it has picked up the pages from our XML sitemap file. The problem is that the live pages will actually result in a failure because of the Url Encoded Slashes Problem in Apache. Some solutions are mentioned here We are implementing a 301 redirect scheme for all the error pages. This should make the Google bot delete the pages from the crawling errors (no more crashing pages). Does implementing the 301s require the pages to be "live"? In that case we may be forced to implement solution 1 in the article. The problem is that solution 1 will pose a security vulnerability..

    Read the article

  • Seo Google Publisher Network

    - by Andy
    I'm just about to start a new business which creates niche affiliate sites. I'm curious about the impacts to me from Google of all the urls being hosted with the same analytic tags, webmaster tool tags and server ip ranges. To benefit the most from google's serps should i have each domain within seperate analytic accounts and webmaster tools or is it ok for me to have all of my domains within one account. My issue is duplicate content and the fact that i am building a publisher network and i'm not sure how much google likes them. I'm notoriously bad at searching and as such havent found what i'm looking for yet. Any help would be very much appreciated.

    Read the article

  • How to track in Google Analytics registrations come from Google AdWords ads?

    - by automatix
    I created a campaign in Google AdWords and some ads in it and gave them URLs like mydomain.tld/registration/?utm_campaign=mycampaing&ad=x mydomain.tld/registration/?utm_campaign=mycampaing&ad=y mydomain.tld/registration/?utm_campaign=mycampaing&ad=z All ads lead to the registration page. A registration is a visit of the page mydomain.tld/registration-complited/?user={ID} So I can track the registrations in Google Analytics. I just go to Behavior -> Site Content -> All Pages and filter the pages to registration-complited. But how can I see, how many and which users have registered, after they came from an ad of a campaign, e.g. utm_campaign? And how can I also track this for a sigle ad of the campaign, e.g. x?

    Read the article

  • Working with different URL structures

    - by Dane411
    As I'm quite newbie to this field, I've doubts and there are some I couldn't find on Google, i.e: If I'm not wrong, index.html makes it possible to avoid to add the filename to the url, www.example.com/ is equal to www.example.com/index.html. And that works for the following subdirectories, right? www.example.com/music/ Is there any other way to achieve this without using an index.html file? (I've read smth about converting dynamic urls to static: ./?var1=value1&varN=valueN - ./value1/valueN) How can I convert www.example.com/music/ to music.example.com/ and why should it be used? Thanks in advance!

    Read the article

  • best/simplest way to inform search engine of sitemap location

    - by Don
    AFAIK, there are 2 ways to make search engines aware of a sitemap's location: Include an absolute link to it in robots.txt Submit it to them directly. The relevant URLs are: http://www.google.com/webmasters/tools/ping?sitemap=SITEMAP_URL http://www.bing.com/webmaster/ping.aspx?sitemap=SITEMAP_URL Where SITEMAP_URL is the absolute URL of the sitemap. Currently, I do both. Regarding (2), I have a job that runs automatically every day which submits the sitemap to Bing and Google. I don't think there's any reason to do (1) and (2), but I'm paranoid, so I do. I imagine you can avoid both (1) and (2) if you just make your sitemap accessible at a conventional URL (like robots.txt). What's the simplest and most reliable way to ensure that search engines can find your sitemap?

    Read the article

  • How do I interpret direct traffic that lands on random pages?

    - by mfg
    Looking at yesterday, according to Google Analytics, I got six direct visitors to my site (their source/medium is direct/(none)). Only one ended up at the actual domain. The other five ended up at miscellaneous foo.com/xyz.html. I did not send out links to people by email, and I'm not sure how likely it is the people would have copy/pasted the URLs. How do the visitors end up there? Is there a way to better capture where they might be coming from?

    Read the article

  • How to generate "language-safe" UUIDs?

    - by HappyDeveloper
    I always wanted to use randomly generated strings for my resources' IDs, so I could have shorter URLs like this: /user/4jz0k1 But I never did, because I was worried about the random string generation creating actual words, eg: /user/f*cker. This brings two problems: it might be confusing or even offensive for users, and it could mess with the SEO too. Then I thought all I had to do was to set up a fixed pattern like adding a number every 2 letters. I was very happy with my 'generate_safe_uuid' method, but then I realized it was only better for SEO, and worse for users, because it increased the ratio of actual words being generated, eg: /user/g4yd1ck5 Now I'm thinking I could create a method 'replace_numbers_with_letters', and check that it haven't formed any words against a dictionary or something. Any other ideas? ps. As I write this, I also realized that checking for words in more than one language (eg: english and french, spanish, etc) would be a mess, and I'm starting to love numbers-only IDs again.

    Read the article

  • IIS 6 nested virtual directory redirection

    - by threedaysatsea
    We're running IIS 6 on a WinServer2k3 box and we're having some trouble with the following problem: E-mails were sent out to users asking them to go to the following URL: alias.contoso.com/directory2/view.aspx?queryparam1=no&queryparam2=blue However, the URLS are actually supposed to be: server.contoso.com/directory2/view.aspx?queryparam1=no&queryparam2=blue It's too late to recall all of the e-mails, and we'd like to redirect traffic to make this as seamless as possible for our users. The real problem here is that the server (server.contoso.com) is hosting the alias (alias.contoso.com) as a redirect thusly, and the existing redirect we need to keep functional: Default Web Site (server.contoso.com) --Directory1 --Directory2 --Directory3 Redirection to Directory3 (alias.contoso.com) --Essentially alias.contoso.com will take the user to server.contoso.com/Directory3 Is there any way to host a separate redirect inside of the existing redirect? We need to keep alias.contoso.com taking the user to server.contoso.com/Directory3 but also make alias.contoso.com/directory2/view.aspx?queryparam1=no&queryparam2=blue point to server.contoso.com/directory2/view.aspx?queryparam1=no&queryparam2=blue Any tips? Is this even possible?

    Read the article

  • A lot of 302 redirects

    - by user3651934
    I have a website for which one month stat shows: Unique Visitors 6274 Total Visitors 7260 Pages visited 9520 Hits 88891 Whats concerns me about is the HTTP status code: 302 Moved temporarily (redirect) 36302 How come 40% hits are being redirected. If it is not normal, what could be the possible reasons? ------------------------ adding more information ------------------------ Ok, here is the code I'm using in my .htaccess file for clean URLs. Is this causing as many as 36302 redirect hits? RewriteCond %{REQUEST_FILENAME}.php -f RewriteRule ^([^\.]+)$ $1.php [L] RewriteCond %{REQUEST_FILENAME} -d RewriteRule ^(.+[^/])$ $1/ [R] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^(.*)$ index.php?page=$1 [L,QSA] RewriteRule ^(.*)/$ index.php?page=$1 [L,QSA]

    Read the article

  • How will this affect my SEO ranking?

    - by dunc
    I run a fishkeeping website based on a WordPress (PHP) CMS. I've recently put a fairly complex "filter" into place which searches my content for mentions of fish species profiles and turns them into an active link. For example, asdasd this is a test about abdomen to see if the caudal fin will work asdadasdas try again with abdomen and A. panduro and Apistogramma panduro ...becomes asdasd this is a test about abdomen to see if the caudal fin will work asdadasdas try again with abdomen and <a href="/?p=1703" class="link_species">A. panduro</a> and <a href="/?p=1703" class="link_species">Apistogramma panduro</a> On the rest of my website, the species are linked with pretty URLs such as /species/apistogramma-panduro/ but due to the way this filter works, the only information I can get access to is the idof the post. As such, I'm using /?p=1703 or whatever the ID is. What I'd like to know is: how much will this affect my SEO rating/ranking? Will it be detrimental if I don't rewrite the function? Thanks in advance,

    Read the article

  • SharePoint Content and Site Editing Tips

    - by Bil Simser
    A few content management and site editing tips for power users on this bacon flavoured unicorn morning. The theme here is keep it clean!Write "friendly" email addressesRemember it's human beings reading your content. So seeing something like "If you have questions please send an email to [email protected]" breaks up the readiblity. Instead just do the simple steps of writing the content in plain English and going back, highlighting the name and insert a link (note: you might have to prefix the link with mailto:[email protected]). It makes for a friendlier looking page and hides the ugliness that are sometimes in email addresses.Use friendly column and list namesThis is a big pet peeve of mine. When you first create a column or list with spaces the internal name is changed. The display name might be "My Amazing List of Animals with Large Testicles" but the internal (and link) name becomes "My_x00x20_Amazing_x00x20_List_x00x20_of_x00x20_Animals_x00x20_with_x00x20_Large_x00x20_Testicles". What's worse is if you create a publishing page named "This Website is Fueled By a Dolphin's Spleen". Not only is it incorrect grammar, but the apostrophe wreaks havoc on both the internal name for the list (with lots of crazy hex codes) as well as the hyperlink (where everything is uuencoded). Instead create the list with a distinct and compact name then go back and change it to whatever you want. The end result is a better formed name that you can both script and access in code easier.Keep your Views CleanWhen you add a column to a list or create a new list the default is to add it to the default view. Do everyone a favour and don't check this box! The default view of a list should be something similar to the Title field and nothing else. Keep it clean. If you want to set a defalt view that's different, go back and create one with all the fields and filtering and sorting columns you want and set it as default. It's a good idea to keep the original AllItems.aspx (note the lack of space in the filename!) easy and unfiltered. It's also a good idea to keep your column count down in views. Don't let every column be added by default and don't add every column just because you can. Create separate views for distinct responsibilities and try to keep the number of columns down to a single screen to prevent horizontal scrolling.Simple NavigationThe Quick Launch is a great tool for navigating around your site but don't use the default of adding all lists to it. Uncheck that box and keep navigation simple. Create custom groupings that make sense so if you don't have a site with "Documents and Lists" but "Reports and Notices" makes more sense then do it. Also hide internal lists from the Quick Launch. For example, if most users don't need to see all the lookup tables you might have on a site don't show them. You can use audience filtering on the Quick Launch if you want to hide admin items from non-admin users so consider that as an option.Enjoy!

    Read the article

  • Extracting meta tags attribute using wget [migrated]

    - by Amit
    I have a file having some URLs per line. I need to extract the "keywords" present in the tags i.e. if there is meta tag for "keywords" then i want to get "content" value for it. Example: if the web-page has this meta-tag then for that URL i want "wikipedia,encyclopedia" to be extracted. One approach is to download the web-page using "wget" and then parse it using some standard HTML parser. I was wondering is there any better way to do this without downloading the entire web-page.

    Read the article

  • Moved sitemaps to a different subdomain and losing search referrals around the same time. Red herring or correlation?

    - by er1234
    We started to lose search referral traffic around the same time that I moved some of our sitemaps to a subdomain. Could this have hurt us? I followed Google's steps to creating a sitemap under a different subdomain. The new sitemaps.foo.com subdomain is being crawled and indexed well. Both www.foo.com and sitemaps.foo.com have been verified in Google Webmaster Tools. They appear as distinct sites. Is this correct? I can't find a way in Webmaster Tools to say "Hey, sitemaps.foo.com is really owned by www.foo.com, so show them together and make sure to attribute sitemaps.foo urls to www.foo" Our www.foo.com/robots.txt Sitemap: http://www.foo.com/sitemap.xml Sitemap: http://sitemaps.foo.com/subdir/sitemap.xml.gz

    Read the article

  • Length of Page Title, URL, Meta Description and total number of links on a page

    - by MJWadmin
    We've been examining a number of different SEO tools recently. Several of these tell us that some of our page title's, urls and meta descriptions are too long. We've also been told that some of our pages have too many links on them. I guess our first question is - is any of that feedback true! Can URL's etc actually be too long and if so how much does this affect ranking? Secondly can you have too many links on a page and if so, how many is too many? Thanks in advance...

    Read the article

  • SEO penalty for landing page redirects

    - by therealsix
    Using ebay as an example- lets say I have a large number of items whose URLs' look like this: cgi.ebay.com/ebaymotors/1981-VW-Vanagon-manual-seats-seven-/250953153841 I want to give my client the ability to put links to these items on their website EASILY, without knowing or checking my URL. So I created a redirect service that will map their identifier with my URL: ebay.com/fake_redirect_service/shared_identifier9918 would redirect to the link above. This works great- my clients can easily setup these links with information they already have, and the user will see the page as usual. So on to the problem... I'm concerned that this redirecting service will have a negative impact on my SEO ranking. Having a landing page redirect you immediately to a different URL seems like something a typical spam site would do. Will this hurt me? Any better solutions?

    Read the article

  • BleachBit: How to Completely Clear URL History in Firefox?

    - by tSquirrel
    14.04 / Firefox 29.0 I've been using Bleachbit to clear usage/file history, and for the most part it works great. However, it doesn't seem to clear the website hostnames out of the URL, at all. These addresses are not bookmarked. Also, the total URL isn't preserved, just the hostname. Visit site http://www.bluesnews.com/some_random_URL_string Exit Firefox Run Bleachbit, with ALL Firefox options selected Restart Firefox Check history: completely empty, other than bookmarked sites. www.bluesnews is NOT bookmarked Type "blue" which is Firefox automatically completes as "http://www.bluesnews.com/" Alternate Step #3: Use Firefox's built-in "Clear History" and select ALL entries with a time frame of "Everything". Same result as above. My inquiry in BB forums hasn't been responded to. I found Dan's proposed solution, however changing autocomplete in about:config only turns off the function, it doesn't actually stop storing URLs.

    Read the article

  • Google Site Search (commercial) not indexing files in sitemap

    - by melat0nin
    I have a client for whom we have purchased Google Site Search. It works well for HTML pages served by the CMS, but files aren't being reliably indexed. I wrote a script to generate an XML feed (sitemap) of all the files in the CMS which I've plugged in to Google Webmaster Tools for the site. It says that for that sitemap 923 URLs have been submitted, but only 26 have been indexed. The client relies heavily on searching within files, which is why we decided to use Google search, so this is a bit of a problem. Many of the files aren't linked to from any page on the site, as they are old and therefore don't merit having a page of their own. But they still need to be accessible through search for archiving purposes. The file archive xml can be found at www.sniffer.org.uk/file-archive and the standard xml sitemap (of pages) can be found at www.sniffer.org.uk/sitemap.xml. Any thought would be much appreciated!

    Read the article

  • Programming Windows Identity Foundation - ISBN 978-0-7356-2718-5

    - by TATWORTH
    This book introduces a new technology that promises a considerable improvement on the ASP.NET membership system. If you ever had to write an extranet, system you should be aware of the problems in setting up membership for your site. The Windows Identity Foundation promises to be an excellent replacement. Therefore the book Programming Windows Identity Foundation - ISBN 978-0-7356-2718-5 at  http://oreilly.com/catalog/9780735627185, is breaking new ground. I recommend this book to all ASP.NET development teams. You should reckon on 3 to 5 man-days to study it and try out the sample programs and see if it can replace your bespoke solution. Rember this is version 1 of WIF and give yourself adequete time to read this book and familiarise yourself with the new software. Some URLs for more information: WIF home page at http://msdn.microsoft.com/en-us/security/aa570351.aspx The Identity Training Kit at http://www.microsoft.com/downloads/en/details.aspx?displaylang=en&FamilyID=c3e315fa-94e2-4028-99cb-904369f177c0 The author's blog at http://www.cloudidentity.net/

    Read the article

  • Best way to redirect in IIS

    - by stephmoreland
    We have a website that has two URLs (one for the US side and another for the Canadian side which is then broken into Canadian English and Canadian French). For the purposes of my question, I will write as: www.us_url.com (US) www.canada_url.ca/ca_en/ (Canadian English) www.canada_url.ca/ca_fr/ (Canadian French) To make sure people are on the correct site, what do I do if they go to the US URL with Canadian English content (e.g. www.us_url.com/ca_en/canada.asp) but I want to make sure the URL is the Canadian one (e.g. www.canada_url.ca/ca_en/canada.asp) so it shows up properly in Google Analytics. We're using IIS 7 and classic ASP.

    Read the article

  • Alternatives to OAuth?

    - by sdolgy
    The Web industry is shifting / has shifted towards using OAuth when extending API services to external consumers & developers. There is some elegance in simple....and well, the 3-step OAuth process isn't too bad ... i just find it is the best of a bad bunch of options. Are there alternatives out there that could be better, and more secure? The security reference is derived from the following URLs: http://www.infoq.com/news/2010/09/oauth2-bad-for-web http://hueniverse.com/2010/09/oauth-2-0-without-signatures-is-bad-for-the-web/

    Read the article

  • Google Analytics - how to track clicks on a screen?

    - by milesmeow
    Can I track the click of every link, button, dropdown select, etc. on a screen and have it be tracked in Google Analytics? I want to create a page and collect data on which widget the users use most. What about AJAX stuff? What if you're using jQuery or Mootools...can you get the functions to register a fake URL with GA based on user interaction? I use to do this with Flash. Everytime you click a button, it can initiate a fake URL request. I would make urls such as ".../customize/eyes/" or ".../customize/nose", etc. Just wondering if I can do that with Javascript on the page. I've also posted at StackOverflow.

    Read the article

  • AdWords traffic not (properly) reflected in Analytics

    - by CJM
    I have an AdWords account, which was set to use Auto-tagging of URLs. When looking at the Analytics account for that site, I couldn't find any reference to AdWords traffic either in the Advertising section or the Traffic Sources section. So I manually constructed the URL tags, and updated the Campaign Ad. Once the ad was approved and the clicks started coming through again, I could see the results in the Traffic Sources section of Analytics. In the Sources Campaigns section, my campaign was listed, and under Sources All Traffic, it was registering the same level of traffic from google/adwords. However, the Advertising AdWords section is still drawing a blank. Any ideas? Are there explicit steps needed to enable full tracking of AdWords campaigns? If it is relevant, the Adwords campaign was set up with one account, and the Analytics tracking with another, but both accounts have full access to both AdWords and Analytics.

    Read the article

  • HTAccess redirect directories to index.html

    - by BFTrick
    Hi there, I am working on a site that where I do not have permission to the server and someone else keeps changing the settings. That person just changed the settings preventing users from going to example.com/foo/ and seeing the index page. This Virtual Directory does not allow contents to be listed. If you type in example.com/foo/index.html you can still see the file. So I want to use htaccess to redirect all urls that end in a directory to change into directory/index.html How do I write that? I started with some code that changes .php files to .html files and tried to work from that but I couldn't quite get it to work. RewriteRule ^(.*)\.php$ /$1.html [R=301,L] Any suggestions?

    Read the article

< Previous Page | 88 89 90 91 92 93 94 95 96 97 98 99  | Next Page >