Search Results

Search found 9717 results on 389 pages for 'pro'.

Page 56/389 | < Previous Page | 52 53 54 55 56 57 58 59 60 61 62 63  | Next Page >

  • Multiple 301 redirect and massive loss of ranking

    - by DoesNotCompute
    I just remade from scratch a website for a client, the client ask me to preverve their ranking by making 301 redirect from the original URL to the new URL. For instance: http://plumber-directory.my-website.com/john-smith-city-1.php became http://directory.my-website.com/plumber/city/john-smith.html So i put the website online for few days until the 301 partially kicks in the google results. Then the client call me back to tell me that his boss want to switch back to the ancients URLs _< So i put a new 301 redirect: http://directory.my-website.com/plumber/city/john-smith.html revert to http://plumber-directory.my-website.com/john-smith-city-1.php Because google had just few days to assimilate the new URLs, it have now the two kinds of URLs in it's own result pages. Also the ranking of the website keeps falling down every day, i suspect google to mistaking those redirects for duplicate content. Is there something i can do to avoid a total loss of rankings?

    Read the article

  • Determine web page draw time via a program

    - by Kevin Burke
    Google Chrome has a nice tool to determine the time the page begins drawing, in the Network tab in Developer Tools. Similarly sites like webpagetest.org can tell you the draw time and give you the whole waterfall of page loads for a given web page. I was wondering if I could automate the process of finding the time it took to the first page draw, for all of the pages on my site, so I can share this data within my company. Obviously the page draw time will depend on the latency and throughput of your connection, but I'm more concerned with the relative data about pages on our site. Can I get this data from Selenium or another tool? Thanks, Kevin

    Read the article

  • Should my URLs be lowercase?

    - by Rowan Freeman
    According to this blog ("Understanding SEO Friendly URL Syntax Practices") I should change http://example.com/Hello-Dolly To http://example.com/hello-dolly The reasons given are: URLs, in general, are case-sensitive it will simplify any case sensitive SEO and analytics reports According to this GIF that I found on Wikipedia's article on URL Normalization I should convert my URLs from any uppercase to all lowercase. However I use ASP.NET MVC4 and by default my URLs are structured like this (CamelCase): http://www.domain.com/Controller/Action/Parameter http://www.greatsite.com/Categories/List/Bicycles I've skimmed through the RFC1738 but I didn't see any definitive answers to this. Should I go out of my way to force the framework to change everything to lower case? Why did Microsoft choose to design their framework like this if everybody is telling me to use lowercase?

    Read the article

  • Different behaviour with windows authentication on IIS7 websites

    - by amaters
    I need to run a website with just windows authentication. Given the following situation: The location of the default website is: c:\inetpub\wwwroot The location of my code is: c:\Sites\WebApp my hostfile is edited so any .local i use points to 127.0.0.1 I have created a new application called 'AppX' underneath the default website and point it to c:\Sites\WebApp. It will use the DefaultappPool. When I switch off anonymous and switch on windows authentication all works well when I go to localhost/AppX/. What i really want is a new website (No need to question why I want this). So I created Website2 and did exact the same creation of the application. Everything is the same; destination, app pool and authentication. Now when I browse to this website web2.local/AppX/ I get the 401.2 - Unauthorized error. What am I missing here?

    Read the article

  • Does hiding images on 404 error affect SEO?

    - by Question Overflow
    I have a dynamic website that allows registered users to upload and display images on the their profile page. As each user may upload less than the maximum limit of 20 images, there would be some "empty" images on the page. I am using javascript to hide these empty images. The loading of the profile page would generate a series of 404 errors depending on the number of empty images. Would these 404 errors affect the SEO of the page and the website?

    Read the article

  • Pins on Pinterest link to 404 - yet link is valid

    - by Damian Rees
    this is a really strange one. I've set up a bunch of pins on Pinterest linking through to our services which all work fine. Then I decided to do the same on our blog articles (we use Wordpress), yet everytime I click the link (and I've done this on different computers) the link goes to a 404 page on our site. However the link is valid and if you right click the pin and open in new window it opens fine. I have contacted Pinterest who are next to useless. I have also tried different browsers, different computers and different Pinterest accounts. I can't see any weirdness in my htaccess files causing this so I'm a little stumped. Any suggestions?

    Read the article

  • 'Development dashboard' web application

    - by espais
    Hi all, I am not sure if something like this exists in that it is ready out of the box. I currently have some web space that I use for various projects, and I would like to setup an area for some friends and I to develop web applications together. My ideal setup would be to create a folder, say, webdev.domain.com. We could all go to this domain, login, and then be able to setup new applications, pick which language will be used, setup database tables, allow HTML based file uploading, and create sub-folders to basically have a test bed for the applications. In retrospect, it seems like I'm describing a limited version of cpanel. I could come up with something in Drupal I'm sure, but I don't want to have to really spend time configuring much. Like I said, I want to install it and have minimal configuration. Does something like this exist (preferably in open-source)?

    Read the article

  • How should I update my name server after I installed a new dedicated server?

    - by Jim Thio
    Say I got a dedi. The IP is 123.123.123.123 Now I got domain name domainname.com that will be the "main" domain name for that server. Should I? Set the name server of the domainname.com to ns1.domainname.com and ns2.domainname.com Add child nameserver ns1.domainname.com and ns2.domainname.com to point to that exact IP. or Should I? Point the name server to my registrar name server. Set an A address of the name server to point to my IP. Which one is right? Obviously I want ns1.domainname.com and ns2.domainname.com to point to my IP so I can then point hundreds of domains to that IP. But how exactly I should do that? Specifically I simply use cpanel. Centosh with cpanel.

    Read the article

  • Thousands of 404 errors in Google Webmaster Tools

    - by atticae
    Because of a former error in our ASP.Net application, created by my predecessor and undiscovered for a long time, thousands of wrong URLs where created dynamically. The normal user did not notice it, but Google followed these links and crawled itself through these incorrect URLs, creating more and more wrong links. To make it clearer, consider the url example.com/folder should create the link example.com/folder/subfolder but was creating example.com/subfolder instead. Because of bad url rewriting, this was accepted and by default showed the index page for any unknown url, creating more and more links like this. example.com/subfolder/subfolder/.... The problem is resolved by now, but now I have thousands of 404 errors listed in the Google Webmaster Tools, which got discovered 1 or 2 years ago, and more keep coming up. Unfortunately the links do not follow a common pattern that I could deny for crawling in the robots.txt. Is there anything I can do to stop google from trying out those very old links and remove the already listed 404s from Webmaster Tools?

    Read the article

  • Apache returns 304, I want it to ignore anything from client and send the page

    - by Ayman
    I am using Apache HTTPD 2.2 on Windows. mod_expires is commented out. Most other stuff are not changed from the defaults. gzip is on. I made some changes to my .js files. My client gets one 304 response for one of the .js files and never gets the rest. How can I force Apache to sort of flush everything and send all new files to the client? The main html file includes these scripts in the head section of the main page: <script src="js/jquery-1.7.1.min.js" type="text/javascript"> </script> <script src="js/jquery-ui-1.8.17.custom.min.js" type="text/javascript"></script> <script src="js/trex.utils.js" type="text/javascript" charset="utf-8"></script> <script src="js/trex.core.js" type="text/javascript" charset="utf-8"></script> <script src="js/trex.codes.js" type="text/javascript" charset="utf-8"></script> <script src="js/trex.emv.js" type="text/javascript" charset="utf-8"></script> <script src="js/trex.b24xtokens.js" type="text/javascript" charset="utf-8"></script> <script src="js/trex.iso.js" type="text/javascript" charset="utf-8"></script> <script src="js/trex.span2.js" type="text/javascript" charset="utf-8"></script> <script src="js/trex.amex.js" type="text/javascript" charset="utf-8"></script> <script src="js/trex.abi.js" type="text/javascript" charset="utf-8"></script> <script src="js/trex.barclays.js" type="text/javascript" charset="utf-8"></script> <script src="js/trex.bnet.js" type="text/javascript" charset="utf-8"></script> <script src="js/trex.visa.js" type="text/javascript" charset="utf-8"></script> <script src="js/trex.atm.js" type="text/javascript" charset="utf-8"></script> <script src="js/trex.apacs.js" type="text/javascript" charset="utf-8"></script> <script src="js/trex.pstm.js" type="text/javascript" charset="utf-8"></script> <script src="js/trex.stm.js" type="text/javascript" charset="utf-8"></script> <script src="js/trex.thales.js" type="text/javascript" charset="utf-8"></script> <script src="js/trex.fps-saf.js" type="text/javascript" charset="utf-8"></script> <script src="js/trex.fps-iso.js" type="text/javascript" charset="utf-8"></script> <script src="js/trex.app.js" type="text/javascript" charset="utf-8"></script> Apache access log has the following: [07/Jul/2013:16:50:40 +0300] "GET /trex/index.html HTTP/1.1" 200 2033 "-" [07/Jul/2013:16:50:40 +0300] "GET /trex/js/trex.fps-iso.js HTTP/1.1" 304 [08/Jul/2013:07:54:35 +0300] "GET /trex/index.html HTTP/1.1" 304 - "-" [08/Jul/2013:07:54:35 +0300] "GET /trex/js/trex.iso.js HTTP/1.1" 200 12417 [08/Jul/2013:07:54:35 +0300] "GET /trex/js/trex.amex.js HTTP/1.1" 200 6683 [08/Jul/2013:07:54:35 +0300] "GET /trex/js/trex.fps-saf.js HTTP/1.1" 200 2925 [08/Jul/2013:07:54:35 +0300] "GET /trex/js/trex.fps-iso.js HTTP/1.1" 304 Chrome request headers are as below: THis file is ok, latest: Request URL:http://localhost/trex/js/trex.iso.js Request Method:GET Status Code:200 OK (from cache) THis file is ok, latest: Request URL:http://localhost/trex/js/trex.amex.js Request Method:GET Status Code:200 OK (from cache) This one is also ok: Request URL:http://localhost/trex/js/trex.fps-iso.js Request Method:GET Status Code:200 OK (from cache) The rest of the scrips all have 200 OK (from cache).

    Read the article

  • Why does max-width behave counter intuitively on columns in a table? [migrated]

    - by Nate
    Basically, I have a stretchy table, I want my label column to be fixed width and my data column to be dynamically sized. My inclination would be to set the max-width via CSS on my label column. However, this has the opposite effect. I've created a jsfiddle that replicates this. (Re-size the window to see the left column dynamically sized and the right column fixed size) On my own site, I see the same behavior and it happens in IE and Chrome. If I switch it, and set max-width on the data column, everything behaves as I want, but it feels backwards to me. Am I doing something wrong here?

    Read the article

  • Spam prevention through IP tracking

    - by whamsicore
    I am building a website with user generated comments. In order to implement user moderation/spam-protection, users have the ability to mark comments as spam. When one comment is marked as spam, I want all comments from the same IP address to be deleted. I am not familiar with spam prevention in general, other than Captcha. Question: is this a feasible/good system for spam prevention? are there better ways, or improvements I can make? Thanks.

    Read the article

  • How do I block a user-agent from Apache

    - by rubo77
    How do I realize a UA string block by regular expression in the config files of my Apache webserver? For example: if I would like to block out all bots from Apache on my debian server, that have the regular expression /\b\w+[Bb]ot\b/ or /Spider/ in their user-agent. Those bots should not be able to see any page on my server and they should not appear neither in the accesslogs nor in the errorlogs. http://global-security.blogspot.de/2009/06/how-to-block-robots-before-they-hit.html supposes to uses mod_security for that, but isn't there a simple directive for http.conf?

    Read the article

  • Remove urls to unidex blog content from google, then copy blogs content to new blog [closed]

    - by sam
    Possible Duplicate: migrating PR / rankings from one site to another Ive been writing a blog for the past yr or so, with about 300 published articles, the blog have been running under a subdomain blog.mysite.com We are no looking to change the url of our site, so the blog is going to have to be ported over to a subdoamin on the new site. We would really like to keep the backlog / archive of all the articles we have written but dont wont to be penalized for having duplicate content, could we just remove / unindex the urls from google in webmaster tools then export the blog and import it back to our new blog ? Would google still see this a duplicate content or becuase ive removed the urls have they no longer got a copy of it ? thanks

    Read the article

  • Problem with javascript [migrated]

    - by Andres Orozco
    Well i have a problem, and i made this test code to show you my problem. HTML Code -- http://i.stack.imgur.com/7qlZx.png Javascript Code -- http://i.stack.imgur.com/DYvuq.png What i want to do here in this example is to change the styles of the id. But for some reason it doesn't work the bad news is that i have some problem to detect... problems on my code, so i don't know what happen and i really need your help. Thank's. Andres.

    Read the article

  • Do premium domain names help us with other languages too?

    - by Fabio Milheiro
    It's commonly known that premium domains with one or two relevant keywords may help us improve our rankings in SERPS. But would it be possible that an english premium domain, for example gold.com (no, it's not mine) also helps to drive more non-english traffic (I'm talking about non-english pages ob)? Trying to make my question clear: Let's suppose that I have an english premium domain with a page like this: gold dot com/post/123/gold-is-yellow And decide to have a spanish, portuguese or french version of the site with pages like: gold dot com/es/post/123/el-oro-es-amarillo gold dot com/pt/post/123/o-ouro-e-amarelo gold dot com/fr/post/123/fsdfsdfsdf The fact that my english domain is a premium one and highly relevant for english terms, will also help me to achieve good rankings for non-english searched terms like: oro (spanish) or ouro (portuguese)?

    Read the article

  • Blogging platform for shared hosting?

    - by Ankit
    First of all, this is not for flame wars. So, please do not bash some blogging platform :P I have a shared hosting account. I want to setup a blog on it. Now, the thing is that as it is shared hosting, I do not have as much bandwidth as some expensive hosting. I have tried using Wordpress with the caching plugins configured. It still does not seem to speed it much. Also if I create a simple PHP website, the speed is fine. Can someone suggest me some lightweight platform?

    Read the article

  • Is traditional JavaScript image pre-loading taboo

    - by Evan Plaice
    I remember the good-old-days (not really) back when I was still sucking the teet of Dreamweaver to build websites and the lure of playing copypasta with fancy built-in scripts (ex, image-swap) was like black magic. I'm pretty far removed from that now days but I was adapting a small site from it's original FrontPage (::cringe::) format to a standard HTML/CSS implementation and couldn't help wondering... should I should re-implement the JavaScript image pre-loading into the current version? Or, is there a better way? I don't want to block the page from loading by requiring the user to request all the assets withing the page by using the traditional JavaScript pre-loader method. I value giving the user something to look at ASAP, and there's some potential harm to my Google mojo by doing so. Is there a cleaner solution to prevent unnecessary page-reflows during loading? Such as, setting the static width/height dimensions through a CSS style attribute on the image element.

    Read the article

  • IIS MIME type for XML content

    - by Rodolfo
    recently a third party plugin I'm using to display online magazines stopped working on mobile devices. According to their help page, this happens for people serving with IIS. Their solution is to set the MIME type .xml to "application/xml". It's by default set to "text/xml". Changing it does work, but would that have unintended side effects or is it actually the correct way and IIS just set it wrong?

    Read the article

  • setting up mod_proxy - plesk, apache, .htacess?

    - by sam
    I want to set up mod_proxy to work so that my blog is running under a subdirectory of my site rather than subdomain so i get the seo backlink benefit. What im looking to do is get my tumblr blog which is running at blog.mysite.com (which is in turn mapped from myblog.tumblr.com) will be running on mysite.com/blog How can i set up mod_proxy to do this, is it just something that i can setup from inside of my .htacess file ? Ive got my site hosted on an apache server, using plesk as a controll panel. I contacted my webhost and they told me mod_rewrite could acheve it, they gave me this but said they wont provide me further support regarding mod_rewrite as its somthing they dont support <IfModule mod_rewrite.c> RewriteEngine On RewriteCond %{HTTP_HOST} ^example.co.uk/blog$ RewriteCond %{REQUEST_URI} !/standard RewriteRule ^(.*)$ http://example.tumblr.com$1 [R] </IfModule> ideally id like to use the mod_proxy method as it recomended from an seo point of view from this article http://www.seomoz.org/blog/what-is-a-reverse-proxy-and-how-can-it-help-my-seo

    Read the article

  • Issue with Godaddy DNS manager

    - by Fischer
    I'm using domains.live.com to setup an email to a domain registered on Godaddy. The domains.live.com configuration page says: Godaddy's DNS manager isn't accepting this string Value: v=spf1 include:hotmail.com ~all it gives an error, something is wrong, either with the string or with the DNS manager and I would like to know how to fix it. Notes: The more information link is dead, Godaddy no longer gives support by email, no Microsoft support

    Read the article

  • Rackspace Cloud Servers in Europe?

    - by mit
    We have setup a cloud virtual server at rackspace in the US, but we use it from Europe. I found out I am not quite happy with the response time. Of course I knew that there would be some latency. But I am not sure if it is the overseas latency (ping is 120ms) or also the minimal resources. It is the smallest machine, 256 MB, 10 GB, running a Mediawiki on Ubuntu 10.04 64 bit. The Instance lives in the rackspace ORD1 datacenter. As soon as they have opened their new facilities in the UK we plan moving the incstance there. But we are planing more machines already. The pricing is quite attractive. I don't really want to do some measuring and benchmarking and this stuff, so I am asking just for your opinions and it would be nice to hear what you can tell from your experience. Maybe someone who uses such small instances in the US. And what can we really expect if we upgrade to more resources.

    Read the article

  • dynamic urls and links on one web page

    - by John
    I am trying to figure out how to create dynamic links and urls on a static webpage. What I want to do is the following: I have a single webpage for example: MYWEBPAGEdotCOM/INDEX.HTML that will always look the same, except for one link on the page. the link would be on the page for example: LINK TO AFFILIATE: affiliatedotCOM/my-affiliate_code_here_DYNAMIC_REFERER the only thing would change is the "DYNAMIC_REFERER" with every dynamic url on this page: MYWEBPAGEdotCOM/INDEX.PHP_id=test1 MYWEBPAGEdotCOM/INDEX.PHP_id=test2 MYWEBPAGEdotCOM/INDEX.PHP_id=test3 MYWEBPAGEdotCOM/INDEX.PHP_id=test4 which would only hange the dynamic link on the page to: affiliatedotCOM/my-affiliate_code_here_test1 affiliatedotCOM/my-affiliate_code_here_test2 affiliatedotCOM/my-affiliate_code_here_test3 affiliatedotCOM/my-affiliate_code_here_test4 Can someone tell me how I could go about doing this? I just dont want to have to make 100's of pages, as this would prevent me from having to do so.

    Read the article

  • How do you find all the links to disavow for a Google reconsideration request? [duplicate]

    - by QF_Developer
    This question already has an answer here: How to identify spammy domains giving backlinks to my site (to submit in disavow links in WMT) 2 answers A few months ago I received the following notification on Google Webmaster for a website I look after. Unnatural links to your site—impacts links Google has detected a pattern of unnatural artificial, deceptive, or manipulative links pointing to pages on this site. Some links may be outside of the webmaster’s control, so for this incident we are taking targeted action on the unnatural links instead of on the site’s ranking as a whole. Learn more. The question here is, should we actively attempt to disavow these links given that the action is seemingly targeted to just a bunch of keywords? I've downloaded the inbound links sample from Google Webmaster and so far I've been through the disavow and reconsideration requests process 6 times, each taking 2-3 weeks only to be supplied just 2 more links that Google don't approve of. At this rate it will take me the rest of my natural life to cleanup all these spammy links! It seems disavowing is futile as they haven't implemented broad actions against the website as a whole and (from what I can gather) have already nullified the value of those offending links. Under the quoted statement above however is a reconsideration request button that seems to imply I should be actively doing something here? UPDATE 14th October -- I have since created a small .NET application that you can feed the CSV sample links file into from Google Webmaster. What this tool does is crawl all the links and looks for specific linking patterns as per some configurable match strings. I realised that many of the links that Google are taking issue with were created by a rogue SEO firm we hired several years ago. All the links are appended with 1 of 5 different descriptions. The application I built uses some regexes to isolate any link sources with these matching appendages and automatically builds the disavow txt file. In the end it had to come down to an algorithm as manually disavowing links on this scale would take weeks! I will post the app here once I've cleaned it up.

    Read the article

  • Link a domain to a Squarespace account

    - by TheMightyLlama
    I have set up a Squarespace account and now need to link a domain I purchased elsewhere. What settings do I need to change on the domain hosted? This is what I've got so far, however the changes don't seem to have propagated and my website still has a default page: Here are the settings that Squarespace say I need to use for GoDaddy, but I'm on A2Hosting and confused as to how to configure everything.

    Read the article

< Previous Page | 52 53 54 55 56 57 58 59 60 61 62 63  | Next Page >