Search Results

Search found 9717 results on 389 pages for 'pro'.

Page 132/389 | < Previous Page | 128 129 130 131 132 133 134 135 136 137 138 139  | Next Page >

  • How to move a website and domain name without experiencing downtime for emails or site?

    - by user4842
    Okay, I have a pretty complex problem, so I'll get right to it. I'm a designer who built a new website for my client. Their old site is hosted at GoDaddy, as well as their email. Problem is, the guy who built the original site decided to put the original domain name and hosting under HIS personal GoDaddy account. Well, that turned out to be a bad move for several reasons. Here's how it's all tied together. The original domain name, www.domainoriginal.com, was actually purchased at Network Solutions. The original web designer pointed the nameservers from Network Solutions to his GoDaddy account, where the email and hosting is setup. The new domain name, www.domainnew.com, was purchased under a new and separate GoDaddy account belonging to the company, and the new website was built under a 3rd party platform (Big Commerce). So, the www.domainnew.com is already pointed to the new website using A records at new GoDaddy account. All is fine there. However, they still need www.domainoriginal.com to point to the NEW website as well. (The old one can simply be deleted, it is NOT important). AND, they want to keep their old email addresses intact and working as well, but under the NEW GoDaddy account. Obviously, I have no DNS control at Network Solutions, and I have no idea what kind of control I have at GoDaddy under the old account because the web designer will not let me see inside his account. But, he and GoDaddy both tell me nothing can be done other than to repoint the nameservers to Network Solutions, and then repoint the A record to my new website, www.domainnew.com, and point the MX Records to GoDaddy. I'm told the downtime would be 24-48 hours if I do this. Ideally, we'd like to do a domain name transfer and get www.domainoriginal.com in the new GoDaddy account created by the company. But, I'm told this could take up to 7 days. Does this mean the site and email will be down for 7 days? And any emails sent during this time, would they be lost forever? If I do this, how long could I expect the site and email to go down? And, will the emails be permanently lost? I've gotten different answers from everybody at GoDaddy so I kind of don't trust them anymore... Any help would be greatly appreciated Thanks, Tyson

    Read the article

  • Is this a link scheme? If so, what to do? what problems can i face?

    - by guisasso
    I was asked to remodel a website, and decided to check its rank on alexa. Surprisingly, there are many, many different websites linking to it, none relevant. One particular thing about it is that none of these urls work, and they all display the exact same error when accessed, which to me is a very good indication that this is some sort of linking scheme. (besides the somewhat obvious names, it even says scheme in one of the urls !?) If so, how should i proceed about this website? What can i do if this is in fact a scheme, how can this hurt the website, what types of problems can i face, and what can i do about it? addurlnow . info dirlist15.addurlnow . info/Business___Economy/Services/page-12.html linkdirectory101 . info dirlist16.linkdirectory101 . info/Business___Economy/Services/page-15.html seonetblog . info dirlist52.seonetblog . info/Business___Economy/Affiliate_Schemes addurls . us dirlist21.addurls . us/Business___Economy/Services/page-10.html webdirectoriessite . info dirlist20.webdirectoriessite . info/Business___Economy/Services/page-6.html addurlstore . info dirlist10.addurlstore . info/business___economy/services/page-14.html ukwebdirectorys . info dirlist21.ukwebdirectorys . info/Business___Economy/Services/page-13.html

    Read the article

  • E-commerce for custom orders/customer image upload

    - by ansarob
    We have a client that needs an e-commerce site set up pretty quickly. As I have no experience with e-commerce, I am looking for some guidance. Basically, the two big features we need are: Ability for customer to add info about order (example: the name the customer wants to be put on the customizable product they ordered) Ability for customer to upload photo of product to be customized I hope this makes sense. Right now I am really looking into Shopify, but I can't tell if it does everything we need. I know you can add order notes when checking out, but not sure about image upload (maybe it can be added as an app through the API?).

    Read the article

  • Creating a Template Like System in cPanel

    - by clifgray
    I am creating a medium sized website using cPanel and their File Manager system and the majority of my pages are going to be the same with a different title and content section and I wanted to see if there is a system for making one general template file and then having all the other pages inherit from that file so all I have to do is have a content and title section and the rest of the links, headers, and whatnot can be changed throughout the site by just changing one file. Is there anything like this? I have used Jinaj2 in python and a few other systems for other server scripting languages but I am not sure how to implement it with cPanel.

    Read the article

  • Is it possible to transfer a domain without a "gap" in Whois privacy protection?

    - by Guest
    I currently own several domains on which I am using a Whois privacy protection service to hide my personal details. In the near future, I would like to transfer some of these domains to a different registrar. It has been many years since I last performed domain transfers, so I am no longer knowledgeable about what it involves. However, I have read from several registrars that they ask their customers to disable Whois protection before effecting a domain transfer. Since there are several websites out there that publish archived versions of Whois information (and ask handsome money for the information to be hidden, of course), I would prefer to avoid having such a "gap" in my privacy protection. I figured that these websites would fetch Whois information mainly when a query is effected through their own website. However, I have found out that at least one of these sites had a copy of the Whois information for a new domain up on their site within hours after I registered it, so they must have some other source (of course I used a Google search to find that out, not their own site). What that tells me is that the time it takes for the domain transfers to go through would be more than enough for these rogue websites to cache my information. If my new registrar offers privacy protection for domains right from the point of registration as well, is there no way to transfer the domain between the two without reverting to my default Whois information in between?

    Read the article

  • When Googlebot sees a link, will it click it or navigate to it?

    - by FakeRainBrigand
    My site uses pushState and JSON data to display content. So, for example, this might appear on my page: <a href="/some/page">some page</a> The JavaScript then prevents the default action (following the link), and instead renders a view (using a different api, such as /getjson?some_page). $('[href]').click(function(){ history.pushState(...); handleURL(...); }); Assume my server will respond to requests at /some/page with a pre-rendered version. My questions are: will Googlebot receive the prerendered version, or allow JavaScript to instead invoke pushState, etc. if it doesn't make the direct request, will it wait for AJAX content to be loaded? does Googlebot implement pushState, so it will show the proper URL in search results?

    Read the article

  • How to check that I have recovered from Penguin 2.0?

    - by Simon Walker
    I have 3 year old website which has been hit by Penguin 2.0 in May. The website traffic dropped almost 30%. I have been working hard from last 2.5 months on the website and my website's traffic recovered in last week of August. In fact, I am receiving more traffic then ever. When I look at the stats, I find my website's search engine visibility has been improved. It is now appearing for more search queries. My website's impressions have also increased. What I am worried about is that my website is nowhere in top 5 pages for keywords having high competition and carrying the highest search volume. They are few in number but important. Should I consider my current situation as recovery or it's just the partial recovery? If it is only partial, then how come traffic is more then it was before penguin 2.0?

    Read the article

  • Removing 301 redirect from site root

    - by Jon Clements
    I'm having a look at a friends website (a fairly old PHP based one) which they've been advised needs re-structuring. The key points being: URLs should be lower case and more "friendly". The root of the domain should be not be re-directed. The first point I'm happy with (and the URLs needed tidying up anyway) and have a draft plan of action, however the second is baffling me as to not only the best way to do it, but also whether it should be done. Currently http://www.example.com/ is redirected to http://www.example.com/some-link-with-keywords/ using the follow index.php in the root of the Apache2 instance. <?php $nextpage = "some-link-with-keywords/"; header( "HTTP/1.1 301 Moved Permanently" ); header( "Status: 301 Moved Permanently" ); header("Location: $nextpage"); exit(0); // This is Optional but suggested, to avoid any accidental output ?> As far as I'm aware, this has been the case for around three years -- and I'm sorely tempted to advise to not worry about it. It would appear taking off the 301 could: Potentially affect page ranking (as the 'homepage' would disappear - although it couldn't disappear because of the next point...) Introduce maintainance issues as existing users would still have the re-directed page in their cache Following the above, introduce duplicate content Confuse Google/other SE's as to what the homepage actually is now I may be over-analysing this but I have a feeling it's not as simple as removing the 301 from the root, and 301'ing the previous target to the root... Any suggestions (including it's not worth it) are sincerely appreciated.

    Read the article

  • Forward .html/.htm to .php with .config

    - by PhilipK
    I'm moving a site from my linux hosted server to a client's windows hosted server. The .htaccess file no longer works and I'm told that windows servers use .config . How can I forward all users accessing .html & .htm files to the equivalent .php file. Server Info... OS/Hosting Type: Windows / Shared Hosting .Net Runtime Version: ASP.Net 2.0/3.0/3.5 PHP Version: PHP 5.2 IIS Version: IIS 7.0 Data Center: US Regional EDIT *Hosting provided by GoDaddy Was told by a friend following should work but it has no effect on the site. <configuration> <system.webServer> <handlers> <add name="PHP-FastCGI" verb="*" path="*.html" modules="FastCgiModule" scriptProcessor="c:\php\php-cgi.exe" resourceType="Either" /> </handlers> </system.webServer> </configuration>

    Read the article

  • Domain changes required for SSL integration

    - by user131003
    Currently my site supports regular payment options (User is taken to Payment Gateway/PG website). Now I'm trying to implement "seamless" PG integration. I need SSL for this. I'm having a dedicated server with 5 static IPs from Hostgator/HG. options: I take SSL for www.my_domain.com. According to HG, I need to change IP of main site as current IP is not really dedicated as it is being shared by cpanel etc. So They need to bind another dedicated IP to main domain for SSL to work. This would required DNS change for main website and hence cause few hours downtime (which is ok). I've noticed that most of the e-commerce websites are using subdomains like secure.my_domain.com for ssl/https. This sounds like a better approach. But I've got few doubts in this case: a) Would I need to re-register with existing PGs (Paypal, Google Checkout, Authorize.net) if I switch to subdomain? Re-registering is not an option for me. b) Would DNS change be required for www.my_domain.com in this case. This confusion arose because of following reply from HG : "If the sub domain secure.my_domain.com is added to an existing cPanel it will use the IP for that cPanel so as long as it is a Dedicated IP that will be fine. If secure.my_domain.com gets setup as its own cPanel it will need to be assigned to a Dedicated IP which would have a DNS change involved.". Please suggest?

    Read the article

  • Browser privacy improvement implications for websites

    - by phq
    On https://panopticlick.eff.org/ EFF let you test the number of uniquely identifying bits that the browser gives a website. Among these are HTTP header fields such as User-Agent, Accept, Accept-Language and later perhaps ETAG and If-Modified-Since. Also there is a lot of Information that javascript can get from the browser such as time-zone, screen resolution, complete list of fonts and plugins available. My first impression is, is all this information really usable/used on a majority of all websites? For example, how many sites does really send different content-types depending on the http accept header, or what fonts are available(I thought css had taken care of this)? Let's say of these headers/js functionality on day would be gone. Which ones would; never be noticed they were gone? impact user experience? impact server performance? immediately reimplemented because the Internet cannot work without it? Extra credit for differentiating between what can be done, what should be done and what is done in most situations.

    Read the article

  • Google's Opinion on Javascript Page Refresh

    - by user35306
    I was wondering if anyone knows Google's view on this. My company has a homepage that features a lot of 3rd parties on it and it needs to inform customers which ones are currently online, which aren't, and which are currently busy. Because this constantly changes, we have the homepage refresh to show the most relevant and up-to-date content to our users. I'm not using a meta refresh element in the http-equiv parameter to do this. Instead I have this js element to refresh the page: window.setTimeout("refreshPage()", 120000); I just want to know whether people think Google might consider this a violation of the content guidelines or not. Or if it's not an outright violation, then at least if Google frowns on this or not. It doesn't redirect the user to a different page or anything, just refreshes the page so that they can see the most relevant content.

    Read the article

  • How to Figure AdSense PPC with AdWords CPC of $0.05

    - by Melanie
    Often when I am using the Adwords Keyword tool I find keywords with a CPC of $0.05... I mean VERY often. Is this like base level when it comes to PPC or is this a possible error? For example, with a few keywords I have targetted with a 0.05 CPC, I often find a PPC of 0.25 or more. Obviously, this is because my content is triggering other keywords, although it is centered around a 0.05 keyword. I have found several keywords that have over 200,000 searches using the [exact] search parameter but have a CPC of $0.05. I plan on writing content to cover these keywords, but I am trying to figure the approximate value of these keywords. 0.05 leads me to believe there are no advertisers so IF someone were to advertise and use this keyword, it would cost them ~0.05. But since there is obviously no demand for these keywords and thus they aren't getting bids, other ads MUST be shown. How can I predict the value of ads with a CPC of $0.05? Strange question I know, but I'm just trying to understand this a bit more.

    Read the article

  • For Google Rich Snippets: Is it 'harmful' to add the same `hreview-aggregate` microformat markup in several places?

    - by Oliver
    We are right now incorporating microformats markup for reviews into a client's web application and were wondering, whether it can be harmful to provide the same information on more than one page, e.g. on a dynamic search page and on the concrete product page. Does anybody have any experience with this? UPDATE: Actually, I was wondering, if Google showed a link to the page the review comes from, then how would they decide which of the sources of the review they would link to? Or don't they?

    Read the article

  • Tracking AdWord ads with different text in Google Analytics

    - by at01
    I'm trying to see how the text in my Google AdWords ads affects my metrics in Analytics. I have auto-linking enabled, so I figured I would be able to automatically see this in Analytics. Unfortunately, if I try to add a second dimension of Traffic Sources-Ad Content, the metrics are only split by the ad's Headline. Most of my tests are changing only the ads' descriptions... So I guess I need to add a tracking parameter like ?campaign=special_text to my URLs? Or is there a way to see the ads split by ad descriptions? Should I add the full suite of utm_campaign/utm_medium/etc parameters? What's the proper way to track these ads which are mostly similar except the ad descriptions?

    Read the article

  • how to fix bad seo after being hacked

    - by mkprogramming
    About a year ago my wordpress website was hacked & some company decided to go nuts and actually do some "SEO" on the various links it created. Some of the pages would show up on google as "payday cash advance" instead of "portfolio". The issue has been resolved, but now as I've been doing GOOD seo, I've noticed (when checking backlinks) that there are TONS of links still on the internet (mostly broken sites now) that have links to my website with titles like: "get a loan today" and so on. Is there a way to remove these links ? Can I tell google to ignore them ? Help !

    Read the article

  • Should I add a "nofollow" attribute to download links, or disallow the URLs in robots.txt?

    - by Laurent
    I have a download link very similar to Opera's one - it's just a script that sends the file. It doesn't have an extension and there's no obvious way to tell that it's actually a download link. So since I don't want robots to crawl this link, do I need to add it to robots.txt or maybe add a "nofollow" attribute to it? I see that on Opera's website they didn't do either of this, so perhaps it's not necessary?

    Read the article

  • Foolproof way to ensure Google news pulls the correct image for it's thumbnails?

    - by Anthony
    Google news results have an acompanying thumbnail next to articles that show up in the results. If google's crawler can't find a thumbnail to pull from our site, it uses its next best guess from another site, therefore linking the image to another site but still uses our headline. Example: Headline from Reuters, Image from Livemint: Our pages absolutely have images, they are not massive in file-size or dimensions, yet we are not having them pulled / crawled correctly. We have read up on the suggestions from google, and from others around the web and nothing is panning out. Has anyone had any experience where they can ensure google news will pull a thumbnail of our choosing?

    Read the article

  • SEO for replacing blog content, but keeping the same page URL

    - by cphill
    This might not have any major impact on the SEO, but basically I have random blog at this URL: http://example.com/blog (not a real URL), that I am removing and replacing with a company blog. I want to use the http://example.com/blog URL address, but I'm not sure how this would effect my SEO since this random blog content that I am removing has the example.com/blog URL prefix. Would I just add a 310 redirect for those old blog articles and leave the basic /blog URL without any redirects?

    Read the article

  • 250 k 404 & 410 errors in Webmaster Tools. Bad backlinks?

    - by Natália
    Our webmaster tools account is showing 250.000 errors related with weird links from other sites. These URLs are comming mostly from non existent sites or are being generated directly by our website. Here some examples of these urls: oursite.com/&q=videos+caseros+sexo+pornos+gratis&sa=X&ei=R638T8eTO8WphAfF2vG8Bg&ved=0CCAQFjAC%2F%2Fpage%2F2%2Fpage%2F3%2Fpage%2F4%2Fpage%2F3%2Fpage%2F4%2Fpage%2F3%2Fpage%2F4%2Fpage%2F5%2Fpage%2F4/page/3 Our site is a popular spanish adult site, yet we don´t have keywords which are being mentioned in this url. Apparently this link comes from our site. Some more examples: oursite.com/&q=losmejoresvideosporno&sa=X&ei=U__8T-BnqK7RBdjmhYsH&ved=0CBUQFjAA%2F%2Fpage%2F2%2Fpage%2F3%2Fpage%2F2%2Fpage%2F3%2Fpage%2F2%2Fpage%2F3%2Fpage%2F4%2Fpage%2F3%2Fpage%2F2%2Fpage%2F3/page/4 Once again: not our queries, not out urls. oursite/tag/tetonas We think that it might be other site, which is having a policy of extremely bad SEO based on other sites branding and keywords usage: thirdsite/buscador/tetonas-oursite The question is: if other sites are generating these urls, how can we prevent this? Why the tag is being generated if no link was added to the other site? What should we do with these errors? 301? 410 gone? I have read all similar Q&A here but none of them seems to solve our problem. It is not likely to be a bad ad (Inspected them all). Maybe some all content which google decided to recrawl suddenly? Maybe third parties bad SEO policy? Maybe all of them? Any help will be higly appreciated,

    Read the article

  • Wordpress Multisite and Google Analytics in subfolders with mapped domains

    - by David
    I have a wordpress multisite with sub folders. The site's subfolders are mapped to domains, which are set to primary. I'm using the 'Google Analytics Multisite Async' code to track things. From what I can see it's tracking the sites fine (getting page hits for each site in google analytics) baring the original site in the Multisite which in content overview lists domains then the amount of traffic it's getting along with the orginal domains traffic. I don't want to track any other traffic for my orginal site than what goes to that. i.e. I don't want it tracking my other sites in multi-site. e.g. domain1.com is my orginal and I have lots of other sites in the multisite lets say domain2.com, domain3.com. In content overview in Analytics it's listing say domain2.com as content. Can I tell it to filter these out some how either in Analytics or within WordPress? Hopefully explained that clearly!

    Read the article

  • Migrating from .co.uk to .uk [on hold]

    - by DD.
    I currently run the site https://www.example.co.uk and I'm considering migrating to https://www.example.uk to take advantage of the new shorter domain. When migrating the domain in Google Webmaster Tools...will all the authority from the old domain pass if I use 301 redirects? Does anyone know how all the reviews I have collected will get transferred to the new domain when using the AdWords seller ratings? Any other important issues to be wary of when migrating to the new domain?

    Read the article

  • Keep search engine from indexing specific content on your site

    - by Jimmy Chopps
    I've got a pretty weird scenario that I was wondering someone could help me out with. I recently created a blog site and noticed that search engines have been including the content of my footer in with the description. This presents a problem because my footer is basic a brief legal statement saying that the views are my own and don't represent the company I work for (and yada yada yada). So, basically, I need a way to prevent search engines from indexint that content in my footer or even my footer altogether. I've been looking back through some of my SEO books and searching through forums but this doesn't seem possible. Is it possible to keep search engines from indexing only certain content on a page? If it isn't possible, what alternatives are there to ensure this legal mumbo jumbo doesn't show up in the results?

    Read the article

  • Tools for managing eCommerce backend

    - by rboarman
    I am working with an eCommerce company that has outgrown their hacked together backend for managing inventory, pricing and feeds to various shopping engines (Yahoo, 3d cart, Amazon, etc.). They currently manage about 12,000 skus and are doing $40M in revenue. Their internal people are working on a new Magento solution, but that is six months away and they need to replace/improve their current solution in order to hold them over. Their current solution was developed by two people who have left the company. What tools/architecture do other eCommerce sites use to manage their inventory, pricing, product descriptions and feed generation for the shopping engines? The current solution looks like this: 1) Inventory, pricing and product descriptions are maintained in a database and in NetSuite by employees 2) New products are added to the database via import 3) Twice a week data is extracted into a giant Excel spreadsheet 4) The Excel file adjusts pricing based on some simple algorithms 5) The Excel file exports about six different csv feeds which are manually uploaded to Amazon, 3d cart, Yahoo, Google and Merchant Advantage a. Each feed is a variant of the product which different field names and formatting b. Pricing levels differ between feeds c. Some products are not sent to all feeds 6) Orders are manually parsed and the inventory is adjusted as needed once product is sold The new solution should: 1) Import data from ODBC, CSV and NetSuite (CSV via ftp) 2) Apply pricing changes via simple algorithms (< $80 add $10, $200 add $25) 3) Ensure margins are being met 4) Format and generate a bunch of CSV and XML feeds 5) Perhaps upload feeds to shopping engines automatically What I need to do is replace the Excel file with something that is maintainable and automated. Something in the .Net stack is preferable but not mandatory. I’ve been looking at BizTalk but it may take too long to develop and deploy. Any suggestions?

    Read the article

  • How to properly remove URL's from Google's index?

    - by ElHaix
    On some of our sites, we now have several thousand pages that dilute our website's keyword density. The website is an MVC site with SEO routing. If I submit a new sitemap with say only the 2000 or so pages that we want indexed, even though navigating to the diluting pages still works, will Google re-index the site with only those 2000 pages, dropping the superfluous ones? For example, I want to keep roughly 2000 of the following: www.mysite.com/some-search-term-1/some-good-keywords www.mysite.com/some-search-term-2/some-more-good-keywords And remove several thousand of the following that have already been indexed. www.mysite.com/some-search-term-xx/some-poor-keywords www.mysite.com/some-search-term-xx/some-poor-more-keywords These pages are not actually "removed" as navigating to these URL's still renders a page. Even though there are potentially hundreds of thousands of pages, I only want say 2000 to be re-indexed and retained. The others removed (without having to do these manually). Thanks.

    Read the article

< Previous Page | 128 129 130 131 132 133 134 135 136 137 138 139  | Next Page >