Search Results

Search found 11896 results on 476 pages for 'smart pro'.

Page 254/476 | < Previous Page | 250 251 252 253 254 255 256 257 258 259 260 261  | Next Page >

  • will redirect subdomain lose Google SEO links

    - by user29160
    Because of a change in brand, I want to redirect our subdomain.domain.com to a newdomain.com. The content being exactly the same, I was thinking of using a 301 wild card 301 redirect to newdomain.com. I noticed it is not possible to do a redirect in Google webmasters tool as you can with a root domain. Is there a way I can do this redirect without losing all the backlinks referenced with Google? All the best, mark

    Read the article

  • Can you disavow a whole domain apart from the index page?

    - by Silver89
    Many years ago I may have bought a few sitewide links for some of my sites, these have now come back to haunt me and I need to sort them out. I've tried to contact the owners but they're too lazy to bother changing the sites so I figure it's time to disavow the links. But is there a way to disavow all of the sitewide links on the domain apart from on the index page and would this be a benefit to leave the index or would it still be seen as spammy? Something like ... # Contacted owner of shadyseo.com on 7/1/2012 to # ask for link removal but got no response domain:shadyseo.com !shadyseo.com/index.php

    Read the article

  • Facebook contest policy no-no?

    - by Fred
    I would like to post a link on a Facebook page where it will exit Facebook entirely and go to a client's website, where people will be on a page (client's) where they can enter their e-mail address to be entered in a temporary database file with rules and disclosures etc., for a draw once the number of entries reaches 100 for instance. Once the number of entries reaches 100, a random winner is picked and notified via E-mail. The functionality is as follows: A link is place on a Facebook page leading to an external page The page is a form to merely enter their email address for a contest The email is placed in a temporary file An automatic E-mail is sent to the address used for confirmation using SHAH-256 hash The person receives the Email saying something to the affect "Please confirm your Email address etc. - If you did not authorize this, simply ignore this message and no further action will be taken". If the person clicks on the confirmation link, the Email is then stored in the database and the person is again notified saying "Thank you for signing up etc." Once others do the same process and the database reaches a certain number, the form is no longer accessible and automatically picks a random Email. Once picked, an Email is automatically sent to the winner stating the instructions, and notifying me also. Once that person clicks yet another confirmation link, the database is then automatically deleted. I have built this myself and have no intentions of breaking any rules, nor jeopardize the work/time/energy I have put into this project. Is this allowed?

    Read the article

  • Adwords: is it possible to automatically copy changes in one campaign to other campaigns?

    - by Richard
    Is it possible to have identical campaigns except for one thing, and whenever changes are made to one campaign, the same changes are automatically made to others? I want to have identical campaigns running for different time zones, but for the campaigns only to run between particular hours of the day in each time zone. If I have a campaign for each time zone, when I make changes to one campaign, such as adding ad groups, or adding keywords, or changing bid amounts, etc, I want those changes to occur to the other campaigns also. Is this possible? Thanks,

    Read the article

  • How long is the penalty for Duplicate ecommerce content after it has been ressurected

    - by will
    I am fixing all of the duplicate content on my ecommerce site with all orignal descriptions etc. How long does it take google to start ranking it again? I used to have a good ranking that converted quite a few sales, in the last week i have had next to nothing. Also would the disclaimer i created under each product be considered duplicate content because it is on most of my product pages & is the same.

    Read the article

  • how did Google Analytics kill my site?

    - by user1813359
    Yesterday I created a google analytics profile for one of my sites and included the JS block in the layout template. What happened next was very strange. Within about 2 minutes, the site had become unreachable. I had been checking the AWStats page for the site when I thought to set up GA. After that had been done, I clicked on the link for 404 stats, which opens in a new tab. It churned for a long while and then showed a nearly blank page, similar to that when Firefox chokes on a badly-formatted XML page, except there was no error msg. But i was logged into the server and could see that that page has a 401 Transitional DTD. Strange! I tried viewing source but it just churned endlessly. I then tried "inspect element" and was able to see an error msg having to do with some internal Firefox lib. Unfortunately, i neglected to copy that. :-( All further attempts to load anything on the site would time out. Firebug's Net panel showed no request being made. Chrome would time out. So, I deleted the GA profile, removed the JS block, and cleared the server cache. No joy. I then removed all google cookies and disabled JS. Still nothing. No luck in any other browser. And now my client couldn't access the site. Terrific. I was able use wget while logged into another server. The retrieved page was fine, and did not contain the GA JS block. However, the two servers are on the same network. (Perhaps a clue.) The server itself was fine. Ping, traceroute looked great. I could SSH in. I tailed the access log and tried a browser request. Nothing. But i forgot to quit and a minute or so later I saw a request from someone else being logged. Later, I could see that requests had been served all day to some people. Now, 24 hours later, the site works once again, but is still unreachable by the client (who is in another city). So, does anyone have some insight into what's going on? Does this have something to do with google's CDN? I don't know very much about how GA works but what I'm seeing reminds me of DNS propagation issues. And why the initial XML error? And why the heck was the site just plain unreachable? What did google do to my site?! Sorry for the length but I wanted to cover everything.

    Read the article

  • HTML5 article tag application for the iPad

    - by dspencer
    I've used article tags on websites. My understanding and practice is to use the article tag for publication content. I always use HTML/HTML5 tags as their intended purposes and not at will. Recently, I've seen an HTML template that uses the article tag for the non-publication page content such as the content of an About Us page or any other generic page. I asked the why it was used this way and the (vague) explanation was that it had to do with the way the iPad read the tag. Is this true?

    Read the article

  • Is there a colloquial term for the web dev equivalent of "Script Kiddies"? [on hold]

    - by Darkcat Studios
    I recently came across a couple of lets say "young people" who were convinced that they were "Web Developers", despite the fact that they were only able to configure a wordpress template, and not even able to edit images properly. So this got me thinking, "back in the day" kids who thought they were hackers but just ran other peoples scripts they had found, were known as Script Kiddies. Do we have a term for these yet?

    Read the article

  • Restricting A Directory Through .htaccess

    - by Whitechapel
    I'm trying to put all of my FTP accounts into a folder on /public_html/ftp and password protect it so search bots can't crawl their private files. I'm also trying to redirect all site traffic from the non-www to www. I keep getting 500 errors when accessing the site, and I need to point it to www.vivalanation.com/ftp to www.vivalanation.com/ftp/, because the /ftp just errors out, you need the trailing slash. Here is my .htaccess in the /public_html/ftp folder: RewriteEngine on RewriteBase / RewriteCond %{HTTP_HOST} !^www\. [NC] RewriteRule ^(.*)$ http://www.%{HTTP_HOST}/$1 [R=301,L] AuthName "FTP Access" AuthType Basic AuthUserFile /home1/vivalst/.htpasswds/public_html/ftp/passwd Require valid-user I created a passwd file in /.htpasswds/public_html/ftp And here is my basic .htaccess in the root of /public_html/: RewriteEngine on RewriteBase / RewriteCond %{HTTP_HOST} !^www\. [NC] RewriteRule ^(.*)$ http://www.%{HTTP_HOST}/$1 [R=301,L]

    Read the article

  • How should I structure a site with content dependent on visitor type (not user)?

    - by Pedr
    I have a website that displays different content depending on two selections made by a visitor: Whether they are a teacher or student, and their learning level (from 4 options). Everything is public and they don't need to authenticate to access the content. Depending on their selection, different content is displayed across the whole site, other than a contact and about page. The tone of the language changes depending on whether the visitor is a student or teacher and the materials available on each page also change depending on the learning level, however in all cases, the structure of the site is identical. Currently I'm using a cookie to store the visitor's selections and render different content appropriately, so I have a single set of URLs which display different content depending on the cookie, with one of the permutations as default. I appreciate this is far from ideal, but what is the better option? Would I be better using a distinguishing segment for each selection, for example: http://example.com/teacher/lv3/resources/activities http://example.com/teacher/lv4/resources/activities http://example.com/student/lv4/resources/activities etc. What is the most sensible way to handle this situation?

    Read the article

  • Does Google penalize pseudo-duplicate pages for different locations?

    - by mikewowb
    My compony's site's home page was not specificly optimized to any location. Now, I am planning to optimize it to Boston, and create ten or so other landing pages for other locations we serve. If we made these new pages by copying the original Boston one and changing the location's name (s/Boston/Montreal/), would Google consider them as duplicate pages and penalize us? What is the best practice for this?

    Read the article

  • How do I fix the paths of my website?

    - by EASI
    I have Joomla 2.5.7 in my client's server updated recently from 1.6 to 1.7. I did not make that site but I am responsible for it now. I prefer to make a site from zero. Now users area clicking on the menu options and when joomla send them to a meta-url like http://iap.pa.gov.br/acervo they get the message 404 The requested URL /acervo was not found on this server. Would that be because they moved joomla folder from the root to root/iap (name of the site)? If it is, what is the configuration to adapt to that new folder?

    Read the article

  • Are there disadvantages an literal + instead of an encoded + (%2B) in an URL?

    - by M_rk
    A client of mine has a product ending with a plus-sign (e.g. Google+) and would like the webpage of this product to have an URL that is human-readable (i.e. an URL that doesn't contain %2B). Since our projects use the following .htaccess RewriteRule RewriteRule ^(.*)$ index.php?$1 it is possible to use an urlencoded space in an URL like that. However, while the url would read like /google+, the actual meaning of the URL would be /google[space]. (The markup won't let me place a real space there.) Now my concern is that this would have disadvantages for SEO. Is this concern valid and/or are there other culprits to this approach?

    Read the article

  • Managing client passwords [closed]

    - by C.Johns
    I am just starting up a small website development business and one of the issues I am having is remembering passwords and account information for clients hosting, cpanel, ftp accounts etc. I was wondering what is the most suitable system / industry standard for controlling such information? Pretty marginal on the close there... I read the FAQ and I felt list this could be a common issue for webmasters, its defiantly not a coding questions so stackoverflow is out of the question and its not a broad question its focused on one particular aspect of being a webmaster.

    Read the article

  • Serve up syntactic XHTML5 using the text/html MIME type?

    - by cboettig
    I have a site currently written with HTML5 tags. I'd like to be able to parse the site as XML, with support for namespaces, etc, to facilitate programmatic extraction of data. Currently I have <!DOCTYPE html> and <meta charset="utf-8"> Which I gather is equivalent in HTML5 to explicitly setting the content-types as <meta http-equiv="Content-Type" content="text/html; charset=utf-8" /> for my current setup. In order to serve XML it sounds like the right thing to do is <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd"> <html xmlns="http://www.w3.org/1999/xhtml"> Should I also change my Content-Type to <meta http-equiv="content-type" content="application/xhtml+xml; charset=iso-8859-1" /> Or is that not necessary? What is the advantage of having content-type be "application/xhtml+xml"? What is the disadvantage? (Sounds like it may break internet explorer rendering of the site? but maybe that information is out of date now?) Many thanks!

    Read the article

  • With Google DFP (Small Business) is it possible to disable AdSense in an Ad Slot on a per-request basis?

    - by Daniel Pehrson
    Setup: I run a network of websites that target different hobby niches and have a section dedicated to community classifieds. I serve advertising on these sites through Google DFP for Small Business with AdSense enabled on the slots. Problem: One of the next sites in my network will be targeting the firearms/shooting industry and as such the classifieds section will not comply with the prohibited content guidelines of AdSense regarding the sale (or coordination of sale) of weapons. I work very hard to comply with the guidelines of my partners even if I don't understand/agree with them and after talking with many people have decided that the best option is to disable AdSense serving on that section of that website, while leaving it on for the rest of the network. Solution: Right now my only idea for this is to duplicate all my site's ad slots and tack a "_sensitive" onto the end of each one (eg. header and header_sensitive) conditionally registering ad slots based on whether or not I am in the sensitive section of the sensitive site. My hope however is that there may be a way to accomplish this without duplicating all my ad slots possibly with some sort of options to the GA_googleFillSlot() call that allows me to say "load ads from this slot but do not serve AdSense no matter what."

    Read the article

  • MySQL can only log in as root, even after creating new users with their own database

    - by ionFish
    Problem: I just set up a Debian Wheezy installation for testing, and installed the LAMP packages and PMA. I can log in as root with my pre-defined password, create/edit/delete both databases and users. The problem comes when I create a new user 'something', set a password for it, and grant it all privileges on a table 'something' (same as the username). Upon connecting, it denies access to the user. Details: Host: localhost using MySQL 5.5.24-8 Creating user: CREATE USER 'something'@'%' IDENTIFIED BY '***';GRANT USAGE ON *.* TO 'something'@'%' IDENTIFIED BY '***' WITH MAX_QUERIES_PER_HOUR 0 MAX_CONNECTIONS_PER_HOUR 0 MAX_UPDATES_PER_HOUR 0 MAX_USER_CONNECTIONS 0;CREATE DATABASE IF NOT EXISTSsomething;GRANT ALL PRIVILEGES ONsomething.* TO 'something'@'%'; Checking privileges: GRANT USAGE ON *.* TO 'something'@'%' IDENTIFIED BY PASSWORD '*92F9DAF5F5129554509489FDB6A433510223C799'; Result: Access denied for user 'something'@'localhost' (using password: YES) More Info: I use this same exact procedure for the Squeeze distribution, and it works perfectly. Is there a chance it's because of Wheezy, or something else? I need to continue using Wheezy because of the updated packages (for this test server -- the others work fine), so 'just use Squeeze' is not an option. Note: I HAVE tried flush privileges; to no avail.

    Read the article

  • Googlebot visit but no cache update - why?

    - by Mick
    I have made a new plain vanilla HTML website. I have been making regular modifications to it on an almost daily basis. The site is hosted by hostmonster and as part of their service they offer "awstats" to let you know assorted details of visitors to the site. One thing is puzzling me. According to awstats, a "robot/spider" calling itself "Googlebot" visited my site as recently as today (28th June 2011), but when I find my site on google (e.g. by searching for "full reserve banking") the cache is dated only the 5th June. I always thought that a visit from the google robot was synonymous with a cache update. Am I wrong? Or have I accidentally put something in the site telling google that nothing has been updated? EDIT: It seems a moderator has removed the name of my website, so there is now no chance that anyone could check out if I had made some error on my site :-( ... but anyway, in answer to paulmorriss' question, here is what aw stats was telling me:

    Read the article

  • Comparisons of Javascript 'data grids'?

    - by Joe
    I've found plenty of questions between here and StackExchange of people asking for the 'best' data grid / data table, or one that has a particular feature, and plenty of lists out there (of various ages) listing the various data grid implementations ... but is anyone aware of any matrix of what features the various solutions implement? (eg, allow shift-click to select multiple; support checkboxes for selection; can update a regular table in-place; allow editing of cells; support websql or indexeddb for local caching; which browsers they support; infinite scroll; etc.) There's a generic 'javascript framework' comparison on wikipedia, which would be the sort of thing I'm looking for, but it doesn't go into detail on data grids. (which makes sense, as so many are extensions, not core features of those frameworks, and in the case of jQuery, there's lots of 'em.)

    Read the article

  • SEO Blog Indexing : Dot Wordpress Versus a Registered Domain?

    - by rumspringa00
    I've used Wordpress for a few of my client's sites, mostly small businesses and ecommerce sites. I have found through Google Analytics as well as the All in One Webmaster plugin that when it comes to social media, using Wordpress is a surefire way of getting your site indexed by Google and occasionally Bing and Yahoo. Since I am a heavy WP user, I'd like to contribute by registering a dot Wordpress domain for my portfolio. When using a WP installation concurrently with a WP domain, e.g. myportfolio.wordpress.com, will the site be more or less likely to be indexed rather a generic myportfolio.com domain? I've seen mixed opinions where people seem to favor a WP domain for URL output where others say that it's a moot point, and that Google will not favor a WP domain over a dot com domain as long as your meta tags are updated and content is keyword optimized. I tend to disagree and believe a WP domian would more likely be indexed and output more URLs over an individual, laconic domain like myportfolio.com. Am I wrong? Thanks in advance!

    Read the article

  • Are there advantages of using hard coded URLs for localization?

    - by nbolton
    On the Synergy website, localization is detected (and can be overridden) but uses the same URL for all languages. Some websites however, like Wikipedia have language specific subdomains. What are the advantages of having either subdomains or subdirectories (i.e. a specific URL) for each language localization? Also, should it automatically redirect the user to the specific subdomain/subdir based on the language that the browser requests? I suspect that there are advantages, which I'm guessing are: When the website appears in search results for non-English languages, the translated page description will be shown (assuming there is a translation provided by the website). When a user shares a page (e.g. through twitter), it will show in a specific language. Perhaps this is a disadvantage though? Am I correct, if so, are there more advantages?

    Read the article

  • Which one of the Google Analytics stats should be my top priority to improve SEO: visits, pageviews, bounce rate, new visitors, or visit duration

    - by HOY
    My site is getting a lot of traffic from an image (a company logo image) because this image is ranked 1st in Google search results for a company's title. (I have no idea how that happened.) This image is must for my website, but it is not relevant to the site content, so irrelevant people search for the image and find out about my site. I get interesting statistics: Pros: Total Visits & Avg. New Visits Cons: Avg. Page/Visit, Avg. Visit Duration, Bounce Rate I am confused if this image is helpful to my website. I don't know what the balance between those 5 statistics should be. My website is 2 months old, and we are working on SEO at the moment. Edit: here are the details about the image: Search Keyword: arcelik logo Search Site: google.com.tr Search URL

    Read the article

  • quickest way to research a set of pages backlinks

    - by JeremyB
    I have a list of 300+ pages (they were chosen based on which pages rank for a keyword I'm interested in) and I want to compile a list all the (known) inbound links to those pages. What's the fastest way to do this? It seems like the only tools out there-- Yahoo Site Explorer, SEOMoz, Majestic, require you to either a) manually export each set of links by hand, or b) get data at the domain level (e.g. Majestic's clique hunter). Does anyone know of any efficient way to do this? I ask because I'm about to write a bunch of code and I don't want to waste my time if there's another tool that will work. I know SEOMoz and Majestic have API's but I'm wondering if there's a more user-friendly option.

    Read the article

  • Can I use asterisks in URLs?

    - by KajMagnus
    Are there any reasons I shouldn't use an asterisk (*) in a URL? Background: With asterisks, I could provide these nice and user friendly (or what do you think??) URLs: example.com/some/folder/search-phrase* means search for pages with names starting with "search-phrase", located in /some/folder/. example.com/some/**/*search-phrase* means search for any page with "search-phrase" anywhere in its name. example.com/some/folder/* means list all pages in /some/folder/ (rather than showing the /some/folder/index page).

    Read the article

  • Merging two sites into one, how to redirect from the domain that's going away?

    - by bikeboy389
    I haven't been able to find any existing questions that cover my exact issue, so here goes: My client wants her two sites (domain1.com and domain2.com) rolled into a single, new site under domain1.com. Once the site is ready on domain1.com, DNS for domain2.com would be pointed at the same server as domain1.com. I know how to do an htaccess rewrite rule that would make all domain2.com traffic map to a specific single page or directory within domain1.com. But that's not what the client wants. What she wants is for a bunch of specific pages on domain2.com to map to specific new pages on domain1.com. For example: domain2.com/index.php?pageid=58 GOES TO domain1.com/2011/04/somearticle domain2.com/index.php?pageid=92 GOES TO domain1.com/2011/03/differentname etc. I could put a bunch of 301 redirects in the htaccess on domain1.com, which would work fine. The problem is, the client doesn't want/need specific redirects for ALL the domain2.com pages, and if I just do 301 redirects, anybody who comes looking for a domain2.com page that I haven't built a specific redirect for will get a 404 error. So I need to use 301 redirects for some traffic, and a rewrite rule for any traffic that's not covered in the 301 redirects. How do I do sort of a blending of a rewrite rule and 301 redirects, all in the htaccess file for domain1.com? Is this possible? Is it as simple as putting the 301 redirects in the htaccess file first, then doing the rewrite rule? I'm guessing not.

    Read the article

< Previous Page | 250 251 252 253 254 255 256 257 258 259 260 261  | Next Page >