Search Results

Search found 27426 results on 1098 pages for 'build tools'.

Page 83/1098 | < Previous Page | 79 80 81 82 83 84 85 86 87 88 89 90  | Next Page >

  • Displaying thumbnails in google search results for flash games?

    - by serg
    Some sites are somehow displaying video preview thumbnails for pages that contain flash games in google search results. For example search for: site:www.thorgaming.com game I see this only on small sites which makes me believe google is not very happy about it. How do they do it and is it ok with google? I assume they are submitting thumbnails through video sitemaps, but I can't find any information about using them with flash games. I also run a few such sites through rich snipped testing tool and it didn't detect any microdata tags on a page.

    Read the article

  • schema.org 'reviewRating' tag not recognized by google snippet testing tool

    - by saravanak
    I'm trying to add more structural information to my webpages by using the microdata format suggested in www.schema.org. The procedure seems straight forward but I'm having issues validating my results in the Google Rich Snippets Testing Tool. Check out this review page, here I'm using the 'reviewRating' property item to specify rating values for that particular review. I followed the same format as defined in schema.org/Rating but this markup fails validation in Google's rich snippet testing tool with the following error info. Item Type: http://schema.org/rating reviewrating = 5 ratingvalue = 5 Warning: Property "reviewrating" was not found.

    Read the article

  • Search ranking for important keywords has gone down drastically [duplicate]

    - by Vaivhav
    This question already has an answer here: How to diagnose a search engine ranking drop? 5 answers Firstly, we are a small entrepreneurial team of 3 persons and I am more like an amateur webmaster of the company's website as we cannot really afford a technical guy/department right now. A few weeks earlier, our website traffic and rankings for most keywords decreased overnight. I did a lot of reading henceforth and learned about Penguin 2.1 which people said is the reason for the drop. Something like this had never happened before. Now, I have gone through the entire Google webmaster help section. It says there that if a manual penalty is taken against us, we would notice a message in Manual Actions page. So far, we haven't received any notice from Google for web spam. Some SEO guys I contacted said they found spam links in our backlink profile. I do believe I had mistakenly purchased a cheap link/SEO scheme when I was yet very new to SEO. This was more than a year back but since then we have been legitimate. Moreover, how do I find out which is a spam link and which is not? Our content is all original, refreshing and the best you will find in our niche. We also have a blog but on a different domain (wordpress.com) from where we send out anchored links to our business website. Is this a good thing to do? Now, how should we proceed and recover our traffic/rankings. I tried searching in webmasters for a way to reach google and ask them why the traffic has decreased suddenly, but I couldn't find a contact form or something. Can someone please go through our website and help in making things more clear regarding the reason for the drop, along with a solution. Will really appreciate this as I can't get to figure this out and its taking a lot of time. Vaivhav

    Read the article

  • Analyzing the errorlog

    - by TiborKaraszi
    How often do you do this? Look over each message (type) in the errorlog file and determine whether this is something you want to act on. Sure, some (but not all) of you have some monitoring solution in place, but are you 100% confident that it really will notify for all messages that you might find interesting? That there isn't even one little message hiding in there that you would find valuable knowing about? Or how about messages that you typically don't are about, but knowing that you have a high...(read more)

    Read the article

  • Google is not treating two Australian schools as separate sites when both are subdomains of qld.edu.au

    - by LuckySpoon
    My question relates to two websites, each of which is a "Calvary Christian College", however in two totally different locations and unrelated to each other entirely (except by name, and thus domain). All schools in the state are issued a <school-name>.qld.edu.au subdomain, in this case calvary.qld.edu.au and calvarycc.qld.edu.au. Now what's interesting is that these domains are crossing each other in sitelinks for searches such as calvary christian college townsville. The green data here is for one school (the Townsville school, as per search term), and the red data is for the other school. I've put a demotion in for this 6 months ago (we control calvary.qld.edu.au), however we're seeing no change on the results page. I have been able to get the owners of calvarycc.qld.edu.au to submit demotions for our domain, which should go in sometime in the next few days. What can we do to tell Google that these websites are not interchangeable, despite both appearing as "subdomains" of qld.edu.au? We can possibly open channels of communication with the administrators of qld.edu.au but will need to tell them what we need to change, and at this point I'm out of ideas.

    Read the article

  • Desktop Software to monitor online status of web site and web-based application

    - by pansp
    I'm basically looking for a desktop-based software which can monitor my company's website and the web application's online availability. I know there are few online applications like Uptime Robot which does the same work but I have been asked to find a desktop based software which can monitor running in system tray and notify any down-time. A free software would be great. Any help would be appreciated. Thanks!

    Read the article

  • Buying a custom domain for blogger

    - by John Demetriou
    I am about to move my blogger site to a custom domain. I do all the steps as told but whenever I find the perfect custom domain (that is free) I get redirected to google apps for bussines... Is it a necessity to get Google apps for business before buying a custom domain? If I only start a free trial of Google apps for business when the trial period expires will my custom domain domain still be valid?

    Read the article

  • My Xmap generated sitemap is not being submitted

    - by user2014989
    I m using Joomla Xmap component for creating sitemap. Here is the URL of my Xmap generated Sitemap: http://www.acethehimalaya.com/index.php?option=com_xmap&sitemap=1&view=xml I tried to submit my sitemap to Google but the problem I'm facing is that the URL doesn't get submitted and I'm having the issue that it says the sitemap is empty. Can Xmap generated sitemaps not be submitted, or am I doing anything wrong?

    Read the article

  • How much time it needs google webmaster yo generate content keyword if url masking is enabled? [closed]

    - by user1439968
    Possible Duplicate: What is domain “masking” or “cloaking”? Why should it be avoided for a new web site? my real domain is domain.in. But url masking has been enabled and the masked url is domain2.in .. In that case i have added d url bputdoubts.21backlogs.in to google webmaster a week ago but content keyword hasn't been generated. In this case when can I expect to get the content keywords generated ?? And is there a problem for getting visitors from google search if url masking is enabled ?

    Read the article

  • Automatically keep your local git repos clean

    - by kerry
    Most developers using git are probably aware of a command ‘git gc’ that has to be run from time to time when you notice your git commands are running a little slow. This command cleans up your git repo and makes sure everything is nice and tidy. If you have not run this command lately, you will notice a huge performance increase in your git commands after running. It’s a bit annoying to have to run this command when you notice that your git performance is suffering. The command also takes a while if you have not run it recently. With this in mind, I decided to create a method to automatically run this command from time to time. So I decided to overload cd similar to how rvm does. All you have to do is paste the method in your .profile file and it will run the command every time you enter a directory with a git repo. You’ll notice a little pause when entering the directory, it’s not insufferable but if you would prefer, you can add an & to the end of the command to have it run in the background. I chose the pause over the pid output of the background command. Here it is in all it’s glory. View the code on Gist.

    Read the article

  • How can I determine the trending pages on my site?

    - by Dogweather
    I'm looking to what what the "hot" pages are on one of my sites. I want to see for various timeframes, what the top-50 pages are. I'm going to create a data feed with this info which will be input to another app. I have Apache logs, and complete control of the machine to install what I want. I'm mostly wondering if there's something out there already that I can use, or if I have to implement it myself, what good algorithms or strategies might be. Thanks.

    Read the article

  • I've changed my URL schema. How do I tell Google to index the new schema and forget the old one?

    - by growse
    I had a site where the urls were constructed like this /index.php/Topic /index.php/AnotherTopic These were indexed in google, and search results returned that pointed to these. However, I've recently replatformed that site, and reconfigured it so the above urls would be: /index.php?title=Topic /index.php?title=AnotherTopic The original urls are returning 404s. The site is linking to the correct URL schema internally, but Google is retaining the original schema in its search results. I've updated and resubmitted the sitemap which only contains the new schema. Also, Google's webmasters tool is going slightly bananas at the fact there's now a spike in 404 errors in its crawl results. What would be the best approach to get Google to 'forget' about the old schema, and instead index the new schema? Should I try blocking /index.php/ in robots.txt? Should I be returning 301 codes instead of 404 for the original urls?

    Read the article

  • Revamped Joomla site to Google search engine

    - by user3127632
    I am about to upload a revamped site of Joomla (update from 1.5 to 2.5 + changes). I currently have a test bed subdomain that I am currently working on. In few days I am about to do the swap and replace the old site with the new one. I am worrying about Search Engines and specifically Google. The site currently has a very good rank (appears 2nd in the search), what actions do I have to take in order to be updated and preserve the rank? (except submitting the new sitemap I guess). It's not a difficult task but because I don't have the option to be wrong or mistakes to be done I an asking for a more "expert" advice.

    Read the article

  • Google shows "search instead for" when searching for our website

    - by Athanatos
    Our website is new and the name is similar (only one letter different than another website) completely different type and company though. searching for xxxxxA works OK in Google and we find relatively good results. However searching xxxxxA.com finds results for the other website and gives us the following options: Showing results for xxxxxE.com Search instead for xxxxxA.com (hyperlink when clicked then it is correctly searching for our site) Questions: Do we need to contact Google to correct this and if yes how ? if not will it be corrected automatically when the site becomes more popular and what is the process? How do we make the process quicker?

    Read the article

  • Reflector Pro Cometh

    Reflector 6 is here. Nick Harrison is a long-time Reflector enthusiast, and has been responsible for writing an add-in. As he'd helped test the new version, Nick asked to review it for Simple-Talk. The team were anxious to know what he thought. They needn't have worried.

    Read the article

  • Creating WPF Prototypes with SketchFlow

    Prototyping with Sketchflow transforms what was once a frustrating and time-consuming chore. With SketchFlow, WPF prototypes can be created and changed with amazing ease. SketchFlow is WPF's secret weapon. Well, it was secret until Michael Sorens produced this article.

    Read the article

  • Programmer logbook application?

    - by jsoldi
    I've just released my application to the public, and I'm working on an updated version, but I really think I should keep track of ALL the code changes. In case some functionality suddenly starts failing, with a history of all the changes I made it would be a lot easier to figure out where I messed it up, in case the problem wasn't already there. The ideal would be to have a super fast computer with a huge hard drive and an application that automatically saves a backup of the whole project every time I change a line in the code, with some file comparison tool that would show me every difference between any two backed up projects, but that's not really possible for now. So, do you know any application that makes it easy for a programmer to keep track of the changes made to the source code?

    Read the article

  • How to delete all your old website data from the internet?

    - by Akky Awesøme
    I had my website on rohbits.com but for some reasons I had to delete it and recreate it with this URL wwww.rohbits.com/blog. My problem is that the old links are still visible on google search and when people click on those links, they land on a 404 Error page of the hosting company. I want to either delete all the previous data from the search engines or have an 404 Error page of my own so that I can tell my visitors where the actual website is. I have already redirected all the traffic which comes to rohbits.com to www.rohbits.com/blog but when they click on the expired links, they get this error page. One sample expired link is this one: http://rohbits.com/wordpress-tricks.

    Read the article

  • How can I ease the work of getting pixel coordinates from a spritesheet?

    - by ThePlan
    When it comes to spritesheets they're usually easier to use, and they're very efficient memory-wise, but the problem that I'm always having is getting the actual position of a sprite from a sheet. Usually, I have to throw in some aproximated values and modify them several times until I get it right. My question: is there a tool which can basically show you the coordinates of the mouse relative to the image you have opened? Or is there a simpler method of getting the exact rectangle that the sprite is contained in?

    Read the article

  • Feature Usage Reporting in Early Access Programs

    After doing Web development, you can get very used to the luxury of having basic information about your users' machines and browsers. With their permission, you can also get the same information from an application, and can even get more targeted anonymous information that will tell you how the features are used. Kevin explains how this can be used with early access builds to improve the reliability and usability of applications.

    Read the article

  • How to get rid of crawling errors due to the URL Encoded Slashes (%2F) problem in Apache

    - by user14198
    The Google web crawler has indexed a whole set of URLs with encoded slashes (%2F) for our site. I assume it has picked up the pages from our XML sitemap file. The problem is that the live pages will actually result in a failure because of the Url Encoded Slashes Problem in Apache. Some solutions are mentioned here We are implementing a 301 redirect scheme for all the error pages. This should make the Google bot delete the pages from the crawling errors (no more crashing pages). Does implementing the 301s require the pages to be "live"? In that case we may be forced to implement solution 1 in the article. The problem is that solution 1 will pose a security vulnerability..

    Read the article

  • Should SQL Server tools target wide screen formats instead of portrait formats?

    - by Greg Low
    There was a short discussion on the SQL Down Under mailing list this morning about screen resolutions for working with the SQL Server tools. In particular, the issue was about how unusable the tools are on the 1366x768 resolution notebooks that now seem to be the most common. While finding a notebook with an appropriate resolution is obviously the answer at this time, I started thinking that the product itself needs to address this. SQL Server tools currently target a portrait 4:3 shape for minimum...(read more)

    Read the article

  • Creating an online community - use templates or self-develop?

    - by ican ican
    PHPMotion, Joomla or develop my own? I'm thinking of developing a common interest online community. It will be have UGC, stats, etc.. functionality, and perhaps an online store. Though cost is an issue at this time, I want to be professional and effective. Should I use existing free platform templates, like PHP, Joomla, or should I develop my own? What are the advantages/disadvantages of either option? As a rough estimate, how much will it cost me to develop and manage my own? And how long will it take. In general what should I be careful about on this journey?

    Read the article

  • How can I track hits to areas of my web application?

    - by Tyson
    We have a growing web application, and we currently use Google Analytics and Chartbeat to track usage and engagement (although we're open to alternatives). Unfortunately, both are geared towards content-based sites where everything is about the URL. Our URLs contain object IDs, making them less useful independently, and causing us to grow beyond Google Analytics' 50,000 unique URLs per day. How can we track hits to areas of our web application, essentially ignoring parts of the URLs?

    Read the article

  • Re-indexing website with clean URL's

    - by artsi
    So I have a website with URL's like this: http://www.domain.com/profile.php?id=151 I've now cleaned them up with mod_rewrite into this: http://www.domain.com/profile/firstname-lastname/151 I've fetched and re-indexed my website after the change. What is the best way to make the old dirty ones disappear from search results and keep the clean ones? Is blocking profile.php with robots.txt enough?

    Read the article

< Previous Page | 79 80 81 82 83 84 85 86 87 88 89 90  | Next Page >