Search Results

Search found 9717 results on 389 pages for 'pro metedor'.

Page 99/389 | < Previous Page | 95 96 97 98 99 100 101 102 103 104 105 106  | Next Page >

  • Why are intermittent pages disappearing from SERPs?

    - by Beebee
    I have a very basical informational site with about 50+ pages. Specifically, there is a page about each site. I have been tracking pages daily for months and they have been streadily increasing in the SERPs. Suddenly, about 3-4 weeks ago, about 20 pages were completely missing from the SERPS, though they were still indexed (I could google a specific text from the page in quotes and it would appear). Since then, pages have been added and removed from SERPs continually, cutting my traffic by about 50%. All I have been adding is information about the sites topic. I haven't stolen content or done anything else that seems like it would cause this.

    Read the article

  • What's the ideal setup for an news minisite for an app [closed]

    - by Leonardo Amigoni
    I am mobile app developer, I would like my application to check for news about new updates of the app when the user opens it. I am unfamiliar on how I would check from a server if the news are actually new or have already been read. If they have not been read, I could of course display them in the app. Can anyone point me in the right direction on how to achieve this? Something similar to an RSS feed but on mobile. Thanks

    Read the article

  • Proper way to create and work with a subdomain?

    - by Genadinik
    My site got effected by Panda, and I am trying to see if making a subdomain would work. The site is comehike.com, and I created a subdomain which is currently empty at hiking.comehike.com I have a directory /outdoors that has some high quality hand-written articles. I want to put those into the new subdomain to see what would happen. My questions are: Should I just copy and paste the files for those pages into the new subdomain's folder, and just change all the links in all my pages from the original domain to the new subdomain? Should I just do a 301 redirect to the new subdomain? Since test.site.com and www.site.com are different domains, will the new page have to start from scratch in terms of Pagerank, and its rankings in the SERPs?

    Read the article

  • Seeking .htaccess help: Converting multiple subdomains (both HTTP and HTTPS) to www.domain.com using .htaccess

    - by Joshua Dorkin
    I've been trying to get an answer to this question on other forums (the folks at SuperUser thought this was the place I needed to post) and via my connections, but I haven't gotten very far. Hopefully you guys can help me find an answer. I've got a dozen old subdomains that have been indexed by Google. These have been indexed as both HTTP AND HTTPS. I've managed to redirect all the subdomains properly, provided they are not HTTPS, but can't get any of the HTTPS subdomains to property redirect. Here's the code I'm using: RewriteCond %{HTTP_HOST} ^subdomain1.mysite.com$ [NC] RewriteRule ^(.*)$ http://www.mysite.com/$1 [R=301,L] RewriteCond %{HTTP_HOST} ^subdomain2.mysite.com$ [NC] RewriteRule ^(.*)$ http://www.mysite.com/$1 [R=301,L] RewriteCond %{HTTP_HOST} ^subdomain3.mysite.com$ [NC] RewriteRule ^(.*)$ http://www.mysite.com/$1 [R=301,L] This works great until someone goes to: https://subdomain2.mysite.com$ which is not redirected back to http://www.mysite.com$ How can I get this to work? Additionally, I'm guessing there is an easier way to make it happen than setting up a dozen pairs of RewriteCond/RewriteRule? Is there any way to do this in just a few lines, including one where I list all the subdomains? I'd actually also like to redirect everything on https://www.mysite.com$ to http://www.mysite.com$ except for 3 folders. These are mysite.com/secure, mysite.com/store, mysite.com/user. Is there any good way to add this to the .htaccess file?

    Read the article

  • Which token from a long User-Agent should I use in robots.txt?

    - by Gaia
    The definition of User-Agent states that several tokens can be included, as deemed necessary by the client. I want to block certain bots via robots.txt and I am confused as to which part of the User-Agent string to use, especially for more obscure bots. For example: Mozilla/5.0 (compatible; uMBot-LN/1.0; mailto: [email protected])" JS-Kit URL Resolver, http://js-kit.com/ Mozilla/5.0 (compatible; SEOkicks-Robot +http://www.seokicks.de/robot.html Do I use the second token? Can tokens contain spaces, or did the SEOkicks folks forget a semicolon after SEOkicks-Robot? I don't actually intend on making my question specific to a couple bots - I want to know the guideline: which part of UA do I place in robots.txt for these exotic bots with UA as long as a haiku? User-agent: uMBot-LN/1.0 Disallow: / PS: Thank you but I do not need to hear that undesirable bots are better blocked with mod_security. I already have commercial mod_sec rules in place.

    Read the article

  • Yahoo search: different results shown in two identical searches

    - by Marco Demaio
    Hello,simple question: searching on http://www.yahoo.it for villa matrimonio bologna I noticed Yahoo shows different results. You need to retry few times to get this done maybe exiting the browser and openeing it again, or maybe searching once and then clearing browser cookies and then search again (it's even easier to test if you use two different browsers at the same time to search for the same phrase). Anyway in order to reproduce this easily I write down here the query shown in the address bar after the search, so you can just click on these to see the results shown by entering these query: http://it.search.yahoo.com/search;_ylt=AirvLYKvBMPP_6MpAmONN14brK5_?vc=&p=villa+matrimonio+bologna&toggle=1&cop=mss&ei=UTF-8&fr=yfp-t-709 http://it.search.yahoo.com/search;_ylt=AirvLYKvBMPP_6MpAmONN14brK5_?vc=&p=villa+matrimonio+bologna&toggle=1&cop=mss&ei=UTF-8&fr=sfp Note the last parameter fr is different, but it's Yahoo that set it (not me), I don't even know what it means. You can see in the search box that the searched phrase is IDENTICAL in both cases. So why Yahoo is giving out different results on same search phrase? I used the same browser and performed the test in few minutes by simply trying more than once. You may also notice that the number of results returned (written on the left side of the page) is different, for the 1st search it returns 274K results, for the 2nd one 5.38M results. Actually you might think that this is just an error on Yahoo, but it's almost 1 year that while looking once in a while at some websites to see how they are ranking on Yahoo and also Google, I noticed that two searches on the same phrase show up different results even on the same day after few minutes/hours. I couldn't reproduce this behaviour also on Google so I can not say for sure, but since it seems to me it happened sometimes I was wondering if anyone of you noticed it too. Do you know if this is the normal behaviour of search engines? Because if it's normal (and it's just me that noticed it only now) I wonder how do you understand how well a site is ranking on a search engine, you could even see one of your customer's website ranking differently compared to what your customer sees on his PC.

    Read the article

  • how to get IP adress of google search/crawler bot to add to our white list of ip address

    - by Jayapal Chandran
    Hi, Google webmaster tools says network unreachable. When i contacted my hosting provider they said that they have installed firewall which could block frequent incoming ip addresses and they dont know the google's ip adress to unblock. so they requested me to find google search/crawler bot's ip adress so that they can add it to their whitelist. How to find the ip address of google search bot or crawler bot? My site stopped appearing in google search. My hits had gone too low. What should i do? any kind of reply would he helpful.

    Read the article

  • What is the proper way to setup Google Apps email accounts for a subdomain?

    - by binaryorganic
    Let's say I've registered domain.com at enom and set it up to use Google Apps for email by rerouting DNS to enom's servers and editing the MX records there. That works flawlessly. Now let's say I want to have email at a subdomain for that same site. I already have a working subdomain at the host, but I want to catch email traffic at enom before it gets that far. I've set up Google Apps as a new account for the subdomain, successfully verified domain ownership, and now they want me to update MX records. What's the right format? For domain.com, I just put @ for the hostname, and then provided the Address and Pref values that Google gave me. I tried putting subdomain.domain.com as new values under hostname for the subdomain, but that doesn't seem to work. What am I doing wrong?

    Read the article

  • Free forum engine with good anti-attack mechanisms

    - by macias
    I am looking for forum engine (for discussions) with good attack countermeasures built in. Windows (preferrably) or Linux. Free (as beer). I think about registration flooding and blocking user accounts attacks. For registration, such engine should have at least: captcha blocking mulitple registrations from the same IP providing login (for logging in) and user name (for displaying the author of the posts) For logging in: no blocking on multiple tries -- instead after X try sending via mail a token, the third piece needed for next login -- without it logging in will be impossible (it would be similar to activation process) The engine should be designed with two ideas in mind: protecting engine against attacks 0 penalty for decent users Thank you in advance for your help and recommendations.

    Read the article

  • Do AdWords conversions only track AdWords visitors?

    - by atticae
    I have set up an AdWords conversion and put the conversion tracking JS code on a test page. However, I don't see any conversions tracked when I visit it. Does the AdWords conversion tracking only register your conversion if you come to the site by clicking on an AdWords campaign? Google's help page advises me to test the code by clicking a campaign. However, I would like to use the tracking to track all conversions, not just AdWords. I considered using Analytics as well, but it seems you can only track via url there, not JS, which would mean I had to restructure a part of my page. (Because currently a conversion appears does not necessarily happen on a different URL.) Is AdWords tracking not a viable solution to track all visitor conversions on my site?

    Read the article

  • How to verify real people?

    - by Gerben Jacobs
    For a music community, I want bands to be able to verify themselves. What is the best way to do this? For example I could let the record label mail me, but some bands are indepedent. I could also ask them to put a 'code' on their website or Facebook page and then check manually. I'm not per se looking for a waterproof solution, so no scanning of real life documents and I'm okay with doing the checks manually. In other words, how can you verify real people with their virtual presence?

    Read the article

  • general questions about link spam

    - by hen3ry
    Hello, A CMS-based site I manage is suffering from a small but ominously growing number of almost certainly bot-emplaced, invisible spam links placed in registered-user-only shoutboxes and user forums. "Link Spam", yes? Until recently, I've kept my eyes on narrow tech issues, and I'm having trouble understanding what's going on. I understand that we need to tighten up our registration procedures, but more generally... Do I understand correctly that our primary interest in combatting link spam on our site is that major search engines reduce or zero the search visibility of sites that contain link spam? Although we're non-commercial, we don't want to be at the bottom of the rankings, or eliminated altogether. Are the linked-to sites the direct beneficiaries of the spam links, or is there some kind of indirection? What is the likely relationship between the link-spammers and the owners of the (directly or indirectly) linked-to sites? Are the owners of the linked-to sites paying the link-spammers for higher visibility? Are the owners aware that this method is being used? It is my impression that major search engines are capable these days of detecting that given sites are being promoted by link spam, and that these sites may consequently be reduced in search rank or dropped altogether. Do these sanctions occur frequently? Is there any potential value in sending notifications to the owners of the linked-to sites that their visibility is at risk? TIA, hen3ry

    Read the article

  • facebook share and opengraph

    - by hannit cohen
    I'm managing a video blog. The blog contains a main youtube video on the homepage and several thumbnail images of other videos elsewhere. I have a share button for the main page and the single posts pages and have added opengraph to specify the used image. For some reason facebook ignores my opengraph image and uses othe images it finds in the page... the header looks like this: (for the homepage) <!-- Facebook Opengraph --> <meta property="fb:app_id" content="155967927783206" /> <meta property="og:url" content="http://mttv.co.il/2010/12/%d7%9e%d7%a4%d7%92%d7%a9-%d7%a1%d7%99%d7%99%d7%a2%d7%95%d7%aa-%d7%92%d7%a0%d7%a0%d7%95%d7%aa-%d7%9c%d7%93%d7%95%d7%a8%d7%aa%d7%99%d7%94%d7%9d-1959-2010-%d7%91%d7%9e%d7%a2%d7%9c%d7%95%d7%aa/"/> <meta property="og:title" content="???? ?????? ????? ???????? 1959-2010 ??????" /> <meta property="og:description" content="???? ????? ?? ?????? ????? ?????? ????????? ???? ?????? 1959 ??? 2010 ?????? ????? ??????? ???' ??? ??? ????? ???? ??????? ???????? ?????? ?&quot;? ???? ??? ?&quot;? ????? ?????? ????? ???? 08.12.10 " /> <meta property="og:type" content="article" /> <meta property="og:image" content="http://www.mttv.co.il/wp-content/uploads/2010/12/Gvi91UEjCAw_mid-135x77.jpg" /> The website address is: http://mttv.co.il Any help will be appreciated

    Read the article

  • Domains with similar names and issues

    - by abel
    I recently purchased one of those domain names like del.ico.us. While registering I found that delicious.com was being used. Argument: I found that delicious.com belonged to the same category as my to-be website. It served premium delicious dishes. Counter Argument: My to-be domain though belonging to the same category, specialized in serving free but delicious dishes or in giving out links(affiliate) to other sites serving premium delicious dishes. Additional Counter Arguments: 1.delicious.com was not in English. 2.the del.icio.us in my domain name though having the same spelling, is not going to be used in the same fashion. For eg.(this may not make sense, because the names have been changed)the d in delicious on my website actually stands for the greek letter Delta(?/d) and since internationalized domains are still not easily typable, I am going for the english equivalent.The prefix holds importance for the theme of the service which my website intends to offer. My Question: Can I use the domain name del.icio.us for my website? How are these kinds of matters dealt? (The domain names used are fictitious. And I have already registered the domain but have not started using it.I chanced upon this domain name because it was short, easy to remember and suited the theme of my website.)

    Read the article

  • Error when setting Piwik analytics

    - by bertran
    I've uploaded the latest version of Piwik unto my web server, which is hosted by go daddy.com, on a linux hosting plan. I'm setting it up (accessing it from my browser as instructed) and I have the "Piwikinstallation" page open on step 3 (database set-up ) of 9. I don't know what to imput in the field "database server"... the default is the number 127.0.0.1 When I leave that input as is, and click "Next" leaving the gives the error: "Error when trying to connect to database server: SQLSTATE[HY000] [2013] Lost connection to MySQL server at 'reading initial communication packet', system error: 111" and changing that input to "localhost" gives me another error: "Error when trying to connect to database server:SQLSTATE[HY000] [2002] Can't connect to local MySQL server through socket '/var/lib/mysql/mysql.sock' (2)"

    Read the article

  • Is Movable Type among the most secure PHP blogs? How secure are the various PHP blog applications?

    - by user6025
    Basically I'm trying to find a blog for a website, and security is the highest priority in our case. We don't need any features that I would imagine are special. Wordpress was our first idea, but its reputation precedes it, and though it may have cleaned up its act lately, I'm not seeing much solid evidence. I get the impression that Movable Type (at least the Perl version) has a much better reputation for security than Wordpress (historically at least). I'm not sure I want to take a chance with Wordpress at this point, but is there some objective source I can got to to back up (or counter) the notion that MT is at least among the best? Secunia doesn't recommend using their stats for comparisons, and securityfocus.com doesn't have stats at all that I can see. Searching here http://web.nvd.nist.gov makes MT look way better than WP (at least in 2007), but this site was referenced by MT's own page boasting about their security, so I don't know how relevant it is or how seriously people take it. Any suggestions on sites where I could/should make a somewhat objective comparison?

    Read the article

  • 2 google analytics profiles for 2 sections of the same site

    - by sam
    Ive got a website which for the most part is a portfolio, there is another section of the site mysite.com/micro-site which ranks extremely well for it chosen term / topic, and brings in lots of traffic, but actually has little to do with the core business. It was really made as a piece of content - in the same way sites like this are - http://chrome.com/campaigns/rollit For the main site i use 1 Google analytics profile and set of tags, for the micro site i have a completely different analytics profile and set of tags. The main reason ive done this is because the traffic stats and insights for the micro site are essentially just noise, its nice to have the traffic but they dont help when reading analytics reports, so if they were combined my analytics reports would be a mess. Is there any disadvantage / negatives of doing this ?

    Read the article

  • Methods of Geotargeting and optimising for location

    - by Switchfire
    This is somewhat an SEO question and somewhat a general web developer question. Our company website is pretty awful, I'm currently redesigning the new one, the problem is they have a directory called regions, which contains page for around 200 different locations around the country, all stuffed to the brim with keywords and useless things. Some of these pages work and the traffic's good enough to keep a few of them. Apart from creating a page for everyone again, is there another way of targeting all these locations without having to create a new page for each or is the a more dynamic way to do it? Any ideas or suggestions?

    Read the article

  • Site migration and SEO impact

    - by John Smith
    I'd greatly appreciate a response on the following question relating to site migration and SEO impact. Here's some background on how my domain name and site is currently configured: My domain name provider has the following settings: host name @ is an A NAME record and points to IP address x.x.x.x host name www is an A NAME record and points to IP address x.x.x.x sub-domain host name new.example.com is an A NAME record and points to IP address x.x.x.x My hosting provider has the following settings: host record @ is an A NAME record and points to IP address x.x.x.x, folder home/public_html/old host record www is a C NAME record and points to example.com sub-domain host record new.example.com points to home/public_html/new I want to: point the domain (example.com AND www.example.com) to the content hosted under folder home/public_html/new, which is currently the content directory for new.example.com retire the content hosted under folder home/public_html/old retire the sub-domain host record new.example.com I believe the easiest method of doing this, is: removing the sub-domain host record new.example.com; and changing the following line in the .htaccess file in home/public_html from # Change 'subdirectory' to be the directory you will use for your main domain. RewriteCond %{REQUEST_URI} !^/old/ to # Change 'subdirectory' to be the directory you will use for your main domain. RewriteCond %{REQUEST_URI} !^/new/ But I don't understand how this will impact my SERP - ideally, I'd like it to remain the same. Research on this topic resulted in the following Google page, which was no help, and this related StackExchange question, which suggests that this should not affect my SERP (at least, not permanently). But I wanted to make certain with a more specific example, and hopefully contribute to the community at the same time. I'd appreciate any feedback on this. Is there a better/recommended method to migrate sites this way? Is there an SEO impact?

    Read the article

  • Are Meta tags useful for SEO?

    - by Lynda
    Reading a Search Results with this or a similar phrased question can lead to reading a lot of conflicting answers. Are there any meta tags that matter in SEO? From what I have read I do know that meta keywords are no longer used (or so little it is not worth using them) and don't worry with using them. Meta Description tags are not used for page ranking but can effect click through rates so should be used but be less than 160 characters. I know the following meta tags exist: author - the author's name and possibly email address robots - to allow or disallow indexing by robots copyright - the copyright date of the page How much do these meta tags matter and are there others that are new (including ones that may not be used by all but might be used in the future or are used by only one of the big players like Google or Bing) meta tags that should be included? Note: Even if a meta tag doesn't matter in SEO but helps with click through rates similar to the descript tag does then feel free to include it with your answer.

    Read the article

  • How to Configure Name Servers using Webmin in Unmanaged VPS on Centos

    - by John
    I want to configure my site's name servers and all related stuff. I'm not able to find any good documention steps to do it straight-forwardly without understanding the nitty-gritty of this. I wish I could afford managed Vps I feel that I'm the odd one out looking for this documentation. I've followed doc at these places: http://www.webtop.com.au/blog/how-to-setup-dns-using-webmin-2009052848 , http://www.beer.org.uk/bsacdns and https://www.virtacoresupport.com/index.php?_m=knowledgebase&_a=viewarticle&kbarticleid=134

    Read the article

  • SEO indexing with dynamic titles, keywords and description

    - by Andrea Turri
    I'm working on a worldwide website (all in one single domain) so I'm wondering to create dynamic titles, descriptions, keywords and headings for each location. What I'm doing is to get information from the IP of the user and show for example a dynamic title: var userCity = codeToGetCityFromIP; <title>Welcome to userCity</title> // and same for description, keywords and headings... Obviously the code is different... I'd like to know if it is a good solution to create multiple SEO indexing based on cities? I'm also using GeoLocation and I do same using the returned values from it. I'm doing right or there are more effective ways to indexing in different countries and cities without create multiple website for each city of the world? Thanks.

    Read the article

  • Sending a small number of targeted emails, is it spamming?

    - by Alex Mor
    I have a directory website and I want to send focused emails, a small amount, less than 50 a month, to some of the businesses on my directory that get many visitors. The intention is to let them now many people are viewing there page and encourage them to update it and post information on it. How can I send this small number of emails without being targeted as spam? Also, should I send it from an email with the websites domain or will it better to send from a personal email? that way at least of email is tagged as spam sometimes it won't hurt the website's reputation, is this true?

    Read the article

  • How to protect SHTML pages from crawlers/spiders/scrapers?

    - by Adam Lynch
    I have A LOT of SHTML pages I want to protect from crawlers, spiders & scrapers. I understand the limitations of SSIs. An implementation of the following can be suggested in conjunction with any technology/technologies you wish: The idea is that if you request too many pages too fast you're added to a blacklist for 24 hrs and shown a captcha instead of content, upon every page you request. If you enter the captcha correctly you've removed from the blacklist. There is a whitelist so GoogleBot, etc. will never get blocked. Which is the best/easiest way to implement this idea? Server = IIS Cleaning out the old tuples from a DB every 24 hrs is easily done so no need to explain that.

    Read the article

< Previous Page | 95 96 97 98 99 100 101 102 103 104 105 106  | Next Page >