Search Results

Search found 5380 results on 216 pages for 'webmasters'.

Page 106/216 | < Previous Page | 102 103 104 105 106 107 108 109 110 111 112 113  | Next Page >

  • Strange spam posts not making sense

    - by Paaland
    I'm running a web site with a forum where one small part is open for posting from unregistered users. The site uses captcha, but still some spam posts get through every day. Here is the thing. All of the messages follow the same pattern, but all also come from different IP's. That makes me thing this is some sort of automated scripted "attack" from a botnet of some sorts. The strange thing is that all the messages start with six random characters and contains a couple of links. The words have no meaning and the domains in the links does not even exist. Why would anyone use time and resources spreading these things? Below you can see two of these messages: A5Zfs6 exrzvrbspntz, [url=http://nktqoqllnuab.com/]nktqoqllnuab[/url], [link=http://wtrenldadvsy.com/]wtrenldadvsy[/link], [http://rnlrqfgdvdot.com/] O2oLpL nqeffxhryfdk, [url=http://jutyurbpfxow.com/]jutyurbpfxow[/url], [link=http://jpcdtmdalpow.com/]jpcdtmdalpow[/link], [http://qopqwqxwjdjx.com/] Since all the messages come from different IP's I can't see blocking those will help much. For now I'm considering just dropping all messages following this pattern since it's quite easy to match with a regexp. Have anyone else seen these kinds of messages or know the point of posting them?

    Read the article

  • Correctly indexing multiple domains with same content in Google and others

    - by AJweb
    I have a client with a dozen territorial domains, like mydomain.co.uk, mydomain.fr, mydomain.de, etc Most of these domains hold a different language of the same dynamic content (shop), but some, like co.uk and .com, have the same language and content, except for some content customized to each country/domain in the front page, contact and other pages. I am aware that we should use the canonical meta tag to mark those duplicated contents, but, we want the co.uk to be present in UK ( indexed in google.co.uk ) and the .com to be present in US and other countries, for example, or least that is the goal. Is there anything we can do to "help" google determine the geographical meaning of each domain? If we mark with canonical tag the .com and co.uk sites, do you know how google will decide which one to show on a given search?

    Read the article

  • Tag link suggestion plugin for wordpress?

    - by Emerson
    Hi, every time I write a post I make sure I add links to wordsthat I have tags for. For example: "The economy of Brazil has improved in the last few years" this ensure that when people re-post my content, a lot of back-links will be created to my tags. This is quite a lot of work to do manually for every post. It would be cool if there was a plugin that would suggest tags to be applied when they match existing words in the text of the post. Is there such a thing?

    Read the article

  • One site being on a subdirectory of another. Does google count this against you?

    - by Mick
    I have created two similar websites (relating to monetary systems). So far, one appears to be loved by Google and the other hated. I'm struggling to work out why. This is a mystery to me because both sites were created by me with the same design philosophy, both in pure html. Both are packed to the rafters with references to, and information about, their respective subjects. One issue I'm worried may be the cause is to do with the location of the sites. I got a web hosting package from hostmonster.com for the successful one, but less liked one is just an "add-on" which sits on a subdirectory of the successful one. I wonder if Google somehow detects this and treats it as a less significant website? EDIT: Just to clarify, even though one site is an add-on that sits on a subdirectory of the other, the URL is arranged to look like it is a root. I.e. the unpopular site can be accessed directly with a simple www.myunpopularsite.com name, without specifying any subdirectory. EDIT: Just in case its important... say the popular site is called pop.com and the unpopular one unpop.com. In the webspace I've purchased, there is a directory called public_html. This is where I put the index.htm and all the other files of my popular site. When I purchased the add-on unpop.com. I made a subdirectory of public_html called unpop. It is within this "public_html\unpop\" that I place the index.htm and all the other files of my unpopular site. Typing www.unpop.com into the address bar of a browser links directly to the contents of "public_html\unpop\" and the user is not aware that this site is sitting on a subdirectory of another site. BUT if you type "www.pop.com/unpop" into the address bar of a browser you DO see the unpopular site.

    Read the article

  • Google Places seo?

    - by sam
    Im familiar with seo and getting higher google listings but for allot of services google has recently been making there search results (were applicable) much more location orientated.. for instance searching for a "accountant in london" or "accountancy firm london" will through return the first half of page 1 as google places listings, then under about 6 of these you will get your normal search results so somone who used to rank #1 on page 1 now will rank effectily #7. What i was wandering is that i cant see any reasons as to why the company that rank high in the places results get there, often they are not high up in the search results. Is there a way to optimise on or offsite to rise up the google places listings in your city ?

    Read the article

  • How can I create an SPF record on my 1and1.com hosted domain?

    - by tnorthcutt
    Emails from my domain (hosted at 1and1, and using Google Apps Premier edition) have sporadically been going to recipients' spam folders lately. I did some research, tested, and found out that I do not have an SPF record for my domain. According to this Google Support page, I need to create one. Following the steps on that page is easy, until I get to #3: Create a TXT record containing this text: v=spf1 include:_spf.google.com ~all I see no way to create a "TXT record". Here is a screenshot of the admin panel:

    Read the article

  • Loading main javascript on every page? Or breaking it up to relevant pages?

    - by Kyle
    I have a 700kb decompressed JS file which is loaded on every page. Before I had 12 javascript files on each page but to reduce http requests I compressed them all into 1 file. This file is ~130kb gzipped and is served over gzip. However on the local computer it is still unpacked and loaded on every page. Is this a performance issue? I've profiled the javascript with firebug profiler but did not see any issues. The problem/illusion I am facing is there are jquery libraries compressed in that file that are sometimes not used on the current page. For example jquery datatables is 200kb compressed and that is only loaded on 2 of my website pages. Another is jqplot and that is another 200kb. I now have 400kb of excess code that isn't executed on 80% of the pages. Should I leave everything in 1 file? Should I take out the jquery libraries and load only relevant JS on the current page?

    Read the article

  • Strange SEO Problems

    - by Davey
    I have a twitter account linked to my wordpress site so that each tweet becomes a new post. I was wondering why my SEO was hurting and when I looked at the source I was seeing stuff like this: Los Conchita & # 8 2 1 7 ; s on Prince has V & # 8 2 3 0 ; That is what the source lists as the title of the page. Has anyone else had this problem and know why it is occurring? Thanks! The site is reviewathens.com

    Read the article

  • Google: What does a return to PR 'Unranked' mean?

    - by UpTheCreek
    One of my sites is very new (about 3 months). When first launched it's pages had (unsurprisingly) a Google PR of 'Unranked' [From Google toolbar stats, via the firefox SearchStatus plugin]. After a few weeks these changed to PR0. Just recently I noticed that they are showing PR 'Unranked' once more in Google Toolbar. As far as I know I'm following the Google guidelines. Results for the site still seem to be showing for its keywords. What could this mean?

    Read the article

  • Need a host which supports OSQA

    - by Josip Gòdly Zirdum
    Hi i'm looking to install OSQA and see how it goes I have a great niche which I think may work real well, but till I get a large enough audience I'd like to use shared hosting then move up to a dedicated or vps hosting... Almost all hosts i've looked at don't support something OSQA needs I need relatively cheap shared hosting with cpanel. Any recommendations? It needs to support: Django Python markdown html5lib Python OpenId South

    Read the article

  • SEO + international sites? country.domain.com or domain.country?

    - by Pure.Krome
    Hi folks, is it better to have seperate country specific domains (which costs more money) or subdomains which define the country, for better SEO? eg. stackoverflow.com stackoverflow.com.au stackoverflow.co.uk vs stackoverflow.com au.stackoverflow.com uk.stackoverflow.com Assumption: int the search engine web master tools, each subdomain are associated to a country. eg. au.stackoverflow.com is associated to the country Australia. cheers! Update I understand that both methods do work, especially when i utilize the assumption, listed above. The question is about: Which method is better? Is there such a small SEO difference between them? Is the first method way way way better than the second with getting better SEO results? Update #2 A number of folks have suggested that the following is a good/better approach: stackoverflow.com/ stackoverflow.com/au stackoverflow.com/uk By adding a country specific iso code to the end of the url/the first folder of the domain can be recognised as the country. But a number of SEO mates have suggested that this is a valuable waste of folder level space. Er.. how can i explain. Ok, it's been suggested by some SEO experts that if the number of levels or folders in the domain exceeds 5 then the page drops dramatically in importance. Basically, you don't want to make it deep. As such, adding the country as the first level can be considered a waste, especially when it can be handled by the domain OR subdomain - hence the question :) So, any more thoughts on this? (Maybe SO is the wrong place to ask this question?)

    Read the article

  • foobar.com working, but www.foobar.com not working?

    - by dpmattingly
    I am setting up a web site for a client. She is using GoDaddy for domain registration, and a hosting company I have never used before. After setting up the nameservers on GoDaddy's side, the address foobar.com (for example) is correctly directing to the new site. However, the address www.foobar.com is redirecting to a 404 page on the hosting company's side. I've been dealing with customer service on the hosting side, and they have told me various things including wait for DNS propagation (which has obviously happened since the 404 page is on their side), and to make sure that the nameservers on GoDaddy's side were entered in lower case instead of upper case (which I know doesn't matter since nameservers are case insensitive). I think I'm getting the runaround from the hosting company, but the client had signed up with them before I came to the project, so if possible I'd like to resolve this issue with them before we start treating it as a loss. Does anybody know what could cause foobar.com to resolve correctly but www.foobar.com to not resolve? How would I best be able to suggest a fix to this through the technical support channels of a hosting company?

    Read the article

  • several subdomains pointing to one folder. fasthosts problem

    - by David
    I have a asp.net website (e.g. www.website.com). The idea is that you will go to the url 'yourname.website.com' and my site will request the name from the sub domain, process it and change the content accordingly. So, I purchased my fasthosts hosting package and created a sub domain which points to a folder within my site. e.g. 'www.website.com/folder'. However, when i now go to the url 'yourname.website.com' it is immediately redirected to 'website.com/folder'. this means I cannot request the subdomain from the url because it has been lost. I have tried contacting these guys but they don't understand and keep going on about some sort of redirect script (although i don't see how any further redirects can solve my problem). Is there something I am missing here? Any suggests or help would be much appreciated!

    Read the article

  • Is there a way to hide text from descriptions in Google

    - by Linda H
    The first line of text on all of our client's product pages is "Download hi-res images", which of course isn't what we'd want in the description when people search for their products. Is there any way to hide this text/link so that Google and the others just ignore it and go on into the text description below? I suppose we could use a meta-description, but the client isn't very good at computers and it's such a small site it seems silly.

    Read the article

  • Alternative for Subdomains [duplicate]

    - by Raj
    This question already has an answer here: Should I choose sub-directories over sub-domains in this case? 2 answers I have a company and website like www.example.com We have 1 industry with product 1 ,product 2 and another industry with product 3 and product 4 . All these products are different to each other my questions is like should have subdomains like www.industry1.example.com or www.example.com/industry1 If it is industry1.example.com it might sense different domain , if it is example.com/industry1 the number of folders might increase Please suggest a best solution for this thanks, Raj

    Read the article

  • Tracking first visit date in Google Analytics

    - by dusan
    I want track a site's visitor retention using Google Analytics, to see if unregistered users are returning to it, within a time frame of 2+ months from now. This blog post seems to be on the right track, but I want to track unregistered users, so I don't have a "join date" or similar variable at my disposal. This other blog post suggests using all 5 GA custom variables, using the first variable slot on the first week, variable 2 on week 2, etc. This method will allow me to track 5 weeks of visitors. I want to track more than 5 weeks of visitors, so I was thinking on using two custom variables in GA: visitor's first visit date, and visitor's last visit date. How I can save the first visit date? Because if I save another value in the same slot the old value will be overwritten, and I don't know how reliable is to save that variable conditionally (reading the __utmv cookie to check whether the "first visit date" is set, if it isn't set I save the current date)

    Read the article

  • Having good domain name and using domain aliases ( I use notlong.com)?

    - by Michal P.
    I use only free servers and after creating my website: http://pundaquit.republika.pl I decided to make access to that domain by simple domain name . I decided to use domain alias http://notlong.com/ service and have simple domain name http://pundaquit.notlong.com The second advantage of using alias here was to be independant from my file host which I will have to change. I haven't found a better alias service like notlong, because notlong.com is easy to remember. After that I encounter many problems: * most of forums or social services treat notlong adress as a spam, * Bing so far hvn't accepted http://pundaquit.notlong.com domain and others. Is it another way to have good free domain name? How about the situation when your hosting server will inform you to expire? Only a lasting layer of domain aliases make you independant from the real file hosts.

    Read the article

  • How to find out if my hosting's speed is good enough?

    - by Mert Nuhoglu
    There are lots of different online performance tests: Google PageSpeed Insights iWebTool Speed Test AlertFox Page Load Time WebPageTest Also there are several desktop/client software such as: ping tool YSlow Firebug's Net console Fiddler Http Watch I just want to decide if my hosting provider has a good enough performance or if I need to switch my hosting to another provider. So, which tool should I use to compare my hosting provider with other hosting providers?

    Read the article

  • Group "Discussion" software?

    - by Kayle
    My client wants a "lite" forum... not unlike these stack exchange sites, but even a little lighter. There's a screenshot of the discussion group she likes most, below. You can also go here to see it for yourself it you like. I don't think traditional forum apps will display, functionally, in this manner. Is there any software I can use to get a similar result? A web service would be acceptable as well.

    Read the article

  • Is my robots.txt working as it should?

    - by TigerBlood
    I want crawlers to have access to http://www.example.com but not http://www.example.com/ My robots.txt is as follows: User-agent: * Allow: /$ Disallow: / My site is in google search results, but I am not coming up in Bing, Yahoo, etc. I have had the same robots.txt since last year, and I initially requested inclusion ~1 year ago, having also resubmitted the URL to those latter search engines several times since as well. Is my robots.txt blocking those other crawlers? And if so, why not google as well? Thanks in advance!

    Read the article

  • Facebook Comments Lost

    - by Rish
    I am using Facebook comments on couple of my blogs at the moment and I just found that somehow magically all the previous comments made on posts are gone and are no longer being displayed. I'm using wordpress for all of these blogs and Facebook Comments for WordPress to manage all the facebook comments. But somehow they all disappeared all of a sudden. Another problem which I've been facing lately is that I can't seem to moderate the faceboook comments. When I go to http://developers.facebook.com/tools/comments where there should be a list of all the comments made on my sites (against the Applications that I've created just for the sake of comments), there is nothing there. This has been the thing from the starting, before the comments vanished on my site, today So technically, there are two issues to solve here.

    Read the article

  • Panda 4: Reducing #indexed pages. How much is enough?

    - by Noam
    I've been hit by panda 4 (40% decrease). I didn't see any change during panda 1-3. From what I've read it and when compared to my site, the change is probably due to the fact that I have over 30M pages indexed on Google, and they've starting seeing that as some sort of bad indication. Although I feel all of the pages have a unique value that Google should crawl, it seems I should make some tough calls and deduce the indexed pages according to some prioritization I will conduct. The question is what should be my target, or what factors should help me figure out a relevant target. How many pages should I try to reduce to? - 25M - 15M - 1M - 2000 Is it enough to add noindex to low priority pages or should I also remove all internal linking to them?

    Read the article

  • How to prevent a 404 Error when installing a subdomain using a wordpress multi-site installation

    - by Chris
    I have installed a multi-site instillation of WordPress onto my domain. I then added the necessary code to the wp-config.php file and .htaccess as instructed by WordPress. I also installed a plugin called Quick Page/Post Redirect Plugin which allowed me to place a 301 redirect onto the main domain as I only want to use the sub domain and not the main domain. Then I also added the following line of code to the wp-config.php file to redirect the main domain define( 'NOBLOGREDIRECT', 'URL Redirect Address' ); The site works fine with a redirect on the main domain and my subdomain runs fine when you type in subdomain.domain.com or http://subdomain.domain.com. However when I enter www.subdomain.domain.com or http://www.subdomain.domain.com the following error message is returned: Not Found The requested URL / was not found on this server. Apache/2.4.9 (Unix) Server at www.subdomain.domain.com Port 80 Any help with this would be much appreciated.

    Read the article

  • How can one keep an ecommerce site active?

    - by Mantorok
    So, you build an e-commerce site, all your products are on there, but then very little changes which obviously causes your site to become less active, and ultimately not ranking as highly in search engines. Is there anything that can be done to keep it active? I'm aware that inbound links are important and I guess these come over time, are there any other recommended means of keeping the site active?

    Read the article

  • How can I receive more traffic? My VPS fails!!!

    - by Vic
    I have a web site - photo gallery. About 400 photos. Site on Gallery 3. mySQL. Hosted on VPS from myhosting.com (CPU 1792 MHz, 2048 MB RAM). Everything seems to be ok, but there is one big problem. Once traffic reaches ~ 20 people (online) - website start loading really really slow. Actually website can't be loaded about 30-60 sec. What should I do? Buy more RAM / CPU on the same VPS? Move to a dedicated server or maybe myhosting.com just sucks? What do you recommend?

    Read the article

< Previous Page | 102 103 104 105 106 107 108 109 110 111 112 113  | Next Page >