Search Results

Search found 96383 results on 3856 pages for 'code pro'.

Page 616/3856 | < Previous Page | 612 613 614 615 616 617 618 619 620 621 622 623  | Next Page >

  • Are these hacking attempts or something less sinister?

    - by Darkcat Studios
    I just had a look through our web server error logs, and Terminal services is reporting: "Remote session from client name a exceeded the maximum allowed failed logon attempts. The session was forcibly terminated." Hundreds of times, every 10.5 seconds or so for a period of about 5-10 minutes, once at 2pm yesterday and once again at about 1am this morning. We CURRENTLY have RDP open to the outside, as I am just completing the setup and now and then I/Others need to jump on from an outside office/location (VPN isn't an option) As these are so regular, am I right in assuming that they may be the result of some sort of dictionary attack? or could something like an internal admin's hung session cause such a mass of events? (Win Server 2008 R2)

    Read the article

  • Site inaccessible by some people, fine for others [on hold]

    - by Paul Howell
    A couple of days ago my website www.howellphoto.com (hosted by one.com, wordpress site) started loading really slowly, and I have been unable to access any pages linked from the homepage. Several of my friends have found the same issue, yet many are able to access the site without problem. Live support at one.com have not been all that much help, requesting the ip addresses of a few people who cannot access the site, and saying it could be a firewall issue. Wordpress support (my site was created in prophotoblogs) have been better and have updated all plugins, etc, but can see no issue from their end. My main issue is that even if there was a local fix that I could do on my computer, this would not help wih any potential customers visiting my site for information! This is driving me crazy!!! Any help will be legendary! Cheers, Paul

    Read the article

  • Is the use of hashbang really a good idea? [on hold]

    - by user32642
    I've been working on a WordPress site lately that was design with hashbang or shebang in the dynamically generated URLs. After doing some research, I noticed that there was some preference by Google in their use and how it crawled the site. However, after I ran several sitemap generators and Screaming Frog SEO Spider, I realized that the only page being crawled was the index page. So now I am questioning the use of hashbangs. What do you think? Should I attempt to remove them? Or will it even matter? And does anyone know of a easy way to remove this? The site is www.modernvintage1005.com

    Read the article

  • Another website is mirroring and ranks above my site in search results

    - by Marlboro Goodluck
    There is a site of ill-repute known as thedirty which has completely mirrored my site and now has links appearing on Google at the #1 spot using my content. I checked my log files and noticed that this site has been crawling mine for sometime, and also has 10,000 links from their site to mine. I have blocked user access which is referred from this site and reported them as web spam to Google already. I also disavowed the domain. How are they getting top links in Google (even overtaking mine) for such nefarious tactics? What are the steps to completely eliminating an issue such as this?

    Read the article

  • How hard is to be the anonymous owner of a website?

    - by silla
    I'd like to create a website with a very radical political message. It won't be unethical (encouraging violence, etc) but I feel the points I plan to list in it will definitely make me a lot of enemies. How hard would it be to protect my identity from anyone finding out who I am? I know domains always have a $10/year option for privatizing your registration information but is there any other protection I should think about having? Thanks!

    Read the article

  • Adwords: is it possible to automatically copy changes in one campaign to other campaigns?

    - by Richard
    Is it possible to have identical campaigns except for one thing, and whenever changes are made to one campaign, the same changes are automatically made to others? I want to have identical campaigns running for different time zones, but for the campaigns only to run between particular hours of the day in each time zone. If I have a campaign for each time zone, when I make changes to one campaign, such as adding ad groups, or adding keywords, or changing bid amounts, etc, I want those changes to occur to the other campaigns also. Is this possible? Thanks,

    Read the article

  • What are the dis-advantages of installing the ssl certificate for the naked domain?

    - by user1744649
    I might buy an SSL certificate for my sie. I know that it will help me in many ways. But will there be dis-advantages also? eg. If I load an image from another server (using plain http), will that alert the customer saying something is wrong? Will I be able to use all existing codes like phpbb, awstats etc without a problem? Will there be any issue if redirect a page from my domain.com to my subdomain.domain.com using a meta refresh or .htaccess? Will there be any issue if redirect a page from my subdomain.domain.com to my domain.com using a meta refresh or .htaccess? Any other issue that I might get into? Thanks.

    Read the article

  • Best Way for Developers to Upload Files to Production Server

    - by ultrajohn
    Small team of developers doing their work here and there. We have a team leader, and is sole responsible for uploading updated source files from the development server to the production server. So let's say, so if an updated files needs to be uploaded to the prod server, that concerned developer shall notify the team lead about it, and then the team lead will update the files to the prod server. So no developer has an access to the prod server except for the team lead. That's our current setup. Now, what we want to do is to give developers a way for uploading their updated files to the server without the team lead intervening in the process. What do you think is the best way to go about this?

    Read the article

  • How long is the penalty for Duplicate ecommerce content after it has been ressurected

    - by will
    I am fixing all of the duplicate content on my ecommerce site with all orignal descriptions etc. How long does it take google to start ranking it again? I used to have a good ranking that converted quite a few sales, in the last week i have had next to nothing. Also would the disclaimer i created under each product be considered duplicate content because it is on most of my product pages & is the same.

    Read the article

  • Improving FAQ SEO with multiple pages?

    - by asdfasdf
    I have a client who has over 200 Question/Answer style content blocks. Neither the questions or answers are very long and most of them have almost the same question but with a word or two differentiating themselves from the rest of the questions. Would SEO be helped or hurt if I would to put each QA on its own page with the title of the page the question being asked etc... Or, would that be considered "farming"? If not, what would be the best way (in SEO world) do present all these QAs? Thanks for any advice..

    Read the article

  • Issue with sitemap in GWT

    - by Anusha
    I have an e-commerce website www.beyondtime.in, i have been constantly monitoring the google bot crawling on my website and my webmaster account. Lately, i have found two issues that i have not been able to understand and hence want your help. 1.) The Google Bots have been only crawling www.beyondtime.in/telecom.php this URL of my website, when the URL is not even valid. So, kindly help me understand what needs to be done to let Google crawl other pages of the website as well. 2.) The second question is about the Google Webmaster account, where i've submitted my sitmap with 227 URLs, but out of that only 156 have been indexed. Also none of the images of my website have been indexed by Google. So kindly help me with this as well. Thanks

    Read the article

  • 100% APC Fragmentation - Cacherouter & Pressflow install

    - by granttoth
    My APC cache has a 100% fragmentation. I'm not quite sure I understand what is going on here. For testing I jacked the available memory up to 512. After a day the total available free memory shows 73% but I still have 100% fragmentation. Would you gurus please look at my settings and offer your advice? Oh, I have read people suggest that I disable apc.stat when possible but when I do the site crashes. I am using the Pressflow build of Drupal 6 with the cacherouter module installed. Edit: (added screenshot) http://i.imgur.com/DqZEX.png

    Read the article

  • Convert video files for home IIS server [closed]

    - by Jey
    I am finally learning to set up an IIS server (personal use only) and I thought it would be cool to have some videos on it for me to watch when I am away from home. Since I'm usually on 3G (iPhone) or work wifi, I'd like to convert them to an optimal format that will stream fast. The video files are mostly avi and mp4 (from 30 minutes to 2 hours in length). What would be an easy and fast way to go about doing this? Thanks.

    Read the article

  • how did Google Analytics kill my site?

    - by user1813359
    Yesterday I created a google analytics profile for one of my sites and included the JS block in the layout template. What happened next was very strange. Within about 2 minutes, the site had become unreachable. I had been checking the AWStats page for the site when I thought to set up GA. After that had been done, I clicked on the link for 404 stats, which opens in a new tab. It churned for a long while and then showed a nearly blank page, similar to that when Firefox chokes on a badly-formatted XML page, except there was no error msg. But i was logged into the server and could see that that page has a 401 Transitional DTD. Strange! I tried viewing source but it just churned endlessly. I then tried "inspect element" and was able to see an error msg having to do with some internal Firefox lib. Unfortunately, i neglected to copy that. :-( All further attempts to load anything on the site would time out. Firebug's Net panel showed no request being made. Chrome would time out. So, I deleted the GA profile, removed the JS block, and cleared the server cache. No joy. I then removed all google cookies and disabled JS. Still nothing. No luck in any other browser. And now my client couldn't access the site. Terrific. I was able use wget while logged into another server. The retrieved page was fine, and did not contain the GA JS block. However, the two servers are on the same network. (Perhaps a clue.) The server itself was fine. Ping, traceroute looked great. I could SSH in. I tailed the access log and tried a browser request. Nothing. But i forgot to quit and a minute or so later I saw a request from someone else being logged. Later, I could see that requests had been served all day to some people. Now, 24 hours later, the site works once again, but is still unreachable by the client (who is in another city). So, does anyone have some insight into what's going on? Does this have something to do with google's CDN? I don't know very much about how GA works but what I'm seeing reminds me of DNS propagation issues. And why the initial XML error? And why the heck was the site just plain unreachable? What did google do to my site?! Sorry for the length but I wanted to cover everything.

    Read the article

  • Does Google penalize pseudo-duplicate pages for different locations?

    - by mikewowb
    My compony's site's home page was not specificly optimized to any location. Now, I am planning to optimize it to Boston, and create ten or so other landing pages for other locations we serve. If we made these new pages by copying the original Boston one and changing the location's name (s/Boston/Montreal/), would Google consider them as duplicate pages and penalize us? What is the best practice for this?

    Read the article

  • Does the user agent in any regular browser contain 'bot' or 'crawl'?

    - by Echo
    Does the user agent in any regular browser contain 'bot' or 'crawl'? I check the user agent on my site to see if it is coming from a bot or not. If it is, I can do some little optimizations since they don't login. (I don't change the content at all) After adding checks for 30-40+ bots, I'm getting tired of added them. So I was wondering if checking if it just contains 'bot' or 'crawl'. I know that wont get all bots, but it would get a lot of them. But if that could cause any false positives, then it would totally mess up the ability to add to cart, place an order, and login in.

    Read the article

  • .htaccess mobile redirect issues

    - by val
    I'm trying to set up a mobile redirect for a site with 2 subfolders, and I cannot get both to work at the same time. This is the structure of the site www.mysite.com/EN/ www.mysite.com/ES/ This is a bilingual site so each subfolder contains the files corresponding to each language version. Then I was using a 301 redirect, and setting up the index in /EN/ as the main index. Everything was getting redirected to it. I was using DirectoryIndex index.html Redirect /index.html http://www.mysite.com/EN/index.html and several Rewritecond to redirect mysite.com and old urls to the new URL. It worked fine before I decided to add a mobile version - m.mysite.com. I used the solution provided in http://stackoverflow.com/questions/3680463/mobile-redirect-using-htaccess, and it redirects my mobile version properly, but now the desktop is both working. Besides, my mobile version must be bilingual as well.

    Read the article

  • Will URL encoding the image names affect Google

    - by TheGateKeeper
    Just wondering if it makes any difference to Google whether or not I URL encode the image names when linking to them. For example if I have an image named "test-1234-!.jpg", does it make a difference if I name it refer to it as "test-1234-%21.jpg"? The reason I am asking is because I am doing a major shift in the way my website works and while all new image names will not be URL encoded, all of the past ones are. I want to see if it is worth it renaming all of them or if I should just leave it like that.

    Read the article

  • Should I include everything in the sitemap or only new content?

    - by Mee
    For a website with dynamic content (new content is constantly being added), should I only include the newest content in the sitemap or should I include everything (with a sitemap index)? What are the best practices for sitemaps esp. for large sites? Also, is there anyway to make google (and other search engines) only crawl the pages in the sitemap? Thanks Update: Also, any idea how stackoverflow handle this? I'd like to know but unfortunately (also understandingly) they have blocked access to their sitemap.

    Read the article

  • using apple-mobile-web-app-capable and cache.manifest issue [migrated]

    - by LocoMike
    So I have this simple html file <!DOCTYPE HTML> <html manifest="cache.manifest"><head> <meta name="apple-mobile-web-app-capable" content="yes"> <meta name="apple-mobile-web-app-status-bar-style" content="black"> <title>Test</title> <meta http-equiv="content-type" content="text/html"> <meta name="HandheldFriendly" content="true"> <meta name="viewport" content="width=320; initial-scale=1.0; maximum-scale=1.0; user-scalable=0;"> <style type="text/css"></style></head> <body marginwidth="0" marginheight="0" topmargin="0" leftmargin="0"> <h1>hello</h1> </body> </html> My cache.manifest is simply CACHE MANIFEST I run this website on my local server (localhost). I load it from iphone safari and it works fine. I then stop the server and load it again, and it works, because the offline cache is doing its job. However... if I save the website as a start icon in the iphone dashboard, and then I try to open it with the server stopped it won't load. However... if I open it with the server running at least once (it will work) then I can open it later without problem. It looks like even though the page was cached in safari, it is not cached in this saved app. Anybody knows how to get around this? Thank you!

    Read the article

  • Should I create topics in a forum I'm about to launch so that new users won't feel it is "empty"?

    - by janoChen
    I'm about to launch a discussion forum about Taiwan. I'm really trying to figure out how to deal with the first visitors. I've thought about the following so far: Invite few friends to start some discussions and give some replies. Create discussions myself and reply them myself (with another account). I don't want the first visitors to feel like the site is empty. Maybe I'm missing something. Any suggestions?

    Read the article

  • How to reverse engineer the SEO on a website?

    - by Startup Crazy
    I have read this question. My question is a bit different from it. I want to know how can I reverse engineer another website that is ranking the best for some keywords. For example some website called www.bla.com is there and it ranks high for many keywords and I want to learn from it how can my website be of the same authority and get the same ranking (or probably better ranking if I found something that they are missing). Can anyone enlist it as a procedure, how to reverse engineer a website?

    Read the article

  • Makefile - How to save the .o one directory up?

    - by nunos
    Imagine the following folder structure: project src code.c makefile bin How can I compile code.c to code.o and directly put it inside bin? I know I could compile it to code.o under src and the do "mv code.o ../bin" but that would yield an error if there were compile errors, right? Even if it works that way, is there a better way to do it? Thanks.

    Read the article

  • why my site cache 2 time par day?

    - by clarawood
    I have read the FAQs and checked for similar issues: YES My site's URL (web address) is: www.adultxdating.com Description (including timeline of any changes made): I have lost my top search listings from last 4 months. I am still working on this but not getting proper guidance. This site is caching 2 times in 24Hrs. Some times sites will back in to top 10 search listing on 100s keywords, some time its gone out 1000+. anybody can help me why its happening. I have more than 200K+ incoming links and updating the site regularly. Please help. Thanks clara wood

    Read the article

  • Why my sub-domain redirect returns a blank page?

    - by Tom Brito
    I have the domain http://dropbox.tombrito.com/ (on GoDaddy) forwarding with masking to www.dropbox.com/sh/k6ypvx4y4kf0gu6/rdjxQ1b1OL It was working fine some time ago, but now the result is a blank page (although the Dropbox's favicon appears correctly in the browser's tab title). The DNS manager shows me a single entry with the name "dropbox": A dropbox 64.202.189.170 Any idea what's wrong? Related: Why my domain redirect on Google Apps is returning 404?

    Read the article

< Previous Page | 612 613 614 615 616 617 618 619 620 621 622 623  | Next Page >