Search Results

Search found 30887 results on 1236 pages for 'site module'.

Page 50/1236 | < Previous Page | 46 47 48 49 50 51 52 53 54 55 56 57  | Next Page >

  • How to Decrease Bounce Rate on Your Site

    Bounce rate measures the number of visitors who come to visit a particular webpage but leave to go to another page. While most experts agree that a bounce rate of 20% is acceptable, bounce rates in excess of 50% can be bad news.

    Read the article

  • Google is not indexing my entire site despite having a sitemap

    - by Anusha
    I have an e-commerce website www.beyondtime.in. I have been constantly monitoring Googlebot crawling on my website and my webmaster account. Lately, I have found two issues that I have not been able to understand. 1.) The Google Bots have been only crawling www.beyondtime.in/telecom.php when the URL is not even valid. What needs to be done to let Google crawl other pages of the website as well? 2.) The second question is about the Google Webmaster account, where I've submitted my sitemap with 227 URLs. Out of that, only 156 have been indexed. None of the images of my website have been indexed by Google.

    Read the article

  • Multiple sites redirected to one main site

    - by mattgcon
    I have a client who insists of having multiple website domains all being redirected to one main website domain. It is getting out of hand and his server has become conveluted and riddled with garbage because of it, not to mention confusing at times. Each of these domains that he is setting up has no content, they simply redirect the user to the main website domain. Is this practice of having multiple domains pointing to one main website common? And does anyone know where I can get information to give to this client to let him know this is a bad practice if it is a bad practice?

    Read the article

  • Default Wordpress site on IIS

    - by Mike
    We have multiple wordpress installations on our IIS7 (Windows Server 2008) Server as follows: http://www.example.com/site_one http://www.example.com/site_two http://www.example.com/site_three These all work properly. However we would like to configure it so that when users visit the root domain (http://www.example.com/) or any page underneath, ie: http://www.example.com/ http://www.example.com/page1 http://www.example.com/page2 They would actually see the corresponding pages for site_two: http://www.example.com/site_two/ http://www.example.com/site_two/page1 http://www.example.com/site_two/page2 How could we achieve this?

    Read the article

  • Display maintenance site to requesters based on their IP address

    - by user64294
    Hi all. I would like to set a special configuration in our apache web server. I would like to display sites to the users according to their IP addresses. We plan to upgrade our web sites. During the upgrade we'll put a maintenance site: so all the users which will connect to our web sites will get this site. There are 200 websites affected by the upgrade, so I don't want to change apache settings for each one. In order to test the upgrade i need to set apache to let only my IP address to access to asked site. If my IP address is a.b.c.d and if i ask for test.com i want to see it. but all other users, having a different IP address, should get the maintenane site even if they look for test.com. Our webserver is hosted out of the office (ovh.com france). The testers are the developers at our office and me. We can take some sites and enable them for test in which we implement IP restrictions in each website: the idea is on these websites, if the visitor's IP address is different from our office IP address we redirect this visitor to our maintenance website else we display the website. Is there a way to do this? Thank you.

    Read the article

  • Google Analytics showing more unique visitors than there are pages on an intranet site

    - by DDEX
    I take care of a company intranet and measure the traffic with GA. I am absolutely sure that there are no more than 5000 URLs in our company and it is impossible to check the intranet from outside the company network. Yet when I check the number of Unique Visitors (UV) in the last year GA says there were 36.500 of them. How is that possible? I thought UV should measure each URL only once in the given time period. Could anybody explain how this actually works? Can it be that the cookie trackers expire after some time and are counted more then once?

    Read the article

  • Games for the Brain [Brain Teasers Game Site]

    - by Asian Angel
    Are you looking for a great collection of fun games to play at work or home? Then the ‘Games for the Brain’ website is definitely worth bookmarking in your favorite browser! This terrific collection of brain teasers is the perfect way to relax, have fun, and each one only take a few minutes to complete (if you are pressed for time). Games for the Brain Game Homepage [via StumbleUpon] 8 Deadly Commands You Should Never Run on Linux 14 Special Google Searches That Show Instant Answers How To Create a Customized Windows 7 Installation Disc With Integrated Updates

    Read the article

  • Cisco Switching Module and HSRP interface Tracking

    - by Kyle Brandt
    When using 4 port switching module where each port is configured to switchport access vlan ##, for HRSP should I track the vlan interface or the FastEthernet interface? interface FastEthernet0/0/0 switchport access vlan 10 interface Vlan10 ip address 12.12.12.1 255.255.255.0 int FastEthernet0/1 ip address 192.168.1.2 255.255.255.0 standyby ip 192.168.128.1 standby track ?? ! FastEthernet 0/0/0 or Vlan 10?

    Read the article

  • Removing 301 redirect from site root

    - by Jon Clements
    I'm having a look at a friends website (a fairly old PHP based one) which they've been advised needs re-structuring. The key points being: URLs should be lower case and more "friendly". The root of the domain should be not be re-directed. The first point I'm happy with (and the URLs needed tidying up anyway) and have a draft plan of action, however the second is baffling me as to not only the best way to do it, but also whether it should be done. Currently http://www.example.com/ is redirected to http://www.example.com/some-link-with-keywords/ using the follow index.php in the root of the Apache2 instance. <?php $nextpage = "some-link-with-keywords/"; header( "HTTP/1.1 301 Moved Permanently" ); header( "Status: 301 Moved Permanently" ); header("Location: $nextpage"); exit(0); // This is Optional but suggested, to avoid any accidental output ?> As far as I'm aware, this has been the case for around three years -- and I'm sorely tempted to advise to not worry about it. It would appear taking off the 301 could: Potentially affect page ranking (as the 'homepage' would disappear - although it couldn't disappear because of the next point...) Introduce maintainance issues as existing users would still have the re-directed page in their cache Following the above, introduce duplicate content Confuse Google/other SE's as to what the homepage actually is now I may be over-analysing this but I have a feeling it's not as simple as removing the 301 from the root, and 301'ing the previous target to the root... Any suggestions (including it's not worth it) are sincerely appreciated.

    Read the article

  • WordPress Plugins to Help Make Your Site Responsive

    - by Ravish
    Ultimate Coming Soon Page Ultimate Coming Soon Page plugin allows you quickly and easily set up a coming update page for your website. It has includes some feature like completely customizable with background color and image, add custom CSS and HTML, collect emails, option to stretch background image according to browser etc. WP Orbit Slider [...] Related posts:Responsive WordPress Theme Eleven40 by Studiopress 10 Useful Admin WordPress Plugins 15 Useful SEO Plugins For WordPress

    Read the article

  • CDN for site with target market in Australia

    - by Jae Choi
    I was told that http://www.edgecast.com/ is very good CDN provider for Australian market. I have a cloud server based in Sydney Australia but was wondering whether it's even worth getting cdn as my target market is only Australia based also. Would I see any performance gain if I use above CDN services or would this be more for sites that target international visitors? I have Apache installed in our server but I would like to install Nginx. Would I see much more gain in performance on this change than CDN or should I go for both as they are all beneficial?

    Read the article

  • Increase Site Search Ranking With Search Engines

    Increasing your website's search ranking should start with an effective SEO strategy with keyword analysis playing a pivotal role in SEO. However, it can be argued that the less your website needs to rely on search engines for traffic, the more the search engines want to rely on your website.

    Read the article

  • .htaccess rewrite , parked domain on another site to read the proper domain name

    - by Stefano
    I have a parked domain ¨mysite.co.uk¨ on another domain name webspace in a folder and need to rewrite the mysite.co.uk domain name in the browser URL as it currently shows the Other domain name while browsing and i need it to read mysite.co.uk. When parking the domain the isp automatically added this which works although displays anotherdomain name. RewriteCond %{HTTP_HOST} ^mysite.co.uk$ [OR] RewriteCond %{HTTP_HOST} ^www.mysite.co.uk$ RewriteRule ^/?$ "http\://www.otherdomain.eu/myfolder" [R=301,L]

    Read the article

  • Rsync over ssh: "ERROR: module is read only" suddenly appeared

    - by user978548
    I've used from some time rsync/ssh to backup my shared host contents to my personal Synology NAS (212j for that matter), and it worked quite well. For information, I use a password-less ssh connection. 3 days ago, I updated my NAS software and since (or at least I believe it's since that), the backup won't work anymore. I get the following error on the host: rsync: writefd_unbuffered failed to write 4 bytes to socket [sender]: Broken pipe (32) ERROR: module is read only ..which I do not understand. beside that nothing changed that I know of in both source and destination that can be related to rsync or ssh, I did check a few things and all seems to be alright: I can still connect through ssh from the host to my NAS with the good user, so ssh stuff like keys haven't changed. I also have the correct file permissions on the NAS (I checked, and also tried to create files, directories, .. with the user used by rsync through ssh). I read here and there that the error means that I have to ensure that my rsyncd.conf have the right read only = no in it, but as far as I know, I never used rsyncd as well as I never configured anything for it and until now it worked like a charm.. I use the following command to do the backup: rsync -ab --recursive \ --files-from="$FILES_FROM" \ --backup-dir=backup_$SUFFIX \ --delete \ --filter='protect backup_*' \ $WDIRECTORY/ \ remote_backup:$REMOTE_BACKUP/ So I'm stuck and really can't figure out what happened. Edit: As suggested in comments, I also tried passing commands to ssh (but not from inside a ssh session), that worked as expected, and also tried a single rsync command, which didnt worked, failing just like the complete backup command. (sharedHost):hostuser:~ > touch test.txt (sharedHost):hostuser:~ > rsync test.txt remote_backup:backups/test.txt ERROR: module is read only rsync error: syntax or usage error (code 1) at main.c(1034) [Receiver=3.0.8] rsync: connection unexpectedly closed (9 bytes received so far) [sender] rsync error: error in rsync protocol data stream (code 12) at io.c(601) [sender=3.0.7] and (sharedHost):hostuser:~ > ssh remote_backup 'touch /abs_path_to_backups/backups/test2.txt && echo "ProoF" > /abs_path_to_backups/backups/test2.txt' (sharedHost):hostuser:~ > ssh remote_backup 'cat /abs_path_to_backups/backups/test2.txt' ProoF

    Read the article

  • multilingual mobile site and google seo [closed]

    - by kollo
    Possible Duplicate: How should I structure my urls for both SEO and localization? What's the preferred SEO compliance for a mobile website that is multilingual ? I have - web: en: http://mysite.com fr: http://fr.mysite.com es: http://es.mysite.com mobi: http://m.mysite.com Should I use http://m.fr.mysite.com for my mobile french version ? Nothing is specified on google blog for mobile : http://googlewebmastercentral.blogspot.co.uk/2011/12/new-markup-for-multilingual-content.html

    Read the article

  • IIS7 - Basic Authentication Module missing?

    - by FlySwat
    I'd like to use basic HTTP authentication to keep people out of our dev site instance since it is unfortantly exposed to the wild internet. However, in IIS7, the only authentication modes listed are Forms, Anonymous and Impersonation. Where did the "Basic Authentication" module go, and how can I get it back?

    Read the article

  • Unable to access newly created web site in IIS 7.5

    - by Animesh
    Configuration: 32-bit Windows 7 development machine with IIS 7.5 I created a new web site in IIS to host only MVC sites called MVCHOST. The physical path to this website is set as C:\inetpub\mvcroot. I created a new v4.0 pool called mvcpool for this purpose. I have given Modify rights to IIS_WPG, IIS_IUSRS, ASPNET accounts. I created this web site with a host header "mvchost" and port 80, in the hopes of browsing MVC sites in the following way: mvchost/mvcapp1 mvchost/mvcapp2 instead of localhost/mvcapp1 localhost/mvcapp2 The only binding I set is the default one: http:*:80:mvchost. I have also copied the files iisstart.htm, web.config, welcome.png and folder aspnet_client from wwwroot over to mvcroot. Now when I try to the browse this site from IIS manager, I get the following error: This webpage is not available If I leave out the host header and give some port, say 99, I can access this website at localhost:99. What am I missing here? Why am I unable to access the web site at: http://mvchost/?

    Read the article

  • Firefox will not remember local site cookie

    - by Campo
    This is a weird one. We have a production server (Server 2008) and two staging servers (Server 2008 and Server 2003) I have sites on all of these. They all use cookies. On the Production server when browsing to our site www.supernovainteractive.com there is a cookie that detects when you visted the site and it will not refresh the logo animation (top left hand side) on clicking to another page. This works for all browsers on the production server. I’m not sure what’s going on but for some reason cookies are not working on one site in the 2008 staging server only. This is when browsing using Firefox (3.6.3) they work fine on all other browsers (IE, Chrome, Safari, Opera) In addition, the 2003 staging server works fine. You can test on the Supernova Interactive site by noticing the logo in the top left corner. It uses a cookie to detect if you’ve already seen the animation. Once you’ve seen it once, it doesn’t animate again until tomorrow. Currently, it’s animating every time. I have opened an outside facing port so others can see the issue. Http://exchange.supernova.com:10009 Any ideas on this one? Firewalls are off on the server. Notice you do not get a cookie from Exchange.supernova.com.

    Read the article

  • Will search engines reindex a page that has been set to redirect on to a newer site page?

    - by Luke Duddridge
    We were asked by a client to change a website so that any pages/Urls we were hosting on an older site would now redirect to a newer site hosted somewhere else and a different domain name to boot. We did this by changing each page in the IIS site management, to redirect to a url on their new domain instead of rendering a page locally. According to the redirect tool here: http://www.webconfs.com/redirect-check.php . What we have done is search engine friendly. Problem now is... the client has been on a course learning all about meta tags and so thinks they have a better understanding of the "matrix" (remember there is no spoon). As Google still has the older site appearing in a search, this isnt helping matters. I have tried to explain, we have to wait for Google to reindex. I'm not blowing smoke am I? I'm now starting to wonder... will the older site always appear in a search, even though the pages don't exist? Is there a better way I should be redirecting their site to ensure google will stop keeping an index of pages that no longer exist and would instead replace them with the content in the newer site? a suggestion on the site mentioned above is to use the code: Response.Status="301 Moved Permanently" Response.AddHeader "Location","http://www.new-url.com/" Does using the option in the IIS management tool to redirect the url not do the same?

    Read the article

  • On-Site SEO 101 - Simple Tips to Help Improve Your Website Visibility

    Search Engine Optimization (SEO) is often perceived as impossible to understand by novice website owners and creators. While it is true that any website would benefit greatly from experienced, professional SEO services, many people simply cannot afford to keep an SEO on payroll on a consistent, month-to-month basis. For people in this category, all hope is not lost.

    Read the article

< Previous Page | 46 47 48 49 50 51 52 53 54 55 56 57  | Next Page >