Search Results

Search found 9717 results on 389 pages for 'gkt pro'.

Page 80/389 | < Previous Page | 76 77 78 79 80 81 82 83 84 85 86 87  | Next Page >

  • % new visitor vs. % returning visitor

    - by Torben Gundtofte-Bruun
    I'm not sure how to interpret the results in Google Analytics. I understand that some metrics should be high, and some should be low. But this one I don't get: % new visitor vs. % returning visitor: It's good that users are returning, but surely it's also good to get new, fresh visitors. How do I evaluate this %-vs-% ratio? The higher the better: visits unique visitors pageviews pages per visit avg. visit duration The lower the better: bounce rate drop-offs

    Read the article

  • Submitting a sitemap to take care of inherited Google crawler errors

    - by leeand00
    I have an awful lot of Google Crawler errors (1000 or so) after I inherited a site that the previous owner migrated without moving much of their content. Would generating a map of the current site and submitting it to Google help fix this? Is there any quicker, automated way to eliminate errors other than clicking each and every site error? Note: I have already tried automating this on my own.

    Read the article

  • Improve efficiency of web building setup and processes - Wordpress on Mac

    - by Rob
    Can anyone see any ways in which I can improve my speed and efficiency with the following setup? Or if there are any obvious holes in my building process? This is for building Wordpress websites on Mac: 1) I have a standard Wordpress setup that I work from which includes various plugins that I tend to use across all setups - thus cutting out the step of having to download them all the time! 2) My standard WP files are copied into a Dropbox folder - thus creating backups of the files. 3) I then open up MAMP and setup a local version. 4) I open up Coda and setup the FTP details so files can be uploaded to the live domain by using the publish button. If anyone has any advice on how to improve this process then please let me know!

    Read the article

  • Rewrite for robots.txt and favicon.ico

    - by BHare
    I have setup some rules in which subdomains (my users) will default to where I have located the robots.txt, favicon.ico, and crossdomain.xml therefore if a user creates a site say testing.mywebsite.com and they don't make their own favicon.ico at testing.mywebsite.com/favicon.ico, then it will use the favicon.ico I have in /misc/favicon.ico This works perfect, but it doesn't work for the main website. If you attempt to go to mywebsite.com/favicon.ico it will check if "/" exists, in which it does. And then never redirects to /misc/favicon.ico How can I get it so both instances redirect to /misc/favicon.ico ? # Set all crossdomain (openpalace file) favorite icons and robots.txt doesnt exist on their # side, then redirect to site's just to have something to go on. RewriteCond %{REQUEST_URI} crossdomain.xml$ RewriteCond ^(.+)crossdomain.xml !-f RewriteRule ^(.*)$ /misc/crossdomain.xml [L] RewriteCond %{REQUEST_URI} favicon.ico$ RewriteCond ^(.+)favicon.ico !-f RewriteRule ^(.*)$ /misc/favicon.ico [L] RewriteCond %{REQUEST_URI} robots.txt$ RewriteCond ^(.+)robots.txt !-f RewriteRule ^(.*)$ /misc/robots.txt [L]

    Read the article

  • Evil Spam Emails caused hosting account suspension!

    - by Sei
    We have a couple domains hosted rackservers.com.au. Recently our account go suspended without any notice. I then filed a ticket and soon got the answer:'There are some one forging email accounts from your domain, and they have been sending out spam emails. So we do not want you here anymore, take your backup and go.' I am quite shocked by such attitude and more confused by the actions we should take under such situation. Should I take my back up and go? Should I ask them for more details? How can I prevent this from happening again in the future?

    Read the article

  • Apache - .httaccess RewriteRule from domainA to domainB

    - by milo5b
    Problem: I have a website (mywebsite.com) that was, and partly is, indexed in google. Somebody pointed their own domain (theirsite.com) name to my server and DNS, so it resolves with my IP. Now, probably being an older domain, it outranks me in google, and the pages at my domain are starting to getting de-indexed (probably duplicate content or something). So, for example, my homepage got de-indexed, and their homepage (theirsite.com/) is indexed with my content/code/etc. The same is for other pages (theirsite.com/other/page.html is showing mysite.com/other/page.html) Quick-fix: To quickly fix it, I have added few lines to my PHP code, checking for $_SERVER['HTTP_HOST'], and if different than my domain, redirects to my domain. It does the job, but to me it looks like a dirty solution. Question: I could not find a way to have apache to do this job. I would prefer to find an apache/.htaccess solution to this problem (redirecting all traffic from domainA.com/(.*) to domainB.com/$1), is it possible in any way? Thanks

    Read the article

  • Transfer .com domain to GoDaddy - websites running on same domain - 3 weeks left until expiration, 2 days left web hosting

    - by Eric Nguyen
    Our company purchased this abc.com domain from a local registrar. The domain will expire in about 3 weeks. We have our main websites running on this abc.com domain and they cannot be down for too long. The web hosting service will end in 2 days. Our websites are already hosted and they are up and running on Amazon EC2. We would like to transfer the domain to GoDaddy now or as soon as possible. (since we have many other domains there and we belive GoDaddy will be better in long-term considering the prices and the features it offers) There are many questions on the decision to transfer the domain to GoDaddy: 1) Cost and time required to move out of our local registrar? This is currently unknown as I'm still trying to retrieve the agreement we have with them 2) How does the 3 week time left until expiration of the domain matters here? Should we wait until the domain expires and then purchase in through GoDaddy? How long would such process take as I suppose our websites will be down during that time? Any other drawbacks? 3) What can I do to ensure our websites will continue functioning regardless of the domain transfer process? It seems the actual registrar here is enom.com and the local registrar here just partners with it I suppose I should then park the abc.com domain with enom.com and make changes to DNS settings so that our websites can continue to be hosted on EC2 as normal. How long does it normally take the domain to be transferred to GoDaddy completely? Is it even possible at all to keep our websites are up and running during the whole domain transfer process? Apologies that I'm throwing many questions at the same time here. It's rather last minutes and I suddenly realised there are too many unknown risks.

    Read the article

  • Possible to set equal height for divs in pairs but only if browser width 960 or wider?

    - by CreateSean
    I'm working on a responsive site where I've found that I've got pairs of divs with heights that I would like to be equal, but only if the browser width is equal to or greater than 960px. Any smaller than that and the divs stack so different heights do not make a difference. DIV 1 | DIV 2 DIV 3 | DIV 4 DIV 5 | DIV 6 DIV 7 | DIV 8 Based on the above set up, Div 1 and Div 2 need to be equal height as do Div 3 and Div 4, but both pairs do not need to be equal to each other. i.e. the pair sets can have different heights but each pair must be equal. Is this possible and if so what is the best approach to take? My javascript/jQuery is rather elementary. I'm sure I could do equal heights alone, but with the pair sets I'm not sure and then adding in the need to set this to only happen if the browser is 960 or wider and I'm lost.

    Read the article

  • Traffic fall after a server problem

    - by Sébastien
    I have a website from which I analyse the traffic with Google analytics. Day after day the traffic (mainly from Google SE) incresed until I get a problem with my server. For one day the server has been offline and after that I have no longer had as much users as I had before. Now it's like the site is no more referenced on Google index (but when I type "site:mysite.com", I still have all the results). Do you know if this is a normal behaviour and if the traffic will come back as before (the server has had problems two days ago) ?

    Read the article

  • Error using SoapClient() in PHP [migrated]

    - by Dhaval
    I'm trying to access WSDL(Web Service Definition Language) file using SoapClient() of PHP. I found that WSDL file is authenticated. I tried with passing credentials on an array by another parameter and active SSL on my server, still I'm getting an error. Here is the code I'm using: $client = new SoapClient("https://webservices.chargepointportal.net:8081/coulomb_api_1.1.wsdl",array("trace" = "1","Username" = "username","Password" = "password")); Here is the error I'm getting: Warning: SoapClient::SoapClient(https://webservices.chargepointportal.net:8081/coulomb_api_1.1.wsdl) [soapclient.soapclient]: failed to open stream: Connection timed out in PATH_TO_FILE on line 80 Warning: SoapClient::SoapClient() [soapclient.soapclient]: I/O warning : failed to load external entity "https://webservices.chargepointportal.net:8081/coulomb_api_1.1.wsdl" in PATH_TO_FILE on line 80 Fatal error: Uncaught SoapFault exception: [WSDL] SOAP-ERROR: Parsing WSDL: Couldn't load from 'https://webservices.chargepointportal.net:8081/coulomb_api_1.1.wsdl' : failed to load external entity "https://webservices.chargepointportal.net:8081/coulomb_api_1.1.wsdl" in PATH_TO_FILE:80 Stack trace: #0 /home2/wingstec/public_html/widget/API/index.php(80): SoapClient-SoapClient('https://webserv...', Array) #1 {main} thrown in PATH_TO_FILE on line 80 It seems that error says file not exist at the path we given but when we run that path directly on browser then we're getting that file Can anyone help me to figure out what the exactly problem is?

    Read the article

  • Project planning and customer tracking system

    - by Daniel Hollands
    First off, sorry if this is the wrong 'stack' site, but it seemed like a good place to start. I'm happy to report that my services as a web developer are starting to be in quite a lot of demand, and I have a few existing and potentially new customers all lining up - but I'm finding it very hard to keep track of everything. What I'm hoping for is some (preferably web-based) system which I can use to keep track of who my customers are, the various projects that I've got going on for them, and (if possible) the individual sub-tasks that make up each project. What would be even better is if the relevant customer was able to log into the site, and see the process of their projects. I do hope you know what I'm talking about, and that you'll be able to offer some suggestions of either web-base sites that offer something along these lines, or of some open source solution or something like that? Thank you

    Read the article

  • The sharp decline Statistics of website

    - by Erfan Safarpoor
    My website has had 10 months ago, the statistics are very high. Very high ... But after 10 days of server failure, Marm was 20 times less. I got lost for a long time without making a mistake, do ... I am the source of links that they've hired a writer to pen the final results are seen. But a strange thing: Approximately every two months and was hit again 20 more times and then low again after 10 days! my website url : www.sooran.com (food.sooran.com)

    Read the article

  • Apache: Virtual Host and .htacess for URL Rewriting not working

    - by parth
    I have configured virtual host in my local machine and every thing working fine . Now I want to use SEO friendly urls. To achive this I have used .htacess file . My virtual host configuration is : <VirtualHost *:80> DocumentRoot "C:/xampp/htdocs/ypp" ServerName ypp.com ServerAlias www.ypp.com ##ErrorLog "logs/dummy-host2.localhost-error.log" ##CustomLog "logs/dummy-host2.localhost-access.log" combined </VirtualHost> and my .htacess file has : AllowOverride All RewriteEngine On RewriteBase /ypp/ RewriteRule ^/browse$ /browse.php RewriteRule ^/browse/([a-z]+)$ /browse.php?cat=$1 RewriteRule ^/browse/([a-z]+)/([a-z]+)$ /browse.php?cat=$1&subcat=$2 The above .htacess setting is not working . After that I have modigied my virtual host setting and it is working . new virtual host setting is : <VirtualHost *:80> RewriteEngine On RewriteRule ^/browse$ /browse.php RewriteRule ^/browse/([a-z]+)$ /browse.php?cat=$1 RewriteRule ^/browse/([a-z]+)/([a-z]+)$ /browse.php?cat=$1&subcat=$2 ServerAdmin [email protected] DocumentRoot "C:/xampp/htdocs/ypp" ServerName ypp.com ServerAlias www.ypp.com ##ErrorLog "logs/dummy-host2.localhost-error.log" ##CustomLog "logs/dummy-host2.localhost-access.log" combined <Directory "C:/xampp/htdocs/ypp"> AllowOverride All </Directory> </VirtualHost> Please guide me where I am wrong in .htacess file for url rewriting . I donot want to use setting in virtual host because for every change I have restart apache .

    Read the article

  • Email sent via Google via relayhost being marked as spam

    - by Mark H
    Company email hosted by Google Apps. Company PBX in-house is Elastix. All voicemails received on the extensions of Elastix are supposed to be emailed by the CentOS server (Postfix) to the email address of the employee. Using relayhost on postfix, I am sending those emails through Google Apps (smtp.gmail.com), but some of these voicemail emails end up in the spam. Sending it through Google, and sending it to an email hosted by Google - yet there's spam. Email sent from the Google Apps interface - no complaints of it going to spam - just from the Elastix server. I've just asked our DNS domain guys to add spf records, but is that all that's needed? Some help please!

    Read the article

  • Duplicating someone's content legitimately & writing HTML to support that

    - by Codecraft
    I want to add content from other blogs to my own (with the authors permission) to help build additional relevant content and support articles I've found useful that others have written. I'm looking into how to do this responsibly - ie, by giving the original content author a boost and not competing against them for search traffic which should go to their site. In order to keep my duplicate content out of search, and to hint to the search engines where the original content is to be found i've implemented: <head> <meta name='robots' content='noindex, follow'> <link rel='canonical' href='http://www.originalblog.com/original-post.html' /> </head> Additionally, to boost the original article and to let readers know where it came from i'll be adding something like this: <div> Article originally written by <a href='http://www.authorswebsite.com'>Authors Name</a> and reproduced with permission.<br/> <a href='http://www.originalblog.com/original-post.html' target='new'> Read the original article here. </a> </div> All that remains is a way to 'officially' credit the original author in the HTML for the search spiders to see. Can anyone tell me a way to do this possibly using rel="author" (as far as I can see thats only good for my own original content), or perhaps it doesn't matter given that the reproduced pages will be kept out of search engines? Also, have I overlooked anything in the approach?

    Read the article

  • DNS and Wildcard CNAME

    - by Thomas Chapman
    Whenever I attempt to make a record for *.schneiderdonnelly.com.au and CNAME it, I get two errors: You can't mix CNAME/MX records together using the same hostname. Domain root's cannot be CNAME's, however you can web-forward this record to www.schneiderdonnelly.com.au instead for the same effect. I've read it's possible so why can't I make it work? I donated $5 to be a premium member and I've been trying to make it work for yonks. http://i.stack.imgur.com/D9Ui5.jpg This is how I want it to appear. The last record. I am prepared to swap DNS providers as long as they're free.

    Read the article

  • Handling SEO for Infinite pages that cause external slow API calls

    - by Noam
    I have an 'infinite' amount of pages in my site which rely on an external API. Generating each page takes time (1 minute). Links in the site point to such pages, and when a users clicks them they are generated and he waits. Considering I cannot pre-create them all, I am trying to figure out the best SEO approach to handle these pages. Options: Create really simple pages for the web spiders and only real users will fetch the data and generate the page. A little bit 'afraid' google will see this as low quality content, which might also feel duplicated. Put them under a directory in my site (e.g. /non-generated/) and put a disallow in robots.txt. Problem here is I don't want users to have to deal with a different URL when wanting to share this page or make sense of it. Thought about maybe redirecting real users from this URL back to the regular hierarchy and that way 'fooling' google not to get to them. Again not sure he will like me for that. Letting him crawl these pages. Main problem is I can't control to rate of the API calls and also my site seems slower than it should from a spider's perspective (if he only crawled the generated pages, he'd think it's much faster). Which approach would you suggest?

    Read the article

  • Godaddy multiple domain problem

    - by user6182
    I have godaddy deluxe plan and here is my problem: I have two domains for example: e1.com and e2.com. Both are hosted in same hosting plan. First I created a folder for each domain in the root folder and uploaded two web site but when I'm trying to run my sites, the URL for e1 always shows http://e1.com/e1/ and for e2 it shows http://e2.com/e2. Can I avoid showing e1 and e2 folder and only show http://e1.com and http://e2.com? Thank you.

    Read the article

  • Install Moodle to subdomain with Softaculous via cPanel

    - by Sean
    I installed Moodle to a directory with Softaculous. Since it doesn't allow installing to a subdomain, after installing it I created a subdomain and pointed the destination (of the subdomain) to the previously created Moodle directory. Now when I go to the subdomain.example.com it says Incorrect access detected, this server may be accessed only through "http://example.com/moodle" address, sorry. Please notify server administrator. I must be doing something wrong, when installing it was very similar to these instructions. Any suggestions would be much appreciated.

    Read the article

  • XAMPP - Unable to serve files larger than ~30MB [on hold]

    - by Sparx401
    I'm developing a site locally with XAMPP on Windows 7, and as far as media is concerned, I'm unable to play media files that are larger than 30MB or so. Both video and audio files (MP4 and MP3 respectively) generate this error in Chrome (and show similar errors in other browsers such as IE9 and Opera): No data received Unable to load the webpage because the server sent no data. Error 324 (net::ERR_EMPTY_RESPONSE): The server closed the connection without sending any data. It seems that the exact number of MB somewhat varies between browsers though. One video in question is 34MB and actually plays in Opera and IE9, but gives the aforementioned error in Chrome. I've checked to make sure the file paths were typed correctly and ensured that the directive for .htaccess is there to serve MP4s: AddType video/mp4 mp4 Also, I have these directives set as well in the same .htaccess file: php_value upload_max_filesize "80M" php_value post_max_size "80M" php_value max_input_time 60 php_value max_execution_time 60 And memory_limit is set in php.ini as "128M" so I'm left wondering: what is causing my files to not play, and what, if any, directives I have to change on the server-side? Perhaps something to do with limitations with the GET method (the method I'm seeing on Chrome's network tab among other header request/response info)?

    Read the article

  • Display only visits referred from a Google Adwords campain

    - by Adjam
    I want to only display visits to my site which were sent by my Google Adwords campain preferably in the Visitors overview page. I've tried filtering with 'Advanced Segments' but when I select "Paid Search Traffic" visits goes down to zero. But I do know that most of my visitors at the moment are being sent from Google Adwords. In this question the answer (which was not chosen) suggested adding a HTTP GET request or an URL shortener, but surly there is a way to do it in Analytics?

    Read the article

  • Forum software alternative to phpBB3

    - by Fernando
    I've been using phpBB3 for quite some time now. It seems to me this forum software hasn't evolved at all in all these years. Installing mods is a hassle, updating it to a newer version a real pain in the arse and moderating is not intuitive at all. Besides, I find there's just no way to stop spam on it. Lots of web software have made a great job controlling spam, but phpBB3 still doesn't, at least not without too much complex and tedious work. Since my last attempt to update to the latest version broke it, I'm finally fed up with it, and decide I'm not wasting a minute more in mantaining such a beast. I'm looking for a free software (free as in free beer and free as in free speech) alternative. So SMF is not an alternative at the moment. The most important feature I'm looking for is there must be a script to migrate all of the current phpBB users and posts into the new system. Out of all the alternatives out there, does any of them support these features? Which one do you recommend?

    Read the article

  • Is it true that quickly closing a webpage opened from a search engine result can hurt the site's ranking?

    - by Austin ''Danger'' Powers
    My web designer recently told me that I need to be careful not to Google for my business' website, click on its search result link, then quickly close the page (or click back) too many times. He says "Google knows" that I didn't stay on the page, and could penalize my site for having a high click-through rate if it happens too much (it was something along those lines- I forget the exact wording). Apparently, it could look like the behavior of a visitor who was not interested in what they found (hence the alleged detrimental effect on the site's search ranking). This sounds hard to believe to me because I would not have thought any information is transmitted which tells Google (or anyone, for that matter) whether or not a website is still open in a browser (in my case Firefox v25.0). Could there possibly be any truth to this? If not, why might he have come to this conclusion? Is there some click-tracking or similar technology employed by search engines which does something similar? Looking forward to hearing everyone's thoughts.

    Read the article

  • fsockopen() error : Network is unreachable port 43 in php [closed]

    - by hamid
    i've writed some Php code that lookup for domain (whois) but it fails !! this is some of my code : function checkdomain($server,$domain){ global $response; $connection = fsockopen($server,43); fputs($connection, "domain " . $domain . "\r\n"); while(!feof($connection)){ $response .= fgets($connection, 4096); } fclose($connection); } checkdomain("whois.crsnic.net","www.example.com"); the code work on my localhost ( apache,php,mysql, OS - Win XP ) but when i uploaded it to my host (Linux) it failed. and i always see the Below Error/message : Warning: fsockopen() [function.fsockopen]: unable to connect to whois.crsnic.net:43 (Network is unreachable) in /home/hamid0011/public_html/whois/whois.php on line 37 what should i do ? is this my host's problem or whois server ( but it work in localhost ) or my code ? TNX

    Read the article

  • Nginx or Apache for a VPS?

    - by James
    I consider myself to be an inexperienced user/administrator when it comes to running my VPS. I can get by with a few CLI commands, I can set up Webmin and I can set up Yum repos, but beyond the very basic stuff, I'm out of my depth. So far, I'm running Apache. I don't know it particularly well, but I can get by with editing httpd.conf if I'm told what to edit. I've heard good things about Nginx and that it's not as resource-hungry as Apache. I'd like to give it a go, but I can't find any information about its suitability for administrators like me, with little experience of sysadmin or web server config. Webmin now has support for Nginx, so getting it installed and running probably won't be too much of a problem. What I'm wondering is, from a site administrator perspective, is running Nginx as transparent as running Apache? IE, at the moment, I can just throw up Wordpress and Drupal sites without having much to worry about or having to make any config changes to Apache. Would Nginx be as transparent?

    Read the article

< Previous Page | 76 77 78 79 80 81 82 83 84 85 86 87  | Next Page >