Search Results

Search found 14053 results on 563 pages for 'upk pro (knowledge pathwa'.

Page 188/563 | < Previous Page | 184 185 186 187 188 189 190 191 192 193 194 195  | Next Page >

  • Why Google Analytics is displaying wrong landing pages?

    - by Salman
    I see all of my pages as Landing Pages in Google Analytics which cannot be true as I did not post those pages anywhere and I don't see any traffic hitting directly to that page. Also, I am using virtual page views on few buttons and I see those virtual pages as Landing pages too. For example, /click/request-a-quote 35000 views 35000 is too big a number to be ignored. Even if I ignore Virtual Pages Views, I see a lot of pages as Landing Pages that I am 100% sure that visitors ( atleast not so many users) are NOT hitting directly. Any advice, how to debug it? PS: I'm using the following code: var _gaq = _gaq || []; _gaq.push(['_setAccount', '<']); _gaq.push(['_setDomainName', 'none']); _gaq.push(['setLocalGifPath', '/images/_utm.gif']); _gaq.push(['_setAllowLinker', true]); _gaq.push(['_trackPageview','account/phase1']);

    Read the article

  • Is it possible to block traffic originating from a specific country?

    - by mickburkejnr
    Hi guys, My personal website is currently getting a lot of spam comments at the moment, and most of them originate from Russia (I've used Google Analytics to identify the traffic, and a lot of the links link to Russian sites). As it's a pain to keep deleting this comments, I would like to ban people from there commenting or visiting the website. Is this possible? Also, the website is using WordPress. Many thanks!

    Read the article

  • .htaccess file to implement multiple redirects

    - by RobMorta
    I have a dynamic site and from .htaccess file creating clean URLs: RewriteCond %{REQUEST_URI} !(\.png|\.jpg|\.gif|\.jpeg|\.bmp)$ RewriteRule ^([a-zA-Z0-9_\-\+\ ]+)$ flight.php?flights=$1&slug=$1 This code worked fine for me but when I created a new type of page and trying to get clean URLs with the same code i.e.: RewriteCond %{REQUEST_URI} !(\.png|\.jpg|\.gif|\.jpeg|\.bmp)$ RewriteRule ^([a-zA-Z0-9_\-\+\ ]+)$ manual-page.php?url=$1&slug=$1 it's not working and if I comment the previous two lines then its is working fine. Only one code is working at a time. For first I have a URL domain.com/flight.php?flight-san-fransisco-london-flights and I want this being redirect to domain.com/san-fransisco-london-flights & from the second one I have domain.com/manual-page.php?url=my-new-page and I want this being redirect to domain.com/my-new-page. Is these any way to get both working together?

    Read the article

  • Google analytics set up with wrong domain

    - by Tom
    I have recently embedded Google Analytics into a site using the default embed code. <script> (function (i, s, o, g, r, a, m) { i['GoogleAnalyticsObject'] = r; i[r] = i[r] || function () { (i[r].q = i[r].q || []).push(arguments) }, i[r].l = 1 * new Date(); a = s.createElement(o), m = s.getElementsByTagName(o)[0]; a.async = 1; a.src = g; m.parentNode.insertBefore(a, m) })(window, document, 'script', '//www.google-analytics.com/analytics.js', 'ga'); ga('create', 'UA-XXXXXXXX-1', 'MYDOMAIN.COM'); ga('send', 'pageview'); </script> However I had MYDOMAIN.COM set to an incorrect domain. The views for the site seem very low, however, I can see myself there as a visitor in the real time scanner. What effect would setting the domain incorrectly have had? How does Google use this parameter?

    Read the article

  • Why googling by keycaptcha gives results on reCAPTCHA? [closed]

    - by vgv8
    EDIT: I'd like to change this title to: How to STOP Google's manipulation of Google search engine presented to general public? I am frequently googling and more and more frequently bump when searching by one software product I am given instead the results on Google's own products. For ex., if I google by keyword keycaptcha for the "Past 24 hours" (after clicking on "Show search tools" -- "Past 24 hours" on the left sidebar of a browser) I am getting the Google's search results show only results on reCAPTCHA. Image uploaded later: Though, if confine keycaptcha in quotes the results are "correct" (well, kind of since they are still distorted in comparison with other search engines). I checked this during few months from different domains at different ISPs, different operating systems and from a dozen of browsers. The results are the same. Why is it and how can it be possibly corrected? My related posts: "How Gmail spam filter works?" IP adresses blacklisting Update: It is impossible for me to directly start using google.com as I am always redirected to google.ru (from google.com) by my ip-address "auto-detect location" google's "convenience". The google's help tells that it is impossible to switch off my location auto-detection because it is very helpful feature. There is a work-around to use google.com/ncr (to get google.com) (?anybody know what does it mean) to prevent redirection from google.com but even. But all results are exactly the same OK, I can search by quoted "keycaptcha", I am already accustomed to these google's quirks, but the question arises why the heck to burn time promoting someone's product if GOOGLE uses other product brands for showing its own interests/brands (reCAPTCHA) instead and what can be done with it? The general user will not understand that he was cheated and just will pick up the first (wrong) results Update2: Note that this googling behaviour: is independent on whether I am logged-in (or log-out-ed of) a google account, which account, on browser (I tried Opera, Chrome, FireFox, IE of different versions, Safari), OS or even domain; there are many such cases but I just targeted one concrete restricted example speciffically to to prevent wandering between unrelated details and peculiarities; @Michael, first it is not true and this text contains 2 links for real and significant results.. I also wrote that this is just one concrete example from many and based on many-month exp. These distortions happen upon clicking on: Past 24 hours, Past week, Past month, Past year in many other keywords, occasions/configurations of searches, etc. Second, the absence of the results is the result and there is no point to sneakingly substitute it by another unsolicited one. It is the definition of spam and scam. 3d, the question is not abt workarounds like how to write search queries or use another searching engines. The question is how to straighten the googling's results in order to stop disorienting general public about. Update: I could not understand: nobody reproduces the described by me behavior (i.e. when I click "Past 24 hours" link in google search searching for keycaptcha, the presented results are only on reCAPTCHA presented)? Update: And for the "Past week":

    Read the article

  • Is there a better way to have a two column website with header and footer, equal height columns and stretchy column widths? [closed]

    - by Seamus
    I wrote a website a while ago that is a little messy in how it does things. I used this CSS template and this equal height columns trick. I have not one but two container divs and I can't remember what they're doing. So I'm thinking of re structuring the thing from scratch, and possibly making use of the more "semantic" html5 tags like <nav> and so on at the same time. The question is: is there a better way to achieve a site structure with these properties: 2 equal height main columns (with widths as percentages of the available real estate, not explicitly stated) both a header and footer element that stretch the whole width of the total of the two main columns That allows the use of semantic html5 tags instead of meaningless divs

    Read the article

  • Changing domain name - what are the practical steps involved

    - by Homunculus Reticulli
    I launched a website a couple of years ago, bright eyed and bushy tailed, with dreams of conquering the world. Unfortunately it wasn't to be. Now, that I am a bit older and wiser, I have spent some money on branding and creating more quality content etc, I am rebranding and relaunching the site with a new domain name. Although the traffic on the old site is laughable (i.e. non-existent), there are a few pages of good information on there and I don't want to lose any "juice" those pages may have gained because web crawlers have been seeing it for a few years now. Ok, the upshot of all that is this: I want to change my domain name from xyz.com to abc.com. I am maintaining the same friendly urls I had before, only the domanin name part of the url will change, so that any traffic coming to the old page will be forwarded/redirected? to the new page seamlessly. How do I go about achieving this (i.e. what are the steps I need to carry out, and to minimize any "disruption" to any credibility the existing site has with Googlebot etc? I am running Apache 2.x on a headless Linux (Ubuntu) server.

    Read the article

  • Google Analytics HTTP vs HTTPS

    - by Pelangi
    I want to use Google Analytics on a website that uses both HTTP and HTTPS that works as explained below: Secure pages accessed through https://mydomain.com/secure/* are always on HTTPS. Any access to these pages through HTTP will be redirected to HTTPS. Any other pages will be accessible through both HTTP and HTTPS I have a Google Analytics profile with URL using HTTPS. Will I cover all traffic? Do I need to create another profile using HTTP and how should I apply the other profile?

    Read the article

  • Displaying the same page, no matter what URI

    - by jgauffin
    We have moved a webapplication and would like to display a message in the old IIS. Let's say that the application was in http://oldserver/appname/. How do I make sure that our moved.html is displayed to the user no matter which uri the user browsed in to (in that virtual folder)? http://oldserver/appname/some/path.aspx --- should display http://oldserver/appname/moved.html http://oldserver/appname -- should display http://oldserver/appname/moved.html

    Read the article

  • Why there suddenly were so many 400 request in my access log?

    - by LotusH
    Below are little part of my access_log 118.186.8.50 - - [19/Dec/2011:22:42:57 +0800] "-" 400 0 "-" "-" 05 118.186.8.50 - - [19/Dec/2011:22:42:57 +0800] "-" 400 0 "-" "-" 06 118.186.8.50 - - [19/Dec/2011:22:42:57 +0800] "-" 400 0 "-" "-" 07 118.186.8.50 - - [19/Dec/2011:22:42:57 +0800] "-" 400 0 "-" "-" 08 118.186.8.50 - - [19/Dec/2011:22:42:57 +0800] "-" 400 0 "-" "-" 09 220.173.136.39 - - [19/Dec/2011:22:43:22 +0800] "-" 400 0 "-" "-" 10 220.173.136.39 - - [19/Dec/2011:22:43:22 +0800] "-" 400 0 "-" "-" 11 220.173.136.39 - - [19/Dec/2011:22:43:22 +0800] "-" 400 0 "-" "-" 12 220.173.136.39 - - [19/Dec/2011:22:43:22 +0800] "-" 400 0 "-" "-" 13 220.173.136.39 - - [19/Dec/2011:22:43:22 +0800] "-" 400 0 "-" "-" 14 220.173.136.39 - - [19/Dec/2011:22:43:22 +0800] "-" 400 0 "-" "-" And the volume was very huge, some like one hundred thousand of these 400 request per second. And I'm pretty sure there are no errors on my site in that period of time.(No error report and I didn't change the source code)

    Read the article

  • How to prevent Google Analytics from adding a second slash between domain and page specific URL when viewing a page?

    - by Jeromy Anglim
    I have a blog http://foo.tumblr.com. I sometimes go to Site Content - All Pages on Google Analytics and then navigate to page listing and then click the icon to take me to that page on my blog. However, instead of opening http://foo.tumblr.com/post/1234/blah.html Google Analytics is opening http://foo.tumblr.com//post/1234/blah.html (i.e., it is adding a second slash between the domain the page specific component of the URL). How can I stop Google Analytics from doing this?

    Read the article

  • How to filter traffic coming to particular page from other page?

    - by BishKopt
    I've got page A linking to page B. There are also other pages linking to B. How can I see traffic that is coming to page B be ONLY form page A? I can somehow do it via Behavior flow: Behavior Behavior flow [Right click on anything] Explore traffic through here [Click edit icon] Define a page group [Right click] Group details [Dropdown] Incoming traffic But how do I do it in normal reports? Is there any way to filter out only the traffic coming from particular page?

    Read the article

  • Tab navigation and double content

    - by Guisasso
    I have a website in which i use tabs to navigate between pages. For example, page a displays A as an active tab and B and C background tabs. If the visitor gets to the website via page B, i also would like to display to page d, but not a and c. Question: I know i can just create index2 for b for example, so when the visitor gets to b from a, i display a,b,c and index1 when visitor gets to b from d for example. Is that a bad practice? I know double content isn't good, but in which other way can i or should i approach this problem? The tab navigation i designed uses < li and id tag do display active tab, defined in the < body tag.

    Read the article

  • Usefulness of the Backlinks shown in Webmaster Tools

    - by Ewan Heming
    Is the list of links for a site shown in Google Webmaster Tools a complete list or just a sample? I've noticed that the links in there appear to be all the ones I didn't think would have any real value - either because they were nofollow or from irrelivant sites. The few I did think would be some use have never shown up and there's also some links that are sometimes there and sometimes not (such as my linkedIn profile). Does this mean that the missing links don't/no longer carry any value? It almost appears that the list is there for Google to either inform you about problems (there was a useful list there when someone tried to SPAM my site) or mis-imform you about which link-building strategies work or not (to keep people guessing about what works or not).

    Read the article

  • How to make Google recognize language for a multilingual website?

    - by Julien Fouilhé
    Few weeks ago, I implemented translation functionality for the website of my company. The website is now available in french and english and I did look on the internet the best way to do if we want to do not lose any ranking and to have our pages on Google. Here is what I did: I did set a response header: Content-Language:en and Content-Language:fr My URLs are formatted as: http://www.website.com/en/... and http://www.website.com/fr/... My html tag is set with a lang attribute: <html lang="en"> and <html lang="fr"> There is a <link rel="alternate" hreflang="en" href="EnglishPageUrl"> on french pages and a <link rel="alternate" hreflang="en" href="frenchPageUrl"> on english pages. But Google keeps referring to some english pages when I'm doing a search on french engine, knowing that the website was first only available in english. Is that normal? Do I have to wait still, it has been now almost one month, I thought it would be okay...? Thank you.

    Read the article

  • Internet Explorer menu z-order problem [migrated]

    - by robgt
    I have what appears to be a z-order problem with Internet Explorer 9. It might be in other IE versions also, but not tested. I have to assume so. This page: http://www.modelhelicopters.co.uk/partsfinder/trex500esp/frames If you hover over the "All pages for this model" menu item on the parts finder menu bar (below the currency selector) - it should drop down a list of all the parts finder pages for the selected model helicopter. If you view the same page in IE or Chrome etc, you will see how it should appear. In IE9, the menu gets cut off at the top of the main exploded view image - suggesting the z-order is wrong. I have tried amending this with a jquery snippet but it didn't fix IE9. I know the code was inserted by jquery as shown by firebug in firefox. $j('div.std img[src*="/partsfinder/img"]').attr("style","position:relative;z-index:-100;"); I really do not know why this is not working.

    Read the article

  • custom domain point to tumblr blog

    - by Julius
    My domain mydomain.com is registered with godaddy. I wish to host my tumblr blog on this domain with nearlyfreespeech.net hosting. My active nameservers at godaddy already point to my authoritative ones at NFS.net which is working. However i'm baffled of the correct configuration to set to point to my Tumblr. Preferably id like (A) my domain http://mydomain.com to host the blog and have http://www.mydomain.com redirect also to http://mydomain.com If this is too difficult my next preference is (B) to have http://www.mydomain.com host the blog whilst http://mydomain.com redirects to http://www.mydomain.com My 3rd preference is to have (C) a sub-domain like http://tumblr.mydomain.com or http://tumblr.mydomain.com to host the blog and i guess have http://mydomain.com and http://www.mydomain.com both redirect to it. I've tried having two aliases mydomain.com and www.mydomain.com pointing to my permanent NFS ip at mydomain.nfshost.com and when i try to add: (1) an A record pointing mydomain.com to the ip 66.6.44.4 as per Tumblr's instructions it tells me i already have the bare domain as an alias so i cant do that. (2) the A record on the www.mydomain.com alias. I can do this with either www.mydomain.com set as an alias or not. But when i tried this with mydomain.com set as the canonical name the result when visiting either mydomain.com or www.mydomain.com was them both continually redirecting to eachother until an error was thrown. So i was wondering if there is a ninja that could save me some hairpulling and tell me the correct way to config A, or else B, or else C.

    Read the article

  • Is it safe to Block These URLs with Robots.txt?

    - by Edgar Quintero
    I have a website that has all URLs optimized and 301 redirected from nasty URLs to clean ones. However, everywhere throughout the site the unclean URLs are linked in menus, content, products, etc. Google currently has all clean URLs indexed, along with a few unclean URLs too. So the site still has linked everywhere the old URLs (ideally this wouldn't be the case but this is how it is ATM). I would like to block the unclean URLs with robots.txt. The question: If I block these unclean URLs with the robots.txt, when the entire website is linked with them (but they all redirect to the clean version), will this affect the indexing status at all?

    Read the article

  • Is Google DFP a replacement for ad rotate plugin?

    - by EPQRS
    I'm currently using Ad-Rotate WordPress plugin on my WordPress site. I recently came to know of Google DFP. I'm currently adding 1-5 ads per day which will increase soon and am wondering if Google DFP is an alternate solution to Ad Rotate plugin. I want to mainly show ours and clients' ads and not AdSense. I'm just looking for an ad manager and was wondering if Google DFP is the right alternate solution. Where can I find a tutorial on how to use (add ads) Google DFP? (I already have an AdSense account)

    Read the article

  • over reporting in google analytics - social media stats

    - by colmcq
    I have a client and their traffic from social media reads thus: 80% from facebook 1% from Twitter This suggestsd they are not exploiting twitter at all and this was in my presentation but my boss took it out claiming twitter stats are under reported in google analytics. I can't substantiate this claim and wonder where she got this idea from. Can anyone shed light on this? Are my stats wrong and should I disregard these figures? but 80-1 seems like one hell of an under report! thanks c

    Read the article

  • Switching to HTTPS - redirect question

    - by seengee
    Following the recent Google announcements about improved ranking for sites running on https we have a number of clients asking about this. Is it safe to just 301 redirect all pages to their SSL equivalent, for example in a common PHP include file: if($_SERVER['HTTPS']!="on"){ $redirect= "https://".$_SERVER['HTTP_HOST'].$_SERVER['REQUEST_URI']; header("Location:$redirect",true,301); exit(); } Obviously I'm aware this is also possible within a .htaccess file but that cannot be modified in our case. Obviously all internal links would be switched to https:// links but obviously we need to sort out incoming links from Google and elsewhere. Is this a sound approach? Are there any other gotchas to be aware of?

    Read the article

  • Go up one directory in mod_rewrite

    - by Rudolph Gottesheim
    I've got a standard Zend Framework 1 project that looks a bit like this: Project |- public |- .htaccess |- index.php The .htaccess looks like this: RewriteEngine On RewriteBase / RewriteRule ^image/.*$ img.php?file=$1 [NC,L] RewriteCond %{REQUEST_FILENAME} -s [OR] RewriteCond %{REQUEST_FILENAME} -l [OR] RewriteCond %{REQUEST_FILENAME} -d RewriteRule ^.*$ - [NC,L] RewriteRule ^.*$ index.php [NC,L] Now I want to start transitioning the site to Zend Framework 2, which I put in a separate directory in the root, so the whole thing looks like this: Project |- public |- .htaccess |- index.php |- zf2 |- public |- .htaccess |- index.php What would I have to change in my original (ZF1) .htaccess to route all requests to (for example) /zf2/whatever to ZF2's index.php? I've tried RewriteRule ^zf2(/.*)$ ../zf2/public/index.php [NC,L] in the line after RewriteBase /, but that just gives me a 400 Bad Request.

    Read the article

  • SEO strategy for h1, h2, h3 tags for list of items

    - by Theo G
    On a page on my website page I have a list of ALL the products on my site. This list is growing rapidly and I am wondering how to manage it from an SEO point of view. I am shortly adding a title to this section and giving it an H1 tag. Currently the name of each product in this list is not h1,2,3,4 its just styled text. However I was looking to make these h2,3,4. Questions: Is the use of h2,3,4 on these list items bad form as they should be used for content rather than all links? I am thinking of limiting this main list to only 8 items and using h2 tags for each name. Do you this this will have a negative or possible affect over all. I may create a piece of script which counts the first 8 items on the list. These 8 will get the h2, and any after that will get h3 (all styled the same). If I do add h tags should I put just on the name of the product or the outside of the a tag, therefore collecting all info. Has anyone been in a similar situation as this, and if so did they really see any significant difference?

    Read the article

  • Keeping rackspace vserver alive

    - by mit
    It appears to me that rackspace somehow freezes cloud VMs after some idle time. This means the first page request to a php page takes much longer to respond than the subsequent requests. This is in some cases good, in other cases not acceptable. I am actually querying a machine with wget from a different host now to keep it "alive". But I wonder what frequency would be necessary. Does anyone know the time period after which they send a VM to "sleep"? I guess it would be some minutes. EDIT: There is absolutely no caching involved on the php site. It just recently moved from another vhost and there was never such latency on the first request.

    Read the article

  • Using photoshop actions to decide if an image needs to be rotated

    - by voxobscuro
    I have Photoshop CS3 and I need to do a batch on a lot of pictures before I upload them. The pictures need to fit in an 600x800 box, yet be as big as possible within that box. Some of them are much wider than taller and others are more tall than wide. I am trying to put together a photoshop action that will rotate, resize, and fill pictures as needed to make them as big as possible while staying within the 600x800 box. The only thing I haven't gotten sorted out is how to tell photoshop to rotate the image 90 degrees if that will allow the picture to be bigger within the constraints. Any ideas?

    Read the article

< Previous Page | 184 185 186 187 188 189 190 191 192 193 194 195  | Next Page >