Search Results

Search found 51290 results on 2052 pages for 'google image search'.

Page 477/2052 | < Previous Page | 473 474 475 476 477 478 479 480 481 482 483 484  | Next Page >

  • Correct configuration of multiple Analytics trackers per page, spanning domains and subdomains

    - by Eliot Shepard
    My company publishes sites on a somewhat convoluted domain structure, and we're having trouble getting accurate numbers in Analytics when we have multiple trackers on the page. We publish under two brands (A, B). Each brand has a "national" site at A.com, B.com, as well as per-city "local" sites at eg. ny.A.com, la.A.com, sf.A.com, etc. Right now we're trying to track in these dimensions: Full network (A.com, ny.A.com, B.com, la.B.com, etc.) All sites in brand (A.com, ny.A.com, la.A.com, etc.) Inidividual site (ny.A.com) Here are the commands we're using on an individual site: _gaq.push( ['t0._setAccount', 'UA-XXXXXX-1'], // full network ['t0._setDomainName', 'none'], ['t0._setAllowLinker', true], ['t0._trackPageview'], ['t1._trackPageLoadTime'], ['t1._setAccount', 'UA-XXXXXX-2'], // brand ['t1._setDomainName', 'none'], ['t1._setAllowLinker', true], ['t1._trackPageview'], ['t1._trackPageLoadTime'], ['t2._setAccount', 'UA-XXXXXX-3'], // individual ['t2._setDomainName', 'none'], ['t2._setAllowLinker', true], ['t2._trackPageview'], ['t2._trackPageLoadTime'] ); We send the same commands to each account because we've had strange results when trackers were configured differently in the past. However, right now we're seeing inflated numbers for uniques on all three trackers. What is the correct way to configure this setup? Thanks for your time.

    Read the article

  • Launch elasticsearch dockerfile using my own elasticsearch.yml

    - by Kevin
    I am launching elasticsearch via a dockerfile found here: https://index.docker.io/u/ehazlett/elasticsearch/ It works great. I need to define my own hosts as my environment does not support multicast of any kind. I understand that my options are: 1) supply hosts when elasticsearch is run as a command line parameter 2) modify my elasticsearch.yml file to set the hosts. I know how to build the yml, what I need to know is how to launch elasticsearch via docker using my own yml instead of the one in the container. Is that possible? Thanks.

    Read the article

  • Outlook VBA script - find and replace text with image

    - by user2530616
    I have a e-commerce store. When I get a sale, I receive an order confirmation email which contains the name of the product sold. When the email comes through, I would like to run a script that replaces the product name eg. "red widget", with a picture of that product. Is that possible? I have found a similar code to replace text (set of numbers in this case) with a link, but I need it to replace with a picture instead. Option Explicit Sub InsertHyperLink(MyMail As MailItem) Dim body As String, re As Object, match As Variant body = MyMail.body Set re = CreateObject("vbscript.regexp") re.Pattern = "#[0-9][0-9][0-9][0-9][0-9][0-9]" For Each match In re.Execute(body) body = Replace(body, match.Value, "http://example.com/bug.html?id=" & Right(match.Value, 6), 1, -1, vbTextCompare) Next MyMail.body = body MyMail.Save End Sub example mail Order Confirmation Thanks for shopping with us today! ------------------------------------------------------ Order Number: 2209 Date Ordered: Friday 28 June, 2013 Products ------------------------------------------------------ 1 x red widget = $5.00 ------------------------------------------------------ Total: $0.00 Delivery Address xxx search text: "red widget" replace picture: redwidget.jpg

    Read the article

  • 1. születésnap - új tartalom

    - by Lajos Sárecz
    Kb. 1 éve, június elején kezdtem a blogot írni. Az elmúlt egy év alatt 3395 egyedi látogatója volt a blognak, a látogatók alkalmanként átlagosan 59 másodpercet töltöttek el itt. Azt is el kell ismerni, hogy elég nagy a visszafordulási arány (83,5%), azaz az idetévedok kevesebb mint ötöde találja hasznosnak a tartalmat. Ennek persze lehet egy fontos oka, hogy az IT kifejezésekre keresések miatt külföldiek számára is feljön a keresokben a blogom, így csak megnyitva a blogot észlelik, hogy nem értenek magyarul :-). Jól mutatja az alábbi térkép is, hogy a világ számos pontjáról idetalálnak (van aki egyébként használ valami webes fordító megoldást a nyelvi akadály leküzdésére), de az is látszik hogy a célközönséget is sikerül elérni, azaz a látogatók zöme Magyarországról érkezik (3505 látogatás az összes 5082-bol). Érdekes, hogy a legtöbb látogató a "vb tippjáték" (110 látogató) kifejezésre talált ide, ezt követi az "oracle junior képzés" (42), majd a "kyte" (40) (nem biztos hogy Tom Kyte-ra gondoltak :-)). Csupán a 4. helyen áll az elso igazi IT keresés: "oracle workflow" (37), majd "oracle enterprise manager" (33). A napi csúcslátogatást (76) is a vb tippjátékra keresoknek köszönheti a blog... Az 1. születésnap örömére próbáltam valami újítást csinálni. A fejléc alatt látható új menüsorba próbálom összeszedni azokat a hasznos oldalakat, amelyek kapcsolódnak a blog témájához, ám esetleg nem annyira egyszeru rátalálni a weben. Tervezem még ezt tovább bovíteni, esetleg egy külön oldalt is létrehozni erre a célra a blog mögött. Remélem ezzel segítem a blog olvasók munkáját.

    Read the article

  • Set postfix to send email but not to receive them

    - by CodeShining
    I'm using Google Apps to handle personal email addresses for my domain name, and I set up the DNS as Google suggests. All works fine. Now since I need a SMTP to send emails from my e-commerce I installed Postfix on the server. It works fine when I send emails to any email address but it doesn't send to the same domain name, so let's say my domain is example.com, I set postfix using example.com, if I try to reset a password using [email protected] postfix doesn't send and instead reports on the mail.log Sep 20 01:09:52 ip-10-54-26-162 postfix/pickup[6809]: B09A3415D8: uid=33 from=<www-data> Sep 20 01:09:52 ip-10-54-26-162 postfix/cleanup[6854]: B09A3415D8: message-id=<20120920010952.B09A3415D8@ip-10-54-26-162.eu-west-1.compute.internal> Sep 20 01:09:52 ip-10-54-26-162 postfix/qmgr[30978]: B09A3415D8: from=<[email protected]>, size=4234, nrcpt=1 (queue active) Sep 20 01:09:52 ip-10-54-26-162 postfix/local[6856]: B09A3415D8: to=<[email protected]>, relay=local, delay=0.01, delays=0.01/0/0/0, dsn=5.1.1, status=bounced (unknown user: "myaccount") Of course it cannot find a local user "myaccount" since that account is on Google Apps... How can I tell Postfix to send the email and do not search for a local user?

    Read the article

  • Chrome mistakenly downloads localhost/wordpress pages, but Firefox doesn't

    - by Mr E
    I have Wordpress installed at http://localhost/wordpress When I try and view a page in Chrome, it downloads the PHP file rather than showing the page. When I view in Firefox, everything works fine. when I create localhost/test.php containing simply <?php echo "hello world" ?> it displays in both browsers. Any ideas on what might be wrong? $ dpkg -l | grep apache outputs: apache2, apache2-mpm-prefork, apache2-utils, apache2.2-bin, apache2.2-common, libapache2-mod-php5

    Read the article

  • Low "time on site" and high bounce-rate in Japan

    - by Noam
    I'm seeing a substantially low "time on site" and high bounce rate from visitors coming from Japan. Comparing to other states, even in languages I don't speak, the stats are still dramatically worse. So I assume there's something specific to that nation, that I should understand in order to make their experience better. The content they are seeing is in Japanese, and I've also translated the head-lines, which to my surprise didn't make stats look better. The site doesn't have a mobile version, so I assume that might be part of the problem. Wanted to hear from your experience what other reasons might there be that are specific for Japan. UPDATE: The content itself was in Japanese all the time, that's the reason it attracts users from Japan. The head-lines were in English, so I only changed them.

    Read the article

  • Webmaster Tools word count

    - by Henrik Erlandsson
    Is there a way to somehow verify that the googlebot finds the headings and the content, for example by word count? I'm asking this because I tried a program called Screaming Frog, which fails to even fetch the first h1 on a validated page - for about 1/3 of all the pages(!) - and got insecure. Even though the site looks hunky dory in Webmaster Tools, I'd like to know what a googlebot-like content crawler finds on my page and in what order. Any tips on such tools is appreciated. This is not about keyword count.

    Read the article

  • How can I download the source code for linux-image-3.2.0-24-generic?

    - by keepitsimpleengineer
    The directions at Ubuntu Wiki apt-get source linux-image-$(uname -r) and askubuntu question Where can I find the source code for the Ubuntu Kernel? don't work… sudouser@64bitws:~# uname -r 3.2.0-24-generic and sudouser@64bitws:~# apt-get source linux-image-3.2.0-24-generic Reading package lists... Done Building dependency tree Reading state information... Done Picking 'linux' as source package instead of 'linux-image-3.2.0-24-generic' E: Unable to find a source package for linux

    Read the article

  • Does Webmaster Tools list traffic from ads as inbound links?

    - by Mohamad
    In Webmaster Tools, under the inbound links section, do ads get counted as inbound links? I am doing a review of inbound links on a website and found that most of them are sourced from meaningless blogs and spam websites. Before I accuse anyone of not doing their job properly, I would like to know something: Is it possible that those inbound links were generated when an ad for the website appeared on the spam website? An SEO firm was paid handsomly to generate inbound links and I am afraid all they did was submit material to spam blogs and websites.

    Read the article

  • DNS issue on Fedora 12? wget wordpress.org fails where wget www.google.com works

    - by Tom Auger
    I'm administering a Fedora 12 box, but am quite new to networking specifics. Recently one of our WordPress apps hosted on our server has stopped being able to perform its auto-update or auto-download of plugins. Investigating further, I have tried the following: $ wget wordpress.org --2010-12-17 11:26:50-- http://wordpress.org/ Resolving wordpress.org... failed: Temporary failure in name resolution. wget: unable to resolve host address âwordpress.orgâ Whereas: $ wget www.google.com --2010-12-17 11:27:26-- http://www.google.com/ Resolving www.google.com... 74.125.226.82, 74.125.226.84, 74.125.226.80, ... Connecting to www.google.com|74.125.226.82|:80... connected. HTTP request sent, awaiting response... 302 Found Location: http://www.google.ca/ [following] --2010-12-17 11:27:26-- http://www.google.ca/ Resolving www.google.ca... 173.194.32.104 Connecting to www.google.ca|173.194.32.104|:80... connected. HTTP request sent, awaiting response... 200 OK Length: unspecified [text/html] Saving to: âindex.html.4â [ <=> ] 9,079 --.-K/s in 0.02s 2010-12-17 11:27:26 (462 KB/s) - âindex.html.4â Interestingly: $ ping wordpress.org PING wordpress.org (72.233.56.138) 56(84) bytes of data. 64 bytes from wordpress.org (72.233.56.138): icmp_seq=1 ttl=50 time=81.5 ms 64 bytes from wordpress.org (72.233.56.138): icmp_seq=2 ttl=50 time=67.3 ms ^C --- wordpress.org ping statistics --- 2 packets transmitted, 2 received, 0% packet loss, time 1783ms rtt min/avg/max/mdev = 67.361/74.448/81.536/7.092 ms and $ nslookup wordpress.org Server: 192.168.2.1 Address: 192.168.2.1#53 Non-authoritative answer: Name: wordpress.org Address: 72.233.56.138 Name: wordpress.org Address: 72.233.56.139 nscd has been stopped and flushed. iptables appear to be clean. At this point I have exhausted my limited abilities to diagnose the issue. Can anyone suggest a resolution path?

    Read the article

  • grep command is not search the complete pattern

    - by Sumit Vedi
    0 down vote favorite I am facing a problem while using the grep command in shell script. Actually I have one file (PCF_STARHUB_20130625_1) which contain below records. SH_5.55916.00.00.100029_20130601_0001_NUC.csv.gz|438|3556691115 SH_5.55916.00.00.100029_20130601_0001_Summary.csv.gz|275|3919504621 SH_5.55916.00.00.100029_20130601_0001_UI.csv.gz|226|593316831 SH_5.55916.00.00.100029_20130601_0001_US.csv.gz|349|1700116234 SH_5.55916.00.00.100038_20130601_0001_NUC.csv.gz|368|3553014997 SH_5.55916.00.00.100038_20130601_0001_Summary.csv.gz|276|2625719449 SH_5.55916.00.00.100038_20130601_0001_UI.csv.gz|226|3825232121 SH_5.55916.00.00.100038_20130601_0001_US.csv.gz|199|2099616349 SH_5.75470.00.00.100015_20130601_0001_NUC.csv.gz|425|1627227450 And I have a pattern which is stored in one variable (INPUT_FILE_T), and want to search the pattern from the file (PCF_STARHUB_20130625_1). For that I have used below command INPUT_FILE_T="SH?*???????????????US.*" grep ${INPUT_FILE_T} PCF_STARHUB_20130625_1 The output of above command is coming as below PCF_STARHUB_20130625_1:SH_5.55916.00.00.100029_20130601_0001_US.csv.gz|349|1700116234 I have two problem in the output, first is, only one entry is showing in output (It should contain two entries) and second problem is, output contains "PCF_STARHUB_20130625_1:" which should not be came. output should come like below SH_5.55916.00.00.100029_20130601_0001_US.csv.gz|349|1700116234 SH_5.55916.00.00.100038_20130601_0001_US.csv.gz|199|2099616349 Is there any technique except grep please let me know. Please help me on this issue.

    Read the article

  • Can I upgrade Lucid to Precise with the 12.04.1 desktop image?

    - by rahmad ars
    I am running Ubuntu Lucid and would like to upgrade to Precise, using the normal ubuntu-12.04.1-desktop-...iso image. So far I have tried upgrading with the CD but was not given the choice. Searching on the internet, I have found upgrading using the ubuntu-12.04.1-alternate-...iso image works, but I don't have great internet access. Is upgrading from Lucid possible, with the ubuntu-12.04.1-desktop image?

    Read the article

  • Is there a version of Chrome or Chromium whose bookmarks are visible to the HUD?

    - by Isaac
    I noticed that Firefox has bookmarks that can be executed in the HUD. Love it! Chrome and Chromium, which run javascript apps much faster for some odd reason, allow for history to be called through the HUD, but not bookmarks. This wouldn't be a problem if ALL history was callable, but it seems that Chrome LIMITS the amount of history visible to the hud. Is there a version of chrome or chromium that will see our bookmarks?

    Read the article

  • Goal completions 10x higher in dashboard

    - by cjk
    I have the following table in my Dashboard: Page path level 1 Visits Goal Completions ----------------- ------ ---------------- /sub1/ 994 1,295 / 102 3 /sub2/ 10 1 I know my conversion rate is 10-20%, and that actually in this period I only had 183 goal completions under /sub1/. My goal is set as a regular expression for a particular page (/success?.*), and I have a funnel set up which tracks the page before the goal (/action). The actual urls hit would be /sub1/action then /sub1/success?1234 and /sub2/action then /sub2/success?1234. Why is my table in my dashboard giving me wildly wrong numbers? Have I done something wrong?

    Read the article

  • EXCEL 2010 Check if sub string value in cell match with other string from range of cells

    - by gotqn
    I am stuck with this one from hours. I have range with cells with string values: A1 text1 A2 text2 An text3 And other column with other string values like: B1 text1sampletext B2 text2sampletext B3 text3sampletext B4 text1sampletext B5 text1sampletext I have to check if text in column A is sub string of text in column B. If it is, to set in column C the text from column A. Like this: B1 text1sampletext - C1 text1 B2 text2sampletext - C1 text2 B3 text3sampletext - C1 text3 B4 text1sampletext - C1 text1 B5 text1sampletext - C1 text1

    Read the article

  • Server 2003 Remote Desktop loses its virtual printer image of the local printer

    - by Charles Hart
    Server 2003 Remote Desktop provides service to stores served by several ISPs. The server loses its virtual printer image of the local printer (as seen from the remote store site) and a copy of the original local printer appears on the local computer with a different driver without notice. Specifically: A remote desktop session is opened on a local computer that has a Brother HL2140 USB printer connected and the associated software installed with a correct driver shown under the “advanced” button. The server has the same Brother software and driver. An application that is running on the server attempts to print on the local printer connected to the local computer running Vista Pro or XP Pro. Either it works correctly (Good) or it does not print (Bad) or it prints on another Local Printer connected to another local computer logged into the server (Bad and Odd). When it doesn’t print (or prints somewhere else) we ask the customer to look for the (virtual) printer using the Remote desktop view of the server and the printer is gone. Then we ask the customer to look at the printers folder in the local computer. There are several possibilities: The printer is there, but the driver is mysteriously changed in the drop down to MDX something; we have the customer select the other (proper) Brother driver, and all is well again, as now after the change, the virtual printer in the server (which now matches the local printer) appears again, and so printing can resume. A “copy” of the printer mysteriously appears in the local printer’s folder and after we delete it the virtual printer in the server appears again and so printing can resume. Note that in both case 1 and 2, the server sometimes sends the print job elsewhere, to some other local computer. Meanwhile in the log file, endless errors are reported and the server eventually crashes, sometimes twice a day. I’m puzzled what changes the local printer driver and I’m puzzled what loads the copy 2 or copy 3 of the printer in the local printer folder. This entire description randomly occurs on any of 40+ local computers in eight different locations in different ISPs, all sharing one Domain.

    Read the article

  • Is Cloud Print to blame for my inability to print?

    - by VarLogRant
    I have just moved to the Chrome beta (first 9.something and currently 10.0.648.127) on my Windows 7 64-bit machine, and my good web browsers (Firefox Beta and Chrome) can no longer print, and neither can Komodo Edit. The problem seems to have started when I tried to enable Google Cloud Print. Frustratingly, I can still print from IE 9 RC, and also from other machines in the office, which means the problem is entirely on this Windows machine. When I try to print from Chrome, first time I get the printer dialog, and once I click print, it waits, then pops up "Something went wrong", and each time after, it pops up with "No Printer Found". Under Firefox, I get "An unknown error occurred while printing", as does Komodo. I have printed from these applications before, and have even printed from Google Cloud Print. Once. But then it stopped. I really don't know where to go next in debugging this thing, and the Google results I see tend to be from 2007. I'm not ready for the paperless office yet, so please help!

    Read the article

  • Adwords: Is there a drawback to setting a really high max CPC to learn what works faster?

    - by Rob Sobers
    I'm toying with increasing my max CPC really high on all my keywords so ensure my ad gets shown in the top spot on page one in order to draw more clicks. I think this will be a good way to quickly figure out whether the ads I'm writing have a decent CTR and, more importantly, whether the landing pages I'm building are converting. Since I can set a max daily budget for my campaign, I won't risk breaking the bank. I can't think of any drawbacks, personally. Am I missing any?

    Read the article

  • Sortie imminente du Chrome Web Store, Google envoie un mail d'information et organise un évènement sur Chrome demain

    Sortie imminente du Chrome Web Store Google envoie un mail d'information aux développeurs d'extensions pour Chrome Google va envoyer une première série de mails d'information aux développeurs d'extensions et de thèmes pour Chrome, signe que l'ouverture de la boutique en ligne Chrome Web Store est imminente. Gregor Hochmuth, chef de produit de Google Chrome Web Store, a lui-même annoncé que les développeurs seront informés avant le lancement officielle de la boutique. Le but de ces messages sera d'indiquer les modifications apportées au magasin afin que les développeurs puissent vérifier l'impact sur leurs codes et y apporter des modifications avant la publication en...

    Read the article

  • Is this a link scheme? If so, what to do? what problems can i face?

    - by guisasso
    I was asked to remodel a website, and decided to check its rank on alexa. Surprisingly, there are many, many different websites linking to it, none relevant. One particular thing about it is that none of these urls work, and they all display the exact same error when accessed, which to me is a very good indication that this is some sort of linking scheme. (besides the somewhat obvious names, it even says scheme in one of the urls !?) If so, how should i proceed about this website? What can i do if this is in fact a scheme, how can this hurt the website, what types of problems can i face, and what can i do about it? addurlnow . info dirlist15.addurlnow . info/Business___Economy/Services/page-12.html linkdirectory101 . info dirlist16.linkdirectory101 . info/Business___Economy/Services/page-15.html seonetblog . info dirlist52.seonetblog . info/Business___Economy/Affiliate_Schemes addurls . us dirlist21.addurls . us/Business___Economy/Services/page-10.html webdirectoriessite . info dirlist20.webdirectoriessite . info/Business___Economy/Services/page-6.html addurlstore . info dirlist10.addurlstore . info/business___economy/services/page-14.html ukwebdirectorys . info dirlist21.ukwebdirectorys . info/Business___Economy/Services/page-13.html

    Read the article

  • bootmgr image is corrupt

    - by bacord
    I just replaced my HD with a SSD. I did a brand new install on the SSD and I received the following error after booting BOOTMGR image is corrupt. The system cannot boot. A little configuration for your knowledge. The system used to be a dual boot with XP & windows 7. After replacing the original startup HD with my SSD I changed a setting in the bios to AHCI (I have tested changing it back but this did not help). When I look at the stats on the drive in the bios, it claims that the SSD is in a raid configuration despite the settings not being configured that way. Related System Information: Intel 320 Series 80 GB Sata II SSD JetWay JPA78VM3-H-LF AM2+/AM2 AMD 780V HDMI Micro ATX AMD Motherboard AMD Athlon 64 X2 7 GB ram I have performed 2 fresh installs to no avail. Also, followed this guide and performed option 1 and 2. I have done bootsec/fixmbr and /fixboot. So.. any suggestions?

    Read the article

  • Is this DFP error message the reason my ad won't show?

    - by Eric
    I'm setting up DFP to display ads and I have an ad tag (Javascript) from adtechus.com. The tag looks like this: <script src="http://adserver.adtechus.com/?adrawdata/1.0/1111/11111/0/0/ADTECH;loc=100;noperf=1;"></script> When I paste that tag into DFP, I get an error message saying it does't recognize the tag format: ...and, more importantly, my ad isn't showing on the page. DFP seems to be taking my adtechus ad tag regardless and working with it, despite the error message. But could that be the reason my ad isn't showing? And how can I fix it?

    Read the article

  • How to test robots.txt in googlebot to find out what is being indexed

    - by Amar Jarubula
    This question is a continuation for this answer How to check if googlebot will index a given url? As was told I did go to the Webmaster Tools and tested contents of my robots.txt file. However this is just giving me the info if that content is good enough or not. However for my scenario I need to test whether disallowing some patterns is being indexed or not. For example I have something like this below in my robots.txt disallow:/pattern* My understanding is the URLs with word pattern should not crawled, but how do I test this pattern is enforced while indexing the website?

    Read the article

< Previous Page | 473 474 475 476 477 478 479 480 481 482 483 484  | Next Page >