Search Results

Search found 9721 results on 389 pages for 'quicktest pro'.

Page 79/389 | < Previous Page | 75 76 77 78 79 80 81 82 83 84 85 86  | Next Page >

  • Connecting Google Analytics with Custom Search Engine AdSense

    - by Yochai Timmer
    I have a Custom Search Engine that I've created with AdSense. I've put that search engine as a site search in my Google Sites page. I've connected both the Custom Search Engine and the Google Site to my Analytics page via their settings pages. Now, I'm trying to get Analytics to show me the AdSense for Search statistics. I've managed to connect the Google Sites page, to the Analytics, and I can see the search statistics in the Analytics as well. But I can't get it to show the actual AdSense for Search statistics from the Custom Search Engine. How can I configure everything so I can get the AdSense for Search statistics of my Custom Search Engine in my Analytics page?

    Read the article

  • Google Analytics checkout page tracking problem

    - by Amir E. Habib
    I am running a multilingual website, each lang on a different domain name. I am trying to lead all purchase requests to the checkout progress, which has its own domain too. In order to keep Google Analytics tracking I've updated the Google Analytics code accordingly. I set the source domain to 'multiple top-level domains'. Everything is going fine so far unless in E-commerce Overview; the "Sources / Medium" is always showing as (direct) - or the name of the source domain. Since I am redirecting using PHP header(location:.. etc.) the Google _link method doesn't seem to be working properly - I want to focus on two questions: Should I create a new profile for the checkout domain in Google Analytics? (I am now using the profile ID of the source domain even though I move to the checkout domain, si that OK?) When I'm trying to pass the cookies of the source domain to the checkout domain, I notice that the Google cookies are copied to the new domain (the cookie path is .checkout-domain/) and they have the same values of the original cookies - But for some reason another set of cookies is created once I access a page with google analytics code in the checkout pages, with different values (same path). Feels like I'm doing something wrong here, so my question is - What am I doing wrong here? Does anyone have an idea how to pass the cookies to the checkout domain?

    Read the article

  • Duplicating someone's content legitimately & writing HTML to support that

    - by Codecraft
    I want to add content from other blogs to my own (with the authors permission) to help build additional relevant content and support articles I've found useful that others have written. I'm looking into how to do this responsibly - ie, by giving the original content author a boost and not competing against them for search traffic which should go to their site. In order to keep my duplicate content out of search, and to hint to the search engines where the original content is to be found i've implemented: <head> <meta name='robots' content='noindex, follow'> <link rel='canonical' href='http://www.originalblog.com/original-post.html' /> </head> Additionally, to boost the original article and to let readers know where it came from i'll be adding something like this: <div> Article originally written by <a href='http://www.authorswebsite.com'>Authors Name</a> and reproduced with permission.<br/> <a href='http://www.originalblog.com/original-post.html' target='new'> Read the original article here. </a> </div> All that remains is a way to 'officially' credit the original author in the HTML for the search spiders to see. Can anyone tell me a way to do this possibly using rel="author" (as far as I can see thats only good for my own original content), or perhaps it doesn't matter given that the reproduced pages will be kept out of search engines? Also, have I overlooked anything in the approach?

    Read the article

  • XAMPP - Unable to serve files larger than ~30MB [on hold]

    - by Sparx401
    I'm developing a site locally with XAMPP on Windows 7, and as far as media is concerned, I'm unable to play media files that are larger than 30MB or so. Both video and audio files (MP4 and MP3 respectively) generate this error in Chrome (and show similar errors in other browsers such as IE9 and Opera): No data received Unable to load the webpage because the server sent no data. Error 324 (net::ERR_EMPTY_RESPONSE): The server closed the connection without sending any data. It seems that the exact number of MB somewhat varies between browsers though. One video in question is 34MB and actually plays in Opera and IE9, but gives the aforementioned error in Chrome. I've checked to make sure the file paths were typed correctly and ensured that the directive for .htaccess is there to serve MP4s: AddType video/mp4 mp4 Also, I have these directives set as well in the same .htaccess file: php_value upload_max_filesize "80M" php_value post_max_size "80M" php_value max_input_time 60 php_value max_execution_time 60 And memory_limit is set in php.ini as "128M" so I'm left wondering: what is causing my files to not play, and what, if any, directives I have to change on the server-side? Perhaps something to do with limitations with the GET method (the method I'm seeing on Chrome's network tab among other header request/response info)?

    Read the article

  • Website Access...DNS, ISP, issue?

    - by sublet
    This isn't so much a code issue as it might be an issue with my ISP. For some reason when I visit a site very often, like one I manage or write stories on, it will just stop pulling data down after a while. It's very random when it happens, but probably happens once a week. If effects everyone who is accessing the site from this connection, and I can access other sites no problem. Also, if I go outside the office back home, which is right down the street, and access the site it is fine. I'm using Comcast in both locations. It's almost as if I have a limit on requests to each site and have hit my limit so it blocks the site for a while. Anybody have any clue what this might be?

    Read the article

  • News Portal CMS

    - by George Grigorita
    I am looking for a specific news portal CMS. I know all the major "general" CMS (like WordPress, Drupal or Joomla) and even the less known ones (like TYPO3, Expression Engine, Text Pattern or Concrete5). I'm already working with a Drupal distribution called OpenPublish and another WordPress installation to determine which would be better, but these are more of a Plan B. I would like to work directly with a CMS that was build exactly for this kind of tasks specific to a news / media portal. It doesn't matter if the CMS is commercial (however, I don't want to pay a monthly fee) or free, but I need to be able to use it on my own server / hosting and I need to be able to access it's source code (not to modify it, but to integrate it with future plugins / modules). If you know any CMS that qualifies for this job, please let me know. In the last few days I was all over Google but I couldn't anything worth mentioning.

    Read the article

  • Hosted CMS - Based On Drupal [closed]

    - by Eddy Freeman
    I just want a little clarification concerning hosted CMS like shopify.com, solidshops.com (i learnt shopify runs on ruby on rails) so let me be specific about hosted CMS based on Drupal :: www.buzzr.com, www.drupalgardens.com and www.pagebuild.net etc.. What i want to know is 1) Do they use the Multi-Site feature in Drupal to automatically creates all those 1000's of sites they host when a user sign up? 2) Do they create those 1000's of sites as sub-sites(if you like let me say subdomains)? 3) Do they use a different way other than the Multi-Site in Drupal?

    Read the article

  • How to find the Fastest DNS servers to host our domain?

    - by Denis Volovik
    The question was born because lately we've seen a pretty odd (well, at least for us, for the first time) - error message in Google webmaster tools - "DNS lookup timeout" ... I was pretty sure that with eNom's 5 DNS servers (dns1... to dns5.name-services.com) we're pretty set... But it appears that from (Europe/Hungary), for example - dns1.name-services.com takes 170ms. to respond on a ping... while GoDaddy's ns75.domaincontrol.com - takes only 40 ms. to respond... and at the same time - dns2 to dns5.name-services.com - each result with a timeout error (on ping)... This issue came to our attention right in the final stages of optimizing our web-site (almost to death) - basically, just in time... I would love to move our domains to a fast (fastest?) and reliable DNS server.. - but how do I find one ? Also - I did the ping tests from various geographic locations (we have servers in many countries) and GoDaddy seemed to be faster than eNom almost in every case. I'd be very thankful for any hints on this! Edited: Well.. maybe this one does not have an answer, after all...

    Read the article

  • Facebook Comments and page SEO

    - by Gaurav Gupta
    Facebook's recently launched commenting system for blogs loads comments in an iframe, instead of loading them inline. Since blog comments can often contribute significantly to the page's SEO, is it a good idea to use Facebook's system on my blog? Or, does Google recognize iframe content as a part of the page and treats it as such? (It's noteworthy that Disqus.com does not use iframes and loads all comments inline)

    Read the article

  • Is having a 'home' navigation item on the home page negative to your sites SEO?

    - by Brady
    My work colleague has recently had conversations with some SEO consultants and after those conversations she has come to the conclusion that having a link to the home page on the home page will have a negative effect on the websites SEO. And because of this we are now building websites that don't have a home link show until you are on any page other than the home page. If the above argument is true then surely then if we are on the about page of a website we shouldn't show a navigation item for the page we are on, and that would the case for any other page of the website... So my question is: Does having a home navigation item on the home page have a negative effect on the websites SEO? And if not: Why has my colleague come to the above conclusion? Could she be misunderstanding something more important about home links on the home page regarding SEO?

    Read the article

  • Submitting a sitemap to take care of inherited Google crawler errors

    - by leeand00
    I have an awful lot of Google Crawler errors (1000 or so) after I inherited a site that the previous owner migrated without moving much of their content. Would generating a map of the current site and submitting it to Google help fix this? Is there any quicker, automated way to eliminate errors other than clicking each and every site error? Note: I have already tried automating this on my own.

    Read the article

  • "Progressive" JPEG: Why do many web sites avoid rendering JPEGs that way? Pros, cons?

    - by Chris W. Rea
    When JPEG images are used by a web page, they are typically rendered top-down ... but they can also be rendered using a mode called progressive JPEG, where the image starts out full-size, but blurry, and then gets sharper with successive passes, until it's fully loaded. Progressive loading requires the image have been saved that way. Why don't more web sites use progressive JPEG? What are the drawbacks? Is it simply a lack of tool support, or are these files somehow inferior to traditional top-down rendered JPEG images?

    Read the article

  • Forum software alternative to phpBB3

    - by Fernando
    I've been using phpBB3 for quite some time now. It seems to me this forum software hasn't evolved at all in all these years. Installing mods is a hassle, updating it to a newer version a real pain in the arse and moderating is not intuitive at all. Besides, I find there's just no way to stop spam on it. Lots of web software have made a great job controlling spam, but phpBB3 still doesn't, at least not without too much complex and tedious work. Since my last attempt to update to the latest version broke it, I'm finally fed up with it, and decide I'm not wasting a minute more in mantaining such a beast. I'm looking for a free software (free as in free beer and free as in free speech) alternative. So SMF is not an alternative at the moment. The most important feature I'm looking for is there must be a script to migrate all of the current phpBB users and posts into the new system. Out of all the alternatives out there, does any of them support these features? Which one do you recommend?

    Read the article

  • Tracking click conversions with Google Analytics

    - by Joel
    Is there anyway I can use Google Analytics to track click conversions on a link? For example, if I have a link to www.a.com , is it possible for google to track the number of times that particular link was shown on my page and then track how many times it was really clicked? The problem is that I do not show the link to www.a.com every time the page loads. I am using a random function (server side) to generate a different link everytime. I would like Google Analytics to provide me with the click conversion for each of the links I choose to show the user. Thanks, Joel

    Read the article

  • Host And Expose Application to local small network

    - by tartak
    I developed a little application (web application) using JavaEE+MySql. I try to keep some data and .. from time to time to get some reports using my data. My problem is I have to access this application from 4-5 computers in the office. They are connected through a switch. It's a typical small office network, nothing fancy. I need some advice on how to do this. I mean for a small application with no external communication is it mandatory to use an Apache machine? I'd use a simple Tomcat container on the "server machine" (which is my computer, a windows machine) and .. basically .. I would like to permit the access to my colleagues also. I don't have any knowledge about concurrency (I know mysql permits concurrent access) so I would like some configuration tips also.

    Read the article

  • Best practices when loading images for improving page loading speed

    - by Naoise Golden
    I am working on optimizing a page's loading speed. Here are some analytics: Notice how the images, although only accounting for 65% of the total size (1.1MB), are by far the slowest loading assets: 96% of time. I'd like to know which are the recommended practices on optimizing loading speed, only taking images into account. Some of the techniques we are already applying: image compression images hosted on cookieless domain and CDN spriting everything that can be sprited http headers: keep alive and Expires to one year. Disclaimer: I have gone through the available documentation, I think by focusing on image loading optimization I am not creating a duplicate or a subjective question.

    Read the article

  • How to Install Moodle to subdomain with softaculous via cpanel

    - by Sean
    Hi there i installed moodle to a directory with softaculous as it doesn't allow installing to subdomain, after install I created a subdomain and pointed the destination (of subdomain) to previously created moodle directory, now when I go to the subdomain.example.com it says Incorrect access detected, this server may be accessed only through "http://example.com/moodle" address, sorry. Please notify server administrator. Any suggestions much appreciated! I must be doing something wrong, when installing it was very similar to these instructions

    Read the article

  • good/bad idea to use email address in php session variable? [closed]

    - by Stephan Hovnanian
    I'm developing some additional functionality for a client's website that uses the email address as a key lookup variable between various databases (email marketing system, internal prospect database, and a third shared DB that helps bridge the gap between the two). I'm concerned that storing a visitor's email address as a $_SESSION variable could lead to security issues (not so much for our site, but for the visitor). Anybody have suggestions or experience on whether this is okay to do, or if there's another alternative out there?

    Read the article

  • Transferring Email to Google Apps - Timing

    - by picus
    I did a site for a client a few months back. Hosting & email was setup through Dreamhost VPS. Hosting has not been an issue, but email has become increasingly dodgy. Long story short, they want to transfer to Google Apps for Biz. They already have the mailboxes setup - they are on macs so they will be transferring using the gmail email importer for mac - my question is this - should they transfer their domain over first or their emails? I'm a developer so I have no problem changing their DNS settings, but I am not an IT manager type by any stretch so I am a bit in the dark about process - my proposed process was: Delete any junk/deleted mail from current environment Backup email locally copy emails to google apps via importer Switch domain and update mac mail settings It seems that doing the domain first would be best but I don't know if that is possible. I have been trying to find a generic checklist, but i haven't been able to.

    Read the article

  • Error using SoapClient() in PHP [migrated]

    - by Dhaval
    I'm trying to access WSDL(Web Service Definition Language) file using SoapClient() of PHP. I found that WSDL file is authenticated. I tried with passing credentials on an array by another parameter and active SSL on my server, still I'm getting an error. Here is the code I'm using: $client = new SoapClient("https://webservices.chargepointportal.net:8081/coulomb_api_1.1.wsdl",array("trace" = "1","Username" = "username","Password" = "password")); Here is the error I'm getting: Warning: SoapClient::SoapClient(https://webservices.chargepointportal.net:8081/coulomb_api_1.1.wsdl) [soapclient.soapclient]: failed to open stream: Connection timed out in PATH_TO_FILE on line 80 Warning: SoapClient::SoapClient() [soapclient.soapclient]: I/O warning : failed to load external entity "https://webservices.chargepointportal.net:8081/coulomb_api_1.1.wsdl" in PATH_TO_FILE on line 80 Fatal error: Uncaught SoapFault exception: [WSDL] SOAP-ERROR: Parsing WSDL: Couldn't load from 'https://webservices.chargepointportal.net:8081/coulomb_api_1.1.wsdl' : failed to load external entity "https://webservices.chargepointportal.net:8081/coulomb_api_1.1.wsdl" in PATH_TO_FILE:80 Stack trace: #0 /home2/wingstec/public_html/widget/API/index.php(80): SoapClient-SoapClient('https://webserv...', Array) #1 {main} thrown in PATH_TO_FILE on line 80 It seems that error says file not exist at the path we given but when we run that path directly on browser then we're getting that file Can anyone help me to figure out what the exactly problem is?

    Read the article

  • Removing Duplicate Data From SQL Query Output For Display On A Web Page [migrated]

    - by doubleJ
    I had asked a similar question on stackoverflow but didn't really get anywhere. This page shows the output that I'm currently getting from my MSSQL server. I have a table of venue information (name, address, etc...) that our events happen on. Separately, I have a table of the actual events that are scheduled (an event may happen multiple times in one day and/or over multiple days). I join those tables with this query: <?php try { $dbh = new PDO("sqlsrv:Server=localhost;Database=Sermons", "", ""); $dbh->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION); $sql = "SELECT TOP (100) PERCENT dbo.TblSermon.Day, dbo.TblSermon.Date, dbo.TblSermon.Time, dbo.TblSermon.Speaker, dbo.TblSermon.Series, dbo.TblSermon.Sarasota, dbo.TblSermon.NonFlc, dbo.TblJoinSermonLocation.MeetingName, dbo.TblLocation.Location, dbo.TblLocation.Pastors, dbo.TblLocation.Address, dbo.TblLocation.City, dbo.TblLocation.State, dbo.TblLocation.Zip, dbo.TblLocation.Country, dbo.TblLocation.Phone, dbo.TblLocation.Email, dbo.TblLocation.WebAddress FROM dbo.TblLocation RIGHT OUTER JOIN dbo.TblJoinSermonLocation ON dbo.TblLocation.ID = dbo.TblJoinSermonLocation.Location RIGHT OUTER JOIN dbo.TblSermon ON dbo.TblJoinSermonLocation.Sermon = dbo.TblSermon.ID WHERE (dbo.TblSermon.Date >= { fn NOW() }) ORDER BY dbo.TblSermon.Date, dbo.TblSermon.Time"; $stmt = $dbh->prepare($sql); $stmt->execute(); $stmt->setFetchMode(PDO::FETCH_ASSOC); foreach ($stmt as $row) { echo "<pre>"; print_r($row); echo "</pre>"; } unset($row); $dbh = null; } catch(PDOException $e) { echo $e->getMessage(); } ?> So, as it loops through the query results, it creates an array for each record and ends up like this: Array ( [Day] => Tuesday [Date] => 2012-10-30 00:00:00.000 [Time] => 07:00 PM [Speaker] => Keith Moore [Location] => The Ark Church [Pastors] => Alan & Joy Clayton [Address] => 450 Humble Tank Rd. [City] => Conroe [State] => TX [Zip] => 77305.0 [Phone] => (936) 756-1988 [Email] => [email protected] [WebAddress] => http://www.thearkchurch.org ) Array ( [Day] => Wednesday [Date] => 2012-10-31 00:00:00.000 [Time] => 07:00 PM [Speaker] => Keith Moore [Location] => The Ark Church [Pastors] => Alan & Joy Clayton [Address] => 450 Humble Tank Rd. [City] => Conroe [State] => TX [Zip] => 77305.0 [Phone] => (936) 756-1988 [Email] => [email protected] [WebAddress] => http://www.thearkchurch.org ) Array ( [Day] => Tuesday [Date] => 2012-11-06 00:00:00.000 [Time] => 07:00 PM [Speaker] => Keith Moore [Location] => Fellowship Of Faith Christian Center [Pastors] => Michael & Joan Kalstrup [Address] => 18999 Hwy. 59 [City] => Oakland [State] => IA [Zip] => 51560.0 [Phone] => (712) 482-3455 [Email] => [email protected] [WebAddress] => http://www.fellowshipoffaith.cc ) Array ( [Day] => Wednesday [Date] => 2012-11-14 00:00:00.000 [Time] => 07:00 PM [Speaker] => Keith Moore [Location] => Faith Family Church [Pastors] => Michael & Barbara Cameneti [Address] => 8200 Freedom Ave NW [City] => Canton [State] => OH [Zip] => 44720.0 [Phone] => (330) 492-0925 [Email] => [WebAddress] => http://www.myfaithfamily.com ) As you can see, The Ark Church and its associated contact information is duplicated, so when I work with those arrays and output them to the page, I see a bunch of duplicate content. I'd like to remove the duplicate information so that I get results similar to this: The Ark Church Alan & Joy Clayton 450 Humble Tank Rd. Conroe, TX 77305 (936) 756-1988 [email protected] http://www.thearkchurch.org Meetings: Tuesday, 2012-10-30 07:00 PM Wednesday, 2012-10-31 07:00 PM Fellowship Of Faith Christian Center Michael & Joan Kalstrup 18999 Hwy. 59 Oakland, IA 51560 (712) 482-3455 [email protected] http://www.fellowshipoffaith.cc Meetings: Tuesday, 2012-11-06 07:00 PM Faith Family Church Michael & Barbara Cameneti 8200 Freedom Ave NW Canton, OH 44720 (330) 492-0925 http://www.myfaithfamily.com Meetings: Wednesday, 2012-11-14 07:00 PM It doesn't necessarily have to end up like that (I'm not looking for code specific for these results, but a concept of how to not show the duplicated information). I'm assuming that an additional foreach or while will do it, but I haven't figured out any logic that says <?php if ($location == $previouslocation) echo ""; ?>.

    Read the article

  • How to solve "Login Only" rejection?

    - by Renan
    Recently, a site of mine was rejected due to "Login Only": "Login Only: During our review of your website, we found that the majority of pages on your site are behind a login, or there is restricted access. Please note that we will not approve applications for login-protected pages, as we are not able to review their content for acceptance into the program." Although the site does require login to send content, it doesn't require any to view any page. How do I tell the Googlebot or whatever is used to crawl pages to adsense that all the content is publicly available but registration is needed to post?

    Read the article

  • How do you exclude yourself from Google Analytics on your website using cookies?

    - by Keoki Zee
    I'm trying to set up an exclusion filter with a browser cookie, so that my own visits to my don't show up in my Google Analytics. I tried 3 different methods and none of them have worked so far. I would like help understanding what I am doing wrong and how I can fix this. Method 1 First, I tried following Google's instructions, http://www.google.com/support/analytics/bin/answer.py?hl=en&answer=55481, for excluding traffic by Cookie Content: Create a new page on your domain, containing the following code: <body onLoad="javascript:pageTracker._setVar('test_value');"> Method 2 Next, when that didn't work, I googled around and found this Google thread, http://www.google.com/support/forum/p/Google%20Analytics/thread?tid=4741f1499823fcd5&hl=en, where the most popular answer says to use a slightly different code: SHS Analytics wrote: <body onLoad="javascript:_gaq.push(['_setVar','test_value']);"> Thank you! This has now set a __utmv cookie containing "test_value", whereas the original: pageTracker._setVar('test_value') (which Google is still recommending) did not manage to do that for me (in Mac Safari 5 and Firefox 3.6.8). So I tried this code, but it didn't work for me. Method 3 Finally, I searched StackOverflow and came across this thread, http://stackoverflow.com/questions/3495270/exclude-my-traffic-from-google-analytics-using-cookie-with-subdomain, which suggests that the following code might work: <script type="text/javascript"> var _gaq = _gaq || []; _gaq.push(['_setVar', 'exclude_me']); _gaq.push(['_setAccount', 'UA-xxxxxxxx-x']); _gaq.push(['_trackPageview']); // etc... </script> This script appeared in the head element in the example, instead of in the onload event of the body like in the previous 2 examples. So I tried this too, but still had no luck with trying to exclude myself from Google Analytics. Re-iterate question So, I tried all 3 methods above with no success. Am I doing something wrong? How can I exclude myself from my Google Analytics using an exclusion cookie for my browser?

    Read the article

  • Apache redirecting: reason unknown

    - by Sinan
    I have a simple php script. The script is not important. It just prints out $_SERVER. When I request an URL like www.server.com/?ref=bar everything is fine. However if the request contains something like www.server.com/?ref=http://www.test.com (?ref=http%3A%2F%2Fwww.test.com) the server redirects to 403.shtml. No redirect for http://x but redirects http://x.y As far as I can understand somehow the server doesn't like "http://x". It always redirects to 403.shtml when there is a valid query string in the form of a valid url. my .htacess file is the same both on my server and local test server and local test server behaves as expected (no redirects). So I don't it is related to .htaccess. I'm on shared host on Hostgator. Can anyone help? Edit: Here's the .htaccess file Options +FollowSymLinks RewriteEngine On RewriteBase / RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^(.*)$ index.php?$1 [L,QSA] When there is an http://xx.x it redirects to 403 even if there is physical file. However if I remove the .htaccess redirect to 403 also disappears. But I need the above .htaccess file. Is there a way to get around this?

    Read the article

  • Custom per domain CSS in Internet Explorer

    - by Damiqib
    We have an old web app, which would be much more usable if it could be visually tweaked a bit. Being in a corporate environment - IE (always using the latest version) is all I can use. Also app in question being 3rd-party - there's no way to change it's own CSS files. Is there a way to use per domain injected custom CSS in internet Explorer. Let's say I want to change the background-color of domain http://oldapp.localintranet/ - is there any way to make this happen? Place to put a custom.css-file? With an add-on/extension?

    Read the article

< Previous Page | 75 76 77 78 79 80 81 82 83 84 85 86  | Next Page >