Search Results

Search found 17124 results on 685 pages for 'final cut pro'.

Page 120/685 | < Previous Page | 116 117 118 119 120 121 122 123 124 125 126 127  | Next Page >

  • Can I remove visible referer from link?

    - by Andreas
    I use referer info to track which of my campaigns works the best. So instead of <a href="someweb.com">someweb</a> I have a link like <a href="http://someweb.com?utm_source=john&utm_medium=email&utm_content=NAME&utm_campaign=campaing">someweb</a> Now when a user cliks "someweb" the whole URL string is shown in the adressbar. Is this possible to mask/hide somehow? Maybe via .htaccss? Thanks in advance

    Read the article

  • Robots.txt never downloaded but some blocked URLs in GWT

    - by Zistoloen
    There is something I don't understand in Google Webmaster Tools (GWT) for my Wordpress site. In menu "Blocked URLs", it mention that my robots.txt has never been downloaded but there are some blocked URLs. It's kind of weird and not logical. Am i missing something? User-agent : * Disallow: /*? Disallow: /wp-login.php Disallow: /wp-admin Disallow: /wp-includes Disallow: /wp-content Allow: /wp-content/uploads Disallow: */trackback Disallow: /*/feed Disallow: /*/comments Disallow: /cgi-bin Disallow: /*.php$ Disallow: /*.inc$ Disallow: /*.gz$ Disallow: /*.cgi$ Disallow: /author/* I'm afraid my robots.txt doesn't block several URLs I want to block.

    Read the article

  • Can I enable GZIP on Godaddy?

    - by Dave
    One of my sites lives on GoDaddy's bottom-of-the-line cheap hosting, I have the correct code in my .htaccess file, but it's not compressing because mod_deflate is not loaded. How do I enable that? The best I've found is this article which suggests I use PHP to zip everything (which is going to be more work than just changing hosting companies): " Add the following code to the very top of your Web pages above the DOCTYPE: if (substr_count($_SERVER['HTTP_ACCEPT_ENCODING'], 'gzip')) b_start("ob_gzhandler"); else ob_start();

    Read the article

  • How to enable customers to use their own domain for sites hosted by me

    - by Scott
    I am thinking of running a self-site builder. But was wondering how would I allow customers to use their own domains that they already own. Is that even possible? Let's say my site is www.bestsitebuildingwebsite.com and each customer has urls like this www.bestsitebuildingwebsite.com/frances www.bestsitebuildingwebsite.com/eden www.bestsitebuildingwebsite.com/john And a customer has a domain called widgets.com Is it actually possible domain widgets.com to go to my site somehow and have HASHES on the URL still work (my site makes use of hashes for AJAX queries). And their site still have good SEO with Google? Thanks Scott

    Read the article

  • Managing multiple Adwords accounts from one Google account

    - by CJM
    I have a Gmail account linked to numerous Analytics accounts, and a couple of Adwords accounts - that is, I can track stats from a dozen or so sites, and have administration rights for a couple of Adwords campaigns. However, a client has already set-up their Adwords account and has invited me help administer their campaign. However, when I try to accept, I get the folowing error: The Google Account xxx already has access to an AdWords account (Customer ID: ). As many have discovered, for some reason Google won't let an account that already owns an Adwords campaign, to join another account. However, I wondering if there is any workaround for this? Temporarily, I'm using a separate Gmail account for this, but what is the longer term solution. Going forwards, sometimes clients will be happy from me to 'host' their campaigns (but providing them with access), but I'm equally sure that many will want to retain greater control. Surely there must be a better way than creating an additional Gmail account for each client? How do web/SEO agencies handle this?

    Read the article

  • Is PhotoBucket a viable solution to host a website's photo galleries

    - by Evan Plaice
    I'm currently working with a lot of photographers and will probably be picking up development on a professional photography site soon. With that in mind, and I can't stop thinking about a way I can implement a user-friendly photo gallery hosting solution where the site owner can upload images themselves without any webmaster intervention. Kind of like a CMS for image hosting. The idea is: - The user can log in to PhotoBucket - Upload their gallery - Visit an admin section of the site - Enter the new gallery name to the listing And... Voila, the gallery automagically gets displayed on the website in a clean lightbox-style presentation format (ie, no iframe nonsense). I took a brief look at the API and it looks promising. Is this a viable solution? Bonus points if you have implemented something like this with Photobucket and/or another 3rd-party image hosting site. Note: Purchasing a premium account is expected if necessary. The limitations on free accounts at most image hosting sites are just too restrictive to be useful.

    Read the article

  • Is Live Chat a valuable feature?

    - by Jim
    Does having a live chat feature on a website increase conversions enough to cover the cost administration? Most of the data I have found comes from companies selling live chat solutions and is unreliable in my opinion. Do users find this as a valuable feature or an annoyance? Personally, I'm annoyed by sites that use live chat especially when a little box pops up without my explicit request, but if users find this valuable then I should consider implementing it. Any concrete data is appreciated.

    Read the article

  • How is this site so fast?

    - by user8628
    how is the website http://dftba.com/ so fast? when i click a link it loads right then? what makes it work like this? how do i make it work like this on my site? some of the objects on the site are being hosted by a website called ecogeek-cdn.net? who is this company and why do they host the images of this site? i have been looking into this site some time because i want this site to be like mine site they site use Apache they site use Python (when asked the developer told me this) they site use jquery and jqueryui they site is custom built not using wordpress they site is ownedhosted by liquidweb they site gets a million users a month they site launched in january they site uses cpanel they site does not have SSH or FTP (i tried to connect but it denied me all) they does have SSH and FTP but only allowed by their addresses Please; my english is not as good as yours

    Read the article

  • Failing with Adsense / How to get $ PC

    - by cam77
    So, I am literally just starting out with google adsense. I have implemented google adsense ads from (1) account and from (2) different channels, per website they're on. (2 Wordpress Websites, activated with 'GoogleAdsense Plugin' for WordPress). They are implemented at the bottom of every post on my 2 Wordpress websites within the 'Blog' sections. My adsense dashboard is stating I've received a few clicks; but across it still states my account earnings and balance at $0.00. When / How will I start seeing money earned to my account?

    Read the article

  • How To Make FileZilla Open All The Required Files With One Click

    - by Omar Tariq
    Is there any way of configuring FileZilla so that I can open all the files on a server that I use to edit with just one click. For example if the files are like this:- /home/abc/def/one.txt /home/abc/def/yet/another/directory/two.txt /home/abc/def/ghi/yet/another/directory/three.txt then it is very time-consuming to navigate through each directory and open the required files. These are only 3 files but what if we have around 10 to 20 files? Yes, copying the path of the directories is one thing. But something that is built-in so that I can just click a button like open all the required files of this connection and it opens all the files in the editor (as set in FileZilla preferences) then that would be great!

    Read the article

  • .htaccess RedirectMatch 301 issue

    - by Steve
    Hi. I've moved my Wordpress installation from one domain to another, and I want to use an .htaccess file on the original to redirect visitors to the new page on the new website. The old site is http://www.steve.doig.com.au/wordpress/. The new site is http://www.superlogical.net I tried using tried using the following .htaccess file in the /wordpress directory: RedirectMatch 301 http://www.steve.doig.com.au/wordpress(.*) http://www.superlogical.net/$1 However, all this does is redirect visitors to the URL: http://www.superlogical.net/wordpress/ I guess this is working properly, but I don't have Wordpress installed in a /wordpress folder on the new domain. How do I remove this from the URL redirected to? Thanks..

    Read the article

  • tomcat behind apache

    - by dannynjust
    i am trying to use the mod_proxy_ajp to forward all the request from tocat.example.com to example.com:8080 here is what the tomcat server.xml looks like: <Connector port="8009" protocol="AJP/1.3" redirectPort="8443" /> and here is the apache.conf config: <VirtualHost *:80> ServerName tomcat.example.com ServerAdmin [email protected] ErrorLog logs/tomcat.example.com-error_log CustomLog logs/tomcat.example.com-access_log common <Proxy *> AddDefaultCharset Off Order deny,allow Allow from all </Proxy> ProxyPass / ajp://example:8009/ ProxyPassReverse / ajp://example:8009/ </VirtualHost> but it is not working, any idea?

    Read the article

  • Nginx or Apache for a VPS?

    - by James
    I consider myself to be an inexperienced user/administrator when it comes to running my VPS. I can get by with a few CLI commands, I can set up Webmin and I can set up Yum repos, but beyond the very basic stuff, I'm out of my depth. So far, I'm running Apache. I don't know it particularly well, but I can get by with editing httpd.conf if I'm told what to edit. I've heard good things about Nginx and that it's not as resource-hungry as Apache. I'd like to give it a go, but I can't find any information about its suitability for administrators like me, with little experience of sysadmin or web server config. Webmin now has support for Nginx, so getting it installed and running probably won't be too much of a problem. What I'm wondering is, from a site administrator perspective, is running Nginx as transparent as running Apache? IE, at the moment, I can just throw up Wordpress and Drupal sites without having much to worry about or having to make any config changes to Apache. Would Nginx be as transparent?

    Read the article

  • Serve most of a domain with Apache, but use mod_proxy to serve some URLs from Lighttpd

    - by Alex Pineda
    So we wish to host some pages on a new server with apache2, and embed some of our old content & functionality from another server with lighttpd in an iframe. I'm looking at this configuration from the apache docs (http://httpd.apache.org/docs/2.2/vhosts/examples.html#page-header) under "Using Virtual_host and mod_proxy" together. <VirtualHost *:*> ProxyPreserveHost On ProxyPass / http://192.168.111.2/ ProxyPassReverse / http://192.168.111.2/ ServerName hostname.example.com </VirtualHost> The only issue is that I want to proxy only on a subdomain, or even better, if I can keep the top domain and proxy only if the url contains a particular path ie. "/myprocess.php". So in essence the DNS will point to the apache2 as the "master router".

    Read the article

  • Retroactively applying a Piwik goal to visitors

    - by Andrew Aylett
    I started receiving a large (for me) amount of traffic on one of my pages yesterday. Today, I thought that it would be useful to track goals from that page -- there's a link to my blog from it. I added the 'visited external link' goal to Piwik, and new visits are being recorded. However, it seems to me that there must be enough data in the database to retroactively apply the goal to past users -- is there a way to achieve that?.

    Read the article

  • Adding my face to my web-site in Google's search result

    - by Roman Matveev
    I'm trying to accomplish the rich snippet to the template of my future web-site. The data format is review and I used the microdata formatting to add all necessary information to the web-page. The Structured Data Testing Tool delivered rating, author information and review date: However there is no my face image and the sections related to authorship are empty: I made all that recommended to link my Google+ profile to the web-site: I did something wrong? Or I will not be able to see my face in the test tools ever and it will be in the real SERP?

    Read the article

  • Can search engine robots read file with permission 640?

    - by dkjain
    I am on a shared web hosting linux server. I want search engine robots/spiders to be able to read the robots.txt but not any one typing www.mysite.com/robots.txt. As per the following google group post, the user specifies that by setting file permission to 640, it's possible to deny access to robots.txt file by the world but still enable search engine robots to read them. Is that true? If not how it's possible to deny general public access to robots.txt but still allow Search engine robots to read them.

    Read the article

  • Is there a good [and modern] reason to not have static HTML pages with AJAX content , rather than generate pages?

    - by user1725
    Assumptions: We don't care about IE6, and Noscript users. Lets pretend we have the following design concept: All your pages are HTML/CSS that create the ascetics, layout, colours, general design related things. Lets pretend this basic code below is that: <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd"> <html> <head> <link href="/example.css" rel="stylesheet" type="text/css"/> <script src="example.js" type="text/javascript"></script> <head> <body> <div class="left"> </div> <div class="mid"> </div> <div class="right"> </div> </body> </html> Which in theory should produce, with the right CSS, three vertical columns on the web page. Now, here's the root of the question, what are the serious advantages and/or disadvantages of loading the content of these columns (lets assume they are all indeed dynamic content, not static) via AJAX requests, or have the content pre-set with a scripting language? So for instance, we would have, in the AJAX example, lets asume jquery is used on-load: //Multiple http requests $("body > div.left").load("./script.php?content=news"); $("body > div.right").load("./script.php?content=blogs"); $("body > div.mid").load("./script.php?content=links"); OR--- //Single http request $.ajax({ url: './script.php?content=news|blogs|links', method: 'json', type: 'text', success: function (data) { $("body > div.left").html(data.news); $("body > div.right").html(data.blogs); $("body > div.mid").html(data.links); } }) Verses doing this: <body> <div class="left"> <?php echo function_returning_news(); ?> </div> <div class="mid"> <?php echo function_returning_blogs(); ?> </div> <div class="right"> <?php echo function_returning_links(); ?> </div> </body> I'm personally thinking right now that doing static HTML pages is a better method, my reasoning is: I've separated my data, logic, and presentation (ie, "MVC") code. I can make changes to one without others. Browser caches mean I'm just getting server load mostly for the content, not the presentation wrapped around it. I could turn my "script.php" into a more robust API for the website. But I'm not certain or clear that these are legitimately good reasons, and I'm not confidently aware of other issues that could happen, so I would like to know the pros-and-cons, so to speak.

    Read the article

  • Hide email adress with JavaScript

    - by Martin Aleksander
    I read somewhere that hiding email address behind JavaScript code, could reduce spam bots harvesting the email address. <script language="javascript" type="text/javascript"> var a = "Red"; var t = "no"; var doc = document; var b = "ITpro"; var ad = a; ad += "@"; ad += b; ad += "."; ad += t; var mt = "ma"; mt += "il"; mt += "to"; var text = ""; if (text == null || text.length == 0) text = ad; doc.write("<"+"a hr"+"ef=\""+mt+":"+ad+"\">"+text+"</"+"a>"); </script> This will not display the actual email-address in the sourcecode of the page, but it will display and work like a normal link for human users. Is it any point of doing this? Will it reduce spam bots, or is it just nonsense that might slow down performance of the page because of the JavaScript?

    Read the article

  • Tracking unique views for a site showing my advertisements [on hold]

    - by user580950
    I am in trouble. I placed and advertisement on a website in 2012. The website said they got 950,000 unique visits each month. Early in 2012 I advertised with them. The advertisement didn't worked out. I checked in 2-3 months time and I saw that the unique visitors on the site was 8,000 at that time. I immediately closed the account. I don't remember which site I used to check the unique visitors. The advertising company has filed a dispute against me. So is there any tool that can show me the 2012 stats for any website? I tried Google Trends but it doesn't show statistics.

    Read the article

  • How long should my Html Page Title Really be?

    - by RandomBen
    How long should my text within my <title></title> tags really be? I know Google cuts it off at some point but when? When I used IIS7's SEO Toolkit 1.0 I get error stating my title should be under 65 characters. I have a book by Bruce Clay that states I should use from 62-70 characters and roughly 9 +/- 3 words. I also have used SenSEO's Firefox Add-on and it states I should use a max of 65 characters or roughly 15 words. What is the max really? I have 2 sources saying 65 and 1 saying 72 but Bruce Clay is generally kept in high regard.

    Read the article

  • Google Analytics - Unable to get GA Tracking

    - by Pure.Krome
    We've been using GA for a few years with no probs. About 2-3 weeks ago we tried to clean up some of our tracking and on one of our profiles, it's not working anymore (since oct 10.) First, some context then some GA Debugging code. 1. Context. We have the following setup: different root domains AND different sub-domains on one of the root domains. www.website.com www.website.com.au www.anotherWebsite.com foo.website.com baa.website.com So what we're doing is the following: each root domain and each sub-domain get their own tracking code. This way we can allow separate people (from outside our company) to access only their own data. Eg. a manager for foo.website.com can only see data related to that domain .. and see data on the other domains. Have a last account which is the SUM of all the domains. this is for us. so we can see total numbers. So to do this, we have two trackers that fire off, on the page. the individual accounts all work fine - they seem to be tracking data ok. the 'global' account is not working and this gives us the = Tracking Not Installed error. This has been going on since oct 10. So the wait 24/48/72 hours thing is waaaaay over. 2. GA Debug code. Installing GA Debug chrome extension gives the following output. I've tried to hide anything that could be considered secret. UA-XXXXX34-1 == Global account (which isn't working any more). UA-XXXXX34-11 == Specific account for www.website.com _gaq.push processing "_setAccount" for args: "[UA-XXXXX34-1]": ga_debug.js:18 _gaq.push processing "_setDomainName" for args: "[website.com]": ga_debug.js:18 _gaq.push processing "_setAllowLinker" for args: "[true]": ga_debug.js:18 _gaq.push processing "_trackPageview" for args: "[]": ga_debug.js:18 Track Pageview ga_debug.js:18 Tracking beacon sent! utmwv=--snipped-- Account ID : UA-XXXX234-1 Page Title : Some page title Host Name : www.website.com Page : / Referring URL : - Hit ID : 1923583969 Visitor ID : 785310647 Session Count : 51 Session Time - First : Thu Aug 23 2012 15:20:17 GMT 1000 (AUS Eastern Standard Time) Session Time - Last : Mon Oct 29 2012 11:41:46 GMT 1100 (AUS Eastern Summer Time) Session Time - Current : Mon Oct 29 2012 12:19:23 GMT 1100 (AUS Eastern Summer Time) Campaign Time : Thu Aug 23 2012 15:20:17 GMT 1000 (AUS Eastern Standard Time) Campaign Session : 1 Campaign Count : 1 Campaign Source : (direct) Campaign Medium : (none); Campaign Name : (direct) Language : en-gb Encoding : UTF-8 Flash Version : 11.4 r31 Java Enabled : true Screen Resolution : 1050x1680 Browser Size : 1033x861 Color Depth : 32-bit Ga.js Version : 5.3.7d Cachebuster : 1846514973 ga_debug.js:18 _gaq.push processing "_setAccount" for args: "[UA-XXXX234-11]": ga_debug.js:18 _gaq.push processing "_setDomainName" for args: "[website.com]": ga_debug.js:18 _gaq.push processing "_setAllowLinker" for args: "[true]": ga_debug.js:18 _gaq.push processing "_trackPageview" for args: "[]": ga_debug.js:18 Track Pageview ga_debug.js:18 Tracking beacon sent! utmwv=--snipped-- Account ID : UA-XXXX234-11 Page Title : SomePageTitle Host Name : www.website.com Page : / Referring URL : - Hit ID : 1923583969 Visitor ID : 785310647 Session Count : 51 Session Time - First : Thu Aug 23 2012 15:20:17 GMT 1000 (AUS Eastern Standard Time) Session Time - Last : Mon Oct 29 2012 11:41:46 GMT 1100 (AUS Eastern Summer Time) Session Time - Current : Mon Oct 29 2012 12:19:23 GMT 1100 (AUS Eastern Summer Time) Campaign Time : Thu Aug 23 2012 15:20:17 GMT 1000 (AUS Eastern Standard Time) Campaign Session : 1 Campaign Count : 1 Campaign Source : (direct) Campaign Medium : (none); Campaign Name : (direct) Language : en-gb Encoding : UTF-8 Flash Version : 11.4 r31 Java Enabled : true Screen Resolution : 1050x1680 Browser Size : 1033x861 Color Depth : 32-bit Ga.js Version : 5.3.7d Cachebuster : 1580443754 and this is the js code he have. BTW, it is inside a <head></head> <script type="text/javascript"> var _gaq = _gaq || []; _gaq.push( ['_setAccount', 'UA-XXXX234-1'], ['_setDomainName', 'website.com'], ['_setAllowLinker', true], ['_trackPageview'] ,['b._setAccount','UA-XXXX234-11'], ['b._setDomainName','website.com'], ['b._setAllowLinker',true], ['b._trackPageview'] ); (function () { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })(); </script> Finally, I've triple checked that the UA is the correct text. and yes, the global account is -1 and the specific domain is -11. Anyone have any suggestions to help?

    Read the article

  • Set up development site on another server/host

    - by Ofeargall
    I'm developing a site for a client. They've got a site now that's hosted at hosting.com. I'm going to move them to my VM hosting solution at edge web but I want to run some tests and have the client approve the site before changing the name servers to the new site/hosting location. How do I make this happen? I'm running a red hat/Apache on linux for the edge web hosting. I don't have control of the domain name (i.e. the client controls that right now). Edgeweb has set up a dns zone for the domain name so that when the time comes to switch we're ready to go. I'm a web developer and I understand the technologies that make a user experience 'work' but I'm unfamiliar with the server jargon and all that so, please be patient. Thanks in advance.

    Read the article

  • How to interpret number of URL errors in Google webmaster tools

    - by user359650
    Recently Google has made some changes to Webmaster tools which are explained below: http://googlewebmastercentral.blogspot.com/2012/03/crawl-errors-next-generation.html One thing I could not find out is how to interpret the number of errors over time. At the end of February we've recently migrated our website and didn't implement redirect rules for some pages (quite a few actually). Here is what we're getting from the Crawl errors: What I don't know is if the number of errors is cumulative over time or not (i.e. if Google bots crawl your website on 2 different days and find 1 separate issue on each day, whether they will report 1 error for each day, or 1 for the 1st, and 2 for the 2nd). Based on the Crawl stats we can see that the number of requests made by Google bots doesn't increase: Therefore I believe the number of errors reported is cumulative and that an error detected on 1 day is taken into account and reported on the subsequent days until the underlying problem is fixed and the page it's crawled again (or if you manually Mark as fixed the error) because if you don't make more requests to a website, there is no way you can check new pages and old pages at the same time. Q: Am I interpreting the number of errors correctly?

    Read the article

  • What is the SEO impact of moving my domain to another IP address and what is the right way of doing this?

    - by ElHaix
    I am planning to move several websites to a new hosting provider - keeping the same URL but will resolve to different IP addresses. For example, some sites are Canadian content-only sites, hosted on .CA domains sitting on Canadian IP addresses. I want to move these to Amazon servers which have US IP addresses. The domain names will remain the same. (1) What is the SEO impact of this? (2) Will the site lose some ranking if the sites are moved to a new IP address (Canadian or not), and if so, what is the cleanest way of accomplishing this (some kind of 301's)?

    Read the article

< Previous Page | 116 117 118 119 120 121 122 123 124 125 126 127  | Next Page >