Search Results

Search found 9717 results on 389 pages for 'gkt pro'.

Page 93/389 | < Previous Page | 89 90 91 92 93 94 95 96 97 98 99 100  | Next Page >

  • Video for an ads-driven web-site

    - by AntonAL
    I have a website, wich i will fill with a bunch of useful videos. I've implemented an ads rotation engine for articles and will do so for videos. The next milestone is to decide, how video will be integrated. They are two ways: To host videos myself. Pros: complete freedom. Cons: need tens of gigabytes of storage; support for multiple formats to be crossbrowser and crossdevice. Use Youtube. Pros: Very simple to use; nothing to do. What are pros and cons for each way ? Some questions for YouTube: Will i be able to control playback of YouTube-embedded video to make post-rolls ? What is ranking impact on my web-site, when most of pages will refer to YouTube ? Will, say, iPad play video, embedded via YouTube's iframe ? Does relying entirely on YouTube have a long-term perspective for a web-site, that should bring money ?

    Read the article

  • Set Up Google Analytics to Track Domain Alias

    - by Brian Boatright
    I found this article from Google http://www.google.com/support/analytics/bin/answer.py?hl=en&answer=55523 However I'm not sure what happens to the data. Will I be able to determine which domain forwarded to the primary domain using their technique? Or will it simply tranfers all the relevant keyword and other factors to the primary domain but not which domain was originally landed before the 302 redirect. What I need to do is track which domain alias are being used.

    Read the article

  • Legal Web Fonts & Licensing

    - by Phill
    So I'm a little confused in regards to legality of using fonts for web. Often from designers I get a PSD file and it uses a special font, and the font is supplied. However attempting to convert the font using: http://www.fontsquirrel.com/fontface/generator Ends up with a message saying it's blocked by Adobe. Not only that sometimes if the font can be converted, it often looks like crap when viewed in a browser. I assume that the generator is primarily for people to convert their own fonts, but if you purchase the use of a font then you can only use it for Web if the terms allows you to? The fonts used in PSD's are often Adobe Fonts, I can't find anything that suggests I can convert those and use them on the web. So I'm wondering if anyone knows the legal rights around using Photoshop supplied fonts on the web? In addition I'm wondering what resources are available (free/paid) that provide fonts that can be used on the web. Free: http://www.fontsquirrel.com http://www.google.com/webfonts Paid: http://www.fontshop.com This is the only one's I've found so far that aren't cartoon type fonts like what's primarily on www.dafont.com

    Read the article

  • Website design reviews and advice [closed]

    - by dotman14
    I have developed a website for a non-profit organisation, and after a while I constantly get bad reviews on how my CSS is. Most of them don't really say what the problem is or how I can manage to redo or make amends to it. Please what do you advice that I do in this case to make it look better. Please feel free to migrate the question to the appropriate SO site, if this question does not belong here. Thank you.

    Read the article

  • protecting css selectors on large website

    - by Tim
    I have content that appears within a corporate website inside an iframe. Several departments contribute their own CSS files to manage the overall UI and design. My problem is that they may use selectors for elements like td (for instance), without notice. Of course that will affect my own content in the frame unless I add a class to every td. I'm just using td as an example: the generic style for any element could change without notice. Is there any method/convention/practice I can use to protect my own styling?

    Read the article

  • Do image backlinks count as backlinks?

    - by sam
    If i have lots of images appearing tumblr blogs, the sort of tumblogs with very little text just reams and reams of images for people to browse through (example - http://whereisthecool.com/). If my image is embeded in their site like this : <a href="http://mysite.com" target="blank"> <img src="cutecatblog.com/cat.jpg" alt="cute cat"/> </a> so the image was a link back to my site. Although there is no anchor text to speak of does google take into account the alt text of the image ? Would this still count in googles eyes as a backlink ?

    Read the article

  • SSL setup with GoDaddy subdomains and EC2 servers

    - by Kevin
    We have two EC2 instances that are used to host various scripts. Our main page 'companyname.com' is hosted with GoDaddy but is unrelated to those EC2 instances. I need to setup SSL connections for the two EC2 microinstances, one running Linux AMI and the other running Windows Server. I purchased two single-domain Comodo certificates and am at the part to generate CSR's on the instances. I'm not sure what to put as "Server Name" on EC2. I would like each server to be accessible through a subdomain which I have forwarded on GoDaddy to the elastic IPs on EC2. For server name, do I use the elastic ip, the EC2 public dns, or the subdomain that I want? And which of these do I then place in my VirtualHosts file on Apache? The Windows instance is running IIS7 but the Apache box is priority.

    Read the article

  • Analytics Tracking and SEO

    - by Mahesh
    I'm using piwik on some of my websites and recently switched from google analytics. I find most of the stuff same on both analytics. But i always had this question in mind that what am i supposed to track other than these ? Bounce rate Referral sites Keywords Geolocation Periodic data(Month, year, week) for above factors Any other SEO factors to be considered while tracking with any analytics software ?

    Read the article

  • Fetch as Googlebot works but Submit to Index does not for AJAX urls

    - by Jennifer
    First I fetch as googlebot, then I am prompted to Submit to Index. This I want to do, but the tool just re-prompts me. This does not happen when I am just submitting a standard url. For those urls I get a confirmation that they were submitted to the index. It only occurs when I am submitting a AJAX url. I know the urls are searchable, as I have performed many tests and see the results using /?_escaped_fragment_= Here is an example url: http://www.townbeam.com/#!events Can someone shed some light on this? Thank you

    Read the article

  • Meta description not displaying in custom site search results page

    - by Stephen Connolly
    We have Google Custom Site Search implemented on our company website. When I'm looking at the results page, I noticed that the Meta Description is not being displayed. It just seems to be reading the links titles from our drop down menu and using this as a description. When I search for the same page via google.com, the meta description is pulled in correctly. Any thoughts why this might be happening. I can't see anything in the Custom Site Search settings.

    Read the article

  • DNS slows down on development environment

    - by Sequenzia
    I have a local development environment setup on my Mac. I am running an Ubuntu Web Server inside of a Virtual Box VM. I setup a host file on my Mac that points my dev site to the IP of the Ubuntu Virtual Server. Everything works good other than the fact a lot (not all) of the time it takes more than 5 seconds to load a page. I used firebug to track down where the problem is and when it's slow the DNS part of my request is taking over 5 seconds. Like I said it's not all the time. Sometimes it resolves and loads the page within milliseconds. The same page one click will be super fast and then the next time it takes over 5 seconds. It's really slowing me down and I am not sure what is causing it.

    Read the article

  • SSL Certificate Works in Monit - But Not in Keystore

    - by Bart Silverstrim
    I have a situation where there's a keystore file with the various root/intermediate certificates stored in it in a way that it seems to work for most browsers. Problem is that when mobile browsers hit it, there's a break in the chain and they complain. I used an SSL checker at http://www.sslshopper.com/ssl-checker.html and it states that "The certificate is not trusted in all web browsers. You may need to install an Intermediate/chain certificate to link it to a trusted root certificate." So...the desktop browsers must have the intermediate certs already and can make the chain connections, I'm assuming, while the mobile browsers can't. The thing is that I had used Portecle to export certificates from the keystore and cobble them together to create a .PEM certificate to run the Monit utility. When I check that application with the SSL checker, it works fine! The person that originally created the keystore said he couldn't follow the SSL provider's directions for creating the keystore because he created the CSR request using openssl, so the cert and private key had to be converted to DER format and use importkey to get it to work; following the directions he found online had importkey seem to use only a set keystore file as a result, and it would erase anything already in the file if it existed. So is there a way to take the certificate I created for Monit and create a working keystore for the Tomcat website? What would be causing the chain to be broken in the current keystore, but work for Monit? I have the SSL cert provider's intermediate and cross certificates, and the website's certificate, but is what else would I need to create a working chain of certs for a keystore?

    Read the article

  • Google Analytics async=true seems wrong in the Google documentation?

    - by leeand00
    In the Google Analytics async example, they state that in order to include more than one tracker, you need to setup your pages for asyncrous tracking, and they do so using the following code: <script type="text/javascript"> _gaq.push( ['_setAccount', 'UA-XXXXX-1'], ['_trackPageview'], ['b._setAccount', 'UA-XXXXX-2'], ['b._trackPageview'] ); (function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })(); </script> The second tracker is not receiving any results. After looking at my tracking codes to ensure they are correct, I noticed that the ga.async = true statement is specified differently most of the time and is never set to a value of true, it's often set to async but never true. Could this be stopping my Analytics data from posting to the second tracker? or might it be something else? Also what calls should I look for in the Net tab in firebug to ensure that GA is being called when the page loads?

    Read the article

  • small IIS web farm-create an Active Directory domain or no?

    - by brian b
    We have a smallish web farm of < 5 Windows 2008 servers. Some do data, most do IIS hosting. Is it a good/bad idea to set up a domain controller and put all in the same "production" domain? We want to avoid a world where we have to sync multiple admin passwords between the boxes (or share admin credentials among the team). Presumably, the DC would be just another VM, so hardware cost doesn't enter into the discussion.

    Read the article

  • Domains with similar names and legal issues

    - by abel
    I recently purchased one of those domain names like del.icio.us. While registering I found that delicious.com was being used. Argument: I found that delicious.com belonged to the same category as my to-be website. It served premium delicious dishes. Counter Argument: My to-be domain though belonging to the same category, specialized in serving free but delicious dishes or in giving out links(affiliate) to other sites serving premium delicious dishes. Additional Counter Arguments: 1.delicious.com was not in English. 2.the del.icio.us in my domain name though having the same spelling, is not going to be used in the same fashion. For eg.(this may not make sense, because the names have been changed)the d in delicious on my website actually stands for the greek letter Delta(?/d) and since internationalized domains are still not easily typable, I am going for the english equivalent.The prefix holds importance for the theme of the service which my website intends to offer. My Question: Can I use the domain name del.icio.us for my website? How are these kinds of matters dealt? (The domain names used are fictitious. And I have already registered the domain but have not started using it.I chanced upon this domain name because it was short, easy to remember and suited the theme of my website.)

    Read the article

  • Facebook link to facebook.com/company-page doesn't work

    - by Teo
    For the last 2 days I'm trying to find the reason why my previous setup, which was a link to my websites Facebook pages doesn't work anymore. I assume that I mistakenly changed something in the Facebook developer area, but I can't remember what it was. The bottom linked my previously to the Facebook.com/company-page, now the same Bottom links me just to Facebook.com. I guess I saw some redirect in the tab, but I'm not sure since it's too fast changing to Facebook.com. The original link in the footer is correct : <a href="http://facebook.com/company-page " target="_blank" class="facebook_ico"></a>. Any ideas?

    Read the article

  • SEO Keyword Research Help

    - by James
    Hi Everyone, I'm new at SEO and keyword research. I am using Market Samurai as my research tool, and I was wondering if I could ask for your help to identify the best key word to target for my niche. I do plan on incorporating all of them into my site, but I wanted to start with one. If you could give me your input on these keywords, I would appreciate it. This is all new to me :) I'm too new to post pictures, but here are my keywords (Searches, SEO Traffic, and SEO Value / Day): Searches | SEO Traffic | PBR | SEO Value | Average PR/Backlinks of Current Top 10 1: 730 | 307 | 20% | 2311.33 | 1.9 / 7k-60k 2: 325 | 137 | 24% | 822.94 | 2.3 / 7k-60k 3: 398 | 167 | 82% | 589.79 | 1.6 / 7k-60k I'm wondering if the PBR (Phrase-to-broad) value of #1 is too low. It seems like the best value because the SEOV is crazy high. That is like $70k a month. #3 has the highest PBR, but also the lowest SEOV. #2 doesn't seem worth it because of the PR competetion. Might be a little too hard to get into the top page of Google. I'm wondering which keywords to target, and if I should be looking at any other metric to see if this is a profitable niche to jump into. Thanks.

    Read the article

  • What do I do if a user uploads child pornography?

    - by Tom Marthenal
    If my website allows uploading images (which are not moderated), what action do I take if a user uploads child pornography? I already make it easy to report images, and have never had this problem before, but am wondering what the appropriate response is. My initial thought is to: Immediately delete (not just make inaccessible) the image File a report with the National Center for Missing and Exploited Children with all information I have on the user (IP, URL, user-agent, etc.), identifying myself as the website operator and providing contact information Check any other images uploaded by that IP user and prevent them from uploading in the future (this is impossible, but I can at least block their account). This seems like a good way to be responsible in reporting, but does this satisfy all of my legal and moral responsibilities? Would it be better not to delete the image and to just make it inaccessible, so that it can be sent to the National Center for Missing & Exploted Children, the police, FBI, etc?

    Read the article

  • Why does Google Search Engine reject my title tag's change?

    - by Michal P.
    I made a simple webpage http://pundaquitboat.michaelspages.com/ giving it the the title tag "Boat – Pundaquit" and I have submitted it to Google bot by Google Webmaster Tools. Then I decided to change the title to "Anawangin trip" of the same page and I submited my webpage again in the same way to Google bot. The result was that the new title of my webpage coexisted with the old title of the same webpage in SERPs for maybe 2 days. After that the new title was rejected and if I enter site:pundaquitboat.michaelspages.com/ I can see that Google has my old copy of my webpage with old title in its database. This problem doesn't occur in Bing when I can enjoy high position of "Anawangin trip" phrase. (In Bing I haven't submitted the old version of title.)

    Read the article

  • What is the best approach to copy public dynamic pages?

    - by Renan
    Situation: the government is supposed to publish official information online such as acts and laws. Problem: they're using 90s expertise to do it. You can tell that by the constant use of deprecated html tags such as <table and the lack of any compression at all, which makes some documents go way over 700,000 bytes even though they're pure text. Side problem: some companies are actually editing and selling this content that should be public and free. What I need to know is the best approach to offer said official content in my own site for free. I've thought of setting up a mirror to copy the official pages from time to time, since some of them are updated frequently, which would automatically be compressed as all my pages are via htaccess.

    Read the article

  • Looking for Hosting Companies that Meet the Following Criteria [closed]

    - by Bryan Hadaway
    Possible Duplicate: How to find web hosting that meets my requirements? Please Note: This is not a subjective question and I am not looking for opinions. This is very much an objective question with legitimate use and purpose to identify hosts that offer the following: Multi Domain SSL Certificate Linux Server PHP5+ cPanel Unlimited Storage, Bandwidth, MySql DBs and Addon Domains SSL is mentioned first because this is most important. This is not a single domain or wildcard SSL cert. It's relatively new and unique. It's for the purpose of securing multiple domains on one account without having to have an entirely separate hosting account and SSL cert for every domain. I'm currently using BlueHost/HostMonster which meets all my criteria except for this special kind of SSL cert. Currently, HostGator is the only host that offers everything I've listed that I've been able to find. Again, I'm not requesting recommendations, advice or opinions of the best or most reputable service based on your experiences. I am asking for an objective list of known hosts that offer the aforementioned listed items only. Thereafter, I (and others who this will benefit) can make our comparisons and selection privately.

    Read the article

  • Google Webmasters tools crawl error

    - by Shiro
    I am looking in to Google Webmaster Tools - Crawl Error section. How should I handle for those URL due their system / application showed invalid URL. e.g http://www.example/images/products/s_=enlarge_16gb.jpg but, I dunno what happen to yahoo groups, it break the link into http://www.example/images/products/s_= enlarge_16gb.jpg and I only make the top part become hyperlink, which is http://www.example/images/products/s_= Because of the URL, Google show crawl error, I got few error because of this kind of result or because other people typo error. How do I prevent this. I am sure I don't have the right go and change other people post. What is the solution for this. Thanks!

    Read the article

  • Highly SEO optimised forum posts

    - by Tom Gullen
    Given the following forum post: Basics of how internals of Construct work I've used GameMaker in the past. And I know some C++ and have used a few 3d engines with it. I have also looked at Unity, though I didn't get too much into it. So I know my way around programming etc... My question is, how does construct work internally? I know it allows python scripting, which itself is "technically" interpreted, though python is pretty fast as far as being interpreted goes. But what about the rest? Is the executable that gets cre... The forum software will take the first 150 chars of the first post as the page meta description, and the title will be the thread title. All ok. So in Google it will appear as: Basics of how internals of Construct work I've used GameMaker in the past. And I know some C++ and have used a few 3d engines with it. I have also looked at Unity, though I didn't get too much... http://www.domain.com/forum/basics-of-how-internals-of-construct-work.html Now the problem is (not so much with this thread, but other ones) is the first 150 chars don't always create the best meta description. Is it worth my time to cherry pick threads and manually set their description/title tags so they read like: Internal workings of Construct 2 Events aren't converted to any other language. The runtime is a standalone compiled EXE application, which is optimised and actually very fast. Your events... http://www.domain.com/forum/basics-of-how-internals-of-construct-work.html The H1 on the page is still the original title, but we have overridden the title and description to look more friendly on search results. Is this advantageous forgetting the obvious time cost?

    Read the article

  • How does bing-bot( is that the right spider-name? ) and googlebot interpret 301 redirect?

    - by jbcurtin
    I've been looking for documentation on how the Microsoft and Google bots interpret 301 redirects. It seems that google-bot stores documents on a url based index system. But I haven't been able to figure out how bing works. Should I assume that they are still working towards coping everyone else and assume they use an algorithm close to google? Is it best to just forward a page to a new location via Javascript? I think this might be a blackhat trick, but how would I tell the bots that it's not? Is 301 redirect my best option and I just have to bit the bullet because said pages are no longer in existence? What other options do I have that I might not be aware of?

    Read the article

  • google custom search gives different result number for same query

    - by santiagozky
    We are using google custom search and we have found that often the totalResults iterates between two values, even for the same query. The different values can be slightly different or more than double. The parameters I am using look like this: https://www.googleapis.com/customsearch/v1? q=something cx=XXXXXXXXXX lr=lang_en siteSearch=www.mydomain.com start=1 fields=context%2Citems%28fileFormat%2CformattedUrl%2Clink%2Cpagemap%2Csnippet%2Ctitle%29%2Cqueries%2CsearchInformation%28searchTime%2CtotalResults%29%2Cspelling%2FcorrectedQuery key=YYYYYYYYYYYYYYY filter=0 This is problem because of calculating the number of result pages. How can I get the same results for the same query?

    Read the article

< Previous Page | 89 90 91 92 93 94 95 96 97 98 99 100  | Next Page >