Search Results

Search found 9721 results on 389 pages for 'quicktest pro'.

Page 178/389 | < Previous Page | 174 175 176 177 178 179 180 181 182 183 184 185  | Next Page >

  • Leveraging a hosted web font service from a local development server?

    - by Tom Auger
    There are a number of popular web font services on the market today who "host" the fonts and serve them to your web page via javascript or CSS pointing to remote locations. For example http://webfonts.fonts.com or http://typekit.com However, there seems to be an issue when you're developing on a local testing server - the remote font services don't validate the font and return 403 access denied errors and the like. What workarounds are there for using remote services such as a hosted font service, on a local development server?

    Read the article

  • Domain from A and hosting from B

    - by Zero
    I have buyed domain from one company and hosting from another. On hosting company website finded DNS addresses and applied them to domain hosting website(changed DNS) I done it yesterday, so today it should work, but: Unable to resolve the server's DNS address appears. In direct admin control panel (DNS control) i have (it's my hosting company settings): http://pastebin.com/MGbQ02hr Note: IP and domain hidden! Any ideas whats wrong ?

    Read the article

  • How do I create a subdomain for a site hosted by someone who does not allow it?

    - by user99572_is_fine
    I want to create a subdomain for a site hosted by Jimdo (a DIY website builder). Jimdo does not allow subdomains however. I am trying to find a workaround where a subdomain is hosted elsewhere but everything else remains as it is. E.g. I use their email service and I want to keep it. The domain is not hosted by Jimdo, but by a host that allows me to edit my zones. It points to the Jimdo NS. I have independent hosting where I have NS information. This is where I want to host my subdomain. My thinking was that I could use ZoneEdit as a "fork" that allows me to keep using my Jimdo page like before and, at the same time, directs a subdomain to another host. Provided this is possible: Question: How do I configure ZoneEdit CNAME or NS records to forward visitors to my website and my email to my Jimdo mail account while pointing a subdomain to another host?

    Read the article

  • Is it possible to use a VB master page to cover an entirely separate directory written in C#?

    - by Jason Weber
    I have a company website written in vb.net. There are 5 master pages. I recently began utilizing a forum application, also asp.net 4.0, but this one is written in C#. My forum directory is domain.com/knowledgebase/. Is there any possible way to take one of my vb.net master pages and somehow integrate into the /knowledgebase/ directory? Here's what's currently This is what's in the top of every page in my site: <%@ Page Title="USS Vision Inc." Language="VB" MasterPageFile="~/homepage.master" AutoEventWireup="false" CodeFile="default.aspx.vb" Inherits="_default" culture="auto" meta:resourcekey="PageResource1" uiculture="auto" Debug="true" %> This is what's in my /knowledgebase/ directory: <%@ Page Language="C#" AutoEventWireup="true" ValidateRequest="false" Inherits="YAF.ForumPageBase" culture="auto" uiculture="auto" %> <%@ Register TagPrefix="YAF" Assembly="YAF" Namespace="YAF" %> <script runat="server"> Is it somehow possible to use, for instance, homepage.master in the /knowledgebase/ directory? If so, how would I accomplish this? Thanks for any guidance anybody can offer!

    Read the article

  • How can I clone or mirror a site without SEO penalties for duplicate content?

    - by Amanda
    I am a web developer and I want to create clones of the sites I've developed for clients, so that I have an "original copy" on a subdomain of my own website, so that I can showcase my work to new clients. What is the best way to not get my clients original websites penalised for duplicate content? I am planning to have a robots.txt file that disallows all robots, as well as using <link href="http://www.client-canonical-site.com/" rel="canonical" /> in the <head> of the pages. Is that sufficient? Should I use rel=nofollow on all the links as well?

    Read the article

  • PrestaShop install SQL error

    - by Steve
    I am trying to install PrestaShop 1.4.0.17, and reach Step 3. I enter database information, which tests okay, and I choose the second option: Full mode: includes 100+ additional modules and demo products (FREE too!). I choose Next, and receive the error: Error while inserting data in the database: ‘CREATE TABLE `shop_county_zip_code` ( `id_county` INT NOT NULL , `from_zip_code` INT NOT NULL , `to_zip_code` INT NOT NULL , PRIMARY KEY ( `id_county` , `from_zip_code` , `to_zip_code` ) ) ENGINE=’ You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near \’\’ at line 6(Error: : 1064) This happens if I use either MyISAM, or InnoDB. Why is this happening? This also happens if I drop all database tables, and try again in simple mode. Is there a manual installation method?

    Read the article

  • Google analytics/adwords account and leaking of private data

    - by Satellite
    I am frequently asked to log into clients google analytics and adwords accounts. If I forget to log out before visiting other google properties (google search, youtube etc), this leaves tracks of my views/searches etc, exposing my activities to the client. Summary: Client gives me access to their Google Analytics / AdWords account I log into clients Analytics account and do some stuff Then in another tab I perform some related google searches to solve some related issues Issues solved, I then close the Analytics tab I then visit google.com, perform some unrelated searches I then visit YouTube, view some unrelated videos All Web and YouTube searches are recorded in clients google account, thus leaking potentially sensitive data Even assuming that I remember to log out correctly at step 4 (as I do 95% of the time), anything I do at step 3 is exposed to the client. I would be surprised if this is not a very common issue. I'm looking for a technical solution to ensure that this can never happen. Any ideas?

    Read the article

  • How can I determine the trending pages on my site?

    - by Dogweather
    I'm looking to what what the "hot" pages are on one of my sites. I want to see for various timeframes, what the top-50 pages are. I'm going to create a data feed with this info which will be input to another app. I have Apache logs, and complete control of the machine to install what I want. I'm mostly wondering if there's something out there already that I can use, or if I have to implement it myself, what good algorithms or strategies might be. Thanks.

    Read the article

  • Correct microdata and/or microformats for real estate listings?

    - by Ernests Karlsons
    Given I am running a real estate rentals listing website, what would be the correct microdata or microformats for the listing pages? There is the usual data: address, photos, price, start date, possible end date, person who is renting it out, list of amenities, description etc. Are there also microformats/microdata that can be used in the listing summary page (e.g., page that displays all listings in a particular city)?

    Read the article

  • What to do with a site that has multiple languages in Google Analytics...

    - by stephmoreland
    We have a site that has four "streams" for language and each language has different content based on that language and location (US English, Spanish, Canadian English and Canadian French). I'm wondering if I have to set up accounts for each stream so that we can see the stats from each stream only, or do I use one account and somehow tell GA to separate the different streams based on language. For example, the US English site starts at (/en/) while the Canadian English site starts at (/ca_en/), etc.

    Read the article

  • Usefulness of the Backlinks shown in Webmaster Tools

    - by Ewan Heming
    Is the list of links for a site shown in Google Webmaster Tools a complete list or just a sample? I've noticed that the links in there appear to be all the ones I didn't think would have any real value - either because they were nofollow or from irrelivant sites. The few I did think would be some use have never shown up and there's also some links that are sometimes there and sometimes not (such as my linkedIn profile). Does this mean that the missing links don't/no longer carry any value? It almost appears that the list is there for Google to either inform you about problems (there was a useful list there when someone tried to SPAM my site) or mis-imform you about which link-building strategies work or not (to keep people guessing about what works or not).

    Read the article

  • Weird Results A/B Test in Google Website Optimizer

    - by Yisroel
    I set up a test in Google Website Optimizer that has a 3 variations - original (A), B, and C. In order to further validate the results of the test, I added a variation C that is exactly the same as the original. And thats where the results get weird. 6 days into the test, the best performing variation is C. It outperforms the original by 18.4%! How is that possible? Do I now discount the results of this test entirely?

    Read the article

  • Adsense alternative for a "Sex Education" website?

    - by WhatIsOpenID
    I am creating a nice and niche "Sex Education" website. No porn, nothing offensive and no scams. I would love to place Google Adsense but they do not allow ads on adult sites. I would like to know what advertising and link-exchange like should I place on my site. My sole objective is to cover the server costs and salary of one or two persons. (In this way, it is different from Best alternative to Adsense for a small website?).

    Read the article

  • building a website

    - by Ant
    A couple of my friends run a business and they asked me to build them a public website. It will only be used for information about the company with soe pictures. No transactions will be involved. Right now I work for a company where I build internal websites, and do alot of backend programming in C#. I understand html, css, jquery, etc. so I feel like I am completely capable of building a website for them. However, I do not know all the basic knowledge to building one. For example, where should we host the files, what type of security issues do I need to be aware of, what's the best software to use for developing websites (I use visual studio at work), where can I find some design techniques, etc. Any help is appreciated.

    Read the article

  • Why does google does not ignore the word "languages", although I have set to ignore it in advanced search settings

    - by jitendra1234
    Why does google does not ignore the word "languages", although I have set to ignore it in advanced search settings. here is the term I am using in google search: -languages site:http://en.wikipedia.org/wiki/ and here is the first result where the word "languages" is still present, (you can do a quick crtl+F to find out) http://en.wikipedia.org/wiki/Walter_Bedell_Smith I am just curious to know why google have not ignored the word "languages"?

    Read the article

  • How to deal with overly aggressive "Link Take Down Demands"?

    - by Eoin
    I've been receiving a large number of emails recently requesting I clean from link spam from my forum. Initially the emails were very polite and professional, and I was happy to remove the links. Recently the email have gotten very abrasive, here is a particularly rude example: From: [email protected] To: [email protected] Hi, This is the second time we are reaching out to you regarding your link to our site hxxp://www.company-two.com from hxxp://www.my-forum.com/some-topic-id. We really do need to remove this link. We have to report to Google any link we were unable to remove, and I wouldn't want to have to include your site in the list. Could you please remove our link from this page and any other page on your site? Thank You, Name Changed Behind the superficial pleasantries I feel there is some very real maliciousness. Note the email address, DMCA Violations, I don't see how the DMCA is involved here, except as a word which tends to strike fear in many people. Also relating to the email address, it doesn't match the company being linked to at all. How am I to trust they are truely operating on behalf of company-two when they don't even use one of it's email addresses. My email is hidden by privacypost. While a service with legitimate uses, I feel it's highly unprofessional for communications between to companies. The claim "This is the second time..." Every email I've received has started like this, but a check of my spam filters has never revealed a 1st mail. Initially I gave them the benefit of the doubt, by now though it's clear this is a cheap ploy to start me off on the defensive. And finally worst of all- the threats of reporting me to Google if I don't do everything they ask. I sent a polite reply asking for more information. I have no idea if the email address was even valid but I never received any response. Much later I got this followup mail From: [email protected] To: [email protected] Hi, This is the final time we are reaching out to you regarding your link to our site hxxp://www.company-two.com from hxxp://www.my-forum.com/some-topic-id. We will soon be reporting to Google any link we were unable to remove, and currently your site will have to be on the list. Could you please remove our link from this page and any other page on your site? I appreciate your urgent attention to this matter. Thank You, Name Changed This time the from address was more personal, though still not obviously connected to the spammed company. Lets be honest, I don't for one second believe that the companies were the victim of a 3rd party spammer as they claim. The links in questions were generated well over a year ago, and I firmly believe the companies were directly responsible for the spam links in question, a type of spam that has plagued my forum. Now they have the audacity to demand I spend my time cleaning up their mess, using threats to ensure they get their way. Have recent changes in Googles algorithms meant all the cash they spent spamming the web has now turned into a liability? If so I can see why these companies are all of a sudden running scared. Frankly, cleaning up my forum is a good things, but the threats they are using sickens me. So my question here is specifically about the threats: Are they vaild, and would such reports to Google destroy my page rankings? Is there a way I can report this abusive behaviour to Google?

    Read the article

  • Problem re-factoring multiple timer countdown

    - by jowan
    I create my multiple timer countdown from easy or simple script. entire code The problem's happen when i want to add timer countdown again i have to declare variable current_total_second CODE: elapsed_seconds= tampilkan("#time1"); and variable timer who set with setInterval.. timer= setInterval(function() { if (elapsed_seconds != 0){ elapsed_seconds = elapsed_seconds - 1; $('#time1').text(get_elapsed_time_string(elapsed_seconds)) }else{ $('#time1').parent().slideUp('slow', function(){ $(this).find('.post').text("Post has been deleted"); }) $('#time1').parent().slideDown('slow'); clearInterval(timer); } }, 1000); i've already know about re-factoring and try different way but i'm stack to re-factoring this code i want implement flexibelity to it.. when i add more of timer countdown.. script do it automatically or dynamically without i have to add a bunch of code.. and the code become clear and more efficient.. Thanks in Advance

    Read the article

  • What do you need to know to get a job as a web developer [closed]

    - by Alex Foster
    What do you need to know to at the very least get your foot in the door? We're assuming for someone who doesn't have a college degree (yet) but will eventually get one. My guess is html, css, javascript, and php, and photoshop and dreamweaver, and sql. And being familiar with using a web host to have sites live, like knowing how to use cpanel. It's probably a very inaccurate and narrow guess but that's what i think right now. I don't know exactly.

    Read the article

  • File access forbidden in htpasswd

    - by Nerd-Herd
    I have been using the htpasswd generated in this question and it seemed to have been working well until recently. Since yesterday, I am not able to access the newest file created in the folder ChatLogs(named 10_07_2012.txt). The server returns a 403 Forbidden error saying: Forbidden You don't have permission to access /ChatLogs/2012/07/10_07_2012.txt on this server. I am still able to access older files(until 09 July, 2012). At first I thought it might be because of file permissions, but they are the same as on other 9 files in the folder. What could be the problem? Please Help.

    Read the article

  • Determining cause of random latency/loading issues

    - by Sherwin Flight
    I'm not sure exactly what details to post in regards to my issue, because I'm not sure what is relevant. Prior to the end of September my websites all loaded quickly, in almost all cases. Loading time wasn't usually more than a few seconds. However, since the end of September I noticed a big increase in page loading times. In some cases pages were taking 30 seconds or more to load. I do have a remote monitoring service monitoring some of the sites as well, and the image below shows the response times over the past month. The response times shown at the beginning of this graph were what the usual response times were prior to this issue occurring. You can see that there has been a significant increase in response times from the beginning to the end of this graph. The thing is, the problem is not happening 100% of the time. If I click through the site, or even just keep refreshing the page, about 25% of the time the pages load quickly, the remaining 75% of the time they load slowly. Sometimes the pages take so long to load that they time out, and don't load at all. I have contacted my hosting provider, and they said things at their end was fine. I don't believe the problem is my home internet provider, because all other websites load without a problem. The server is located in Texas, USA. This also raises another interesting point. My remote monitor checks my site from two locations, California, USA, and London, England. As you can see in the chart below the response time is actually shorter when checked from London, which doesn't seem to make sense, since the server is physically closer to the California monitoring location. I would have expected the London monitoring location to have higher response times since they are physically farther away. I should also point out that in some traceroute test I've done it seem like the first connection to the server seems to take the longest, then after that the rest of the page loads quickly. Below is a little chart showing the times for the first connection to the server. So, what could be causing this problem, and what steps can I take to resolve it or at least narrow down the problem? Sending the request to the server was very quick, and receiving the reply back seems pretty quick, but the WAIT time is really long. So it connects, sends the request, but then waits close to 30 seconds before it starts receiving data back. I am also aware that there are things I can do to speed up page loading times, like reducing the number of css/js files used on a page, compressing images, etc. This is not really what the source of the problem is though, because nothing has really changed on the site since before the problem started, and other sites on the same server are loading slowly as well. Any help or advice is much appreciated.

    Read the article

  • Is a subdomain per service a good idea for SEO?

    - by Kennie R.
    I am creating a site with quite a few services, such as a free account service, and of course a subdomain for my site's blog and then for article base and other related services, would having them all on subdomains be a good idea? Are there any caveats you are aware of in existing search engines for this? I believe mapping foo.example.com to example.com/foo to provide an alternative just in case is a good idea for sitemaps, I like to keep things clean.

    Read the article

  • Is this safe? <a href=http://javascript:...>

    - by KajMagnus
    I wonder if href and src attributes on <a> and <img> tags are always safe w.r.t. XSS attacks, if they start with http:// or https://. For example, is it possible to append javascript: ... to the href and src attribute in some manner, to execute code? Disregarding whether or not the destination page is e.g. a pishing site, or the <img src=...> triggers a terribly troublesome HTTP GET request. Background: I'm processing text with markdown, and then I sanitize the resulting HTML (using Google Caja's JsHtmlSanitizer). Some sample code in Google Caja assumes all hrefs and srcs that start with http:// or https:// are safe -- I wonder if it's safe to use that sample code. Kind regards, Kaj-Magnus

    Read the article

  • Looking for free, specific Ip2Location Database

    - by Andresch Serj
    I am searching for a free db (like an updated XML or CSV file) that relates IP addresses to specific locations. I want more information than just the country. I want some sort of region or city reference, even if that ends up to be a number that makes no sense to me. Doesn't have to be super correct or always up to date either. It is just to distinguish between user groups and not to monitor or spy on them.

    Read the article

  • Do extra words in url affect SEO?

    - by smp7d
    Often for technical reasons we end up with some extra words in a url that we would not want to optimize for as they would have no bearing on the content. Examples would be: sportssite.com/content/sports-article movieportal.com/node/movie-review electronicsforum.com/blog/top-10-cameras webmasters.stackexchange.com/questions/34046/do-extra-words-in-url-affect-seo Do these have any affect on ranking in any of the major search engines? Would it behoove us to strip the extra words?

    Read the article

  • Getting link to abstract indexed in Google Scholar

    - by JordanReiter
    We have a large digital library with thousands of papers indexed in Google Scholar. We allow Google Scholar to index our PDFs but they're blocked unless you have a subscription. So Google has full-text indexing/searching of our PDFs (great!) but then the links point just to those PDFs (boo!) instead of the more helpful abstract pages. Does anyone know what could cause an issue like this? I am, to the best of my knowledge, following all of the guidelines laid out in their Inclusion Guidelines. Here's some example meta data: <meta name="citation_title" content="Sample Title"/> <meta name="citation_author" content="LastName, FirstName"/> <meta name="citation_publication_date" content="2012/06/26"/> <meta name="citation_volume" content="1"/> <meta name="citation_issue" content="1"/> <meta name="citation_firstpage" content="10"/> <meta name="citation_lastpage" content="20"/> <meta name="citation_conference_title" content="Name of the Conference"/> <meta name="citation_isbn" content="1-234567-89-X"/> <meta name="citation_pdf_url" content="http://www.example.org/p/1234/proceeding_1234.pdf"/> <meta name="citation_fulltext_html_url" content="http://www.example.org/f/1234/"/> <meta name="citation_abstract_html_url" content="http://www.example.org/p/1234/"/> <link rel="canonical" href="http://www.example.org/p/1234/" /> example.org/p/1234 is the abstract page for the article; example.org/f/1234 is the fulltext link accessible to subscribers only (and to Google Scholar). example.org/p/1234/proceeding_1234.pdf is the fulltext PDF link.

    Read the article

< Previous Page | 174 175 176 177 178 179 180 181 182 183 184 185  | Next Page >