Search Results

Search found 14053 results on 563 pages for 'upk pro (knowledge pathwa'.

Page 179/563 | < Previous Page | 175 176 177 178 179 180 181 182 183 184 185 186  | Next Page >

  • What's the difference between my nameserver and CName settings?

    - by Josh Mcquiston
    I have purchased a domain name(mxsoup.net) through GoDaddy, and it is just parked. In order to set up my custom URL for my SourceAudio site, they give me the following instructions: In order to host your site at a your own URL, we need you to set up some DNS records to point your URL to us. Specifically, we need two CNAME references, one for 'www.mxsoup.net' and one for 'secure.mxsoup.net', both of which should point to 'web2.sourceaudio.com'. But the rep on the phone at GoDaddy said that my site is hosted at HostMonster.com, and therefore I need to talk to them to accomplish this(which is possibly true, but my business owner says he hasn't purchased hosting for this particular domain, yet he does have some other sites in his hostmonster hosting acct.) My GoDaddy account shows that my nameservers are pointing at NS1.HOSTMONSTER.COM, and NS2.HOSTMONSTER.COM, and I can edit those. But is this the same as setting up the CNAME as described above? Any help would be greatly appreciated!

    Read the article

  • Use Google Analytics to record newsletter clicks to an external website

    - by rlsaj
    Note: by external website I mean a website that we do not have access to the code. For example www.facebook.com I want to record how many social share clicks we have from our customer newsletters. For example, when a customer receives a newsletter they can click "Share this on Facebook" which shares the hosted version of the newsletter. If I wanted to record these newsletter clicks to our website I understand we'd use Google URL Builder (https://support.google.com/analytics/answer/1033867?hl=en) to create a UTM URL but because we're linking to an external site, how do we record this?

    Read the article

  • Tumblr custom domain not redirecting properly

    - by Manic
    I decided to host my blog at Tumblr, using their custom domain setup (http://blog.smokingfishgames.com/ instead of http://smokingfishgames.tumblr.com). However, it's been 72 hours and I'm still getting spotty redirection. It works some of the time--I go and see the page and blog, and it's all fine. However, it occasionally just stops working and redirects back to my web host, which is a directory with nothing but a single file called BUGGER.html (which I stuck in to make sure that it was my web host and not some Tumblr empty directory). Clearing the Chrome DNS cache makes the problem go away--for a while. After a few minutes, or an hour, or however long, I'll start seeing BUGGER.html again. I clear the cache, and poof, the blog shows up. The thing that's curious to me is that when I clear the cache and get BUGGER.html again (which happens occasionally), I can look at my Chrome DNS cache and see assets.tumblr.com UNSPECIFIED blog.smokingfishgames.com UNSPECIFIED www.tumblr.com UNSPECIFIED IP addresses and expiration times omitted for brevity's sake--if they're important I'm sure I can replicate the issue. This implies, to me anyway, that my browser is reaching Tumblr but getting bounced back to my web host. Any reason why this would be happening, or is this a normal symptom of DNS propagation? If it is a problem, should I be bothering Tumblr or my host with it, or is this something I can fix myself?

    Read the article

  • I'm getting 403 forbidden error on my website

    - by user1230090
    I was accessing the directories through cyberduck and also trying to upload files.But now it started showing this forbidden error.I was getting the homepage first,now i dont get that too.Can anyone please tell me how can I get my website back to show [Fri Mar 02 14:36:21 2012] [error] File does not exist: /var/www/vhosts/example.com/httpdocs/bin [Fri Mar 02 14:37:24 2012] [error] File does not exist: /var/www/vhosts/example.com/httpdocs/httpsdocs [Fri Mar 02 14:39:01 2012] [error] (13)Permission denied: file permissions deny server access: /var/www/vhosts/example.com/httpdocs/index.html

    Read the article

  • CDN virtual subdomain causes duplicated content

    - by user3474818
    I have created a subdomain and a CNAME record which points to the domain root. The subdomain www.static.example.com is actually a copy of the entire website www.example.com and it is supposed to act as an CDN and serve static content in order to improve speed. However, all of my content can be accessed via subdomain aswell, so Google has indexed it all and now I am dealing with duplicated content. How could I deny access to crawlers for the subdomain baring in mind that I do not have different subfolder for subdomain, so I can't create a separate robots.txt file?

    Read the article

  • Photo management utilities

    - by Frantumn
    I'm about to develop a web site for a new client. It's not going to be very intense, but one requirement is that, if possible, the client wants to be able to manage the photo gallery themselves. Since they are not technically savvied at all, I was wondering what utilities exist that provide a GUI for users to log in to manage photos. Can anyone make a recontamination? I haven't purchased the web hosting yet, so if your answer requires a specific type of host server don't worry, I am open to options.

    Read the article

  • Redirecting requests for .html pages in subdirectories to the same page in root with .htaccess

    - by Asherion
    I am porting a site from an old version of a CMS to a newer version which has different page addressing techniques. I'm unfortunately not very good with htaccess at all. URL/blog/sublblog/article.html is now simply URL/article.html Unfortunately, this will destroy any linking programs they have going, and break all the old links. I need a way to use .htaccess say: if request = /(any subdirectory)/(string).html then redirect to /(string).html If that makes any sense.

    Read the article

  • SEO: disallowing Google from indexing forms in iframes or not?

    - by Marco Demaio
    I usually place forms in iframes (i.e. order form, request assistance form, contact forms, ect.). Just the forms, I never place other contents or pages in iframes. From a SEO point of view, would you exclude forms from being indexed/crawled by Google or not? I mean my forms hardly ever contains keyword/keyphrases, moreover I obviously place empty title/meta description tags in pages shown in iframe to display forms, cause those titles are never displaied in browser title bar. So I'm wondering what's the point of letting Google index them? Moreover I think these form pages might suck out PR from all other pages that are more valuable for SEO. If your answer is "yes I would exclude them form indexing" would you simply use robots.txt to exclude them? Thanks!

    Read the article

  • SQL Server Transaction Log Fragmentation: a Primer

    Generally, you will have no need to worry about the number of virtual log files in your transaction log. However, if you use the default settings for 'auto-grow', you can end up with such 'fragmentation' in your transaction log as to affect performance noticably. How can this be avoided? How can you tell it's a problem? What do you do about it? Greg explains. "SQL Backup Pro 7 improves on an already wonderful product" - Don KolendaHave you tried version 7 yet? Get faster, smaller, fully verified backups. Download a free trial of SQL Backup Pro 7.

    Read the article

  • Preventing Duplicates on Google

    - by abel
    I am currently using a rewrite rule to enable access to .php pages, without using the php extension. However to prevent old links from breaking, the pages can still be accessed via links containing the .php extension too. For eg. domain.com/page.php can now be accessed at domain.com/page All the links on the website now use domain.com/page type links within the site. However older incoming links will still link to the .php pages, meaning Google will index both pages and mark them as duplicate. I have two plans to remedy the situation. Use a php 301 redirect: When a page is accessed with the .php extension, I can redirect each page individually using a 301 redirect using php Using Canonical: Place a canonical tag on each page, pointing to the ".php" less version My Question: Are both methods equally efficacious in preventing Google from indexing my ".php" pages? Which method should be preferred, by convention or otherwise?

    Read the article

  • Website falsely blocked because of spam. Does anyone know how we should proceed?

    - by Thomas Crepain
    I'm responsible for ICT at FOS Open Scouting, a belgian scouting organisation. Our website was hacked a few years back and blocked by Facebook as a result. After we regained control over the site Facebook continued to block our domain and this is causing us a number of problems. We have tried many times in the past year to contact Facebook using their 'I am blocked from adding content' form (https://www.facebook.com/help/contact.php?show_form=block_appeal) to no avail. The blocked URLs are: http://www.fos.be and http://www.fosopenscouting.be Does anyone know how we should/could proceed?

    Read the article

  • Suggestions for a Live chat software on websites for customer support?

    - by Munish Goyal
    Recommendations needed. We want to get in touch with customers via live chat. Requirements: chat window customisable to mingle with website theme (colors etc) preferably the window should be within webpage and not only pop-out/popup. ease of use by customer minimally intrusive should have triggers/Alerts to backend side. for ex: user is unable to fill-up signup form or something, we should be able to offer help to user and this chat window automatically shows to user. What is the cost ? UPDATE: After R&D we also narrowed down to comm100 and liveperson, and we will go with comm100. LP is best commerical soln. but comm100 is a good free soln. We go with comm100 as starting point. But it gives out exceptions alerts sometimes in the dashboard. Can it be configured for chat invitations triggered on certain conditions. Currently only a timebased trigger is available. Any other major difference between these two ?

    Read the article

  • IE8 HTTPs Download Issue

    - by Jon Egerton
    I have a problem with a system I develop related to IE8 downloading over SSL (ie on sites using https://...) and is described on this MS kb article: http://support.microsoft.com/kb/323308 We use the HTTPCacheability.NoCache option as the data being downloaded is sensitive, and is downloaded from a secured site. I don't want that data to be cached on any of the proxies etc that the response passes through back to the client. The article describing the issue details a fix to the client side registry changing a BypassSSLNoCacheCheck setting. I don't want to loosen the system security just for IE8, as the system works fine on anything more upto date. Getting all the clients to apply the hotfix is difficult at best, and impossible at worst. We need to support IE8 in the system, at least for now. So: 1: Does the detailed hotfix have any implications for the security at the browser end in IE8 - does it mean the file will be cached? (in a place other than where the user saves the file). 2: Is there some way I can get these files downloadable with a change at the server end that doesn't break the security side of things?

    Read the article

  • books on web server technology [closed]

    - by tushar
    i need to understand the web server technologies as to how are the packets recieved how does it respond and understand httpd.conf files and also get to undertand what terms like proxy or reverse proxy actually mean. but i could not find any resources so please help me and suggest some ebook or web site and by server i dont mean a specific one (apache or nginx..) in short a book on understanding the basics about a web server i already asked this on stackoverflow and webmasters in a nutshell was the answer i got and they said its better if i ask it here so please help me out

    Read the article

  • mitigating lost emails when switching provider

    - by sam
    were about to change to gmail from a webmail provided by our hosting provider, i understand changing the mx records and all. But my main worry was if there would be any emails that would fall through the gaps of the two systems during change over. Im not familiar with the ins and outs of how the mx record works, is it like a dns record change, ie. it needs to propagate ? If thats the case would there be a period were its left my current email provider but not switched to the new gmail account ? Thus allowing emails not be delivered or worse lost ?

    Read the article

  • What is the best way to have the same website in multiple domains?

    - by Daniel Magliola
    I would like to have the same website to sell a specific product, in multiple domains , to take advantage of keywords matching the domain name, for several different searches. However, I understand that having the same content in multiple sites will unleash the wrath of Google. If I have a redirect from all domains minus one, to that last one, do I still get any bonus for the "magic exact domain match jackpot"? Same question applies to canonical URLs... What's the best way to approach this? Thanks!

    Read the article

  • SEO disasters moving domain for a high traffic website?

    - by chrism2671
    We're looking at moving our website from http://www.wikijob.co.uk to http://www.wikijob.com/uk as we spread our wings internationally. Our .co.uk website has a PR6 and received around 1/2 million visitors a month, 40% international. The wikijob.com domain, while registered for a while, has not been used nor promoted. I am concerned that moving domain could really haemorrhage our traffic and result in a loss of goodwill from Google, even if we use a 301, but equally, if we could transfer that pagerank to the .com domain, that would give us a massive head start around the world. Should we do it, or should we start over with .com and leave .co.uk as is?

    Read the article

  • Incorrect Meta information in Google

    - by Ashfame
    Google shows up incorrect meta info (title & description) in search engine results for an add domain and the information is of the domain which is the primary domain of the hosting account. I mentioned this fact because add-on domains are in a sub-directory of the primary domain. Any ideas what could be the reason? Check this Google search which shows the information of http://katherinegaudette.com/

    Read the article

  • Google Analytics Export API - nextPagePath data

    - by Btibert3
    I am probably missing something obvious, but I do not understand when I query: start.date = DATE_START, end.date = DATE_END, dimensions = c("ga:pagePath","ga:previousPagePath"), metrics = c("ga:pageviews"), filters = mypageofinterest, table.id = "ga:mytable", max.results=RESULTS my data return as expected, all of the previous pages including (entrance). However, when I modify the code to be nextPagePath start.date = DATE_START, end.date = DATE_END, dimensions = c("ga:pagePath","ga:nextPagePath"), metrics = c("ga:pageviews"), filters = mypageofinterest, table.id = "ga:mytable", max.results=RESULTS only one line of data are returned; the pagepath and nextpagepath are identical with itself. I replicated this result using the Query Explorer. What am I missing or doing wrong? I was expecting to see a large number of "next" pages, including (exit). Thanks in advance.

    Read the article

  • Where can a list of Desktop web browsers be found at?

    - by Sn3akyP3t3
    I have another question posted in regards to the practicality of whitelisting. In this question I'm simply looking for an frequently updated list of top known used Desktop web browsers to use as part of my whitelist. I'm not trying to target any specific OS so please show one, show all. The list of browsers for desktops isn't exploding, but it does grow. I've only recently been made aware of other browsers that have multiple rendering engines. I'm not always on top of the text based browsers found out there either. I'm aware of the mobile browser platform and there is an active list used with regular expression for identification purposes that I will use as well as whatever I can find for the desktop platforms.

    Read the article

  • Pages are indexed but disappear after few days

    - by Sergio
    My pages get indexed after 1 day, but some days later disappear from search results. Any idea why this happens? I've been trying to find any of the usual problems like hidden links or other issues, but can't find anything wrong. Here is an example. It was on first page until yesterday, today is gone. This is happening with all my pages lately, so I think it must be something common to all, but can't figure out what.

    Read the article

  • Tracking Search Filter Parameters Using Google Analytics

    - by Petra Barus
    I'm just wondering if there is a way to do this using Google Analytics. Let's say I have a search filter like the one used in Trulia.com There is a text search for the location with other drop-downs for filtering by bedroom, land size, property type (apartments, house) etc. Is there a way to track the filter and obtain a report for some questions like below using Google Analytics What is the most popular property types (house, apartments) for search in New York area? What is the most common maximum price of users who are looking for apartments in San Francisco? (or actually Google Analytics is not suitable for this kind of thing?)

    Read the article

  • Why is Google still not indexing my !# website?

    - by Zubair
    I have been working on a website which uses #! (2minutecv.com), but even after 6 weeks of the site up and running and conforming to the Google hash bang guidelines stated here, you can still see that Google still hasn't indexed the site yet. For example if you use Google to search for 2MinuteCV.com benefits it does not find this page which is referenced from the homepage. Can anyone tell me why Google isn't indexing this website? Update: Thanks for al lthe help with this answer. So just to make sure I understand what is wrong. According to the answers Google never actually indexes the pages after the Javascript has run. I need to create a "shadow site" which google indexes (which google calls HTNL snapshots). If I am right in thinking this then I can pick a winner for the bounty

    Read the article

  • Which is better for search engines, repeated phrases or different phrases with the same meaning?

    - by George Botros
    When I'm designing an ads website I have two options: Let the advertiser to choose from some predefined lists to create the new ad. For Example: product list ( T-Shirt, Shorts, Suit, .....) Color list ( Black, Red, .....) Let the advertiser to write his own descriptive content for the product For Example "Amazing suit with a good price" I like the first Scenario but which is better for search engine optimization [SEO], repeated phrases or different phrases with the same meaning? Note : assuming each page will contain one or more ads

    Read the article

  • Alternative to Google Adsense which has good international coverage

    - by Yoga
    I have a technical blog (programming related) which has around 500 visit per days, 70% are international visitors and 30% are from US/CA. Google Adsense disabled my account due to invalid clicks so I can't use them (no need to explain here, they suck hard and never respect and listen to the publishers' appeal) I have tried adbrite and recently using chitika but they almost give me nothing, e.g. chitika 13,865 Page Views 4 clicks $0.01 The performance is so poor even I don't want to mention about it. I am already putting a full top banner and a 350x200 box in article body. I am researching if any alternative would provide more revenue for my internation visitors or technical visitors. Thanks.

    Read the article

< Previous Page | 175 176 177 178 179 180 181 182 183 184 185 186  | Next Page >