Search Results

Search found 14053 results on 563 pages for 'upk pro (knowledge pathwa'.

Page 150/563 | < Previous Page | 146 147 148 149 150 151 152 153 154 155 156 157  | Next Page >

  • Moving from one DNS provider to another

    - by Senthil Kumaran
    I had registered with a particular DNS provider X and I have been unhappy with their services and now when the time for renewal came, I did not renew and I let it expire. I am hoping that once it is expired from this provider, I would be able to sign up for the same domain name from an alternative provider which I have tested and I am satisfied. What kind of precautions should I take? The domain name is not a critical one, it is of a NGO and we prefer to own it again without any change in the name. The information given by the expiry notice says Domains can be renewed between 90 days before and 14 days after the expiry date. If domains are not renewed they will be removed from the account and set for deletion. Should I wait for time till gets deleted at their end so that I can sign up for the same from another provider?

    Read the article

  • Not finding a good free webhost [duplicate]

    - by JoJo
    This question already has an answer here: How to find web hosting that meets my requirements? 5 answers I am searching for a free web host to upload my website on my domain, but i can't find a good one! My website contains a few asp.net pages and an PhpBB forum, and i also have my own domain so I don't want to have to do it on a free subdomain. So is there a free web host that can run asp.net can run phpbb allows you to use your own, already registered domain

    Read the article

  • Forward .html/.htm to .php with .config

    - by PhilipK
    I'm moving a site from my linux hosted server to a client's windows hosted server. The .htaccess file no longer works and I'm told that windows servers use .config . How can I forward all users accessing .html & .htm files to the equivalent .php file. Server Info... OS/Hosting Type: Windows / Shared Hosting .Net Runtime Version: ASP.Net 2.0/3.0/3.5 PHP Version: PHP 5.2 IIS Version: IIS 7.0 Data Center: US Regional EDIT *Hosting provided by GoDaddy Was told by a friend following should work but it has no effect on the site. <configuration> <system.webServer> <handlers> <add name="PHP-FastCGI" verb="*" path="*.html" modules="FastCgiModule" scriptProcessor="c:\php\php-cgi.exe" resourceType="Either" /> </handlers> </system.webServer> </configuration>

    Read the article

  • Making an advertising server ads from different ad networks

    - by John
    In India there are many ad-networks(other than Adsense) who pay per acquisition or per lead. So Javascript ad code is not required(as fraud clicks don't matter as long as one converts). So an ad network will have many companies and each company will have many banner sizes for ads. Also suddenly any ad may be stopped just because company's target has met. Which is a common nuisance since if we don't remove those url's then that company will get conversions for free. I've a dozen sites and removing the ads are difficult every now and then. Also CPA based ads may not convert at all. That means I'll need to remove non-performing ads regularly. I've gone through: How can I show multiple ad networks on my site? . I've also visited DFP solution but without Adsense they wouldn't let me open account. I want to make an ad server wherein I'll feed new ads (banner image + link for click). I want to maintain categories there like ( shoes, phones, books etc). So if an ad is paused - i'll simply remove/pause the ad there while other ads in the category keep running. Also changing ad code within sites will no more be required. For example - let me have an ad category "clothing" where I can add ads from different companies. So if one of my site requests an ad from there it'll randomly select an ad in this category and return it to site for display. Removing/adding ads within this category will not affect the site requesting those ads. Any idea how to implement it?

    Read the article

  • How to properly remove URL's from Google's index?

    - by ElHaix
    On some of our sites, we now have several thousand pages that dilute our website's keyword density. The website is an MVC site with SEO routing. If I submit a new sitemap with say only the 2000 or so pages that we want indexed, even though navigating to the diluting pages still works, will Google re-index the site with only those 2000 pages, dropping the superfluous ones? For example, I want to keep roughly 2000 of the following: www.mysite.com/some-search-term-1/some-good-keywords www.mysite.com/some-search-term-2/some-more-good-keywords And remove several thousand of the following that have already been indexed. www.mysite.com/some-search-term-xx/some-poor-keywords www.mysite.com/some-search-term-xx/some-poor-more-keywords These pages are not actually "removed" as navigating to these URL's still renders a page. Even though there are potentially hundreds of thousands of pages, I only want say 2000 to be re-indexed and retained. The others removed (without having to do these manually). Thanks.

    Read the article

  • Attaining credit card data

    - by Adam
    I've read the many posts on this site that say we are not allowed to store cc numbers if we are not pci-compliant. But, I'm wondering if it is possible to send a CC number through a form to an email address? Would that be still infringing on the standards? The reason I ask is that a local business owner wants to retrieve a number through a form on his website, so he can manually enter the cc info on his end. I'm assuming the only way to properly get a credit card number is to setup a merchant account? What's the best way to get a cc number without calling the actual customer? I'm thinking email is a bad idea as well.

    Read the article

  • Moving large website to new CMS - URL changes

    - by herrherr
    Hi, I was wondering if you have any tipps on the following situation. I'm going to move a large website to a new Content Management System, here are some details on the site: online news magazine with roughly 3,000 articles domain age: 10 years online in the current form since May 2010 indexed pages: ~10.000 percent of search engine traffic: under 10% Unfortunately a custom-tailored CMS was used for the site. The performance, reliability and SEO capabilites have been really bad, so we are moving to a new and proven open source CMS. All the articles will be kept as they are, but the URL structure as well as the structure of the HTML templates will be changed. What I wanted to do now is to actually create 301 redirects for all articles from the old to the new schema, i.e: Old: www.example.com/en/html/news/detail/title-of-the-article/ New: www.example.com/category/title-of-article.html Is this a proven way to do something like this? If not, can you recommend a way that has worked for you? Thanks :)

    Read the article

  • Lightning fast forum based around metadata / tags? [closed]

    - by Dan W
    Possible Duplicate: What Forum Software should I use? I wonder if anything like this exists. I'd like to add a forum to my site, but instead of the usual forum/subforum/sub-subforum structure, I'd like to use a metadata/tag approach where everything exists as a single directory, and where there's a search field at the top which instantly (<0.5 sec) filters the threads to a particular keyword or keywords. Also, as the admin, I would be able to add highly visible buttons at the top, which can be clicked on for the main categories I choose for the forum (nevertheless, users can also add tags to their own threads outside of these default main tags I supply if they wish). This approach, if done properly, is more powerful, efficient, maintenance free, scalable and friendly than a standard forum, so I was hoping someone had the same idea and made something out of it. It couldn't be that hard. I'd want the speed to be up to (or near) the standard of this: http://forum.dlang.org/ Other forums (e.g.: phpBB) are orders of magnitude worse than that in terms of latency (posting or browsing), and I think that is wrong, even in principle ;)

    Read the article

  • How to check that I have recovered from Penguin 2.0?

    - by Simon Walker
    I have 3 year old website which has been hit by Penguin 2.0 in May. The website traffic dropped almost 30%. I have been working hard from last 2.5 months on the website and my website's traffic recovered in last week of August. In fact, I am receiving more traffic then ever. When I look at the stats, I find my website's search engine visibility has been improved. It is now appearing for more search queries. My website's impressions have also increased. What I am worried about is that my website is nowhere in top 5 pages for keywords having high competition and carrying the highest search volume. They are few in number but important. Should I consider my current situation as recovery or it's just the partial recovery? If it is only partial, then how come traffic is more then it was before penguin 2.0?

    Read the article

  • The layout page "~/Views/Shared/_Layout.cshtml" could not be found

    - by Rei Brazilva
    I got this error and I can't figure out what is going on. I am positive the _layout.cshtml resides in the shared folder and for the sake of trying things out, I moved to the Home folder and it then told that the Views/Home/_Layout.cshtml couldn't be found there either. So now I'm thinking the problem is in the call for this file for some reason. I'm not going to pretend I know ASP.NET MVC4, so please when you answer, explain it as you would to someone who is not familiar with the system at all. Believe it or not, this error came from tutorial #1 ha ha Here's the code to show that I did code it right: @{ ViewBag.Title = "Home Page"; Layout = "~/Views/Shared/_Layout.cshtml"; } And here is a picture of the location p.s. I did my research, Google has nothing and there is another question here but it was asked on 2008 with MVC3 which is completely different I am running ASP.NET MVC4 on Azure Thanks

    Read the article

  • CMS without templates

    - by Mark
    I am looking for a CMS where I can layout the page from scratch using HTML/PHP/CSS and simply enter code such as:- FOR EACH (listOfArticles) SORT mostRecent CATEGORY news LIMIT 5 <div class="articleTitle">{title}</div> <div class="arcielBody">{body}</div> END to get a list of the five most recent articles of a certain category in the relevant place. Does such a thing exist anymore? Unless my mind is playing tricks on me, the CMSs of five or ten years ago had this approach. I am thinking of MovableType and the now defunct CityDesk. It seems to me that CMSs these days have a 'templates first' approach. I.E. you must always choose a template before doing anything - which I find really painful. Learning how to design these structured templates also seems overly painful. So can anyone help me in my quest? Thank you, Mark

    Read the article

  • Do CDNs work with POST operations?

    - by iddqd
    I'm using a CDN (Level3) for the first time and I'm a bit confused. I'm accessing dynamic URLs such as http://cdn.mysite.com?getItem=1234 that return text data. Do CDNs work with HTTP POST operations? When i issue a HTTP POST operation, my "real" server receives this request every time, so I'm wondering if the CDN has a problem with POST operations. If i use HTTP GET it seems to work, i call the URL once (from my application), i can see my server receiving the request. If i call it a second time, the CDN delivers it directly, my server doesn't get anything. However if i open same the link manually from a second browser tab, my server is asked to deliver again, shouldn't it be cached by now? Many thanks.

    Read the article

  • Single page not appearing in Google Search

    - by Dan
    Description I have a static franchise website which has various sub pages each dedicated to an individual franchisee. For each franchisee the page, the only thing slightly similar between all of them are the page titles, they follow this structure: <title> Welcome to THE_COMPANY - PRODUCT_DESCRIPTION Services, THE_LOCATION </title> THE_COMPANY and PRODUCT_DESCRIPTION are the same across all franchisees, however THE_LOCATION changes depeding on where they are located in the UK. Each franchisee page has the following <meta /> tags: <meta name="DC.creator" content="user"/> <meta name="DC.format" content="text/html"/> <meta name="DC.language" content="en"/> <meta name="DC.date.modified" content="2014-01-23T11:22:31+00:00"/> <meta name="DC.date.created" content="2014-01-23T11:22:09+00:00"/> <meta name="DC.type" content="Page"/> <meta name="DC.distribution" content="Global"/> <meta name="robots" content="ALL"/> <meta name="distribution" content="Global"/> The main content on each franchisee page is completely different. The Problem There is one particular franchisee page, located in Area A.. Which will not appear in Google Search results at all. However every single other franchisee (if you Google Search for "THE_COMPANY, THE_LOCATION" is number 1). And if I do the same search on Bing, Yahoo or DuckDuckGo, the Area A franchisee is the first result on all of them. Has Google for some reason black listed one page on the site? What I Have Tried Ensuring the page is referenced in my sitemap.xml file 'Fetching as Google Bot' the link www.the_company.co.uk/areaa When that came back as OK I would submit to index Resubmitting the sitemap.xml file in Webmaster Tools Linking to the Area A page from another pages content For this I also waited about 3 weeks before checking again to give Google time to re-index Making a change to the page content and waiting another 2 / 3 weeks Removing the page completely and recreating it with an alternative URL The closest thing I have found to this issue is this StackOverflow question but this particular franchisee has existed for almost a year, it used to appear on Google searches however no longer does. I'm guessing the Panda update wasn't too happy with something on the page, but it hasn't effected anything else on the site and I am at a loss for things to try. I would greatly appreciate any information or thoughts as to what could have caused this Thanks. Update In line with Daniel Fukudas answer below, I have followed some of his steps but everything seems to check out alright: HTTP Headers HTTP/1.1 200 OK => Date => Tue, 25 Feb 2014 16:31:29 GMT Server => Zope/(2.12.16, python 2.6.6, linux2) ZServer/1.1 Content-Length => 40078 Expires => Sat, 01 Jan 2000 00:00:00 GMT Content-Type => text/html;charset=utf-8 Content-Language => en Vary => Accept-Encoding Connection => close Robots <meta /> tag: <meta name="robots" content="ALL"/> I have updated this <meta /> tag to read content="INDEX" instead now. robots.txt: User-agent: * Disallow: User-Agent: Googlebot Disallow: /*sendto_form$ Disallow: /*folder_factories$ Using site:THE_COMPANY.co.uk: Searching for 'AREA A site:THE_COMPANY.co.uk' does not return the page, but regardless of that searching just for site:THE_COMPANY.co.uk will not necessarily return every indexed page, or so I understand... Update It appears Google likes to drop pages every now and then from the index, despite my steps above, I left the site alone and the page appeared back in the SERPs by itself.

    Read the article

  • Showing content from pages at different URL's (masking), possibly with .htaccess

    - by zigojacko
    If I have URL's like:- domain.com/category/widgets/filter/blue domain.com/category/widgets/filter/red And it is pretty difficult to reconstruct them to something like:- domain.com/category/blue-widgets domain.com/category/red-widgets Is there any way at all that I can use URL rewrites or anything else with .htaccess or on the server to display the URL's as the domain.com/category/blue-widgets on the domain.com/category/widgets/filter/blue page? I've looked into masking URL's but got nowhere and this has been something bugging me for almost 6 months now. Is there any way to achieve what I want to do? FYI: This is a Magento website and the above process, I am wanting to implement for potentially hundreds of URL's. Edit To respond to @kkugelmann's answer:- I couldn't get your proposed RewriteRule to make a difference at all in the .htaccess file so I started testing a few things in this .htaccess tester:- The proposed RewriteRule didn't work in this tester:- However, the following did:- But adding any of these RewriteRule's into the website's .htaccess file did not rewrite the URL at all... Edit2 By the way, if I add [R=301,L] to the end of the URL rewrite rule, it does actually then rewrite the rule, but of course 301 redirects it as well which is unwanted behaviour. Edit3 I found another question with the same issue... And an accepted answer that solved the problem which seemed to be something to do with using mod_proxy and the [P] tag on the rule (if I try this, the page 404's).

    Read the article

  • How do I fix the Google Webmaster Tools warning: "URL not followed?"

    - by user3611500
    A few days after submitting my sitemap to Google, I received this warning: When we tested a sample of URLs from your Sitemap, we found that some URLs redirect to other locations. We recommend that your Sitemap contain URLs that point to the final destination (the redirect target) instead of redirecting to another URL. The example URL Google gave me is http://iketqua.net/?_escaped_fragment_=CIDTKT/mien-trung/xo-so-kon-tum I checked all possible things that I could think of, but still can't figure out what the warning is about! My sitemap: http://iketqua.net/sitemap.xml

    Read the article

  • How can I pass referrer header from my https domain to http domains?

    - by nutcracker
    My website is 100% https. I have links to other http domains. The referrer header is not set when linking from a https page to a http page. From http://en.wikipedia.org/wiki/HTTP_referrer If a website is accessed from a HTTP Secure (HTTPS) connection and a link points to anywhere except another secure location, then the referer field is not sent. I would prefer that other domains can see the referrer so that they know that traffic comes from my domain. Is there a way to force this header or is there another solution? Update I've done some basic testing using a redirect: http page -- link to http --> 301 redirect --> http page = referrer intact https page -- link to https --> 301 redirect --> http page = referrer blank https page -- link to http --> 301 redirect --> http page = referrer blank https page -- link to http --> 302 redirect --> http page = referrer blank The referrer is lost when linking from a https page to a http redirect page on my own domain. So there is no referrer on the redirect.

    Read the article

  • Can you add doubleclick macros to exisiting ads

    - by picus
    Setup: A few weeks back I made some very simple html5 "ads" to run on a few of our partner sites. They weren't paid ads as we also manage these sites, however there are a few of them, so I made a modular solution that is hosted on one of our web servers and included on each page via javascript which outputs an iframe. Each search (ad has a search box) or click appends a url param that we track using custom vars in Google Analytics. In essence, the ad is a HTML page served in an iframe via javscript. Problem: We have an opportunity to run these ads on a third party site, I had sent them a brief how-to for inserting them and they came back saying that: The creative code doesn't contain the %u macro. We can’t substitute the default click-through URL without it. I am somewhat familiar with doubleclick from a web developer's POV, i have inserted DC dart tags before and even have implemented the ad tool for publishers. I have not, however, actually ever created an ad for the doubleclick network before. I assume the publisher needs these tags to track clicks and hence charge us. However, they have not responded to me in regards to these questions. Are macros something I can just add to or replace the existing links with, or do I need to completely setup the ad with doubleclcik - a big issue in the short term given we do not have a advertiser's account set up with them. Thanks in advance

    Read the article

  • How to block FeedReader from fetching my content to their site?

    - by Wei Kai
    As you all can see from the picture below, my site's content is duplicated by FeedReader and indexed at Google. When I clicked at the FeedReader link, it uses some sort of iFrame to draw content from my site live. This forms some sort of content duplication to me, and I believe it does harm to my site. (Stackoverflow doesn't allow me to post image due to new account, pls click at the link above to see the picture, million thanks to you.) What can I do to prevent Feedreader to fetch my content to their site? I know robots.txt can perform such function, but I don't know how to do it. Any help would be much appreciated. I have also highlighted this issue to FeedReader 2 days ago, but yet to get any reply from them.

    Read the article

  • Is it needed to have your blog title and description in H1 and H2

    - by Saif Bechan
    I have read an article that states that it is not necessary to have your blog title and description on your website at all. Just have the titles of the posts in h1, on the index and the post page. And on the post page have your different sections started with h2. Widget headers start with h3. Title and description are most of the time in the logo image. I have looked at the source of my favorite blog, http://net.tutsplus.com, and I see they do the same. Is this recommended?

    Read the article

  • Is this DFP error message the reason my ad won't show?

    - by Eric
    I'm setting up DFP to display ads and I have an ad tag (Javascript) from adtechus.com. The tag looks like this: <script src="http://adserver.adtechus.com/?adrawdata/1.0/1111/11111/0/0/ADTECH;loc=100;noperf=1;"></script> When I paste that tag into DFP, I get an error message saying it does't recognize the tag format: ...and, more importantly, my ad isn't showing on the page. DFP seems to be taking my adtechus ad tag regardless and working with it, despite the error message. But could that be the reason my ad isn't showing? And how can I fix it?

    Read the article

  • Image hotlinking providers?

    - by Josh
    I use a lot of images in my wordpress and due to hosting restrictions I need to host the images somewhere else and hotlink them in my blog posts. So I am looking for some reliable image host which provides free hotlinking service. The Google Picasa would be best, but I think they do not allow hotlinking. PS. I'm not looking for hosts like tinypic or imgshack, I'm looking for some websites which provides powerful features to oranize images (eg. albums etc).

    Read the article

  • how to get new site indexed by alexa

    - by JohnMerlino
    When I search my site on alexa, it says "Alexa Traffic Rank: No Data". So I google the issue. I come across this page: http://www.loudable.com/my-website-data-not-showing-in-alexa-get-your-website-crawled-by-alexasolution.html It says to get site indexed, click Crawl my site on the webmasters page. However there is no longer a link that says "Crawl my site". So as of now, does anyone know how to get site indexed by alexa so that my traffic rank will display on alexa index?

    Read the article

  • AdWords Keyword Tool planner CPC completely different to real CPC?

    - by steve
    I'm new to AdWords, and trying to figure out the best keywords to use. I go to Adwords Keyword Planner, and typed in an example keyword. It gives me an average CPC of $0.94. But when I go to set up a real campaign and type it the same keyword, I get an error saying 'below first page bid estimate' which is $8.75. What gives? Is there a better way to get more accurate feedback on how much this will cost?

    Read the article

  • Domain changes required for SSL integration

    - by user131003
    Currently my site supports regular payment options (User is taken to Payment Gateway/PG website). Now I'm trying to implement "seamless" PG integration. I need SSL for this. I'm having a dedicated server with 5 static IPs from Hostgator/HG. options: I take SSL for www.my_domain.com. According to HG, I need to change IP of main site as current IP is not really dedicated as it is being shared by cpanel etc. So They need to bind another dedicated IP to main domain for SSL to work. This would required DNS change for main website and hence cause few hours downtime (which is ok). I've noticed that most of the e-commerce websites are using subdomains like secure.my_domain.com for ssl/https. This sounds like a better approach. But I've got few doubts in this case: a) Would I need to re-register with existing PGs (Paypal, Google Checkout, Authorize.net) if I switch to subdomain? Re-registering is not an option for me. b) Would DNS change be required for www.my_domain.com in this case. This confusion arose because of following reply from HG : "If the sub domain secure.my_domain.com is added to an existing cPanel it will use the IP for that cPanel so as long as it is a Dedicated IP that will be fine. If secure.my_domain.com gets setup as its own cPanel it will need to be assigned to a Dedicated IP which would have a DNS change involved.". Please suggest?

    Read the article

  • Apache2 Unwantingly Allowing Proxy Requests

    - by Kevin
    I'm not sure if this is the right location, but this is fairly urgent. I have completely removed all traces of mod_proxy and the other mod_proxy mods, although the Apache server continues to allow proxy requests. I have restarted numerous times, and have shut down until I can find an answer. I've noticed lots of requests from IPs in and around China to external sites such as free movie downloads and such. I'd like to prevent this from happening. I'll be grateful for any help I get.

    Read the article

< Previous Page | 146 147 148 149 150 151 152 153 154 155 156 157  | Next Page >