Search Results

Search found 9721 results on 389 pages for 'quicktest pro'.

Page 77/389 | < Previous Page | 73 74 75 76 77 78 79 80 81 82 83 84  | Next Page >

  • Under an Xampp install, I modify the apache httpd-vhosts.conf file and then my Apache server will not start

    - by eLLIOT
    I have read all the articles but still must be doing something wrong... I modified the httpd-vhosts.conf file to access the project I am working on... I have tried many different configurations and none work. My apache service will not start (I am on Windows 7)... any ideas would be helpful - here is the code I added to the conf file: DocumentRoot "C:/xampp/htdocs" ServerName localhost DocumentRoot C:/xampp/htdocs/socengv1" ServerName socengv1.local I have also tried this with NameVirtualHost *:80 command with no difference to note.

    Read the article

  • What Web Technology to use?

    - by Sven
    Hey guys, I would like to start a project and I am concerning what kind of programming language/web framework to use. There is not that much logic involved. It's about a community-page with a lot of users(not that much at the beginning but I would like to be ready to welcome a lot), that should be able to communicate through private messages and a forum and there will be a lot of content (news, articles) to consume. I also want to provide several authorization settings to provide some content for only specific people. In fact it's about a content management system, but I want to design it functionally myself. And I want to use some external APIs. The only website I can think of with almost similar functionality is pokerstrategy.com. I looked up their job offers and it seems like they use java and php Maybe you guys can give me your thoughts. What would you use to encounter that requirements and how would you apprach? Thank you

    Read the article

  • URL good practice for category sub category?

    - by Seting
    I have developed a application and I need to work for SEO-friendly URL. I have following URL structure: http://localhost:3000/posts/product/testing-with-slug-url-2 and http://localhost:3000/posts/product/testing-with-slug-url-2-4-23 Is this a good practice? If not how can I rewrite it? Ok Ill explain about my applicaiton. My application is based on shopping. if a customer searched for mobiles. it will redirect to url like this http://mydomain.com/cat/mobile-3 3 in the url indicates my database id it is used for further searching After the user reached the mobile page he may need to filter for some brand eg. nokia so my url look lik http://mydomain.com/subcat/nokia-3-2 The integer at the end refers to 3 category id and 2 the brand id My doubt is whether the integer at the end of the url will affect seo ranking.

    Read the article

  • Cannot access Adsense funds after switching to third-party partnership program

    - by Clay
    I had a Google Adsense partnership with $80 in it, but then switched to a different partnership and now can't get my money. When I first started YouTube, I joined the Adsense partnership program. After gaining $80 in my Adsense account, I got an offer to join a third party partnership program called Zoomin.tv. I accepted, and it is paying me monthly now. The problem is that my Adsense account still has the $80 in it, and is not gaining more cash. The Zoomin.tv money is going directly to my PayPal. The payment threshold in Adsense is $100, and you can't make it lower. Therefore, my money is stuck in Adsense and I'd love a solution that allows me to access my money.

    Read the article

  • Strange characters appearing on websites - ASCII? - UNICODE?

    - by Mick
    I have created many very simple pure HTML websites over the years. Most of them appear to work fine most of the time. But there is one recurring problem which I have never quite sorted out involving strange characters. The scenario goes like this: I create the site. I look at it in my browser, everything appears fine. I may look at it a great many times over the coming weeks or months as I make additions here and there. Perhaps on a variety of browsers on a variety of PC's. Then one day I look at the page and see a random sprinkling of white question marks against dark diamond shapes. These might appear where I had expected to see hyphens or quotes or apostrophes. My immediate thought is that my browser got into some strange state because I was looking at some foreign website with strange characters, but I'm never quite sure. I'm left with that nagging feeling that perhaps half the planet is seeing my website with funny question marks all over it. So my question is what's going on? What should I do to ensure that as many people as possible around the world can view my text as I originally intended? Should I be using those special html sequences like &pound; for all non alphanumeric characters? Should I worry at all? Edit: Right now I have the problem occurring on this page: http://www.fullreservebanking.com/papers.htm ... part of it looks like this: I am using FireFox 5 and the character encoding currently appears to be "UNICODE (UTF-8)". I do not remember manually setting the character encoding to anything since installation. I do occasionally look at Japanese websites for work related reasons - though when I do so, I do not manually make any changes to firefox settings. Edit: Now fixed. Web page altered accordingly.

    Read the article

  • SEO for Country & Language Specific content.

    - by kecebongsoft
    Currently I am creating a website which has a common topic for an article, but it's going to be different content for each country, and also, each of that content will be provided in several languages. And this mechanism exists in most of the parts in the website. For example, I have an article about tax. This article has to be different for each country, for example china. And tax content for china should be written in china AND english language (for non china-speaker). What is the best URL pattern to handle this? What I've been thinking is, using a sub folder (/country-code/language-code/) such as: www.example.com/cn/cn/tax www.example.com/cn/en/tax Or using top level domain such as: www.example.cn/cn/tax www.example.cn/en/tax Or subdomain such as cn.example.com/cn/tax cn.example.com/en/tax I think I will not prefer the last option since I might need to use subdomain for other purpose. Which left only subfolder and TLDN. I've read some articles saying that TLDN is good for localized content (language-specific content), but in my case, my TLDN will also has english contents (for non local speaker) which is specific only to that particular country (also the purpose of this is to let people from other country easily search it through google). What is the best pattern to pick and why?.

    Read the article

  • Advice On Price Comparison Affiliate Programs

    - by pixelcook
    I want a price comparison feature on my site similar to Consumer Reports' "Price & Shop" section. They use PriceGrabber.com, but as far as I can tell they have a special deal with CR, so I can't get a similar service for my site. I've gathered that I need to use an affiliate network, but the whole thing seems so shady, I don't really know what sites are legit, and I don't know what sites offer the price comparison feature. Datafeedfile.com comes up a lot during my searches, but the ugly site makes me wary. Does anyone have any experience with this? What affiliate networks do you recommend? Or should I be looking at something else altogether?

    Read the article

  • "Progressive" JPEG: Why do many web sites avoid rendering JPEGs that way? Pros, cons?

    - by Chris W. Rea
    When JPEG images are used by a web page, they are typically rendered top-down ... but they can also be rendered using a mode called progressive JPEG, where the image starts out full-size, but blurry, and then gets sharper with successive passes, until it's fully loaded. Progressive loading requires the image have been saved that way. Why don't more web sites use progressive JPEG? What are the drawbacks? Is it simply a lack of tool support, or are these files somehow inferior to traditional top-down rendered JPEG images?

    Read the article

  • What's the best practice for linking to/from guest posts on other blogs?

    - by sam
    When writing a guest blog for a site, I include a link back to my site - an inbound link. If I were to write a post on my blog publicising my guest post, that would mean there were reciprocal links, thus cancelling each other out, correct? If I made the link (on my blog) back to the guest post nofollow, would that cancel the effect, meaning I still get link juice from the guest post? Further down the line if the site I posted on as a guest wanted to write a post for my site, what is the best way for me to prevent re-introducing the reciprocal link problem?

    Read the article

  • advertising servers / advert delivery solutions for C#/Asp.Net

    - by Karl Cassar
    We have a website which we want to show adverts in - However, these are custom adverts uploaded by the webmaster, not the Google adverts, or any adverts the network chooses. Ideally, there would be both options. We were considering developing our own advert-management system, but looking at the big picture, it might be better to consider other alternatives. Website is currently developed in C# / ASP.Net (Web Forms) Are there any recommendations to some open-source delivery networks and/or external hosted advert delivery networks? Personally I've used Google's DFP, however sometimes it is not so easy to get a Google AdSense account approved, especially while developing a new website and it not yet being launched. Not sure if this is the best place to ask this kind of question!

    Read the article

  • .htaccess URL rewriting friendly URL with 2 parameters, the second parameter is optional

    - by letsworktogether
    I'm kind of stuck at this part and was hoping that I'd get some assistance. I'm building a highscores page in PHP, that's going great, it works. However, I dislike the idea of index.php?skill=name and therefore wanted a bit of SEO in this. I have successfully replaced the url with a more friendly version: highscores/skill/name And this is where the problem starts, I have added pagination to the highscores and the page is read from the HTTP_GET page variable ($_GET['page']). I dislike the idea of highscores/skill/name&page=2 and was hoping if you guys could assist me to make the url like the following: Page 1, so accessing the file without declaring the page number: DOMAIN.TLD/highscores/skill/name Page 1 so now the page variable is needed: DOMAIN.TLD/highscores/skill/name/2 As you can tell the "2" will define page 2 and load the correct data for page 2. However, I'm having much trouble in my .htaccess file to configure it this way. RewriteRule ^highscores\/skill\/(.*?)(\/(.*?)*)$ highscores/skills.php?skill=$1&page=$2 [L] # Skills page That is my latest attempt in order to get it to work, unfortunately it does not work, it makes the page look horrible (CSS doesn't work) and it doesn't go to the page specified on the URL.

    Read the article

  • DNS NAmeserver Aname and cname records

    - by David
    Hi - I am inexperienced in the configuration of DNS and have an issue with dominan hosting set up. I have two domains 'www.mydomain1.com' and 'www.mydomain2.com', with mydomain2 pointed at the same place as mydomain1. The domains were passed to me recently by the person who previoulsy controlled them. I have an account with fasthosts in the uk. When I accepted the domains I could not access the DNS settings and enquired with fasthosts as to why. The replied saying 'The delegate hosting option for both domains were enabled and this is the reason why you were unable to find the option to edit the advanced DNS records. I have now disabled the delegate hosting option so you can now edit the advanced DNS records for both domains in your account.' When i log into the fasthost control panel now i can access the DNS controls but both domains have no A Record of Cname record set up. I am concerned that fasthosts have blatted the previous Nameserver entries and set me up on theirs but not added any record. 'www.mydomain1.com' currently still works but 'www.mydomain2.com' does not find the site anymore. i am worried i will lose mydomain1 to as teh dns changes filter through the system. my webhosting is at 'xxx.xxx.xxx.xxx/mydomain1.com/' and this is where I want both domains to point. Any advice would be much appreciated. one thing which is confusing me is that because I am on a shared server I have to put 'xxx.xxx.xxx.xxx/mydomain1.com/' to get to my site rather than just 'xxx.xxx.xxx.xxx'. The form on fasthosts for the aname record only allows an IP to be entered - does it add the mydomain1.com/ onto the end itself? Thanks for any help given - I'm quite worried about this David

    Read the article

  • How to get search engines to properly index an ajax driven search page

    - by Redtopia
    I have an ajax-driven search page that will allow users to search through a large collection of records. Each search result points to index.php?id=xyz (where xyz is the id of the record). The initial view does not have any records listed, and there is no interface that allows you to browse through all records. You can only conduct a search. How do I build the page so that spiders can crawl each record? Or is there another way (outside of this specific search page) that will allow me to point spiders to a list of all records. FYI, the collection is rather large, so dumping links to every record in a single request is not a workable solution. Outputting the records must be done in multiple requests. Each record can be viewed via a single page (eg "record.php?id=xyz"). I would like all the records indexed without anything indexed from the sitemap that shows where the records exist, for example: <a href="/result.php?id=record1">Record 1</a> <a href="/result.php?id=record2">Record 2</a> <a href="/result.php?id=record3">Record 3</a> <a href="/seo.php?page=2">next</a> Assuming this is the correct approach, I have these questions: How would the search engines find the crawl page? Is it possible to prevent the search engines from indexing the words "Record 1", etc. and "next"? Can I output only the links? Or maybe something like:  

    Read the article

  • SEO - Index images (lazyload)

    - by Guilherme Nascimento
    Note:My question is not about Javascript. I'm developing a plugin for jQuery/Mootols/Prototype, that work with DOM. This plugin will be to improve page performance (better user experience). The plugin will be distributed to other developers so that they can use in their projects. How does the lazyload: The images are only loaded when you scroll down the page (will look like this: http://www.appelsiini.net/projects/lazyload/enabled_timeout.html LazyLoad). But he does not need HTML5, I refer to this attribute: data-src="image.jpg" Two good examples of website use LazyLoad are: youtube.com (suggested videos) and facebook.com (photo gallery). I believe that the best alternative would be to use: <A href="image.jpg">Content for ALT=""</a> and convert using javascript, for this: <IMG alt="Content for ALT=\"\"" src="image.jpg"> Then you question me: Why do you want to do that anyway? I'll tell you: Because HTML5 is not supported by any browser (especially mobile) And the attribute data-src="image.jpg" not work at all Indexers. I need a piece of HTML code to be fully accessible to search engines. Otherwise the plugin will not be something good for other developers. I thought about doing so to help in indexing: <noscript><img src="teste.jpg"></noscript> But noscript has negative effect on the index (I refer to the contents of noscript) I want a plugin that will not obstruct the image indexing in search engines. This plugin will be used by other developers (and me too). This is my question: How to make a HTML images accessible to search engines, which can minimize the requests?

    Read the article

  • Can AJAX in a CMS slow down your server

    - by Saif Bechan
    I am currently developing some plugins for WordPress, and I was wondering which route to take. Let's take an example, you want to display the last 3 tweets on your page. Option 1 You do things the normal way inside WordPress. Someone enters the website, while generating the page, you fetch the tweets in php via the twitter api, and just display them where you want. Now the small problem with this is, that you have to wait for the response from twitter. This takes a few ms. NO real problem, but this is question is just out of curiosity. Option 2 Here you don't do anything in WordPress on the initial load, but you do have the API inside. Now you just generate the page, and as soon as the page is done on the client side, you do a small AJAX call back to the server via a WordPress plugin, to fetch your latest tweets. Also called asynchronously. Now the problem with this IMO is that you have much more stress on your server. For starters you have two HTTP requests instead of one. Secondly the WordPress core has to load two times instead of one. Other options Now I know there are a lot of other options: 1) Getting the tweets directly via javascript, no stress on the server at all. 2) Cache the tweets so they are fetched from the DB instead of using the API every time. 3) Getting the tweets from an ajax call that is not a WordPress plugin. 4) Many more. My Question Now my question is if you only compare 1 and 2, which would be a better choice.

    Read the article

  • DNS for domain shows old website for www version

    - by user3745746
    I bought 2 domains form GoDaddy but with both I am seeing the same problems in that the domain on the www version goes to the old site which is still being hosted. I have checked the IntoDNS website and in the www record it shows: Your www.example.com A record is: www.example.com -> example.typepad.com -> cname-cloudflare.typepad.com -> What can I do to stop this from happening? Will this eventually be automatically removed and fix itself? Though obviously it's not automatically fixed itself in the long drawn out expiry process... It's been quite a while for one of them and still hasn't propagated for the www. I'm not having any problems with the normal example.com part of the site.

    Read the article

  • Getting a double slash when redirecting for a canonical hostname on Firefox only

    - by Brian Neal
    I have a Django powered website, and I'm trying to solve the "canonical hostname" problem. I want www.example.com to redirect to example.com. I have tried both techniques found in the Apache documentation here (scroll down to Canonical hostnames). I'm currently trying the mod_rewrite method, and I have this in a virtual host container: RewriteEngine on RewriteCond %{HTTP_HOST} ^www\.example\.com$ [NC] RewriteRule ^/?(.*)$ http://example.com/$1 [L,R=301,NE] This works for me, except for one case. In Firefox only, if I type www.example.com in a browser, it redirects and I see this in the URL bar: example.com// (note the 2 trailing slashes). However, something like this will work correctly: www.example.com/news/ gets redirected to example.com/news/. I only see this on the root URL in Firefox. It seems to work fine on Windows under Chrome, IE9, and Opera (maybe those browsers eat the double slash?). My Mac using friend says it is fine in Safari, but he also sees the problem in Firefox. As far as Django settings go, I am using the default value of APPEND_SLASH=True. I don't know if Django has anything to do with it, but I've tried mod_rewrite rules like the above on static HTML sites before and it always seems to work.

    Read the article

  • Chmod 777 to a folder and all contents on Apache web server

    - by Ryan Murphy
    I have just got new hosting for my website and I have a directory /www which I put all my website files within and a folder in that directory called 'store'. Within 'store' is several files and folders, I want to give the folder 'store' and all files and folders within the 'store' folder all permissions. How do I do this? I am guessing via .htaccess. I have tried inserting chmod 777 -R /store Into the .htaccess file but didn't work. Threw a big on screen error at me. I want to make all the files and folders within /store writable.

    Read the article

  • Send and Receive SMS from my Website [on hold]

    - by Weng Fai Wong
    The way I plan to use it is: Have people send SMS to a number to vote. On the backend (assuming that the data comes back to my Web server), I will display the voting results on my Web site. After say 10 minutes, I would like to press a button on my Web site so ONE of the people who sent an SMS earlier receive an SMS saying that person is a winner. I'm an ASP .Net developer, so I just need an API to code against. One such company I saw that does this (but is limited to US) is: http://www.twilio.com/sms/ Do you know any international providers that are similar to Twilio SMS? I'm based in Sydney, Australia. I've looked through the discussion here but could not find any International provider that does what Twilio SMS does: How to add SMS text messaging functionality to my website? Thank you.

    Read the article

  • Robots.txt practices with .htaccess redirections (inherits)

    - by Jayhal
    I have a question regarding how to write robots.txt files for many domains and subdomains with redirects in place. We have a hosting account that enacts primary and add-on domains. All of our domains and subdomains, including the primary domain, is redirected via htaccess 301s to their own subdirectories in the primary domain's root directory. I'm confused about how I would write the robots.txt for certain directories. First, I wanted to confirm I am right in understanding that for domains and subdomains, crawlers will look to the directory that acts as that urls root directory for the crawling rules(robots.txt). Also, that a directory will not be affected by a robots.txt present in their parent directory if the directory has its own domain/subdomain, and that url is the one being accessed by crawlers. (Am pretty sure, but I wanted to confirm I didnt have a fundamentally flawed understanding of robots.txt) In the original root directory on the account(where the primary domain was directed before htaccess was put in place) what should the robots.txt contain? When crawlers look to crawl our primary domain, will they look to the original root directory for the robots.txt or will they reference the file contained in the new subdirectory where all the primary domain's site files are located? If so, what should the root's robot.txt include if anything at all. Would I be right to include a simple 'disallow: /' for all agents, and then include more specific robots.txt files in each subdirectory with more specific instructions. Would that affect the crawling of the directory where the primary domain is now redirected? Any help is greatly appreciated, Thanks!

    Read the article

  • What constitutes a "substantial, good-faith effort to remove the links"

    - by Luke McCallum
    We engaged the services of a 3rd party SEO consultant to assist us in managing our Meta data and to write regular blogs on our site http://cyberdesignworks.com.au Without our authorisation, the SEO also ran a link building campaign which has seen us Penguin slapped and we no longer appear in Google for a number of our core keywords. Since notification by Google that we have "unnatural links" back in March we have undertaken a significant campaign to rid ourselves of these dodgy backlinks by a number of methods. I have just received feedback on my 4th or 5th resubmission which is still advising that we need to make a "substantial, good-faith effort to remove the links" before Google will reconsider us for inclusion. After the effort that I have gone through to get links removed, I am now at a loss as to what else I can do to demonstrate "substantial, good-faith effort to remove the links". Below is a summary of the actions that we have taken to date. According to http://removem.com we had about 5584 back-linking domains. Of those we have successfully contacted and had removed links from 344 domains We ignored links from 625 domains as they were either legitimate press releases, natural backlinks or client websites containing an attribution link in the footer that points back to us. Due to our efforts, or the sites simply becoming defunct, removem.com reports that links from 3262 domains have been removed. We have contacted but are yet to receive feedback from 1666 domains so we can assume that the backlinks remain. We have configured an automatic 301 redirect for each of the links from these 1666 domains to point to http://redirects.sanscode.com/ which we are calling our Bad Link Catcher (a stroke of genius I thought). i.e http://www.mysimplewebdesign.com/create-a-perfect-webpage-with-four-important-tips-from-sydney-web-development-service-companies.php As we are a web design agency, we have a large number of client websites which contain an attribution link in their footer which points back to us. We have gone through the vast majority of these and updated these links to replace anchor text with an image and rel="nofollow" link. i.e <a rel="nofollow" target="_blank" href="http://www.cyberdesignworks.com.au/"><img src="https://sessions.sanscode.com/site/assets/media/badges/Badge_CDW_SANSCODE.png"></a> See http://www.milkatwork.com.au/ An export from http://removem.com detailing the number of times we have contacted each link and whether it is still found or not was also supplied with each resubmission. The total back links reported in Google Web Master Tools has dropped from over 100K to 87K and I expect it to drop significantly lower once Google re-crawls each back-linking page. Based on all of the above, I am not sure what else I can do to to demonstrate a "substantial, good-faith effort to remove the links". I would sincerely appreciate any feedback or suggestions that you may have as I am out of ideas.

    Read the article

  • Shopping Cart URL Structure

    - by Drew
    In regards to URL structure when it comes to guests and authenticated users, am I able to track traffic associated with both paths, but at the same time track total conversions going through the shopping cart? I have set up the following URL structure: Authenticated users follow this path: /cart /checkout /checkout-confirmation-ty Guests go like such: /cart /checkout-guest /checkout-confirmation-guest-ty can I track the authenticated and guest users separately? is this possible with Google Analytics?

    Read the article

  • Replace %26 in htaccess to %2526

    - by Patrick
    I would like htaccess to rewrite example.com/something_%26_else into example.com/something_%2526_else. I'm importing a bunch of pages that have ampersands in the title from Mediawiki. These are encoded as %26. Drupal, for various reasons, has decided double encode the url it to have it become %2526. I simply can't create the alisis within Drupal so I have to use htaccess This is what I have as my rule so far as RewriteRule ^w/([^%26]+)\%26(.*)$ w/$1\%2526$2 [R=301] I asked this question three months ago on stackexchange and was not able to get it working. I tried hiring a contractor for this but was unable to find one. So this my last ditch effort before I completely give up. I really appreciate the help. All the best, Patrick

    Read the article

  • How to protect Google Ads from yontoo layers runtime?

    - by Dharmavir
    Since sometime I have observed that Google Ads on any website including my blog (http://blogs.digitss.com) gets replaced with something similar to uploaded image below. I am sure it's happening with many people and that could reduce google adsense income. After some research I found that it is because "yontoo layers runtime" from http://www.yontoo.com/ (tagline says: Platform that allows you to control the websites you visit everyday.) but actually they are taking over. I am not sure with which software they are making a way into users computer but that seems very bad in terms of freedom of Internet and advt/marketing industry. I don't remember I have ever said "yes" to install yontoo on my computer. This piece of software is successful to install itself on my laptop/desktop and workstation at office. I am going to disable it now but the question is how do I make my websites aware of Yontoo Runtime and stop them from replacing Google Ads? Basically they are not able to replace all adsense ads but so far they are successfully replaced 1st instance of adsense advt and I am sure in future they will hit more. There could be 2 approaches 1) Fool yontoo runtime by putting some misleading divs in html document to save actual ads, 2) Completely disable yontoo by working out some client side script (javascript) which can fail/crash yontoo runtime and so will fail it's purpose of replacing ads. You can visit my blog (http://blogs.digitss.com) and see on top-right corner, if you find that google ad replaced with something similar to image attached with question - it means your computer/browser is infected too. Looking forward to reply from webmasters, if someone has already wrote some code/plugin to make website (and google ads) safe from yontoo or similar runtime. FYI: it was able to push this runtime in all browsers installed on machine. So a dangerous threat. And yes, I am just using Google ads - not sure if yontoo runtime is doing trick against other ad networks or not? I am sure they must be doing it with some handful of ad networks.

    Read the article

< Previous Page | 73 74 75 76 77 78 79 80 81 82 83 84  | Next Page >