Search Results

Search found 96383 results on 3856 pages for 'code pro'.

Page 513/3856 | < Previous Page | 509 510 511 512 513 514 515 516 517 518 519 520  | Next Page >

  • Developing A Shopping Cart

    - by Eddy Freeman
    I posted a question earlier about creating shopping cart from scratch(they've closed the question) but i think i must reframe the question because i left something important in the question. I know shopping carts like "Magento Community Edition(very big software)", "OpenCart", "Prestashop", etc, are opensource and maybe huge to develop by one programmer. What about hosted shopping carts like "shopify", "BigCommerce", "3dCart", etc.. are these carts too huge for a one programmer to develop them within 0-2yrs? Are there huge differences between the opensource carts and the hosted carts? Thanks for your answer.

    Read the article

  • Why Is Another Domain Resolving To My IP Address?

    - by Andrew
    I'm not really sure if this is something that I should worry about... I'm currently renting a dedicated server which is hosting a website I've created. The domain of the website was registered with GoDaddy. After submitting a sitemap to Google several months ago, I've noticed that another domain name is resolving to my IP address. This means that every page on my website is actually accessible from another domain. As far as I can tell, the other domain name is meaningless to me, so I'm not sure if this is something I should worry about or not. Is this a residual DNS record from another site that is probably no longer in use? Is it important from the standpoint of either security or SEO? My website is a .com which will later serve e-commerce purposes. The other domain has a top-level domain of st. It's the first one of those that I've encountered. Many thanks in advance!

    Read the article

  • Bad Bot blocking Revisited

    - by Tom
    I've read a lot about bad bot blocking, php scripts, .htaccess techniques, etc... Is this a valid method? Since .htacces can rewrite and send a bad bot a 403 deny or forward to something like spam poison, is it possible to Disallow a folder, then through .htaccess in that specific folder redirect to spampoison? Since Apache reads each .htaccess independently and follows specific instructions, then a bad bot not following robots.txt would just be redirected. Or anyone trying to access, /badbot/ or whatever I choose to call my trap folder. Thanks Tom

    Read the article

  • SSL and green address bar

    - by tinab
    I am new to SSL so can someone explain why my address bar turns green when I'm on certain sites beginning with https:// and sometimes it doesn't even though I know the site has SSL? Maybe these two nuances are not even related, but if I go to GoDaddy and order a new domain I notice their address bar is green the entire time I'm using the https:// protocol, but then I go to Victoria's Secret to place an order and even though it says https:// the address bar doesn't turn green.

    Read the article

  • Tor and Google Analytics - how to track?

    - by Jeremy French
    I make a lot of use of Google Analytics - Google has reasonable tracking for location of users so I can tell where users come from. I know it is not 100% but it gives an idea. In the wake of Prism it is possible that more people will make use of networks such as tor for anonymous browsing. I have no problem with this, people can wear tin foil hats while browsing my site for all I care, but it will lead to more erroneous stats. Is there any way to flag traffic as coming from TOR, so I can filter location reports not to include it, and to get an idea of the percentage of traffic which does use it? Has anyone actually tried this?

    Read the article

  • At what visitor share do you stop supporting a given browser?

    - by adam
    I'm lead dev for a large website which has a higher than average percentage of IE6 users - about 4.4% of our audience. Our new version is going to make use of progressive enhancement - including transitions and effects as well as rounded corners, gradients, web fonts and other CSS techniques. Obviously there are cross-browser ways to achieve most of these things which require various amounts of work to implement. What I'm currently looking into - and what I'd like your experiences of - is how to decide at what point we draw the line between providing an enhanced experience vs just supporting the functionality. FYI, I believe that this question meets the six guidelines for great subjective questions as defined in the FAQ. I'm after answers detailing why and how, not too short, with constructive comments, experiences, facts and references. Thanks! Adam

    Read the article

  • How often does Dreamhost change IP Addresses

    - by pjreddie
    So I just migrated our site to dreamhost because they are free for non-profits. However, right after I switched the nameservers over to them they changed the IP address of the site. So first they propagated out IP address x.x.x.180, then they switched it to x.x.x.178 and had to propagate that out. Point being it meant a lot of downtime since a lot of big DNS servers (like google) thought the address was still x.x.x.180 for up to 5 hours after they switched it. This is compounded by the fact that most our visitors to the site live here in Unalaska and we have local DNS servers that take a LONG time to update (like a day or more) since we get all our internet over satellite. So every time Dreamhost changes our IP address it can mean a day of downtime for us in our community. So my question is, how often do these changes take place? I asked Dreamhost support and they gave me a vague response: I wish I could say, however those changes happen at random times. They're not that frequent, maybe even months between updates, but there's no way to know for sure. First, I hardly believe that they don't know their own system well enough to give me at least some estimate or average. Second, is it worth looking at other providers so that I can get a static IP address? We were hosting the site here originally and hadn't run into this problem since we have a static IP here. We don't get a ton of traffic but usually around 500 hits a day or so, sometimes more if our stories are featured on statewide or national news broadcasts. So hours of downtime every time Dreamhost "randomly" decides to move our server location can be bad for our readership.

    Read the article

  • Why is a # sign added to the end of URLS?

    - by Niro
    Note: I'm asking this from the perspective of the site developers (trying to help someone there). not as a user. Please don't forward this to superuser.com. It's a server admin question. Have a look here http://www.wanimo.com/fr/chiens/coussin-matelas-tapis-pour-chien-sc28/tapis-plat-urban-chic-sf7263/ you'll see that the page gets redirected to the same page with # at the end. Worse, when you click back you get garbage url. I'm trying to debug what is causing the redirect. Any advice on how to find it ?

    Read the article

  • Affiliate software to attract incoming customers

    - by Steve
    I am close to starting a new website for a small business which imports products from USA to Australia. The wholesaler says he will allow my client to be the sole distributor for Australia & New Zealand. I'm not sure what CMS or shopping cart software to use yet, but it will need to include an affiliate system to allow advertisers to push customers our way. Do you have any suggestions for robust, flexible affiliate software? Thanks.

    Read the article

  • Strange robots.txt - how and why did it get there?

    - by Mick
    I recently created a very simple, pure HTML website which I have hosted with "hostmonster". Hostmonster had very good reviews on some comparison website and in general so far they appear to be perfectly good in every way... At least I thought so until just now... I have been making lots of edits to my site on an almost daily basis. My site now appears on the first page (7th on the list) for my most important keyphrase when doing a google search. But I did notice some problem with the snippet chosen by google. I asked a question on this site about snippets and got some great answers. I then made some modifications to my meta data and within 48hrs the google snippet for my search was perfect. The odd thing though was that looking at the "cached" version google had, it appeared that the cache was still very odl- like three weeks previous. This seemed very odd - how could it be that the google robots had read my new metadata without updating the cache? This puzzled me greatly. Just now it occurred to me that maybe I had some goofey setting in my robots.txt file. I didn't actually remember even making one - but I thought I'd have a look just in case. Much to my horror, I saw that there was a robots.txt and it contained the disturbing text below: sitemap: http://cdn.attracta.com/sitemap/728687.xml.gz Intuitively this looks like some kind of junk, spam trick, and I had indeed been getting some spam from "attracta". So my questions are: 1. Should I simply delete this robots.txt? 2. Was the file there all along - placed there because of some commercial tie-in between attracta and hostmonster. 3. Does the attracta robots file explain the lack of re-caching?

    Read the article

  • custom facebook connect image - Is it facebook's policy violation?

    - by Viruthagiri
    I was going to change facebook's default login button with my custom image like mashable I mean like this But I found a article which state its against facebook's policies Is it really a violation? If it is how come mashable using custom image? Can someone answer me? Update This is the exact image i would like to use. Facebook mentioned like this in this page. While you may scale the size to suit your needs, you may not modify the “f” logo in any other way (such as by changing the design or color). If you are unable to use the correct colour due to technical limitations, you may revert to black and white. So my sign in with facebook image violating facebook policy in anyway?

    Read the article

  • Difference between two kinds of Bing URL Referers

    - by joshuahedlund
    Most of the referral URLS that I get from Bing have the following syntax: http://www.bing.com/search?q=keywords+keywords&[some other variables] However I just noticed that maybe 10-20% of them are coming in like this: http://www.bing.com/url?source=search&[some other variables]&url=http%3A%2F%2Fwww.example.com/user-landing-page-on-my-site&yrktarget=_top&q=keywords+keywords&[some other variables] The first syntax gives me the keywords the user typed in, but the second actually gives me the keywords the user typed in and their landing page on my site. I was originally unaware of this second kind altogether because I have a customized referral report that filters out URLs containing my domain. But now that I noticed them I want to know why they occur to see if I can get more to occur this way because the second syntax contains more valuable information. If I go to one of the first URLs, it gives me a typical Bing query page. The second URLs seem to just redirect me to the Bing home page. I'm not sure if it has to do with the kind of search being performed (I also get a few http://www.bing.com/shopping/search?q= referers) or some other metric. Does anyone know what causes some referral URLs from Bing to have the /search?q syntax and others to have the /url?source syntax? P.S. I have verified that I am getting both kinds of URLs from non-advertising clicks. P.P.S. I am not talking about data in Google Analytics or similar software but the raw $_SERVER['HTTP_REFERER'] value coming from the client's original request.

    Read the article

  • 403 error on index file

    - by John L.
    When I try to access index.py in my server root through http://domain/, I get a 403 Forbidden error, but when I can access it through http://domain/index.py. In my server logs it says "Options ExecCGI is off in this directory: /var/www/index.py". However, my httpd.conf entry for that directory is the same as the ones for other directories, and getting to index.py works fine. My permissions are set to 755 for index.py. I also tried making a php file and naming it index.php, and it works from both domain/ and domain/index.php. Here is my httpd.conf entry: <Directory /var/www> Options Indexes Includes FollowSymLinks MultiViews AllowOverride All Order allow,deny Allow from all AddHandler cgi-script .cgi AddHandler cgi-script .pl AddHandler cgi-script .py Options +ExecCGI DirectoryIndex index.html index.php index.py </Directory> Thanks

    Read the article

  • # id - urls with id first display full page, then move to #id

    - by guisasso
    I've noticed this in the new version of chrome, and ie9 and 10. Some urls in a photo gallery have a #id tag as they are supposed to display a full view of a picture. Basically, a div in a lower position on the page has that #id that i call via a.com/1.html#id. This has never been an issue until lately, when i noticed a bit of a lag. The issue: The website loads normally, then the view moves to the #id as supposed, but with some lag sometimes, perhaps because of the high resolution of the picture, which is somewhat noticeable. Anyway to avoid this, or make it so the page would move to the correct #id even before fully loaded?

    Read the article

  • Wordpress subcatagory navigation with permalinks

    - by Towhid
    I used beautiful permalinks on my WP website but navigation in sub subcategories is not possible. for example these URLs are fine: http://technopolis.ir/category/articles/security-articles/ & http://technopolis.ir/category/articles/security-articles/page/2/ but this sub subcategory will generate 404 on 2nd page: http://technopolis.ir/category/articles/security-articles/backtrack/ [first page is fine] http://technopolis.ir/category/articles/security-articles/backtrack/page/2/ [404 error]

    Read the article

  • Google Analytics Funnel Step Regular Expression Not Working

    - by scoarescoare
    The first step in a funnel is going to have a dynamic ending fragment. Examples: http://mysite.com/invite/tickle-party http://mysite.com/invite/pajama-party http://mysite.com/invite/puppy-party To allow for such dynamism, I provided this url for step one: \invite(.*) My goals work but the funnel visualization report shows 0 for everything. I know this problem is due to the regex in the funnel step because I copied this entire goal except I replaced \invite(.*) with /invite/puppy-party When I hardcoded /invite/puppy-party the funnel worked as expected. Why is my funnel report not working with my original funnel step url parameter?

    Read the article

  • Redirect Permanent and https

    - by Clem
    I just set up https on my server, and I have an issue with redirect permanent. If I have a link for example http://domain.com/index.html it redirect me on https://www.domain.comindex.html The / is missing and I can't figure out how to fix it. It's work with http://www.domain.com/index.html Here is my httpd.conf <VirtualHost *:80> ServerName domain.com Redirect permanent / https://www.domain.com/ </VirtualHost> <VirtualHost *:80> ServerName www.domain.com Redirect permanent / https://www.domain.com/ </VirtualHost> <VirtualHost *:443> DocumentRoot /var/www/domain/ ServerName www.domain.com SSLEngine on SSLCertificateFile ssl.crt SSLCertificateKeyFile ssl.key </VirtualHost>

    Read the article

  • Using multiple A-records for my domain - do web browsers ever try more than one?

    - by Jonas
    If I add multiple A-records for my domain, they are returned in a round robin order by DNS-servers. E.g: 1.1.1.1 A example.com 1.1.1.2 A example.com 1.1.1.3 A example.com But how does webbrowsers react if the first host (1.1.1.1) is down (unreachable)? do they try the second host (1.1.1.2) or do they return a error message to the user? Are there any difference between the most popular browsers? If I implement my own application, I can implement so that the second is used in case the first is down, so it's possible. And this would be very helpful to create a fault tolerant website.

    Read the article

  • Google indexed my main site's content under subdomains

    - by Christie Angelwitch
    Google is indexing top level domain content as though it belongs on subdomains and I want to disable this. My site has wildcards enabled and we also have two subdomains with unique content. The first subdomain serves as a blog, the second one has only one page. Both have backlinks. Google has indexed content from the main site under the subdomains as well. Let's say that we have a page at example.com/page.html . The same page has also been indexed as subdomain.example.com/page.html as well and sometimes ranks better than the one located at the main site. The thing is that we never placed this content at the subdomain. I've thought about adding canonical tags at the subdomains to help with the duplicate content issue. How can I stop Google from indexing those pages? I don't even know how Google found those, since we never placed them at the subdomains.

    Read the article

  • .com vs .me for personal and blogging sites. Which one is good regarding seo

    - by Sameer Manas
    I basically have a domain under my name with .com extension. I am planning to use it for my portfolio and also as a regular blog. Now considering SEO and ranking stuff, what is the best way to implement this. myname.com - Portfolio || myname.com/blog - Blog page (or) myname.com - Blog || myname.me - Portfolio i have absolutely no idea on how .tld's impact SEO and Ranking, so i seek the experts advice on this. Thanks in advance.

    Read the article

  • Correct configuration of multiple Analytics trackers per page, spanning domains and subdomains

    - by Eliot Shepard
    My company publishes sites on a somewhat convoluted domain structure, and we're having trouble getting accurate numbers in Analytics when we have multiple trackers on the page. We publish under two brands (A, B). Each brand has a "national" site at A.com, B.com, as well as per-city "local" sites at eg. ny.A.com, la.A.com, sf.A.com, etc. Right now we're trying to track in these dimensions: Full network (A.com, ny.A.com, B.com, la.B.com, etc.) All sites in brand (A.com, ny.A.com, la.A.com, etc.) Inidividual site (ny.A.com) Here are the commands we're using on an individual site: _gaq.push( ['t0._setAccount', 'UA-XXXXXX-1'], // full network ['t0._setDomainName', 'none'], ['t0._setAllowLinker', true], ['t0._trackPageview'], ['t1._trackPageLoadTime'], ['t1._setAccount', 'UA-XXXXXX-2'], // brand ['t1._setDomainName', 'none'], ['t1._setAllowLinker', true], ['t1._trackPageview'], ['t1._trackPageLoadTime'], ['t2._setAccount', 'UA-XXXXXX-3'], // individual ['t2._setDomainName', 'none'], ['t2._setAllowLinker', true], ['t2._trackPageview'], ['t2._trackPageLoadTime'] ); We send the same commands to each account because we've had strange results when trackers were configured differently in the past. However, right now we're seeing inflated numbers for uniques on all three trackers. What is the correct way to configure this setup? Thanks for your time.

    Read the article

  • selling or using a domain name with trademark of other company

    - by Prakash Moturu
    in domain name but the problem is its the exact same word of a big company i am not sure whether they trademarked it or not . is it legal to use the domain for a non profit purpose and for use in the field other than the company in ? and also can i sell it to any one is there any possibility for the company to take any action for selling or using it for some no profit and non related field i have absolutely no idea about trademarks and patents thanks for your time in advance

    Read the article

  • Which shopping cart / ecommerce platform to choose?

    - by fabien7474
    I need to build an ecommerce website within a tight budget and schedule. Of course, I have never done that before, so I have googled out what my solutions are and I have concluded that the following were not valid candidates anymore : Magento : Steep learning curve osCommerce : old, bad design, buggy and not user-friendly Zencart, CRE Loaded, CubeCart : based on osCommerce Virtuemart, uberCart, eCart : based on CMS (Joomal, Drupal, WordPress) that is not necessary for my use-case So I finally narrowed down my choices to these solutions : PrestaShop : easy-to-use, great templating engine (smarty) but many modules are not free buy yet indispensable OpenCart : security issues and not a great support from the main developer. See here and here. So, as you can see, I am a little bit confused and if you can help me choosing an easy-to-use, lightweight and cheap (not-necessarily free) ecommerce solution, I would really appreciate. By the way, I am a Java/Grails programmer but I am also familiar with PHP and .NET. (not with Python or Ruby/Rails) EDIT: It seems that this question is more appropriate for the Webmaster StackExchange site. So please move this question to where it belongs (I cannot do that) instead of downvoting it. BTW, I have found out a question quite similar on SO (http://stackoverflow.com/questions/3315638/php-ecommerce-system-which-one-is-easiest-to-modify) which is quite popular.

    Read the article

  • Phishing alert but file never existed

    - by IMB
    I got an alert from Google Webmasters. They say the following file was present in my host: example.com/~jhostgop/identity.php I checked my files and it never existed at all. I've experience this problem in two different host and domains but the file never existed in my file system. It appears somebody out there is linking a random domain and it prefixes the link with /~jhostgop/identity.php. Now Google may have indexed them so now I get those false phishing alerts. Anyone experienced this? Is it possible to prevent this?

    Read the article

  • How to handle non-existent subdirectories?

    - by Question Overflow
    I have a dynamic website with friendly URLs. Example: Instead of /user.php?id=123, I have /user/123 Instead of /index.php?category=fishes, I have /fishes But, how do I handle non-existent subdirectories such as /about/123? Currently it gives a 200 success instead of a 404 not found error. Is there a way to deal with non-existent subdirectories in Apache config and at the same time allow for friendly URLs? Or do I have to handle this individually for each PHP script?

    Read the article

< Previous Page | 509 510 511 512 513 514 515 516 517 518 519 520  | Next Page >