Search Results

Search found 9724 results on 389 pages for 'pro zeck'.

Page 72/389 | < Previous Page | 68 69 70 71 72 73 74 75 76 77 78 79  | Next Page >

  • Weird .ASP pages from my non-ASP site generating 404s

    - by Amanda
    In Google Webmaster Tools, I have a huge list of "Not Found" crawl errors (404) with URLs that look like this: http://www.exclusivevillas.co.za/villa_view.asp?vSeq=82&activitySeq=3&page=3, seemingly originating from URLs very similar to that (eg http://www.exclusivevillas.co.za/villa_view.asp?vSeq=82&activitySeq=3&page=4. Thing is, the site is WordPress. Has been for almost a year now. Was plain html before that. I don't know where these ASP requests are coming from. And furthermore, the dates these supposed ASP pages requested these other ASP pages, resulting in 404s, are very recent. What's going on?

    Read the article

  • 1and1: Unable to host an external domain

    - by Django Reinhardt
    I'm sorry if this isn't the right place for this question, but I'm presently having difficulties with my hosting provider (1and1). Two weeks ago, two of my clients bought hosting from them on my recommendation, but as it turned out, 1and1 are having severe technical difficulties. Right now non of their hosting packages are able to accept ANY external domains. So either you pay the costs of transferring the registrar of your domain, or you use the ugly 1and1 domain name. Not any good for a hosting company of 1and1's reputation! They have been promising me for two weeks that they're going to fix the problem, but as you have probably guessed by now, that hasn't been the case. I would like to know if a) Anyone else is in the same boat as me, and b) If there are other comparably reputable hosting providers that I should consider moving to instead? Very disappointing! :( Note: This is for 1and1 in the UK. I imagine it isn't affecting users in other countries(?) Clarification: 1and1 are unable to accept ANY external domains. That means that even if you update your DNS details on your domain, their system cannot be updated to add your external domain to your account.

    Read the article

  • Interpretation of empty User-agent

    - by Amit Agrawal
    How should I interpret a empty User-agent? I have some custom analytics code and that code has to analyze only human traffic. I have got a working list of User-agents denoting human traffic, and bot traffic, but the empty User-agent is proving to be problematic. And I am getting lots of traffic with empty user agent - 10%. Additionally - I have crafted the human traffic versus bot traffic user agent list by analyzing my current logs. As such I might be missing a lot of entries in there. Is there a well maintained list of user agents denoting bot traffic, OR the inverse a list of user agents denoting human traffic?

    Read the article

  • Apache htaccess results in files being downloaded instead of displayed

    - by chrissik
    So I had this "beautiful" website that did exactly what I wanted it to do. Then I shut down my PC, reboot and...the pages just download now instead of being displayed. I re-installed XAMPP and launched Apache again and I was able to identify the .htaccess file as the cause of the problem. Options +FollowSymlinks RewriteEngine on RewriteCond %{QUERY_STRING} !^desktop RewriteCond %{HTTP_USER_AGENT} "android|blackberry|googlebot-mobile|iemobile|iphone|ipod|#opera mobile|palmos|webos" [NC] RewriteRule ^/?$ /mobile/index [L,R=302] RewriteRule ^/?$ /de/index [R] RewriteCond %{REQUEST_FILENAME} !-d RewriteCond %{REQUEST_FILENAME}\.html -f RewriteRule ^(.*)$ $1.html Here is the problem I guess: RewriteCond %{REQUEST_FILENAME} !-d RewriteCond %{REQUEST_FILENAME}\.html -f RewriteRule ^(.*)$ $1.html This should make it possible to use /de/index instead of /de/index.html - but somehow it causes the page to download if I open localhost/de/index (but with localhost/de/index.html it works fine...). I'm using HTML Sites with SSI Elements on a Apache web server. The only other file that is different to the out-of-the-box ones is the httpd.conf, where I enabled SSI: AddType text/html .shtml AddHandler server-parsed .shtml AddHandler server-parsed .html AddHandler server-parsed .htm Options Indexes FollowSymLinks Includes AddOutputFilter INCLUDES .shtml Options +Includes So I hope there is somebody among you that can help me with this annoying problem as I'm quite desperate... for some reason, even without the problematic lines Chrome keeps downloading the files (even if I delete the .htaccess file), while IE and Opera display the pages. Edit: Now Opera also wants to download files (whether index.html or index are called).

    Read the article

  • Fetching templates via API. Who provides this service?

    - by Guandalino
    I'm mainly a server side developer. I'm not a designer, even if I understand web layouts, grids, CSS, typography, valid markup, etc. and I'm able to do some graphic work too (almost). It just takes a lot of time and the result is not always beautiful. I know there are tons of website templates sites out there, and I'd like to use their designs as a starting point for my customers' works, giving them the possibility to choose the design they like more. I'd just prefer to show the templates catalog to customers from within my site, fetching templates info (screenshots, description, etc) from a remote server using an API. TemplateMonster.com provides, or provided, such API. But the service responds with "Unauthorized usage". Are there other sites offering this kind of retrieval service?

    Read the article

  • What bots are really worth letting onto a site?

    - by blunders
    Having written a number of bots, and seen the massive amounts of random bots that happen to crawl a site, I am wondering if the goal of the site allowing bots is for the potential for the bot to send real traffic back to the site if there is any reason to allow bots that are not known to be sending real traffic back, and how to spot these "good" bots; based on how they ID themselves, IPs they come from, behaviors, etc.

    Read the article

  • Displaying topics/categories related ads

    - by Frank
    Is there any advertising company that allow webmasters to decide on general ad topics (such as "Entertainment", "Autos and vehicles", "Arts", etc.) to be displayed for a user visiting on a website? I know you can choose specific ad campaigns to show to users, or let the advertising company decide for you which ads to show. But I am looking for an option to ask for the advertsing company to show me ads based on categories/topics. Thanks. Frank

    Read the article

  • Will it hurt my website's SEO friendliness if I host a french targetted website at, let's say, godaddy.com?

    - by Suraj
    Hi guys, I have read that the server location is important for a website to be SEO friendly. I am planning to build a website from scratch which is targetted mainly to french audience (in france), but I am planning to host the web site at godaddy.com. My concern is will it hurt the website SEO friendliness? Or do you recommend me to host the website in france itself? I have also read that I need to have a static IP Address. If it's true, can anyone explain me for what reason? Can anyone suggest me some good web hosting companies, prefereable in france? Thanks in advance!

    Read the article

  • Blocking Just the Parent Domain via robots.txt

    - by Bryan Hadaway
    Let's say you have a parent domain: parent.com and children subdomains under that parent domain: child1.com child2.com child3.com Is there a way to use just the following within parent.com: User-agent: * Disallow: / Considering each child has their own robots.txt stating: User-agent: * Allow: / Or is the parent robots.txt still going to have to make an exception for every single subdomain: User-agent: * Disallow: / Allow: /child1/ Allow: /child2/ Allow: /child3/ Obviously this is important and tricky territory SEO wise so I'm looking to learn the definitive and safe, best practice method here to sharpen my skills. Thanks, Bryan

    Read the article

  • DNS question and Google PageRank from domains

    - by Beck
    I'm not so good at dns at all :) Just basics. A while ago i have noticed, that my blog have different Page Ranks, PR 3 for domain www.example.com and PR 1 for domain example.com. In dns records i have this setup: A - IP - www.example.com A - same IP - example.com Should i replace this record "A - same IP - example.com" with row with CNAME instead of A? Like that: CNAME - same IP - example.com - alias www.example.com Will this combine Page Rank value of both domains? Or i can just create 302 redirection inside .htaccess file, verify example.com(without www) inside Google Webmaster Tools as my domain and inside www.example.com options set as main domain? Thanks ;)

    Read the article

  • Magento Checkout options

    - by graham barnes
    Hi I want to add some options to my magento, lets say i print on clothing, a customer buys some t-shirts, shirts and jackets from me, it totals to £60+ VAT on the checkout area where i signup and not before I need to add an option where I can add a text box and upload option, can i do this? I ideally then want to add some pricing options if the user has chosen to add some branding to a product or multiple products e.g. if the branding was on the top right of the shirt it will cost £5.00, if on the back it costs £7.00 etc all if possible to be done via the admincp. I also want an option so when they upload their logo for the first time they are charged a one off charge, like a setup fee but If the customer has allready sent in there logo then no charge applies. thanks Graham

    Read the article

  • Website hosted at home pingable from outside, but not browseable from outside

    - by Richard DesLonde
    I have a simple setup. Server at home has local I.P. 192.168.1.3 IIS is running on the server and the website is up. Windows firewall on the server has an exception rule for port 80 TCP Router has static I.P. XX.XXX.XX.XXX Router is forwarding TCP port 80 to 192.168.1.3 My domain registrar is my DNS host and is pointing to the static I.P. XX.XXX.XX.XXX of the router Here's what I can and can't do. I can browse the website from within my home network either by I.P. or domain name. I can ping the domain and the I.P. from outside the network (from a computer at work). I can't browse the website either by domain name or by I.P. Wierd. Why I can't browse my website? Incidentally, I wasn't sure this question was appropriate for SO, but after finding a few others similar to it on SO, and no comments on those questions saying anything about it being innapropriate, I decided I would post this question. Let me know if this is not appropriate for SO, or is more appropriate for another of the SE websites. Thanks!

    Read the article

  • Will my current page layout get me penalized for duplicate content?

    - by Perry Roper
    I am using WordPress and in my post sidebar I have related posts which may be of interest to the user, however, I also have an excerpt of each article which is normally the first paragraph of the post it is linking to. For example: http://musicdune.com/reviews/album-review-ellie-goulding-lights If you do a Google Search for the first excerpt in the realted posts section from that page you get 4-5 results from my domain, http://www.google.co.uk/search?sourceid=chrome&ie=UTF-8&q=Strip+back+the+synths,+fast+beats+and+the+other+pop+elements,+and+you%E2%80%99re+left+with+something+elegant+and+soulful Is it recommended that I remove the excerpt from the related posts?

    Read the article

  • Link form correct, or, punishable by search engines?

    - by w0rldart
    I have the following dilemma with the links of a wordpress blog that I work with, I don't know if the way it creates the link to the images is ok or not so good. For example: Article URL: http://test.com/prima-de-riesgo/ Image URL belonging to the article: http://test.com/prima-de-riesgo/europa/ So what I'm worried about is the repeating "prima-de-riesgo" part. Should I, or shouldn't I? UPDATE Wow, I can't believe that you took test.com as for the real domain, hehe! Article URL: http://queaprendemoshoy.com/prima-de-riesgo-y-otras-graficas-interesantes-del-ano-2011-deuda-publica-pib-vs-empleo-y-precio-del-oro/ Image URL belonging to the article: http://queaprendemoshoy.com/prima-de-riesgo-y-otras-graficas-interesantes-del-ano-2011-deuda-publica-pib-vs-empleo-y-precio-del-oro/deuda-publica-eurozona/ So, as I mentioned... I'm worried that prima-de-riesgo-y-otras-graficas-interesantes-del-ano-2011-deuda-publica-pib-vs-empleo-y-precio-del-oro , the common factor for the article url and image url, can be considerate as duplicate content or anything that could be punishable by search engines

    Read the article

  • CRM + Invoicing/Billing + Ticketing for a small web design company

    - by Mike
    Hi everyone, I am currently using ActiveCollab but it lacks the typical CRM features. I can't even keep notes about a customer saved in one place. What I am looking for is a simple but efficient CRM application that allows me to store all the (potential) customers along with their phone calls noted down, contracts, agreements. On the billing end, I should be able to keep track of invoices and payments, along with a bit of sales reports. A great extra would be a ticket support feature but not really necessary I looked at VTiger and SugarCRM at first. Though, they look too complex on the sales/campaigns end but completely lack the billing side. Do you have some good apps/services to suggest? :) Any programming language or OS would do. Both paid and free. Thanks Mike

    Read the article

  • How important is responsive web design?

    - by Daniel
    I've heard many different opinions regarding the pros and cons of responsive web design recently and was wondering whether it was necessary for small businesses that target small geographical areas to implement it? Some sub-questions I have relating to this include: Is it better to use responsive web design as opposed to having separate code utilized for different dimensions/devices? Can it affect SEO (positively or negatively)? What are the main problems I could run into when optimizing a website for a business using this design method?

    Read the article

  • Chrome causing 404's ending with "/cache/[hex-string]/"?

    - by Jan Fabry
    Since the last weeks we see many 404's on our sites caused by Chrome adding /cache/[hex-string]/ to the current page URL. The hex strings we have seen are: e9c5ecc3f9d7fa1291240700c8da0728 1d292296547f895c613a210468b705b7 408cfdf76534ee8f14657ac884946ef2 9b0771373b319ba4f132b9447c7060a4 b8cd4270356f296f1c5627aa88d43349 If you search for these strings you get matches from different sites, but they are most likely auto-generated (/search/cache/e9c5ecc3f9d7fa1291240700c8da0728/ for example). Is this a known issue with Chrome (or an extension)?

    Read the article

  • Format numbers with css

    - by Luc M
    Is it possible to format numbers using css ? When I have 7000000.00, I would like it displayed as 7 000 000.00 I know I could write a backend (php, perl...) function or a javascript function that could return the formatted number but... The numbers that I want to format are into a cell. I would like to have something like <td class="myformat">7000000.00</td> or <td><span class="myformat">7000000.00<span></td>

    Read the article

  • Is it fine to put different category of stuff in a single domain name?

    - by Fahad Uddin
    I own a website which is regarding startups and finance. I am looking forward to work on Wordpress programming in which I would be selling wordpress themes. I thought of buying a domain name for Wordpress website but it takes quite lot of time to setup a website and then do its SEO. Is it fine(in terms of SEO and professionalism) to put the Wordpress category inside my old domain like, Domain: www.startupsandfinance.com Wordpress domain, www.wp.startupsandfinance.com

    Read the article

  • OpenJs mysql not reading database [migrated]

    - by Benedikt Wutzi
    I try to access a mysql table using OpenJs Grid. I already doublechecked if the database "partsdb" and the table "parts" exists and it can be accessed from the commandline. Currently I'm using the must basic example: members.php: <!DOCTYPE html> <html lang="en"> <head> <meta charset="utf-8"> <title>Members Page</title> <script> $(function() { $(".parts").grid(); }); </script> </head> <body> <div id="container"> <h1>Members Page</h1> <table action="ajax.php"> <tr> <th col="id">Id</th> <th col="name">Name</th> </tr> </table> <a href='<?php echo base_url()."main/logout" ?>'>Logout</a> </div> and ajax.php <?php mysql_connect("localhost","****","*****"); mysql_select_db("partsdb"); require_once("grid.php"); $grid = new Grid("parts"); ?> When I run php directly on ajax.php I get an error telling me: " Unknown column 'parts.' in 'field list' " and the table shows only the headers. What am I doing wrong?

    Read the article

  • Facebook - Filter Page posts by #hashtag [on hold]

    - by beppe9000
    I'm trying to gather all the posts (official posts and people's posts) on my page which contain a specified #tag, to later show them on website. But I've no clue on how to accomplish this. Is there any API capable of this or anything else that could help? I basically need to get all those posts IDs looping spidering trought them for my hashtag. I'm planning to do this server-side so PHP is my choice.

    Read the article

  • What about SEO in one page website with ajax loaded content?

    - by Azimbadwewe
    As my title I'd like to build a website with just one input text for searching restaurants and I would like to load via ajax in the same page the resultst in a list. After the list is loaded if you click on one row for Restaurant details it load via ajax all the Restaurant details. what about SEO in a website structure like this? There is a way to index every single restaurant? I'm pretty new in SEO and every comment will be for sure important to me in order to understand and learn more about it. Cheers

    Read the article

  • Best Free software for hosting user guides

    - by Hippyjim
    Hi All After having to clean up spam from a MediaWiki install for the umpteenth time, despite a recaptcha plugin "preventing" automated signups, I'm wondering if MediaWiki is the right choice as a CMS for hosting user manuals and guides. I've always loved the way wikis can let the guides be edited and commented on collaboratively, but I'm getting tired of dealing with automated vandals. I've disabled edits & signups for now, but as I'm having to go through the pain of cleaning thousands of junk pages, I'm beginning to think I should cut my losses and look for a better alternative. Does anyone know of suitable a FOSS application (preferably PHP / MySQL based) that would be simple for a non-coder (our manual writer) to edit, but that has all the interconnectivity, and searchability of a wiki? Or should I just bite the bullet again and lock the wiki down even further?

    Read the article

  • Do or can robots cause considerable performance issues?

    - by Anicho
    So the question in the title is exactly what I am trying to find out. My case is: At work we are in a discussion with team members who seem to think bots will cause us problems relating to performance when running on our services website. Out setup: Lets say I have site www.mysite.co.uk this is a shop window to our online services which sit on www.mysiteonline.co.uk. When people search in google for mysite they see mysiteonline.co.uk as well as mysite.co.uk. Cases against stopping bots crawling: We don't store gb's of data publicly available on the web Most friendly bots, if they were to cause issues would have done so already In our instance the bots can't crawl the site because it requires username & password Stopping bots with robot .txt causes an issue with seo (ref.1) If it was a malicious bot, it would ignore robot.txt or meta tags anyway Ref 1. If we were to block mysiteonline.co.uk from having robots crawl this will affect seo rankings and make it inconvenient for users who actively search for mysite to find mysiteonline. Which we can prove is the case for a good portion of our users.

    Read the article

  • Use virtual pageviews for all goal tracking

    - by Jeff Wu
    I'm new to Google Analytics and I'm wondering if it would be cleaner to user virtual pageviews for all the goal tracking on my website instead of using a mix of regular page views and virtual pageviews. I know in most cases this is just semantics but there are multiple pages where the same goal can be achieved and I think it would be cleaner just to fire the same virtual pageview instead of having two different goal pages. Will this model also give developers more flexibility when they do development? I know we are moving to a CMS and urls can get hairy, so I think this might be a good way to make analytics portion of the site "future proof". Any thoughts are appreciated! Thanks.

    Read the article

< Previous Page | 68 69 70 71 72 73 74 75 76 77 78 79  | Next Page >