Search Results

Search found 14789 results on 592 pages for 'pro backup'.

Page 272/592 | < Previous Page | 268 269 270 271 272 273 274 275 276 277 278 279  | Next Page >

  • robots.txt, how effective is it and how long does it take?

    - by Stefan
    We recently updated the site to a single page site using jQuery to slide between "pages". So we now have only index.php. When you search the company on engines such as Google, you get the site and a listing of its sub pages which now lead to outdated pages. Our plan doesn't allow us to edit the .htaccess and the old pages are .html docs so I cannot use PHP redirects either. So if I put in place a robots.txt telling the engines to not crawl beyond index.php, how effective will this be in preventing/removing crawled sub pages. And rough guess, how long before the search engines would update?

    Read the article

  • Apache2 and FTP

    - by Jo Colina
    I just set up an Apache web server on my Raspberry Pi, along with MySQL and PHP5, and to upload files i set up vsftpd. The thing is that the ftp connection sent me to my pi user home directory, instead of /var/www . So i changed Pi home directory to /var/www and changed it again to it's previous home. FTP now sends me to /var/www but whenever I upload files other rights are null. (Apache sends a 403 Forbidden every time unless I manually chmod the files inside /var/www uploaded via ftp) Does anyuone know how to fix this? Thanks!

    Read the article

  • How can I get cross-browser consistent behavior for TR heights within a table with a set height? [migrated]

    - by Dan
    I have an arbitrary number of tables with an arbitrary number of rows in each, and all tables are the same height. My initial approach was to just set the overall height of the table and hope the rows were smart enough to distribute themselves appropriately. That's not the case. I have 4 different behaviors going on with 4 browsers, but I need them to all render at the very least in a similar way. Safari & Chrome (WebKit): All rows are equal height, creating scroll bars as needed and fitting within table height. Firefox: All rows are the height necessary to fit their content, with the remaining rows overflowing out of the table. Additionally, If the content of the rows does not take up all of the height, only the part of the table with content in it takes the background (though it seems, through use of Firebug, that the actual table [and TR] extend to the bottom of the proper table height). IE: All rows are the height necessary to fit their content, with the remaining rows overflowing out of the table. Obviously this only includes one version of each browser and additional variation would likely appear with more being tested. Ideally, a solution where the browser renders TRs with less content smaller than those with larger content, while still using scrolling within the variable height TRs when the overall height of the table is not enough would be optimum. I could potentially see a solution to achieve that with JS, but can it be done with CSS? Or, if not, can the behavior that WebKit displays be made to work across the browsers? Thanks! PS: Example can be found here.

    Read the article

  • Should I use my domain registrar's nameserver or find an alternative?

    - by Fazal
    I've recently been moving my client's sites as well as my own and my friends to a cloud server instance I've set up. I don't have a nameserver setup on my instance because I'm not sure how to deploy and manage that side of things yet. I'm using the default nameservers where possible for the sites and just changing the A record DNS settings to point at the server. Some clients are complaining that the sites are running slower then before (since I changed nameservers back to the defaults). Some of the domain registrars are a nightmare to deal with and I can't convince some of my clients to leave them. Is there a sort of paid service I can use instead?

    Read the article

  • Is there a way I can filter traffic by page-type based upon URL structure in Google-Analytics or Google Webmaster Tools?

    - by Felix
    I have a local business directory site. I'm trying to segment my incoming traffic by page-type such that I can find out what percentage of traffic is going to zip code pages exclusively and what percentage is going to city/state level pages. I basically want to filter by URL structure to find out what percentage of total traffic zip code pages account for. The reason for doing this is to find out if Google Tag Manager can help with this? Here are the two URL paths: http://www.example.com/ny/new-york/10011/ http://www.example.com/ny/new-york

    Read the article

  • Cookie manager PHP

    - by HaCos
    I own a Joomla commerce store and although I use Google Analytics in order to track visitors, I need to install a cookie manager in order to be able to track cookies that were installed on customer when he punctuate an order. To be more specific , I am planning to join an affiliate network and I need somehow to track no only the last visit of a customer but if he has a cookie and from which affiliate network as well.

    Read the article

  • make a custom apache handler (for multiple php versions)

    - by user10580
    ive succesfully installed php 5.2 and it runs as CGI. ultimatly i want to put something like AddHandler application/x-httpd-php52 in the htaccess file of the dir i want to run it on. however, this only really works in the virtual hosts because i cant wrap my head around how to deifne a custom handler. <FilesMatch "\.php"> SetHandler application/x-httpd-php5 </FilesMatch> ScriptAlias /php52-cgi /usr/lib/cgi-bin/php52-cgi Action application/x-httpd-php5 /php52-cgi AddHandler application/x-httpd-php5 .php How can i do somthing like AddHandler application/x-httpd-php52 in the htaccess?

    Read the article

  • redirect subdomain(weblog) to new domain that can't access .htccss 301

    - by fafa
    I've a problem that I can't find the solution of it in web. I have a blog that has PR 1 and it's subdomain "aaaa.domain.com" that "domain.com" is a blog server. and now i buy a domain "newdomain.com" i want tell google webmaster to redirect old subdomain to this new domain and load trafic to my new domane. but i can't access .htccss to redirect 301. all thing that i can do is put html code in the html . now how can i do this. when i use "Change of Address" in goole webmaster it say:"Restricted to root level domains only" . sorry for bad English.

    Read the article

  • Webmaster tools showing 404 for non existent folder pages

    - by Jody
    Google webmaster tools is reporting some/many 404 urls that don't exist on my site. The links are things such as domain.com/xyz/ However that doesn't exist, but domain.com/xyz/index.html does exist. The "linked from" pages all show proper links to the "/xyz/index.html". The page without index.html DOES 404, but why is google even trying these urls if they are not linked to? My real question, is there a way to have google stop attempting to load these pages, and ultimately remove these from the crawl errors report. Thanks.

    Read the article

  • Should the English website use href="x-default" when it doesn't auto-redirect to the user's language or country?

    - by Noam
    For each URL on my site, I'm auto-redirecting according to header accept language. The site arch is English version: http://mydomain.com/page Spanish version http://es.mydomaina.com/page etc.. The english version is displayed unless I'm seeing a specific language other than en and that I support in the header, and then a redirect occurs. Google says this: For language/country selectors or auto-redirecting homepages, you should add an annotation for the hreflang value "x-default" as well: My pages aren't language selectors, nor are they the homepage. But I am auto-redirecting. My question is, should my english version be hreflang="x-default" or/and hrefland="en"?

    Read the article

  • SSL is not enabled in Heroku

    - by Drom
    Since Heroku has integrated the piggybank_ssl addon by default, trying: heroku addons:add piggyback_ssl or heroku addons:add ssl:piggyback results in: ! This add-on is now a standard Heroku platform feature. ! Your apps can already use piggyback SSL. But, accessing: mywebsite.herokuapp.com results in: SSL is not enabled mywebsite.herokuapp.com does not have SSL enabled. My production.rb has: config.force_ssl = true What am I missing? Update I have read the page SSL on Heroku, but it's obsolete, they suggest to run: heroku addons:add piggyback_ssl Which is no longer necessary. SOLVED ! I don't change anything, and now it's working. It seems that it was a heroku problem.

    Read the article

  • Will adding q&a help my site's rankings, and if so, what are the implications of a sub-domain for q&a rather than a path on the site? [closed]

    - by ElHaix
    Possible Duplicate: Subdomain versus subdirectory One of our web properties is doing quite well without any additional links being created on the site, and our link inventory is tightly managed - no user-generated links. To introduce a community aspect to the site, we want to implement a q&a forum. Once in place, new links will populate our link inventory with keywords that are not necessarily targeted to the site. With the q&a on a sub-domain, would that not affect the main site's rankings? What's the best approach for this?

    Read the article

  • Recommendations for a network of student-related content

    - by Javier Marín
    I am running a network of websites with notes, homeworks, essays, etc. where users share their own content. I'm having real trouble with the latest Google updates (penguin, panda, etc) because the content is mainly poor-quality and with the same topic. For that reason, I want to create more websites and have more probabilites to appear in the SERPs. My question is: does Google analyzes related websites in order to exclude it from the results? I've think about distribute the websites around the world, in different hostings, but I'm afraid that Google would link it by their analytics, webmaster tools or adsense account, is that possible? What other recommendations do you have?

    Read the article

  • Images not indexed by google since moving to cdn

    - by dfunkydog
    Last week I moved all the images on coffeeandvanilla.com to a cdn( maxcdn.coffeeandvanilla.com ). The problem I'm having is that although the sitemap—generated by yoast wordpress seo plugin—points images to the correct location, google only indexes[sic] images from the category and page site maps but 0 images from the posts sitemap( see screenshot https://dl.dropbox.com/u/4635252/sitemap.png ) This website has been doing quite well with google image-search before the change, visits from google image search have dropped from ~200/day to 11 yesterday Here is an example entry from the generated posts.xml sitemap http://pastebin.com/vcMRf9VW Can anyone suggest where the problem lies? Why have I lost all my google image juice? Should I just wait some more, how long before really worrying?

    Read the article

  • Can I improve my AdWords quality scores with better landing pages?

    - by Eric
    I noticed that I have some keywords in my AdWords that are totally applicable to my site but the quality score of the keyword is 4 or 5. I'd like to get it up higher by creating custom versions of my site's home page (landing page) targeted specifically for people searching on those keywords. So for example, if we pretend my site sells pet food, my current home page has the phrase "dog food." I have a specific AdWords campaign for people searching on cat food (with cat food-specific ads). I'm thinking about changing the URL on those ads to something like http://mysite.com/cat.html, so a different home page comes up with the phrase "cat food." My thinking is that will help Google see that this new landing page is appropriate for the keywords and will raise my quality score for the "cat food" keywords. (Note that none of what I'm doing is shady or misleading; nobody would disagree that all of the keywords and ads I've created are perfect and appropriate for what my site offers.) Question: is what I describe the correct way to raise poor quality scores on keywords, and will it help?

    Read the article

  • joomla sometimes messes up urls, probably cache involved

    - by Bakaburg
    Is a bit i'm having this problem and i really cannot get the hang of it... Every once in while my joomla site messes up links url and for example from something like this: http://www.sism.org/index.php?option=com_comprofiler&task=userslist&listid=4&Itemid=123 it becomes like this: http://www.sism.org/index.php/component/k2/administrator/components/com_dump/assets/css/images/stories/inrilievo/sism/htm/index.php?option=com_comprofiler&task=userslist&listid=4&Itemid=123 the new page has the right content but there are no css and other linked resources. Usually i solve the problem by deleting all the cache and turning it off and on again. Of course this is pretty annoying especially for my association. Does any one have any clue on this? Watching the URLs the components involved seems to be K2 and Jdump. Thanks

    Read the article

  • Why are my backlinks not showing on google on this asp.net website with all I've done?

    - by Jason Weber
    I recently implemented many SEO techniques for a company on their asp.net website; in 6 months, we jumped from a PR1 to a PR3. But I'm having issues with google backlinking. Here are some of the things I've done: Not only did I set up their own Google+ page 6 months ago, I update it pretty much daily with links, pictures, etc., and I blog about it on my own personal Google+ page and post links, etc. ... They have their own Twitter, Facebook, YouTube, and all are updated almost daily. I've listed in as many quality, relevant directories as possible 6 months ago; I've avoided link farms. The site is solid SEO-wise. Key-phrase rich URLs, schema.org & rich snippets. No duplicate content ... www or non-www 301's, trailing slashes, etc. ... all taken care of. Probably a ton of other things, but basically, the site is all set, SEO-wise. Here's what's confounding: When I do a link:www.example.com in Bing/Yahoo, it shows many backlinks. When I do a link:www.example.com in google, it shows up 0 links. Or when I use a site-ranker like Web Site Rank Tool it's showing 0 backlinks from Google. Any suggestions would be appreciated!

    Read the article

  • Changing domain name - what are the practical steps involved

    - by Homunculus Reticulli
    I launched a website a couple of years ago, bright eyed and bushy tailed, with dreams of conquering the world. Unfortunately it wasn't to be. Now, that I am a bit older and wiser, I have spent some money on branding and creating more quality content etc, I am rebranding and relaunching the site with a new domain name. Although the traffic on the old site is laughable (i.e. non-existent), there are a few pages of good information on there and I don't want to lose any "juice" those pages may have gained because web crawlers have been seeing it for a few years now. Ok, the upshot of all that is this: I want to change my domain name from xyz.com to abc.com. I am maintaining the same friendly urls I had before, only the domanin name part of the url will change, so that any traffic coming to the old page will be forwarded/redirected? to the new page seamlessly. How do I go about achieving this (i.e. what are the steps I need to carry out, and to minimize any "disruption" to any credibility the existing site has with Googlebot etc? I am running Apache 2.x on a headless Linux (Ubuntu) server.

    Read the article

  • Managing accounts on a private website for a real-life community

    - by Smudge
    I'm looking at setting-up a walled-in website for a real-life community of people, and I was wondering if anyone has any experience with managing member accounts for this kind of thing. Some conditions that must be met: This community has a set list of real-life members, each of whom would be eligible for one account on the website. We don't expect or require that they all sign-up. It is purely opt-in, but we anticipate that many of them would be interested in the services we are setting up. Some of the community members emails are known, but some of them have fallen off the grid over the years, so ideally there would be a way for them to get back in touch with us through the public-facing side of the site. (And we'd want to manually verify the identity of anyone who does so). Their names are known, and for similar projects in the past we have assigned usernames derived from their real-life names. This time, however, we are open to other approaches, such as letting them specify their own username or getting rid of usernames entirely. The specific web technology we will use (e.g. Drupal, Joomla, etc) is not really our concern right now -- I am more interested in how this can be approached in the abstract. Our database already includes the full member roster, so we can email many of them generated links to a page where they can create an account. (And internally we can require that these accounts be paired with a known member). Should we have them specify their own usernames, or are we fine letting them use their registered email address to log-in? Are there any paradigms for walled-in community portals that help address security issues if, for example, one of their email accounts is compromised? We don't anticipate attempted break-ins being much of a threat, because nothing about this community is high-profile, but we do want to address security concerns. In addition, we want to make the sign-up process as painless for the members as possible, especially given the fact that we can't just make sign-ups open to anyone. I'm interested to hear your thoughts and suggestions! Thanks!

    Read the article

  • Is it safe to have no TOS or PP?

    - by JamerTheProgrammer
    I have coded my own forums from the ground up. I have tried my best to make my code as secure as possible and encrypting everything I can. I want to use this forum for a Minecraft server. I have one concern however.... I would like to setup this forum now but having no TOS or Privacy Policy has put me off. Will having none of either cause me any legal trouble in the unlikely event of a data leakage? Thanks

    Read the article

  • Google adsense - providing access (via an additional account?) to a third party

    - by Homunculus Reticulli
    I am working with a partner who will be handling the marketing side of things for one of my websites. He has informed me that he will require access to my adsense account. I need to create an additional account for him, so that he can access and manage Google Adwords/units etc, using his own login credentials. However, despite searching Google for a while now, I can't seem to locate any information that pertains to creating additional user accounts. Does anyone know how I may do this?

    Read the article

  • Are there any free hit counters that don't track users?

    - by David Englund
    Are there any free services that increment a simple hit counter without tracking the users of the site? I would like to know how many visitors there are to my site, excluding bots. I don't need detailed information like unique visitors or where the user is from (in fact, that's exactly what I don't want). I have been researching free hit counters, and it seems that most (all?) of them display advertisements and their terms of service indicate that they can use the data they collect from the client site however they want. Google Analytics also does this and tracks users across sites. The site is static HTML, so an external link or iframe of some sort is easiest for me to implement. I could switch to a Ruby or Node.js back-end, in which case lots of other options open up (like Ruby impressionist and more low-level implementations), but my hosting service is pretty limited. If the answer to my question is simply "no," what are my other options?

    Read the article

  • Does Fetch as Googlebot still support their ajax-crawling proposal?

    - by Gunchars
    I spent half a day implementing the server side html generation for modal pages based on their proposal (link), but it seems like the Fetch as Googlebot functionality in Webmaster tools completely ignores the URL fragment. I've verified that the _escaped_fragment_ functionality is working on my server (example), but when I submit a URL like /#!/recipes, the Googlebot just fetches /. There aren't any recent confirmations that it's working and, honestly, it wouldn't surprise me if they just silently dropped the functionality without even editing the docs.

    Read the article

  • URL-rewriting on Plesk using ISAPI_rewrite3 Lite

    - by Anusha
    I am using Plesk Windows based web server with Windows 2008 server OS with IIS-6 for my e-commerce website. I want to rewrite URLs for all dynamic pages, So I installed ISAPI_Rewrite 3 Lite on my web server also I had uploaded the .htaccess file with the basic rules as follows RewriteEngine on RewriteRule ^contact\.html$ contactus.php? [NC,R] I never worked before with ISAPI neither on URL- rewriting. My doubt is How should I proceed after installation. Should I upload .htaccess or httpd.conf file OR This s/w has ISAPI_Rewrite Manager which gives place to edit httpd.conf, Should I write rules on this. Anyways I had tried all these steps but unfortunately I couldn't find any remedies. Any immediate solution will be appreciable.

    Read the article

  • setting up freedns with an existing domain

    - by romeovs
    I've been running a webserver off of a pc at a static IP succesfully for the past 5 months. recently however, I've moved into another appartment and my ISP only provides a dynamic IP (my IP changes from time to time). I'm not an internet genius but I was thinking to fix this by using a Dynamic DNS provider. So I got on the web and found freedns. I'm a bit confused about how to set up everything though. I've managed to succesfully install the IP updater daemon on my web server. Then, in my registrars control panel, I set the NS records to point at ns1 through ns4.afraid.org (removing the old NS records). I'm not certain what I should do with the A records though (for now they are still pointing to the old static IP address). I have A records for www, blog, irc, etc. but I cannot point them at my new IP address, because it isn't Could someone explain this in the clearest possible sense (perhaps elaborating on what happens at each step of the DNS process). I never really knew what the A records are for anyway. (note that I haven't really found any documentation at the freedns website, or on google)

    Read the article

< Previous Page | 268 269 270 271 272 273 274 275 276 277 278 279  | Next Page >