Search Results

Search found 12701 results on 509 pages for 'fulltext index'.

Page 262/509 | < Previous Page | 258 259 260 261 262 263 264 265 266 267 268 269  | Next Page >

  • How to purge old links in google from an old domain.

    - by jbcurtin
    Hey all, Recently, I uploaded a new site to an existing domain and I'd like to figure out how I can forward all links to said domain to a new domain. I'm looking for a wordpress solution if possible, but in the end I I seem myself writing a small header script that I will paste into ever directory's index file saying header('Location:http://xxx.yyy.zzz') Is there a cleaner way to do this without having to resort to managing the whole file structure? No, I do not have access to the apache runtime. Unfortunately it is a shared-host server. Thanks in advance.

    Read the article

  • Will a rel=canonical link pointing to a 301 redirect pass less pagerank than one without a 301?

    - by tobek
    On this official Google page about canonical links it says: Can rel="canonical" be a redirect? Yes, you can specify a URL that redirects as a canonical URL. Google will then process the redirect as usual and try to index it. There is no mention that this might dilute the impact of the canonical link. However, Google has made clear elsewhere that 301 redirects do dilute PageRank - roughly as much as a link dilutes PageRank. Is that relevant here? I'm assuming the answer is "no" but I wanted to confirm. Relevant but not duplicate: Does Rel=Canonical Pass PR from Links or Just Fix Dup Content.

    Read the article

  • Why do I get a 403 error when accessing my apache server?

    - by nishan
    Im running Ubuntu 12.04 LTS on a system with 2 GB RAM and a 500 GB HDD. My hard drive has 4 partitions: Partition 1 = 40 gb Windows (NTFS, lable = win32) Partition 2 = 320 gb Windows (FAT label = common) Partition 3 = 40 gb Ubuntu (EXT4) I installed apached2. Then, to change its default www directory, I ran gksu gedit /etc/apache2/sites-enabled/000-default and, in the editor, changed the location to /media/common/www. After that I ran these commands in a terminal: chmod 777 /media/common/www chmod 777 /media/common/www/*.* After that I ran: firefox 127.0.0.1/index.php It said: Forbidden You don't have permission to access / on this server. Apache/2.2.22 (Ubuntu) Server at 127.0.0.1 Port 80 Before my changes it was working fine. How can I run my websites?

    Read the article

  • Dynamic website SEO development

    - by Pankaj Upadhyay
    I made a website which stayed online for 6 months. During that period the search results for the site were not at all good. Even typing the domain name yielded just two or three category result. Now, I have taken the site down for total redevelopment and redesign. The aim of this question is to know the basics for SEO to be done while redesigning the site. My site will be in ASP.NET MVC 3 and will have main categories, sub categories and sub-sub if any. Then there will be products in those categories. All the data will come from MSSQL DB. Please tell me just the basics required for a dynamic website during development. I want to ensure that google and other engines index all the pages of my site including products or whatever.

    Read the article

  • Redirect packages directed to port 5000 to another port

    - by tdc
    I'm trying to use eboard to connect to the FICS servers (http://www.freechess.org), but it fails because port 5000 is blocked (company firewall). However, I can connect to the server through the telnet port (23): telnet freechess.org 23 (succeeds) telnet freechess.org 5000 (fails) Unfortunately the port number is hardcoded (see here: http://ubuntuforums.org/archive/index.php/t-1613075.html). I'd rather not have to hack the source code as the author of that thread ended up doing. Can I just forward the port on my local machine using iptables? I tried: sudo iptables -t nat -A PREROUTING -i eth0 -p tcp --dport 5000 -j REDIRECT --to-port 23 and sudo iptables -t nat -I OUTPUT --src 0/0 -p tcp --dport 5000 -j REDIRECT --to-ports 23 but these didn't work... Note that: $ sudo iptables -t nat -L Chain PREROUTING (policy ACCEPT) target prot opt source destination REDIRECT tcp -- anywhere anywhere tcp dpt:5000 redir ports 23 Chain INPUT (policy ACCEPT) target prot opt source destination Chain OUTPUT (policy ACCEPT) target prot opt source destination REDIRECT tcp -- anywhere anywhere tcp dpt:5000 redir ports 23 Chain POSTROUTING (policy ACCEPT) target prot opt source destination

    Read the article

  • Google Analytics - bad experiences? (esp. adult content)

    - by Litso
    Hello all, I work for a rather large adult website, and we're currently not using Google Analytics. There is an internal debate going on about whether we should start using Analytics, but there is hestitation from certain parties. The main argument is that they fear that Google will get too much insight into our website, and might even block us from the index as a result based on our adult content. Has anyone here ever had such an experience, or know stories about bad experiences with Google Analytics in such a manner? I personally think it will only improve our website if we were able to use Analytics, but the dev team was asked to look into possible negative effects. Any help would be appreciated.

    Read the article

  • Do wordpress websites get indexed quicker by SE than a regular website?

    - by guisasso
    I registered a couple of domains with the names of categories of products we sell. I then installed wordpress in one of those domains and played around with it for a bit, and left it alone for about a month. There was a link on my regular website to that secondary website and that website was also registered in google's webmaster tools, but that's that. I then searched on google last week for that product category, and to my surprise, that secondary website showed up in the 2nd or 3rd page on google. Now my question is: Do search engines index wordpress websites quicker? I had given up on using wordpress for that website, since it's so simple, but should i use it, would it give me better results? Thanks in advance for the help, if the question is not deleted.

    Read the article

  • "Failed to fetch" while updating

    - by Farouk BA
    I'm trying to update from ubuntu 12.10 lately but I keep getting the "Failed to fetch" error. W: Failed to fetch ht tp://security.ubuntu.com/ubuntu/dists/quantal-security/Release Unable to find expected entry 'independent/binary-amd64/Packages' in Release file (Wrong sources.list entry or malformed file) W: Failed to fetch ht tp://archive.ubuntu.com/ubuntu/dists/quantal/Release Unable to find expected entry 'independent/source/Sources' in Release file (Wrong sources.list entry or malformed file) W: Failed to fetch ht tp://archive.ubuntu.com/ubuntu/dists/quantal-updates/Release Unable to find expected entry 'independent/binary-amd64/Packages' in Release file (Wrong sources.list entry or malformed file) W: Failed to fetch ht tp://archive.ubuntu.com/ubuntu/dists/quantal-backports/Release Unable to find expected entry 'independent/binary-amd64/Packages' in Release file (Wrong sources.list entry or malformed file) E: Some index files failed to download. They have been ignored, or old ones used instead. I changed the server and deleted the source lists from /var/lib/apt/lists/ like some answers say but still. This is really annoiying.

    Read the article

  • 11.10 - Update Manager Not working

    - by Mattlinux1
    W:Failed to fetch cdrom://Ubuntu 11.10 Oneiric Ocelot - Release i386 (20111012)/dists/oneiric/main/binary-i386/Packages Please use apt-cdrom to make this CD-ROM recognized by APT. apt-get update cannot be used to add new CD-ROMs , W:Failed to fetch cdrom://Ubuntu 11.10 Oneiric Ocelot - Release i386 (20111012)/dists/oneiric/restricted/binary-i386/Packages Please use apt-cdrom to make this CD-ROM recognized by APT. apt-get update cannot be used to add new CD-ROMs , E:Some index files failed to download. They have been ignored, or old ones used instead. This happens when i hit the check button? and the updates were working before.

    Read the article

  • 12.04 sound keeps auto-muting when idle

    - by fali
    I just installed 12.04 on an HP8510W. Everything works fine except for one weird behavior which I have noticed. When ever there is no audio playing, the audio mute indicator on the laptop is on. As soon as I start playing a you tube video the mute indicator turns off and I get sound. Here is my pulse audio output which says that the sink is suspended because it is idle: Welcome to PulseAudio! Use "help" for usage information. list-sinks 1 sink(s) available. index: 0 name: <alsa_output.pci-0000_00_1b.0.analog-stereo> driver: <module-alsa-card.c> flags: HARDWARE HW_MUTE_CTRL HW_VOLUME_CTRL DECIBEL_VOLUME LATENCY DYNAMIC_LATENCY state: SUSPENDED suspend cause: IDLE I tried running alsamixer, but I don't see the auto-mute option.

    Read the article

  • is it ok to have 2 sitemaps on 1 website?

    - by user615041
    Do I have to have a sitemap page on my index page for bots to read it or can I just have it anywhere on my server? I have a phpbb/wordpress integration and I need 2 sitemaps mods for each one (or I need to have them somehow integrated together into one xml sitemap). Is this possible? Whats my best option? I would have the phpbb one something like this: http://www.example.com/phpbb/sitemap.html and the wordpress one something like this: http://www.example.com/wordpress/sitemap.html and then I would submit both off..but not have the links on my footer to confuse anyone.., the sitemaps would strictly be for search engines. Is this a good idea? what are you thoughts?

    Read the article

  • How to recommend that Google indexes some keywords?

    - by Werewolf
    I've read many articles about SEO. I've tried to implement my knowledge on a site but I haven't gotten good results in 6 months. e.g.: I've used Google Webmaster Tools, sitemaps, title tags, keywords in paragraphs, etc. My Alexa rank is growing but Google detected some keywords that isn't my goal :-(. Is there a good way to focus on a keyword on search engines? How can I recommend Google to index some desired keywords? (They are available in my pages.)

    Read the article

  • Flash site loads slowly

    - by bogdanvursu
    I have a simple html page that embeds an swf, that downloads other xml, swf and image files. The total count of the requests reaches about 90. I am aware that it should take a while until the content is available and I am OK with that. All the needed files are hosted by two different providers in the US: flashxml.net/monochrome-demo.html and u1.flashcomponents.net/samples/8751/index.html From two different countries in Europe, the content shows up a lot later (almost twice as later) from flashxml, than flashcomponents. I've done mtr tests and the ping difference is about 40ms and the flashxml server load is below 1. Do you have any other suggestions as to what should I look at?

    Read the article

  • not getting updates

    - by gknarayana
    when i check for updates the message is "W:Failed to fetch cdrom://Ubuntu 12.04 LTS _Precise Pangolin_ - Release i386 (20120423)/dists/precise/main/binary-i386/Packages Please use apt-cdrom to make this CD-ROM recognized by APT. apt-get update cannot be used to add new CD-ROMs , W:Failed to fetch cdrom://Ubuntu 12.04 LTS _Precise Pangolin_ - Release i386 (20120423)/dists/precise/restricted/binary-i386/Packages Please use apt-cdrom to make this CD-ROM recognized by APT. apt-get update cannot be used to add new CD-ROMs , W:Failed to fetch cdrom://Ubuntu 11.10 _Oneiric Ocelot_ - Release i386 (20111012)/dists/oneiric/main/binary-i386/Packages Please use apt-cdrom to make this CD-ROM recognized by APT. apt-get update cannot be used to add new CD-ROMs , W:Failed to fetch cdrom://Ubuntu 11.10 _Oneiric Ocelot_ - Release i386 (20111012)/dists/oneiric/restricted/binary-i386/Packages Please use apt-cdrom to make this CD-ROM recognized by APT. apt-get update cannot be used to add new CD-ROMs , E:Some index files failed to download. They have been ignored, or old ones used instead." please suggest what i should do

    Read the article

  • duplicate pages

    - by Mert
    I did a small coding mistake and google indexed my site wrongly. this is correct form: https://www.foo.com/urunler/171/TENGA-CUP-DOUBLE-HOLE but google index my site like this : https://www.foo.com/urunler/171/cart.aspx first I fixed the problem and made a site map and only correct link in it. now I checked webmaster tools and I see this; Total indexed 513 Not selected 544 Blocked by robots 0 so I think this can be caused by double indexes and they looks not selected makes my data not selected. I want to know how to fix this "https://www.foo.com/urunler/171/cart.aspx" links. should I fix in code or should I connect to google to reindex my site. If I should redirect wrong/duplicate links to correct ones, what the way should be? thanks for your time in advance.

    Read the article

  • How to direct a Network Solutions domain name to an html website hosted on Google Drive? [on hold]

    - by Air Conditioner
    To begin with, I'd wanted to take advantage of HTML, CSS, and so on to build a website that looks and works just as I'd like it to. I took a look around on how I could make that work, and I soon saw a lifehacker article showing that its possible to host website files on google drive. I then made sure that the folder containing the files was shared publicly throughout the web, and I now have a working 'google drive hosted' domain for the website. However, I did want to have the custom domain, and so I registered one with network solutions. So now, I'm curious on how I should direct my Network Solutions domain to the index.html I'm hosting on google drive. Would anyone have an Idea?

    Read the article

  • Wordpress Multisite Network installation and dev questions

    - by Daitya
    Please go easy on me. I'm a clutzy dinosaur. I currently have a large, unwieldy website hand-coded in html/css with php includes. It currently has a single WP installation in a subdirectory. The plan is to reorganize, and I want to use WP as the CMS and incorporate 3 WP blogs for 3 subdomains. Ideally, would like to create a WP multisite network to allow for further expansion and to save admin trouble. I just want to confirm that if I install WP in the root directory and create 3 blogs (in subdomains), does this mean my website's home page is the mother blog's index.php? Essentially, I will have created 4 blogs - mother at root and 3 children in subdomains? How to set this up on my Mac (OSX 10.5.8) running MAMP for development? And then how to migrate to server without breaking?

    Read the article

  • Join Us at Oracle OpenWorld Latin America (Dec 4-6)

    - by Zeynep Koch
    Hello to all Latin Americans,  Oracle Openworld Latin America is starting tomorrow. Oracle Linux will be showcased in different sessions and in the exhibition area. Here's some of the links and details to our sessions: Session Schedules: http://www.oracle.com/openworld/lad-en/session-schedule/index.html Oracle Linux sessions: New Features in Oracle Linux: A Technical Deep Dive,    Dec 4, 13:30-14:30, Mezzanine Room 7 Oracle Linux Strategy and Roadmap,   Dec 4, 17:15-18:15, Mezzanine Room 5 Oracle OpenWorld Latin America Exhibition Halls Hours Tuesday, December 4 12:00–19:3018:15–19:30 (Dedicated Hours)Wednesday, December 511:00–19:3018:30–19:30 (Dedicated Hours)Thursday, December 6 11:00–19:0017:45–19:00 (Dedicated Hours) We will also hand out the following in our booth, don't forget to visit us: - Oracle Linux and Oracle VM DVD Kit  - Server Virtualization for Dummies  See you there :)

    Read the article

  • T-SQL Tuesday: What kind of Bookmark are you using?

    - by Kalen Delaney
    I’m glad there is no minimum length requirement for T-SQL Tuesday blog posts , because this one will be short. I was in the classroom for almost 11 hours today, and I need to be back tomorrow morning at 7:30. Way long ago, back in SQL 2000 (or was it earlier?) when a query indicated that SQL Server was going to use a nonclustered index to get row pointers, and then look up those rows in the underlying table, the plan just had a very linear look to it. The operator that indicated going from the nonclustered...(read more)

    Read the article

  • What is the proper way to create a cross-fade effect? [closed]

    - by Starx
    When creating an image slider, using a cross fade is one of more popular effects. Various sliders use differing techniques to create such an effect. Two techniques I've found so far are: Use an overlay and underlay <div> and fade in and out each other's visibility. Create a <div> matching the exact size of the slider during initialization, play with its z-index property, and then fade each other. Is there a better way to create this effect?

    Read the article

  • What measures can be taken to make sure Google is aware of the existence of a newly created page?

    - by knorv
    Consider a website with a large number of pages. New pages are published regularly. When publishing a new page the website operator wants to get the newly created paged indexed in Google as soon as possible. The website operator wants to minimize the time spent between publication and indexing. Consider the site http://www.example.com/ with hundreds of thousands of pages. The page page http://www.example.com/something/important-page.html is created at say 12:00. I want to get important-page.html indexed as soon as possible after 12:00. Ideally within seconds or minutes. What options are available to try to get Google to index a specific newly created page as soon as possible?

    Read the article

  • Hosting a magnet link site which could possibly infringe copyrighted material?

    - by Griff
    I have for the last 3 months built a crawler, indexer and alot of other things for what started out to be a home project for indexing magnet links on the internet. As my project grew I have thought about releasing my collected data (which at the minute is on a public domain but with no access) to the public. Whatever the crawler sucks in goes in, and whatever the indexer decides to index gets indexed as it is a fully automated process. My question is as follows; Considering that most of the data that is collected from what I have built points to illegal copyrighted material (as most magnet links do) where would it be best to host such a site. I notice all of the already public torrent sites are hosted in India is this because there laws are less strict on copyright infringement? Have any of you hosted such a site, and if so what problems have you ran into? And as always any advice on being a webmaster for this type website?

    Read the article

  • Git doesn't sync files until committed, even if checked out in a different branch

    - by DertWaiter
    Okay, I have git 1.7.11.1 on Windows and I have a local test repository with 2 branches. One is master with index.php and help.php. I then create another branch called slave :) I run from git bash rm help.php and it disappears from the folder, but I don't stage anything. I switch to checkout master branch and it is supposed to restore file help.php because it is not modified in the master branch, isn't it? And it does not do it. When I go back to the slave branch and commit and then switch to checkout master then help.php appears. Is that the way it is supposed to to work? Why?

    Read the article

  • "Unable to connect" to getdeb.net, how do I fix it?

    - by Nirmik
    i want to know what this error means and how to fix it? The following is the output on the terminal- W: Failed to fetch http://archive.getdeb.net/ubuntu/dists/precise-getdeb/Release.gpg Unable to connect to archive.getdeb.net:http: W: Failed to fetch http://archive.getdeb.net/ubuntu/dists/precise-getdeb/apps/binary-i386/Packages Unable to connect to archive.getdeb.net:http: W: Failed to fetch http://archive.getdeb.net/ubuntu/dists/precise-getdeb/apps/i18n/Translation-en_IN Unable to connect to archive.getdeb.net:http: W: Failed to fetch http://archive.getdeb.net/ubuntu/dists/precise-getdeb/apps/i18n/Translation-en Unable to connect to archive.getdeb.net:http: E: Some index files failed to download. They have been ignored, or old ones used instead.

    Read the article

  • Google Analytics - bad experiences? (esp. adult content)

    - by Litso
    I work for a rather large adult website, and we're currently not using Google Analytics. There is an internal debate going on about whether we should start using Analytics, but there is hestitation from certain parties. The main argument is that they fear that Google will get too much insight into our website, and might even block us from the index as a result based on our adult content. Has anyone here ever had such an experience, or know stories about bad experiences with Google Analytics in such a manner? I personally think it will only improve our website if we were able to use Analytics, but the dev team was asked to look into possible negative effects. Any help would be appreciated.

    Read the article

< Previous Page | 258 259 260 261 262 263 264 265 266 267 268 269  | Next Page >