Search Results

Search found 53597 results on 2144 pages for 'http requests'.

Page 458/2144 | < Previous Page | 454 455 456 457 458 459 460 461 462 463 464 465  | Next Page >

  • Will this sitemap get me de indexed from Google?

    - by heavy rocker dude
    My site's URL (web address) is: http://quantlabs.net/private/sitemap.xml Description (including timeline of any changes made): Will this sitemap get me de-indexed from Google? My new site map just got spidered by Google for some reason. It is located at http://quantlabs.net/private/sitemap.xml, is this in danger of getting me de-indexed from Google's index. Does it look like spam even though it is not meant to be? I am trying to figure the limitation in terms of Google's threshold before they deem it a spammy sitemap. This is sitemap contains automated postings which are different with the stock symbol provided. The amount of postings within the Sitemap are quite a few in a small amount of time.

    Read the article

  • SEO disasters moving domain for a high traffic website?

    - by chrism2671
    We're looking at moving our website from http://www.wikijob.co.uk to http://www.wikijob.com/uk as we spread our wings internationally. Our .co.uk website has a PR6 and received around 1/2 million visitors a month, 40% international. The wikijob.com domain, while registered for a while, has not been used nor promoted. I am concerned that moving domain could really haemorrhage our traffic and result in a loss of goodwill from Google, even if we use a 301, but equally, if we could transfer that pagerank to the .com domain, that would give us a massive head start around the world. Should we do it, or should we start over with .com and leave .co.uk as is?

    Read the article

  • Add in the header of the license type is enough to say: "my code is licensed"? (Open-source)

    - by silverfox
    I do not know if this is the correct place to ask this stackexchange. Note: If a moderator can move to the correct place (if I am in the inappropriate site SE) I read on various sites about licenses. I did just put the license type in the header file (in my case the javascript file - open-source). /* * "codeName" "version" * http://officialsite.com/ * * Copyright 2012 "codeName" * Released under the "LICENSE NAME" license * http://officialsite.com/LICENSE NAME */ javascript code ... In the same folder I leave a copy of the license. The listing of the folder looks like this: * codeName.js * LICENSE In the file LICENSE would leave my code uses. What nobody says is if it is enough to say my code is licensed (the case of an open-source). Or is something more required? Sorry for the bad English. Thanks.

    Read the article

  • Is IE9 a modern browser?

    - by TATWORTH
    At http://people.mozilla.com/~prouget/ie9/ there is a very provocative article entitled "Is IE9 a modern browser?". There is a rebuttal by Tim Sneath at http://blogs.msdn.com/b/tims/archive/2011/02/15/a-modern-browser.aspx that is well worth a look. Certainly IE9 is already superior to its predecessors. My comment on the matter is that those that consider IE9 to be non-standards compliant, should submit tests to the W3C to demonstrate the non-compliance. Upon acceptance by the W3C, all the competing browsers can then be re-tested. I prefer objective tests to subjective opinion. I have used IE9 and on some sites such as Hotmail, it is noticeably faster. I have so far been unable to apply the promised IE9 lockout of spyware cookies. With Firefox, I just instal NoScript and never enable spyware sites.

    Read the article

  • RewriteRule working local but not on remote server

    - by m0tv
    I have a .htaccess file with one simple RewriteRule: RewriteEngine on RewriteRule ^([A-Za-z0-9-]+)$ ?site=$1 I want to have an url like http://www.example.com/imprint and forward it to http://www.example.com/?site=imprint I checked this rule with an RewriteRule tester which gave me the results I want to achieve. On my local development system it works well too. But on a remote server the URLs just give me a 404 error. Other more simple rewrite rules are working with no problems, so everything must be set up correctly (I think..). The problem is that I don't have access to any error logs or the server configs. So the only thing I can do is to guess... Can anyone tell me if theres something wrong with this rule? Or anything else I can do or test to solve this? Or has someone an idea what could be wrong on the server?

    Read the article

  • SEO tool is telling me title, description and keywords don't exist, but they do. Where is the problem?

    - by DaveDev
    I'm using the following tool to analyse how 'optimal' a site that I'm working on is for search engines: http://tools.seobook.com/general/spider-test/ I enter the URL for the site - http://ftmsuat.moneymate.com - into the search bar, and it returns a breakdown of the contents of the page. I'm a little confused by what I see though. According to the results, the page doesn't have a title, description or keywords. But if you check the source of the page, those elements are definitely there. So I'm wondering now, which is wrong? seobook.com or my page?

    Read the article

  • What to do with my "unmounted drive"?

    - by Taylor Guistwite
    I just recently followed the tutorials on http://www.ubuntu.com/download/ubuntu/download for installing the ubuntu server onto my 1TB Seagate External. I was planning on using this to install it on my macbook and in these instructions it states to preform this line of code Run diskutil unmountDisk /dev/diskN (replace N with the disk number from the last command; in the previous example, N would be 2) Now my HD prompts "The disk you inserted was not readable by this computer". Would I just run diskutil mountDisk /dev/diskN in order to be able to access all my files again? here is a screenshot to the instructions i followed http://i17.photobucket.com/albums/b97/hello_screamo/Screenshot2011-11-11at113914AM.png

    Read the article

  • Patterns for a tree of persistent data with multiple storage options?

    - by Robin Winslow
    I have a real-world problem which I'll try to abstract into an illustrative example. So imagine I have data objects in a tree, where parent objects can access children, and children can access parents: // Interfaces interface IParent<TChild> { List<TChild> Children; } interface IChild<TParent> { TParent Parent; } // Classes class Top : IParent<Middle> {} class Middle : IParent<Bottom>, IChild<Top> {} class Bottom : IChild<Middle> {} // Usage var top = new Top(); var middles = top.Children; // List<Middle> foreach (var middle in middles) { var bottoms = middle.Children; // List<Bottom> foreach (var bottom in bottoms) { var middle = bottom.Parent; // Access the parent var top = middle.Parent; // Access the grandparent } } All three data objects have properties that are persisted in two data stores (e.g. a database and a web service), and they need to reflect and synchronise with the stores. Some objects only request from the web service, some only write to it. Data Mapper My favourite pattern for data access is Data Mapper, because it completely separates the data objects themselves from the communication with the data store: class TopMapper { public Top FetchById(int id) { var top = new Top(DataStore.TopDataById(id)); top.Children = MiddleMapper.FetchForTop(Top); return Top; } } class MiddleMapper { public Middle FetchById(int id) { var middle = new Middle(DataStore.MiddleDataById(id)); middle.Parent = TopMapper.FetchForMiddle(middle); middle.Children = BottomMapper.FetchForMiddle(bottom); return middle; } } This way I can have one mapper per data store, and build the object from the mapper I want, and then save it back using the mapper I want. There is a circular reference here, but I guess that's not a problem because most languages can just store memory references to the objects, so there won't actually be infinite data. The problem with this is that every time I want to construct a new Top, Middle or Bottom, it needs to build the entire object tree within that object's Parent or Children property, with all the data store requests and memory usage that that entails. And in real life my tree is much bigger than the one represented here, so that's a problem. Requests in the object In this the objects request their Parents and Children themselves: class Middle { private List<Bottom> _children = null; // cache public List<Bottom> Children { get { _children = _children ?? BottomMapper.FetchForMiddle(this); return _children; } set { BottomMapper.UpdateForMiddle(this, value); _children = value; } } } I think this is an example of the repository pattern. Is that correct? This solution seems neat - the data only gets requested from the data store when you need it, and thereafter it's stored in the object if you want to request it again, avoiding a further request. However, I have two different data sources. There's a database, but there's also a web service, and I need to be able to create an object from the web service and save it back to the database and then request it again from the database and update the web service. This also makes me uneasy because the data objects themselves are no longer ignorant of the data source. We've introduced a new dependency, not to mention a circular dependency, making it harder to test. And the objects now mask their communication with the database. Other solutions Are there any other solutions which could take care of the multiple stores problem but also mean that I don't need to build / request all the data every time?

    Read the article

  • mod_rewrite for clean URL doesn't work

    - by deathlock
    Basically what I want to do is to convert this: http://localhost/jariungu/user_caleg.php?idCaleg2014=3 into this: http://localhost/jariungu/caleg/3 I have managed to make /jariungu/caleg/3 to direct to the original URL (as in, if I open that URL, it directs me to the appropriate page). The problem is, once opened, the URL returns to the original, ugly one in the address bar. This is what I tried. Could someone provide a help? <IfModule mod_rewrite.c> Options +FollowSymlinks RewriteEngine On RewriteBase /jariungu/ RewriteRule ^caleg\/([0-9]+)\/([a-zA-Z]+\s*[0-9]*)/?$ caleg.php?idCaleg2014=$1&namaCaleg=$2 [NC,L] RewriteRule ^caleg\/([0-9]+)/?$ caleg.php?idCaleg2014=$1 [NC,L] </IfModule>

    Read the article

  • Problem connecting to Webmin

    - by railguage48
    I have installed Webmin with the view to try an understand what is all running. Yesterday I had it setup with login and password, but today when I try to get access to the server with: https://ubuntu:10000/ in order to login, I get unable to connect, the page does not load. Seems like there is no connection. I tried http://localhost:10000 and https://localhost:10000 They both returned the same unable to connect response. I am not sure of what it means to turn https on. ... the https and http is not showing. Am I going about this the wrong way?

    Read the article

  • Recommend an open source CMS for single page web site

    - by RedMan
    Hi I want to create a single page web site like http://kiskolabs.com/ or http://www.carat.se to display my portfolio. I want to add new products after launching the site without having to edit the entire site. I've looked at opencart (too much for single page site), Magneto (more for ecommerce), Wordpress (couldn't find open source / free templates which i can start from). Can you suggest a CMS which will support the creation of a single page site and allow insertion of new products without having to edit the entire page? I would prefer a CMS which also has open source / free templates which I can tweak for my use. I can do php and mysql, xml. If it is an easier option I can do PSD to site (but don't know much about this at all).

    Read the article

  • Community Events in Köln (October) and Copenhagen November #ssas #tabular #powerpivot

    - by Marco Russo (SQLBI)
    Short update about community events in Europe where I will speak.On October 11 I will present DAX in Action in Köln - all details in the PASS local chapter here: http://www.sqlpass.de/Regionen/Deutschland/K%C3%B6lnBonnD%C3%BCsseldorf.aspxI will be speaking at a community event in Copenhagen on November 21, 2012. The session will be Excel 2013 PowerPivot in Action and details about time and location are available here: http://msbip.dk/events/30/msbip-mode-nr-9/I will be in Köln and Copenhagen to teach the SSAS Tabular Workshop. The workshop in Köln is the first in Germany and I look forward to meet new BI developers there.Copenhagen is the second edition after another we delivered this spring. It is a convenient location also for people coming from Malmoe and Göteborg in Sweden. Last event in Copenhagen were conflicting with a large event in Sweden, maybe this time I'll meet more people coming from the other side of the Øresund Bridge!Many other dates and location are available on the SSAS Tabular Workshop website.

    Read the article

  • Should the English website use href="x-default" when it doesn't auto-redirect to the user's language or country?

    - by Noam
    For each URL on my site, I'm auto-redirecting according to header accept language. The site arch is English version: http://mydomain.com/page Spanish version http://es.mydomaina.com/page etc.. The english version is displayed unless I'm seeing a specific language other than en and that I support in the header, and then a redirect occurs. Google says this: For language/country selectors or auto-redirecting homepages, you should add an annotation for the hreflang value "x-default" as well: My pages aren't language selectors, nor are they the homepage. But I am auto-redirecting. My question is, should my english version be hreflang="x-default" or/and hrefland="en"?

    Read the article

  • error: you need to load kernel first

    - by Angelos318
    I made a clean install on my Sony Vaio laptop, of Ubuntu 11.10 and when the installation was ready, it prompted to remove the usb I was installing the distro from, and press enter to reboot. After this reboot the first thing I got was the following error: error: couldn't read file error: you need to load the kernel first Press any key to continue.. After that it throws me back to the Grub select screen: Ubuntu, with linux 3.0.0-14-generic-pae recovery mode previous linux versions (none since I made a clean install) memory test If i choose the first option it shows only a black screen and never loads anything. If i reboot the same thing happens. Could I repair this using boot-repair? Is there any other way? Note: I know nothing about linux code so i am a total noob on this one Update: boot-repair did not help Grub.cfg here: http://pastebin.com/GKLuDuhM Boot Info Script: http://pastebin.com/indARkKJ

    Read the article

  • Removing surrounding noises from voice recording

    - by Peak Reconstruction Wavelength
    I have a wave file whose frequency spectrum looks like this. http://i.stack.imgur.com/2rRaS.png It contains audio, which I want to keep while removing the rest. The problem is that the surround noise changes, just those distinct voice patterns remain. I marked the voice patterns for clarity: http://i.stack.imgur.com/eLkBl.png What could an algorithm look like / a workflow in adobe audition look like that removes everything but the voice patterns? I think that the main characteristic is the line-shaped form over time. Loudness alone is not enough as the noise is loud aswell.

    Read the article

  • using curl command to download file in parts from different interfaces and to run the commands simultaneously from a script

    - by jsjain007
    i wanted to download a file using curl command simultaneously in different parts using ip aliasing(virtual Ethernet ports) so what i did was pasted the commands in a text file and run but the problem as obvious since it is in a file the commands will be executed one by one so is there a way to run all those commands simultaneously. here is the command curl --interface eth0:0 --range 0,38010880 http://wdl.cache.ijinshan.com/wps/download/Linux/unstable/kingsoft-office_9.1.0.4244~a12p3_i386.deb -o kinsoft-office.part1 curl --interface eth0:1 --range 38010880 ,- http://wdl.cache.ijinshan.com/wps/download/Linux/unstable/kingsoft-office_9.1.0.4244~a12p3_i386.deb -o kinsoft-office.part2 cat kinsoft-office.part*>kinsoft-office can anyone help me to run these above 2 commands simultaneously from the script so as to increase download speed

    Read the article

  • What percent of visitors should click on the next page before you enable prefetching?

    - by Kevin Burke
    Mozilla Firefox and Google Chrome support prefetching via an HTML tag: <!-- in chrome --> <link rel="prerender" href="http://example.org/index.html"> I suppose it is always worthwhile to include this tag if 100% of users on a page click on the "Next Page" button or similar, and never worthwhile to include it if only 2% or 3% of users visit the following page. At what percent of clicks should you turn on prefetching of the next page? 65%? Also, does the calculus change if the current page is HTTP and the next page is HTTPS?

    Read the article

  • Sponsored Giveaway: Free Copies of WinX DVD Ripper Platinum for All How-To Geek Readers

    - by The Geek
    Have you ever wanted to watch a movie on your iPad, iPhone, Android tablet, or even your computer… without having to pay to download it from iTunes? You can easily convert DVDs to digital formats using WinX DVD Ripper Platinum, and we’re giving away free copies to all How-To Geek readers. To get your free copy, just click through the following link to download and get the license code, as long as you download it by November 27th. For Windows users: http://winxdvd.com/giveaway/ For Mac users: http://www.macxdvd.com/giveaway/giveaway.htm   Why Does 64-Bit Windows Need a Separate “Program Files (x86)” Folder? Why Your Android Phone Isn’t Getting Operating System Updates and What You Can Do About It How To Delete, Move, or Rename Locked Files in Windows

    Read the article

  • Can't ping external websites

    - by Frantumn
    I can't ping google.com with my virtual ubuntu 12.04 server. I have set up a proxy URL in my /etc/apt/apt.conf file and it says Aquire::http::proxy http://urlname.com:9999; Now, I don't know a lot about how the proxy works, but I do know when we use it on windows VMachines it's a pac script that we place in internet explorer LAN settings and it automatically detects the script and gives internet access. I tried including the 9999/proxy.pac in the apt.conf URL and it didn't seem to work any better. Would ubuntu know how to handle a proxy.pac assuming it was created for windows? Should my URL include the .pac or just end after the port numbers? I've tried both without sucess, but I would like to know. A quick test to ping a fellow co-workers' PC was sucessful. So I can see network computers, but not google. or other internet sources.

    Read the article

  • Stop Google Analytics from appending hostname?

    - by Nick Q.
    I've come across an Analytics profile that is appending the rest of a URL to the end of a page's path. For example when looking at the page that exists at http://example.com/page I would expect to see /page but instead it shows me /page/http://example.com/. The profile has no filters applied to it, and until July was reporting as expected (/page), in July the site in question switched hosts (and absolutely nothing else, so I'm not sure that's the problem). The analytics code on the site is the standard Google Async code with a domain set. All other profiles for the site show /page as expected. Any ideas as to how I can get the profile to function as expected?

    Read the article

  • O'Reilly 50& off offer on CSS3 books to 05:00 PT on Oct/28

    - by TATWORTH
    Originally posted on: http://geekswithblogs.net/TATWORTH/archive/2013/10/21/oreilly-50-off-offer-on-css3-books-to-0500-pt.aspxAt  http://shop.oreilly.com/category/deals/css3.do?code=WKCSS&imm_mid=0b155e&cmp=em-prog-books-videos-lp-owo_css3_direct_wkcss, O'Reilly are offering 50% off a number of e-books on mastering CSS3 to 05:00 PT on Oct 28 "CSS3—the technology behind most of the eye-catching visuals on the Web today—is loaded with capabilities that once would have required JavaScript or third-party plugins, such as animation, pseudo-classes, and media queries. Use CSS3 to transform markup into stunning, richly detailed web pages that look great in any browser. For one week only, SAVE 50% on CSS3 ebooks from shop.oreilly.com and take your sites from ordinary to incredible."

    Read the article

  • XHTML fix solution republished

    - by TATWORTH
    As a post VS2010 SP1 installation activity, I am recompiling all my open source projects. The first is XHTMLFIX at http://xhtmlfix.codeplex.com/ This LGPL project has simple fixes to ASP.NET 2.0/4.0 to achieve XHTML compliance as measured by the W3C tests at http://validator.w3.org/ The XHTML project shows as untrue the commonly held belief that MVP or MVC are necessary for producing XHTML compliant web pages. Incidentally the other supposed advantage of MVP and MVC over web forms of easier testing is also very dubious as web forms can be tested by systems such as Selenium or WaTiN. I have used NUnitASP (alas sadly discontinued) with web forms and found it be more effective than unit testing MVP. Now if you prefer the MVP and / or MVC approach over Web forms then fine, that is your preferance. Now if you can find an example where ASP.NET 4.0 Web forms properly written do not produce XHTML compliant markup, I would be glad of your example and will look at ways of modifying the markup to be XHTML compliant.

    Read the article

  • What did I do wrong when installed this theme?

    - by Qmal
    I have a problem installing a theme for my Ubuntu, note that I am very new to it and probably messed something up. Still it seems to me that I did everything like it was stated in the INSTALL file. Theme: http://opendesktop.org/content/show.php/?content=140562 So I downloaded the *.zip package, unzipped it in home/.themes folder and changed Controls and Borders inside my theme preference. Still my result is pretty poor. You can look at the image, it looks nothing like what in the screenshot of the author. I also installed fonts running sudo apt-get install ttf-mscorefonts-installer and tried to install GTK2 engine as sudo apt-get install gtk2-engines-murrine gtk2-engines-pixbuf, but it showed that 0 files was modified since I already have it. Please tell me what did I did wrong so I can fix that :) (It seems that I can't upload images yet, no rep. I can give you link tho) http://postimage.org/image/1zhibxx5w/full/

    Read the article

  • Gallery of Exoplanets

    - by TATWORTH
    Space.com have put together a gallery of exoplanets (planets outside our solar system) at http://www.space.com/13986-gallery-smallest-alien-planets-exoplanets.htmlSome exoplanets have been discovered by monitoring the red/blue shift of the star, whereas others have been found by the transit method. The kepler space telescope is continuously monitoring a small patch of the sky for transits.Here are some of the more interesting (and eye-catching pictures). Remember so far the best image of an extra-solar planets has been a diffraction blur, but even just as artist's impressions based upon the best current evidence, they are very impressive. Have a look at the Gallery at http://www.space.com/13986-gallery-smallest-alien-planets-exoplanets.html, it is very impressive.

    Read the article

  • Tales from the Coal Face - Reporting errors

    - by TATWORTH
    One of the questions that comes up frequently, is "Is it worthwhile to report errors?".Last weekend, after installing the latest StyleCop I loaded up my copy of Power Collections. I found that StyleCop was now correctly picking up a lot of missing "this." statements, however there were now a number of false positives. Anticipating the need to submit sample code, I cleaned the solution and zipped it up.I reported this at http://stylecop.codeplex.com/discussions/357319.  The stylecop administrator promoted this report to a work item (see http://stylecop.codeplex.com/workitem/7285) and I uploaded the previously prepared Zip file. The StyleCop team was able to locate the problem and it is "Fixed in upcoming 4.7.27".The conclusion:Report errors!  Prepare sample code illustrating the error.

    Read the article

< Previous Page | 454 455 456 457 458 459 460 461 462 463 464 465  | Next Page >