Search Results

Search found 5530 results on 222 pages for 'nested urls'.

Page 107/222 | < Previous Page | 103 104 105 106 107 108 109 110 111 112 113 114  | Next Page >

  • Extracting meta tags attribute using wget [migrated]

    - by Amit
    I have a file having some URLs per line. I need to extract the "keywords" present in the tags i.e. if there is meta tag for "keywords" then i want to get "content" value for it. Example: if the web-page has this meta-tag then for that URL i want "wikipedia,encyclopedia" to be extracted. One approach is to download the web-page using "wget" and then parse it using some standard HTML parser. I was wondering is there any better way to do this without downloading the entire web-page.

    Read the article

  • Any frameworks or library allow me to run large amount of concurrent jobs schedully?

    - by Yoga
    Are there any high level programming frameworks that allow me to run large amount of concurrent jobs schedully? e.g. I have 100K of urls need to check their uptime every 5 minutes Definitely I can write a program to handle this, but then I need to handle concurrency, queuing, error handling, system throttling, job distribution etc. Will there be a framework that I only focus on a particular job (i.e. the ping task) and the system will take care of the scaling and error handling for me? I am open to any language.

    Read the article

  • Moved sitemaps to a different subdomain and losing search referrals around the same time. Red herring or correlation?

    - by er1234
    We started to lose search referral traffic around the same time that I moved some of our sitemaps to a subdomain. Could this have hurt us? I followed Google's steps to creating a sitemap under a different subdomain. The new sitemaps.foo.com subdomain is being crawled and indexed well. Both www.foo.com and sitemaps.foo.com have been verified in Google Webmaster Tools. They appear as distinct sites. Is this correct? I can't find a way in Webmaster Tools to say "Hey, sitemaps.foo.com is really owned by www.foo.com, so show them together and make sure to attribute sitemaps.foo urls to www.foo" Our www.foo.com/robots.txt Sitemap: http://www.foo.com/sitemap.xml Sitemap: http://sitemaps.foo.com/subdir/sitemap.xml.gz

    Read the article

  • Seo Google Publisher Network

    - by Andy
    I'm just about to start a new business which creates niche affiliate sites. I'm curious about the impacts to me from Google of all the urls being hosted with the same analytic tags, webmaster tool tags and server ip ranges. To benefit the most from google's serps should i have each domain within seperate analytic accounts and webmaster tools or is it ok for me to have all of my domains within one account. My issue is duplicate content and the fact that i am building a publisher network and i'm not sure how much google likes them. I'm notoriously bad at searching and as such havent found what i'm looking for yet. Any help would be very much appreciated.

    Read the article

  • StyleCop 4.7.33.0 has been released

    - by TATWORTH
    StyleCop 4.7.33.0 was released, today, 29/June at http://stylecop.codeplex.com/releases/view/79972This version is compatible with the Visual Studio 2012 RC (11.0.50522).Install order should be : VS2008VS2010VS2012 RCR#6.1.1 msi (for VS2010)R#7.0 (tested with daily build 7.0.70.189)StyleCop  This version is now compatible with R# 5.1 (5.1.3000.12), R# 6.0 (6.0.2202.688), R# 6.1 (6.1.37.86), R# 6.1.1 (6.1.1000.82) and R# 7.0 (7.0.70.189).Fixes for this release are:Updated docs for SA1103.Fix to not throw 1101 when is a nested interface. Added new tests.Fixes to install the ReSharper plugins back in the main directories for all users.Styling fixes.7291. Create indexer documentation better. Port fixes for 7289 and 7223 to 7.0.0 plugin.Fix for 7289. Create interface documentation better.Fix for 7223. Better text for inserted property text.Ensure WebSites and other folders containing aspx.cs files get analysed.Add re-analyse Project option to context menus (I asked for this one!)

    Read the article

  • How to generate "language-safe" UUIDs?

    - by HappyDeveloper
    I always wanted to use randomly generated strings for my resources' IDs, so I could have shorter URLs like this: /user/4jz0k1 But I never did, because I was worried about the random string generation creating actual words, eg: /user/f*cker. This brings two problems: it might be confusing or even offensive for users, and it could mess with the SEO too. Then I thought all I had to do was to set up a fixed pattern like adding a number every 2 letters. I was very happy with my 'generate_safe_uuid' method, but then I realized it was only better for SEO, and worse for users, because it increased the ratio of actual words being generated, eg: /user/g4yd1ck5 Now I'm thinking I could create a method 'replace_numbers_with_letters', and check that it haven't formed any words against a dictionary or something. Any other ideas? ps. As I write this, I also realized that checking for words in more than one language (eg: english and french, spanish, etc) would be a mess, and I'm starting to love numbers-only IDs again.

    Read the article

  • Content Optimization only?

    - by danie7L T
    There are tons of discussions around tips&tricks to improve Search Engines "ranking" and SEOs. What if the focus of the webmaster/client is 100% set on the quality of the content with precise keywords in meta tags, clean design, regular articles updates, clean URLs and highly filtered external links leading to pages on websites dealing on the same,or related subjects; isn't it the job of a good search engine like Google to catch this website and show it in its front-page ? Or does Search Engines count on us to help them find us, and webmasters will always have to be up-to-date regarding SEO tools and rules updates on top of websites' design, browsers customization, progressive enhancement etc ?

    Read the article

  • Google Site Search (commercial) not indexing files in sitemap

    - by melat0nin
    I have a client for whom we have purchased Google Site Search. It works well for HTML pages served by the CMS, but files aren't being reliably indexed. I wrote a script to generate an XML feed (sitemap) of all the files in the CMS which I've plugged in to Google Webmaster Tools for the site. It says that for that sitemap 923 URLs have been submitted, but only 26 have been indexed. The client relies heavily on searching within files, which is why we decided to use Google search, so this is a bit of a problem. Many of the files aren't linked to from any page on the site, as they are old and therefore don't merit having a page of their own. But they still need to be accessible through search for archiving purposes. The file archive xml can be found at www.sniffer.org.uk/file-archive and the standard xml sitemap (of pages) can be found at www.sniffer.org.uk/sitemap.xml. Any thought would be much appreciated!

    Read the article

  • Powershell progress dialogs

    - by Norgean
    Creating nested progress dialogs in Powershell is easy. Let the code speak for itself: for ($i = 1; $i -le 2; $i++) {     Write-Progress -ID 1 -Activity "Outer loop" -Status "Tick $i" -percentComplete ($i / 2*100)     for ($j = 1; $j -le 3; $j++)     {         Write-Progress -ID 2 -Activity "Mid loop" -Status "Tick $j" -percentComplete ($j / 3*100)         for ($k = 1; $k -le 3; $k++)         {             Write-Progress -ID 3 -Activity "Inner loop" -Status "Tick $k" -percentComplete ($k / 3*100)             Sleep(1)         }     } } I.e. some text that explains what we're doing (Activity and Status), and ID numbers. Easy.

    Read the article

  • VMware Player and Ubuntu 12.04 - Full Screen

    - by DotNetStudent
    I have installed VMware Player 4.0.2 under Ubuntu 12.04 (Final) and, apart from having to patch the modules, everything went smoothly. However, there's an irritating behavior when toggling full screen mode: toggling full screen (using Virtual Machine Toggle Full Screen or Ctrl + Alt + Return), minimizing the player and maximizing it again changes the resolution of the guest to some strange one and the player gets "nested" between GNOME3's taskbar as every other of Ubuntu's native windows. To switch to full screen again I have to Ctrl + Alt + Return twice. Can anyone please tell me if this is the nromal, expected behavior? Is there any way of "correcting" it? The host operating system is Ubuntu 12.04 (Final) and the guest is Windows 7 (both 64 bits).

    Read the article

  • SEO penalty for landing page redirects

    - by therealsix
    Using ebay as an example- lets say I have a large number of items whose URLs' look like this: cgi.ebay.com/ebaymotors/1981-VW-Vanagon-manual-seats-seven-/250953153841 I want to give my client the ability to put links to these items on their website EASILY, without knowing or checking my URL. So I created a redirect service that will map their identifier with my URL: ebay.com/fake_redirect_service/shared_identifier9918 would redirect to the link above. This works great- my clients can easily setup these links with information they already have, and the user will see the page as usual. So on to the problem... I'm concerned that this redirecting service will have a negative impact on my SEO ranking. Having a landing page redirect you immediately to a different URL seems like something a typical spam site would do. Will this hurt me? Any better solutions?

    Read the article

  • How will this affect my SEO ranking?

    - by dunc
    I run a fishkeeping website based on a WordPress (PHP) CMS. I've recently put a fairly complex "filter" into place which searches my content for mentions of fish species profiles and turns them into an active link. For example, asdasd this is a test about abdomen to see if the caudal fin will work asdadasdas try again with abdomen and A. panduro and Apistogramma panduro ...becomes asdasd this is a test about abdomen to see if the caudal fin will work asdadasdas try again with abdomen and <a href="/?p=1703" class="link_species">A. panduro</a> and <a href="/?p=1703" class="link_species">Apistogramma panduro</a> On the rest of my website, the species are linked with pretty URLs such as /species/apistogramma-panduro/ but due to the way this filter works, the only information I can get access to is the idof the post. As such, I'm using /?p=1703 or whatever the ID is. What I'd like to know is: how much will this affect my SEO rating/ranking? Will it be detrimental if I don't rewrite the function? Thanks in advance,

    Read the article

  • Free Pluralsight Videos for this week

    - by TATWORTH
    Pluralsight have issued two free videoshttp://blog.pluralsight.com/2012/09/05/video-end-the-global-pollution-crisis-in-javascript/http://blog.pluralsight.com/2012/08/31/video-fake-it-until-you-make-it-with-fakeiteasy/Their exact words were: Free Videos this Week End the Global Pollution Crisis... In Javascript Too many globally scoped variables and functions can make Javascript code difficult to work with, particularly on large projects. See how to simulate the concept of namespaces using objects. Fake it Until You Make It with FakeItEasy FakeItEasy is a .NET framework for easily creating mock objects in tests. See how easy it is to FakeItEasy with complex nested hierarchies.

    Read the article

  • BleachBit: How to Completely Clear URL History in Firefox?

    - by tSquirrel
    14.04 / Firefox 29.0 I've been using Bleachbit to clear usage/file history, and for the most part it works great. However, it doesn't seem to clear the website hostnames out of the URL, at all. These addresses are not bookmarked. Also, the total URL isn't preserved, just the hostname. Visit site http://www.bluesnews.com/some_random_URL_string Exit Firefox Run Bleachbit, with ALL Firefox options selected Restart Firefox Check history: completely empty, other than bookmarked sites. www.bluesnews is NOT bookmarked Type "blue" which is Firefox automatically completes as "http://www.bluesnews.com/" Alternate Step #3: Use Firefox's built-in "Clear History" and select ALL entries with a time frame of "Everything". Same result as above. My inquiry in BB forums hasn't been responded to. I found Dan's proposed solution, however changing autocomplete in about:config only turns off the function, it doesn't actually stop storing URLs.

    Read the article

  • Hidden Windows 7 Wallpaper

    - by BizTalk Visionary
    To find the hidden wallpaper: Type globalization in a search of your C: drive. The only result should be a folder located in the main Windows directory, and you should only be able to see ELS and Sorting folders nested here. Now search for MCT in the top-right search bar. This will display five new unindexed folders, each corresponding to a different global region. Browse these folders for some extra themes and wallpapers specific to Australia, USA, South Africa, and Canada. From here you can select a new wallpaper.

    Read the article

  • Length of Page Title, URL, Meta Description and total number of links on a page

    - by MJWadmin
    We've been examining a number of different SEO tools recently. Several of these tell us that some of our page title's, urls and meta descriptions are too long. We've also been told that some of our pages have too many links on them. I guess our first question is - is any of that feedback true! Can URL's etc actually be too long and if so how much does this affect ranking? Secondly can you have too many links on a page and if so, how many is too many? Thanks in advance...

    Read the article

  • Best way to redirect in IIS

    - by stephmoreland
    We have a website that has two URLs (one for the US side and another for the Canadian side which is then broken into Canadian English and Canadian French). For the purposes of my question, I will write as: www.us_url.com (US) www.canada_url.ca/ca_en/ (Canadian English) www.canada_url.ca/ca_fr/ (Canadian French) To make sure people are on the correct site, what do I do if they go to the US URL with Canadian English content (e.g. www.us_url.com/ca_en/canada.asp) but I want to make sure the URL is the Canadian one (e.g. www.canada_url.ca/ca_en/canada.asp) so it shows up properly in Google Analytics. We're using IIS 7 and classic ASP.

    Read the article

  • Programming Windows Identity Foundation - ISBN 978-0-7356-2718-5

    - by TATWORTH
    This book introduces a new technology that promises a considerable improvement on the ASP.NET membership system. If you ever had to write an extranet, system you should be aware of the problems in setting up membership for your site. The Windows Identity Foundation promises to be an excellent replacement. Therefore the book Programming Windows Identity Foundation - ISBN 978-0-7356-2718-5 at  http://oreilly.com/catalog/9780735627185, is breaking new ground. I recommend this book to all ASP.NET development teams. You should reckon on 3 to 5 man-days to study it and try out the sample programs and see if it can replace your bespoke solution. Rember this is version 1 of WIF and give yourself adequete time to read this book and familiarise yourself with the new software. Some URLs for more information: WIF home page at http://msdn.microsoft.com/en-us/security/aa570351.aspx The Identity Training Kit at http://www.microsoft.com/downloads/en/details.aspx?displaylang=en&FamilyID=c3e315fa-94e2-4028-99cb-904369f177c0 The author's blog at http://www.cloudidentity.net/

    Read the article

  • Alternatives to OAuth?

    - by sdolgy
    The Web industry is shifting / has shifted towards using OAuth when extending API services to external consumers & developers. There is some elegance in simple....and well, the 3-step OAuth process isn't too bad ... i just find it is the best of a bad bunch of options. Are there alternatives out there that could be better, and more secure? The security reference is derived from the following URLs: http://www.infoq.com/news/2010/09/oauth2-bad-for-web http://hueniverse.com/2010/09/oauth-2-0-without-signatures-is-bad-for-the-web/

    Read the article

  • What's Your Method of not forgetting the end brackets, parentheses

    - by JMC Creative
    disclaimer: for simplicity sake, brackets will refer to brackets, braces, quotes, and parentheses in the couse of this question. Carry on. When writing code, I usually type the beginning and end element first, and then go back and type the inner stuff. This gets to be a lot of backspacing, especially when doing something with many nested elements like: jQuery(function($){$('#element[input="file"]').hover(function(){$(this).fadeOut();})); Is there a more efficient way of remembering how many brackets you've got open ? Or a second example with quotes: <?php echo '<input value="'.$_POST['name'].'" />"; ?>

    Read the article

  • How do I configure multiple domain names on my IIS server? [closed]

    - by Dillie-O
    We have a few websites that we are running on one instance of IIS that need to be mapped for each of their domain names. For example. Site A has the domain name coolness.com Site B has the domain name 6to8Weeks.com Site C has the domain name PhatTech.com When I look at the "Web Site Identification" section of the IIS configuration window, I notice that I can specify an IP address and port, but if I click the Advanced button, I can also configure the site based on host header values as well. How do I configure each site in IIS? Ideally I would like them to all be able to listen to port 80, so I don't have weird URLs, but I'm not sure if I do this using headers, IP addresses, both, or something else.

    Read the article

  • How can I compute the Big-O notation for a given piece of code?

    - by TheNew Rob Mullins
    So I just took a data structure midterm today and I was asked to determine the run time, in Big O notation, of the following nested loop: for (int i = 0; i < n-1; i++) { for(int j = 0; j < i; j++2) { //1 Statement } } I'm having trouble understanding the formula behind determining the run time. I thought that since the inner loop has 1 statement, and using the series equation of: (n * (n - 1)) / 2, I figured it to be: 1n * (n-1) / 2. Thus equaling (n^2 - 1) / 2. And so I generalized the runtime to be O(n^2 / 2). I'm not sure this is right though haha, was I supposed to divide my answer again by 2 since j is being upped in intervals of 2? Or is my answer completely off?

    Read the article

  • HTAccess redirect directories to index.html

    - by BFTrick
    Hi there, I am working on a site that where I do not have permission to the server and someone else keeps changing the settings. That person just changed the settings preventing users from going to example.com/foo/ and seeing the index page. This Virtual Directory does not allow contents to be listed. If you type in example.com/foo/index.html you can still see the file. So I want to use htaccess to redirect all urls that end in a directory to change into directory/index.html How do I write that? I started with some code that changes .php files to .html files and tried to work from that but I couldn't quite get it to work. RewriteRule ^(.*)\.php$ /$1.html [R=301,L] Any suggestions?

    Read the article

  • Google Analytics - how to track clicks on a screen?

    - by milesmeow
    Can I track the click of every link, button, dropdown select, etc. on a screen and have it be tracked in Google Analytics? I want to create a page and collect data on which widget the users use most. What about AJAX stuff? What if you're using jQuery or Mootools...can you get the functions to register a fake URL with GA based on user interaction? I use to do this with Flash. Everytime you click a button, it can initiate a fake URL request. I would make urls such as ".../customize/eyes/" or ".../customize/nose", etc. Just wondering if I can do that with Javascript on the page. I've also posted at StackOverflow.

    Read the article

  • How to keep google rank and index for a page that changed its url? [closed]

    - by ProSoft
    Possible Duplicate: How to tell Google that I have changed my website URLs? Recently, I changed URL of my web page. Of course, I do it by URL rewriting. And now, I want to keep the rank of this page in Google and Bing. For example: Main address of the page: http://mywebsite.com/page1.php Virtual address by URL rewriting: http://mywebsite.com/page And new address is: http://mywebsite.com/newTitlePage Now, when I open this page by search in Google, I face to 401 error (not found). How should I do it?

    Read the article

< Previous Page | 103 104 105 106 107 108 109 110 111 112 113 114  | Next Page >