Search Results

Search found 5137 results on 206 pages for 'i like traffic lights'.

Page 119/206 | < Previous Page | 115 116 117 118 119 120 121 122 123 124 125 126  | Next Page >

  • Migrate from Thunderbird to Mutt

    - by deshmukh
    I am contemplating moving from Thunderbird to Mutt (provided it is feasible) to move to a faster, simpler application. My current Thunderbird set-up consists of multiple IMAP accounts (gmail and google apps). Only selected folders (read labels) in each IMAP account are stored locally. For all other folders, I glance through the headers and open a message only if I find it interesting. I also use folder bookmarks to navigate to folders quickly. I also move messages across folders with keyboard shortcuts. Is it possible to replicate the set-up in Mutt? Can someone share/ point to a sample muttrc file that does the same thing? It would be great if the muttrc file is adequately commented. On a side note, will it also be possible to import my messages from Thunderbird locally? That will save me considerable network traffic (about 2GB data stored locally).

    Read the article

  • Trust

    - by mprove
    I sense traffic of this blog w/o a present reason. Hmm. What about this,  brief musings about trust: Each software, each website, each social platform, each community building effort is a matter of trust building. You make a social promise to continue the effort, and to care for the commitment of the users or community members. It is easy to offer more to your community. On the other hand, it is quite difficult or impossible to take something away, or to close down or end the product or community without disappointing someone. cheers,Matthias

    Read the article

  • Ubuntu Freezes w/ Proxy

    - by jrc03c
    Ubuntu 11.04 freezes completely after a little while when using a proxy. It's fine when it's got a direct Internet connection, but it completely grinds to a halt after about 10 or 15 minutes of traffic through a proxy. Any ideas? UPDATE: Here's some more information. I have a second-generation MacBook, which has both OSX 10.6.8 (Snow Leopard) and Ubuntu 11.04 (Natty Narwhal) installed on it. When running through a proxy, Snow Leopard works fine, but Ubuntu freezes frequently. Any suggestions at all? Or, rather, what other kinds of information do you need?

    Read the article

  • keep getting added to hosts.deny + iptables

    - by Sc0rian
    I am confused to why this has started to happen. On my local network, if I click 10-20 apache/http links my server will decide to add me hosts.deny file and block me on iptables. Its not just apache, it seems to happen with any kinda of traffic, that comes in on a flood method. Like I use subsonic, if I change tracks 10-20 times, it will do it. I would assume I have some sort of firewall which is sitting on the server which is doing this. However I do not have fail2ban or any denyhosts in /var/lib. I cannot work out why I keep getting added to hosts.deny/iptables. Thanks

    Read the article

  • GA tracking utm query params after hashbang

    - by hybrid9
    We currently use a hashbang for the portion of our site that generates dynamic content which can also be deep linked. Our analytics team wants to use utm params to track the referral traffic from social networks. We are using Universal Analytics (analytics.js) as well as GTM. Will GA pick up the query parameters after the hashbang or does it always have to go before? For example: example.com/#!/some/content?utm_source=foo&utm_campaign=bar example.com?utm_source=foo&utm_campaign=bar/#!/some/content In #1, I'm concerned that the utm params won't be recorded and in #2 the page will break or the url could be incorrectly written. How does GA pull in those parameters - location.search? regex? Can I get away with using either?

    Read the article

  • Best stats tool for cross-domain traacking

    - by kidbrax
    We build a webapp that allows users to run the app under their own subdomain. So we run the app under search.domainX.com, search.domainY.com and so on. They each have their own Google Analytics to track individual stats. But we want to know what general traffic for all clients of our app. So we want to know stuff like "among all our clients we had x number of views." What is the best way tool to track that sort of thing.

    Read the article

  • Windows Not Sleeping All Night

    - by John Paul Cook
    Having a computer wake up when you don’t want it to wastes electricity and drains the battery on mobile devices. My desktop had been waking up at night, so I assumed it was some network traffic on my home network. I unchecked Allow this device to wake the computer on my network adapters . Figure 1. Network adapter Power Management tab. That didn’t solve the problem. I included the screen capture in Figure 1 because it could be part of the solution for someone else. To identify the root cause instead...(read more)

    Read the article

  • What are the tactics used to discover what kind of affiliate products will do well in your website?

    - by freethinker
    I'm starting to post some affiliate ads on my website. As it happens, I am not even close to making a sale. I'm not sure if the products I have chosen will appeal to the audience I have. I'm not sure if the volume of traffic is enough to support affiliate programs. I get about 8000-9000 visitors everyday. But since its growing constantly, thats not much of a worry. But I'm surely struggling to figure out what kind of products to market? (its a techie site). Is there a service/tool which can analyze the website and suggest what products will do well and what won't?

    Read the article

  • Question about SEO and Domains

    - by jasondavis
    This is my first post on here as I am mainly on Stackoverflow and Serverfault. I have been programming for at least 10 years now, have made hundreds of websites but I have just recently started getting into Design and the SEO side of sites, sad that I have been overlooking these for so many years. I have pretty good knowledge from all my years of SEO but I have never really looked into it until now. My question, I would like to build a site that targets many different key words for the search engines, for an example. Let's say I built a site about Outdoor activities called outdoorreview.com and I planned on having many sections hunting fishing Hiking camping cycling climbing etc... For best Search Engine results, how could I get the most search engine traffic to all these ares? Also how should I structure the way to get to them, outdoorreview.com/Hiking/ or hiking.outdoorreview.com ?

    Read the article

  • uneven illuminated images

    - by coul
    How to get rid of uneven illumination from images, that contain text data, usually printed but may be handwritten? It can have some spots of lights because the light reflected while making picture. I've seen the Halcon program's segment_characters function that is doing this work perfectly, but it is not open source. I wish to convert an image to the image that has a constant illumination at background and more dark colored regions of text. So that binarization will be easy and without noise. The text is assumed to be dark colored than it's background. Any ideas?

    Read the article

  • Duplicate domain names - .net and .com Create separate pages, or redirect?

    - by guisasso
    In a SEO point of view: This website has a good amount of traffic for a local business, but also ships some merchandise. While the .net domain (registered first) is associated with the local busines (google places, maps, etc...) the .com domain only redirects to the .net domain. Is it good, bad or okay to create a different page for the domain .com for example, that would be pretty simple, but would link to the 10 different categories of products that this company sells? I know links are good, so there's that, but what else is good, or bad? Thanks in advance!

    Read the article

  • Wordpress : Automatically transfer media files to Amazon S3

    - by Ron Ranieri
    I've been using VPS to host 7 Wordpress websites, most of them require big storage but very little RAM and traffic. So I'm thinking of moving the static files(uploads folder) content to Amazon S3 and I'm looking for the most viable solution to this. I want every website to have their own bucket and newly uploaded media files automatically uploaded to Amazon S3 without using plugin. I'm ok with cron job, for example the files were uploaded first to my server, then transferred to S3 and deleted from my server every 24 hour. Or is there any way for me to change the default upload directory to my S3 bucket without sacrificing any Wordpress functionality(resize/title etc)? What do you think the most efficient way to do this? Currently I'm looking at this plus cron job but I would like to know better option if it exist.

    Read the article

  • WebLogic Server 12c Launch Event - 1 December 2011

    - by Chuck Speaks
    Introducing Oracle WebLogic Server 12c, the #1 Application Server Across Conventional and Cloud Environments Please join Hasan Rizvi on December 1, as he unveils the next generation of the industry’s #1 application server and cornerstone of Oracle’s cloud application foundation—Oracle WebLogic Server 12c. Hear, with your fellow IT managers, architects, and developers, how the new release of Oracle WebLogic Server is: Designed to help you seamlessly move into the public or private cloud with an open, standards-based platform Built to drive higher value for your current infrastructure and significantly reduce development time and cost Optimized to run your solutions for Java Platform, Enterprise Edition (Java EE); Oracle Fusion Middleware; and Oracle Fusion Applications Enhanced with transformational platforms and technologies such as Java EE 6, Oracle’s Active GridLink for RAC, Oracle Traffic Director, and Oracle Virtual Assembly Builder Don't miss this online launch event. Register now.

    Read the article

  • What are some potential issues in blocking all incoming requests from the Amazon cloud?

    - by ElHaix
    Recently I, along with the rest of the world, have seen a significant increase in what appears to be scraping from Amazon AWS-related sources. So simply put, I blocked all incoming requests from the Amazon cloud for our hosted application. I know that some good services/bots are now hosted on the cloud, and I'm wondering if certain IP addresses should be allowed, as they may gather data that would in the end benefit our site's SEO rankings? -- UPDATE -- I added a feature to block requests from the following hosts: Amazon Softlayer ServerDeals GigAvenue Since then, I have seen my network traffic decrease (monitored by network out bytes). Average operation is around 10,000,000 bytes. You can see where last week I was not blocking, then started blocking. I've since removed the blocks and will see what the outcome is.

    Read the article

  • Silverlight 4 + RIA Services - Ready for Business: Search Engine Optimization (SEO)

    To continue our series, lets look at SEO and Silverlight.  The vast majority of web traffic is driven by search. Search engines are the first stop for many users on the public internet and is increasingly so in corporate environments as well.  Search is also the key technology that drives most ad revenue.  So needless to say, SEO is important.  But how does SEO work in a Silverlight application where most of the interesting content is dynamically generated?   I will...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • What effect does using itemprop="significantLinks" on anchors have for SEO?

    - by hdavis84
    So as I've described in a previous post about span tags within head tags, I'm practicing application of microdata via http://schema.org. Anyone who's browsed the documentation there knows that there's a lot of need for improvement for more clear understandings on use for each property. My question on this post is more about the "significantLinks" property and how it effects SEO for on page, in content anchored text. Does anyone have any more information regarding whether its good to use for link optimization? I understand what schema.org means that it's to be used on "non-navigational links" and those links should be relevant to the current page's meaning. But will using this property hurt SEO or make SEO better for each page? Thanks in advance, as by answering this with accurate information you are helping not just me, but many people who are trying to make their customers more successful through helping their rank for relevant keywords to their business, bringing them more search engine traffic.

    Read the article

  • VirtualBox Port Forward

    - by john.graves(at)oracle.com
    A great new feature in VirtualBox 4.0 is the ability to use NAT networking and forward ports without needing to use ssh -L/-R tricks.  This is great for booting multiple VM domains simultaneously.  It is possible to have several instances which map back to the host machine and different ports on localhost:* automatically forward to the correct VM.  This avoids the hassle of setting up dns entries or static IP addresses.In this example, I'm mapping the host ports 3xxxx to the VM's well known server ports.Note: It is important to setup the Frontend HTTP host/port to avoid incorrect URL rewriting.You may also need to setup an http channel to deal with local traffic which uses the network address 10.0.2.15Happy VMing.

    Read the article

  • IIS 6 nested virtual directory redirection

    - by threedaysatsea
    We're running IIS 6 on a WinServer2k3 box and we're having some trouble with the following problem: E-mails were sent out to users asking them to go to the following URL: alias.contoso.com/directory2/view.aspx?queryparam1=no&queryparam2=blue However, the URLS are actually supposed to be: server.contoso.com/directory2/view.aspx?queryparam1=no&queryparam2=blue It's too late to recall all of the e-mails, and we'd like to redirect traffic to make this as seamless as possible for our users. The real problem here is that the server (server.contoso.com) is hosting the alias (alias.contoso.com) as a redirect thusly, and the existing redirect we need to keep functional: Default Web Site (server.contoso.com) --Directory1 --Directory2 --Directory3 Redirection to Directory3 (alias.contoso.com) --Essentially alias.contoso.com will take the user to server.contoso.com/Directory3 Is there any way to host a separate redirect inside of the existing redirect? We need to keep alias.contoso.com taking the user to server.contoso.com/Directory3 but also make alias.contoso.com/directory2/view.aspx?queryparam1=no&queryparam2=blue point to server.contoso.com/directory2/view.aspx?queryparam1=no&queryparam2=blue Any tips? Is this even possible?

    Read the article

  • Tracking users behaviour - with or without Google Analytics

    - by Ilian Iliev
    If I understand correctly the following (point & from GA TOS): PRIVACY . You will not (and will not allow any third party to) use the Service to track or collect personally identifiable information of Internet users, nor will You (or will You allow any third party to) associate any data gathered from Your website(s) (or such third parties' website(s)) with any personally identifying information from any source as part of Your use (or such third parties' use) of the Service. You will have and abide by an appropriate privacy policy and will comply with all applicable laws relating to the collection of information from visitors to Your websites. You must post a privacy policy and that policy must provide notice of your use of a cookie that collects anonymous traffic data. You are not allowed to use custom variables that will identify the visitor(for example website username, e-mail, id etc.) So the question is how can I track a specific user behaviour(for example the actions that every single logged in user do).

    Read the article

  • Moved sitemaps to a different subdomain and losing search referrals around the same time. Red herring or correlation?

    - by er1234
    We started to lose search referral traffic around the same time that I moved some of our sitemaps to a subdomain. Could this have hurt us? I followed Google's steps to creating a sitemap under a different subdomain. The new sitemaps.foo.com subdomain is being crawled and indexed well. Both www.foo.com and sitemaps.foo.com have been verified in Google Webmaster Tools. They appear as distinct sites. Is this correct? I can't find a way in Webmaster Tools to say "Hey, sitemaps.foo.com is really owned by www.foo.com, so show them together and make sure to attribute sitemaps.foo urls to www.foo" Our www.foo.com/robots.txt Sitemap: http://www.foo.com/sitemap.xml Sitemap: http://sitemaps.foo.com/subdir/sitemap.xml.gz

    Read the article

  • javascript disabling a div or a link? (for a 5-star rating system)

    - by Cyprus106
    Basically, I've created a 5-star rating system. Pretty typical. It shows how many stars other people have given the item, and then when a user hovers over the stars, it lights up x number of stars based on how many they're over. It's all run by AJAX. They click 5 stars it automatically adds their 5-star rating to the group. The problem is that after they rate it I want to turn the system off, but I can't seem to be able to do that. I've tried everything I can think of. I've tried using element.disable for the a hrefs and for the div, but it still lets them vote away, over and over again, at least in firefox.... Can anyone help me out with a method to simply "freeze" the stars on what the user voted?? If I need to add code that's cool! I figured it probably wasn't necessary in this situation!

    Read the article

  • How can an application (like Firefox) be forced to use a certain network interface?

    - by Lekensteyn
    I've two interfaces: eth0 and wlan0 on a notebook. Possible use cases: eth0 grants me Internet access, and wlan0 is currently connected to a router which does not have Internet-connectivity. For development purposes, I need to connect to the wlan0 by default, but use eth0 for surfing eth0 and wlan0 are both connected to the Internet. For a torrent application, eth0 should be used for speed, but for portability of the notebook, SSH should have a connection over wlan0 eth0 is a wire connection, wlan0 is a wireless one. Sensible data should be transferred over eth0, but other traffic can go over wlan0 as well. Is there a way to force applications (like nc.traditional or firefox) to use a certain network interface? A wrapper like example-wrapper eth0 program is fine too if such program exist. It would be nice if it could configured within Firefox (in runtime). I'd like to avoid IPTables solutions if possible.

    Read the article

  • Redisigning an old site, structure change etc

    - by RhymeGuy
    I have an old site built in 2006, it has around 200 pages and 500 pictures. Every single page is of course indexed as well as images. It is very well ranked for targeted keywords and I receive good amount of SEO traffic (I guess that's due the various campaigns, branding, ppc, etc..) Problem: Site has outdated design, pages and images have not so proper names, there are no heading and alt tags, it was built in tables, inline CSS etc.. Goal: Complete redisign site, use divs, change file names, add proper meta data, alt tags etc.. Question: How this can affect current SEO positions? I will redirect (301) every single page to the new one, build site map, but what to do with images? Do I need to redirect them also? Any other suggestion?

    Read the article

  • which way should I look at visits by region in Google Analytics?

    - by Drai
    I need to generate a report for only the Americas in Google Analytics. When I create an advanced segment that includes Continent Exactly Matching Americas I get one number, If I create the segment that includes sub-Continent region Includes America I get a slightly different number, And if I look at all visits but choose Demographicslocationand segment by sub-continent region I get yet a 3rd number! (Note: this is because it also includes Caribbean) All are only different by around 1% of traffic. What is the most accurate way to do this, or should I just pick a way and be consistent?

    Read the article

  • How to delete all your old website data from the internet?

    - by Akky Awesøme
    I had my website on rohbits.com but for some reasons I had to delete it and recreate it with this URL wwww.rohbits.com/blog. My problem is that the old links are still visible on google search and when people click on those links, they land on a 404 Error page of the hosting company. I want to either delete all the previous data from the search engines or have an 404 Error page of my own so that I can tell my visitors where the actual website is. I have already redirected all the traffic which comes to rohbits.com to www.rohbits.com/blog but when they click on the expired links, they get this error page. One sample expired link is this one: http://rohbits.com/wordpress-tricks.

    Read the article

< Previous Page | 115 116 117 118 119 120 121 122 123 124 125 126  | Next Page >