Search Results

Search found 7555 results on 303 pages for 'sites'.

Page 138/303 | < Previous Page | 134 135 136 137 138 139 140 141 142 143 144 145  | Next Page >

  • What Exactly Does the Wattage Rating on a Power Supply Unit Mean?

    - by Jason Fitzpatrick
    Your PSU is rated 80 Plus Bronze and for 650 watts, but what exactly does that mean? Read on to see how wattage and power efficiency ratings translate to real world use. Today’s Question & Answer session comes to us courtesy of SuperUser—a subdivision of Stack Exchange, a community-drive grouping of Q&A web sites. How To Use USB Drives With the Nexus 7 and Other Android Devices Why Does 64-Bit Windows Need a Separate “Program Files (x86)” Folder? Why Your Android Phone Isn’t Getting Operating System Updates and What You Can Do About It

    Read the article

  • Add in the header of the license type is enough to say: "my code is licensed"? (Open-source)

    - by silverfox
    I do not know if this is the correct place to ask this stackexchange. Note: If a moderator can move to the correct place (if I am in the inappropriate site SE) I read on various sites about licenses. I did just put the license type in the header file (in my case the javascript file - open-source). /* * "codeName" "version" * http://officialsite.com/ * * Copyright 2012 "codeName" * Released under the "LICENSE NAME" license * http://officialsite.com/LICENSE NAME */ javascript code ... In the same folder I leave a copy of the license. The listing of the folder looks like this: * codeName.js * LICENSE In the file LICENSE would leave my code uses. What nobody says is if it is enough to say my code is licensed (the case of an open-source). Or is something more required? Sorry for the bad English. Thanks.

    Read the article

  • My application's bounce rate jumped from 30% to 80% overnight

    - by davidrac
    My application is provided as a service that is embedded in other sites. I have google analytics installed on the login popup dialog which is a page of my application, which is opened from the host site (OAuth). About a week ago, I've noticed a sharp decrease in the number of new users registrations and a jump in the bounce rate (from ~30% to ~80%). This happened without any change in the application. I looked into technical parameters like page load time and error rates, but could not see any change in there. Any ideas what can cause this behavior?

    Read the article

  • script shortcut to open two files in gedit as sudo

    - by Sam
    I want to double click a file on my desktop, and have two files be opened in gedit as sudo. Whenever I'm making a new website, I need to open /etc/hosts /etc/apache2/sites-enabled/000-default.conf as sudo. At the moment this means opening the terminal, runing sudo gedit then opening each file manually. I want to streamline this part of my workflow. On windows I had wrote a little bash script which worked nicely. How can I do the same in ubuntu? So far in my searches I've come across ways of adding a shortcut to the file browser, and similar things, but not exactly what I want. I have tried creating a desktop launcher, but can't see how to make it run as sudo.

    Read the article

  • Les premiers noms de domaines non-latins fonctionnent, avec des URLs en caractères arabes

    Mise à jour du 07.05.2010 par Katleen Les premiers noms de domaines non-latins fonctionnent, avec des URLs en caractères arabes Il y a quelques heures, les trois premiers noms de domaines non-latins on été placé dans la root zone du DNS. Ils sont donc désormais en service, et fonctionnent parfaitement. Voici un exemple de ce que vous pourrez voir dans le champ d'URL de votre navigateur, si vous visitez l'un de ces sites : [IMG]http://blog.icann.org/wp-content/uploads/2010/05/idn-example-450px.png[/IMG] Ces trois nouveaux domaines sont السعودية. (?Al-Saudiah?), امارات. ( ?Emarat?) et ...

    Read the article

  • How can I determine the trending pages on my site?

    - by Dogweather
    I'm looking to what what the "hot" pages are on one of my sites. I want to see for various timeframes, what the top-50 pages are. I'm going to create a data feed with this info which will be input to another app. I have Apache logs, and complete control of the machine to install what I want. I'm mostly wondering if there's something out there already that I can use, or if I have to implement it myself, what good algorithms or strategies might be. Thanks.

    Read the article

  • I am the Webmaster now. Where do I start? [closed]

    - by John C
    I just changed jobs and will soon be in charge of a custom-built ASP.NET CMS and website for a fairly large corporation with global offices. I have IT and developer FTE resources available to me but I am trying to build a list of branding, project, and functionality points to review. What guides or lists can/should I use to evaluate this website before I begin adding features, creating new projects, or even redesigning and redeveloping the site? (I have been a webmaster/designer/developer for small, WordPress/Drupal sites for 10 years. I have been an unofficial webmaster (director/content manager) for a large site for 3 years (no direct development control over Sharepoint administration, IIS, or hosting ... but everything else, I did. Analytics, email, advertising, social, SEO, etc.).) Thank you!

    Read the article

  • Question about SEO and Domains

    - by jasondavis
    This is my first post on here as I am mainly on Stackoverflow and Serverfault. I have been programming for at least 10 years now, have made hundreds of websites but I have just recently started getting into Design and the SEO side of sites, sad that I have been overlooking these for so many years. I have pretty good knowledge from all my years of SEO but I have never really looked into it until now. My question, I would like to build a site that targets many different key words for the search engines, for an example. Let's say I built a site about Outdoor activities called outdoorreview.com and I planned on having many sections hunting fishing Hiking camping cycling climbing etc... For best Search Engine results, how could I get the most search engine traffic to all these ares? Also how should I structure the way to get to them, outdoorreview.com/Hiking/ or hiking.outdoorreview.com ?

    Read the article

  • How best to take a users signature online? (UK law orientated) [closed]

    - by Ben Griffiths
    Not sure if this is the best place to ask, but I can't seem to find any of the other SE sites that would fit better (unless there's a law one?) I'm building an application that will replace an existing paper based form, and this form would normally be signed by the person filling it in. Looking around, it's hard to find a good definitive resource to explain what I can and cannot accept as far as a signature goes. It looks like some UK government online forms accept just your name typed into a box, but I've also heard you should back up with an email - so that process would be type name into a box along with providing an email address, send out an email, then make them click a link within the email to finally complete the verification. Involving email seems very long winded and leaves the system open to spam filters blocking emails, forgotten emails that just sit in inbox's etc. So, does anyone have any knowledge in this department? Personally, I'd love to just get them to type their name into a box and be done with it!

    Read the article

  • jQuery 1.8 b1 est disponible, le code du framework est maintenant réparti dans différents modules

    jQuery 1.8 b1 est disponible jQuery est utilisé par 50 % des grands sites du web, mais les navigateurs et les appareils sur lesquels il fonctionne ont beaucoup changé depuis 6 ans. De même, la manière et les outils pour concevoir un site web se modifient rapidement. jQuery doit s'adapter en permanence à son environnement. Comme il en a été décidé lors de la sortie de la version 1.7, l'équipe de développement se pose maintenant en permanence les questions : « Cet ajout est-il indispensable ? Que peut-on supprimer ? » De nouvelles obsolescences ont été ajoutées à la liste des obsolescences annoncées à l'époque. Ces questions sont cruciales dans le domaine des appareils mobiles, jQuery doit mettre à la disposition du dé...

    Read the article

  • O'Reilly 50& off offer on CSS3 books to 05:00 PT on Oct/28

    - by TATWORTH
    Originally posted on: http://geekswithblogs.net/TATWORTH/archive/2013/10/21/oreilly-50-off-offer-on-css3-books-to-0500-pt.aspxAt  http://shop.oreilly.com/category/deals/css3.do?code=WKCSS&imm_mid=0b155e&cmp=em-prog-books-videos-lp-owo_css3_direct_wkcss, O'Reilly are offering 50% off a number of e-books on mastering CSS3 to 05:00 PT on Oct 28 "CSS3—the technology behind most of the eye-catching visuals on the Web today—is loaded with capabilities that once would have required JavaScript or third-party plugins, such as animation, pseudo-classes, and media queries. Use CSS3 to transform markup into stunning, richly detailed web pages that look great in any browser. For one week only, SAVE 50% on CSS3 ebooks from shop.oreilly.com and take your sites from ordinary to incredible."

    Read the article

  • Disable outbound links without letting others know that

    - by tadoman
    Is there a way I can tell google not to follow external links ( pointing to other sites) without letting other know. I know you can disable outbound links by putting rel=nofollow or something in robots.txt. But that's something others can see as well. I'm just wondering if there's a way to tell google not to follow those links without letting others know that... like a setting in webmaster tools or something similar ( there's definetly one way. I could set an exception in my conf file for my server to check the user agent to be "googlebot" and then serve a different version of robots.txt. So that when a different user would check that link it would return a different robots.txt thant the one served to googlebot. However I'm not too sure google would be too happy about this) Thank you

    Read the article

  • Keyword Generator Tool Gets Your Ahead of the Competition

    A keyword generator tool provides ideas that website owners and search engine optimizers use for site and engine optimization. Key phrase generators rely on search query popularity from introductory keywords to a more complex keyword search management to drive more traffic to a website. It maximizes prospective and potential high-traffic keywords and integrates it with your sites campaign techniques. Keyword generator tool allows you to manage and add "exact match" and "phrase match" keywords to your lists, also allows you to create misspellings, combine and reverse keywords then automatically calculates the ad group focus score of your keyword lists.

    Read the article

  • Protecting design ideas from being copied by other websites?

    - by mickburkejnr
    Hi everyone, I'm planning a project at the moment, while building a completely different project at the same time. Both of these projects are quite innovative in the way they either work or the way they are presented. One of the projects hasn't been done before, and the other is being made has competition, but I feel the competitions websites are light years behind what I'm doing. Is there a way for me to prevent the way my sites work or presented from being stolen? I've thought of patenting parts of them, but it requires £10,000 and I don't have that amount of money. Also, would me putting a Copyright notice on the site or an All Rights Reserved tag give me any muscle when going to websites that I feel have stolen my ideas (if they have)? Cheers!

    Read the article

  • How should I handle search engines auto-correcting the spelling of a site's name?

    - by Nathan G.
    A client's site and company is called 'Tranin Communications' (Tranin is her last name). It ranks well in searches for her name but rather poorly in searches for the name of her site/company. I realized that this is largely due to* search engines (Google especially) assuming that the query was misspelled and automatically including results for both 'train communications' and 'communications training'. Both of those queries yield many high-ranking sites that completely drown out hers. Sometimes Google even shows results for 'communications training' instead of 'tranin communications', hiding her site altogether. Is there a way to report an incorrect auto-correction to Google or something I can do to discourage this behavior (e.g. a meta tag)? My searches have come up cold, any suggestions would be appreciated. *I've come to this conclusion because her site ranks very highly when the same queries are put in quotes.

    Read the article

  • How should I study a competitor's off page SEO?

    - by Chris Adragna
    What are the things I need to do, and with what tools to know what a competitor has working for him/her off-page (free and paid tools -- please suggest both)? First of all, I'm supposing I want to see all of the sites linking in and see what anchor text is used. Is there something that would report on the anchor text linking in, such as counting the keyword phrases used as anchor text? Also, it would be helpful to see where the PR is coming from by PR, such as, listing inbound links by PR of the page linking in. Lastly, if I'm missing something, here in the way of off-page attributes, please say so.

    Read the article

  • Covering Yourself For Copyrighted Materials [on hold]

    - by user3177012
    I was thinking about developing a small community website where people of a certain profession can register and post their own blogs (Which includes an optional photo). I then got to thinking about how people might use this and the fact that if they are given the option to add a photo, they might be likely to use one that they simply find on Google, another social network or even an existing online blog/magazine article. So how do I cover myself from getting a fine slapped on me and to make it purely the fault of the individual uploader? I plan on having an option where the user can credit a photo by typing in the original photographers name & web link (optional) and to make them tick a check box stating that the post is their own content and that they have permission to use any images but is that enough to cover myself? How do other sites do it?

    Read the article

  • About cdn architecture to route way

    - by Tony Lee
    Our web system, use the third-party cdn service. Assume that the user set the local dns with the googledns or opendns to visit our web sites, so cdn service will select the closest cdn proxy node. all right, but in fact the user's actual access position might outside there, cdn service may chose the one furthest away from the user node, so static resource access slower.. At present, my idea is if user local set dns server with googledns, and then first one we get the actual ip address of the user, tracerote to test a best routing lines, set up a cookie in user browser, and then set 302 header for response to jump to the which best cdn node. Whether the user's browser side traceroute tool can provide the best route decision-making ? Because we find that, once the user to set local dns server with the foreign network segment, for example : set dns with 8.8.8.8, so cdn routing will choose the foreign service node.

    Read the article

  • What is the impact of a CMS on page load time versus a static site?

    - by PleaseStand
    I am creating a 20-page site that will go on shared hosting. Each page will be about 20 KB (including HTML, CSS, and images common to all pages). To avoid manually adding navigation elements to each page, I am considering using a CMS. However, I am concerned that on a busy server, using a CMS would make the site load more slowly. In a shared hosting environment where PHP is run as a CGI binary, how much does a CMS (WordPress, Drupal, etc.) generally affect page load time, compared to both "plain HTML" static sites and those using PHP as merely a templating language?

    Read the article

  • Is Azure Compatible with JPEG XR?

    - by Shawn Eary
    I just put an F#/MVC app into a Windows Azure solution as a Web Role. Before migration, my JPEG XR (*.WDP) files were getting displayed on the client in IE9 without issue via my local and hosted sites. Now, after migration into Windows Azure, my JPEG XR files neither get displayed in my local Windows Azure compute emulator nor do they get displayed when they are deployed to http://*.cloudapp.net. Is there some sort of conflict with Widows Azure and (JPEG XR) *.wdp files? If so, what is the accepted best practice for overcoming this conflict?

    Read the article

  • Will duplicate international (i18n) content hinder SEO rankings?

    - by Rhys
    Google clearly states that duplicate content within a single, or multiple, domains is not advised. This is understood, but I am not sure of any exceptions for sites with region-specific content that is often replicated across locales. For example, a site's /en-us/about page could be identical to /en-uk/about, whereas most likely /en-ja/about is unique. Are GYM smart enough to understand that the initial URL depth is a locale specifier? Is there any robots.txt or header, etc, trickery that I should include to outline the site's international structure?

    Read the article

  • Why do I get a 403 error when accessing my apache server?

    - by nishan
    Im running Ubuntu 12.04 LTS on a system with 2 GB RAM and a 500 GB HDD. My hard drive has 4 partitions: Partition 1 = 40 gb Windows (NTFS, lable = win32) Partition 2 = 320 gb Windows (FAT label = common) Partition 3 = 40 gb Ubuntu (EXT4) I installed apached2. Then, to change its default www directory, I ran gksu gedit /etc/apache2/sites-enabled/000-default and, in the editor, changed the location to /media/common/www. After that I ran these commands in a terminal: chmod 777 /media/common/www chmod 777 /media/common/www/*.* After that I ran: firefox 127.0.0.1/index.php It said: Forbidden You don't have permission to access / on this server. Apache/2.2.22 (Ubuntu) Server at 127.0.0.1 Port 80 Before my changes it was working fine. How can I run my websites?

    Read the article

  • How to organize my site's file system properly?

    - by Wolfpack'08
    Doing some reading on Stack Overflow, I've found a lot of information suggesting that proper organization of a file system is crucial to a well-written web app. One of the key pieces of evidence is high-frequency references to "separation of concerns" in questions related to keeping programs organized. Now, I've found some information on organizing file systems (Filesystem Hierarchy Standard) from 2004. It raises only two concerns: first, the standard's a bit dated, so I believe it may be possible to do better given the changes in technology over the past 8 years; second, and most important, my application is very small compared to an entire Linux distro. I think that the file system should be organized very differently because of that. Here's what I'm looking at, currently: /scripts, /databases, /www -> /dev, /production -> login, router, admin pages, /sites -> content types, static pages /modules, /includes, /css, /media -> /module-specific-media

    Read the article

  • Convert Currencies Dynamically using PHP, Google and cURL [closed]

    - by LizO
    I want to be able to allow users to dynamically change the currency of the products prices in my webstore, right there on the page. For example, 300 USD will change to 221.61 EUR when the user selects Euros from a dropdown. I found a few sites with PHP code for a calculator/input format (user inputs value and converted currency is output.) http://www.chazzuka.com/blog/?p=104 http://www.pixel2life.com/publish/tutorials/1166/currency_conversion_in_php/page-3/ I was hoping someone could help me figure out how to modify the PHP script. Thanks in advance.

    Read the article

  • Moved sitemaps to a different subdomain and losing search referrals around the same time. Red herring or correlation?

    - by er1234
    We started to lose search referral traffic around the same time that I moved some of our sitemaps to a subdomain. Could this have hurt us? I followed Google's steps to creating a sitemap under a different subdomain. The new sitemaps.foo.com subdomain is being crawled and indexed well. Both www.foo.com and sitemaps.foo.com have been verified in Google Webmaster Tools. They appear as distinct sites. Is this correct? I can't find a way in Webmaster Tools to say "Hey, sitemaps.foo.com is really owned by www.foo.com, so show them together and make sure to attribute sitemaps.foo urls to www.foo" Our www.foo.com/robots.txt Sitemap: http://www.foo.com/sitemap.xml Sitemap: http://sitemaps.foo.com/subdir/sitemap.xml.gz

    Read the article

< Previous Page | 134 135 136 137 138 139 140 141 142 143 144 145  | Next Page >