Search Results

Search found 29554 results on 1183 pages for 'human computer interface'.

Page 463/1183 | < Previous Page | 459 460 461 462 463 464 465 466 467 468 469 470  | Next Page >

  • Protect Your Data with Windows Vista

    Now a day nothing is more important than backing up your data of your computer. But there are still many people who do not understand the importance of protecting data. Therefore when they proceed fo... [Author: Susan Brown - Computers and Internet - May 08, 2010]

    Read the article

  • ClearTrace for SQL Server 2012

    - by Bill Graziano
    I’ve updated the beta for ClearTrace that support SQL Server 2012.  This requires SQL Server 2012 to be installed on the computer where ClearTrace is running.  It will read traces from SQL Server 2008 R2, SQL Server 2008 and SQL Server 2005. It includes some minor improvements in performance and handling large SQL statements. It should also give better errors. If you do find any of those errors, please report them in the support forum.

    Read the article

  • How to hide the admin login form?

    - by John Doe
    In my website there are no accounts except for those of moderators and administrators. That's why I don't want to show the login form to everyone but to these people. I thought of using a weird URL for the login form like www.example.com/1a79a4d60de6718e8e5b326e338ae533 that only admins and mods would know. But it's a quite impractical solution, besides if someone would want to login in another computer and forgets this URL, then is unable to. Is there any more effective way?

    Read the article

  • Traveling With Laptop

    Laptops are great, especially if your business requires frequent traveling or if you simply cannot leave your home without your computer for one reason for another. "Laptops are great, especially i... [Author: Jeremy Mezzi - Computers and Internet - May 29, 2010]

    Read the article

  • its possible to use unity 3d and gnome3 without any drivers?

    - by user49523
    currently i don't have any driver available for the laptop that i use , and cause of this i only can use unity2d and gnome2. i have ubuntu 11.10 and ubuntu 12.04 installed and no drivers for both. i would like to use unity3d and gnome3..instead of unity and gnome2 or at least have that option even if computer became a lit slower. so there is a way of enabling unity 3d and gnome without any driver had been installed?

    Read the article

  • YouTube, no video or sound

    - by Cautious1
    I tried the answers from previous posts without luck. I'm using ubuntu 10.04.4 and youtube shows a black screen, no video and no sound . I uninstalled adobe flash player closed down and reinstalled but it didn’t help. I have run Mint 13 on the same computer and it will play youtube without a problem. I'm not familier with linux language . Using comands in terminal might make everything terminal if I try!

    Read the article

  • The Internet – Then and Now (1996 versus 2011) [Infographic]

    - by Asian Angel
    Use the link below to view the entire infographic. Keep in mind that it may take a few moments for it to load due to its large size. True Hollywood Story: Bipeds and the World Wide Web [infographic] [via TinyHacker] How to See What Web Sites Your Computer is Secretly Connecting To HTG Explains: When Do You Need to Update Your Drivers? How to Make the Kindle Fire Silk Browser *Actually* Fast!

    Read the article

  • Live cd and usb install failure blank screen when trying to install on an HP Pavilion dv6

    - by Ajian
    I recently bought a new computer, and have been trying to install linux on it, 11.10 x64. It is a HP pavilion dv6-6117dx. 2.4GHz/1.5GHz VISION A8 Technology from AMD with AMD Quad-Core A8-3500M Accelerated Processor AMD Radeon HD 6620G Discrete-Class Graphics I am pretty sure i picked a unsupported graphics card or something. I have tried booting from usb as well, but the screen becomes blank after rebooting.

    Read the article

  • Why Are We Still Using CPUs Instead of GPUs?

    - by Jason Fitzpatrick
    Increasingly GPUs are being used for non-graphical tasks like risk computations, fluid dynamics calculations, and seismic analysis. What’s to stop us from adopting GPU-driven devices? Today’s Question & Answer session comes to us courtesy of SuperUser—a subdivision of Stack Exchange, a community-drive grouping of Q&A web sites. 6 Ways Windows 8 Is More Secure Than Windows 7 HTG Explains: Why It’s Good That Your Computer’s RAM Is Full 10 Awesome Improvements For Desktop Users in Windows 8

    Read the article

  • BIG DATA eBook - Now Available

    - by Javier Puerta
    The Big Data interactive e-book “Meeting the Challenge of Big Data: Part One” has just been released. It’s your “one-stop shop” for info about Big Data and the Oracle offering around it.The new e-book (available on your computer or iPad) is packed with multi-media resources to educate Oracle staff, customers, prospects and partners on the value of Big Data. It features videos, tutorials, podcasts, reports, white papers, datasheets, blogs, web links, a 3-D demo, and more. Go and get it here!

    Read the article

  • Your system is running in low-graphics mode with an ATI Radeon 3200 Graphics card

    - by say
    I installed 12.04 LTS (upgrade from 11.10) but When I start my computer it show "Your system is running in low-graphics mode, Your screen, graphics card, and input device settings could not be detected correctly. You will need to configure these yourself." And than show dialog what I want to do, but this one doesn´t work correctly. So I can access only terminal but I don´t know how set this staf or how to start GUI. Because I´m terminal kiddies :-) Thanks for any help :-)

    Read the article

  • From HttpRuntime.Cache to Windows Azure Caching (Preview)

    - by Jeff
    I don’t know about you, but the announcement of Windows Azure Caching (Preview) (yes, the parentheses are apparently part of the interim name) made me a lot more excited about using Azure. Why? Because one of the great performance tricks of any Web app is to cache frequently used data in memory, so it doesn’t have to hit the database, a service, or whatever. When you run your Web app on one box, HttpRuntime.Cache is a sweet and stupid-simple solution. Somewhere in the data fetching pieces of your app, you can see if an object is available in cache, and return that instead of hitting the data store. I did this quite a bit in POP Forums, and it dramatically cuts down on the database chatter. The problem is that it falls apart if you run the app on many servers, in a Web farm, where one server may initiate a change to that data, and the others will have no knowledge of the change, making it stale. Of course, if you have the infrastructure to do so, you can use something like memcached or AppFabric to do a distributed cache, and achieve the caching flavor you desire. You could do the same thing in Azure before, but it would cost more because you’d need to pay for another role or VM or something to host the cache. Now, you can use a portion of the memory from each instance of a Web role to act as that cache, with no additional cost. That’s huge. So if you’re using a percentage of memory that comes out to 100 MB, and you have three instances running, that’s 300 MB available for caching. For the uninitiated, a Web role in Azure is essentially a VM that runs a Web app (worker roles are the same idea, only without the IIS part). You can spin up many instances of the role, and traffic is load balanced to the various instances. It’s like adding or removing servers to a Web farm all willy-nilly and at your discretion, and it’s what the cloud is all about. I’d say it’s my favorite thing about Windows Azure. The slightly annoying thing about developing for a Web role in Azure is that the local emulator that’s launched by Visual Studio is a little on the slow side. If you’re used to using the built-in Web server, you’re used to building and then alt-tabbing to your browser and refreshing a page. If you’re just changing an MVC view, you’re not even doing the building part. Spinning up the simulated Azure environment is too slow for this, but ideally you want to code your app to use this fantastic distributed cache mechanism. So first off, here’s the link to the page showing how to code using the caching feature. If you’re used to using HttpRuntime.Cache, this should be pretty familiar to you. Let’s say that you want to use the Azure cache preview when you’re running in Azure, but HttpRuntime.Cache if you’re running local, or in a regular IIS server environment. Through the magic of dependency injection, we can get there pretty quickly. First, design an interface to handle the cache insertion, fetching and removal. Mine looks like this: public interface ICacheProvider {     void Add(string key, object item, int duration);     T Get<T>(string key) where T : class;     void Remove(string key); } Now we’ll create two implementations of this interface… one for Azure cache, one for HttpRuntime: public class AzureCacheProvider : ICacheProvider {     public AzureCacheProvider()     {         _cache = new DataCache("default"); // in Microsoft.ApplicationServer.Caching, see how-to      }         private readonly DataCache _cache;     public void Add(string key, object item, int duration)     {         _cache.Add(key, item, new TimeSpan(0, 0, 0, 0, duration));     }     public T Get<T>(string key) where T : class     {         return _cache.Get(key) as T;     }     public void Remove(string key)     {         _cache.Remove(key);     } } public class LocalCacheProvider : ICacheProvider {     public LocalCacheProvider()     {         _cache = HttpRuntime.Cache;     }     private readonly System.Web.Caching.Cache _cache;     public void Add(string key, object item, int duration)     {         _cache.Insert(key, item, null, DateTime.UtcNow.AddMilliseconds(duration), System.Web.Caching.Cache.NoSlidingExpiration);     }     public T Get<T>(string key) where T : class     {         return _cache[key] as T;     }     public void Remove(string key)     {         _cache.Remove(key);     } } Feel free to expand these to use whatever cache features you want. I’m not going to go over dependency injection here, but I assume that if you’re using ASP.NET MVC, you’re using it. Somewhere in your app, you set up the DI container that resolves interfaces to concrete implementations (Ninject call is a “kernel” instead of a container). For this example, I’ll show you how StructureMap does it. It uses a convention based scheme, where if you need to get an instance of IFoo, it looks for a class named Foo. You can also do this mapping explicitly. The initialization of the container looks something like this: ObjectFactory.Initialize(x =>             {                 x.Scan(scan =>                         {                             scan.AssembliesFromApplicationBaseDirectory();                             scan.WithDefaultConventions();                         });                 if (Microsoft.WindowsAzure.ServiceRuntime.RoleEnvironment.IsAvailable)                     x.For<ICacheProvider>().Use<AzureCacheProvider>();                 else                     x.For<ICacheProvider>().Use<LocalCacheProvider>();             }); If you use Ninject or Windsor or something else, that’s OK. Conceptually they’re all about the same. The important part is the conditional statement that checks to see if the app is running in Azure. If it is, it maps ICacheProvider to AzureCacheProvider, otherwise it maps to LocalCacheProvider. Now when a request comes into your MVC app, and the chain of dependency resolution occurs, you can see to it that the right caching code is called. A typical design may have a call stack that goes: Controller –> BusinessLogicClass –> Repository. Let’s say your repository class looks like this: public class MyRepo : IMyRepo {     public MyRepo(ICacheProvider cacheProvider)     {         _context = new MyDataContext();         _cache = cacheProvider;     }     private readonly MyDataContext _context;     private readonly ICacheProvider _cache;     public SomeType Get(int someTypeID)     {         var key = "somename-" + someTypeID;         var cachedObject = _cache.Get<SomeType>(key);         if (cachedObject != null)         {             _context.SomeTypes.Attach(cachedObject);             return cachedObject;         }         var someType = _context.SomeTypes.SingleOrDefault(p => p.SomeTypeID == someTypeID);         _cache.Add(key, someType, 60000);         return someType;     } ... // more stuff to update, delete or whatever, being sure to remove // from cache when you do so  When the DI container gets an instance of the repo, it passes an instance of ICacheProvider to the constructor, which in this case will be whatever implementation was specified when the container was initialized. The Get method first tries to hit the cache, and of course doesn’t care what the underlying implementation is, Azure, HttpRuntime, or otherwise. If it finds the object, it returns it right then. If not, it hits the database (this example is using Entity Framework), and inserts the object into the cache before returning it. The important thing not pictured here is that other methods in the repo class will construct the key for the cached object, in this case “somename-“ plus the ID of the object, and then remove it from cache, in any method that alters or deletes the object. That way, no matter what instance of the role is processing the request, it won’t find the object if it has been made stale, that is, updated or outright deleted, forcing it to attempt to hit the database. So is this good technique? Well, sort of. It depends on how you use it, and what your testing looks like around it. Because of differences in behavior and execution of the two caching providers, for example, you could see some strange errors. For example, I immediately got an error indicating there was no parameterless constructor for an MVC controller, because the DI resolver failed to create instances for the dependencies it had. In reality, the NuGet packaged DI resolver for StructureMap was eating an exception thrown by the Azure components that said my configuration, outlined in that how-to article, was wrong. That error wouldn’t occur when using the HttpRuntime. That’s something a lot of people debate about using different components like that, and how you configure them. I kinda hate XML config files, and like the idea of the code-based approach above, but you should be darn sure that your unit and integration testing can account for the differences.

    Read the article

  • Is CPU Performance Affected by Age?

    - by Jason Fitzpatrick
    Your computer feels a little slower than it did this time last year; is that change something you can chalk up to an aging processor? Today’s Question & Answer session comes to us courtesy of SuperUser—a subdivision of Stack Exchange, a community-drive grouping of Q&A web sites. How to Factory Reset Your Android Phone or Tablet When It Won’t Boot Our Geek Trivia App for Windows 8 is Now Available Everywhere How To Boot Your Android Phone or Tablet Into Safe Mode

    Read the article

  • How to access shared folder in Virtual Box

    - by alsadi90
    I followed the steps for sharing folders between windows 7 and Ubuntu in virtual box. but the folder appear with X sign and give me the following message when open it "the folder conent could not be displayed" and when choose "shared folder" from "Device" menu the following is written below "on the system page , you have asigned more than 50% of your computer's memory (2.93) to the virtual machine ...

    Read the article

  • Can't access some websites using Ubuntu 13.10

    - by Adame Doe
    Something's wrong with Ubuntu. Since I've upgraded to 13.10, I can't access some websites for no apparent reason. I've tried everything imaginable to solve this problem : Made sure that MTUs are the same, Disabled IPv6 in both the network manager and used browsers, Deactivated my network keys, DMZed my computer, Used other DNS like Google and OpenDNS, Checked that no firewall was running my computer ... And it's the same result. I even tried to reinstall Ubuntu a couple of times, but no luck. The most annoying thing about it is I can't access wordpress.org! So, there's no way it could be an ISP restriction of some kind. When I use a VPN, I can access pretty much anything. I'm really frustrated because I have to use wordpress.org very often. Any clue? ifconfig adame@adame-ws:~$ ifconfig eth0 Link encap:Ethernet HWaddr 00:26:18:3d:b0:7c inet addr:10.42.0.1 Bcast:10.42.0.255 Mask:255.255.255.0 inet6 addr: fe80::226:18ff:fe3d:b07c/64 Scope:Link UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 RX packets:8024 errors:0 dropped:0 overruns:0 frame:0 TX packets:7966 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:684480 (684.4 KB) TX bytes:616608 (616.6 KB) lo Link encap:Local Loopback inet addr:127.0.0.1 Mask:255.0.0.0 inet6 addr: ::1/128 Scope:Host UP LOOPBACK RUNNING MTU:65536 Metric:1 RX packets:8222 errors:0 dropped:0 overruns:0 frame:0 TX packets:8222 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:0 RX bytes:568269 (568.2 KB) TX bytes:568269 (568.2 KB) wlan0 Link encap:Ethernet HWaddr 00:19:70:40:85:eb inet addr:192.168.2.3 Bcast:192.168.2.255 Mask:255.255.255.0 inet6 addr: fe80::219:70ff:fe40:85eb/64 Scope:Link UP BROADCAST RUNNING MULTICAST MTU:1464 Metric:1 RX packets:123705 errors:0 dropped:0 overruns:0 frame:0 TX packets:98141 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:94963545 (94.9 MB) TX bytes:10387470 (10.3 MB) /etc/hosts 127.0.0.1 localhost 127.0.1.1 adame-ws ::1 ip6-localhost ip6-loopback fe00::0 ip6-localnet ff00::0 ip6-mcastprefix ff02::1 ip6-allnodes ff02::2 ip6-allrouters tracepath wordpress.org 1: adame-ws.local 0.092ms pmtu 1500 1: 192.168.2.1 1.300ms asymm 2 1: 192.168.2.1 1.060ms asymm 2 2: no reply 3: no reply 4: no reply 5: no reply 6: no reply 7: no reply 8: no reply ... keep on going like that ping wordpress.org adame@adame-ws:~$ ping wordpress.org PING wordpress.org (66.155.40.250) 56(84) bytes of data. --- wordpress.org ping statistics --- 10 packets transmitted, 0 received, 100% packet loss, time 9071ms

    Read the article

  • How can I deal with the cargo-cult programming attitude?

    - by Aivar
    I have some computer science students in a compulsory introductory programming course who see programming language as a set of magic spells, which must be cast in order to achieve some effect (instead of seeing it as a flexible medium for expressing their idea of solution). They tend to copy-paste code from previous, similar-looking assignments without considering the essence of the problem. Can anyone recommend some exercises or analogies to make these students more confident that they can, and should, understand the structure and meaning of each piece of code they write?

    Read the article

  • Public/Private Key Generation

    - by JacKeown
    I'm just learning about public key cryptography and I want to make a public key certificate for my web server so that I can use https. My server is hosted on some random free webhost that is practically impossible for anything...and so my question is this: Is there any harm in making my private key, public key, and public key certificate on my computer using openssl and then transferring it to the server? Thanks in advance. Also if there's anything else I'm missing, any help would be appreciated.

    Read the article

  • Program Installer Not Detecting Internet

    - by KeithS
    Hello I am trying to install a program through the Ubuntu Software Center. Every time I click install I get a Message stating "failed to download package files, check your Internet connection". I have tried different software installs and get the same message. I do have an Internet connection (hence being able to write this), I have restarted the computer and have reset the Internet (twice) but still get the same message. Any Ideas??

    Read the article

  • What is the cloud?

    - by llaszews
    Everyone has their own definition cloud computing. This is a real conversation overheard at small cafe in NH between two general contractors: Contractor One: I can't get document I need because it is on my home PC. Contractor Two: You need cloud computing! Contractor One: What the hell is that? Contractor Two: You log into one computer and all your information for all your other computers is available. The NH live free or die definition of cloud computing!

    Read the article

  • Installed without the usual menus. Now I can't log in

    - by Martha
    I tried to install from CD. The computer wouldn't boot from the CD, so I clicked on the 'boot helper', and rebooted the machine. And the first thing I see is do I want to open Windows or Ubuntu (without it asking me whether I wanted to install it alongside Windows or replace it), and when I click on Ubuntu, after a very long time, I finally get a login screen. But I don't have a login, because I never set one up. Help!

    Read the article

  • Is there a way to get my NETGEAR N900 to work with ubuntu?

    - by user208088
    I'm using a NETGEAR N900 USB Wireless Adapter to pick up our home network connection while running Windows 7. I have Ubuntu 12.04.3 running on a second hard drive in my computer that is not compatible with my adapter. I'm not very familiar with how to work around in Ubuntu yet and this is the only thing keeping me from using it. I've seen this asked before but the instructions were very confusing to me! Help is appreciated

    Read the article

  • How to get HDMI sound work on a basic 12.04 install?

    - by Gubuntu
    I have just installed Ubuntu 12.04 (Precise) and am using it on my TV. The HDMI sound doesn't work. All I have installed is: The preinstalled codecs and other software, a KDE game called KsirK, Supertuxkart, and GIMP 2.8. I am using a custom built computer, and a LG TV. My processor is Pentium(R) Dual-Core CPU E5200 @ 2.50GHz dual core, OS is 32-bit, and I have 1.7GB of ram (after having to remove one stick due to failure).

    Read the article

< Previous Page | 459 460 461 462 463 464 465 466 467 468 469 470  | Next Page >