Search Results

Search found 2379 results on 96 pages for 'nvidia prime'.

Page 13/96 | < Previous Page | 9 10 11 12 13 14 15 16 17 18 19 20  | Next Page >

  • How do I install latest NVIDIA drivers from the .run file?

    - by Shahe Tajiryan
    This is what I am trying to do. I downloaded the latest driver for my VGA from http://www.nvidia.com. The installation needs the X11 to be shut down, so I log out of my account, then press CTRL+ALT+F1, then log in with my username and password, then run the command sh NVIDIA-Linux-x86_64-285.05.09.run in every possible way, I have even tried CHMODing the package with 777 permissions, but still I'm getting the sh: can't open NVIDIA-Linux-x86_64-285.05.09.run error. Any help would be greatly appreciated.

    Read the article

  • NVIDIA GeForce GT750M drivers for Ubuntu 13.10 on iMac

    - by Eugene B
    I have just got a new iMac 14.3 with 21.5' display. It has Intel i7 processor and NVIDIA GeForce GT750M graphics card with GPU (Device ID: 0x00fe9). I have installed Ubuntu 13.10 on it (dual boot with refit). Everything works fine apart from graphics. It is clear that graphics' not accelerated, and there are no drivers suggested in the "Additional drivers" section. I have tried to Install nvidia drivers from the repository. Download nvidia drivers and install them manually. Follow instructions here. Follow instructions here. Follow instructions here. Install bumblebee. Play around with xorg.conf file. Blacklist nouveau. As you may guess, nothing worked. The weirdest thing is, that when I boot from a LiveCD, it picks up screen resolution and all the rest correctly, though when I install the system to my Hard Drive it clearly does not have proper drivers. Does anyone have any suggestion what to do?

    Read the article

  • Windows 7 - Can't get my TV working as primary display with nVidia 7900GS

    - by Daniel Schaffer
    I just installed Windows 7 Ultimate 64 RTM (from MSDN) on my HTPC, which is connected to a 42" Magnavox LCD TV via component cables to my nVidia 7900GS. Everything was fine through the installation until I went to install the official driver from nVidia. Towards the end of the installation, the TV blinked off and wouldn't come back on. I went and got an LCD monitor and plugged it into a DVI port and the monitor came right up, but was automatically selected as the primary display. Now, if I set the TV to be the primary display, the TV just blanks until I hit escape to cancel the "settings have changed, do you want to keep them" dialog. Any suggestions? Update: I'm able to set the TV as the primary display using the Windows 7 "screen resolution" configuration panel. However, if I try to remove the LCD monitor either by unplugging it or using the configuration, the TV blanks out again. Update 2: This setup was working correctly in Vista Home Premium 32-bit. Update 3: I've uninstalled the nVidia driver and am using the driver that Windows Update installed. As much as this offends my geek sensibilities (must use the "right" driver!!), well, It Works™.

    Read the article

  • Help with Java Program for Prime numbers

    - by Ben
    Hello everyone, I was wondering if you can help me with this program. I have been struggling with it for hours and have just trashed my code because the TA doesn't like how I executed it. I am completely hopeless and if anyone can help me out step by step, I would greatly appreciate it. In this project you will write a Java program that reads a positive integer n from standard input, then prints out the first n prime numbers. We say that an integer m is divisible by a non-zero integer d if there exists an integer k such that m = k d , i.e. if d divides evenly into m. Equivalently, m is divisible by d if the remainder of m upon (integer) division by d is zero. We would also express this by saying that d is a divisor of m. A positive integer p is called prime if its only positive divisors are 1 and p. The one exception to this rule is the number 1 itself, which is considered to be non-prime. A positive integer that is not prime is called composite. Euclid showed that there are infinitely many prime numbers. The prime and composite sequences begin as follows: Primes: 2, 3, 5, 7, 11, 13, 17, 19, 23, 29, … Composites: 1, 4, 6, 8, 9, 10, 12, 14, 15, 16, 18, 20, 21, 22, 24, 25, 26, 27, 28, … There are many ways to test a number for primality, but perhaps the simplest is to simply do trial divisions. Begin by dividing m by 2, and if it divides evenly, then m is not prime. Otherwise, divide by 3, then 4, then 5, etc. If at any point m is found to be divisible by a number d in the range 2 d m-1, then halt, and conclude that m is composite. Otherwise, conclude that m is prime. A moment’s thought shows that one need not do any trial divisions by numbers d which are themselves composite. For instance, if a trial division by 2 fails (i.e. has non-zero remainder, so m is odd), then a trial division by 4, 6, or 8, or any even number, must also fail. Thus to test a number m for primality, one need only do trial divisions by prime numbers less than m. Furthermore, it is not necessary to go all the way up to m-1. One need only do trial divisions of m by primes p in the range 2 p m . To see this, suppose m 1 is composite. Then there exist positive integers a and b such that 1 < a < m, 1 < b < m, and m = ab . But if both a m and b m , then ab m, contradicting that m = ab . Hence one of a or b must be less than or equal to m . To implement this process in java you will write a function called isPrime() with the following signature: static boolean isPrime(int m, int[] P) This function will return true or false according to whether m is prime or composite. The array argument P will contain a sufficient number of primes to do the testing. Specifically, at the time isPrime() is called, array P must contain (at least) all primes p in the range 2 p m . For instance, to test m = 53 for primality, one must do successive trial divisions by 2, 3, 5, and 7. We go no further since 11 53 . Thus a precondition for the function call isPrime(53, P) is that P[0] = 2 , P[1] = 3 , P[2] = 5, and P[3] = 7 . The return value in this case would be true since all these divisions fail. Similarly to test m =143 , one must do trial divisions by 2, 3, 5, 7, and 11 (since 13 143 ). The precondition for the function call isPrime(143, P) is therefore P[0] = 2 , P[1] = 3 , P[2] = 5, P[3] = 7 , and P[4] =11. The return value in this case would be false since 11 divides 143. Function isPrime() should contain a loop that steps through array P, doing trial divisions. This loop should terminate when 2 either a trial division succeeds, in which case false is returned, or until the next prime in P is greater than m , in which case true is returned. Function main() in this project will read the command line argument n, allocate an int array of length n, fill the array with primes, then print the contents of the array to stdout according to the format described below. In the context of function main(), we will refer to this array as Primes[]. Thus array Primes[] plays a dual role in this project. On the one hand, it is used to collect, store, and print the output data. On the other hand, it is passed to function isPrime() to test new integers for primality. Whenever isPrime() returns true, the newly discovered prime will be placed at the appropriate position in array Primes[]. This process works since, as explained above, the primes needed to test an integer m range only up to m , and all of these primes (and more) will already be stored in array Primes[] when m is tested. Of course it will be necessary to initialize Primes[0] = 2 manually, then proceed to test 3, 4, … for primality using function isPrime(). The following is an outline of the steps to be performed in function main(). • Check that the user supplied exactly one command line argument which can be interpreted as a positive integer n. If the command line argument is not a single positive integer, your program will print a usage message as specified in the examples below, then exit. • Allocate array Primes[] of length n and initialize Primes[0] = 2 . • Enter a loop which will discover subsequent primes and store them as Primes[1] , Primes[2], Primes[3] , ……, Primes[n -1] . This loop should contain an inner loop which walks through successive integers and tests them for primality by calling function isPrime() with appropriate arguments. • Print the contents of array Primes[] to stdout, 10 to a line separated by single spaces. In other words Primes[0] through Primes[9] will go on line 1, Primes[10] though Primes[19] will go on line 2, and so on. Note that if n is not a multiple of 10, then the last line of output will contain fewer than 10 primes. Your program, which will be called Prime.java, will produce output identical to that of the sample runs below. (As usual % signifies the unix prompt.) % java Prime Usage: java Prime [PositiveInteger] % java Prime xyz Usage: java Prime [PositiveInteger] % java Prime 10 20 Usage: java Prime [PositiveInteger] % java Prime 75 2 3 5 7 11 13 17 19 23 29 31 37 41 43 47 53 59 61 67 71 73 79 83 89 97 101 103 107 109 113 127 131 137 139 149 151 157 163 167 173 179 181 191 193 197 199 211 223 227 229 233 239 241 251 257 263 269 271 277 281 283 293 307 311 313 317 331 337 347 349 353 359 367 373 379 % 3 As you can see, inappropriate command line argument(s) generate a usage message which is similar to that of many unix commands. (Try doing the more command with no arguments to see such a message.) Your program will include a function called Usage() having signature static void Usage() that prints this message to stderr, then exits. Thus your program will contain three functions in all: main(), isPrime(), and Usage(). Each should be preceded by a comment block giving it’s name, a short description of it’s operation, and any necessary preconditions (such as those for isPrime().) See examples on the webpage.

    Read the article

  • 3 Monitors, Ubuntu 12.04, Gnome 3, 2 nvidia cards, WITH xrandr or xinerama?

    - by Josh
    Ok. I have been banging my head against the wall for over a week now trying to get 3 monitors to work. I have: 1 - Nvidia 8600 GT 512MB PCIEx16 1 - Nvidia GT 240 1GB PCIEx16 They are not running in SLI (obviously). I have tried to use everything from tutorials to a few templates, all the way up to nvidia-settings, etc etc etc.. From what I hear, Xinerama doesnt like gnome 3 because of the compositing, although I have read a lot about using Xrandr instead, and getting the compositing working, but alas, I cannot. It always either crashes X and I have to replace the xorg.conf with my backup, or it defaults to the gnome-classic desktop, and on top of that, when it does default, it keeps adding more and more panels. Basically, I want to be able to use all 3 monitors (yes, just like in windows) to drag and drop from different windows. I have xorg-edit, but I am still not too sure on how to set this up? Is there any way to: A Get compositing working with 3 monitors, 2 nvidia cards, xinerama, and gnome 3? or BUse twinview with 3 monitors (I have heard it can be done by manually editing xorg.conf) or CSet up Xrandr to draw all 3 monitors with compositing. or DUse separate X for each monitor, and be able to use gnome with compositing, as well as drag between all 3 or EANYTHING. lol. I just want this to work. Any help you can provide would be greatly appreciated. BTW, I am running an ubuntu mini install with gnome. Everything works great but this. I can run it fine with 2 monitors and compositing, but not with 3. Also, what is the best GUI tool for editing xorg.conf? Im not finding anything that is up to date at all, and also is understandable by humans. haha. Im actually an engineer by trade, and have been working with computers a LONG time, but this xorg.conf stuff is really confusing the crap out of me. lol Thanks!

    Read the article

  • How do I disable a Nvidia 9600GT on MacBookPro 5.1?

    - by Gjan
    i finally put up a Dual Boot with Ubuntu and Lion on my old MacBookPro 5.1 As reported in many cases the discrete graphics card is turned on all the time consuming a lot of power and thus heating up the laptop. Since the discrete graphics card does not support the nvidia optimus technology, the corresponding packge nvidia-prime does not help in this case. Therefore my question is, how to manually disable the discrete graphics card Nvdidia 9600GT ? Preferably a 'switch-on-the-run' version, but a 'set-on-boot' would be totally fine!

    Read the article

  • How to get GUI back after freeze interrupted an Nvidia driver update?

    - by Reinere
    I just went through a driver update. The OS froze, so I had to hard reboot the PC. Now I just get the login prompt in terminal. So, I tried to run startx. codeError: API mismatch: the NVIDIA kernel module has version 304.43 but this NVIDIA driver Component has version 295.49` I just got this error. I have to type sudo su then modprobe ndiswrapper to get my Wi-Fi to work, so step by step instructions after that would be greatly helpful.

    Read the article

  • Windows XP: Nvidia GeForce 6100 Brightness

    - by ExProG
    Hello, I have an Nvidia GeForce 6100 and using XP. I also have a SmartBoard. my brightness settings are completely changed and are not good. I tried changing the settings at the Nvidia GeForce 6100 nForce 430 color correction settings. I cannot get any ideal, it's just way to bright, what do i have to do? I already tried my monitor brightness settings and the smartboard brightness settings. it does not matter. I think there are problems with the pixels. How can i fix this? Is there something wrong with my video card?

    Read the article

  • Nvidia System Tools compatible with GTX 560 Ti?

    - by Paula Ferreira
    I want to download and use Nvidia System Tools with ESA suport. But on the "Supported Products" tab my GTX 560 Ti isn't listed. The product was last released on April last year, but it shows support for both the GTX 570 and 580 as well as the GTX 4x family. All sister cards of the GTX 560. Has anyone successfuly ran this nvidia product with the GTX 560 Ti? Why wasn't this card included on the list of supported products?

    Read the article

  • nVidia Driver - Laptop is forced as one of my monitors

    - by vaccano
    I am trying to get my nVidia Driver to correctly configure my multi-monitor setup. I have my laptop in a docking station and two monitors hooked up to it. I had an old driver that this worked correctly with. However, that driver was causing a lot of "Deferred Procedure Calls" so I upgraded to a newer driver. But now I am forced to use my Laptop monitor as one of my monitors. Here is the image in the nVida Control Panel: As you can see, both monitors are recognized, but the only options available are to use one of them with the Laptop Display. Any Ideas? I am running Windows XP (latest updates), I have an nVidia Quadro 1500M. I have tried several different driver versions and all the new ones have this issue.

    Read the article

  • Nvidia 9 series or Intel HD 2000? [closed]

    - by EApubs
    I just tested an Nvidia 9300 GS card with a Intel Corei3 HD 2000 graphics system. Here is the windows experience index scores I got : Nvidia 9300 GS : Base Score 3.9 Processor : 7.1 Memory : 7.5 Graphics : 3.9 Gaming Graphics : 5.1 Hard Disk : 5.9 Intel HD 2000 : Base Score : 5.2 Processor : 7.1 Memory : 5.9 Graphics : 5.2 Gaming Graphics : 5.8 Hard Disk : 5.9 My questions are : When using Intel HD graphics, it reduces the score of my Ram! How is that possible? It checks the speed of the ram. Not the size (i think). Intel graphics take some of the ram space but how can that effect the speed? From both of them, what will be the good choice?

    Read the article

  • Configuring NVIDIA Quadro with Dell Precision M4600

    - by vsecades
    After a frustrating couple of weeks when I recently bought my Dell Precision laptop, I managed to fix an issue where Ubuntu (yes, was NOT using Windows, get serious) would not recognize the video card and would cause all sorts of problems all over the place. I ended up one Saturday morning nearly throwing this thing away, when I managed to find a post about NVIDIA Optimus technology... ( http://www.pcmag.com/article2/0,2817,2358963,00.asp ). Now I am a huge advocate of disruptive new stuff, as long as we keep the broader audience in mind. Anyhow, disabling this (which as the BIOS settings state only work on Windows 7 or later), effectively allow the NVIDIA based Ubuntu driver to kick in full force. No need for a trash can anymore thankfully. As I saw multiple posts all over the place about this, check your BIOS, disable and try the video again to see if this corrects your issues. Best of luck!

    Read the article

  • NVIDIA Driver Crashing on Custom-Built Windows 7 PC

    - by srunni
    I've got a custom-built PC with these specs: Fractal Design Define R3 ASUS P8Z77-V motherboard Intel Core i7-2700k with Thermalright HR-02 Macho Cooler NVIDIA GeForce GTX 560 with Arctic Cooling Accelero Twin Turbo II Cooler Crucial M4 128 GB SSD 1 TB Hitachi HDD G.SKILL Ares Series 2x8 GB RAM (x2) SeaSonic 520W PSU Windows 7 Ultimate SP1 (x64) NVIDIA driver version 301.42 Upon building the PC, I overclocked the CPU (but never the GPU), and there were no problems for 1-2 months. Then I started getting crashes with this error when doing anything that's computationally or graphically intensive: I un-overclocked the CPU, but that hasn't fixed anything. This is what the inside of my case looks like: I'd appreciate any guidance on resolving this problem. I did get some of the thermal paste on the graphics card when installing the aftermarket cooler, but there were no issues for a month or two. Update I did a clean install and the issue persisted - looks like it's a hardware issue. I will try removing/cleaning/reseating all the parts.

    Read the article

  • Nvidia driver on Windows 7 causing black screen

    - by inKit
    I have just installed Windows 7 on a desktop machine and for the first time ever have had a really tough time doing so, its normally a nice smooth install. This time I found that the monitor would simply go black after completing the installation. I tried reinstalling about 3 times and this did not help. After much searching I discovered that it was the nvidia drivers that were playing up with win 7, so i booted into safe mode, disabled the device, then rebooted to complete the installation. Windows 7 now works fine as long as the nvidia 9600 gt video card is disabled. The moment I enable it, the system requires a reboot and the screen will go black before even getting to the log in screen. I have tried downloading the latest driver and installing it manually, I have also tried uninstalling the device and allowing windows 7 to install it itself. Nothing seems to work. any clues?

    Read the article

  • Nvidia driver on Windows 7 causing black screen

    - by inKit
    I have just installed Windows 7 on a desktop machine and for the first time ever have had a really tough time doing so, its normally a nice smooth install. This time I found that the monitor would simply go black after completing the installation. I tried reinstalling about 3 times and this did not help. After much searching I discovered that it was the nvidia drivers that were playing up with win 7, so i booted into safe mode, disabled the device, then rebooted to complete the installation. Windows 7 now works fine as long as the nvidia 9600 gt video card is disabled. The moment I enable it, the system requires a reboot and the screen will go black before even getting to the log in screen. I have tried downloading the latest driver and installing it manually, I have also tried uninstalling the device and allowing windows 7 to install it itself. Nothing seems to work. any clues?

    Read the article

  • Quickly determine if a number is prime in Python for numbers < 1 billion

    - by Frór
    Hi, My current algorithm to check the primality of numbers in python is way to slow for numbers between 10 million and 1 billion. I want it to be improved knowing that I will never get numbers bigger than 1 billion. The context is that I can't get an implementation that is quick enough for solving problem 60 of project Euler: I'm getting the answer to the problem in 75 seconds where I need it in 60 seconds. http://projecteuler.net/index.php?section=problems&id=60 I have very few memory at my disposal so I can't store all the prime numbers below 1 billion. I'm currently using the standard trial division tuned with 6k±1. Is there anything better than this? Do I already need to get the Rabin-Miller method for numbers that are this large. primes_under_100 = [2, 3, 5, 7, 11, 13, 17, 19, 23, 29, 31, 37, 41, 43, 47, 53, 59, 61, 67, 71, 73, 79, 83, 89, 97] def isprime(n): if n <= 100: return n in primes_under_100 if n % 2 == 0 or n % 3 == 0: return False for f in range(5, int(n ** .5), 6): if n % f == 0 or n % (f + 2) == 0: return False return True How can I improve this algorithm?

    Read the article

  • Ubuntu for Android on the ASUS Transformer Prime

    - by sola
    I would like to use Ubuntu on my Transformer Prime in parallel with Android (not as a dual booting solution, I want to be able to switch between them instantaniously). I am aware of the traditional chrooting/VNC solution but I heard that it performs very poorly so I would like to use Ubuntu For Android (UFA) which has been announced recently by Canonical. That looks like a polished, highly integrated solution for Android devices. The Prime would be the ideal device for Ubuntu For Android since it has a powerful processor (Tegra3) capable of running a lot of processes in parallel on its 4 cores. Does anyone know if Canonical or anybody else is working on supporting UFA on the ASUS Transformer Prime? As far as I understand, the X11 driver is available for Tegra3 so, the biggest hurdle may be easily overcome.

    Read the article

  • NVIDIA Tesla K20C in Dell PowerEdge R720xd --- power cables

    - by CptSupermrkt
    I am trying to put an NVIDIA Tesla K20C into a Dell PowerEdge R720xd. I'm having a bit of trouble understanding the power requirements of the card. First, here is a picture of two pages of the same manual, which seems contradictory to me. One page says only a single connector is required, while the next page says both are required. The entire manual for the card can be found here: http://www.nvidia.com/content/PDF/kepler/Tesla-K20-Active-BD-06499-001-v02.pdf Here is an photo taken of the power connections on the card: And here is a photo of where those connectors need to go, onto the PCI-E riser of the r720xd: Neither the R720xd NOR the GPU came with the necessary cables. And given what appears to be a contradiction in the GPU manual (above), I'm not even sure at this point what we actually need. I have searched high and low online for things like 2x6 pin PCI-E to 8 pin male-to-male and so on, and for the life of me cannot find what we need. In case anyone needs it, the owner's manual of the R720xd can be found here: ftp://ftp.dell.com/Manuals/all-products/esuprt_ser_stor_net/esuprt_poweredge/poweredge-r720xd_Owner%27s%20Manual_en-us.pdf The relevant page is page 68, which clearly indicates that the 8-pin female port on the riser card is for a GPU. The bottom line question: exactly what power cables do we need to buy, and where can we find them?

    Read the article

  • Slow, choppy video playback with nVidia 8600GT

    - by user5351
    I have an nVidia 8600GT card (made by EVGA) on a machine with Windows Vista (AMD Athlon X2 processors) and four gigs of ram. It runs pretty good, but I have had some slow/choppy/stuterring video playback issues whenever watching flash videos on Youtube or other sites. The problem is there with both Firefox and IE flash videos, but is maybe worse with Firefox. I also tried Linux with nVidia's binary drivers and it was about the same. I downloaded EVGA precision which allows me to control stuff like the fan and clock speed. The card's temp (in both Vista and Linux) is usually at 66C when idle (not playing a game or watching anything). It goes up a little when watching a video (maybe 68-72C). Any ideas on how to fix this? UPDATE: The issues are both with full screen and embedded flash videos. I have Flash 10.0.32.18 (always make sure I use most recent for security), and the CPU is an AMD Athlon 64 X2 Dual Core Processor 4000+ at 2.11 GHZ. The current GPU driver installed is the most recent GeForce one from last July.

    Read the article

  • How to Force Graphics Options in PC Games with NVIDIA, AMD, or Intel Graphics

    - by Chris Hoffman
    PC games usually have built-in graphics options you can change. But you’re not limited to the options built into games — the graphics control panels bundled with graphics drivers allow you to tweak options from outside PC games. For example, these tools allow you to force-enabling antialiasing to make old games look better, even if they don’t normally support it. You can also reduce graphics quality to get more performance on slow hardware. If You Don’t See These Options If you don’t have the NVIDIA Control Panel, AMD Catalyst Control Center, or Intel Graphics and Media Control Panel installed, you may need to install the appropriate graphics driver package for your hardware from the hardware manufacturer’s website. The drivers provided via Windows Update don’t include additional software like the NVIDIA Control Panel or AMD Catalyst Control Center. Drivers provided via Windows Update are also more out of date. If you’re playing PC games, you’ll want to have the latest graphics drivers installed on your system. NVIDIA Control Panel The NVIDIA Control Panel allows you to change these options if your computer has NVIDIA graphics hardware. To launch it, right-click your desktop background and select NVIDIA Control Panel. You can also find this tool by performing a Start menu (or Start screen) search for NVIDIA Control Panel or by right-clicking the NVIDIA icon in your system tray and selecting Open NVIDIA Control Panel. To quickly set a system-wide preference, you could use the Adjust image settings with preview option. For example, if you have old hardware that struggles to play the games you want to play, you may want to select “Use my preference emphasizing” and move the slider all the way to “Performance.” This trades graphics quality for an increased frame rate. By default, the “Use the advanced 3D image settings” option is selected. You can select Manage 3D settings and change advanced settings for all programs on your computer or just for specific games. NVIDIA keeps a database of the optimal settings for various games, but you’re free to tweak individual settings here. Just mouse-over an option for an explanation of what it does. If you have a laptop with NVIDIA Optimus technology — that is, both NVIDIA and Intel graphics — this is the same place you can choose which applications will use the NVIDIA hardware and which will use the Intel hardware. AMD Catalyst Control Center AMD’s Catalyst Control Center allows you to change these options on AMD graphics hardware. To open it, right-click your desktop background and select Catalyst Control Center. You can also right-click the Catalyst icon in your system tray and select Catalyst Control Center or perform a Start menu (or Start screen) search for Catalyst Control Center. Click the Gaming category at the left side of the Catalyst Control Center window and select 3D Application Settings to access the graphics settings you can change. The System Settings tab allows you to configure these options globally, for all games. Mouse over any option to see an explanation of what it does. You can also set per-application 3D settings and tweak your settings on a per-game basis. Click the Add option and browse to a game’s .exe file to change its options. Intel Graphics and Media Control Panel Intel integrated graphics is nowhere near as powerful as dedicated graphics hardware from NVIDIA and AMD, but it’s improving and comes included with most computers. Intel doesn’t provide anywhere near as many options in its graphics control panel, but you can still tweak some common settings. To open the Intel graphics control panel, locate the Intel graphics icon in your system tray, right-click it, and select Graphics Properties. You can also right-click the desktop and select Graphics Properties. Select either Basic Mode or Advanced Mode. When the Intel Graphics and Media Control Panel appears, select the 3D option. You’ll be able to set your Performance or Quality setting by moving the slider around or click the Custom Settings check box and customize your Anisotropic Filtering and Vertical Sync preference. Different Intel graphics hardware may have different options here. We also wouldn’t be surprised to see more advanced options appear in the future if Intel is serious about competing in the PC graphics market, as they say they are. These options are primarily useful to PC gamers, so don’t worry about them — or bother downloading updated graphics drivers — if you’re not a PC gamer and don’t use any intensive 3D applications on your computer. Image Credit: Dave Dugdale on Flickr     

    Read the article

  • Wine causes twin view to break

    - by deanvz
    I have endlessly been playing around with the Nvidia X Server settings and changing my xorg.conf file to try and work for me and on most days its fine. In each instance I get it working for a while and then this morning the most bizarre thing happens. The moment I open any type of Wine program (which never use to be a problem) my Twin View setup disappears and am left with mirrored displays. I try and change the settings in the Nvidia driver, but its not interested and the screens remain mirrored. I have a work around. Restart my pc... Below are the contents of my current xorg.conf file. # nvidia-settings: X configuration file generated by nvidia-settings # nvidia-settings: version 295.33 (buildd@zirconium) Fri Mar 30 13:43:34 UTC 2012 Section "ServerLayout" Identifier "Layout0" Screen 0 "Screen0" 0 0 InputDevice "Keyboard0" "CoreKeyboard" InputDevice "Mouse0" "CorePointer" Option "Xinerama" "0" EndSection Section "Files" EndSection Section "InputDevice" # generated from default Identifier "Mouse0" Driver "mouse" Option "Protocol" "auto" Option "Device" "/dev/psaux" Option "Emulate3Buttons" "no" Option "ZAxisMapping" "4 5" EndSection Section "InputDevice" # generated from default Identifier "Keyboard0" Driver "kbd" EndSection Section "Monitor" # HorizSync source: edid, VertRefresh source: edid Identifier "Monitor0" VendorName "Unknown" ModelName "LG Electronics W1934" HorizSync 30.0 - 83.0 VertRefresh 56.0 - 75.0 Option "DPMS" EndSection Section "Device" Identifier "Device0" Driver "nvidia" VendorName "NVIDIA Corporation" BoardName "GeForce 9400 GT" EndSection Section "Screen" Identifier "Screen0" Device "Device0" Monitor "Monitor0" DefaultDepth 24 Option "TwinView" "1" Option "metamodes" "CRT-0: nvidia-auto-select +0+0, CRT-1: nvidia-auto-select +1440+0" SubSection "Display" Depth 24 EndSubSection EndSection

    Read the article

  • How do I configure nVidia drivers on a Portable Ubuntu setup?

    - by Nicholas Flynt
    I've been pulling my hair out over this one for a couple of days now, google is no help. I've created a wonderful (until this issue) portable copy of Ubuntu linux that will boot on mostly anything by using a USB enclosure for my laptop's 80GB SATA drive. So far so good, it boots and runs on everything, and on non-nVidia card setups was even detecting the drivers, or letting me install the required drivers for hardware acceleration and compiz. Because you know, the wobble windows are the most awesome thing ever. Anyway, my desktop machine had an nVidia card, so I'm thinking, sure, I'll just install the nVidia drivers like before and everything will work happily. Not so-- now the desktop and any other nVidia cards work great, but it seems to have completely disabled any other graphics cards. When the kernel module detects that an nVidia card isn't present, it shoots up this nasty little dialog box giving me the option to boot into "low graphics" mode, which doesn't even allow me to use the correct screen resolution, much less see the installed graphics card and try to configure a driver for it. Is there any way to configure Ubuntu (with the dreaded nVidia kernel module) so that it can use nVidia's drivers when an nVidia card is present, and default to the normal (not low-graphics) setup in other cases, so that it has a fair chance of using what's actually present? I'm not afraid to much with config files, I just don't know the underlying system well enough to feel comfortable diving in without a push in the right direction. Thanks guys!

    Read the article

  • Linux: Setting primary display (nvidia) form command line

    - by Joernsn
    Is this possible? Normally I use disper to enable my external monitor, but I don't think I can force the 2nd monitor to be primary. http://willem.engen.nl/projects/disper/ I've played around with nv-control-dpy included in the nvidia-control source, but I haven't figured out how to do it yet. How to get: http://ubuntuforums.org/showthread.php?t=922956

    Read the article

< Previous Page | 9 10 11 12 13 14 15 16 17 18 19 20  | Next Page >