Search Results

Search found 557 results on 23 pages for 'optimus prime'.

Page 1/23 | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • GLX on NVIDIA OPTIMUS, Ubuntu 12.04

    - by Gabriel Theron
    I have a laptop that has an NVIDIA Optimus graphic card setup. I have tried running Minecraft on that laptop, and the game crashes after login, with the following error: org.lwjgl.LWJGLException: Could not init GLX I tried to update my drivers, but no driver update was available. I searched for people asking the same question and I found none, hence the following question: Is it possible to enable GLX on NVIDIA Optimus? If yes, how to? Thank you in advance.

    Read the article

  • Multiple Monitors using nvidia-prime or bumblebee on Ubuntu 13.10

    - by user205626
    I've been unable to get multiple monitors to work with Ubuntu 13.10 using nvidia-prime or bumblebee. Could someone point me in the right direction? With nvidia-prime, I've tried the xorg.conf here http://us.download.nvidia.com/XFree86/Linux-x86/319.12/README/randr14.html, but I boot into "low graphics" mode and have to revert to get a desktop back. Any suggestions would be appreciated. Thanks. Edit: I've given up on nvidia-prime; I missed the fact that it never turns off the discrete card... So, I'm back to trying to get VIRTUAL displays working with Bumblebee.

    Read the article

  • How well do laptops with Nvidia Optimus work?

    - by DSJones
    I am considering buying a Dell XPS 15 laptop. The laptop has a nvidia 420m card which should work with linux but I keep reading about the Nvidia optimus technology that isn't supported on Linux. I am not really interested in switching from Nvidia to intel to save power but need to know that the Nvidia card will infact work if i installed Ubuntu. If anyone has experience usung a nvidia card with optimus technology or even better the exact laptop in question (Dell XPS 15 with 1GB NVIDIA® GeForce® GT 420M) it would be great. A major problem holding people back from adopting Linux is this sort of hardware issue. I am a long term Ubuntu user and supporter who can't afford to make a mistake with a purchase like this. I don't want to spend £500+ to find I have no graphics acceleration because Windows7 is not an option.

    Read the article

  • NVidia with Optimus conflicting in Ubuntu 12.04

    - by Humannoise
    i have recently installed Ubuntu 12.04 in a Intel Ivy Bridge with integrated graphics and NVidia GPU with Optimus tech, however i cant manage it to work properly. I have already passed by the solution of bumblebee project, however iam got the following message when try to run anything with nvidia card( e.g. with optirun firefox): [ERROR]The Bumblebee daemon has not been started yet or the socket path /var/run/bumblebee.socket was incorrect. [ERROR]Could not connect to bumblebee daemon - is it running? Since the nvidia card is not working properly, some softwares like Scilab, that make use of X11 system for graphic handling and plotting, wont work too. my bios has no option concerning graphics card and the log of daemon returned: Jul 5 16:10:51 humannoise-W251ESQ-W270ESQ bumblebeed[980]: Module 'nvidia' is not found. Jul 5 16:10:51 humannoise-W251ESQ-W270ESQ kernel: [ 17.943272] init: bumblebeed main process (980) terminated with status 1 Jul 5 16:10:51 humannoise-W251ESQ-W270ESQ kernel: [ 17.943288] init: bumblebeed main process ended, respawning Jul 5 16:10:51 humannoise-W251ESQ-W270ESQ bumblebeed[1026]: Module 'nvidia' is not found. The lspci -nn | grep '\[030[02]\]:' returned: 00:02.0 VGA compatible controller [0300]: Intel Corporation Ivy Bridge Graphics Controller [8086:0166] (rev 09) 01:00.0 VGA compatible controller [0300]: NVIDIA Corporation Device [10de:0de9] (rev a1) Ok, for the command dpkg -l | grep '^ii' | grep nvidia i got : ii bumblebee-nvidia 3.0-2~preciseppa1 nVidia Optimus support using the proprietary NVIDIA driver ii nvidia-current 302.17-0ubuntu1~precise~xup1 NVIDIA binary Xorg driver, kernel module and VDPAU library ii nvidia-current-updates 295.49-0ubuntu0.1 NVIDIA binary Xorg driver, kernel module and VDPAU library ii nvidia-settings 302.17-0ubuntu1~precise~xup3 Tool of configuring the NVIDIA graphics driver ii nvidia-settings-updates 295.33-0ubuntu1 Tool of configuring the NVIDIA graphics driver After full reinstallation, including the remove of any previous nvidia drive, lsmod | grep -E 'nvidia|nouveau' returned: nvidia 10888310 46 dmesg | grep -C3 -E 'nouveau|NVRM' returned things like: [ 1875.607283] nvidia 0000:01:00.0: PCI INT A -> GSI 16 (level, low) -> IRQ 16 [ 1875.607289] nvidia 0000:01:00.0: setting latency timer to 64 [ 1875.607293] vgaarb: device changed decodes: PCI:0000:01:00.0,olddecodes=io+mem,decodes=none:owns=none [ 1875.607363] NVRM: loading NVIDIA UNIX x86_64 Kernel Module 302.17 Tue Jun 12 16:03:22 PDT 2012 [ 1884.830035] nvidia 0000:01:00.0: PCI INT A disabled [ 1884.832058] bbswitch: disabling discrete graphics [ 1884.832960] bbswitch: Result of Optimus _DSM call: 09000019 Some programs, like Scilab, are now working fine under optirun(e.g. >optirun scilab) call. Thank you.

    Read the article

  • Thinkpad T530 with Optimus and Docking Station

    - by Vic Boudolf
    I have a Lenovo Thinkpad T530 with Optimus video, which is not supported on 12.04.1. I don't normally need the discrete (nVidia) graphics, so I turn it off in the BIOS settings to achieve longer battery life (and so that the screen dimmer will work), but when placed in the docking station, the integrated (Intel) graphics don't power the HDMI ports. (The VGA port does work, but I want to focus on the HDMI.) This means I have to change the BIOS settings constantly. Is there any way to have the system detect the docking station and power up/enable the discrete graphics accordingly? I don't need to do it on the fly. Just at startup. This post suggests that bumblebee can turn the discrete graphics on and off for specific applications, but I just want to turn it on or off. [2 suggests that vga_switcheroo will not work with nVidia Optimus.

    Read the article

  • How to solve Bumblebee/Nvidia Optimus issues with kernel 3.4 (works perfectly under 3.2)

    - by theJimy
    I installed Ubuntu and could setup to utilize my Intel HD 3000/Geforce GT 540M hybrid graphics perfectly with the method described here: How well do laptops with Nvidia Optimus work? Everything works fine under Kernel 3.2. Now I wanted to upgrade though to Kernel 3.4, as it brings many improvements, especially in saving battery life (ie. Intel RC6)... at least from what I heard. While I had no issues installing the 3.4 Kernel under Ubuntu 12.04 and everything so far runs fine, Bumblebee causes issues under kernel 3.4. When trying to run commands like optirun, lsmod (or similar kernel tools) these just lock up and never return. The Bumblebee developers seem to refuse to help with mainline kernels (as seen here: https://github.com/Bumblebee-Project/bbswitch/issues/17 ). Does anyone know, how to solve this issue? Could I solve this probably, by compiling the kernel and/or Bumblebee against the kernels sources myself and having a Ubuntu-like kernel? Any other idea that might help me to solve this myself, so I could profit from the 3.4 features and Optimus, would be very appreciated.

    Read the article

  • CUDA & MSI GT60 with Optimus enabled GTX670M?

    - by user1076693
    I have a MSI GT60 Laptop with an Optimus enabled GTX 670M GPU, and I have been trying to get CUDA going in Ubuntu 12.04 environment. I realize that Optimus is not supported in Linux, but I have read the following post suggesting that CUDA works for hybrid GPUs. How can I get nVidia CUDA or OpenCL working on a laptop with nVidia discrete card/Intel Integrated Graphics? I installed the NVIDIA driver via sudo add-apt-repository ppa:ubuntu-x-swat/x-updates sudo apt-get update sudo apt-get install nvidia-current The resulting driver version is 302.17, and supposedly GTX 670M is supported since 295.59. I also downloaded CUDA 4.2 from the NVIDIA site, and compiled it against nvidia-current libraries. Unfortunately, when I run deviceQuery in the CUDA SDK, I get the following output cudaGetDeviceCount returned 38 -> no CUDA-capable device is detected Checking /proc/driver/nvidia/gpus/0/information gives the following Model: GeForce GTX 670M IRQ: 16 GPU UUID: GPU-????????-????-????-????-???????????? Video BIOS: ??.??.??.??.?? Bus Type: PCI-E DMA Size: 32 bits DMA Mask: 0xffffffffff Bus Location: 0000:01.00.0 Here is the output of "lspci | grep VGA" 00:02.0 VGA compatible controller: Intel Corporation Ivy Bridge Graphics Controller (rev 09) 01:00.0 VGA compatible controller: NVIDIA Corporation Device 1213 (rev ff) So... what am I doing wrong? Thanks!

    Read the article

  • Ubuntu 14.04 Nvidia Optimus Bumblebee error

    - by Cristian
    I know that in Ubuntu 14.04 there exists nvidia-prime for Nvidia Optimus, but I don't like it and neither am I able to get it work. After upgrading from Ubuntu 12.04 everything crashed, and I made a clean install of Ubuntu 14.04 and Bumblebee, but now I have new troubles. After running optirun glxgears I get the following error: **[ 4703.996785] [ERROR]Cannot access secondary GPU, secondary X is not active.** **[ 4703.996910] [ERROR]Aborting because fallback start is disabled.** Please help.

    Read the article

  • Nvidia optimus and Steam (on 12.04)

    - by Seiryuu
    I've obtained a copy of the .deb for the Steam beta, but it was pretty disappointing to see that it simply doesn't run. Hardware - Dell XPS L502, with Nvidia Optimus I have bumblebee installed. Trying to run Steam with the Intel HD 3000 completely fails to start it. Message received Installing breakpad exception handler for appid(steam)/version(1352224866_client) followed by a crash with no other information provided. Trying to optirun steam runs the client, but as soon as it gets to the home screen, it says that the Nvidia drivers I am using are out of date (and Steam requires newer drivers to run). It's probably worth to note that it throws the same Installing breakpad... error when run with optirun, but it doesn't crash the client immediately. Any way to fix this? Also, is there a way of manually updating the drivers in bumblebee without breaking anything? Alternatively, is there a reliable way of completely disabling the Intel GPU (in order to use the Nvidia GPU exclusively)? Note: I am using Xmonad with gnome-fallback, if that makes a difference. However, when I tried everything mentioned with Unity (2d), everything was the same, so I guess it has nothing to do with the window manager in use.

    Read the article

  • Unity is broken after upgrading to 12.10 (Optimus laptop)

    - by SyS
    I upgraded to GNU/Linux Ubuntu 12.10 but have been unable to use Unity properly afterwards. Indeed, I encountered the exact same problem as a lot of people: the Unity side and top bars are not displaying, although in my case, Unity seems completely broken, as I can't even right-click. However, in my case, it's worth noticing that I have an Optimus laptop with a Nvidia graphics card (GeForce GT 540M). Bumblebee and its 'optirun' command is working just fine, as usual, after the upgrade. I tried several things, as resetting Compiz and Unity (with the command 'setside unity') -- which works but I have to do it everytime I boot and it resets all my settings -- updating/reinstalling/reconfiguring my Nvidia drivers as well as bumblebee, trying with Nouveau drivers instead of nvidia-current, check if linux-headers-generic were installed (they were). However, I couldn't reset xorg.conf files as they're just not there. There is neither xorg.conf file, nor its backup in /etc/X11. I think this is where the problem comes from, although I'm far from an expert. Maybe retrieving a xorg.conf file will fix this mess, but I have no idea how to do that. I'm just tired and don't know what to do. So, here I am, begging for your help.

    Read the article

  • How to recognize an optimus laptop?

    - by kellogs
    kellogs@kellogs-K52Jc ~ $ lspci 00:00.0 Host bridge: Intel Corporation Core Processor DRAM Controller (rev 18) 00:02.0 VGA compatible controller: Intel Corporation Core Processor Integrated Graphics Controller (rev 18) 00:16.0 Communication controller: Intel Corporation 5 Series/3400 Series Chipset HECI Controller (rev 06) 00:1a.0 USB controller: Intel Corporation 5 Series/3400 Series Chipset USB2 Enhanced Host Controller (rev 06) 00:1b.0 Audio device: Intel Corporation 5 Series/3400 Series Chipset High Definition Audio (rev 06) 00:1c.0 PCI bridge: Intel Corporation 5 Series/3400 Series Chipset PCI Express Root Port 1 (rev 06) 00:1c.1 PCI bridge: Intel Corporation 5 Series/3400 Series Chipset PCI Express Root Port 2 (rev 06) 00:1c.5 PCI bridge: Intel Corporation 5 Series/3400 Series Chipset PCI Express Root Port 6 (rev 06) 00:1d.0 USB controller: Intel Corporation 5 Series/3400 Series Chipset USB2 Enhanced Host Controller (rev 06) 00:1e.0 PCI bridge: Intel Corporation 82801 Mobile PCI Bridge (rev a6) 00:1f.0 ISA bridge: Intel Corporation Mobile 5 Series Chipset LPC Interface Controller (rev 06) 00:1f.2 SATA controller: Intel Corporation 5 Series/3400 Series Chipset 4 port SATA AHCI Controller (rev 06) 00:1f.6 Signal processing controller: Intel Corporation 5 Series/3400 Series Chipset Thermal Subsystem (rev 06) 02:00.0 Network controller: Atheros Communications Inc. AR9285 Wireless Network Adapter (PCI-Express) (rev 01) 03:00.0 System peripheral: JMicron Technology Corp. SD/MMC Host Controller (rev 80) 03:00.2 SD Host controller: JMicron Technology Corp. Standard SD Host Controller (rev 80) 03:00.3 System peripheral: JMicron Technology Corp. MS Host Controller (rev 80) 03:00.4 System peripheral: JMicron Technology Corp. xD Host Controller (rev 80) 03:00.5 Ethernet controller: JMicron Technology Corp. JMC250 PCI Express Gigabit Ethernet Controller (rev 03) ff:00.0 Host bridge: Intel Corporation Core Processor QuickPath Architecture Generic Non-core Registers (rev 05) ff:00.1 Host bridge: Intel Corporation Core Processor QuickPath Architecture System Address Decoder (rev 05) ff:02.0 Host bridge: Intel Corporation Core Processor QPI Link 0 (rev 05) ff:02.1 Host bridge: Intel Corporation Core Processor QPI Physical 0 (rev 05) ff:02.2 Host bridge: Intel Corporation Core Processor Reserved (rev 05) ff:02.3 Host bridge: Intel Corporation Core Processor Reserved (rev 05) kellogs@kellogs-K52Jc ~ $ inxi -SGx System: Host: kellogs-K52Jc Kernel: 3.5.0-17-generic x86_64 (64 bit, gcc: 4.7.2) Desktop: KDE 4.9.5 (Qt 4.8.3) Distro: Linux Mint 14 Nadia Graphics: Card: Intel Core Processor Integrated Graphics Controller bus-ID: 00:02.0 X.Org: 1.13.0 drivers: intel (unloaded: fbdev,vesa) Resolution: [email protected] GLX Renderer: Mesa DRI Intel Ironlake Mobile GLX Version: 2.1 Mesa 9.0.3 Direct Rendering: Yes kellogs@kellogs-K52Jc ~ $ lshw [...] *-display description: VGA compatible controller product: Core Processor Integrated Graphics Controller vendor: Intel Corporation physical id: 2 bus info: pci@0000:00:02.0 version: 18 width: 64 bits clock: 33MHz capabilities: vga_controller bus_master cap_list rom configuration: driver=i915 latency=0 resources: irq:44 memory:d0000000-d03fffff memory:c0000000-cfffffff ioport:e080(size=8) Manufacturer advertises the K52Jc model which I bought as optimus enabled. However, no traces of it in the output above. Of course, Bumblebee would not start on this machine. Should I rest assured that is a defective / un-optimused machine ?

    Read the article

  • How to config multiple monitor with optimus?

    - by irrational
    I have an Acer Aspire 8951G running 12.04 Pangolin with bumblebee working beautifully. My problem is that when I connect either the VGA port or the HDMI to my projector there is no way I can see to properly set up the resolutions or colours. The default basic display driver sees the projector correctly, but messes up colours and resolutions (on hdmi) and resolutions on VGA. (Its a 1280 X 720 projector) Am I missing some sort of Xorg configuration? nvidia-xconfig does not seem to exist and running optirun nvidia-settings -c :8 opens the settings, but of course only for the one display. I just want a way to set a default config for my projector via VGA or preferably HDMI. Any help would be wonderful.

    Read the article

  • Can't use nvidia card/driver on optimus notebook

    - by Mr. Pixel
    I installed (once again) the latest official nvidia driver for my GT540m on Ubuntu 11.10. Even though everything seems OK with my xorg.conf file (I've manually added BusID "PCI:1:0:0", since lspci shows 01:00.0 for my GPU). The problem is, when I use the xorg.conf file generated by Xorg -configure, Xorg automatically loads the Intel GPU. So I removed everything that was not related to my nvidia card, basically leaving my xorg.conf with one screen and one device (with the nvidia driver and the above-mentioned BusID), and Xorg fails to start. The log says something like "Devices on GT540m [newline] none" And a few lines later, something like "NVIDIA(0) found a screen, but have no device for it". When I don't set the BusID, it doesn't seem to detect my card either. Thank you for any suggestion. PS: If possible, I'd like to avoid bumblebee or any similar "hybrid graphics" solution, last time I tried I ended up reinstalling Ubuntu. Edit: Allow me to clarify the problem. I have a notebook with a GT540m graphics card, and an integrated intel gpu. I want to use the graphics card with full hardware acceleration and its official driver, as I do under windows.

    Read the article

  • Can't get Optimus to work with Ironhide on an Asus N53SN

    - by Musaab
    I installed Ubuntu 11.10 (Same issue in 11.04 btw) and then I installed Ironhide. I went through the configuration, chose the one with the highest confirmation for my system and tested it: > Error: Module nvidia does not exist in /proc/modules > P50 Disabling nVidia Card Succeded (the spelling error is theirs) And it changes nothing. I tried other configurations and got worse results. This has really become a major headache. Any solutions?

    Read the article

  • Ubuntu 14.04 + nvidia-331-updates makes a blank desktop screen

    - by Achint
    I upgraded my installation from 13.10 to 14.04. The problem is that whenever I install the Nvidia drivers from the GUI, upon reboot or trying to login again it only shows the wallpaper of my desktop and nothing else. The mouse does move around, but nothing works. I am unable to open a terminal or do anything else. If I go into the tty console and purge the drivers, then things seem to work again. I have an Optimus setup, with an onboard Intel and discrete Nvidia GTX770M card. It's a 64-bit architecture. I really need to work with CUDA, and was hopeful after hearing that nvidia-prime was released, but this is a real downer. Any help on this?

    Read the article

  • Dock with dual external DVI monitors with Intel + Nvidia Optimus?

    - by Ryan
    I have a Dell Latitude E6420 laptop plugged into a docking station, and the dock has 2 monitors (connected with DVI). Also note that I've installed Ubuntu alongside (dual-boot) Windows 7. I can't get the dual monitors to work both on Ubuntu (either 11.10 or 12.04) and Windows 7. When I run lspci | grep VGA, I get: 00:02.0 VGA compatible controller: Intel Corporation 2nd Generation Core Processor Family Integrated Graphics Controller (rev 09) 01:00.0 VGA compatible controller: nVidia Corporation GF108 [Quadro NVS 4200M] (rev a1) If I then reboot and uncheck Optimus setting in the BIOS during reboot, I'm able to get the dual monitors to work in Ubuntu 12.04 (but I need to configure them every boot in Nvidia Settings). When I run lspci | grep VGA, I get: 01:00.0 VGA compatible controller: NVIDIA Corporation GF119 [Quadro NVS 4200M] (rev a1) But then if I reboot into Windows (leaving the Optimus unchecked), Windows can't detect external monitors, and the resolution is unacceptably low. I've seen on many forum posts that this particular graphics card setup causes lots of headaches. I haven't been able to resolve my problem yet. How can I use my external display on my laptop with intel and nvidia video cards? How to use external displays with Intel driver on a NVidia/Intel hybrid system nVidia Optimus , Unity 3D and Dual Monitors "Just use VGA instead of DVI" isn't an option because my dock has only 1 VGA port (and 2 DVI). Switching the BIOS setting on every reboot and then reconfiguring the display settings every time is tedious, time-consuming, and impractical. Do you know how to make this work smoothly? Thanks for your help! P.S. see also: http://superuser.com/questions/434358/dell-latitude-e6420-dual-boot-ubuntu-windows-7-optimus-graphics-problems

    Read the article

  • Nvidia Optimus shows Nvidia card is always active in Windows 8

    - by Ty Underwood
    I have loaded the new Asus Nvidia driver for my Asus Ux32VD that I have installed Windows 8 on. It has an Nvidia 620m and Intel hd 4000 integrated graphics. The problem is that the Optimus activity icon in my system tray shows that the Nvidia card is always active, though it shows that it is not doing anything. I can force the system to prefer the Intel graphics, and after rebooting, the Nvidia card will shut off, but then all the programs will be overridden and use the Intel graphics. Is anyone else having problems with Optimus in Windows 8? My battery life is really suffering because I'm always idling my Nvidia card while also presumably running the Intel card. Any suggestions would be appreciated.

    Read the article

  • Dell Latitude E6420 dual-boot Ubuntu + Windows 7 Optimus graphics problems

    - by Ryan
    I have a Dell Latitude E6420 laptop with Ubuntu 12.04 alongside Windows 7 (dual-boot) docked in a docking station with 2 DVI outputs. It took me a week of tinkering to get the dual external monitors to work in Ubuntu, and I had to disabled the "Optimus" feature in the BIOS. But now neither external monitor is detected in Windows, and the resolution is also very low. Do you know how I can successfully dual-boot Windows 7 and Ubuntu on this machine using my 2 external DVI monitors? I have an open question here too, trying to resolve this same issue: http://askubuntu.com/questions/146933/dock-with-dual-external-dvi-monitors-with-intel-nvidia-optimus

    Read the article

  • Notebook with NVIDIA Optimus not switching video card in games

    - by user140739
    I have a Samsung RC720 notebook with Intel Integrated Graphics and NVIDIA GeForce GT 520M. As you can see it has two video adapters and Optimus is supposed to switch between them. But when I choose dedicated GPU in NVIDIA Control Panel and try to run, for example, GTA IV, it uses integrated graphics and I get very poor performance. I have already installed last NVIDIA and notebook drivers, chose high-performance in NVIDIA Control Panel, tried to execute with "Run with graphics processor..." context option and so on. Thanks for help.

    Read the article

  • RandR 1.4 Optimus Dual Monitor

    - by mathepic
    So, I have a dual monitor setup. HDMI comes in through Nvidia, main display is through Intel (I think). I want to use XMonad with the dual setup, and I want to be able to run with or without the second monitor. Is this even doable? I'm using RandR 1.4 and can get both monitors to display something at the same time (by messing with xrandr) but XMonad can never detect more than one rectangle form Xinerama. Does anyone have a working multi-monitor xinerama or twinview configuration that works with optimus/randr 1.4?

    Read the article

  • Prime Numbers Code Help

    - by andrew
    Hello Everybody, I am suppose to "write a Java program that reads a positive integer n from standard input, then prints out the first n prime number." It's divided into 3 parts. 1st: This function will return true or false according to whether m is prime or composite. The array argument P will contain a sufficient number of primes to do the testing. Specifically, at the time isPrime() is called, array P must contain (at least) all primes p in the range 2 p m . For instance, to test m = 53 for primality, one must do successive trial divisions by 2, 3, 5, and 7. We go no further since 11 53 . Thus a precondition for the function call isPrime(53, P) is that P[0] = 2 , P[1] = 3 , P[2] = 5, and P[3] = 7 . The return value in this case would be true since all these divisions fail. Similarly to test m =143 , one must do trial divisions by 2, 3, 5, 7, and 11 (since 13 143 ). The precondition for the function call isPrime(143, P) is therefore P[0] = 2 , P[1] = 3 , P[2] = 5, P[3] = 7 , and P[4] =11. The return value in this case would be false since 11 divides 143. Function isPrime() should contain a loop that steps through array P, doing trial divisions. This loop should terminate when 2 either a trial division succeeds, in which case false is returned, or until the next prime in P is greater than m , in which case true is returned. Then there is the "main function" • Check that the user supplied exactly one command line argument which can be interpreted as a positive integer n. If the command line argument is not a single positive integer, your program will print a usage message as specified in the examples below, then exit. • Allocate array Primes[] of length n and initialize Primes[0] = 2 . • Enter a loop which will discover subsequent primes and store them as Primes[1] , Primes[2], Primes[3] , ……, Primes[n -1] . This loop should contain an inner loop which walks through successive integers and tests them for primality by calling function isPrime() with appropriate arguments. • Print the contents of array Primes[] to stdout, 10 to a line separated by single spaces. In other words Primes[0] through Primes[9] will go on line 1, Primes[10] though Primes[19] will go on line 2, and so on. Note that if n is not a multiple of 10, then the last line of output will contain fewer than 10 primes. The last function is called "usage" which I am not sure how to execute this! Your program will include a function called Usage() having signature static void Usage() that prints this message to stderr, then exits. Thus your program will contain three functions in all: main(), isPrime(), and Usage(). Each should be preceded by a comment block giving it’s name, a short description of it’s operation, and any necessary preconditions (such as those for isPrime().) And hear is my code, but I am having a bit of a problem and could you guys help me fix it? If I enter the number "5" it gives me the prime numbers which are "6,7,8,9" which doesn't make much sense. import java.util.; import java.io.; import java.lang.*; public class PrimeNumber { static boolean isPrime(int m, int[] P){ int squarert = Math.round( (float)Math.sqrt(m) ); int i = 2; boolean ans=false; while ((i<=squarert) & (ans==false)) { int c= P[i]; if (m%c==0) ans= true; else ans= false; i++; } /* if(ans ==true) ans=false; else ans=true; return ans; } ///****main public static void main(String[] args ) { Scanner in= new Scanner(System.in); int input= in.nextInt(); int i, j; int squarert; boolean ans = false; int userNum; int remander = 0; System.out.println("input: " + input); int[] prime = new int[input]; prime[0]= 2; for(i=1; i ans = isPrime(j,prime); j++;} prime[i] = j; } //prnt prime System.out.println("The first " + input + " prime number(s) are: "); for(int r=0; r }//end of main } Thanks for the help

    Read the article

  • Checking if an int is prime more efficiently

    - by SipSop
    I recently was part of a small java programming competition at my school. My partner and I have just finished our first pure oop class and most of the questions were out of our league so we settled on this one (and I am paraphrasing somewhat): "given an input integer n return the next int that is prime and its reverse is also prime for example if n = 18 your program should print 31" because 31 and 13 are both prime. Your .class file would then have a test case of all the possible numbers from 1-2,000,000,000 passed to it and it had to return the correct answer within 10 seconds to be considered valid. We found a solution but with larger test cases it would take longer than 10 seconds. I am fairly certain there is a way to move the range of looping from n,..2,000,000,000 down as the likely hood of needing to loop that far when n is a low number is small, but either way we broke the loop when a number is prime under both conditions is found. At first we were looping from 2,..n no matter how large it was then i remembered the rule about only looping to the square root of n. Any suggestions on how to make my program more efficient? I have had no classes dealing with complexity analysis of algorithms. Here is our attempt. public class P3 { public static void main(String[] args){ long loop = 2000000000; long n = Integer.parseInt(args[0]); for(long i = n; i<loop; i++) { String s = i +""; String r = ""; for(int j = s.length()-1; j>=0; j--) r = r + s.charAt(j); if(prime(i) && prime(Long.parseLong(r))) { System.out.println(i); break; } } System.out.println("#"); } public static boolean prime(long p){ for(int i = 2; i<(int)Math.sqrt(p); i++) { if(p%i==0) return false; } return true; } } ps sorry if i did the formatting for code wrong this is my first time posting here. Also the output had to have a '#' after each line thats what the line after the loop is about Thanks for any help you guys offer!!!

    Read the article

  • How To Make NVIDIA’s Optimus Work on Linux

    - by Chris Hoffman
    Many new laptops come with NVIDIA’s Optimus technology – the laptop includes both a discrete NVIDIA GPU for gaming power and an onboard Intel GPU for power savings. The notebook switches between the two when necessary. However, this isn’t yet well-supported on Linux. Linus Torvalds had some choice words for NVIDIA regarding Optimus not working on Linux, and NVIDIA is now currently working on official support. However, if you have a laptop with Optimus support, you don’t have to wait for NVIDIA — you can use the Bumblebee project’s solution to enable Optimus on Linux today. Image Credit: Jemimus on Flickr How To Create a Customized Windows 7 Installation Disc With Integrated Updates How to Get Pro Features in Windows Home Versions with Third Party Tools HTG Explains: Is ReadyBoost Worth Using?

    Read the article

  • C++ question on prime numbers.

    - by user278330
    Hello. I am trying to make a program that determines if the number is prime or composite. I have gotten thus far. Could you give me any ideas so that it will work? All primes will , however, because composites have values that are both r0 and r==0, they will always be classified as prime. How can I fix this? int main() { int pNumber, limit, x, r; limit = 0; x = 2; cout << "Please enter any positive integer: " ; cin >> pNumber; if (pNumber < 0) { cout << "Invalid. Negative Number. " << endl; return 0; } else if (pNumber == 0) { cout << "Invalid. Zero has an infinite number of divisors, and therefore neither composite nor prime." << endl; return 0; } else if (pNumber == 1) { cout << "Valid. However, one is neither prime nor composite" << endl; return 0; } else { while (limit < pNumber) { r = pNumber % x; x++; limit++; } if (r == 0) cout << "Your number is composite" << endl; else cout << "Your number is prime" << endl; } return 0; }

    Read the article

  • C++ question on prime numbers.

    - by user278330
    Hello. I am trying to make a program that determines if the number is prime or composite. I have gotten thus far. Could you give me any ideas so that it will work? All primes will , however, because composites have values that are both r0 and r==0, they will always be classified as prime. How can I fix this? int main() { int pNumber, limit, x, r; limit = 0; x = 2; cout << "Please enter any positive integer: " ; cin >> pNumber; if (pNumber < 0) { cout << "Invalid. Negative Number. " << endl; return 0; } else if (pNumber == 0) { cout << "Invalid. Zero has an infinite number of divisors, and therefore neither composite nor prime." << endl; return 0; } else if (pNumber == 1) { cout << "Valid. However, one is neither prime nor composite" << endl; return 0; } else { while (limit < pNumber) { r = pNumber % x; x++; limit++; } if (r == 0) cout << "Your number is composite" << endl; else cout << "Your number is prime" << endl; } return 0; }

    Read the article

1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >