Search Results

Search found 955 results on 39 pages for 'gpu'.

Page 15/39 | < Previous Page | 11 12 13 14 15 16 17 18 19 20 21 22  | Next Page >

  • How do I diagnose a HP Notebook which has hardware issues?

    - by Rob
    My HP DV6t-2300 recently crashed while using FLV Player. It wouldn't start up after this so I had to do a hardware reset (remove power sources, hold down power button ~15 seconds, put back power sources). After this it would turn on, but start up would freeze in Ubuntu, Windows 7, Ubuntu Recovery Mode, and various Linux Live CDs. The only successful way to boot was in Windows 7 Safe Mode. The HP Customer Service was very polite, but they are trying to blame it on a corrupt operating system which is clearly not the case (since I have tried 4 operating systems and none work). I am thinking it might be the GPU since 1) I was watching movies when it crashed and 2) Windows Safe Mode might not use the dedicated GPU. I already ran Memory and HDD tests and there were no detected errors. Any ideas of what's wrong, or suggestions for tests that I should run in safe mode? Should I try reinstalling Windows 7 to convince HP that it's not the OS?

    Read the article

  • Nvidia Drivers on Debian / Lenny (Stable) -> Installation successful -> Monitors gets black

    - by David
    I have successfully installed the proprietary drivers for my nvidia (geforce 7300 gt) graphics card on debian/lenny. I know its not the best way to chose for driver installation ( see this link: http://wiki.debian.org/NvidiaGraphicsDrivers#non-freedrivers ). but the two ways seem to be possible for me (nvidia-kernel module compilation). Now the problem is that the monitors gets black, the power light starts blinking after i launch the x-server. Have a short look a the logs (output truncated from /var/log/Xorg.0.log): (II) Setting vga for screen 0. (**) NVIDIA(0): Depth 24, (--) framebuffer bpp 32 (==) NVIDIA(0): RGB weight 888 (==) NVIDIA(0): Default visual is TrueColor (==) NVIDIA(0): Using gamma correction (1.0, 1.0, 1.0) (**) Jul 28 17:10:11 NVIDIA(0): Enabling RENDER acceleration (II) Jul 28 17:10:11 NVIDIA(0): Support for GLX with the Damage and Composite X extensions is (II) Jul 28 17:10:11 NVIDIA(0): enabled. (II) Jul 28 17:10:11 NVIDIA(0): NVIDIA GPU GeForce 7300 GT (G73) at PCI:1:0:0 (GPU-0) (--) Jul 28 17:10:11 NVIDIA(0): Memory: 262144 kBytes (--) Jul 28 17:10:11 NVIDIA(0): VideoBIOS: 05.73.22.25.00 (II) Jul 28 17:10:11 NVIDIA(0): Detected PCI Express Link width: 16X (--) Jul 28 17:10:11 NVIDIA(0): Interlaced video modes are supported on this GPU (--) Jul 28 17:10:11 NVIDIA(0): Connected display device(s) on GeForce 7300 GT at PCI:1:0:0: (--) Jul 28 17:10:11 NVIDIA(0): Samsung SyncMaster (CRT-0) (--) Jul 28 17:10:11 NVIDIA(0): Samsung SyncMaster (DFP-0) (--) Jul 28 17:10:11 NVIDIA(0): Samsung SyncMaster (CRT-0): 400.0 MHz maximum pixel clock (--) Jul 28 17:10:11 NVIDIA(0): Samsung SyncMaster (DFP-0): 165.0 MHz maximum pixel clock (--) Jul 28 17:10:11 NVIDIA(0): Samsung SyncMaster (DFP-0): Internal Single Link TMDS (II) Jul 28 17:10:11 NVIDIA(0): Assigned Display Device: CRT-0 (==) Jul 28 17:10:11 NVIDIA(0): (==) Jul 28 17:10:11 NVIDIA(0): No modes were requested; the default mode "nvidia-auto-select" (==) Jul 28 17:10:11 NVIDIA(0): will be used as the requested mode. (==) Jul 28 17:10:11 NVIDIA(0): (II) Jul 28 17:10:11 NVIDIA(0): Validated modes: (II) Jul 28 17:10:11 NVIDIA(0): "nvidia-auto-select" (II) Jul 28 17:10:11 NVIDIA(0): Virtual screen size determined to be 1280 x 1024 (--) Jul 28 17:10:11 NVIDIA(0): DPI set to (85, 86); computed from "UseEdidDpi" X config (--) Jul 28 17:10:11 NVIDIA(0): option (==) Jul 28 17:10:11 NVIDIA(0): Enabling 32-bit ARGB GLX visuals. (--) Depth 24 pixmap format is 32 bpp Here is the complete /etc/X11/xorg.conf file as generated by nvidia-xconfig: # nvidia-xconfig: X configuration file generated by nvidia-xconfig # nvidia-xconfig: version 256.35 (buildmeister@builder101) Wed Jun 16 19:25:59 PDT 2010 Section "ServerLayout" Identifier "Layout0" Screen 0 "Screen0" InputDevice "Keyboard0" "CoreKeyboard" InputDevice "Mouse0" "CorePointer" EndSection Section "Files" EndSection Section "Module" Load "dbe" Load "extmod" Load "type1" Load "freetype" Load "glx" EndSection Section "InputDevice" # generated from default Identifier "Mouse0" Driver "mouse" Option "Protocol" "auto" Option "Device" "/dev/psaux" Option "Emulate3Buttons" "no" Option "ZAxisMapping" "4 5" EndSection Section "InputDevice" # generated from default Identifier "Keyboard0" Driver "kbd" EndSection Section "Monitor" Identifier "Monitor0" VendorName "Unknown" ModelName "Unknown" Hor

    Read the article

  • Triple (3) Monitors under Linux

    - by widgisoft
    I have a 3 monitor setup (each 1680x1050) via an Nvidia NVS440 (2 GPUs, 2 outputs per GPU totalling 4 outputs); this works fine under Windows XP,7 but caused considerable headaches under Linux (Ubuntu 9.04). I had previously used an XFX 9600GT and the onboard XFX 9300GS to produce the same result but the card was noisy and power hungry and I was hoping that there was some magical switch in the NVS4400 that got rid of this annoying problem - turns out the NVS440 is just 2 cards on one physical PCB :-p (I searched the net high and low for people using this card under Linux but found nothing, if anything the card uses less power and is fan less so I was to benefit from it either way) Anyway, using either set up there were 5 solutions available: Have 3 separate X instances, all un joined Have 3 separate X instances, adjoined by Xinerama Have 2 separate X instances - One using twin-view, both adjoined by Xinerama Have 2 separate X instances - One using twin-view but no Xinerama Have a single Twin-view setup and leave the 3rd screen unplugged :-p The 4rd option, using 2 separate X instances and twinview (but no xinerama) was the best balance in terms of performance and usability but caused 2 really annoying issues You couldn't control (without altering the shortcuts) which screen an application opened onto - and once it was opened you couldn't move it to another screen without opening up terminal and forcing it to move Nvidia's overriding or falsifying of Xinerama breaks and the 2 screens joined by Twin view behave like a single huge screen causing popups to open in the middle of both screens and maximising of windows stretches to the width of the first 2 screens Firefox can only run one instance as the same user so having multiple firefox windows requires at least 2 users The second option "feels" like the right option, but OpenGL is basically disabled and playing any sort of game or even running anything graphical causes a huge performance drop and instability - even trying to run a basic emulator for gba or gens just causes the system to fall over. It works just enough to stare at your desktop and do nothing but as soon as you start doing some work - opening windows, dragging things around - running multiple copies of firefox it just really feels slow. The last open, only going dual screen works perfectly and everything performs as required, full GPU acceleration - two logical screen spaces - perfect, just make it work across GPUs like windows! :-p Anyway, I know RandR was supposed to pick up the slack when it would introduced GPU objects of sorts to allow multiple GPUs to be stitched together to create one huge desktop at a much deeper layer than Xinerama. I was wondering if this has now been fixed (I noticed X server 1.7 is out) and whether anyone has got it running successfully? Again, my requirements are: One huge desktop to drag any window across Maximising of windows to each screen (as XP does) Running fullscreen apps on the primary screen and disabling the mouse from moving onto the others or on all 3 stretched Finally as a side note; I am aware of the Matrox triple (and dual) head splitter but even the price they go for on eBay is more than I can afford atm, my argument: I shouldn't have to buy extra hardware to get something to work on Linux when it's something that's existed in the windows world for a long time (can you tell I don't get on with X :-p); If I had the cash I'd have bought the latest version of this box already (the new version finally supports large resolutions as the displays I have 1680x1050 each).

    Read the article

  • recommendations for a lightweight linux distribution for a test server

    - by Jack
    I'm planning on setting up a test server to experiment with some application servers (tomcat/jboss/...) and with some portals. Now the machine I've set aside for this is lightweight CPU/GPU wise(Atom D510, 4 gigabyte ram, 500 GiB hdd, onboard GPU). But it should suffice for most things, I'm more interested in the stability of JBoss/Tomcat for my purposes than the stability. However I'm having a bit of trouble picking an appropriate distribution size/performance/setup time wise/security wise since it seems I can't sneeze without another distribution popping up. I've been thinking about going for Fedora since I've read that that distribution has been optimized for Atom, but I'm not really familiar with it. My experience with Linux has mostly been limited to Ubuntu and some tinkering with puppylinux. I'm not afraid to get my hands dirty using the command line. I'm not planning on starting a discussion per se, mostly the pros/cons that people have encountered with some distributions

    Read the article

  • Fan is spinning too fast just in Windows - software?

    - by B. Roland
    I've recently replaced my fans (CPU, GPU, and bought a CHA fan). The GPU remained the same, but I've seen it when it was spinned 2 times faster, than it usual... but it is rarely. The problem is, that the CPU fan in Windows (especially 7) spinned too much, 'cos it keeps in under 40°C, and it is spinning with 3300-3600 RPM, which is too high I think. If I swich to Ubuntu, it keeps on ~40-45°C with with 2500-2800 RPM, which is a big difference in numbers, and in noise. I'm looking for a manual fan control solution, or just reduce the Windows' multipliers of fan speed control, somehow... I was bought the new fans because of the lower noise (and it does it, but not with 3.6k RMP). Thank you!

    Read the article

  • Are these symptoms of a video card is dying?

    - by K Cloud
    What I am getting from my PC: Scrambled text in the boot menu. The scrambled text dissapears and everthing goes normal after several restarts, or the computer has worked for some time. Display goes blank showing, windows kernel mode driver stopped responding. Sometimes the windows just hangs, I need to restart the pc for that. There used to be some scrambled colors in my windows, so I cleaned up my video carda and port and reinstalled the video card, things are something right. The pc runs normally for hours in safe mode, as diffault display driver works at that case. My PC specs: Core 2 Duo processor 2 GB RAM NVIDIA EN210 SILENT GPU Windows 8 Drivers details: NVIDIA drivers - v331.65, told to be fully compatible with Win8 Currently I am installing Windows 7 and will be putting the older version of driver v320 to test, yet I'm really confused weather my GPU is dying or still on go.

    Read the article

  • netbook intel GMA 3150 external monitor 1920x1080 flicker problem

    - by seyenne
    Dear all, i recently purchased a acer netbook (apsire one d260). It runs flawlessly. Yesterday I bought a samsung 23" TFT with a native resolution of 1920x1080. According to the information found in the internet and my local computer dealer, the intel chipset can handle the native resolution of the monitor. However, this is only partly the case. I use the VGA cabel to connect, the monitor instantly switches to the native resolution and now the problem: Occasionaly, especially the first 2 hours after booting up, I have a flickering all over the screen, sometimes the entire screen is shaking and spinning around like crazy. I figured out, that lowering the resolution avoids the flicker but this helps only for some time. I can rule out that it's the monitor's problem since I found no issues with another notebook. Right now, I have no problems with the netbook, for about 30 minutes I didn't experience any issues... But I don't know for how long, it occurs without warning :-) I'm worried that if I would bring the netbook back to the dealer and explain my problem, after testing it on an external screen in the local shop, everything works just fine... And I won't get helped with the problem because I can't prove it. (I'm currently in Thailand and over here, customer service is nothing like back home in Germany) What can I do? Is this a driver related issue? (I installed the latest GPU driver) Is it because of the VGA cable? (But why does it work sometimes without any problems and with no issues on the other notebook) I monitored the GPU/CPU temperature, nothing changes really over time..Can it simply be a faulty GPU and is a replacement justifiable? I'm really stressed now because for the time I'm writing, the flickering didn't occur...but for sure, soon or later it will happen again.. I Forgot to mention, the problem also happens if the netbook runs on battery, unplugged. So the only hardware that is plugged is the TFT screen. ...........and here it comes again, flickering has just begun. NEED HELP! Thank you all for reading through this and giving any suggestions if possible. Cheers

    Read the article

  • Upgraded to a GTX 580 - no video signal on boot

    - by MGOwen
    I have: New Corsair 550w PSU (should be more than enough) AsRock H61M u3s3 mobo (with latest updated BIOS) Just replaced my faithful HD 7790 GPU with a refurbished EVGA GTX 580 1.5GB. When I boot, it sounds like the PC boots to windows, and the 580's fan is spinning, but I can't get any video signal from the card. Old card and integrated graphics still work fine (when I replace and use those instead). I am using the DVI ports on the card with a DVI-VGA adapter until I can get my hands on a mini HDMI adapter tomorrow. Anyone ever heard of a GPU needed something special to make the VGA signal work? Hold down a key on startup or something? I can't even find a manual for this card.

    Read the article

  • DirectCompute

    In my previous blog post I introduced the concept of GPGPU ending with:On Windows, there is already a cross-GPU-vendor way of programming GPUs and that is the Direct X API. Specifically, on Windows Vista and Windows 7, the DirectX 11 API offers a dedicated subset of the API for GPGPU programming: DirectCompute. You use this API on the CPU side, to set up and execute the kernels on the GPU. The kernels are written in a language called HLSL (High Level Shader Language). You can use DirectCompute with HLSL to write a "compute shader", which is the term DirectX uses for what I've been referring to in this post as "kernel".In this post I want to share some links to get you started with DirectCompute and HLSL.1. Watch the recording of the PDC 09 session: DirectX11 DirectCompute.2. If session recordings is your thing there are two more on DirectCompute from nvidia's GTC09 conference 1015 (pdf, mp4) and 1411 (mp4 plus the presenter's written version of the session).3. Over at gamedev there is an old Compute Shader tutorial. At the same site, there is a 3-part blog post on Compute Shader: Introduction, Resources and Addressing.4. From PDC, you can also download the DirectCompute Hands On Lab.5. When you are ready to get your hands even dirtier, download the latest Windows DirectX SDK (at the time of writing the latest is dated Feb 2010).6. Within the SDK you'll find a Compute Shader Overview and samples such as: Basic, Sort, OIT, NBodyGravity, HDR Tone Mapping.7. Talking of DX11/DirectCompute samples, there are also a couple of good ones on this URL.8. The documentation of the various APIs is available online. Here are just some good (but far from complete) taster entry points into that: numthreads, SV_DispatchThreadID, SV_GroupThreadID, SV_GroupID, SV_GroupIndex, D3D11CreateDevice, D3DX11CompileFromFile, CreateComputeShader, Dispatch, D3D11_BIND_FLAG, GSSetShader. Comments about this post welcome at the original blog.

    Read the article

  • Manually writing a dx11 tessellation shader

    - by Tudor
    I am looking for resources on what are the steps of manually implementing tessellation (I'm using Unity cg). Today it seems that it is all the rage to hide most of the gpu code far away and use rather rigid simplifications such as unity's SURFace shaders. And it seems useless unless you're doing supeficial stuff. A little background: I have procedurally generated meshes (using marching cubes) which have quality normals but no UVs and no Tangents. I have successfully written a custom vertex and fragment shader to do triplanar texture and bumpmap projection as well as some custom stuff (custom lighting, procedurally warping the texture for variation etc). I am using the GPU Gems book as reference. Now I need to implement tessellation, but It seems I must calculate the tangents at runtime by swizzling normals (ctrl+f this in gems: <normal.z, normal.y, -normal.x>) before the tessellator gets them. And I also need to keep my custom vert+frag setup (with my custom parameters/textures being passed between them) - so apparently I cannot use surface shaders. Can anyone provide some guidence?

    Read the article

  • Lenovo Ideapad 205 and Ubuntu 12.04 LTS

    - by Oranges
    I just wanted to report one thing, because I know there are a lot of Ideapad S205 users out there, that had issues running Ubuntu on their Lenovo Ideapad S205 in the past. Lenovo Ideapad with Ubuntu 12.04 LTS: Everything works out of the box. I just installed it and everything, including the microphone, the webcam, the WLAN, the AMD E-350 CPU/GPU and the sound worked right away. In other words: Everything works. Even the AMD E-350 GPU works very well with the Open Source drivers and with the prop. drivers. The prop. drivers were automatically offered to me via Ubuntu itself. It works great. A long idea, written short: I love it. It's extremly stable and very fast. I am using the German version of the Idepad S205, combined with the 32bit version of Ubuntu. There is only one, single thing to keep in mind when installing Ubuntu on the S205: Plugin your USB/CD/DVD drive in the left USB-Port. I don't know why, but it only works with the USB-Port on the left side of the Netbook. This is only for installing Ubuntu. After it has been installed and updated successfully, all of the USB ports will work. Oranges - feedback on Lenovo S205 with Ubuntu 12.04 LTS.

    Read the article

  • Nvidia optimus and Steam (on 12.04)

    - by Seiryuu
    I've obtained a copy of the .deb for the Steam beta, but it was pretty disappointing to see that it simply doesn't run. Hardware - Dell XPS L502, with Nvidia Optimus I have bumblebee installed. Trying to run Steam with the Intel HD 3000 completely fails to start it. Message received Installing breakpad exception handler for appid(steam)/version(1352224866_client) followed by a crash with no other information provided. Trying to optirun steam runs the client, but as soon as it gets to the home screen, it says that the Nvidia drivers I am using are out of date (and Steam requires newer drivers to run). It's probably worth to note that it throws the same Installing breakpad... error when run with optirun, but it doesn't crash the client immediately. Any way to fix this? Also, is there a way of manually updating the drivers in bumblebee without breaking anything? Alternatively, is there a reliable way of completely disabling the Intel GPU (in order to use the Nvidia GPU exclusively)? Note: I am using Xmonad with gnome-fallback, if that makes a difference. However, when I tried everything mentioned with Unity (2d), everything was the same, so I guess it has nothing to do with the window manager in use.

    Read the article

  • Heading to GTC 2010

    - by Daniel Moth
    Next week the GPU Technology Conference (GTC) 2010 takes place in San Jose, CA and I am lucky enough to be attending the entire week. It has been an extremely long time (in fact, I can't remember the last time) where I am registered as an attendee at a conference (full pass/access) without being a speaker *and* without having any booth duty! Having said that, we (our team at Microsoft) will be running GPU debugging UX studies throughout the entire week (similar to what I had previously advertised). If you are attending GTC 2010 and you are interested, look for the related flyer in your conference bag. The conference is an excellent opportunity to connect in-person with various individuals that I have only met virtually. From an educational perspective there is a very long and interesting session list, with multiple concurrent slots, making it very hard to choose between them, but I have managed to create my (packed) schedule. I am most looking forward to sessions on the programming languages and tools, both from Microsoft and MS partners. For full conference details, visit the GTC 2010 official page. Comments about this post welcome at the original blog.

    Read the article

  • Nvidia 740M still not working after Bumblebee installation

    - by Jon
    first of all, I have checked a lot of similar topics, but I still can't get my laptop to use Nvidia 740M. So first things first. I have a laptop Asus X550V(i5-3230, 4gb RAM, Nvidia 740M + Intel HD4000). I installed Ubuntu 13.10 alongside Win8(preinstalled) and both systems are running without problems. However, I have problem with second graphic card(Nvidia 740M), as Ubuntu doesn't recognize it. I installed bumblebee with this tutorial https://wiki.ubuntu.com/Bumblebee#Installation, but I still get error "Cannot access secondary GPU" error when trying to run ''optirun Steam'' in terminal. Then I tried to do this: [ERROR]Cannot access secondary GPU - error: [XORG] (EE) No devices detected. you need to edit the /etc/bumblebee/xorg.conf.nvidia (or /etc/bumblebee/xorg.conf.nouveau if using the noveau driver) and specify the correct BusID by following the instructions therein. But with lspci / VGA i get only info about Intel 4000, but no Nvidia. When I type only lspci, I get the line for Nvidia 740M,but after I edit the config file I still get second card error. Also, in /etc/bumblebee/xorg.conf.nvidia there wasn't BusID or anything similar, so I just added the whole line in device section. As I sad, I tried a lot of things to get it working, avoiding this forum(as I din't want to bother people with some solutions possible), but alas!, I had to bother you. If there is a need for some additional info, just say, no problem at all. Thank you very much in advance. :)

    Read the article

  • Javascript Canvas Drawing Efficiency

    - by jujumbura
    I have just recently started some experiments with game development in Javascript/HTML5, and so far it has been going pretty well. I have a simple test scene running with some basic input handling, and a hundred-ish drawImage() calls with a few transforms. This all runs great on Chrome, but unfortunately, it already chugs on Firefox. I am using a very large canvas ( 1920 x 1080 ), but it doesn't seem like I should be hitting my limit already. So on that note, I was hoping to ask a few questions: 1) What exactly is done on the CPU vs. the GPU in terms of canvas and drawImage()? I'm afraid the answer is probably "it depends on the browser", but can anybody give me some rules of thumb? I naively imagined that each drawImage call results in a textured quad on the GPU with the canvas effectively being a render target, but I'm wondering if I'm pretty far off base there... 2) I have seen posts here and there with people saying not to use the translate(), rotate(), scale() functions when drawing on the canvas. Am I adding a lot of overhead just by adding a translate() call, as opposed to passing in the x,y to drawImage()? Some people suggest using "transate3d", etc., which are CSS properties, but I'm not sure how to use them within a scene. Can they be used for animated sprites within a single canvas? 3) I have also seen a lot of posts with people mentioning that pre-building canvases and then re-using them is a lot faster than issuing all the individual draw calls again. I am guessing that my background should definitely be pre-built into a canvas, but how far should I take this? Should I maintain an individual canvas for each sprite, to cache all static image data when not animating? Thank you much for your advice!

    Read the article

  • CUDA & MSI GT60 with Optimus enabled GTX670M?

    - by user1076693
    I have a MSI GT60 Laptop with an Optimus enabled GTX 670M GPU, and I have been trying to get CUDA going in Ubuntu 12.04 environment. I realize that Optimus is not supported in Linux, but I have read the following post suggesting that CUDA works for hybrid GPUs. How can I get nVidia CUDA or OpenCL working on a laptop with nVidia discrete card/Intel Integrated Graphics? I installed the NVIDIA driver via sudo add-apt-repository ppa:ubuntu-x-swat/x-updates sudo apt-get update sudo apt-get install nvidia-current The resulting driver version is 302.17, and supposedly GTX 670M is supported since 295.59. I also downloaded CUDA 4.2 from the NVIDIA site, and compiled it against nvidia-current libraries. Unfortunately, when I run deviceQuery in the CUDA SDK, I get the following output cudaGetDeviceCount returned 38 -> no CUDA-capable device is detected Checking /proc/driver/nvidia/gpus/0/information gives the following Model: GeForce GTX 670M IRQ: 16 GPU UUID: GPU-????????-????-????-????-???????????? Video BIOS: ??.??.??.??.?? Bus Type: PCI-E DMA Size: 32 bits DMA Mask: 0xffffffffff Bus Location: 0000:01.00.0 Here is the output of "lspci | grep VGA" 00:02.0 VGA compatible controller: Intel Corporation Ivy Bridge Graphics Controller (rev 09) 01:00.0 VGA compatible controller: NVIDIA Corporation Device 1213 (rev ff) So... what am I doing wrong? Thanks!

    Read the article

  • Pandaboard crash on startup or freeze after minutes

    - by Meach
    I just received my Pandaboard ES (rev B) and I am having trouble after installing ubuntu-omap4-addons. Once I copied the image ubuntu-12.04-preinstalled-desktop-armhf+omap4.img on my sd card and boot the pandaboard with it, I run the following commands: sudo add-apt-repository ppa:tiomap-dev/release sudo apt-get update sudo apt-get dist-upgrade sudo apt-get install ubuntu-omap4-extras At the end of the installation of ubuntu-omap4-extras, Ubuntu tells me that a problem occurs when the console displays: ldconfig deferred processing now taking place Clicking on "report the problem" tell me that the problem concerns pvr-omap4-dkms. I read somewhere that this can happen and it is better to reinstall pvr-omap4-dkms. Which I am doing by running: sudo apt-get install --reinstall pvr-omap4-dkms I reboot. Then the board has sometimes difficulties to start Ubuntu: it freezes during the loading page, only action I can do is unplugging the board to start it again. Some other times, Ubuntu load successfully but then freeze at another random time, in the range 20 - 40 minutes. I searched on internet for similar bug and found this: https://bugs.launchpad.net/ubuntu/+source/linux-ti-omap4/+bug/971091 So I typed this in: update-rc.d ondemand disable apt-get -y install cpufrequtils echo 'ENABLE="true" GOVERNOR="performance" MAX_SPEED="0" MIN_SPEED="0"' > /etc/default/cpufrequtils cpufreq-set -r -g performance reboot But it doesn't seems to fix the bug. Another detail: on startup, before the loading screen of Ubuntu (when there is the two penguins displayed :)), it shows this: [0.297271] CPU1: Unknown IPI message 0x1 [0.308990] omap_hwmod: mcpdm: _wait_target_ready error: -16 [0.354705] omap_mux_get_by_name: Could not find signal uart1_cts.uart1_cts [0.354766] omap_hwmod_mux_init: Could not allocate device mux entry [2.107086] thermal_get_slope:Getting slope is not supported for domain gpu [2.107116] thermal_get_offset:Getting offset is not supported for domain gpu [2.107299] stm_fw: vendor driver stm_ti1.0 registered [8.725555] OMAPRPC: Registration of OMAPRPC rpmsg service returned 0! debug=0 Any idea what can be wrong? I am not that good with Ubuntu so any help will be appreciated. Cheers! Meach

    Read the article

  • Java SE 8 (with JavaFX) Developer Preview Release for ARM

    - by Roger Brinkley
    In an effort to get ARM developers testing Java SE 8 before the scheduled release later this year a Java SE 8 Developer Preview Release for ARM has been made available. This release has been tested on the Raspberry PI but should work on other ARM platforms. In addition to the new Java SE features, this release provides specific support of hard float GPU on the Raspberry PI. The support for hard float GPU has been anticipated by a number of developers. Additionally, this release includes support of an optimized JavaFX. Specific configurations of JDK 8 on ARM are defined below: Java FX is supported on ARM architecture v6/7 (hard float) Supported platforms without Java FX: ARM architecture v6/7 (hard float) ARM architecture v7 (VFP, little endian) ARM architecture v5 (soft float, little endian) Linux x86 The download page includes setup instructions for a Raspberry PI device as well as demos and samples. Developers are also encouraged to try their own applications as well and to share their stories via the JavaFX or Project Feedback Forums.  If you've got a Raspberry PI or other ARM devices it's time to get started with Java SE 8 Developer Preview release.

    Read the article

  • OpenJDK In The News: AMD and Oracle to Collaborate in the OpenJDK Community [..]

    - by $utils.escapeXML($entry.author)
    During the JavaOne™ 2012 Strategy Keynote, AMD (NYSE: AMD) announced its participation in OpenJDK™ Project “Sumatra” in collaboration with Oracle and other members of the OpenJDK community to help bring heterogeneous computing capabilities to Java™ for server and cloud environments. The OpenJDK Project “Sumatra” will explore how the Java Virtual Machine (JVM), as well as the Java language and APIs, might be enhanced to allow applications to take advantage of graphics processing unit (GPU) acceleration, either in discrete graphics cards or in high-performance graphics processor cores such as those found in AMD accelerated processing units (APUs).“Affirming our plans to contribute to the OpenJDK Project represents the next step towards bringing heterogeneous computing to millions of Java developers and can potentially lead to future developments of new hardware models, as well as server and cloud programming paradigms,” said Manju Hegde, corporate vice president, Heterogeneous Applications and Developer Solutions at AMD. “AMD has an established track record of collaboration with open-software development communities from OpenCL™ to the Heterogeneous System Architecture (HSA) Foundation, and with this initiative we will help further the development of graphics acceleration within the Java community.”“We expect our work with AMD and other OpenJDK participants in Project “Sumatra” will eventually help provide Java developers with the ability to quickly leverage GPU acceleration for better performance,” said Georges Saab, vice president, Software Development, Java Platform Group at Oracle. "We hope individuals and other organizations interested in this exciting development will follow AMD's lead by joining us in Project “Sumatra."Quotes taken from the first press release from AMD mentioning OpenJDK, titled "AMD and Oracle to Collaborate in the OpenJDK Community to Explore Heterogeneous Computing for Java ".

    Read the article

  • Can T520+Bumblebee run an external monitor via DisplayPort?

    - by Fen
    Using 64-bit Ubuntu 11.10 and integrated (Intel) graphics, I can run the 1600x900 laptop display plus a 1600x1200 external monitor connected to the VGA adapter. But my external monitor is 1920x1200 so I have black stripes on each side. I believe the resolution is limited like this as the maximum resolution available from the Intel GPU is 2560x1600 = 4,096,000 pixels and I'm asking for a 3520x1200 = 4,224,000 display (with 1200x300 lost above the laptop screen). At 3200x1200 = 3,840,000 pixels, the Intel GPU seems happy. Under Windows, the same limit exists when using the VGA adapter, but if I turn Optimus on then I can connect the external monitor to the DisplayPort and get its full resolution and an extended desktop. I've seen that Bumblebee can run apps on the DisplayPort using the 'optirun' command. My question is: can Bumblebee run the DisplayPort in concert with the Intel card running the laptop screen creating a large virtual desktop (as on Windows)? If so, are there any pointers to how to do this? I tried once, failed, and dropped back to Integrated Graphics (and black stripes) as I could find no reports of this configuration working.

    Read the article

  • a flexible data structure for geometries

    - by AkiRoss
    What data structure would you use to represent meshes that are to be altered (e.g. adding or removing new faces, vertices and edges), and that have to be "studied" in different ways (e.g. finding all the triangles intersecting a certain ray, or finding all the triangles "visible" from a given point in the space)? I need to consider multiple aspects of the mesh: their geometry, their topology and spatial information. The meshes are rather big, say 500k triangles, so I am going to use the GPU when computations are heavy. I tried using arrays with vertices and arrays with indices, but I do not love adding and removing vertices from them. Also, using arrays totally ignore spatial and topological information, which I may need studying the mesh. So, I thought about using custom double-linked list data structures, but I believe doing so will require me to copy the data to array buffers before going on the GPU. I also thought about using BST, but not sure it fits. Any help is appreciated. If I have been too fuzzy and you require other information feel free to ask.

    Read the article

  • HDA NVidia (GT520) - Sound Issue

    - by Oliver Lucas
    I have an GT520 graphics card and I am trying to get the sound working with my XBMC setup and I'm having trouble. Things I have completed: aplay -l List of PLAYBACK Hardware Devices card 0: NVidia [HDA NVidia], device 3: HDMI 0 [HDMI 0] Subdevices: 1/1 Subdevice #0: subdevice #0 then lspci 01:00.1 Audio device: nVidia Corporation HDMI Audio stub (rev a1) and alsamixer which is set to unmuted Everything looks well, so ran: aplay -D hw:0,3 /home/ollie/Music/alex.mp3 Playing raw data '/home/ollie/Music/alex.mp3' : Unsigned 8 bit, Rate 8000 Hz, Mono aplay: set_params:1059: Sample format non available Available formats: - S16_LE - S32_LE with no luck.. then speaker-test Playback device is default Stream parameters are 48000Hz, S16_LE, 1 channels Using 16 octaves of pink noise Playback open error: -2,No such file or directory also tried running through ftp://download.nvidia.com/XFree86/gpu-hdmi-audio-document/gpu-hdmi-audio.html#upgrading_alsa_driver and http://wiki.xbmc.org/index.php?title=HOW-TO:Setup_audio_over_HDMI_on_nVidia_GeForce/nForce_controller plus 20 other websites with selective "fixes" etc.. but no luck _< I am a complete beginner with Ubuntu so this is a really steep learning curve for me, not sure I'm learning much though as its all just headaches atm! Thanks for any help Ollie

    Read the article

  • How to get Nvidia 555M working on Ubuntu 12.10 (drivers, optimus, cuda)?

    - by bluudz
    I'm trying for few days to get my GPU working properly on Alienware M14x with GPU NVidia 555M, but I have no luck at all. After fresh ubuntu install I did follow the guide here NVIDIA Optimus and Ubuntu 12.10 and istalled Bumblebee without problems. Tested glxspheres/optirun glxspheres both working fine. Now I was continuing to install CUDA as is said here How can I get nVidia CUDA or OpenCL working on a laptop with nVidia discrete card/Intel Integrated Graphics? but I'm getting: Driver: Not Selected Toolkit: Installation Failed. Using unsupported Compiler. Samples: Installation Failed. Missing required libraries. I did not select the driver as I though Bumblebee installed driver already. How should I proceed? And also at what point is the NVidia driver being installed and how can I try its working? Bumblebee seems to be installing the driver, CUDA wants to do the same, its all a bit confusing really. Sorry if its lame question, but I would really want to at least get graphic card and second screen working.. Thank you for any help.

    Read the article

  • How should VertexBuffers be used with Multiple Monitors in DirectX 9

    - by Joshua C
    I am currently using DirectX 9 on a machine with two GPUs and three monitors. I am currently trying to draw a triangle on each monitor using vertexbuffers; A directx helloworld with multiple monitors if you will. I am familiar with some DirectX coding, but new to multiple monitor DirectX coding. I may be going about this the wrong way, so please do correct me if I'm doing something wrong. I have created a Direct3D Device for each enumerated adapter sharing the same Form handle. This allows me to successfully use all three monitors in full-screen mode. For Each Adapter In Direct3D.Adapters Dim PresentParameters As New PresentParameters 'Setup PresentParameters PresentParameters.Windowed = False PresentParameters.DeviceWindowHandle = MainForm.Handle Dim Device as New Device(Direct3D, Adapter.Adapter, DeviceType.Hardware, PresentParameters.DeviceWindowHandle, CreateFlags.HardwareVertexProcessing, PresentParameters) Device.SetRenderState(RenderState.Lighting, False) Devices.Add(Device) Next I can also draw text to each device successfully using a different Font for each Device. When I render a triangle using a different VertexBuffer for each Device, only two monitors display the triangle. One of the two monitors on the same GPU, and the monitor on it's own GPU display properly. VertexBuffer = New VertexBuffer(Device, 4 * Marshal.SizeOf(GetType(ColoredVertex)), Usage.WriteOnly, VertexFormat.None, Pool.Managed) Dim Verts = VertexBuffer.Lock(0, 0, LockFlags.None) Verts.WriteRange({ New ColoredVertex(-.5, -.5, 1, ForeColor), New ColoredVertex(0, .5, 1, ForeColor), New ColoredVertex(.5, -.5, 1, ForeColor) }) VertexBuffer.Unlock() VertexDeclaration = New VertexDeclaration(Device, { New VertexElement(0, 0, DeclarationType.Float3, DeclarationMethod.Default, DeclarationUsage.Position, 0), New VertexElement(0, 12, DeclarationType.Color, DeclarationMethod.Default, DeclarationUsage.Color, 0), VertexElement.VertexDeclarationEnd }) Render Code: Device.SetStreamSource(0, VertexBuffer, 0, Marshal.SizeOf(GetType(ColoredVertex))) Device.VertexDeclaration = VertexDeclaration Device.DrawPrimitives(PrimitiveType.TriangleList, 0, 1) I have to assume the fact that they share the same physical card comes into play. Should I use multiple buffers on the same card, and if so, how? Or what is the way I should access the VertexBuffer across Devices? Another thought I had was the non working monitor acts like there are no lights. Is turning off lighting on each device on the same card causing issues somehow?

    Read the article

  • Silverlight 5 &ndash; What&rsquo;s New? (Including Screenshots &amp; Code Snippets)

    - by mbcrump
    Silverlight 5 is coming next year (2011) and this blog post will tell you what you need to know before the beta ships. First, let me address people saying that it is dead after PDC 2010. I believe that it’s best to see what the market is doing, not the vendor. Below is a list of companies that are developing Silverlight 4 applications shown during the Silverlight Firestarter. Some of the companies have shipped and some haven’t. It’s just great to see the actual company names that are working on Silverlight instead of “people are developing for Silverlight”. The next thing that I wanted to point out was that HTML5, WPF and Silverlight can co-exist. In case you missed Scott Gutherie’s keynote, they actually had a slide with all three stacked together. This shows Microsoft will be heavily investing in each technology.  Even I, a Silverlight developer, am reading Pro HTML5. Microsoft said that according to the Silverlight Feature Voting site, 21k votes were entered. Microsoft has implemented about 70% of these votes in Silverlight 5. That is an amazing number, and I am crossing my fingers that Microsoft bundles Silverlight with Windows 8. Let’s get started… what’s new in Silverlight 5? I am going to show you some great application and actual code shown during the Firestarter event. Media Hardware Video Decode – Instead of using CPU to decode, we will offload it to GPU. This will allow netbooks, etc to play videos. Trickplay – Variable Speed Playback – Pitch Correction (If you speed up someone talking they won’t sound like a chipmunk). Power Management – Less battery when playing video. Screensavers will no longer kick in if watching a video. If you pause a video then screensaver will kick in. Remote Control Support – This will allow users to control playback functions like Pause, Rewind and Fastforward. IIS Media Services 4 has shipped and now supports Azure. Data Binding Layout Transitions – Just with a few lines of XAML you can create a really rich experience that is not using Storyboards or animations. RelativeSource FindAncestor – Ancestor RelativeSource bindings make it much easier for a DataTemplate to bind to a property on a container control. Custom Markup Extensions – Markup extensions allow code to be run at XAML parse time for both properties and event handlers. This is great for MVVM support. Changing Styles during Runtime By Binding in Style Setters – Changing Styles at runtime used to be a real pain in Silverlight 4, now it’s much easier. Binding in style setters allows bindings to reference other properties. XAML Debugging – Below you can see that we set a breakpoint in XAML. This shows us exactly what is going on with our binding.  WCF & RIA Services WS-Trust Support – Taken from Wikipedia: WS-Trust is a WS-* specification and OASIS standard that provides extensions to WS-Security, specifically dealing with the issuing, renewing, and validating of security tokens, as well as with ways to establish, assess the presence of, and broker trust relationships between participants in a secure message exchange. You can reduce network latency by using a background thread for networking. Supports Azure now.  Text and Printing Improved text clarity that enables better text rendering. Multi-column text flow, Character tracking and leading support, and full OpenType font support.  Includes a new Postscript Vector Printing API that provides control over what you print . Pivot functionality baked into Silverlight 5 SDK. Graphics Immediate mode graphics support that will enable you to use the GPU and 3D graphics supports. Take a look at what was shown in the demos below. 1) 3D view of the Earth – not really a real-world application though. A doctor’s portal. This demo really stood out for me as it shows what we can do with the 3D / GPU support. Out of Browser OOB applications can now create and manage childwindows as shown in the screenshot below.  Trusted OOB applications can use P/Invoke to call Win32 APIs and unmanaged libraries.  Enterprise Group Policy Support allow enterprises to lock down or up the sandbox capabilities of Silverlight 5 applications. In this demo, he tore the “notes” off of the application and it appeared in a new window. See the black arrow below. In this demo, he connected a USB Device which fired off a local Win32 application that provided the data off the USB stick to Silverlight. Another demo of a Silverlight 5 application exporting data right into Excel running inside of browser. Testing They demoed Coded UI, which is available now in the Visual Studio Feature Pack 2. This will allow you to create automated testing without writing any code manually. Performance: Microsoft has worked to improve the Silverlight startup time. Silverlight 5 provides 64-bit browser support.  Silverlight 5 also provides IE9 Hardware acceleration.   I am looking forward to Silverlight 5 and I hope you are too. Thanks for reading and I hope you visit again soon.  Subscribe to my feed CodeProject

    Read the article

< Previous Page | 11 12 13 14 15 16 17 18 19 20 21 22  | Next Page >