Search Results

Search found 13534 results on 542 pages for 'gpu programming'.

Page 271/542 | < Previous Page | 267 268 269 270 271 272 273 274 275 276 277 278  | Next Page >

  • Loud fans despite cool system under Linux (but not Windows)

    - by Sman789
    My new desktop computer runs almost silently under Windows, but the fans seem to run on a constantly high setting under Linux. Psensor shows that the GPU (with NVidia drivers) is thirty-something degrees and the CPU is about the same, so it's not just down to Linux somehow being more processor-intensive. I've read that the BIOS controls the fans under Linux, which makes sense given the high fan speeds when in BIOS as well. It's under Windows, when the ASUS AI Suite 3 software seems to take control, that the system runs more quietly and only speeds the fans up when required. So is there a Linux app which offers a similar dynamic control of the fans, or a setting hidden somewhere in the ASUS BIOS which allows the same but regardless of the OS? EDIT - I've tried using lm-sensors and fancontrol, but pwmconfig tells me "There are no pwm-capable sensor modules installed". This is after the sensors-detect command does find an 'Intel digital thermal sensor', and despite the sensors working fine in apps like psensor. Help getting this to work would likely solve the problem.

    Read the article

  • Firefox 4 : sortie de la beta 12, améliorations du support du Flash et de l'accélération matérielle

    Firefox 4 : sortie de la beta 12 Améliorations du support du Flash et de l'accélération matérielle Mise à jour du 28/02/11 La douzième ? et a priori dernière - beta de Firefox 4 est sortie ce week-end. Elle corrige 7.000 bugs et apporte une amélioration dans la lecture des vidéos (en Flash). L'intégration de l'accélération matérielle (allouer des tâches spécifiques de calcul au GPU plutôt qu'au CPU) a elle aussi été retravaillée. Le tout permettant une meilleure stabilité du navigateur. Elle n'inclut malheureusement pas encore les patchs « miracles*» qui permettent de diviser par deux son temps de démarrage (lire par ail...

    Read the article

  • La beta de Moonlight 4 se rapproche de Silverlight 4, l'implémentation open-source ajoute accélération matérielle et support du H.264

    La beta de Moonlight 4 se rapproche de Silverlight 4 Son implémentation open-source propose désormais accélération matérielle et support du H.264 Moonlight 4 vient de sortir en version beta. L'implémentation open-source de Silverlight propose à présent l'accélération matérielle (pour la prise en charge des vidéos et de la 3D par le GPU) ou le support du codec H.264. Avec cette version de développement, Moonlight intègre plusieurs nouveautés de Silverlight 4, notamment la prise en charge des APIs de Silverlight 3 et 4. Elle permet également de construire et de faire tourner des applications « hors du navigateur ». Néanmoins, cette beta ne propose pas toutes les foncti...

    Read the article

  • files power_profile and power_method missing on ubuntu 12.04 after clean isntall

    - by Nikola
    OK here is the problem,I am using gnome-shell, ubuntu 12.04, kernel 3.2.0-32-generic-pae and the proprietary drivers for my ati card (Installed via "additional drivers") , the laptops is a hp 4310s probook and i want to control the power_profiles and power_method , because my GPU temp is high. before i reinstalled ubuntu 12.04, i used the .sh method on startup to write to those files, and everything worked like a charm, but now they are missing, and i can't create them.this is what i get when i try to create the directories mkdir: cannot create directory `/sys/class/drm': No such file or directory How can i can get them back?if you need some information , just ask and i will give it.

    Read the article

  • downgrade ppa packages to versions available at a previous point in time

    - by Will
    The backstory is that the normal Intel GPU drivers don't do the various OpenGL extensions that my hobby coding and some games want. So I have to install xorg-edgers and then its happy. However, last Wednesday or so there was an update to xorg-edgers - lots of packages - and it broke badly; the drivers lock up and take the whole computer with them; hard reset required. So how can you downgrade - select package versions in a PPA that represent a point in the past, ignoring versions newer than that?

    Read the article

  • How can I generate signed distance fields (2D) in real time, fast?

    - by heishe
    In a previous question, it was suggested that signed distance fields can be precomputed, loaded at runtime and then used from there. For reasons I will explain at the end of this question (for people interested), I need to create the distance fields in real time. There are some papers out there for different methods which are supposed to be viable in real-time environments, such as methods for Chamfer distance transforms and Voronoi diagram-approximation based transforms (as suggested in this presentation by the Pixeljunk Shooter dev guy), but I (and thus can be assumed a lot of other people) have a very hard time actually putting them to use, since they're usually long, largely bloated with math and not very algorithmic in their explanation. What algorithm would you suggest for creating the distance fields in real-time (favourably on the GPU) especially considering the resulting quality of the distance fields? Since I'm looking for an actual explanation/tutorial as opposed to a link to just another paper or slide, this question will receive a bounty once it's eligible for one :-). Here's why I need to do it in real time: There's something else:

    Read the article

  • OpenGL Vertex Attributes - Normalisation

    - by Daniel
    Alas, I have searched, and have found no definitive answer. When would you normalize the vertex data in OpenGL using the following command: glVertexAttribPointer(index, size, type, normalize, stride, pointer); I.e when would normalize == GL_TRUE; what situations, and why would you choose to let the GPU do the calculations instead of preprocessing it? All examples I have ever seen, have this set to GL_FALSE; and I cannot personally see a use for it. But Khronos aren't stupid, so it must be there for something useful (and probably common).

    Read the article

  • internal error message pops up after each time system is rebooted

    - by Biju
    I had installed ubuntu 12.04 using wubi. But each time i boot the system an internal error message pops up. As show below:- Executable path /usr/share/apport/apport-gpu-error-intel.py Package xserver-xorg-video-intel 2:2.17.0-1ubuntu4 Problem Type crash Apportversion 2.0.1-0ubuntu7 and so on.. I had earlier upgraded to ubuntu 12.04 from ubuntu 11.10. And encountered the same issue. Hence i uninstalled the OS and reinstalled using wubi. I had posted the same query in ubuntu.com/support (Question Number: 195525) But couldnt find a solution. I am using dell inspiron with intel pentium. Need ur help in resolving this issue. thanking u, Biju

    Read the article

  • How to solve dual monitor issue, which happens only during X start?

    - by tamashumi
    When is loading and two monitors are connected, instead of a login screen I see this: ...after clicking OK, selection appears: Then I'm following to console login, disconnecting by hand the secondary monitor cable, restart lightdm with a command sudo service lightdm restart ...and voila! System loads fine. If I disconnect the cable before boot X will be loaded fine too. It's not a nice 'feature' when I have to disconnect the cable each boot or X restart. I was trying to delete monitors.xml but it didn't help. The situation relates to my notebook with Intel integrated GPU. The same happens on two different pairs of monitors: at the office and at home. How can I fix this? Ubuntu 12.04 x64 Desktop with default Unity GUI.

    Read the article

  • What is the ideal laptop for creative coding applications?

    - by Jason
    Hi, I am a creative coder using C++(cinder and OpenFrameworks) I am looking to upgrade from my MacBook, which slowed down to about 3fps this morning. My project involves particles systems and fluids reacting to audio analysis data and computer vision data in real-time. SD or HD? no biggie. I have asked many people what computer I need. Ideally, I want a MacBook Pro. But is that enough power? I've been told that I need a desktop for what I am doing though I'd rather stay portable I've been told that I should go PC linux to get the most power but I'd rather stay mac I've been told that RAM is more of bottleneck than processor speed I've been told that the Graphics Card is more important than CPU and that code optimizations such as using trees over lists, proper threading, sending tasks to the GPU make a bigger difference than the hardware!!! what's true?! what do I need? Any suggestions are greatly appreciated

    Read the article

  • 12.10 live dvd no video input

    - by mark kirby
    Hi I have been trying install Ubuntu 12.10, but as soon as it gets past my bios and to the screen with the blinking line in the top left, I get a no video input message on my tv (like when you turn the tv on with nothing connected). I have used live dvd's of both betas, alphas and daily build all with exactly the same results. Has any one else had this ? Is there a fix ? Dose this mean I can never upgrade my Ubuntu again ? (12.04 works ive been using since beta) My pc ,while old, should run this fine CPU = 2x Intel P4 HT @ 3ghz GPU = Nvidia Geforce 310 via HDMI RAM = 2 Gb DDR 2 HDD = 2 x 7200 rpm SATA Please help me I use Ubuntu exclusively on my pc and would like to keep doings so.

    Read the article

  • How do I set the correct monitor resolution with Nvidia drivers for a monitor that does not send EDID?

    - by Torben Gundtofte-Bruun
    I keep having trouble getting the correct monitor resolution - every time I reinstall, I happen to use a newer Ubuntu release and the old tricks I used to know no longer work. Instead of leaving a long trail of questions for every new release, I am looking for a more universal and timeless solution. What's the correct way to set the correct monitor resolution with an Nvidia GPU for a screen that does not send EDID values? Note: This is a "dummy" question -- with the help from the chat, I already found the answer, and I am now going to add my own answer to document a solution that is hopefully universal.

    Read the article

  • What are the factors that determine the default frequency of a shader call?

    - by user827992
    After i have been played for some days with various vertex and fragments shaders seems clear to me that this programs are called by the GPU at every and each rendering cycle, the problem is that I can't really quantify this frequency and I can't tell if is based on some default values or not because I don't have a big collection of hardware right now to do extensive tests. For what i know the answer could be really trivial like "it's the same of the refresh rate of your monitor", but i would like some good answers on that to be clear on this. For instance looks really odd to me that all the techniques used to control the amount of FPS that i have seen until now uses a call for the OpenGL function glutGet(GLUT_ELAPSED_TIME) to retrieve a value in ms about when the rendering started but I have to relies on the CPU to do the math. Why I can't set an FPS value in OpenGL if OpenGL clearly has a counter and a timer/clock? PS I'm referring to OpenGL 3.0+

    Read the article

  • Avoiding lag when rendering Texture2D for first time

    - by Emir Lima
    I have found a similar question here, but it is about playing sounds. I am using 2048 x 2048 textures for sprite sheets and every time I call spriteBatch.Draw using a sheet for the first time in game execution, causes a considerable lag. The lag doesn't appears for the next times. Someone has faced this problem before? What can I do to overcome this? Update: I inserted a code in the end of content load routine that draws EVERY Texture2D that is loaded into ContentManager before follow to the game screen. This works well. None lag occurs when different textures are rendered over the time, EXCEPT if the IsFullScreen are changed. Apparently, changing this property makes the textures loaded in the GPU gone. Is that correct?

    Read the article

  • How can I generate signed distance fields in real time, fast?

    - by heishe
    In a previous question, it was suggested that signed distance fields can be precomputed, loaded at runtime and then used from there. For reasons I will explain at the end of this question (for people interested), I need to create the distance fields in real time. There are some papers out there for different methods which are supposed to be viable in real-time environments, such as methods for Chamfer distance transforms and Voronoi diagram-approximation based transforms (as suggested in this presentation by the Pixeljunk Shooter dev guy), but I (and thus can be assumed a lot of other people) have a very hard time actually putting them to use, since they're usually long, largely bloated with math and not very algorithmic in their explanation. What algorithm would you suggest for creating the distance fields in real-time (favourably on the GPU) especially considering the resulting quality of the distance fields? Since I'm looking for an actual explanation/tutorial as opposed to a link to just another paper or slide, this question will receive a bounty once it's eligible for one :-). Here's why I need to do it in real time:

    Read the article

  • What is a good method for coloring textures based on a palette in XNA?

    - by Bob
    I've been trying to work on a game with the look of an 8-bit game using XNA, specifically using the NES as a guide. The NES has a very specific palette and each sprite can use up to 4 colors from that palette. How could I emulate this? The current way I accomplish this is I have a texture with defined values which act as indexes to an array of colors I pass to the GPU. I imagine there must be a better way than this, but maybe this is the best way? I don't want to simply make sure I draw every sprite with the right colors because I want to be able to dynamically alter the palette. I'd also prefer not to alter the texture directly using the CPU.

    Read the article

  • Lexmark X7170 shows documents as printed when they haven't

    - by Mehmet
    I made the move from Windows 7 to Ubuntu not dual booting because I have decided to quit gaming to spare more time for my studies. I just needed an OS for browsing the web and word processing etc. After I installed Ubuntu I installed the AMD GPU drivers, after which I clicked on the little printer icon, selected add printer and it found the drivers for the Lexmark 7000 series and I installed them. Now my problem is when I print something from Writer it processes it thinks its completed it, when in fact it hasn't printed anything. I tried printing a test page but it was stuck on processing for 5 minutes. I have restarted my computer and turned the printer on and off. I'm running 64bit if that changes anything.

    Read the article

  • How to start embedded development for developing a handheld game console?

    - by Quakeboy
    I work as a iPhone app developer now, so I know a bit of c, c++ and objective c. Also have fiddled with Java and many other. All of them have been just high level application/games development. My final goal is to make a handheld game console. More like a home made NES/SNES handheld console or even an Atari. I have found out about RaspberryPI and Arduino. But I need more information about how to approach this. 1) How Do I learn to pick the best board/cpu/controller/GPU/LCD screen/LCD controller etc? 2) Will learning to make a NES emulator first help me understand this field? If so are there any tutorials?

    Read the article

  • Flash 10.1 est là : accélération matérielle et 32 failles colmatées au programme

    Mise à jour du 11/06/10 Flash 10.1 : accélération matérielle et 32 failles colmatées Flash 10.1 est là. Cette nouvelle version de Flash s'accompagne de l'arrivée de l'accélération matérielle et de la correction de 32 failles de sécurité. La première innovation devrait faire taire, du moins en partie, les critiques sur les performances de la technologie d'Adobe. L'accélération matérielle permet de lire les vidéos (H.264) en utilisant les ressources de la carte graphique (GPU) et non plus du CPU. Résultat, une lecture plus rapide et fluide, et un processeur moins impacté par l'utilisation du player. Tout ceci se passe sur le papier. E...

    Read the article

  • DirectCompute information

    - by N0xus
    I've been trying to make use of the GPU as part of a project of mine. I've looked into both CUDA and OpenCL, but the lack of information showing you how to introduce these into a project is shocking. Even their dedicated forum groups are dead. So now, I'm looking into DirectCompute. From what I can tell, it's simply a new type of shader file that makes use of HLSL. My question is this, does my program (aside from being DirectX 10 / 11 ) need its structure changed? I mean, is it simply a case of creating the CS file, setting in the project like I would any other shader, and watch the magic happen? Any information on this would be appreciated.

    Read the article

  • High temperature on my laptop with Radeon Mobility HD4670

    - by Lorthirk
    As almost everyone here, I guess, in these days I downloaded Quantal Quetzal to give it a try. However I noticed that my laptop runs fairly hot with cooling fans almost always on, even sitting in the desktop doing nothing. I downloaded XSensor to read temperature sensors, and I saw that while CPU stays on about 65°C, so quiet normal I guess, the GPU sits at 75°C. In comparison my actual Windows 7 installation, which dual boots witb Quantal, stays at 59°C CPU and 65°C. So I went reading and learned that AMD dropped support for my video card from fglrx package, and that fglrx-legacy won't support 1.13 Xorg, so I'm basically stuck with OSS drivers. So I was guessing if there's anything I can try, and if it's possible that the OSS drivers could be the cause of the high temperature?

    Read the article

  • Chrome 18 : la 3D pour tous et amélioration de l'accélération de Canvas2D

    Chrome 18 : la 3D pour tous Et amélioration de la prise en charge de Canvas2D Chrome 18 vient de passer en version stable. Au menu, une amélioration de la prise en charge de Canvas2D qui tire parti de l'accélération matérielle (et du GPU donc). Elle devrait permettre à des applications web, comme les jeux, de tourner plus rapidement. Pour Google, avec cette prise en charge, les versions 100% Web des applicatifs pourraient même être aussi performantes que les versions traditionnelles. L'accélération matérielle appliquée à Canvas2D était jusqu'ici réservée au « beta channel » de Chrome. La fonctionnalité peut donc avoir encore quelques petits ratés. [IMG]http:/...

    Read the article

  • Bumblebee optirun appears to depend on Intel

    - by user206398
    I have a Lenovo T420 with Intel and Nvidia graphics. On upgrade to Ubuntu Saucy, I had to purge and reinstall bumblebee-nvidia to get beyond optirun failing to find a GPU driver. Now, "optirun glxgears" and "optirun sol" succeed, but optirun fails on 2 Virtual Life viewers that it supported in the past, Cool VL (CoolVLViewer-1.26.8.34-Linux-x86) and Imprudence (Imprudence 1.4.0 beta2). In both cases, the error output is huge, but it starts with libGL error: failed to load driver: i965 and libGL error: failed to load driver: swrast From the little I can discover, i965 is an Intel graphics driver, which should not be invoked at all. I haven't found any information about swrast. I suspect that some of the X configuration associated with Bumblebee has some Intel dependence that is invoked on certain library calls, but not others. I haven't discovered any definite information on this line. The Cool VL Viewer runs without optirun, but complains about the insufficiency of the Intel graphics.

    Read the article

  • Black frame around screen after HDMI connection failure

    - by Wolter Hellmund
    I was trying to watch a movie in my computer through the TV, so I connected both with an HDMI cable. I was unable to have a successful setup (the colors were all weird on the TV and the screen size, incorrect), I tried many resolutions using the nvidia-settings application and somehow my screen got framed by a black border and after that I have been unable to remove it, even after restarting the computer and not being connected to the HDMI cable anymore. I am using Ubuntu 11.10 amd64, my GPU is an nVidia GeForce 8600M GT and I am using the propietary driver version 280. The problem is due to some setting with my account only. I logged in to the guest session and the resolution is right there. Also, my desktop "thinks" the resolution is right (i.e. 1280x800), but it must be right in another scale because there is pixel area occupied by the black frame.

    Read the article

  • How to configure screens in console or create screen configuration profiles?

    - by uncle Lem
    I have two monitors and integrated GPU (Intel® HD Graphics 4600). It works fine for work or movies, but if I launch games in fullscreen mode - I get artifacts, glitches and so on. Temporary disabling second monitor solve this problem, but then I have to enable it back, and set its properties manually (by default, additional screen attaches its left-top corner to main monitor's right-top corner, but I need it to be left-bottom and right-bottom corners). So I need some kind of automatization here. Best option - tool to create and swap between some kind of config profiles. Or, maybe, some console manipulations which I can put into script files would be fine too. (Ubuntu 13.04, if it matters)

    Read the article

< Previous Page | 267 268 269 270 271 272 273 274 275 276 277 278  | Next Page >