Search Results

Search found 13534 results on 542 pages for 'gpu programming'.

Page 2/542 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • Ubuntu 12.04 and KDE, graphical temperature monitor for GPU

    - by Frank
    I have ubuntu 12.04 and KDE. I already installed "lm-sensors", "hardware sensors indicator" and "Psensors" but none works well for me. I also know a couple of commands to find gpu temperature on terminal but I don't need that, I need a graphical application that works with kde to use. Do you know something that helps me? My card is MSI R6870 Hawk (from ATI original HD6870) and I have fglrx drivers Thanks

    Read the article

  • Declarative programming vs. Imperative programming

    - by EpsilonVector
    I feel very comfortable with Imperative programming. I never have trouble expressing algorithmically what I want the computer to do once I figured out what is it that I want it to do. But when it comes to languages like SQL or Relational Algebra I often get stuck because my head is too used to Imperative programming. For example, suppose you have the relations band(bandName, bandCountry), venue(venueName, venueCountry), plays(bandName, venueName), and I want to write a query that says: all venueNames such that for every bandCountry there's a band from that country that plays in venue of that name. In my mind I immediately go "for each venueName iterate over all the bandCountries and for each bandCountry get the list of bands that come from it. If none of them play in venueName, go to next venueName. Else, at the end of the bandCountries iteration add venueName to the set of good venueNames". ...but you can't talk like that in SQL and I actually need to think about how to formulate this, with the intuitive Imperative solution constantly nagging in the back of my head. Did anybody else had this problem? How did you overcome this? Did you figured out a paradigm shift? Made a map from Imperative concepts to SQL concepts to translate Imperative solutions into Declarative ones? Read a good book? PS I'm not looking for a solution to the above query, I did solve it.

    Read the article

  • On a dual-GPU laptop, is using the discrete GPU ever more power efficient?

    - by Mahmoud Al-Qudsi
    Given a laptop with a dual integrated/discrete GPU configuration, is it ever more power efficient to use the discrete GPU instead of the integrated? Obviously when writing an email or working on a spreadsheet, the integrated GPU will always use less power. But let's say you're doing something graphics-medium but not graphics-intensive/heavy - is there a point where it actually makes sense to fire up the discrete GPU, not for performance but for power-saving reasons? Off the top of my head, I can think of a scenario where the external GPU supports hardware decoding of a particular video codec - I'd imagine there is a "price point" where using the GPU saves more energy than decoding that fully in software would. But I think most GPUs, integrated or discrete, pretty much decode just the plain-Jane h264. But maybe there is something more complicated, perhaps if you're doing something like desktop/windowing animations or a flash animation on a website (not an embedded flash video) - maybe the discrete GPU will use enough less power to make up for switching to it? I guess this question can be summed up as to whether or not you can say beyond doubt that if you don't care for performance on a laptop with two GPUs, always use the integrated GPU for maximum battery life.

    Read the article

  • Which programming language should I learn? [on hold]

    - by Ashkan
    I'm Ashkan and I'm from Iran, I started programming when I was 13 and I learned a lot of stuff since then, But now I'm totally lost. Since I live in Iran there are no counselor or any professionals out there to help me, so I decided to ask here. I started with Visual Basic and after 1 year I started to learn HTML , CSS , Javascript and JQuery. And for the past 6 months I've been learning PHP,and I have a basic understanding of OOP. I want to move to America to continue my studies and I was wondering which programming language helps me the most to get there? Should I learn C++ or JAVA or should I study Computer Science and Math? also since We are not in a good place financially, I want a programming language that helps me in college and lets me make some money? Thanks in advance and sorry for my poor English skills.

    Read the article

  • Android programming vs iPhone Programming?

    - by geena
    Hi, I am doing my finol project and thinking of an mobile app to develop.but i am new to mobile OS world and dont know which is good for me to go on.I mean , in long term which will be more beneficial to me b/w android or iPhone programming as well as to my final project ? :) .......... Thanx for all the suggestions of you guyz :) Well I am, if not so bright, then pretty good at Java and C++ :) Although Objective C is a little different from standard C/C++ but I think I can cope with it. Owning a Mac or running Snow Leopard in VMWare is not going to make much difference in iOS development... or is it? Actually, as it is final project for my BS degree, I am wondering whether is it worth taking as a final project or not (iPhone or Android app)...Or.... Is it better to stick with web/desktop development? and what this means that i have to be a

    Read the article

  • Better Programming By Programming Better?

    - by ahmed
    I am not convinced by the idea that developers are either born with it or they are not. Where’s the empirical evidence to support these types of claims? Can a programmer move from say the 50th to 90th percentile? However, most developers are not in the 99th or even 90th percentile (by definition), and thus still have room for improvement in programming ability, along with the important skills.The belief in innate talent is “lacking in hard evidence to substantiate it” as well.So how do I reconcile these seemingly contradictory statements? I think the lesson for software developers who wish to keep on top of their game and become experts is to keep exercising the mind via effortful studying. I read a lot technical books, but many of them aren’t making me better as a developer.

    Read the article

  • How to deal with cargo-cult programming attitude?

    - by Aivar
    I have some students (in introductory programming course) who see programming language as a set of magic spells, which must be cast in order to achieve some effect (instead of seeing it as a flexible medium for expressing his idea of solution). They tend to copy-paste code from previous similarly sounding assignments without considering the essence of the problem. Can anyone recommend some exercises or analogies to make those students more confident that they can and should understand the structure and meaning of each piece of code they write?

    Read the article

  • stress testing opencl/Ati GPU on 12.04

    - by lurscher
    What does people normally use to stress test their GPU on ubuntu 12.04? I tried installing Phoronix Benchmark suite ubuntu .deb but it tries to install freeglu3-dev and at the same time complains about $ phoronix-test-suite benchmark pts/opencl The following dependencies are needed and will be installed: freeglut3-dev This process may take several minutes. Reading package lists... Building dependency tree... Reading state information... freeglut3-dev is already the newest version. 0 upgraded, 0 newly installed, 0 to remove and 90 not upgraded. There are dependencies still missing from the system: - OpenGL Utility Kit / GLUT 1: Ignore missing dependencies and proceed with installation. 2: Skip installing the tests with missing dependencies. 3: Re-attempt to install the missing dependencies. 4: Quit the current Phoronix Test Suite process. Missing dependencies action: 4 so even if it is installing freeglu3, it still complains about missing GLUT. You can't win against GLUT it seems So, what does people use for this? i mean, really, because i have tried googling for an hour and it is not paying up Thanks!

    Read the article

  • How to troubleshoot GPU freezes?

    - by dlsmith2
    So in advance I'll just say I am a total linux newbie, so be kind. I just downloaded Ubuntu 11.10 and this is my first experience with Linux. I enjoy it so far and actually enjoy it except for when my computer freezes. This has been quite often so far. I've done a little research and it seems my problem is with the GPU. When it does freeze I can move the cursor but cannot click on anything. I also cannot run Alt+F2 xkill. So my only previous experience has been with Windows and I would normally solve an issue like this with Ctrl+Alt+Delete and just shut down the offending program. I do not know how to do this in Ubuntu and not even sure this would even work. Please help me if you can, how do I deal with a freeze without having to resort to a hard shutdown, I cannot seem to run the computer over one hour without experiencing this issue. I tried accessing my GRUB menu on startup but I can't even seem to do that. Also the only real program I have been running whenever this happens seems to be Firefox. Thank you, appreciate any help. After running lspci | grep VGA command prompt: 00:12.0 VGA compatible controller: nVidia Corporation C67 [GeForce 7150M / nForce 630M] (rev a2)*****

    Read the article

  • what is best book to learn optimized programming in java [closed]

    - by Abhishek Simon
    Possible Duplicate: Is there a canonical book for learning Java as an experienced developer? Let me elaborate a little: I used to be a C/C++ programmer where I used data structure concept like trees, queues stack etc and tried to optimize as much as possible, minimum no. of loops, variables and tried to make it efficient. It's been a couple of years that I started writing java codes, but it is simply not that efficient in terms of performance, memory intensive etc. To the point: I want to enter programming challenges using java so I need to improve my approach at things I program. So please suggest me some books that can help me learn to program better and have a chance in solving challenges in programming.

    Read the article

  • Why is GPU used for mining bitcoins?

    - by starcorn
    Something that I have not really grasped is the idea of bitcoins. Especially since everybody can mine for it using a powerful GPU. I wonder why is GPU used for this purpose? Is the work done by GPU used by some huge organization or is it just wasted resource that goes into simulated mining? I mean for example SETI uses your GPU for the purpose of finding aliens, but what I can see of bitmining it seems for no actual purpose than wasted resource.

    Read the article

  • Are books on programming hard to understand?

    - by DarkEnergy
    I've been reading books that are extremely daunting. Accelerated C++ is by far one of the books -- that I haven't finished. I plan too, but that's another story. When reading a programming book, do you find yourself re reading a lot of the paragraphs? Sometimes it takes me like an hour to read 20 pages out of a book. Sometimes they become so daunting that it takes me all day to finish a single chapter. I think having these as e-books makes them even harder to read sometimes, since I'm so used to looking down to read a book or just looking at tangible paper. IDK, just wanting to know if reading these books becomes extremely hard, and do you find yourself rereading the most simplest paragraphs 2-3 times just to get the meaning of it because the previous paragraph left your brain hurting? http://www.it-career-coach.net/2007/03/04/are-computer-programming-books-hard-to-study/ here is a article i read on something similar to this. edit sometimes I find myself reading a whole page... then I look up and say 'wth did I just read'... I could finish a chapter in 30 minutes to an hour and feel this way too...

    Read the article

  • Should I pick up a functional programming language?

    - by Statement
    I have recently been more concerned about the way I write my code. After reading a few books on design patterns (and overzealous implementation of them, I'm sure) I have shifted my thinking greatly toward encapsulating that which change. I tend to notice that I write less interfaces and more method-oriented code, where I love to spruce life into old classes with predicates, actions and other delegate tasks. I tend to think that it's often the actions that change, so I encapsulate those. I even often, although not always, break down interfaces to a single method, and then I prefer to use a delegate for the task instead of forcing client code to create a new class. So I guess it then hit me. Should I be doing functional programming instead? Edit: I may have a misconception about functional programming. Currently my language of choice is C#, and I come from a C++ background. I work as a game developer but I am currently unemployed. I have a great passion for architecture. My virtues are clean, flexible, reusable and maintainable code. I don't know if I have been poisoned by these ways or if it is for the better. Am I having a refactoring fever or should I move on? I understand this might be a question about "use the right tool for the job", but I'd like to hear your thoughts. Should I pick up a functional language? One of my fear factors is to leave the comfort of Visual Studio.

    Read the article

  • Comparison of Extreme Programming (XP) to Traditional Programming Methodologies

    The comparison of extreme programming (XP) to traditional programming methodologies can find similarities between the historic biblical battle between David and Goliath. Goliath of Gath is a Philistine warrior renowned for his size, strength and battle tested skills. Much like Goliath, traditional methodologies are known to be cumbersome due to large amounts of documentation, and time consuming do to the time needed to gather all the information. However, traditional methodologies have been widely accepted by the software development community for years because of its attention to detail regarding project development and maintenance. David is a male Israelite teenager, who was small, fearless, and untrained in any type of formal combat. In a similar fashion, extreme programming focuses more on code over documentation so that time is spent on developing the project and not on cumbersome documentation of a project. Typically, project managers and developers are fearless when they start this type of project because they usually start with little to no documentation, and they expect to be given changes to be implemented at the start of every new project iteration. Because of the lack of need or desire for documentation in extreme programming projects they appear to act as if there is no formal process involved in developing an extreme programming project.  This is a misnomer, because of the consistent development iterations and interaction with clients and users the quickly takes form because each iteration allows the project to be refined as the customer needs and desires change. Ravikant Agarwal and David Umphress documented a new approach to extreme programming called personal extreme programming (PXP) at the ACM Southeast Regional Conference in 2008. PXP is the application of extreme programming core concepts in a single developer team environment.  PXP focuses on how to adjust the main concepts and practices of extreme programming that is typically centered in a group environment and how they can be altered to be beneficial for a single developer environment. Suzanne Smith and Sara Stoecklin are both advocates of extreme programming according to the Journal of Computing Sciences in Colleges and in fact they feel that it should receive more attention in introductory programming classes to allow students to better understand the software development process. Reasons why extreme programming is a good thing: Developers get to do more of what they love, Develop. Traditional software development methodologies tend to  add additional demands on a project by requiring all requirements and project specifications to be fully defined prior to the start of the implementation phase of a project. A standard 40 hour work week. With limiting the work week to only 40 hours prevents developers from getting burned out on projects.

    Read the article

  • Programming knowledge vs. programming logic

    - by Shirish11
    Is there any difference between the two topics? I have seen companies asking for Good Programming knowledge some Good Programming logic. I believe that Programming knowledge is related to knowledge about the language in consideration and Programming logic is problem solving logic using programming (in general). Please correct me if I am wrong. Also what is more important. Edit: Do selection of components for application, designing interfaces validating user inputs fall under programming knowledge or Programming logic? Does programming logic simply imply problem solving, or is there anything else which it should comprise of?

    Read the article

  • How do functional programming languages work?

    - by eSKay
    I was just reading this excellent post, and got some better understanding of what exactly object oriented programming is, how Java implements it in one extreme manner, and how functional programming languages are a contrast. What I was thinking is this: if functional programming languages cannot save any state, how do they do some simple stuff like reading input from a user (I mean how do they "store" it), or storing any data for that matter? For example - how would this simple C thing translate to any functional programming language, for example haskell? #include<stdio.h> int main() { int no; scanf("%d",&no); return 0; }

    Read the article

  • What is the specification for GPU ROMs?

    - by Alexandru
    So, graphics cards have a ROM that you can export in GPU-Z (GPU-Z: An example of an application that will perform this task). Is it at all possible to find out what the specification is for a GPU ROM? I have an issue with one of my cards and would like to add a GOP partition to it in order to enable secure boot and remove the annoying watermark in Windows 8.1 about secure boot not being configured correctly.

    Read the article

  • Macbook Pro with Windows 7 - GPU always on

    - by Joonas Pulakka
    Übergizmo is reporting an issue with the new Macbook Pros' GeForce 330M GPU being always "on" under Windows 7, and thus almost halving the battery life compared to that with OS X (which is able to somehow suspend that GPU and use the the low-end integrated GPU to do the light work). Any solutions, or rumors of coming solutions?

    Read the article

  • Using PhysX on a Radeon GPU?

    - by davr
    Is it possible to get PhysX to run on a Radeon GPU? I've seen various posts on forums claiming someone has found a way, but I was unable to locate any actual guides or software downloads to do it. Note: I don't mean running a GeForce and Radeon GPU in the same machine, I mean having only a Radeon GPU and having it run PhysX.

    Read the article

  • Understanding GPU clock rates

    - by trizicus
    I know how to overclock my CPU (mess with multiplier, and bus speed)... However, I've noticed that it seems a bit more complicated with GPU's. How and where do I start? I've noticed that I can adjust the GPU clock speed in my BIOS. Card I'm overclocking: http://www.nvidia.com/object/product_geforce_gt_240_us.html I found that memory bus speed is (Mem Speed * Bus width) / 8. So obviously a good way to overclock the memory bandwidth is to adjust the memory speed. Now, GPU speed is 550 Mhz. How do I find its speed as well? Do I multiply it by the bus width (128)? What is ideal GPU speed relative to memory bandwidth?

    Read the article

  • GPU not powering on

    - by Lerp
    So I got home from work yesterday and went to turn my computer on as per usual to be greeted by this: The screens remained black, so I rebooted; I go as far as GRUB before my screens went black again. I rebooted again, they didn't turn on. I rebooted again, I got as far as the windows login screen. This time I unplugged it, opened it up and cleaned it but to no luck. The GPU was still being tempermental. I repeated the process of turning off and on several times until one time it work as normal. I happily played games for the rest of the night (5-6 hours?) thinking everything was jolly good now. Well I get home from work today and it is doing the SAME thing. Sometimes everything displays normally for a few seconds to minutes then the screens go black; then sometimes the screens don't come on at all. Summary and additional points Screens sometimes turn on before shortly turning off, sometimes they don't; I cannot seem to determine any pattern between when they do or do not turn off. The build has been working fine for about 8 months now so I know it's not hardware incompatibility. If I plug a monitor into the on board graphics I can use the PC normally (just in low graphics mode) I have two monitors and it's a case of they both turn on or not. So I think I can rule out the monitors being dead. I have tried replacing the GPU I have tried replacing the RAM I have tried flashing the CMOS I have tried cleaning the inside The GPU is a Radeon HD 7870 My questions Is my GPU dead? It's not very old and I would rather have a method of being certain it's the GPU before I fork out some money I can't really afford. I do not have a second PC here to test it in. If my GPU is dead why does it sometimes work and sometimes not? Update Okay, it was working again.. at least I thought it was. I left it running for 10-20minutes with the screens black. Turned it off and straight back on and it worked for all of 10minutes. I was then updating the post in joy thinking I could play some games for the rest of the night when BAM it went black again. So yeah, I don't know :C

    Read the article

  • Deactivate GPU and use IGP

    - by squelos
    I am having some trouble with my Sony Vaio SVE1511W1e laptop. It has an ATI Radeon and the i5 has an IGP (i5 2450m). I don't often use my GPU, and the IGP would be just enough for most usage I do. Therefore, in order to improve the battery life, I wish to deactivate the GPU and use only the IGP. The problem is that my BIOS doesnt allow me to do so. But I believe it is possible to deactivate the GPU 'programatically'. I'm running Debian Wheezy on the 3.2.0.4 AMD64 kernel. The first problem I'm running into is that when I run lspci, my IGP doesnt show up. Could this be because im lacking a kernel module? (I chose a targeted installation). What are the solutions to deactivating a GPU and using an IGP on a Linux System such as debian?

    Read the article

  • how to write good programming logic?

    - by user106616
    recently I got job as a java developer, and now I have assigned project too. I want to know what is a good logic? when I check in the code my team lead is saying that its a good code. But when it comes to my project manager he is saying that its a bad code. And he is changing my code, after his changes if I see his code its really very very good and even simple. can you please tell me how to develop the good program, good logic? what is the best way to structure a problem in terms of code?

    Read the article

  • Recommendation for Improving Programming Skills

    - by Moaz ELdeen
    I'm 25, I know C++ syntax since 9 years.. but It seems that I have copied so much code, and I didn't learn that much and didn't solve a lot of algorithms in my own. Currently I'm working for computer vision programmer as a junior and I have difficulity of doing algorithms like blob tracking or object tracking, writing algorithms like KNN, Quadtree,..etc. I don't know what to do, or what to improve, I tried to write asteriods game, I have finished it, and here you can watch it https://www.youtube.com/watch?v=jw0L4aCB4TU What should I do more to enhance my skills ?

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >