Search Results

Search found 13534 results on 542 pages for 'gpu programming'.

Page 233/542 | < Previous Page | 229 230 231 232 233 234 235 236 237 238 239 240  | Next Page >

  • What are the pro/cons of Unity3D as a choice to make games?

    - by jokoon
    We are doing our school project with Unity3d, since they were using Shiva the previous year (which seems horrible to me), and I wanted to know your point of view for this tool. Pros: multi platform, I even heard Google is going to implement it in Chrome everything you need is here scripting languages makes it a good choice for people who are not programming gurus Cons: multiplayer ? proprietary, you are totally dependent of unity and its limit and can't extend it it's less "making a game from scratch" C++ would have been a cool thing I really think this kind of tool is interesting, but is it worth it to use at school for a project that involves more than 3 programming persons ? What do we really learn in term of programming from using this kind of tool (I'm ok with python and js, but I hate C#) ? We could have use Ogre instead, even if we were learning direct x starting january...

    Read the article

  • What is the worst programmer habit?

    - by 0x4a6f4672
    Many people get into programming because programming is fun. At least in the beginning. After some time doing it professionally, programming is no longer fun, often just hard work. Sometimes we develop bad habits along the way to make it fun again. Some bad habits of programmers are well known, for example the "I fix that in a second" habit, the "reinvent the wheel" practice or the "all code except mine is crap" attitude (which often leads to "I will re-write the entire program from scratch" syndrome). There are things which a programmer should never do. What is the worst programmer habit?

    Read the article

  • Career Shifters: How to compete with IT/ComSci graduates

    - by CareerShifter
    I am wondering what are the chances of a career shifter (mid 20's), who have maybe 3-6 months programming experience vs. younger fresh IT/Com Sci graduates. You see, even though I really love programming (Java/J2EE), but nobody gives me a feedback when I apply online. maybe because they preferred IT/ComSci graduates vs a career shifter like me.. So can you advice on how to improve my chance on being hired. How can i get a real-job programming experince if nobody is hiring me. I can make my own projects (working e-commerce site blah blah) but it is still different from the real job. And my codes are working but it still needs a lot of improvement and no one can tell me how to improve it because no one sees it (because I'm doing it alone?). Do you know any open source websites (java/j2ee/jee) / online home-based jobs who accepts java/j2ee/jee trainees.. Thank you very much

    Read the article

  • Is it only possible to display 64k vertices on the monitor with 16bit?

    - by Aufziehvogel
    I did the first 3D tutorial over at riemers.net and stumbled upon that my graphic card only supports Shader 2.0 (Reach profile in XNA) which means I can only use Int16 to store the indices (triangle to vertex). This means that I can only store 2^16 = 65536 vertices. Also I read on the internet that you should prefer 16-bit over 32-bit because not all hardware (like mine) does support 32-bit. Yet, I am wondering: Do really all game scenes get along with only so little vertices? I though already faces of people used a lot of polygons (which are made up of vertices?). It’s not relevant for me yet, but I am interested: Do game scenes use only 65536 vertices? Do you use some trade-off to display more (e.g. 64k in GPU buffer rest on RAM) Is there some method to get more into the GPU buffer? I already read on some other posts that there seems to be a limit of 64k per mesh too, so maybe you can compact stuff to meshes?

    Read the article

  • Samsung a-t-il triché sur les benchmarks pour le Galaxy S4 ? Les performances de l'Exynos 5 remises en cause après des tests

    Samsung : les Galaxy S4 affichent des performances élevées pour les benchmarks Mais moindres pour les autres applicationsTout a débuté avec le post d'un utilisateur mécontent sur le site Beyond3d après qu'il ait réalisé des bancs d'essai pour le GPU de son Galaxy S4. Pendant les essais, l'utilisateur constate que le GPU de son smartphone tourne à 532 Mhz. Cependant, pour toutes les autres applications y compris les jeux (aux résolutions les plus poussées) ce dernier ne tournerait plus qu'à 480 Mhz. « Oh Samsung honte à toi ! » écrira-t-il.Par la suite, Brian Klug et Anand Lal Shimpi, du site web Anandtech, reprennent eux aussi l'expérience de cet utilisateur avec leurs propres smartphones pour en confirmer...

    Read the article

  • I'm a beginner Java programmer but I want to be useful

    - by user105418
    Programming has always interested me, but after learning some of the basics of Java(I'm talking high school level), I don't really know what to do from there. I want to be able to apply what I learned in some way, whether it be a volunteer project or something, but I probably don't know enough programming. Is it possible for a novice Java programmer to be useful in some way whatsoever. I want to do this because I feel like I could learn more about programming by helping people in theirs, but I'm not sure if I'm even able to this though. Does anyone have any suggestions on how I can contribute to other people's project in some way or how to apply it in some way?

    Read the article

  • Climbing the hacker ladder

    - by cobie
    This is not a question in which I am asking for opinions rather I am asking for first hand experience. I have been programming in python for quite a while and I feel solid enough in python programming. I can come up with algorithms for problems and implement them but I somehow feel I am stuck with remaining an apprentice. What are some first hand experiences on how to climb up the ladder and become better at programming as in learning about browsers security, compilers etc. Personal experiences would be valued in responses.

    Read the article

  • Becoming an expert vs boredom [closed]

    - by QAH
    I am a college student, and I love to program, period. I code all kinds of things in different kinds of languages. Although I enjoy programming, I have an extremely hard time sticking to one project for a long time. I attribute this shortcoming to my high level of curiosity, exploring different technologies, languages, libraries, etc. What would be best? Should I settle down more and spend time on becoming an expert in one or two programming fields, or should I be more of a jack of all trades, trying out all kinds of new technologies, languages, programming methods, etc.? I'm guessing that somewhere in the middle would be best. I'm always amazed at how many developers are able to create one or two projects, and develop on them for years. What techniques do you guys employ to help you stay focused on a project?

    Read the article

  • job offer in dead technology

    - by bold
    I have a job offer in a dead technology (specific programming language) that I don't want to work with nor do I believe it will offer many jobs in the future. It requires twice a year travels abroad, which not a plus in my eyes. On the other hand the money on the table is high. What would you do? edit: as its not clear I got a job in a programming language that is different from the academic programming language I worked with. Now I see it as a mistake to head to that direction.

    Read the article

  • Should I continue to learn and program using ansi c or non standard? [closed]

    - by Erik
    I am a Cs Student, enthusiastic about programming. C is my first programming language that I learned. (Never been exposed to programming, data structures, algorithms before) I failed the exam because I studied on my own and didn't know how to use windows non standard libraries like conio.h. (I had to draw circles, etc using get x and get y). I told them but they just don't care. What do I do because for a beginner this is very confusing? Do I continue to study ansi c on my own, or should I study non standard c? Should I do them in parallel? *I did use the search bar but found nothing useful that could help me.

    Read the article

  • wubi install of ubuntu 12.10 on hp dv6 6154tx causes uncontrolled heating, fan speed, low battery life

    - by kansi
    i have a hp dv6 6154tx with an integrated Intel GPU and a discrete ATI GPU (Radeon Mobility HD 6490M (1 GB DDR5)). -Problems: when i install ubuntu 12.10 using wubi, my laptop starts heating, fans start to run fast (to much of noise), and battery goes really low Any attempt to install catalyst drivers (from ati official site and also proprietary drivers) fails , as after reboot when i login everything is gone like no unity no desktop and sometimes black screen appears. even bumblebee and jupiter didnt help.... :( So PLEASE, PLEASE can somebody post the real solution to my problems i.e.(inc battery life and stop fan noise and heating). (i want to install ubuntu only using wubi)

    Read the article

  • Why do we need private variables?

    - by rak
    Why do we need private variables in classes in the context of programming? Every book on programming I've read says this is a private variable, this is how you define it but stops there. The wording of these explanations always seemed to me like we really have a crisis of trust in our profession. The explanations always sounded like other programmers are out to mess up our code. Yet, there are many programming languages that do not have private variables. What do private variables help prevent? How do you decide if a particular of properties should be private or not? If by default every field SHOULD be private then why are there public data members in a class? Under what circumstances should a variable be made public?

    Read the article

  • Usefull skills from a computer science degree

    - by Tom Squires
    I did my degree in physics and moved later into programming. I have two and a half years experience under my belt and like to think I write good code. I am, however, concerned that not doing a compsci degree has left holes in my knowledge. I would like to fill those up now since I know I want to be doing programming for the rest of my career. What skills/techniques did you learn in your compsci degree that one wouldn't pick up from on-the-job programming?

    Read the article

  • "Sorry, Ubuntu 12.04 has experienced an internal error."

    - by malapradej
    I have recently upgraded to Precise and had some errors. It seems to be quite random and with differences in the error reports. I have duly sent the reports hoping the system will have found the problem and sorted itself out. After the second error I am following the wizards (software...) advice and seeking help. The 1st time it happened I jotted down the following: ExecutablePath /usr/lib/tracker/tracker-extract LaunchPad bug 950765 AMD64 The second time the following: ExecutablePath /usr/share/appart/appart-gpu-error-intel.py "Possible GPU hang........" sandybridge-m-gt2 LaunchPad bug 981261 If there is anyone that can help it is much appreciated. I did not really want to upgrade at this stage, but was forced to due to the latest version of python-numpy in precise. You win some, you loose some.... Jacques I am using a Pavilion dv6 notebook and 64bit ubuntu 12.04 LTS

    Read the article

  • 27 vidéos techniques des Qt DevDays 2005, 2006 et 2008 sont désormais rendues publiques par Qt eLear

    L'équipe eLearning de Qt a depuis quelques temps cherché à récupérer des vidéos techniques issues des conférences des anciens QtDevDays dans l'optique de les faire partager à tout le monde. C'est aujourd'hui chose faite avec la publication en ligne de 27 présentations techniques ce qui correspond à 22h30min de vidéos. Les sujets traités sont toujours valides aujourd'hui, même si le framework a évolué au fil des années. 2005 :All About Qt Widgets Effective Graphics Programming Practical Model/View Programming Threaded Programming with Qt - Good Practise Writing Custom Styles with QStyle Writing plugin applications with Qt 2006 :Advanced Item Views...

    Read the article

  • optirun fails in Ubuntu 12.10

    - by chiloxsan
    I am trying to use Nvidia Optimus on my laptop by using Bumblebee, I have an Intel Core i5 with an Nvidia Geforce GT 630M. I have followed the instructions on the wiki page at https://wiki.ubuntu.com/Bumblebee, but when trying to run optirun with Firefox (or any other program, like glxspheres), I get the following error: [ 1921.452820] [ERROR]Cannot access secondary GPU - error: Could not load GPU driver [ 1921.452905] [ERROR]Aborting because fallback start is disabled. I have tried googling the issue but I couldn't find any solution that didn't cause more problems. Thank you for your time. Here is my bumblebee.conf: http://paste.ubuntu.com/1333324/

    Read the article

  • Windows Phone App with 4SQ

    - by Nuttanon Pornpipak
    I'm want to create a my own Coffee shop app for semester's project. It's Windows Phone App. The App can i.e. view who is check-in here now , view menu , view photo by using 4SQ Endpoint APIs. And my problem is I don't know how to start it...which book i should read about C# and I don't know which knowledge (keyword) should i google it i.e. GET POST METHOD , JSON I ever used 4SQ Endpoint APIs once with javascript (jquery) $.ajax{(.....)} to get data from 4SQ Endpoint APIs So I googled and found JSON.NET Class but I don't know how to use it because i never programming in C# I'm just begin programming. I can programming in C only. Thank you Sorry for my bad grammar

    Read the article

  • For asp.net mvc is this a three tiered solution?

    - by bbb
    I am a asp.net mvc programmer and if I want to start a project I do this: I make a class library named Model for my models. I make a class library named Infrastructure.Repository for database processes I make a class library named Application for business logic layer And finally I make a MVC project for the UI. But now some things are confusing me. Am I using 3-tier programming? If yes so what is n-tier programming and which one is better? If no so what is 3-tier programming? Some where I see that the tiers namings are DAL and BIZ. Which one is correct according to the naming convention?

    Read the article

  • Which C# Book to take?

    - by Fischkopf
    I was searching for a book to learn C#, but now i'm kinda stuck. I found many people asking the same question, and many people gave answers, but there are so many books about C# that it is really hard to decide which one to take. Now i reduced my choice on two books, but I just can't decide between them. Namely, there are: Programming C# 4.0 and C# 4.0 In A Nutshell The first thing I want to know, are these good choices? I'm not completely new to programming, but I just didn't find the right language until know, but i think C# is the one I was searching for. I know all the bassic stuff from Delphi/Java/Python so I think i'm not a complete beginner in programming. Is there anyone out there that read both books and can cleary explain whats the difference between them? I haven't found many reviews and sort of, so I just don't know which one to chose. Or is there any book that is better suiting me?

    Read the article

  • Computer randomly freezes when playing games

    - by TutorialPoint
    My computer just randomly freezes when playing certain games. It has happened to me in Battlefield: Bad Company 2, Call of Duty 4, and Blacklight: Retribution. It has not happened to me with other games like Tribes: Ascend yet, which leads me to believe it is a software-side issue related to maybe DirectX or PhysX? Also, temperatures seem stable. I used RivaTuner combined with MSI Afterburner, and at the time of freezing with BF:BC2, it gives: 62C, 67% GPU usage and 78.8FPS. During the session the max I have seen was 65C and 97% GPU usage. On Blacklight: Retribution, I've heard other people complain about the problem too. This is why it is such a mystery to me, is this actually a driver problem, or more a game problem? I've been able to play these games for long until I re-installed Windows 7 (because it was growing too full and slow). Before I had a 32bit Ultimate version, and now 64bit. Specs: O/S: Windows 7 64bit Ultimate CPU: Intel i5-750 @ Default 2.66 GHz GPU: ASUS EAH5770 1GB PSU: CoolerMaster Real Power M520 (520W) MB: Gigabyte P55M-UD2 Catalyst Control Center version (in "About"): 2012.0214.2218.39913

    Read the article

  • nVidia performance with newer X and newer driver abysmal with Compiz

    - by Nakedible
    I recently upgraded Debian to Xorg 2.9.4 and installed nvidia-glx from experimental, version 260.19.21. This was somewhat of an uphill battle as the dependencies for the experimental nvidia-glx package are still somewhat broken. I got it to work without forcing the installation of any packages and without modifying the packages. However, after the upgrade compiz performance has been abysmal. I am using the desktop wall plugin and switching viewports is really slow - takes a few seconds for each switch. In addition to this, every effect that compiz does, such as zoom animations for icons when launching applications, takes seconds. The viewport switching speed changes relative to the amount of windows on that virtual screen - empty screens switch almost at normal speed, single browser windows work almost decently, but just 4 rxvt terminals slows the switches down to a crawl. My compiz configuration should be pretty basic. Xorg is likewise configured without anything special - the only "custom" configuration is forcing the driver name to be "nvidia". I've fiddled around with the nvidia-settings and compizconfig trying different VSync settings, but none of those helped. My graphics card is: NVIDIA GPU NVS 3100M (GT218) at PCI:1:0:0 (GPU-0). This is laptop GPU that is from the Geforce GTX 200 series. Graphics card performance should naturally be no problem.

    Read the article

  • Ubuntu 11.10 ATI Drivers vesa park

    - by Matthias
    This is probably not an issue, from all I can get it seems my hardware and drivers are properly installed. However when I go to system settings - system info - graphics. I get Driver: VESA:PARK. Experience: Standard. my graphics card is a: Ati Mobility Radeon HD 5470 512MB. I am pretty sure it's not a same-die GPU since there is a fan exhaust at the side of my laptop which I presume is the exhaust for the GPU... I have no clue whatsoever what this means. I installed the ati drivers first using the 'additional drivers' method. However I also decided to look a manual installation up via the terminal since I've had problems before with Ubuntu and ati cards. I used wget and something among the lines of sh dpkg -i. I can recall exactly, I took them from another stackoverflow answer. Anyway, it seems everything is installed properly since it shows up with these commands: sudo lshw -C video fglrxinfo however the first command seems to detect hardware, not the driver per se, although the driver is probably needed to detect the hardware anyway which would indicate its properly installed. I am still not sure about that VES:PARK thing though. I'd like to know what it means.. Also, if someone happens to know a good way of testing if the gpu is connected/being used...some sort of benchmark maybe...I'd like to hear it. P.s. I can find my way around in Ubuntu but I would probably still be considered a rookie by more experienced users.

    Read the article

  • NVIDIA Tesla K20C in Dell PowerEdge R720xd --- power cables

    - by CptSupermrkt
    I am trying to put an NVIDIA Tesla K20C into a Dell PowerEdge R720xd. I'm having a bit of trouble understanding the power requirements of the card. First, here is a picture of two pages of the same manual, which seems contradictory to me. One page says only a single connector is required, while the next page says both are required. The entire manual for the card can be found here: http://www.nvidia.com/content/PDF/kepler/Tesla-K20-Active-BD-06499-001-v02.pdf Here is an photo taken of the power connections on the card: And here is a photo of where those connectors need to go, onto the PCI-E riser of the r720xd: Neither the R720xd NOR the GPU came with the necessary cables. And given what appears to be a contradiction in the GPU manual (above), I'm not even sure at this point what we actually need. I have searched high and low online for things like 2x6 pin PCI-E to 8 pin male-to-male and so on, and for the life of me cannot find what we need. In case anyone needs it, the owner's manual of the R720xd can be found here: ftp://ftp.dell.com/Manuals/all-products/esuprt_ser_stor_net/esuprt_poweredge/poweredge-r720xd_Owner%27s%20Manual_en-us.pdf The relevant page is page 68, which clearly indicates that the 8-pin female port on the riser card is for a GPU. The bottom line question: exactly what power cables do we need to buy, and where can we find them?

    Read the article

< Previous Page | 229 230 231 232 233 234 235 236 237 238 239 240  | Next Page >