Search Results

Search found 13534 results on 542 pages for 'gpu programming'.

Page 218/542 | < Previous Page | 214 215 216 217 218 219 220 221 222 223 224 225  | Next Page >

  • What Counts for a DBA: Skill

    - by drsql
    “Practice makes perfect:” right? Well, not exactly. The reality of it all is that this saying is an untrustworthy aphorism. I discovered this in my “younger” days when I was a passionate tennis player, practicing and playing 20+ hours a week. No matter what my passion level was, without some serious coaching (and perhaps a change in dietary habits), my skill level was never going to rise to a level where I could make any money at the sport that involved something other than selling tennis balls at a sporting goods store. My game may have improved with all that practice but I had too many bad practices to overcome. Practice by itself merely reinforces what we know and what we can figure out naturally. The truth is actually closer to the expression used by Vince Lombardi: “Perfect practice makes perfect.” So how do you get to become skilled as a DBA if practice alone isn’t sufficient? Hit the Internet and start searching for SQL training and you can find 100 different sites. There are also hundreds of blogs, magazines, books, conferences both onsite and virtual. But then how do you know who is good? Unfortunately often the worst guide can be to find out the experience level of the writer. Some of the best DBAs are frighteningly young, and some got their start back when databases were stored on stacks of paper with little holes in it. As a programmer, is it really so hard to understand normalization? Set based theory? Query optimization? Indexing and performance tuning? The biggest barrier often is previous knowledge, particularly programming skills cultivated before you get started with SQL. In the world of technology, it is pretty rare that a fresh programmer will gravitate to database programming. Database programming is very unsexy work, because without a UI all you have are a bunch of text strings that you could never impress anyone with. Newbies spend most of their time building UIs or apps with procedural code in C# or VB scoring obvious interesting wins. Making matters worse is that SQL programming requires mastery of a much different toolset than most any mainstream programming skill. Instead of controlling everything yourself, most of the really difficult work is done by the internals of the engine (written by other non-relational programmers…we just can’t get away from them.) So is there a golden road to achieving a high skill level? Sadly, with tennis, I am pretty sure I’ll never discover it. However, with programming it seems to boil down to practice in applying the appropriate techniques for whatever type of programming you are doing. Can a C# programmer build a great database? As long as they don’t treat SQL like C#, absolutely. Same goes for a DBA writing C# code. None of this stuff is rocket science, as long as you learn to understand that different types of programming require different skill sets and you as a programmer must recognize the difference between one of the procedural languages and SQL and treat them differently. Skill comes from practicing doing things the right way and making “right” a habit.

    Read the article

  • Get started with C++ AMP

    - by Daniel Moth
    With the imminent release of Visual Studio 2012, even if you do not classify yourself as a C++ developer, C++ AMP is something you should learn so you can understand how to speed up your loops by offloading to the GPU the computation performed in the loop (assuming you have large number of iterations/data). We have many C# customers who are using C++ AMP through pinvoke, and of course many more directly from C++. So regardless of your programming language, I hope you'll find helpful these short videos that help you get started with C++ AMP C++ AMP core API introduction... from scratch Tiling Introduction - C++ AMP Matrix Multiplication with C++ AMP GPU debugging in Visual Studio 2012 In particular the work we have done for parallel and GPU debugging in Visual Studio 2012 is market leading, so check it out! Comments about this post by Daniel Moth welcome at the original blog.

    Read the article

  • DirectCompute Lectures

    - by Daniel Moth
    Previously I shared resources to get you started with DirectCompute, for taking advantage of GPGPUs in an a way that doesn't tie you to a hardware vendor (e.g. nvidia, amd). I just stumbled upon and had to share a lecture series on channel9 on DirectCompute! Here are direct links to the episodes that are up there now: DirectCompute Expert Roundtable Discussion DirectCompute Lecture Series 101- Introduction to DirectCompute DirectCompute Lecture Series 110- Memory Patterns DirectCompute Lecture Series 120- Basics of DirectCompute Application Development DirectCompute Lecture Series 210- GPU Optimizations and Performance DirectCompute Lecture Series 230- GPU Accelerated Physics DirectCompute Lecture Series 250- Integration with the Graphics Pipeline Having watched these I recommend them all, but if you only want to watch a few, I suggest #2, #3, #4 and #5. Also, you should download the "WMV (High)" so you can see the code clearly and be able to Ctrl+Shift+G for fast playback… TIP: To subscribe to channel9 GPU content, use this RSS feed. Comments about this post welcome at the original blog.

    Read the article

  • How can a computer render a CLI/console along with a GUI?

    - by Nathaniel Bennett
    I'm confused when looking into graphics - specifically with operating systems. I mean, how can a computer render a CLI/console along with a GUI? GUI's are completely different from text. And how can we have GUI windows that display text interfaces, ie how can we have CLI in modern Graphics Operating system - that's what I'm mainly trying to grip on to. How does graphics get rendered to display? Is there some sort of memory address that a GPU access which holds all pixel data, and there system's within OS's that gather the pixel position of windows and widgets, along with the Z Index and rasterize them to that memory address, which then the GPU loads to the screen? How about the CLI's integrated with Graphics? How does the OS tell the GPU that a certain part of the screen wants to display text while the rest wants to display pixel data?

    Read the article

  • How Did we get from CLI to Graphics?

    - by Nathaniel Bennett
    I'm confused when looking into graphics - specifically with operating systems. I mean, how can a computer render a CLI/console along with a GUI. GUI's are completely different from Text. and How Can we have GUI windows that Display Text interfaces, ie how can we have CLI in modern Graphics Operating system - that's what I'm mainly trying to grip on to. How Do Graphic's get rendered to display? is there some sort of memory address that a GPU access which holds all pixel data, and there system's within OS's that Gather the pixel position of Windows and Widgets, along with the Z Index and rasterize them to that memory address, which then the GPU loads to the screen? How About the CLI's integrated with Graphics? how does the OS Tell the GPU that a certain part of the screen wants to display text while the rest, whats to display pixel data? it's all very confusing. Shed some light in it, will ya?

    Read the article

  • Can I automatically make my Nvidia card's fan quieter?

    - by Salim Fadhley
    I have a machine with an Nvidia graphics card. Unfortunately the GPU fan is very loud. It's very annoying at times. We never use this machine for intense 3d work - that GPU is probably not working very hard at all. I'm pretty sure I can run it at a much lower fan-speed without causing any problems. The nvclock utility can be used to manually adjust the fan-speed of my Nvidia graphics card. I'd like to call this utility automatically when the machine boots up. Is there some kind of system service which I can use to automatically apply this kind of system-wide configuration? Even better, is there a system monitoring service which can poll the GPU temperature and adjust the various system fan-speeds accordingly? Thanks!

    Read the article

  • Do I lose/gain performance for discarding pixels even if I don't use depth testing?

    - by Gajoo
    When I first searched for discard instruction, I've found experts saying using discard will result in performance drain. They said discarding pixels will break GPU's ability to use zBuffer properly because GPU have to first run Fragment shader for both objects to check if the one nearer to camera is discarded or not. For a 2D game I'm currently working on, I've disabled both depth-test and depth-write. I'm drawing all objects sorted by their depth and that's all, no need for GPU to do fancy things. now I'm wondering is it still bad if I discard pixels in my fragment shader?

    Read the article

  • I've got two technical degrees but little in the way of experience. How do I get into programming? [closed]

    - by Neonfirelights
    I'm looking for a job, I want to break into programming. I'm looking for the right sort of role and the right place to look for it; I would really appreciate input from someone with industry experience. I've got an excellent academic record: BSc Physics (2:1), MSc Computer Graphics, Vision and Imaging (expecting Merit) from two world ranking universities. I have advanced technical knowledge of C/C++ and Matlab and experience working with C# and VB.NET. Unfortunately I don't have much in the way of commercial experience; unlike a lot of people I know my under-graduate didn't come with a sandwich placement. Where can I go to break into the software industry?

    Read the article

  • How to install nvidia optimus driver on ubuntu 12.10?

    - by Adam
    I have followed http://ubuntuportal.com/2012/01/bumblebee-3-0-tumblewed-nvidia-optimus-gpu-switching-for-linux-has-been-released-how-to-install-bumblebee-3-0-on-ubuntu.html this guide to install nvidia driver on my Dell Inspiron N5110 notebook (Intel HD Graphics 3000 + NVIDIA GeForce GT525M), but i always get some error while i want to start any program with the optirun command. Terminal says: adam@Adam-LT:~$ optirun firefox [ 1482.559417] [ERROR]Cannot access secondary GPU - error: Could not load GPU driver [ 1482.559517] [ERROR]Aborting because fallback start is disabled. My laptop cooler always cools the laptop, which means that nvidia card is consuming power in the background. (Terminal sometimes says something daemon-server is not running.) Can you give me some solution for this?

    Read the article

  • Any help please, Not reconizing my hard drive

    - by Imperial0007
    If any1 can help would be much appreciated.. I recently build my own PC would like to use for gaming etc.. (With Ubuntu of course as my OS) Installed Ubuntu via Flash Drive Everything is connected. Purchased a Graphic card/GPU GPU info;(XFX Double D R9 270 925MHz Boost 2GB DDR5 DP HDMI 2XDVI Graphic card) Now my problem is when i put the CD to install the GPU Drivers it would not recognize the HDD So why is the hard drive not being recognized HDD info;(ADATA USA Premier pro SP600 32GB SATA) I am able to enter the BIOS menu (If that helps) Any help would be much appreciated & Thanks in advanced

    Read the article

  • What are the best programming and development related Blogs?

    - by Christopher Cashell
    There are lots of great resources available on the Internet for learning more about programming and improving your skills. Blogs are one of the best, IMO. There's a wealth of knowledge and experience, much of it covering topics not often found in traditional books, and the increased community aspect helps to bring in multiple viewpoints and ideas. We're probably all familiar with Coding Horror and Joel on Software (so no need to mention them), but what are the other great ones out there? What are the Blogs that you find yourself following most closely? Where you see the best new ideas, the most interesting or informative ideas, or just the posts that make you sit back and think? One Blog per answer, and then we'll vote up the best so we can all learn from them.

    Read the article

  • Technical/Programming/Non-SEO Pros and Cons of WWW or no-WWW?

    - by Ingenutrix
    What are technical/programming/non-SEO pros and cons of www or no-www, for domains as well as sub-domains? From Jeff Atwood's twitter at http://twitter.com/codinghorror/status/1637428313 : "sort of regretting the no-www choice because it causes full cookie submission to ALL subdomains. :(" What does this mean? Is there a blog post or article detailing this? What other specific issues and their reasons should be considered for www. vs no-www. Update: On searching for more info on this topic, I found following helpful ( in addition to Laurence Gonsalves answer ) : Dropping the WWW Prefix Impact on search results: Jivlain's and Isaac Lin's comments Use Cookie-free Domains for Components on StackOverflow : Should I default my website to www.foo or not? on StackOverflow : When should one use a ‘www’ subdomain?

    Read the article

  • I didn't completely get 100% on a programming job interview, should I worry?

    - by user347598
    I recently had a phone job interview with a 1 hour Programming practical. It had two questions on it and I know I answered one completely correct and got most of the second correct. Should I worry about getting the Job just based on that? The actual phone job interview went very well and they told me that I answered their questions well and my questions I aimed at them were very good questions and some they had not heard before but should have in the past. so big question is should I worry? or is not 100% completion ok.

    Read the article

  • Converting Asynchronous Programming Model (Begin/End methods) into event-based asynchronous model?

    - by David
    Let's say I have code that uses the Asynchronous Programming Model, i.e. it provides the following methods as a group which can be used synchronously or asynchronously: public MethodResult Operation(<method params>); public IAsyncResult BeginOperation(<method params>, AsyncCallback callback, object state); public MethodResult EndOperation(IAsyncResult ar); What I want to do is wrap this code with an additional layer that will transform it into the event-driven asynchronous model, like so: public void OperationAsync(<method params>); public event OperationCompletedEventHandler OperationCompleted; public delegate void OperationCompletedEventHandler(object sender, OperationCompletedEventArgs e); Does anyone have any guidance (or links to such guidance) on how to accomplish this?

    Read the article

  • How do i start Game programming in windows phone xna?

    - by Ankit Rathod
    Hello, I am very much interested in Game programming in Xna. However during my college days i did not take Physics or Maths. Does that mean i can't create games in xna? I just know basics of trignometry. Can you all point me to few links where i can learn xna as well as the basic stuff of Maths that is bound to be required in most of the games? Are all game programmers excellent in Maths and Physics ? Thanks in advance :)

    Read the article

  • What programming technique / practice done by you was ahead of its time?

    - by Binoj Antony
    I once built a very good web application in ASP (classic) back in 2001 and extensively used XmlHttpRequest object in it. (I was lucky that the clients were only using IE, and only IE supported this object at that time). Then later when people started talking about AJAX in 2005, It felt good to have used something ahead (or early) of its time. Well, maybe this does not qualify to be listed as something done ahead of its time. Which programming technology/technique/practice have you done that was ahead of this time. One story per answer please. The title for this question taken from an opposite question here.

    Read the article

  • What is the worst programming mistake you have made?

    - by George Edison
    Most of us are not perfect. (Well, except Jon Skeet) Have you made a terrible mistake that you would like to share? The idea is that we could all learn from our mistakes and by collecting them together here, we can avoid some common ones and discover some no-so-common ones we may have overlooked. Oh, and this question is CW, of course. Edit: This question is different than http://stackoverflow.com/questions/1928002/what-is-the-worst-programming-mistake-you-have-ever-seen because we are sharing our own mistakes. Edit again: And this one http://stackoverflow.com/questions/130965/what-is-the-worst-code-youve-ever-written is different too - it asks for code. My question does not have that restriction!

    Read the article

  • How can I make a career in Formal Methods programming in USA?

    - by A5al Andy
    I've found that my (USA) professors recoil with a near-disgust when I ask them about how to pursue a career in Formal Methods programming. They say, "Oh, that stuff! That stuff is anal. You don't need that European POS to get a job." I'm sure I'll get a job without it, but Formal Methods interests me so much that I bet I'd like to make a career of it. I'd like to learn about Formal Methods at an American University and then work in that field here. I've found that even professors at more important universities than mine don't seem to welcome Formal Methods. Almost all FM research project webpages are semi-abandoned and moldering. Europe is where the action seems to be for this. Can anyone suggest a plan of attack, and along the way explain the antipathy to Formal Methods in the US? I'm a sophomore at a public university in the South.

    Read the article

  • What is the most you've charged for a single programming job?

    - by David Murdoch
    This question/wiki is more aimed at my fellow freelancers rather than companies or groups...but any and all feedback definitely is welcome. When quoting jobs for anything over $10,000 I always feel uneasy and unsure about the estimate I'm providing (though, I'm not sure why, I know what I'm worth [ I think :-) ] and I charge appropriately. I'm sure there are more (noob) freelancers here on S.O. that feel the same way. In danger of being voted closed because of its subjective (but factual) nature - the question(s): What is the largest amount you have charged for a single programming job (not including maintenance, support, or residual income). What are some of the details of the specific job? (research, q&a, challenges, etc) What languages did you use to get the job done? Assuming you bill your work at an hourly rate, what was the rate? How long did the job actually take you to complete? (from start to deployment, how many weeks, months, years?)

    Read the article

  • What's the best general programming book to review basic development concepts?

    - by Charles S.
    I'm looking for for a programming book that reviews basic concepts like implementing linked lists, stacks, queues, hash tables, tree traversals, search algorithms, etc. etc. Basically, I'm looking for a review of everything I learned in college but have forgotten. I prefer something written in the last few years that includes at least a decent amount of code in object-oriented languages. This is to study for job interview questions but I already have the "solving interview questions" books. I'm looking for something with a little more depth and explanation. Any good recommendations?

    Read the article

  • What is your longest-held programming assumption that turned out to be incorrect?

    - by Demi
    I am doing some research into common errors and poor assumptions made by junior (and perhaps senior) software engineers. What was your longest-held poor assumption that was eventually corrected? For example: I at one point failed to understand that the size of an integer was not a standard (depends on the language and target). A bit embarrassing to state, but there it is. Be frank: what hard-held belief did you have, and roughly how long did you maintain the assumption? It can be about an algorithm, a language, a programming concept, testing, anything under the computer science domain.

    Read the article

< Previous Page | 214 215 216 217 218 219 220 221 222 223 224 225  | Next Page >