Search Results

Search found 18084 results on 724 pages for 'graphics programming'.

Page 82/724 | < Previous Page | 78 79 80 81 82 83 84 85 86 87 88 89  | Next Page >

  • OpenGL-ES: clearing the alpha of the FrameBufferObject

    - by MrDatabase
    This question is a follow-up to Texture artifacts on iPad How does one "clear the alpha of the render texture frameBufferObject"? I've searched around here, StackOverflow and various search engines but no luck. I've tried a few things... for example calling GlClear(GL_COLOR_BUFFER_BIT) at the beginning of my render loop... but it doesn't seem to make a difference. Any help is appreciated since I'm still new to OpenGL. Cheers! p.s. I read on SO and in Apple's documentation that GlClear should always be called at the beginning of the renderLoop. Agree? Disagree? Here's where I read this: http://stackoverflow.com/questions/2538662/how-does-glclear-improve-performance

    Read the article

  • Learning Python is good?

    - by user15220
    Recently I have seen some videos from MIT on computer programming topics. I found it's really worth watching. Especially the concepts of algorithms and fundamental stuffs. The programs were written and explained in Python. I never had looked into this language before as I learned and doing stuffs with C/C++ programming. But the cleanliness and better readability of syntax attracted me. Of course as a C++ programmer for long time it's the most readable language for me. Also I heard Python library contains solid algorithms and data-structures implementations. Can you share your experience in this language?

    Read the article

  • fglrx installation without success

    - by Lucio
    I followed the steps of this guide and it doesn't work. I've entered the following command and I had an output with dependencies error. sudo dpkg -i fglrx*.deb So I tried with gdebi instead, and it works. Now fglrx & fglrx-amdcccle & fglrx-dev are installed. The next step is Generate a new /etc/X11/xorg.conf file, but I can't do this due to the following reason: When I enter sudo aticonfig --initial -f the terminal show me this output: sudo: aticonfig: command not found I've installed the packages correctly or not? What I have to do to fix the problem? NOTE: I've not uninstalled nothing (drivers, config., etc.) before beginning the installation.

    Read the article

  • Return to the old C days.

    - by RPK
    Long back I used to program on C and than VB exploitation changed the career path. After VB came the .NET that proved to be a HoneyPot of Microsoft for old VB programmers and frustrated programmers of other hard to learn languages. The label on this HoneyPot was: "Getting things done." I now want to contribute to the Linux and other GNU projects. I feel whatever programming language you learn today, but if programming is your bread-and-butter, you must remain in touch with C. Many things have changed now. From the old Turbo-C for DOS to the present ...? Please advise me how to get back on the C track again. Reading again whole thing, chapter-by-chapter is not possible now, but I can learn by writing small utilities type of things, but sure GUI based. And yes, I hope, learning is going to be easy now with so many live forums and active community spots like StackOverflow etc.

    Read the article

  • draw fog of war using shaders

    - by lezebulon
    I am making a RTS game, and I'd like some advice on how to best render fog of wars, given what I'm already doing. You can imagine this game as being a classic RTS like Age of Empires 2, where the fog of war will basically be handled by a 2D array telling if a given "tile" is explored or not. The specific things to consider here are : 1) I'm only doing a few draw calls to draw the whole screen, using shaders, and I'm not drawing "tile by tile" in a 2D loop 2) The whole map is much bigger than the screen, and the screen can move every frame or so In that case, how could I draw the fog of war ? I have no issue maintaining on the CPU-side a 2D array giving the fog of war for each tile, but what would be the best way to actually display it dynamically ? thanks!

    Read the article

  • How to Install Nvidia Drivers

    - by Richard Rodriguez
    I just ordered the Nvidia GTX 560 card, which should arrive tomorrow. I have a dilemma, though. Should I keep using the driver which is available in "additional drivers" in Ubuntu (10.10), or should I install the driver from the nvidia site? NOTE - The methods to install explained here apply to all Nvidia, Ati & Intel video cards The latest driver available at the nvidia site: LINUX X64 (AMD64/EM64T) DISPLAY DRIVER Version: 280.13 Certified Release Date: 2011.08.01 Operating System: Linux 64-bit Language: English (U.S.) File Size: 52.4 MB I should point out that I don't need the card to unleash its full potential in Ubuntu (I have Windows for gaming, other HDD), I just need it to work properly, that meaning the power saving should work (I don't want the card to overheat for no reason), also I would like the fans to work at proper speeds, etc. So which driver is the best for me?

    Read the article

  • Tool for creating Spritesheet? and Tips

    - by Spooks
    I am looking for a tool that I can use to create sprite sheet easily. Right now I am using Illustrator, but I can never get the center of the character in the exact position, so it looks like it is moving around(even though its always in one place), while being loop through the sprite sheet. Is there any better tools that I can be using? Also what kind of tips would you give for working with a sprite sheet? Should I create each part of the character in individual layers (left arm, right arm, body, etc.) or everything at once? any other tips would also be helpful! thank you

    Read the article

  • Launcher icon size and window behavior broken

    - by philipp
    I have installed the nvidia driver for my graphic card, just following some tutorials what works fine now. After this I could set the Icon size of the launcher, windows had a nice litte shadow, resolution was better and the windows showed up a nice effect when popping up an or when bringing to full-screen... But today the this was just gone after reboot. What could this be? Nvidia xserver-settings are availible. I installed and reinstalled wine1.5 via the apt-get commands, so this might broke something. What can do to fix this again? Greetings philipp EDIT: I went on searching and all i found was that this problem might be connected to the mode of unit, so there is 2d and 3d, but could also be something else, just because setting the mode brings no change. EDIT 2: the version of Ubuntu is: 12.04 and it is a 64 bit environment the graphic card is: GeForce GT 330M Edit 3: Using maps.google in webGL mode does not work anymore too, it was working yesterday. EDIT 4: the screenshot. btw: I think that blender is not working anymore too... EDIT: 5 I think that the problem is closely connected to this output

    Read the article

  • How do I learn Python from zero to web development? [closed]

    - by Terence Ponce
    I am looking into learning Python for web development. Assuming I already have some basic web development experience with Java (JSP/Servlets), I'm already familiar with web design (HTML, CSS, JS), basic programming concepts and that I am completely new to Python, how do I go about learning Python in a structured manner that will eventually lead me to web development with Python and Django? I'm not in a hurry to make web applications in Python so I really want to learn it thoroughly so as not to leave any gaps in my knowledge of the technologies involving web development in Python. Are there any books, resource or techniques to help me in my endeavor? In what order should I do/read them? UPDATE: When I say learning in a structured manner, I mean starting out from the basics then learning the advanced stuff without leaving some of the important details/features that Python has to offer. I want to know how to apply the things that I already know in programming to Python.

    Read the article

  • ATI Radeon HD 5750 and lagging in games and youtube videos

    - by Morten Fjord Christensen
    I have a X-ONE W-601 desktop pc: 3,1GHz AMD QuadCore Athlon II 645 X4 8 GB DDR3 RAM 1000 GB Harddisk 7200RPM ATI Radeon HD5750 with 1GB DDR5 RAM I'm running Ubuntu 11.10 64-bit and have installed the proprietary driver, but still games lag and videos a little bit. Been googling around and seen that it has something to do with the older drivers from AMD and KMS, but no guide helped me correctly through to make my graphic card work smoothly. I don't know if this helps but "fglrxinfo" in terminal shows: display: :0 screen: 0 OpenGL vendor string: ATI Technologies Inc. OpenGL renderer string: ATI Radeon HD 5700 Series OpenGL version string: 4.1.11005 Compatibility Profile Context And the driver check command shows: [ 51.184] (II) ATI Proprietary Linux Driver Version Identifier:8.88.7 Any help appreciated :D

    Read the article

  • GLSL, is it possible to offsetting vertices based on height map colour?

    - by Rob
    I am attempting to generate some terrain based upon a heightmap. I have generated a 32 x 32 grid and a corresponding height map - In my vertex shader I am trying to offset the position of the Y axis based upon the colour of the heightmap, white vertices being higher than black ones. //Vertex Shader Code #version 330 uniform mat4 modelMatrix; uniform mat4 viewMatrix; uniform mat4 projectionMatrix; uniform sampler2D heightmap; layout (location=0) in vec4 vertexPos; layout (location=1) in vec4 vertexColour; layout (location=3) in vec2 vertexTextureCoord; layout (location=4) in float offset; out vec4 fragCol; out vec4 fragPos; out vec2 fragTex; void main() { // Retreive the current pixel's colour vec4 hmColour = texture(heightmap,vertexTextureCoord); // Offset the y position by the value of current texel's colour value ? vec4 offset = vec4(vertexPos.x , vertexPos.y + hmColour.r, vertexPos.z , 1.0); // Final Position gl_Position = projectionMatrix * viewMatrix * modelMatrix * offset; // Data sent to Fragment Shader. fragCol = vertexColour; fragPos = vertexPos; fragTex = vertexTextureCoord; } However the code I have produced only creates a grid with none of the y vertices higher than any others.

    Read the article

  • Is it normal to not have desktop effects after installing on an Asus N53S?

    - by Fabzter
    I've just installed 11.10 and so far so great, except I just discovered (after looking at another installation) I'm missing some effects. First of all, I am using unity 3d (I chose ubuntu at the login menu), and have already the correct nvidia drivers. Yet, my launcher is the same from unity 2d, the same as the app switcher, and when dragging windows to screen corners, they don't get grabbed. This "error" was present since the begining, so I was not able to detect it. Please help, I want to have an ubuntu experience as nice as everyone else. EDIT 1: I'm using an Asus N53S. My graphic card is Nvidia geforce GT540M. It says cuda, if that matters.

    Read the article

  • Screen problems

    - by Erick
    I struggled a lot in order to install Ubuntu because of a problem with the video drivers caused by when installing the driver and after turning it on it just appears a black screen, which worked for me was to use the terminal with sudo setpci -s 00:02.0 F4.B=0 and the screen starts, but after rebooting the changes won't remain and I have to run that command again. My question is: How can I make the changes permanent and not to be in need to execute that command always when I shut it down? Original Question in Spanish: Problemas con pantalla He batallado mucho al instalar ubuntu por el problema con el controlador de video ya que al instalarlo al encenderla aparece solo una pantalla negra lo que me funciono fue usar desde consola sudo setpci -s 00:02.0 F4.B=0 enciende la pantalla pero al reiniciarla los cambios no se quedan guardados y tengo que volver a ejecutar ese comando mi pregunta es ¿como hacerle para que los cambios queden igual y no tener que estar ejecutando eso siempre que la apago?

    Read the article

  • Is there an alternative to javascript for the web that can do multi-threading and synchronous execution?

    - by rambodash
    I would like to program web applications as I do with desktop programming languages, where the code is synchronously executed and browser doesn't freeze when doing loops. Yes I know there are workarounds using callbacks and setTimeout, but they are all workarounds after all and they don't give the same flexibility when programming in the orthodox way I've been looking at Dart as a possibilty, but I can't seem to find where it says it can do either of these. The same with haxe, emscript, and the hundreds of other converters that try to circumvent javascript. In the end it gets converted to Javascript so you ultimately have to be conscious about asynchronous/multi threading.

    Read the article

  • Cool examples of procedural pixel shader effects?

    - by Robert Fraser
    What are some good examples of procedural/screen-space pixel shader effects? No code necessary; just looking for inspiration. In particular, I'm looking for effects that are not dependent on geometry or the rest of the scene (would look okay rendered alone on a quad) and are not image processing (don't require a "base image", though they can incorporate textures). Multi-pass or single-pass is fine. Screenshots or videos would be ideal, but ideas work too. Here are a few examples of what I'm looking for (all from the RenderMonkey samples): PS - I'm aware of this question; I'm not asking for a source of actual shader implementations but instead for some inspirational ideas -- and the ones at the NVIDIA Shader Library mostly require a scene or are image processing effects. EDIT: this is an open-ended question and I wish there was a good way to split the bounty. I'll award the rep to the best answer on the last day.

    Read the article

  • Better drivers for SiS 650/740 integrated video?

    - by Bart van Heukelom
    I installed Xubuntu 10.10 on an old box today and the graphical performance is horrid. According to lspci, the video card is this: 01:00.0 VGA compatible controller: Silicon Integrated Systems [SiS] 65x/M650/740 PCI/AGP VGA Display Adapter (prog-if 00 [VGA controller]) Subsystem: ASUSTeK Computer Inc. Device 8081 Flags: 66MHz, medium devsel, IRQ 11 BIST result: 00 Memory at f0000000 (32-bit, prefetchable) [size=128M] Memory at e7800000 (32-bit, non-prefetchable) [size=128K] I/O ports at d800 [size=128] Expansion ROM at <unassigned> [disabled] Capabilities: <access denied> Kernel modules: sisfb Is there a way to make it faster? Alternative drivers? The additional drivers tool shows nothing. I'm specifically interested in improving Java's Java2D rendering speed, because I'll be running a "stat screen" written in that language on it.

    Read the article

  • Why is my Wacom Intuos tablet not detected?

    - by mjwittering
    I need a little help trying to install a Wacom Intuos tablet, model number CTL-480/S. My installation of Ubuntu 13.04, 64bit, doesn't seem to be able to detect the device. I've tried an few different USB ports on my machine and get the same result. I believe there is an issue because when I open the System Settings app from the launcher and browse to the Wacom Tablet section under hardware, it reports that there is 'No table detected'. When I use lsusb I can see the device is detected: Bus 003 Device 004: ID 056a:030e Wacom Co., Ltd I've also pulled the following from the syslog: Oct 16 16:51:05 earth kernel: [ 7062.388031] usb 3-5: new full-speed USB device number 4 using ohci_hcd Oct 16 16:51:05 earth kernel: [ 7062.611038] usb 3-5: New USB device found, idVendor=056a, idProduct=030e Oct 16 16:51:05 earth kernel: [ 7062.611042] usb 3-5: New USB device strings: Mfr=1, Product=2, SerialNumber=0 Oct 16 16:51:05 earth kernel: [ 7062.611045] usb 3-5: Product: Intuos PS Oct 16 16:51:05 earth kernel: [ 7062.611047] usb 3-5: Manufacturer: Wacom Co.,Ltd. Oct 16 16:51:05 earth mtp-probe: checking bus 3, device 4: "/sys/devices/pci0000:00/0000:00:02.0/usb3/3-5" Oct 16 16:51:05 earth mtp-probe: bus: 3, device: 4 was not an MTP device I'd really appreciate any suggestions to help debug and install this device.

    Read the article

  • Making an interactive 2D map

    - by Chad
    So recently I have been working on a Legend of Zelda: A Link to the Past clone, and I am wondering how I could handle certain map interactions (like cutting grass, lifting rocks, etc). The way I am currently doing the tilemap is with 2 PNGs. The first is the "tilemap" where each pixel represents a 16x16 tile and the (red, green) values are the (x, y) coords for the tile in the second PNG (the "tileset"). I am then using the blue channel to store collision data. Each tile is split into 4 8x8 tiles and represented by a 2 bit value (0 = empty, 1 = Jumpdown point, 2 = unused right now, 3 = blocking). 4 of these 2 bit values make up the full blue channel (1 byte). So collisions work great, and I am moving on to putting interactive units on the level; but I am not sure what a good way is to do it. I have experimented with spawning an entity for each grass and rock, but there are just WAY to many; FPS just dies even if I confine it to the current "zone" the user is in (for those who remember LTTP it had zones you moved between). It does make a difference that this is a browser-based JavaScript game. tl;dr: What is a good way to have an interactive map without using full blown entities for each interactive item?

    Read the article

  • Good book or tutorial for learning how to apply integration methods

    - by Cumatru
    I'm looking to animate a graph layout using edges as springs and nodes as weights ( a node with more links will have a bigger weight ). I'm not capable of wrapping my head around the usage of mathematical and physics relations in my application. As far as i read, Runge Kutta 4 ( preferably ) or Verlet will be a good choice, but i have problems with understanding how they really work, and what physics equations should i apply. If i can't understand them, i can't use them. I'm looking for a book or a tutorial which describe the things that i need.

    Read the article

  • ATI (fglrx) Dual monitor / laptop hot-plugging

    - by Brendan Piater
    I feel like I've gone back 5 years on my desktop today. I'll try not dump to much frustration here... I been running 12.04 since alpha with the ATI open source drivers and the gnome 3 desktop. I been generally very happy with them with only small issues along the way. Now of course it does not support 3D acceleration 100%, so games like my newly purchased Amnesia from the Humble bundle would not play. OK, no worries, the ATI driver is in the repos so let me have a go I thought. With all this testing that's been done with multi-monitor support, what could go wrong...? How I use my computer: It's laptop, with a HD 3670 card in it. I spend about 50% of the time working directly on the laptop (at home) and about 50% of the time working with an additional display connected (at work), multi desktop environment. What happening now: installed drivers things seemed to working, save some small other bugs (not critical) this morning I take my machine and plug the additional monitor into it, and nothing happens... ok fine. open "displays" try configure dual display, won't work open ati config "thing" (cause it is a thing, a crap thing) and set-up monitors there reboot it says (oh ffs, really.... ok) reboot, login and wow, I got a gnome 2 desktop (presume gnome 3 fall back) and no multi-monitor...great. (screenshot: http://ubuntuone.com/5tFe3QNFsTSIGvUSVLsyL7 ) after getting into a situation where I had to Ctrl + Alt + Del to get out of a frozen display, I eventually manage to set-up a single display desktop on the "main" monitor ok.. time to go home... unplug monitor... nothing happens.. oh boy here we go... try displays again, nothing, just hangs the display.. great. crash all the apps and reboot... So it's been a trying day... What I really hope is that someone else has figured out how to avoid this PAIN. Please help with a solution that: allows me run fglrx (so I can run the games I want) allows me to hot-plug a monitor to my laptop and remove it again allows me to change the display so include the hot-plugged monitor (preferable automatically like it did with the open drivers) Next best if that's not possible: switch between laptop only display and monitor only display easily (i.e. not having to reboot/logout/suspened etc) Really appreciate the time of anyone that has a solution. Thanks in advance. Regards Brendan PS: I guess I should file a bug about this too, so some direction as to the best place to file this would be appreciated too.

    Read the article

  • How to implement a score database in Android

    - by Michael Seun Araromi
    I making a 2d game for android using opengl-es technology. It is a space shooting game where the player shoots enemy ships. I want to keep a track of score for the amount of enemy ships destroyed and a record of a local highscore, I want the score to be incremented whenever an enemy is destroyed. I also want a way of displaying both score and highscore on the game screen. I am not farmiliar with databases at all and I will appreciate a clear answer or a link to a good tutorial for my cause. Thanks

    Read the article

  • What's a good data structure solution for a scene manager in XNA?

    - by tunnuz
    Hello, I'm playing with XNA for a game project of myself, I had previous exposure to OpenGL and worked a bit with Ogre, so I'm trying to get the same concepts working on XNA. Specifically I'm trying to add to XNA a scene manager to handle hierarchical transforms, frustum (maybe even occlusion) culling and transparency object sorting. My plan was to build a tree scene manager to handle hierarchical transforms and lighting, and then use an Octree for frustum culling and object sorting. The problem is how to do geometry sorting to support transparencies correctly. I know that sorting is very expensive if done on a per-polygon basis, so expensive that it is not even managed by Ogre. But still images from Ogre look right. Any ideas on how to do it and which data structures to use and their capabilities? I know people around is using: Octrees Kd-trees (someone on GameDev forum said that these are far better than Octrees) BSP (which should handle per-polygon ordering but are very expensive) BVH (but just for frustum and occlusion culling) Thank you Tunnuz

    Read the article

  • installing my Graphic Card on ubuntu12.04

    - by lamouchi amine
    I have a HP Pavillion G6 series 1225, i5 laptop with Radeon HD 6470M switchable VGA. i installed Ubuntu 12.04 LTS but the VGA drivers don't work properly. I want to install drivers into the Ubuntu. But when I do it arrived error message like this: sorry, installation of this driver failed. Please have a look at the log file for details: /var/log/jockey.log I found a solution in a link: http://ubuntuforums.org/showthread.php?t=1930450 It works for a few steps and then in the installation of the package of the AMD driver, a message pops up 'fatal error' and it redirects me to Ask Ubuntu to find a solution, please help me, I need to make it work.

    Read the article

  • Which paradigm to use for writing chess engine?

    - by poke
    If you were going to write a chess game engine, what programming paradigm would you use (OOP, procedural, etc) and why whould you choose it ? By chess engine, I mean the portion of a program that evaluates the current board and decides the computer's next move. I'm asking because I thought it might be fun to write a chess engine. Then it occured to me that I could use it as a project for learning functional programming. Then it occured to me that some problems aren't well suited to the functional paradigm. Then it occured to me that this might be good discussion fodder.

    Read the article

  • What languages are most commonly used in medical research?

    - by Chris Taylor
    For someone about to go into a career in medical research, what language would be the most useful to learn? From my limited experience (I have been a researcher in mathematics and in finance) I have been able to recommend looking at R (for statistics) Matlab (for general numeric processing) and Python (for general purpose programming with statistics/numerics as an add-on) but I don't know which of those (if any) are in common use -- or if there are other, more specialized languages that are used. To be clear, I'm not talking about a professional programmer working in a medical setting. I am talking about a medical or genetics researcher who uses programming to analyse data, or generally to help get their work done.

    Read the article

< Previous Page | 78 79 80 81 82 83 84 85 86 87 88 89  | Next Page >