Search Results

Search found 35689 results on 1428 pages for 'development mode'.

Page 461/1428 | < Previous Page | 457 458 459 460 461 462 463 464 465 466 467 468  | Next Page >

  • OpenGL font rendering

    - by DEElekgolo
    I am trying to make an openGL text rendering class using FreeType. I was originally following this code but it doesn't seem to work out for me. I get nothing reguardless of what parameters I put for Draw(). class Font { public: Font() { if (FT_Init_FreeType(&ftLibrary)) { printf("Could not initialize FreeType library\n"); return; } glGenBuffers(1,&iVerts); } bool Load(std::string sFont, unsigned int Size = 12.0f) { if (FT_New_Face(ftLibrary,sFont.c_str(),0,&ftFace)) { printf("Could not open font: %s\n",sFont.c_str()); return true; } iSize = Size; FT_Set_Pixel_Sizes(ftFace,0,(int)iSize); FT_GlyphSlot gGlyph = ftFace->glyph; //Generating the texture atlas. //Rather than some amazing rectangular packing method, I'm just going //to have one long strip of letters with the height being that of the font size. int width = 0; int height = 0; for (int i = 32; i < 128; i++) { if (FT_Load_Char(ftFace,i,FT_LOAD_RENDER)) { printf("Error rendering letter %c for font %s.\n",i,sFont.c_str()); } width += gGlyph->bitmap.width; height += std::max(height,gGlyph->bitmap.rows); } //Generate the openGL texture glActiveTexture(GL_TEXTURE0); //if I texture exists then delete it. iTexture ? glDeleteBuffers(1,&iTexture):0; glGenTextures(1,&iTexture); glBindTexture(GL_TEXTURE_2D,iTexture); glPixelStorei(GL_UNPACK_ALIGNMENT,1); glTexImage2D(GL_TEXTURE_2D,0,GL_ALPHA,width,height,0,GL_ALPHA,GL_UNSIGNED_BYTE,0); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR); //load the glyphs and set the glyph data int x = 0; for (int i = 32; i < 128; i++) { if (FT_Load_Char(ftFace,i,FT_LOAD_RENDER)) { //if it cant load the character continue; } //load the glyph map into the texture glTexSubImage2D(GL_TEXTURE_2D,0,x,0, gGlyph->bitmap.width, gGlyph->bitmap.rows, GL_ALPHA, GL_UNSIGNED_BYTE, gGlyph->bitmap.buffer); //move the "pen" down the strip x += gGlyph->bitmap.width; chars[i].ax = (float)(gGlyph->advance.x >> 6); chars[i].ay = (float)(gGlyph->advance.y >> 6); chars[i].bw = (float)gGlyph->bitmap.width; chars[i].bh = (float)gGlyph->bitmap.rows; chars[i].bl = (float)gGlyph->bitmap_left; chars[i].bt = (float)gGlyph->bitmap_top; chars[i].tx = (float)x/width; } printf("Loaded font: %s\n",sFont.c_str()); return true; } void Draw(std::string sString,Vector2f vPos = Vector2f(0,0),Vector2f vScale = Vector2f(1,1)) { struct pPoint { pPoint() { x = y = s = t = 0; } pPoint(float a,float b,float c,float d) { x = a; y = b; s = c; t = d; } float x,y; float s,t; }; pPoint* cCoordinates = new pPoint[6*sString.length()]; int n = 0; for (const char *p = sString.c_str(); *p; p++) { float x2 = vPos.x() + chars[*p].bl * vScale.x(); float y2 = -vPos.y() - chars[*p].bt * vScale.y(); float w = chars[*p].bw * vScale.x(); float h = chars[*p].bh * vScale.y(); float x = vPos.x() + chars[*p].ax * vScale.x(); float y = vPos.y() + chars[*p].ay * vScale.y(); //skip characters with no pixels //still advances though if (!w || !h) { continue; } //triangle one cCoordinates[n++] = pPoint( x2 , -y2 , chars[*p].tx , 0); cCoordinates[n++] = pPoint( x2+w , -y2 , chars[*p].tx + chars[*p].bw / w , 0); cCoordinates[n++] = pPoint( x2 , -y2-h , chars[*p].tx , chars[*p].bh / h); cCoordinates[n++] = pPoint( x2+w , -y2 , chars[*p].tx + chars[*p].bw / w , 0); cCoordinates[n++] = pPoint( x2 , -y2-h , chars[*p].tx , chars[*p].bh / h); cCoordinates[n++] = pPoint( x2+w , -y2-h , chars[*p].tx + chars[*p].bw / w , chars[*p].bh / h); } glBindBuffer(GL_ARRAY_BUFFER,iVerts); glBindBuffer(GL_TEXTURE_2D,iTexture); //Vertices glEnableClientState(GL_VERTEX_ARRAY); glVertexPointer(2,GL_FLOAT,sizeof(pPoint),&cCoordinates[0].x); //TexCoord 0 glClientActiveTexture(GL_TEXTURE0); glEnableClientState(GL_TEXTURE_COORD_ARRAY); glTexCoordPointer(2,GL_FLOAT,sizeof(pPoint),&cCoordinates[0].s); glCullFace(GL_NONE); glBufferData(GL_ARRAY_BUFFER,6*sString.length(),cCoordinates,GL_DYNAMIC_DRAW); glDrawArrays(GL_TRIANGLES,0,n); glCullFace(GL_BACK); glBindBuffer(GL_ARRAY_BUFFER,0); glBindBuffer(GL_TEXTURE_2D,0); glDisableClientState(GL_VERTEX_ARRAY); glDisableClientState(GL_TEXTURE_COORD_ARRAY); } ~Font() { glDeleteBuffers(1,&iVerts); glDeleteBuffers(1,&iTexture); } private: unsigned int iSize; //openGL texture atlas unsigned int iTexture; //openGL geometry buffer; unsigned int iVerts; FT_Library ftLibrary; FT_Face ftFace; struct Character { float ax,ay;//Advance float bw,bh;//bitmap size float bl,bt;//bitmap left and top float tx; } chars[128]; };

    Read the article

  • How can I test if an oriented rectangle contains another oriented rectangle?

    - by gronzzz
    I have the following situation: To detect whether is the red rectangle is inside orange area I use this function: - (BOOL)isTile:(CGPoint)tile insideCustomAreaMin:(CGPoint)min max:(CGPoint)max { if ((tile.x < min.x) || (tile.x > max.x) || (tile.y < min.y) || (tile.y > max.y)) { NSLog(@" Object is out of custom area! "); return NO; } return YES; } But what if I need to detect whether the red tile is inside of the blue rectangle? I wrote this function which uses the world position: - (BOOL)isTileInsidePlayableArea:(CGPoint)tile { // get world positions from tiles CGPoint rt = [[CoordinateFunctions shared] worldFromTile:ccp(24, 0)]; CGPoint lb = [[CoordinateFunctions shared] worldFromTile:ccp(24, 48)]; CGPoint worldTile = [[CoordinateFunctions shared] worldFromTile:tile]; return [self isTile:worldTile insideCustomAreaMin:ccp(lb.x, lb.y) max:ccp(rt.x, rt.y)]; } How could I do this without converting to the global position of the tiles?

    Read the article

  • How can I make video games if I don't like programming?

    - by hoper
    I am studying C++ code in my school (my major is computer programming). Honestly, my grades are not so good, and assignments are really hard. Sometimes I feel sad that I will spend 8-10 hours per day coding (which is stressful) in the future for my job. But I still want to make video games. Maybe this is the only reason why I am taking all of these stressful courses. I always write down plots, stories, characters, fictional gaming worlds... Once, I thought I should study artistic technology such as game design and not computer technology such as C++, C#, etc. However, most of popular game designers (or directors) such as Kojima, Miyamoto, etc. used to be good programmers. Companies actaully assign programmers to directors because they understand how to make a game. I've try to find other colleges or universities where they teach game design programs. However, one article that lists rank 10 game design schools in North America seems untrustful because the survey company only scores it from intervews of students. Once, I tried to attend Art Institute of Vancouver which is rank 7 according to that article. However, one programmer who used to be an instructor in there told me the truth: the employement rate of graduated students is low. How can I have a future making games if I don't like programming?

    Read the article

  • How to update a game off a database

    - by James Clifton
    I am currently writing a sports strategy management game (cricket) in PHP, with a MYSQL database, and I have come across one stumbling block - how do I update games where neither player is online? Cricket is a game played between two players, and when they (or one of them) is online then everything is fine; but what if neither player is online? This occurs when championship games are played, and these games need to happen at certain times for game reasons. At the moment I have a private web page that updates every 5 seconds, and each time it loads all games are updated; but then I have the problem that when my private web page stops (for example my computer crashes or my web browser plays up) the game stops updating! Any suggestions?

    Read the article

  • Problems with texture orientation in space

    - by frankie
    I am currently drawing texture in 3D space and have some problems with it's orientation. I'd like me textures always to be oriented with front face to user. My desirable result looks like Note, that text size stay without changes when we rotating world and stay oriented with front face to user. Now I can draw text in 3D space, but it is not oriented with front but rotating with world. Such results I got with following shaders: Vertex Shader uniform vec3 Position; void main() { gl_Position = vec4(Position, 1.0); } Geometry Shader layout(points) in; layout(triangle_strip, max_vertices = 4) out; out vec2 fsTextureCoordinates; uniform mat4 projectionMatrix; uniform mat4 modelViewMatrix; uniform sampler2D og_texture0; uniform float og_highResolutionSnapScale; uniform vec2 u_originScale; void main() { vec2 halfSize = vec2(textureSize(og_texture0, 0)) * 0.5 * og_highResolutionSnapScale; vec4 center = gl_in[0].gl_Position; center.xy += (u_originScale * halfSize); vec4 v0 = vec4(center.xy - halfSize, center.z, 1.0); vec4 v1 = vec4(center.xy + vec2(halfSize.x, -halfSize.y), center.z, 1.0); vec4 v2 = vec4(center.xy + vec2(-halfSize.x, halfSize.y), center.z, 1.0); vec4 v3 = vec4(center.xy + halfSize, center.z, 1.0); gl_Position = projectionMatrix * modelViewMatrix * v0; fsTextureCoordinates = vec2(0.0, 0.0); EmitVertex(); gl_Position = projectionMatrix * modelViewMatrix * v1; fsTextureCoordinates = vec2(1.0, 0.0); EmitVertex(); gl_Position = projectionMatrix * modelViewMatrix * v2; fsTextureCoordinates = vec2(0.0, 1.0); EmitVertex(); gl_Position = projectionMatrix * modelViewMatrix * v3; fsTextureCoordinates = vec2(1.0, 1.0); EmitVertex(); } Fragment Shader in vec2 fsTextureCoordinates; out vec4 fragmentColor; uniform sampler2D og_texture0; uniform vec3 u_color; void main() { vec4 color = texture(og_texture0, fsTextureCoordinates); if (color.a == 0.0) { discard; } fragmentColor = vec4(color.rgb * u_color.rgb, color.a); } Any ideas how to get my desirable result? EDIT 1: I make edit in my geometry shader and got part of lable drawn on screen at corner. But it is not rotating. .......... vec4 centerProjected = projectionMatrix * modelViewMatrix * center; centerProjected /= centerProjected.w; vec4 v0 = vec4(centerProjected.xy - halfSize, 0.0, 1.0); vec4 v1 = vec4(centerProjected.xy + vec2(halfSize.x, -halfSize.y), 0.0, 1.0); vec4 v2 = vec4(centerProjected.xy + vec2(-halfSize.x, halfSize.y), 0.0, 1.0); vec4 v3 = vec4(centerProjected.xy + halfSize, 0.0, 1.0); gl_Position = og_viewportOrthographicMatrix * v0; ..........

    Read the article

  • Game physics / 2D Collision detection AS3

    - by Jery
    I know there are some methods you can use like hittestPoint and so on, but I want to see where my movieclip colliedes with another another movieclip. Any other methods I can use? by any chance does somebody know some a good introduction to game physics? Im asking because I coded a small engine and pretty much the whole code is spagetti code thats why I would like to know how you can setup something like this properly

    Read the article

  • How should we approach publishing our game from India?

    - by Praveen Sharath
    Me and my friends are developing a game using Java for Windows operating system. We have nearly completed our game. As we feel that the game can make money, we wish to sell it. But in India we don't know any game publishing company. We want to know the following if possible. Will game publishers like BigFishGames.com publish our game even though we are not in USA? We are just students in college. Is it required that we can sell only after we start a company? Is it possible to sign contracts if the publisher wishes to publish our game, while the publisher and developer are in different countries? Thank you

    Read the article

  • How do I properly use String literals for loading content?

    - by Dave Voyles
    I've been using verbatim string literals for some time now, and never quite thought about using a regular string literal until I started to come across them in Microsoft provided XNA samples. With that said, I'm trying to implement a new AudioManager class from the Net Rumble sample. I have two (2) issues here: Question 1: In my code for my GameplayScreen screen I have a folder location written as the following, and it works fine: menuButton = content.Load<SoundEffect>(@"sfx/menuButton"); menuClose = content.Load<SoundEffect>(@"sfx/menuClose"); If you notice, you'll see that I'm using a verbatim string, with a forward slash "/". In the AudioManager class, they use a regular string literal, but with two backslashes "\". I understand that one is used as an escape, but why are they BACK instead of FORWARD? (See below) soundList[soundName] = game.Content.Load<SoundEffect>("audio\\wav\\"+ soundName); Question 2: I seem to be doing everything correctly in my AudioManager class, but I'm not sure of what this line means: audioFileList = audioFolder.GetFiles("*.xnb"); I suppose that the *xnb means look for everything BUT files that end in *xnb? I can't figure out what I'm doing wrong with my file locations, as the sound effects are not playing. My code is not much different from what I've linked to above. private AudioManager(Game game, DirectoryInfo audioDirectory) : base(game) { try { audioFolder = audioDirectory; audioFileList = audioFolder.GetFiles("*.mp3"); soundList = new Dictionary<string, SoundEffect>(); for (int i = 0; i < audioFileList.Length; i++) { string soundName = Path.GetFileNameWithoutExtension(audioFileList[i].Name); soundList[soundName] = game.Content.Load<SoundEffect>(@"sfx\" + soundName); soundList[soundName].Name = soundName; } // Plays this track while the GameplayScreen is active soundtrack = game.Content.Load<Song>("boomer"); } catch (NoAudioHardwareException) { // silently fall back to silence } }

    Read the article

  • Beat detection and FFT

    - by Quincy
    So I am working on a platformer game which includes music with beat detection. I am currently using a simple if the energy that is stored in the history buffer is smaller then the current energy there is a beat. The problem with this is that ofcourse if you use songs like rock songs where you have a pretty steady amplitude this isn't going to work. So I looked further and found algorithms splitting the sound into multiple bands using FFT. I then found this : http://en.literateprograms.org/Cooley-Tukey_FFT_algorithm_(C) The only problem I'm having is that I am quite new to audio and I have no idea how to use that to split the signal up into multiple signals. So my question is : How do you use a FFT to split a signal into multiple bands ? Also for the guys interested, this is my algorithm in c# : // C = threshold, N = size of history buffer / 1024 public void PlaceBeatMarkers(float C, int N) { List<float> instantEnergyList = new List<float>(); short[] samples = soundData.Samples; float timePerSample = 1 / (float)soundData.SampleRate; int sampleIndex = 0; int nextSamples = 1024; // Calculate instant energy for every 1024 samples. while (sampleIndex + nextSamples < samples.Length) { float instantEnergy = 0; for (int i = 0; i < nextSamples; i++) { instantEnergy += Math.Abs((float)samples[sampleIndex + i]); } instantEnergy /= nextSamples; instantEnergyList.Add(instantEnergy); if(sampleIndex + nextSamples >= samples.Length) nextSamples = samples.Length - sampleIndex - 1; sampleIndex += nextSamples; } int index = N; int numInBuffer = index; float historyBuffer = 0; //Fill the history buffer with n * instant energy for (int i = 0; i < index; i++) { historyBuffer += instantEnergyList[i]; } // If instantEnergy / samples in buffer < instantEnergy for the next sample then add beatmarker. while (index + 1 < instantEnergyList.Count) { if(instantEnergyList[index + 1] > (historyBuffer / numInBuffer) * C) beatMarkers.Add((index + 1) * 1024 * timePerSample); historyBuffer -= instantEnergyList[index - numInBuffer]; historyBuffer += instantEnergyList[index + 1]; index++; } }

    Read the article

  • How does a segment based rendering engine work?

    - by Calmarius
    As far as I know Descent was one of the first games that featured a fully 3D environment, and it used a segment based rendering engine. Its levels are built from cubic segments (these cubes may be deformed as long as it remains convex and sides remain roughly flat). These cubes are connected by their sides. The connected sides are traversable (maybe doors or grids can be placed on these sides), while the unconnected sides are not traversable walls. So the game is played inside of this complex. Descent was software rendered and it had to be very fast, to be playable on those 10-100MHz processors of that age. Some latter levels of the game are huge and contain thousands of segments, but these levels are still rendered reasonably fast. So I think they tried to minimize the amount of cubes rendered somehow. How to choose which cubes to render for a given location? As far as I know they used a kind of portal rendering, but I couldn't find what was the technique used in this particular kind of engine. I think the fact that the levels are built from convex quadrilateral hexahedrons can be exploited.

    Read the article

  • Why aren't tangent space normal maps completely blue?

    - by seahorse
    Why aren't normal maps just blue? I would think that normal maps should be predominantly blue in color because the Z component of the normal is represented by blue. Normals point out of the surface in the Z direction so we should see blue as the predominant colour since the Z component is dominant. By definition tangent space is perpendicular to the surface. At any point we should have the normal always pointing in the Z (blue direction) with no X (red direction) or Y (green direction). Thus the normal map (since it is a "normal map") should have the colour of the normals which is just blue (R = x = 0, G = y = 0, B = z = 1) with no shades in between. But normal maps are not so, and they have gradients of shades in them. Why is this so?

    Read the article

  • multipass shadow mapping renderer in XNA

    - by Nick
    I am wanting to implement a multipass renderer in XNA (additive blending combines the contributions from each light). I have the renderer working without any shadows, but when I try to add shadow mapping support I run into an issue with switching render targets to draw the shadow maps. When I switch render targets, I lose the contents of the backbuffer which ruins the whole additive blending idea. For example: Draw() { DrawAmbientLighting() foreach (DirectionalLight) { DrawDirectionalShadowMap() // <-- I lose all previous lighting contributions when I switch to the shadow map render target here DrawDirectionalLighting() } } Is there any way around my issue? (I could render all the shadow maps first, but then I have to make and hold onto a render target for each light that casts a shadow--is this the only way?)

    Read the article

  • How can I draw an arrow at the edge of the screen pointing to an object that is off screen?

    - by Adam Henderson
    I am wishing to do what is described in this topic: http://www.allegro.cc/forums/print-thread/283220 I have attempted a variety of the methods mentioned here. First I tried to use the method described by Carrus85: Just take the ratio of the two triangle hypontenuses (doesn't matter which triagle you use for the other, I suggest point 1 and point 2 as the distance you calculate). This will give you the aspect ratio percentage of the triangle in the corner from the larger triangle. Then you simply multiply deltax by that value to get the x-coordinate offset, and deltay by that value to get the y-coordinate offset. But I could not find a way to calculate how far the object is away from the edge of the screen. I then tried using ray casting (which I have never done before) suggested by 23yrold3yrold: Fire a ray from the center of the screen to the offscreen object. Calculate where on the rectangle the ray intersects. There's your coordinates. I first calculated the hypotenuse of the triangle formed by the difference in x and y positions of the two points. I used this to create a unit vector along that line. I looped through that vector until either the x coordinate or the y coordinate was off the screen. The two current x and y values then form the x and y of the arrow. Here is the code for my ray casting method (written in C++ and Allegro 5) void renderArrows(Object* i) { float x1 = i->getX() + (i->getWidth() / 2); float y1 = i->getY() + (i->getHeight() / 2); float x2 = screenCentreX; float y2 = ScreenCentreY; float dx = x2 - x1; float dy = y2 - y1; float hypotSquared = (dx * dx) + (dy * dy); float hypot = sqrt(hypotSquared); float unitX = dx / hypot; float unitY = dy / hypot; float rayX = x2 - view->getViewportX(); float rayY = y2 - view->getViewportY(); float arrowX = 0; float arrowY = 0; bool posFound = false; while(posFound == false) { rayX += unitX; rayY += unitY; if(rayX <= 0 || rayX >= screenWidth || rayY <= 0 || rayY >= screenHeight) { arrowX = rayX; arrowY = rayY; posFound = true; } } al_draw_bitmap(sprite, arrowX - spriteWidth, arrowY - spriteHeight, 0); } This was relatively successful. Arrows are displayed in the bottom right section of the screen when objects are located above and left of the screen as if the locations of the where the arrows are drawn have been rotated 180 degrees around the center of the screen. I assumed this was due to the fact that when I was calculating the hypotenuse of the triangle, it would always be positive regardless of whether or not the difference in x or difference in y is negative. Thinking about it, ray casting does not seem like a good way of solving the problem (due to the fact that it involves using sqrt() and a large for loop). Any help finding a suitable solution would be greatly appreciated, Thanks Adam

    Read the article

  • When should a bullet texture be loaded in XNA?

    - by Bill
    I'm making a SpaceWar!-esque game using XNA. I want to limit my ships to 5 active bullets at any time. I have a Bullet DrawableGameComponent and a Ship DrawableGameComponent. My Ship has an array of 5 Bullet. What is the best way to manage the Bullet textures? Specifically, when should I be calling LoadTexture? Right now, my solution is to populate the Bullet array in the Ship's constructor, with LoadTexture being called in the Bullet constructor. The Bullet objects will be disabled/not visible except when they are active. Does the texture really need to be loaded once for each individual instance of the bullet object? This seems like a very processor-intensive operation. Note: This is a small-scale project, so I'm OK with not implementing a huge texture-management framework since there won't be more than half a dozen or so in the entire game. I'd still like to hear about scalable solutions for future applications, though.

    Read the article

  • How much info can I store in a cookie?

    - by Artemix
    Hi guys, Im developing a flash game and I'd like to know how much info can I store in a browser cookie. The game is simple, but it needs to store several variables in order to save all the details of your current progress. The game is only one swf file, no server, no nothing. I need to know how should I use the cookies to achieve this, and if they have the posibility of doing it, of course. (several = 200 variables i.e)

    Read the article

  • For normal mapping, why can we not simply add the tangent normal to the surface normal?

    - by sebf
    I am looking at implementing bump mapping (which in all implementations I have seen is really normal mapping), and so far all I have read says that to do this, we create a matrix to convert from world-space to tangent-space, in order to transform the lights and eye direction vectors into tangent space, so that the vectors from the normal map may be used directly in place of those passed through from the vertex shader. What I do not understand though, is why we cannot just use the normalised sum of the sampled-normal vector, and the surface-normal? (assuming we already transform and pass through the surface normal for the existing lighting functions) Take the diagram below; the normal is simply the deviation from the 'reference normal' for any given coordinate system, correct? And transforming the surface normal of a mapped surface from world space to tangent space makes it equivalent to the tangent space 'reference normal', no? If so, why do we transform all lighting vectors into tangent space, instead of simply transforming the sampled tangent once in the pixel shader?

    Read the article

  • Push back rectangle where collision happens

    - by Tifa
    I have a tile collision on a game I am creating but the problem is once a collision happens for example a collision happens in right side my sprite cant move to up and bottom :( thats because i set the speed to 0. I thinks its wrong. here is my code: int startX, startY, endX, endY; float pushx = 0,pushy = 0; // move player if(Gdx.input.isKeyPressed(Input.Keys.LEFT)){ dx=-1; currentWalk = leftWalk; } if(Gdx.input.isKeyPressed(Input.Keys.RIGHT)){ dx=1; currentWalk = rightWalk; } if(Gdx.input.isKeyPressed(Input.Keys.DOWN)){ dy=-1; currentWalk = downWalk; } if(Gdx.input.isKeyPressed(Input.Keys.UP)){ dy=1; currentWalk = upWalk; } sr.setProjectionMatrix(camera.combined); sr.begin(ShapeRenderer.ShapeType.Line); Rectangle koalaRect = rectPool.obtain(); koalaRect.set(player.getX(), player.getY(), pw, ph /2 ); float oldX = player.getX(), oldY = player.getY(); // THIS LINE WAS ADDED player.setXY(player.getX() + dx * Gdx.graphics.getDeltaTime() * 4f, player.getY() + dy * Gdx.graphics.getDeltaTime() * 4f); // THIS LINE WAS MOVED HERE FROM DOWN BELOW if(dx> 0) { startX = endX = (int)(player.getX() + pw); } else { startX = endX = (int)(player.getX() ); } startY = (int)(player.getY()); endY = (int)(player.getY() + ph); getTiles(startX, startY, endX, endY, tiles); for(Rectangle tile: tiles) { sr.rect(tile.x,tile.y,tile.getWidth(),tile.getHeight()); if(koalaRect.overlaps(tile)) { //dx = 0; player.setX(oldX); // THIS LINE CHANGED Gdx.app.log("x","hit " + player.getX() + " " + oldX); break; } } if(dy > 0) { startY = endY = (int)(player.getY() + ph ); } else { startY = endY = (int)(player.getY() ); } startX = (int)(player.getX()); endX = (int)(player.getX() + pw); getTiles(startX, startY, endX, endY, tiles); for(Rectangle tile: tiles) { if(koalaRect.overlaps(tile)) { //dy = 0; player.setY(oldY); // THIS LINE CHANGED //Gdx.app.log("y","hit" + player.getY() + " " + oldY); break; } } sr.rect(koalaRect.x,koalaRect.y,koalaRect.getWidth(),koalaRect.getHeight() / 2); sr.setColor(Color.GREEN); sr.end(); I want to push back the sprite when a collision happens but i have no idea how :D pls help

    Read the article

  • Rotate a particle system

    - by Blueski
    Languages / Libraries in use: C++, OpenGL, GLUT Okay, here's the deal. I've got a particle system which shoots out alpha blended textures to produce a flame. The system only keeps track of very basic things such as, time alive, life, xyz and spread. The direction in which the flames are currently moving in is purely based on other things which are going on in my code ( I assume ). My goal however, is to attach the flame to the camera (DONE) and have the flame pointing in the direction my camera is facing (NOT WORKING). I've tried glRotate for both x,y,z and I can't get it to work properly. I'm currently using gluLookAt to move the camera, and get the flame to follow the XYZ of the camera by calling glTranslatef(camX, camY - offset, camZ); Any suggestions on how I can rotate the direction of the flame with the camera would be greatly appreciated. Heres an image of what I've got: http://i.imgur.com/YhV4w.png Notes: Crosshair depicts where camera is facing if I turn the camera, flame doesn't follow the crosshair Also asked here: http://stackoverflow.com/questions/9560396/rotate-a-particle-system but was referred here

    Read the article

  • How does flocking algorithm work?

    - by Chan
    I read and understand the basic of flocking algorithm. Basically, we need to have 3 behaviors: 1. Cohesion 2. Separation 3. Alignment From my understanding, it's like a state machine. Every time we do an update (then draw), we check all the constraints on both three behaviors. And each behavior returns a Vector3 which is the "correct" orientation that an object should transform to. So my initial idea was /// <summary> /// Objects stick together /// </summary> /// <returns></returns> private Vector3 Cohesion() { Vector3 result = new Vector3(0.0f, 0.0f, 0.0f); return result; } /// <summary> /// Object align /// </summary> /// <returns></returns> private Vector3 Align() { Vector3 result = new Vector3(0.0f, 0.0f, 0.0f); return result; } /// <summary> /// Object separates from each others /// </summary> /// <returns></returns> private Vector3 Separate() { Vector3 result = new Vector3(0.0f, 0.0f, 0.0f); return result; } Then I search online for pseudocode but many of them involve velocity and acceleration plus other stuffs. This part confused me. In my game, all objects move at constant speed, and they have one leader. So can anyone share me an idea how to start on implement this flocking algorithm? Also, did I understand it correctly? (I'm using XNA 4.0)

    Read the article

  • How does an Engine like Source process entities?

    - by Júlio Souza
    [background information] On the Source engine (and it's antecessor, goldsrc, quake's) the game objects are divided on two types, world and entities. The world is the map geometry and the entities are players, particles, sounds, scores, etc (for the Source Engine). Every entity has a think function, which do all the logic for that entity. So, if everything that needs to be processed comes from a base class with the think function, the game engine could store everything on a list and, on every frame, loop through it and call that function. On a first look, this idea is reasonable, but it can take too much resources, if the game has a lot of entities.. [end of background information] So, how does a engine like Source take care (process, update, draw, etc) of the game objects?

    Read the article

  • How to create per-vertex normals when reusing vertex data?

    - by Chris Smith
    I am displaying a cube using a vertex buffer object (gl.ELEMENT_ARRAY_BUFFER). This allows me to specify vertex indicies, rather than having duplicate vertexes. In the case of displaying a simple cube, this means I only need to have eight vertices total. Opposed to needing three vertices per triangle, times two triangles per face, times six faces. Sound correct so far? My question is, how do I now deal with vertex attribute data such as color, texture coordinates, and normals when reusing vertices using the vertex buffer object? If I am reusing the same vertex data in my indexed vertex buffer, how can I differentiate when vertex X is used as part of the cube's front face versus the cube's left face? In both cases I would like the surface normal and texture coordinates to be different. I understand I could average the surface normal, however I would like to render a cube. Also, this still doesn't work for texture coordinates. Is there a way to save memory using a vertex buffer object while being able to provide different vertex attribute data based on context? (Per-triangle would be idea.) Or should I just duplicate each vertex for each context in which it gets rendered. (So there is a one-to-one mapping between vertex, normal, color, etc.) Note: I'm using OpenGL ES.

    Read the article

  • How to integrate the .gdf with a specific exe for Games Explorer

    - by Kraemer
    Hello, I want to create an installer for a game and after that an icon to be put in Games Explorer for Win Vista and Win 7. I have created the GDF (game definitions file), then build the script for project and obtained the .h, GDF and .rc files. But i can't compile using Visual Studio 2010 the .rc file into an executable to be used after that to create the installer. Some error is popping up after i set the executable path "Could not load file or assembly'Microsoft.VisualStudio.HpcDebugger.Impl, Version 10.0.0.0, Culture=neutral, PublickKeyToken=b03f5f7f11d50a3a' or one of its dependencies. The system cannot find the file specified." Any ideas what i'm doing wrong ? I need to mention that i've never worked before with GDF Editor and Visual Studio. Any answer would be highly appreciated.Thanks!

    Read the article

  • Best practice settings Effect parameters in XNA

    - by hichaeretaqua
    I want to ask if there is a best practice settings effect parameters in XNA. Or in other words, what exactly happens when I call pass.Apply(). I can imagine multiple scenarios: Each time Apply() is called, all effect parameters are transferred to the GPU and therefor it has no real influence how often I set a parameter. Each time Apply() is called, only the parameters that got reset are transferred. So caching Set-operations that don't actually set a new value should be avoided. Each time Apply() is called, only the parameters that got changed are transferred. So caching Set-operations is useless. This whole questions is bootless because no one of the mentions ways has any noteworthy impact on game performance. So the final question: Is it useful to implement some caching of Set-operation like: private Matrix _world; public Matrix World { get{ return _world;} set { if(value == world)return; _effect.Parameters["xWorld"].SetValue(value); _world = value; } Thanking you in anticipation

    Read the article

  • Corona SDK: Quality of support and resources?

    - by Nick Wiggill
    I've not used Corona SDK before and am looking into it for a friend. (He is also considering Unity.) I wonder what the support is like for Corona? While Unity has a great many customers and so the Unity team can often not address issues directly, the community at large is very helpful and there are many excellent resources: tutorials, forum posts, code resources. What is Corona like in this regard, and by comparison?

    Read the article

  • 2D Planet Gravity

    - by baked
    I'm trying to make a simple game where a spaceship is launched and then its path is effected by the gravity of planets. Similar to this game: http://sciencenetlinks.com/interactives/gravity.html I wish to know how to replicate the effect the planets have on the spaceship in this game so a spaceship can 'loop' around a planet to change direction. I have managed to achieve some bogus results where the spaceship loops in a huge ellipse around the planet or is only slightly affected by the gravity of a planet using Vectors. Thanks in advance. p.s I have plenty of coding experience just none to do with game dev.

    Read the article

< Previous Page | 457 458 459 460 461 462 463 464 465 466 467 468  | Next Page >