Search Results

Search found 43935 results on 1758 pages for 'development process'.

Page 605/1758 | < Previous Page | 601 602 603 604 605 606 607 608 609 610 611 612  | Next Page >

  • Game maps with "counties" that are split along lines that aren't necessarily straight

    - by pm_2
    I want to create a game environment that supports a 2D map. This is a really basic map, but must be split along lines that are not necessarily straight. So imagine a country with county boundaries. I then want to be able to detect drag / drop events within these counties. What I'm really looking for here is a pointer to where to start on this (how it has been done before - any existing libraries out there), as I'm sure that what I'm trying to do is not new - although I can't find a beginners guide for this anywhere.

    Read the article

  • How to store bitmaps in memory?

    - by Geotarget
    I'm working with general purpose image rendering, and high-performance image processing, and so I need to know how to store bitmaps in-memory. (24bpp/32bpp, compressed/raw, etc) I'm not working with 3D graphics or DirectX / OpenGL rendering and so I don't need to use graphics card compatible bitmap formats. My questions: What is the "usual" or "normal" way to store bitmaps in memory? (in C++ engines/projects?) How to store bitmaps for high-performance algorithms, such that read/write times are the fastest? (fixed array? with/without padding? 24-bpp or 32-bpp?) How to store bitmaps for applications handling a lot of bitmap data, to minimize memory usage? (JPEG? or a faster [de]compression algorithm?) Some possible methods: Use a fixed packed 24-bpp or 32-bpp int[] array and simply access pixels using pointer access, all pixels are allocated in one continuous memory chunk (could be 1-10 MB) Use a form of "sparse" data storage so each line of the bitmap is allocated separately, reusing more memory and requiring smaller contiguous memory segments Store bitmaps in its compressed form (PNG, JPG, GIF, etc) and unpack only when its needed, reducing the amount of memory used. Delete the unpacked data if its not used for 10 secs.

    Read the article

  • Heightmap and Textures

    - by Robert
    Im trying to find the "best way" to apply a texture to a heightmap with opengl 3.x. Its really hard to find something on google because tutorials are olds and they're all using different methods, im really lost and i dont know what to use at all. Here is my code that generates the heightmap (its basic) float[] vertexes = null; float[] textureCoords = null; for(int x = 0; x < this.m_size.width; x++) { for(int y = 0; y < this.m_size.height; y++) { vertexes ~= [x, 1.0f, y]; textureCoords ~= [cast(float)x / 50, cast(float)y / 50]; } } As you can see, i dont know how to apply the texture at all (i was using / 50 for my tests). Result of that code : I would like to have something very basic like : (you can find more pics in his blog) Edit : my texture size is 1024x1024.

    Read the article

  • Infinite terrain shadows

    - by user35399
    I'm creating an infinite terrain engine, which generates the terrain either with fractals or noise. How can I make dynamic shadows for the sun on this terrain, if I don't know in advance what will be rendered in front of the sun. My terrain: The sun is the only light, it is directional, my terrain is generated on a plane which is positioned before the camera, frustum culled and fits the size of the viewing frustum. It is height mapped with generated noise texture, and using tessellation shaders on it. Video:http://www.youtube.com/watch?v=tk6yFwYusOs Dynamic shadows with the infinite terrain.

    Read the article

  • this.BoundingBox.Intersects(Wall[0].BoundingBox) not working properly

    - by Pieter
    I seem to be having this problem a lot, I'm still learning XNA / C# and well, trying to make a classic paddle and ball game. The problem I run into (and after debugging have no answer) is that everytime I run my game and press either of the movement keys, the Paddle won't move. Debugging shows that it never gets to the movement part, but I can't understand why not? Here's my code: // This is the If statement for checking Left movement. if (keyboardState.IsKeyDown(Keys.Left) || keyboardState.IsKeyDown(Keys.A)) { if (!CheckCollision(walls[0])) { Location.X -= Velocity; } } //This is the CheckCollision(Wall wall) boolean public bool CheckCollision(Wall wall) { if (this.BoundingBox.Intersects(wall.BoundingBox)) { return true; } return false; } As far as I can tell there should be absolutely no problem with this, I initialize the bounding box in the constructor whenever a new instance of Walls and Paddle is created. this.BoundingBox = new Rectangle(0, 0, Sprite.Width, Sprite.Height); Any idea as to why this isn't working? I have previously succeeded with using the whole Location.X < Wall.Location.X + Wall.Texture.Width code... But to me that seems like too much coding if a simple boolean check could be done.

    Read the article

  • How to shift a vector based on the rotation of another vector?

    - by bpierre
    I’m learning 2D programming, so excuse my approximations, and please, don’t hesitate to correct me. I am just trying to fire a bullet from a player. I’m using HTML canvas (top left origin). Here is a representation of my problem: The black vector represent the position of the player (the grey square). The green vector represent its direction. The red disc represents the target. The red vector represents the direction of a bullet, which will move in the direction of the target (red and dotted line). The blue cross represents the point from where I really want to fire the bullet (and the blue and dotted line represents its movement). This is how I draw the player (this is the player object. Position, direction and dimensions are 2D vectors): ctx.save(); ctx.translate(this.position.x, this.position.y); ctx.rotate(this.direction.getAngle()); ctx.drawImage(this.image, Math.round(-this.dimensions.x/2), Math.round(-this.dimensions.y/2), this.dimensions.x, this.dimensions.y); ctx.restore(); This is how I instanciate a new bullet: var bulletPosition = playerPosition.clone(); // Copy of the player position var bulletDirection = Vector2D.substract(targetPosition, playerPosition).normalize(); // Difference between the player and the target, normalized new Bullet(bulletPosition, bulletDirection); This is how I move the bullet (this is the bullet object): var speed = 5; this.position.add(Vector2D.multiply(this.direction, speed)); And this is how I draw the bullet (this is the bullet object): ctx.save(); ctx.translate(this.position.x, this.position.y); ctx.rotate(this.direction.getAngle()); ctx.fillRect(0, 0, 3, 3); ctx.restore(); How can I change the direction and position vectors of the bullet to ensure it is on the blue dotted line? I think I should represent the shift with a vector, but I can’t see how to use it.

    Read the article

  • iPhone 3d Model format: .h file, .obj, or some other?

    - by T Reddy
    I'm beginning to write an iPhone game using OpenGL-ES and I've come across a problem with deciding what format my 3D models should be in. I've read (link escapes me at the moment) that some developers prefer the models compiled in Objective-C .h files. Still, others prefer having .obj as these are more portable (i.e., for deployment on non-iPhone platforms). Various 3D game engines seem to support many(?) formats, but I'm not going to use any of these engines as I would like to actually learn OpenGL-ES. Am I putting myself at a disadvantage here by not using a packaged engine? Thanks!

    Read the article

  • HLSL 5 interpolation issues

    - by metredigm
    I'm having issues with the depth components of my shadowmapping shaders. The shadow map rendering shader is fine, and works very well. The world rendering shader is more problematic. The only value which seems to definitely be off is the pixel's position from the light's perspective, which I pass in parallel to the position. struct Pixel { float4 position : SV_Position; float4 light_pos : TEXCOORD2; float3 normal : NORMAL; float2 texcoord : TEXCOORD; }; The reason that I used the semantic 'TEXCOORD2' on the light's pixel position is because I believe that the problem lies with Direct3D's interpolation of values between shaders, and I started trying random semantics and also forcing linear and noperspective interpolations. In the world rendering shader, I observed in the pixel shader that the Z value of light_pos was always extremely close to, but less than the W value. This resulted in a depth result of 0.999 or similar for every pixel. Here is the vertex shader code : struct Vertex { float3 position : POSITION; float3 normal : NORMAL; float2 texcoord : TEXCOORD; }; struct Pixel { float4 position : SV_Position; float4 light_pos : TEXCOORD2; float3 normal : NORMAL; float2 texcoord : TEXCOORD; }; cbuffer Camera : register (b0) { matrix world; matrix view; matrix projection; }; cbuffer Light : register (b1) { matrix light_world; matrix light_view; matrix light_projection; }; Pixel RenderVertexShader(Vertex input) { Pixel output; output.position = mul(float4(input.position, 1.0f), world); output.position = mul(output.position, view); output.position = mul(output.position, projection); output.world_pos = mul(float4(input.position, 1.0f), world); output.world_pos = mul(output.world_pos, light_view); output.world_pos = mul(output.world_pos, light_projection); output.texcoord = input.texcoord; output.normal = input.normal; return output; } I suspect interpolation to be the culprit, as I used the camera matrices in place of the light matrices in the vertex shader, and had the same problem. The problem is evident as both of the same vectors were passed to a pixel from the VS, but only one of them showed a change in the PS. I have already thoroughly debugged the matrices' validity, the cbuffers' validity, and the multiplicative validity. I'm very stumped and have been trying to solve this for quite some time. Misc info : The light projection matrix and the camera projection matrix are the same, generated from D3DXMatrixPerspectiveFovLH(), with an FOV of 60.0f * 3.141f / 180.0f, a near clipping plane of 0.1f, and a far clipping plane of 1000.0f. Any ideas on what is happening? (This is a repost from my question on Stack Overflow)

    Read the article

  • Comparison between a value with static type Array and a possibly unrelated type Class

    - by Kaoru
    I got this error: Comparison between a value with static type Array and a possibly unrelated type Class. After i modify the class to many classes (before that, everything is on 1 class (all of the functions)), but after i move everything to many classes (all the functions is not on 1 class), that error appear. How to solve this? I am using AS3 and as3isolib Library. Here is the code after i modify the function: if (Constant.dude.y < Constant._numY) { if (Constant.dude.sprites != marioBackClass) { Constant.dude.sprites = [marioBackClass]; Constant.dudeDir = "Up"; } } Here is the code before i change the function to many classes: if (dude.y < ._numY) { if (dude.sprites.toString() != marioBackClass.toString()) { dude.sprites = [marioBackClass]; dudeDir = "Up"; } }

    Read the article

  • Interaction using Kinect in XNA

    - by Sweta Dwivedi
    So i have written a program to play a sound file when ever my RightHand.Joint touches the 3D model . . It goes like this . . even though the code works somehow but not very accurate . . for example it will play the sound when my hand is slightly under my 3D object not exactly on my 3D object . How do i make it more accurate? here is the code . . (HandX & HandY is the values coming from the Skeleton data RightHand.Joint.X etc) and also this calculation doesnt work with Animated Sprites..which i need to do foreach (_3DModel s in Solar) { float x = (float)Math.Floor(((handX * 0.5f) + 0.5f) * (resolution.X)); float y = (float)Math.Floor(((handY * -0.5f) + 0.5f) * (resolution.Y)); float z = (float)Math.Floor((handZ) / 4 * 20000); if (Math.Sqrt(Math.Pow(x - s.modelPosition.X, 2) + Math.Pow(y - s.modelPosition.Y, 2)) < 15) { //Exit(); PlaySound("hyperspace_activate"); Console.WriteLine("1" + "handx:" + x + "," + " " + "modelPos.X:" + s.modelPosition.X + "," + " " + "handY:" + y + "modelPos.Y:" + s.modelPosition.Y); } else { Console.WriteLine("2" + "handx:" + x + "," + " " + "modelPos.X:" + s.modelPosition.X + "," + " " + "handY:" + y + "modelPos.Y:" + s.modelPosition.Y); } }

    Read the article

  • Updates for IOS AppStore Multiplayer Game

    - by TobiHeidi
    I am developing a multiplayer game for the web, android and ios. For the web and android i can instantly push out new versions of my game because they support executing remotly loaded code. But with IOS i need to wait for an Apple approval taking about 10 days. I want to push updates more then weekly. What if my server code changes so the client MUST update? Run an old version of the server code just for IOS? How do other multiplayer devs handle this ?

    Read the article

  • Why does my VertexDeclaration apparently not contain Position0?

    - by Phil
    I'm trying to get my code from calling each individual draw call down to using at least a VertexBuffer, and preferably an indexBuffer, but now that I'm attempting to test my code, I'm getting the error: The current vertex declaration does not include all the elements required by the current vertex shader. Position0 is missing. Which makes absolutely no sense to me, as my VertexDeclaration is: public readonly static VertexDeclaration VertexDeclaration = new VertexDeclaration( new VertexElement(0, VertexElementFormat.Vector3, VertexElementUsage.Position, 0), new VertexElement(sizeof(float) * 3, VertexElementFormat.Color, VertexElementUsage.Color, 0), new VertexElement(sizeof(float) * 3 + 4, VertexElementFormat.Vector3, VertexElementUsage.Normal, 0) ); Which clearly contains the information. I am attempting to draw with the following lines: VertexBuffer vb = new VertexBuffer(GraphicsDevice, VertexPositionColorNormal.VertexDeclaration, c.VertexList.Count, BufferUsage.WriteOnly); IndexBuffer ib = new IndexBuffer(GraphicsDevice, typeof(int), c.IndexList.Count, BufferUsage.WriteOnly); vb.SetData<VertexPositionColorNormal>(c.VertexList.ToArray()); ib.SetData<int>(c.IndexList.ToArray()); GraphicsDevice.DrawIndexedPrimitives(PrimitiveType.TriangleList, 0, 0, vb.VertexCount, 0, c.IndexList.Count/3); Where c is a Chunk class containing an 8x8x8 array of boxes. Full code is available at https://github.com/mrbaggins/Box/tree/ProperMeshing/box/box. Relevant locations are Chunk.cs (Contains the VertexDeclaration) and Game1.cs (Draw() is in Lines 230-250). Not much else of relevance to this problem anywhere else. Note that large commented sections are from old version of drawing.

    Read the article

  • Problem creating levels using inherited classes/polymorphism

    - by Adam
    I'm trying to write my level classes by having a base class that each level class inherits from...The base class uses pure virtual functions. My base class is only going to be used as a vector that'll have the inherited level classes pushed onto it...This is what my code looks like at the moment, I've tried various things and get the same result (segmentation fault). //level.h class Level { protected: Mix_Music *music; SDL_Surface *background; SDL_Surface *background2; vector<Enemy> enemy; bool loaded; int time; public: Level(); virtual ~Level(); int bgX, bgY; int bg2X, bg2Y; int width, height; virtual void load(); virtual void unload(); virtual void update(); virtual void draw(); }; //level.cpp Level::Level() { bgX = 0; bgY = 0; bg2X = 0; bg2Y = 0; width = 2048; height = 480; loaded = false; time = 0; } Level::~Level() { } //virtual functions are empty... I'm not sure exactly what I'm supposed to include in the inherited class structure, but this is what I have at the moment... //level1.h class Level1: public Level { public: Level1(); ~Level1(); void load(); void unload(); void update(); void draw(); }; //level1.cpp Level1::Level1() { } Level1::~Level1() { enemy.clear(); Mix_FreeMusic(music); SDL_FreeSurface(background); SDL_FreeSurface(background2); music = NULL; background = NULL; background2 = NULL; Mix_CloseAudio(); } void Level1::load() { music = Mix_LoadMUS("music/song1.xm"); background = loadImage("image/background.png"); background2 = loadImage("image/background2.png"); Mix_OpenAudio(48000, MIX_DEFAULT_FORMAT, 2, 4096); Mix_PlayMusic(music, -1); } void Level1::unload() { } //functions have level-specific code in them... Right now for testing purposes, I just have the main loop call Level1 level1; and use the functions, but when I run the game I get a segmentation fault. This is the first time I've tried writing inherited classes, so I know I'm doing something wrong, but I can't seem to figure out what exactly.

    Read the article

  • Issues implementing arcball viewer

    - by Pris
    My scene has a simple cube, and a camera built with the lookAt function (I'm using OpenGL). The scene renders fine, and I'm sure I have my model/view/projection matrices set up correctly. Now I'm trying to implement arcball rotation for my camera, but I'm having some trouble. I've got it down to calculating the angle/axis rotation for a virtual sphere in normalized screen coordinates. That means when I move my mouse left to right, I get an angle around the Y axis... and moving my mouse up/down will get me an angle about X. I'm not sure where to go from here -- what do I need to do with my axis so I can apply the angle to simulate camera rotation about its viewpoint? If I try directly applying the axis/angle rotation the camera/view transform I get what you'd expect. The view is rotated about the world axes which the mouse moving over the virtual sphere on the screen corresponds to. So if I move the mouse up/down the view rotates about the world's X axis (what I get reminds me of a first-person view)... but this isn't what I want. I think I need the axis I get to be transformed so it passes through the camera viewpoint and is oriented correct in reference to the camera... but I don't know if that's right or how to do that.

    Read the article

  • Random Position between ranges.

    - by blakey87
    Does anyone have a good algorithm for generating a random y position for spawning a block, which takes into account a minimum and maximum height, allowing player to to jump on the block. Blocks will continually be spawned, so the player must always be able to jump onto the next block, bearing in mind the minimum position which would be the ground, and the maximum which would the players jump height bearing in mind the ceiling

    Read the article

  • 2D XNA Game Engine with a Good Wiki [closed]

    - by gcx
    I'm a newbie game developer. I'm planning to develop a XBOX (with a Kinect to double the fun) game. I've researched some 2D game engines that i can use in my project. After some research I've found IceCream engine and it looks delicious with its Milkshake editor. But I can't seem to find "working" game source examples for that engine and its own website's tutorial is not very sufficent. (If you are familiar with this engine) do you know any community that has helpful resources for this particular engine? If not, which engines do you recommend (that has a great wiki) for a XNA based XBOX - Kinect game?

    Read the article

  • Box2D platformer movement. Should i mess with velocity?

    - by Romeo
    I have a platformer game in which I implemented the movement using a wheel attached to the hero. For jumping I use this: player.body.applyLinearImpulse(new Vec2(0, 30000000), player.body.getPosition()); The problem is that the xVelocity doesn't remain the same during the jump so it isn't looking natural. Is there any way to modify only the x velocity of the body so that before jumping I store it in a variable and after jumping I apply it to the body? I hope you understand what I am trying to say.

    Read the article

  • How should I determine direction from a phone's orientation & accelerometer?

    - by Manoj Kumar
    I have an Android application which moves a ball based on the orientation of the phone. I've been using the following code to extract the data - but how do I use it to determine what direction the ball should actually travel in? public void onSensorChanged(int sensor, float[] values) { // TODO Auto-generated method stub synchronized (this) { Log.d("HIIIII :- ", "onSensorChanged: " + sensor + ", x: " + values[0] + ", y: " + values[1] + ", z: " + values[2]); if (sensor == SensorManager.SENSOR_ORIENTATION) { System.out.println("Orientation X: " + values[0]); System.out.println("Orientation Y: " + values[1]); System.out.println("Orientation Z: " + values[2]); } if (sensor == SensorManager.SENSOR_ACCELEROMETER) { System.out.println("Accel X: " + values[0]); System.out.println("Accel Y: " + values[1]); System.out.println("Accel Z: " + values[2]); } } }

    Read the article

  • Get collision details from Rectangle.Intersects()

    - by Daniel Ribeiro
    I have a Breakout game in which, at some point, I detect the collision between the ball and the paddle with something like this: // Ball class rectangle.Intersects(paddle.Rectangle); Is there any way I can get the exact coordinates of the collision, or any details about it, with the current XNA API? I thought of doing some basic calculations, such as comparing the exact coordinates of each object on the moment of the collision. It would look something like this: // Ball class if((rectangle.X - paddle.Rectangle.X) < (paddle.Rectangle.Width / 2)) // Collision happened on the left side else // Collision happened on the right side But I'm not sure this is the correct way to do it. Do you guys have any tips on maybe an engine I might have to use to achieve that? Or even good coding practices using this method?

    Read the article

  • How do I do random isometric paths?

    - by user406470
    I'm working on an Isometric city generator, and I am looking for a little push in the right direction. I'm looking to randomly generate roads on a isometric plane. I have never done pathfinding before, and I've googled it and didn't find any articles relating to what I am trying to do. Basically, my program generates a random isometric city and, I am hoping to add roads to that. Any help is appreciated!

    Read the article

  • XNA Drag Gestures - fractional delta values

    - by Den
    I have an issue with objects moving roughly twice as far as expected when dragging them. I am comparing my application to the standard TouchGestureSample sample from MSDN. For some reason in my application gesture samples have fractional positions and deltas. Both are using same Microsoft.Xna.Framework.Input.Touch.dll, v4.0.30319. I am running both apps using standard Windows Phone Emulator. I am setting my break point immediately after this line of code in a simple Update method: GestureSample gesture = TouchPanel.ReadGesture(); Typical values in my app: Delta = {X:-13.56522 Y:4.166667} Position = {X:184.6956 Y:417.7083} Typical values in sample app: Delta = {X:7 Y:16} Position = {X:497 Y:244} Have anyone seen this issue? Does anyone have any suggestions? Thank you.

    Read the article

  • Multiple objects listening for the same key press

    - by xiaohouzi79
    I want to learn the best way to implement this: I have a hero and an enemy on the screen. Say the hero presses "k" to get out a knife, I want the enemy to react in a certain way. Now, if in my game loop I have a listener for the key press event and I identify a "k" was pressed, the quick and easy way would be to do: // If K pressed // hero.getOoutKnife() // enemy.getAngry() But what is commonly done in more complex games, where say I have 10 types of character on screen and they all need to react in a unique way when the letter "k" is pressed? I can think of a bunch of hacky ways to do this, but would love to know how it should be done properly. I am using C++, but I'm not looking for a code implementation, just some ideas on how it should be done the right way.

    Read the article

  • How to pass one float as four unsigned chars to shader by glVertexPointAttrib?

    - by Kog
    For each vertex I use two floats as position and four unsigned bytes as color. I want to store all of them in one table, so I tried casting those four unsigned bytes to one float, but I am unable to do that correctly... All in all, my tests came to one point: GLfloat vertices[] = { 1.0f, 0.5f, 0, 1.0f, 0, 0 }; glEnableVertexAttribArray(0); glVertexAttribPointer(0, 2, GL_FLOAT, GL_FALSE, 2 * sizeof(float), vertices); // VER1 - draws red triangle // unsigned char colors[] = { 0xff, 0, 0, 0xff, 0xff, 0, 0, 0xff, 0xff, 0, 0, // 0xff }; // glEnableVertexAttribArray(1); // glVertexAttribPointer(1, 4, GL_UNSIGNED_BYTE, GL_TRUE, 4 * sizeof(GLubyte), // colors); // VER2 - draws greenish triangle (not "pure" green) // float f = 255 << 24 | 255; //Hex:0xff0000ff // float colors2[] = { f, f, f }; // glEnableVertexAttribArray(1); // glVertexAttribPointer(1, 4, GL_UNSIGNED_BYTE, GL_TRUE, 4 * sizeof(GLubyte), // colors2); // VER3 - draws red triangle int i = 255 << 24 | 255; //Hex:0xff0000ff int colors3[] = { i, i, i }; glEnableVertexAttribArray(1); glVertexAttribPointer(1, 4, GL_UNSIGNED_BYTE, GL_TRUE, 4 * sizeof(GLubyte), colors3); glDrawArrays(GL_TRIANGLES, 0, 3); Above code is used to draw one simple red triangle. My question is - why do versions 1 and 3 work correctly, while version 2 draws some greenish triangle? Hex values are one I read by marking variable during debug. They are equal for version 2 and 3 - so what causes the difference?

    Read the article

  • Rain drops on screen

    - by user1075940
    I am trying to make simple rain drop effect on screen.Something like this http://fc00.deviantart.net/fs20/f/2007/302/5/6/Rain_drops_by_rockraikar.png My idea is to: Create small drop shaped normal textures,randomly put few on screen,apply texture perturbation and mix with current frame pixels. Here are my questions: -Does this idea even have sense?How professionals do this effect?Everything from text to code will be appreciated -How to pass pixels to shader of already rendered frame?

    Read the article

  • Is there a library that handles hexagon tiled 2D maps?

    - by Pete Mancini
    It would represent a map that is semi-square of arbitrary size. It would have a simple system for representation of the map coordinates such as 0101 (first column, 1st hex). I'd want the map to be able to tell me the distance between two points, and what other hexes lay between those two points as a list or array. I don't care as much about the language but c# or python would be ideal. Does one exist?

    Read the article

< Previous Page | 601 602 603 604 605 606 607 608 609 610 611 612  | Next Page >