Search Results

Search found 30359 results on 1215 pages for 'extension development'.

Page 603/1215 | < Previous Page | 599 600 601 602 603 604 605 606 607 608 609 610  | Next Page >

  • Windows Phone XAML and XNA Apps with Game Components

    - by row1
    I am using the Windows Phone Template "Windows Phone XAML and XNA Apps" and targeting Windows Phone 7/8. Most examples show your game inheriting from Microsoft.Xna.Framework.Game and then adding Microsoft.Xna.Framework.GameComponent items to the Components collection. But as my game page inherits from PhoneApplicationPage there isn't a Components collection or a Game property. How can I use GameComponent from within PhoneApplicationPage?

    Read the article

  • UDK - How to make sure a PhysicalMaterial mask actually works?

    - by tomacmuni
    Hello, I have been reading the documentation for UDK about physical materials and masks. I have my 1bit BMP mask, and the two physical material assets I want to shoot off in the black and white channels. I have applied my material to both a rigid body and to a skeletal mesh and neither apparently uses the mask. If I assign a regular physical material (one that doesn't use a mask) then it will work fine, but this defeats the point because it gives only one hit reaction. In the documentation it states that it is possible to extend a class on which we want to use a physical material based on the KActor class's usage. How to do that? Here is the quote: "The following properties [ie, ImpactEffect - Particle system to spawn at the point of impact + ImpactSound - Sound to play when an impact occurs] allow you to attach sounds and effects to physical collisions. These only work on classes which support them, which at the moment is only KActor. By looking at the implementation in KActor though, you can add this functionality to other classes (or you can subclass KActor)." Essentially, how to make sure a PhysicalMaterial mask actually works? What code could be added to a skeletal mesh class perhaps, to get it going? Any help appreciated.

    Read the article

  • Start Game Programming [on hold]

    - by vishalpamnani
    I am 23 and working as a Software Developer. Though my work is entirely based on Java and Advanced Java, I know a very little and all my interest is in developing games. I want to make a my career in Gaming Industry as a Game Programmer. I am not able to figure out the starting step to start with Game Programming. I have zero knowledge with developing games and never ever tried a tiniest of game. Please suggest me from where to start. Which programming language to start with? What should be my practice? What references to use? What type of games to begin with? BTW my preferable language would be C++ ~Thanks

    Read the article

  • Change the scale-policy of OpenGL ES in Android?

    - by wanting252
    I currently develop a game for Android in OpenGL ES 1.0, use libgdx library. I target the 720x480 screen size. For example, I design only one arts pack for 720x480. And what will happen in Android phones with screen-size smaller or bigger than it, 480x320 for instance? Could you please tell me how to change the scale-policy of OpenGL ES in Android? Or in libgdx specially? Is there anything like "Resample Image" like photoshop?(Nearest Neighbor, Bilinear, Bicubic etc..) for libgdx? Edit: I found some tutorials about texture filter in OpenGL, test it with Linear and Nearest. Linear is good for scaling but slow down the game, and Nearest is on the contrary. What should I do to get a balance between those?

    Read the article

  • Best game engine 2D for iOS

    - by Adelino
    which is the best 2D game enginefor iOS? I really need a game engine that allows me to modify the game code because I need to control the multi-touch events. I have a framework that detects the gesture that the player makes and I need to test this gesture recognizer in a game, so I have to have the freedom to change the game code. I don't want anything like GameSalad where you can't control anything. Thanks in advance.

    Read the article

  • matrix to transform unit cube to space defined by 8 arbitrary points

    - by aadster
    I asked a question relating to similar to this already, but I think this is a clearer objective of what Im trying to achieve.. or whether its possible at all! Im trying to find a transformation (matrix ideally) which would transform the 8 points of a 3d unit cube to 8 arbitrary points in space. The 8 target points have no known structure. e.g: My gut feeling is that a matrix is unable to provide this xform since the cube faces vertices can be concave.. but are there any other methods of transformation? Thanks!

    Read the article

  • IDirect3DDevice9::GetRenderTargetData() returns no data

    - by P. Avery
    I've got a simple function to get the rendertarget data of an RT( w/default pool ). This particular RT has a resolution of 1x1( it's the 10'th and final mip of a texture ). Here is my code to get data for IDirect3DSurface9 *pTargetSurface: IDirect3DSurface9 *pSOS = NULL; pd3dDevice->CreateOffScreenPlainSurface( 1, 1, D3DFMT_A8R8G8B8, D3DPOOL_SYSTEMMEM, &pSOS, NULL ); // get residual energy if( FAILED( hr = pd3dDevice->GetRenderTargetData( pTargetSurface, pSOS ) ) ) { DebugStringDX( ClassName, "Failed to IDirect3DDevice9::GetRenderTargetData() at DownsampleArea()", __LINE__, hr ); goto Exit; } // lock surface if( FAILED( hr = pSOS->LockRect( &rct, NULL, D3DLOCK_READONLY ) ) ) { DebugStringDX( ClassName, "Failed to IDirect3DSurface9::LockRect() at DownsampleArea()", __LINE__, hr ); goto Exit; } // get residual energy from downsampled texture pByte = ( BYTE* )rct.pBits; D3DXVECTOR4 vEnergy; vEnergy.z = ( float )pByte[ 0 ] / 255.0f; vEnergy.y = ( float )pByte[ 1 ] / 255.0f; vEnergy.x = ( float )pByte[ 2 ] / 255.0f; vEnergy.w = ( float )pByte[ 3 ] / 255.0f; V( pSOS->UnlockRect() ); All formatting and settings are correct, directx in debug mode shows no errors... The problem is that the 4 bytes above are 0...I know this to be incorrect by using PIX to debug...PIX shows that RGB bytes are 0.078 and Alpah is 1. These values are not less than that which can be represented by a single byte( 1 / 255 ). Any ideas? Am I copying rendertarget data correctly?

    Read the article

  • Unit turning in navmesh-based pathfinding

    - by Haddayn
    I'm working on an RTS game, and I'm using navmeshes for unit pathfinding. I do know how to find a general path within a navmesh, but how do you determine if the unit have enough space to turn? I have units of different shapes (mostly rectangles with different dimensions), and with different turn radii. Additionally some of units can turn in place, and some can move in reverse. So, how to find a path which unit can follow, considering that it can not rotate easily?

    Read the article

  • How to display consistent background image

    - by Tofu_Craving_Redish_BlueDragon
    Drawing a large background is relatively slow in PyGame. In order to avoid drawing BG every frame, you could draw it once, then do nothing. However, if something is overdrawn onto the surface and keeps moving, you will need to redraw the background in order to "erase" the color pixels left by moving object; otherwise, you will have "traces" of the moving object. I have a moving object in my PyGame. However, I do not want to "clear the color buffer" by redrawing the background image. Redrawing the background image every frame is slow. My solution : I will "clear" only required portions (where the "traces" of moving object are left) of the "buffer" by redrawing portions of background. Is there any other better way to have a consistent background?

    Read the article

  • Fastest approach to 3D animation

    - by HappyFerret
    I'm currently tasked with designing a small HTML5 game. Having done everything by myself so far (3D models, codebase, game design, etc) I'm now at a point where I'm running out of time. I've less than a day to animate and bind everything together. However, that's exactly my problem. I was under the naive impression that everything would be easier if I went with pre-rendered 3D models. However, I didn't consider the most difficult part. Animation. After having spent over an hour trying to figure out messiahStudio, I figured it's time to ask for outside help. Is there any easier solution to 3D animation than 3D rigging? What I'm basically looking for is some sort of tool that allows me to simply grab and move/deform select polygons. It doesn't have to be as life-like and accurate as rigging, just efficient enough. Were the circumstances any different, I might just learn how to rig. But that's sorely out of scope right now. PS:The models were created in Sculptris but are fairly low-poly.

    Read the article

  • OGRE 3D: How to create very basic gameworld [on hold]

    - by skiwi
    I'm considering trying around to create an FPS (First person shooter), using the Ogre 3D engine. I have done the Basic Tutorials (except CEGUI), and have read through the Intermediate Tutorial, I understand some of the more advanced concepts, but I'm stuck with very simple concepts. First of all: I would want to use some tiles (square ones, with relative little height) as the floor, I guess I need to set up a loop to get those tiles done. But how would I go about creating those tiles exactly? Like making it to be their own mesh, and then I would need to find some texture. Secondly: I guess I can derive the camera and movement functions from the basic tutorial. But I'll be needing a "soldier" (anything does for now), what is the best way to create a moderately decent looking soldier? (Or obtain a decent one from an open library?) And thirdly: How can I ensure that the soldier is actually walking on the ground, instead of mid air? Will raycasting into the ground + adjust position based on that, suffice?

    Read the article

  • What is the purpose of the canonical view volume?

    - by breadjesus
    I'm currently learning OpenGL and haven't been able to find an answer to this question. After the projection matrix is applied to the view space, the view space is "normalized" so that all the points lie within the range [-1, 1]. This is generally referred to as the "canonical view volume" or "normalized device coordinates". While I've found plenty of resources telling me about how this happens, I haven't seen anything about why it happens. What is the purpose of this step?

    Read the article

  • Cube rotation DX10

    - by German
    Well I'm reading the Frank's Luna DirectX10 book and, while I'm trying to understand the first demo, I found something that's not very clear at least for me. In the updateScene method, when I press A, S, W or D, the angles mTheta and mPhi change, but after that, there are three lines of code that I don't understand exactly what they do: // Convert Spherical to Cartesian coordinates: mPhi measured from +y // and mTheta measured counterclockwise from -z. float x = 5.0f*sinf(mPhi)*sinf(mTheta); float z = -5.0f*sinf(mPhi)*cosf(mTheta); float y = 5.0f*cosf(mPhi); I mean, this explains that they do, it says that it converts the spherical coordinates to cartesian coordinates, but, mathematically, why? why the x value is calculated by the product of the sins of both angles? And the z by the product of the sine and cosine? and why the y just uses the cosine? After that, those values (x, y and z) are used to build the view matrix. The book doesn't explain (mathematically) why those values are calculated like that (and I didn't find anything to help me to understand it at the first Part of the book: "Mathematical prerequisites"), so it would be good if someone could explain me what exactly happen in those code lines or just give me a link that helps me to understand the math part. Thanks in advance!

    Read the article

  • iOS: game with facebook challenges

    - by nazz_areno
    I created a game for iPad and I want to challenge my facebook friends. I follow the iOS tutorial in "facebook dev docs", with the "Smash game", but it doesn't explain how to challenge a friend directly to a game. I will explain with an example: I want to start a new match and I want challenge a friend on facebook. Then I send him a request to install the app and when I detect that its app is installed I send him a request to play vs me. Then, when I finish the match I sent him my result and my friend do the same thing. But if I and my friend don't finish the match it is not possible to send another challenge. This scenario is not explained by facebook sdk. Is it necessary to use another instrument to do this situation?

    Read the article

  • How would I use JBox2d in Java?

    - by BluFire
    So I did some research and a found Box2d. I then proceeded to download it and the testbed. Now that i have it, I don't know how to properly use it. I'm looking for a clear simple answer on how to use the engine. The things I did was that I put it into a lib folder and referenced the JBox2D jar file. After that i got stuck. How can i use this to program games for android? I'm very confused since Box2d was intended for C++.

    Read the article

  • Entity communication: Message queue vs Publish/Subscribe vs Signal/Slots

    - by deft_code
    How do game engine entities communicate? Two use cases: How would entity_A send a take-damage message to entity_B? How would entity_A query entity_B's HP? Here's what I've encountered so far: Message queue entity_A creates a take-damage message and posts it to entity_B's message queue. entity_A creates a query-hp message and posts it to entity_B. entity_B in return creates an response-hp message and posts it to entity_A. Publish/Subscribe entity_B subscribes to take-damage messages (possibly with some preemptive filtering so only relevant message are delivered). entity_A produces take-damage message that references entity_B. entity_A subscribes to update-hp messages (possibly filtered). Every frame entity_B broadcasts update-hp messages. Signal/Slots ??? entity_A connects an update-hp slot to entity_B's update-hp signal. Something better? Do I have a correct understanding of how these communication schemes would tie into a game engine's entity system? How do entities in commercial game engines communicate?

    Read the article

  • How can I achieve this lighting with OpenGL?

    - by Smallbro
    I'm currently trying to implement a type of "smooth" lighting. How can I achieve lighting which looks like this: http://dl.dropbox.com/u/1668516/concept/warp3.png Using OpenGl. I've attempted to use blending modes and have come very close to making it work but it came out like this: https://pbs.twimg.com/media/A1071viCEAAlFmJ.png and I also wasn't able to change the alpha of the black background which I want to be able to do. Could I get a few pointers in the right direction?

    Read the article

  • How do I use Content.Load() with raw XML files?

    - by xnanewb
    I'm using the Content.Load() mechanism to load core game definitions from XML files. It works fine, though some definitions should be editable/moddable by the players. Since the content pipeline compiles everything into xnb files, that doesn't work for now. I've seen that the inbuild XNA Song content processor does create 2 files. 1 xnb file which contains meta data for the song and 1 wma file which contains the actual data. I've tried to rebuild that mechanism (so that the second file is the actual xml file), but for some reason I can't use the namespace which contains the IntermediateSerializer class to load the xml (obviously the namespace is only available in a content project?). How can I deploy raw, editable xml files and load them with Content.Load()?

    Read the article

  • How to move a rectangle properly?

    - by bodycountPP
    I recently started to learn OpenGL. Right now I finished the first chapter of the "OpenGL SuperBible". There were two examples. The first had the complete code and showed how to draw a simple triangle. The second example is supposed to show how to move a rectangle using SpecialKeys. The only code provided for this example was the SpecialKeys method. I still tried to implement it but I had two problems. In the previous example I declared and instaciated vVerts in the SetupRC() method. Now as it is also used in the SpecialKeys() method, I moved the declaration and instantiation to the top of the code. Is this proper c++ practice? I copied the part where vertex positions are recalculated from the book, but I had to pick the vertices for the rectangle on my own. So now every time I press a key for the first time the rectangle's upper left vertex is moved to (-0,5:-0.5). This ok because of GLfloat blockX = vVerts[0]; //Upper left X GLfloat blockY = vVerts[7]; // Upper left Y But I also think that this is the reason why my rectangle is shifted in the beginning. After the first time a key was pressed everything works just fine. Here is my complete code I hope you can help me on those two points. GLBatch squareBatch; GLShaderManager shaderManager; //Load up a triangle GLfloat vVerts[] = {-0.5f,0.5f,0.0f, 0.5f,0.5f,0.0f, 0.5f,-0.5f,0.0f, -0.5f,-0.5f,0.0f}; //Window has changed size, or has just been created. //We need to use the window dimensions to set the viewport and the projection matrix. void ChangeSize(int w, int h) { glViewport(0,0,w,h); } //Called to draw the scene. void RenderScene(void) { //Clear the window with the current clearing color glClear(GL_COLOR_BUFFER_BIT|GL_DEPTH_BUFFER_BIT|GL_STENCIL_BUFFER_BIT); GLfloat vRed[] = {1.0f,0.0f,0.0f,1.0f}; shaderManager.UseStockShader(GLT_SHADER_IDENTITY,vRed); squareBatch.Draw(); //perform the buffer swap to display the back buffer glutSwapBuffers(); } //This function does any needed initialization on the rendering context. //This is the first opportunity to do any OpenGL related Tasks. void SetupRC() { //Blue Background glClearColor(0.0f,0.0f,1.0f,1.0f); shaderManager.InitializeStockShaders(); squareBatch.Begin(GL_QUADS,4); squareBatch.CopyVertexData3f(vVerts); squareBatch.End(); } //Respond to arrow keys by moving the camera frame of reference void SpecialKeys(int key,int x,int y) { GLfloat stepSize = 0.025f; GLfloat blockSize = 0.5f; GLfloat blockX = vVerts[0]; //Upper left X GLfloat blockY = vVerts[7]; // Upper left Y if(key == GLUT_KEY_UP) { blockY += stepSize; } if(key == GLUT_KEY_DOWN){blockY -= stepSize;} if(key == GLUT_KEY_LEFT){blockX -= stepSize;} if(key == GLUT_KEY_RIGHT){blockX += stepSize;} //Recalculate vertex positions vVerts[0] = blockX; vVerts[1] = blockY - blockSize*2; vVerts[3] = blockX + blockSize * 2; vVerts[4] = blockY - blockSize *2; vVerts[6] = blockX+blockSize*2; vVerts[7] = blockY; vVerts[9] = blockX; vVerts[10] = blockY; squareBatch.CopyVertexData3f(vVerts); glutPostRedisplay(); } //Main entry point for GLUT based programs int main(int argc, char** argv) { //Sets the working directory. Not really needed gltSetWorkingDirectory(argv[0]); //Passes along the command-line parameters and initializes the GLUT library. glutInit(&argc,argv); //Tells the GLUT library what type of display mode to use, when creating the window. //Double buffered window, RGBA-Color mode,depth-buffer as part of our display, stencil buffer also available glutInitDisplayMode(GLUT_DOUBLE|GLUT_RGBA|GLUT_DEPTH|GLUT_STENCIL); //Window size glutInitWindowSize(800,600); glutCreateWindow("MoveRect"); glutReshapeFunc(ChangeSize); glutDisplayFunc(RenderScene); glutSpecialFunc(SpecialKeys); //initialize GLEW library GLenum err = glewInit(); //Check that nothing goes wrong with the driver initialization before we try and do any rendering. if(GLEW_OK != err) { fprintf(stderr,"Glew Error: %s\n",glewGetErrorString); return 1; } SetupRC(); glutMainLoop(); return 0; }

    Read the article

  • How to manage enemy movement and shoot in a shmup?

    - by whatever
    I'm wondering what is the best (or at least a good) way of managing enemies in a shoot-em-up. Basically, what I'd do would be a class that manages displaying and updating positions of all the enemies. But how to create good deplacements for enemies? A list of where-to-go points? gravitating around some fixed points (with ponderation, distance evaluation etc.)? Same question for the shoot patterns? Can you please put me on a track?

    Read the article

  • Playing a sound on collision?

    - by Eric McLoughlin
    I'm making a pool game. I've separated the gameplay logic and the physics into two systems, entities and physics. Each entity holds a reference to a body which the physics system uses. The body itself holds a reference back to it's owner. When an entity collides with another entity, the Collided(Entity other) method is called on both. What I'm trying to do now is to play a sound when both entities colliding are of a certain subclass. I'm not sure how to do that. I could do it in the Collided method, but then the sound would be played two times at the same time, since the method was called on both entities. How do you suggest I do this?

    Read the article

  • How to syncronize two animations without delays

    - by GeKi
    I have one character idle animation running inside a game in a loop, over and over again. A a certain time I trigger another animation to be played, for the same character. The second animation won't play immediately, as will be a discontinuity in my character animation. First I wait for the idle animation to finish and then I play my second animation. Now I have a smooth, continuous animation, BUT I have introduced a delay between my action and character animation. If I play the second animation right away as it is triggered, the character animation won't be continuous and smooth. I was thinking on breaking the idle animation in small pieces and also to have the same number of second action animations to match the last frame of the idle pieces. This won't solve the delay completely, only will minimize it a bit. So it's a magic formula of how can I get rid of this delay? Thanks.

    Read the article

  • Exporting spritesheet for Cocos2d

    - by Terko
    I would like to know how people usually save the animations in order to load them easily in Cocos2d with as few hard-code as possible. E.G. The solution I thought of is to have one plist file containing information about each frame, and the second plist to contain information about each of the animation(name of the animation, which frames to play, and the delay probably). If this is the correct solution, how can I generate such plist files for spritesheet automatically?

    Read the article

  • C# XNA render an entire frame to a texture2d

    - by redcodefinal
    I asked a question here: C# XNA Make rendered screen a texture2d But, I ended up not getting the exact result I was looking for since I didn't ask the question right. In a game I am writing, I render an extremely large city out of objects, this can cause lag when moving the camera to view things that are off screen. I need a way to render then ENTIRE city, even the stuff that is off screen, and make it into a Texture2D. The answer I chose for the last one didn't work entirely right because it only gets what is on screen, not what is off.

    Read the article

  • How can I make this arcade-highscore game more fun/interesting?

    - by j-a
    I'm having difficulties getting the fun factor into this iPhone game, and I am looking for some ideas or advice. I was asked to generalize the question a bit. What are some techniques for arcade highscore games that can be applied to this game in order to: Make each second of the game fun and challenging, from the first second to the end of the game. Regardless of skill level. Make the player want to try again and again to beat the high score. Briefly about the game: you aim using your finger and pull the bow chord and release by lifting your finger. That part feels quite nice how the bow interacts with the finger. The game idea: hearts fall down and you get 1 pt for each heart you shoot. You start with a few arrows and every now and then a bag of arrow comes down which - if you hit it, you get more arrows. Once your out of arrows the game is over. So it is all about beating your previous high score or your friends high scores. Unfortunately I don't find it that fun. Thankful for any ideas/suggestions/thoughts on how to make it more fun/interesting.

    Read the article

< Previous Page | 599 600 601 602 603 604 605 606 607 608 609 610  | Next Page >