Search Results

Search found 8342 results on 334 pages for 'audio player'.

Page 211/334 | < Previous Page | 207 208 209 210 211 212 213 214 215 216 217 218  | Next Page >

  • As a game developer, which data structure use for develop the game? [duplicate]

    - by Rizwanabbasi
    This question already has an answer here: When should vector/list be used? 5 answers We are developing a game for bank robbery. The game plots a bank robbery. Lots of people witness that robbery. Our game will load the lists of suspected offenders while the players (witnesses) will have to identify the offenders of this robbery. Game should load list of offenders to identify the one as quickly as possible. Admin can add/remove offenders in the lists and two or more lists of offenders can also be merged into one (to show it to the player). As a game developer, which data structure we should use for develop the game? Justify your selection with solid arguments. Remember the most critical requirement is that the list should load super fast.

    Read the article

  • 3d transformation of game world keeping gameplay 2d - COCOS2D 2.0

    - by samfisher
    Using: COCOS2D + iOS. I want to rotate the game world, may be loading another .tmx file for another dimensions when user want to switch dimension. the effect what I am looking for is something like this:CLICK HERE What I have thought of till now: rotating CCCamera will be mandatory. Question: How will I have the other part of the level in place while the camera rotates/rotating? I can load a CCSprite and rotate it accordingly to the 3rd dimension. phew..!! Question: When the camera and world is rotated, will the player controls work properly.. I think not...? I think a better option would be to checkout with COCOS3D... there I could implement 3d world... right?? Question: Not sure how well 2d dynamics will work there as I want to user Box2d as physics engine.. could anyone provide suggestions? Regards, Sam

    Read the article

  • SQLBeat Podcast – Episode 8 – Interviewing Patrick LeBlanc On Interviewing

    - by SQLBeat
    In this episode of the SQLBeat Podcast (@SQLBeat on twitter) I had a chance to speak with Patrick LeBlanc, currently with Microsoft and former SQL Server MVP. We spend a good amount of time talking about his current gig and his apparent fascination with almost never being dressed. Fortunately he had on some jeans and a shirt for this interview, though he did look at bit dozy because he had not slept the night before. With his help we came up with what is going to be a recurring section on the podcast and that is failed or embarrassing job interviews. Patrick has quite a humiliating one for the launch of “Tell Me About Your Worst Job Interview”. Ok, I can come up with a better title for that section surely. Anyway, have a listen and if you can think of something better, let me know in the comments section. <source src="http://www.simple-talk.com/blogbits/rodneylandrum/Patrick_LeBlanc.ogg" type="audio/ogg; codecs="vorbis"

    Read the article

  • The Dark Knight meets The Avengers [Video]

    - by Asian Angel
    Batman and the Avengers team up to defeat a common enemy, but their ‘after battle’ plans are extremely different! Can Batman learn to be a ‘team player’ who relaxes and has fun, or will he brood alone in his cave forever? The Dark Knight Meets The Avengers [CollegeHumor] HTG Explains: How Windows Uses The Task Scheduler for System Tasks HTG Explains: Why Do Hard Drives Show the Wrong Capacity in Windows? Java is Insecure and Awful, It’s Time to Disable It, and Here’s How

    Read the article

  • "Marching cubes" voxel terrain - triplanar texturing with depth?

    - by Dan the Man
    I am currently working on a voxel terrain that uses the marching cubes algorithm for polygonizing the scalar field of voxels. I am using a triplanar texturing shader for texturing. say I have a grass texture set to the Y axis and a dirt texture for both the X and Z axes. Now, when my player digs downwards, it still appears as grass. How would I make it to appear as dirt? I have been thinking about this for a while, and the only thing I can think of to make this effect, would be to mark vertices that have been dug with a certain vertex color. When it has that vertex color, the shader would apply that dirt texture to the vertices marked. Is there a better method?

    Read the article

  • How can I generate a 2d navigation mesh in a dynamic environment at runtime?

    - by Stephen
    So I've grasped how to use A* for path-finding, and I am able to use it on a grid. However, my game world is huge and I have many enemies moving toward the player, which is a moving target, so a grid system is too slow for path-finding. I need to simplify my node graph by using a navigational mesh. I grasp the concept of "how" a mesh works (finding a path through nodes on the vertices and/or the centers of the edges of polygons). My game uses dynamic obstacles that are procedurally generated at run-time. I can't quite wrap my head around how to take a plane that has multiple obstacles in it and programatically divide the walkable area up into polygons for the navigation mesh, like the following image. Where do I start? How do I know when a segment of walk-able area is already defined, or worse, when I realize I need to subdivide a previously defined walk-able area as the algorithm "walks" through the map? I'm using javascript in nodejs, if it matters.

    Read the article

  • 2d libgdx: runtime level generation

    - by lxknvlk
    I have encountered a problem during my first game development: I thought of a Array<Ground> groundArray that does groundArray.add when a new ground will appear on the screen, and removes oldest ground when it will no longer be seen, if player only moves to the right, like in flappy bird. The perfect structure would be a queue for such a mechanic, but libgdx doesnt have one. Using libgdx's Array is not intuitive too - i have to reverse the order of elements. It has a method pop() that removes the last element, but no such method to use on the first element. What are my options here? extend Array class and add something? writing my own queue-like class?

    Read the article

  • 12.04 Ubuntu studio PREEMPT_RT kernel options

    - by Nate Iverson
    My audio processing needs require a preempt_rt kernel. I roughly followed the guide: https://wiki.ubuntu.com/KernelTeam/GitKernelBuild with a little help from: https://rt.wiki.kernel.org/index.php/RT_PREEMPT_HOWTO Currently I am using the 3.4 branch (which is the most recent at the time of this post): http://www.kernel.org/pub/linux/kernel/projects/rt/ I think I have a reasonable kernel config ( for my machine at least ). Multiple trials confirm I need the option: CONFIG_PREEMPT_RT_FULL=y I have the following questions: Is anyone maintaining a recent CONFIG_PREEMPT_RT_FULL kernel in a ppa? Is there any interest in providing a CONFIG_PREEMPT_RT_FULL in the official ubuntu-studio distribution? Does anyone have recent config pointers for a CONFIG_PREEMPT_RT_FULL kernel?

    Read the article

  • Rendering skybox in first person shooter

    - by brainydexter
    I am trying to get a skybox rendered correctly in my first person shooter game. I have the skybox cube rendering using GL_TEXTURE_CUBE_MAP. I author the cube with extents of -1 and 1 along X,Y and Z. However, I can't wrap my head around the camera transformations that I need to apply to get it right. My render loop looks something like this: mp_Camera-ApplyTransform() :: Takes the current player transformation and inverts it and pushes that on the modelview stack. Draw GameObjects Draw Skybox DrawSkybox does the following: glEnable(GL_TEXTURE_CUBE_MAP); glDepthMask(GL_FALSE); // draw the cube here with extents glDisable(GL_TEXTURE_CUBE_MAP); glDepthMask(GL_TRUE); Do I need to translate the skybox by the translation of the camera ? (btw, that didn't help either) EDIT: I forgot to mention: It looks like a small cube with unit extents. Also, I can strafe in/out of the cube. Screenshot:

    Read the article

  • Record Screen Activity with CamStudio

    - by Asian Angel
    Sometimes a visual demonstration works much better than a list of instructions. If you need to make a demo video for family and/or friends then you might want to have a look at CamStudio. Using CamStudio To get properly set up you will need to install two different files (the main program followed by the codec). Once that is done you are ready to get started. When you start the program you will see a surprisingly small window. Notice the highlighted Record to text…it serves as a visual indicator for the video type selected for recording. Before you start creating a video it would be a good idea to look through some of the settings. The first one to look at is the region or area that you want to record. Next you will want to look through the video options since these will affect the quality and final size of your video files. The default setting for quality is 70…adjust that to the level that best suits your needs. Note: For our example we maxed out the various video settings for best quality. On our system Microsoft Video 1 was listed as the default compressor but as you can see there were other options available. You can configure the settings for the compressor you want to use if desired. Keep in mind that each compressor will have unique settings of their own, so if you change it, be certain to go back and check. We decided to use the CamStudio Lossless Codec for our example (it gave the best results while trying the software). Going back to the main window you can toggle back and forth between .avi and .swf output using the last button. Once you are satisfied with the settings click on the red record button to start. If you need to pause while recording or stop recording click on the system tray icon and select the appropriate command. When you are finished recording, you will be presented with the save file window. Browse for the desired save location and name your new file. Once you have saved the file the movie player window will automatically open so that you view your new video. Our sample video shown here is at 50% of original size so may look slightly “gritty”. The detail was much better at 100%. If you decide to record and save as .swf the process will be identical to recording in .avi format until the movie player window opens. At that time the conversion process from .avi to .swf will begin. When complete you will have a new flash video and html file that goes with it. Depending on which browser you have set as default, you may run into a small problem when the preview for your new .swf file tries to open. There is a small bug in the generated html file. You can use this work-around or… Just open the .swf file directly in your favorite browser. Conclusion CamStudio may not produce the highest quality videos, but it’s free and does a very nice job nonetheless. If you are working on a tight budget or only need to make an occasional video then CamStudio is a very sensible choice. Links Download CamStudio Stable Version & CamStudio Codec *Download links are approximately half-way down the page. Download CamStudio Stable Version & CamStudio Codec at SourceForge *Beta version also available here. Similar Articles Productive Geek Tips Get the Classic Style Network Activity Indicator Back in Windows 7How To Copy a DVD with VLC 1.0ALLCapture 3.0 [Review]Listen and Record Over 12,000 Online Radio Stations with RadioSureGeek Reviews: Play And Record Internet Radio With Screamer Radio TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips Xobni Plus for Outlook All My Movies 5.9 CloudBerry Online Backup 1.5 for Windows Home Server Snagit 10 TimeToMeet is a Simple Online Meeting Planning Tool Easily Create More Bookmark Toolbars in Firefox Filevo is a Cool File Hosting & Sharing Site Get a free copy of WinUtilities Pro 2010 World Cup Schedule Boot Snooze – Reboot and then Standby or Hibernate

    Read the article

  • Saving game data to server [on hold]

    - by Eugene Lim
    What's the best method to save the player's data to the server? Method to store the game saves Which one of the following method should I use ? Using a database structure(e.g.. mySQL) to store the game data as blobs? Using the server hard disk to store the saved game data as binary data files? Method to send saved game data to server What method should I use ? socketIO web socket a web-based scripting language to receive the game data as binary? for example, a php script to handle binary data and save it to file Meta-data I read that some games store saved game meta-data in database structures. What kind of meta data is useful to store?

    Read the article

  • Make one monitor act like two, split in half

    - by Nathan J. Brauer
    Context: Ubuntu 11.10, Unity Let's say I have a screen at resolution 1000x500. What I'd like to do is split the screen down the middle so [Unity or X or ?] acts as if there are two displays (each of 500x500). Examples: Unity will display a different toolbar (the top one) on each side of the display. If I maximize a window on the left side of the screen, it will fill the left side only. If I maximize on the right, it will fill the right. If I hit "fullscreen" in youtube (flash) or Chrome or Movie Player, it will only fill the side of the display that it's on. If it's really is impossible to do this with Unity, will it work with Gnome3 and how? A million thanks!

    Read the article

  • Pulseaudio no sound card detected. Dummy output only

    - by Zach Smith
    I'm using 12.10 Quantal with Openbox and a .xinitrc script at login instead of a display manager. Its a relatively fresh install and I noticed when I opened pavucontrol the only output was a dummy one. I check around and it appears that my soundcard is physically installed but Pulseaudio isn't detecting it. I'm really unsure what I should do but any help getting my audio back would be appreciated. Edit: further info if its at all useful: dante@dante-ubuntu:~$ uname -a && aplay -l && cat /proc/asound/version && head -n 1 /proc/asound/card*/codec#* Linux dante-ubuntu 3.5.0-17-generic #28-Ubuntu SMP Tue Oct 9 19:31:23 UTC 2012 x86_64 x86_64 x86_64 GNU/Linux aplay: device_list:252: no soundcards found... Advanced Linux Sound Architecture Driver Version 1.0.25. == /proc/asound/card0/codec#0 <== Codec: ATI R6xx HDMI == /proc/asound/card1/codec#0 <== Codec: IDT 92HD81B1X5

    Read the article

  • Why won't cron run my sh script?

    - by Dmitry Narkevich
    Used gnome-schedule to create a script to set my headset as the fallback audio device because it keeps unsetting it when the headset gets disconnected or pc goes into sleep mode. Anyway, crontab is this: SHELL=/bin/sh PATH=/bin:/sbin:/usr/bin:/usr/sbin:/home/dmitry/bin * * * * * headsetfix /home/dmitry/bin/headsetfix is #!/bin/sh pacmd set-default-sink alsa_output.usb-Logitech_Inc_Logitech_USB_Headset_H540_00000000-00-H540.analog-stereo pacmd set-default-source alsa_input.usb-Logitech_Inc_Logitech_USB_Headset_H540_00000000-00-H540.analog-stereo It runs fine from the terminal. I've made sure it's chmodded to be executable, and "which headsetfix", run from cron, outputs "/home/dmitry/bin/headsetfix" so not sure what the problem is.

    Read the article

  • How can I make QSSTV recognise my sound device

    - by hellocatfood
    After reading this article from Instructables I wanted to try it out in Ubuntu. Luckily QSSTV is available but I'm having problems getting it to work properly. When pressing the green light (to receive data) I get this error: After going to Options Configure Interfaces I discovered that it's attempting to use /dev/dsp which doesn't appear to exist. In my /dev folder there are no files that reference sound devices and I've tried all of the devices in /dev/snd but unfortunately none of them work. Is there a specific audio device that I should be linking to?

    Read the article

  • Beginners' Guide to Development

    - by Bombillazo
    Hello. So I have some experience programming in Java, and at the moment I am learning how to use Python. I have read on the process of game design and such. I also have media covered, got experience with graphics and audio. My question is geared more towards the actual tools to use for making games, developing. I am willing to commit to a long term development cycle, as I will be doing this as a hobby. I've heard of Flash, Gamemaker, etc. I don't intend to create my own Game Engine, so I was looking for a platform that is extensible and easy to program with an OOP mind frame. As a plus it would be great of said game could be played directly from a website. TIA!

    Read the article

  • How to convert unconvertable & unviewable .ts files?

    - by Evelin Versh
    How to convert .ts files that can not be converted with usual WinFF, Avidemux etc programs? The .ts files in question are recorded from TV with STV digital cable digibox, viewable to me so far ONLY with that same digibox. All the video-playing programs i tried do not open the files at all (e.g. classical VLC and WinMedia player). All but 1 video converters i tried also are not able even to open or load the file into the program, therefore no conversion is possible. According to WinFF it can not find codec parameters during the conversion, evidently leading to nothing-happening???! HELP!, please.

    Read the article

  • How can a NodeJS server be used from Game Maker HTML5?

    - by Tokyo Dan
    I want to create a client-server game that runs on Game Maker HTML5-NodeJS. The NodeJS server will be an AI server - a bot that acts like a human opponent and plays against the human player at a front-end game client that is coded in GM HTML5. How can a NodeJS server be used from GM HTML5. Are there any examples of such a system? I already got an iOS game that can talk to a remote AI server (coded in Lua) using TCP sockets. Can this be done with Game Maker HTML5 and NodeJS.

    Read the article

  • 11415 compile errors FTW?!

    - by Koning Baard
    Hello. This is something I've really never seen but, I downloaded the source code of the sine wave example at http://www.audiosynth.com/sinewavedemo.html . It is in an old Project Builder Project format, and I want to compile it with Xcode (GCC). However, Xcode gives me 11415 compile errors. The first few are (all in the precompilation of AppKit.h): /Developer/SDKs/MacOSX10.6.sdk/System/Library/Frameworks/Foundation.framework/Headers/NSObject.h:31:0 /Developer/SDKs/MacOSX10.6.sdk/System/Library/Frameworks/Foundation.framework/Headers/NSObject.h:31: error: expected identifier or '(' before '-' token /Developer/SDKs/MacOSX10.6.sdk/System/Library/Frameworks/Foundation.framework/Headers/NSObject.h:33:0 /Developer/SDKs/MacOSX10.6.sdk/System/Library/Frameworks/Foundation.framework/Headers/NSObject.h:33: error: expected identifier or '(' before '-' token /Developer/SDKs/MacOSX10.6.sdk/System/Library/Frameworks/Foundation.framework/Headers/NSObject.h:35:0 /Developer/SDKs/MacOSX10.6.sdk/System/Library/Frameworks/Foundation.framework/Headers/NSObject.h:35: error: expected identifier or '(' before '-' token /Developer/SDKs/MacOSX10.6.sdk/System/Library/Frameworks/Foundation.framework/Headers/NSObject.h:36:0 /Developer/SDKs/MacOSX10.6.sdk/System/Library/Frameworks/Foundation.framework/Headers/NSObject.h:36: error: expected identifier or '(' before '-' token /Developer/SDKs/MacOSX10.6.sdk/System/Library/Frameworks/Foundation.framework/Headers/NSObject.h:37:0 /Developer/SDKs/MacOSX10.6.sdk/System/Library/Frameworks/Foundation.framework/Headers/NSObject.h:37: error: expected identifier or '(' before '-' token /Developer/SDKs/MacOSX10.6.sdk/System/Library/Frameworks/Foundation.framework/Headers/NSObject.h:38:0 /Developer/SDKs/MacOSX10.6.sdk/System/Library/Frameworks/Foundation.framework/Headers/NSObject.h:38: error: expected identifier or '(' before '-' token /Developer/SDKs/MacOSX10.6.sdk/System/Library/Frameworks/Foundation.framework/Headers/NSObject.h:40:0 /Developer/SDKs/MacOSX10.6.sdk/System/Library/Frameworks/Foundation.framework/Headers/NSObject.h:40: error: expected identifier or '(' before '-' token /Developer/SDKs/MacOSX10.6.sdk/System/Library/Frameworks/Foundation.framework/Headers/NSObject.h:42:0 /Developer/SDKs/MacOSX10.6.sdk/System/Library/Frameworks/Foundation.framework/Headers/NSObject.h:42: error: expected identifier or '(' before '@' token /Developer/SDKs/MacOSX10.6.sdk/System/Library/Frameworks/Foundation.framework/Headers/NSObject.h:48:0 /Developer/SDKs/MacOSX10.6.sdk/System/Library/Frameworks/Foundation.framework/Headers/NSObject.h:48: error: expected identifier or '(' before '@' token /Developer/SDKs/MacOSX10.6.sdk/System/Library/Frameworks/Foundation.framework/Headers/NSObject.h:54:0 /Developer/SDKs/MacOSX10.6.sdk/System/Library/Frameworks/Foundation.framework/Headers/NSObject.h:54: error: expected identifier or '(' before '@' token /Developer/SDKs/MacOSX10.6.sdk/System/Library/Frameworks/Foundation.framework/Headers/NSObject.h:59:0 /Developer/SDKs/MacOSX10.6.sdk/System/Library/Frameworks/Foundation.framework/Headers/NSObject.h:59: error: expected identifier or '(' before '-' token /Developer/SDKs/MacOSX10.6.sdk/System/Library/Frameworks/Foundation.framework/Headers/NSObject.h:61:0 /Developer/SDKs/MacOSX10.6.sdk/System/Library/Frameworks/Foundation.framework/Headers/NSObject.h:61: error: expected identifier or '(' before '@' token /Developer/SDKs/MacOSX10.6.sdk/System/Library/Frameworks/Foundation.framework/Headers/NSObject.h:69:0 /Developer/SDKs/MacOSX10.6.sdk/System/Library/Frameworks/Foundation.framework/Headers/NSObject.h:69: error: expected identifier or '(' before '+' token /Developer/SDKs/MacOSX10.6.sdk/System/Library/Frameworks/Foundation.framework/Headers/NSObject.h:71:0 /Developer/SDKs/MacOSX10.6.sdk/System/Library/Frameworks/Foundation.framework/Headers/NSObject.h:71: error: expected identifier or '(' before '+' token /Developer/SDKs/MacOSX10.6.sdk/System/Library/Frameworks/Foundation.framework/Headers/NSValue.h:39:0 /Developer/SDKs/MacOSX10.6.sdk/System/Library/Frameworks/Foundation.framework/Headers/NSValue.h:39: error: expected identifier or '(' before '-' token /Developer/SDKs/MacOSX10.6.sdk/System/Library/Frameworks/Foundation.framework/Headers/NSValue.h:40:0 /Developer/SDKs/MacOSX10.6.sdk/System/Library/Frameworks/Foundation.framework/Headers/NSValue.h:40: error: expected identifier or '(' before '-' token /Developer/SDKs/MacOSX10.6.sdk/System/Library/Frameworks/Foundation.framework/Headers/NSValue.h:41:0 /Developer/SDKs/MacOSX10.6.sdk/System/Library/Frameworks/Foundation.framework/Headers/NSValue.h:41: error: expected identifier or '(' before '-' token /Developer/SDKs/MacOSX10.6.sdk/System/Library/Frameworks/Foundation.framework/Headers/NSValue.h:42:0 /Developer/SDKs/MacOSX10.6.sdk/System/Library/Frameworks/Foundation.framework/Headers/NSValue.h:42: error: expected identifier or '(' before '-' token /Developer/SDKs/MacOSX10.6.sdk/System/Library/Frameworks/Foundation.framework/Headers/NSValue.h:43:0 /Developer/SDKs/MacOSX10.6.sdk/System/Library/Frameworks/Foundation.framework/Headers/NSValue.h:43: error: expected identifier or '(' before '-' token /Developer/SDKs/MacOSX10.6.sdk/System/Library/Frameworks/Foundation.framework/Headers/NSValue.h:44:0 /Developer/SDKs/MacOSX10.6.sdk/System/Library/Frameworks/Foundation.framework/Headers/NSValue.h:44: error: expected identifier or '(' before '-' token /Developer/SDKs/MacOSX10.6.sdk/System/Library/Frameworks/Foundation.framework/Headers/NSValue.h:45:0 /Developer/SDKs/MacOSX10.6.sdk/System/Library/Frameworks/Foundation.framework/Headers/NSValue.h:45: error: expected identifier or '(' before '-' token /Developer/SDKs/MacOSX10.6.sdk/System/Library/Frameworks/Foundation.framework/Headers/NSValue.h:46:0 /Developer/SDKs/MacOSX10.6.sdk/System/Library/Frameworks/Foundation.framework/Headers/NSValue.h:46: error: expected identifier or '(' before '-' token Some of the code is: HAL.c /* * HAL.c * Sinewave * * Created by james on Fri Apr 27 2001. * Copyright (c) 2001 __CompanyName__. All rights reserved. * */ #include "HAL.h" #include "math.h" appGlobals gAppGlobals; OSStatus appIOProc (AudioDeviceID inDevice, const AudioTimeStamp* inNow, const AudioBufferList* inInputData, const AudioTimeStamp* inInputTime, AudioBufferList* outOutputData, const AudioTimeStamp* inOutputTime, void* device); #define FailIf(cond, handler) \ if (cond) { \ goto handler; \ } #define FailWithAction(cond, action, handler) \ if (cond) { \ { action; } \ goto handler; \ } // ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ // ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ // HAL Sample Code ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ // ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ // ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ //#define noErr 0 //#define false 0 OSStatus SetupHAL (appGlobalsPtr globals) { OSStatus err = noErr; UInt32 count, bufferSize; AudioDeviceID device = kAudioDeviceUnknown; AudioStreamBasicDescription format; // get the default output device for the HAL count = sizeof(globals->device); // it is required to pass the size of the data to be returned err = AudioHardwareGetProperty(kAudioHardwarePropertyDefaultOutputDevice, &count, (void *) &device); fprintf(stderr, "kAudioHardwarePropertyDefaultOutputDevice %d\n", err); if (err != noErr) goto Bail; // get the buffersize that the default device uses for IO count = sizeof(globals->deviceBufferSize); // it is required to pass the size of the data to be returned err = AudioDeviceGetProperty(device, 0, false, kAudioDevicePropertyBufferSize, &count, &bufferSize); fprintf(stderr, "kAudioDevicePropertyBufferSize %d %d\n", err, bufferSize); if (err != noErr) goto Bail; // get a description of the data format used by the default device count = sizeof(globals->deviceFormat); // it is required to pass the size of the data to be returned err = AudioDeviceGetProperty(device, 0, false, kAudioDevicePropertyStreamFormat, &count, &format); fprintf(stderr, "kAudioDevicePropertyStreamFormat %d\n", err); fprintf(stderr, "sampleRate %g\n", format.mSampleRate); fprintf(stderr, "mFormatFlags %08X\n", format.mFormatFlags); fprintf(stderr, "mBytesPerPacket %d\n", format.mBytesPerPacket); fprintf(stderr, "mFramesPerPacket %d\n", format.mFramesPerPacket); fprintf(stderr, "mChannelsPerFrame %d\n", format.mChannelsPerFrame); fprintf(stderr, "mBytesPerFrame %d\n", format.mBytesPerFrame); fprintf(stderr, "mBitsPerChannel %d\n", format.mBitsPerChannel); if (err != kAudioHardwareNoError) goto Bail; FailWithAction(format.mFormatID != kAudioFormatLinearPCM, err = paramErr, Bail); // bail if the format is not linear pcm // everything is ok so fill in these globals globals->device = device; globals->deviceBufferSize = bufferSize; globals->deviceFormat = format; Bail: return (err); } /* struct AudioStreamBasicDescription { Float64 mSampleRate; // the native sample rate of the audio stream UInt32 mFormatID; // the specific encoding type of audio stream UInt32 mFormatFlags; // flags specific to each format UInt32 mBytesPerPacket; // the number of bytes in a packet UInt32 mFramesPerPacket; // the number of frames in each packet UInt32 mBytesPerFrame; // the number of bytes in a frame UInt32 mChannelsPerFrame; // the number of channels in each frame UInt32 mBitsPerChannel; // the number of bits in each channel }; typedef struct AudioStreamBasicDescription AudioStreamBasicDescription; */ // ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ // This is a simple playThru ioProc. It simply places the data in the input buffer back into the output buffer. // Watch out for feedback from Speakers to Microphone OSStatus appIOProc (AudioDeviceID inDevice, const AudioTimeStamp* inNow, const AudioBufferList* inInputData, const AudioTimeStamp* inInputTime, AudioBufferList* outOutputData, const AudioTimeStamp* inOutputTime, void* appGlobals) { appGlobalsPtr globals = appGlobals; int i; double phase = gAppGlobals.phase; double amp = gAppGlobals.amp; double pan = gAppGlobals.pan; double freq = gAppGlobals.freq * 2. * 3.14159265359 / globals->deviceFormat.mSampleRate; int numSamples = globals->deviceBufferSize / globals->deviceFormat.mBytesPerFrame; // assume floats for now.... float *out = outOutputData->mBuffers[0].mData; for (i=0; i<numSamples; ++i) { float wave = sin(phase) * amp; phase = phase + freq; *out++ = wave * (1.0-pan); *out++ = wave * pan; } gAppGlobals.phase = phase; return (kAudioHardwareNoError); } // ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ OSStatus StartPlayingThruHAL(appGlobalsPtr globals) { OSStatus err = kAudioHardwareNoError; if (globals->soundPlaying) return 0; globals->phase = 0.0; err = AudioDeviceAddIOProc(globals->device, appIOProc, (void *) globals); // setup our device with an IO proc if (err != kAudioHardwareNoError) goto Bail; err = AudioDeviceStart(globals->device, appIOProc); // start playing sound through the device if (err != kAudioHardwareNoError) goto Bail; globals->soundPlaying = true; // set the playing status global to true Bail: return (err); } // ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ OSStatus StopPlayingThruHAL(appGlobalsPtr globals) { OSStatus err = kAudioHardwareNoError; if (!globals->soundPlaying) return 0; err = AudioDeviceStop(globals->device, appIOProc); // stop playing sound through the device if (err != kAudioHardwareNoError) goto Bail; err = AudioDeviceRemoveIOProc(globals->device, appIOProc); // remove the IO proc from the device if (err != kAudioHardwareNoError) goto Bail; globals->soundPlaying = false; // set the playing status global to false Bail: return (err); } Sinewave.m // // a very simple Cocoa CoreAudio app // by James McCartney [email protected] www.audiosynth.com // // Sinewave - this class implements a sine oscillator with dezippered control of frequency, pan and amplitude // #import "Sinewave.h" // define a C struct from the Obj-C object so audio callback can access data typedef struct { @defs(Sinewave); } sinewavedef; // this is the audio processing callback. OSStatus appIOProc (AudioDeviceID inDevice, const AudioTimeStamp* inNow, const AudioBufferList* inInputData, const AudioTimeStamp* inInputTime, AudioBufferList* outOutputData, const AudioTimeStamp* inOutputTime, void* defptr) { sinewavedef* def = defptr; // get access to Sinewave's data int i; // load instance vars into registers double phase = def->phase; double amp = def->amp; double pan = def->pan; double freq = def->freq; double ampz = def->ampz; double panz = def->panz; double freqz = def->freqz; int numSamples = def->deviceBufferSize / def->deviceFormat.mBytesPerFrame; // assume floats for now.... float *out = outOutputData->mBuffers[0].mData; for (i=0; i<numSamples; ++i) { float wave = sin(phase) * ampz; // generate sine wave phase = phase + freqz; // increment phase // write output *out++ = wave * (1.0-panz); // left channel *out++ = wave * panz; // right channel // de-zipper controls panz = 0.001 * pan + 0.999 * panz; ampz = 0.001 * amp + 0.999 * ampz; freqz = 0.001 * freq + 0.999 * freqz; } // save registers back to object def->phase = phase; def->freqz = freqz; def->ampz = ampz; def->panz = panz; return kAudioHardwareNoError; } @implementation Sinewave - (void) setup { OSStatus err = kAudioHardwareNoError; UInt32 count; device = kAudioDeviceUnknown; initialized = NO; // get the default output device for the HAL count = sizeof(device); // it is required to pass the size of the data to be returned err = AudioHardwareGetProperty(kAudioHardwarePropertyDefaultOutputDevice, &count, (void *) &device); if (err != kAudioHardwareNoError) { fprintf(stderr, "get kAudioHardwarePropertyDefaultOutputDevice error %ld\n", err); return; } // get the buffersize that the default device uses for IO count = sizeof(deviceBufferSize); // it is required to pass the size of the data to be returned err = AudioDeviceGetProperty(device, 0, false, kAudioDevicePropertyBufferSize, &count, &deviceBufferSize); if (err != kAudioHardwareNoError) { fprintf(stderr, "get kAudioDevicePropertyBufferSize error %ld\n", err); return; } fprintf(stderr, "deviceBufferSize = %ld\n", deviceBufferSize); // get a description of the data format used by the default device count = sizeof(deviceFormat); // it is required to pass the size of the data to be returned err = AudioDeviceGetProperty(device, 0, false, kAudioDevicePropertyStreamFormat, &count, &deviceFormat); if (err != kAudioHardwareNoError) { fprintf(stderr, "get kAudioDevicePropertyStreamFormat error %ld\n", err); return; } if (deviceFormat.mFormatID != kAudioFormatLinearPCM) { fprintf(stderr, "mFormatID != kAudioFormatLinearPCM\n"); return; } if (!(deviceFormat.mFormatFlags & kLinearPCMFormatFlagIsFloat)) { fprintf(stderr, "Sorry, currently only works with float format....\n"); return; } initialized = YES; fprintf(stderr, "mSampleRate = %g\n", deviceFormat.mSampleRate); fprintf(stderr, "mFormatFlags = %08lX\n", deviceFormat.mFormatFlags); fprintf(stderr, "mBytesPerPacket = %ld\n", deviceFormat.mBytesPerPacket); fprintf(stderr, "mFramesPerPacket = %ld\n", deviceFormat.mFramesPerPacket); fprintf(stderr, "mChannelsPerFrame = %ld\n", deviceFormat.mChannelsPerFrame); fprintf(stderr, "mBytesPerFrame = %ld\n", deviceFormat.mBytesPerFrame); fprintf(stderr, "mBitsPerChannel = %ld\n", deviceFormat.mBitsPerChannel); } - (void)setAmpVal:(double)val { amp = val; } - (void)setFreqVal:(double)val { freq = val * 2. * 3.14159265359 / deviceFormat.mSampleRate; } - (void)setPanVal:(double)val { pan = val; } - (BOOL)start { OSStatus err = kAudioHardwareNoError; sinewavedef *def; if (!initialized) return false; if (soundPlaying) return false; // initialize phase and de-zipper filters. phase = 0.0; freqz = freq; ampz = amp; panz = pan; def = (sinewavedef *)self; err = AudioDeviceAddIOProc(device, appIOProc, (void *) def); // setup our device with an IO proc if (err != kAudioHardwareNoError) return false; err = AudioDeviceStart(device, appIOProc); // start playing sound through the device if (err != kAudioHardwareNoError) return false; soundPlaying = true; // set the playing status global to true return true; } - (BOOL)stop { OSStatus err = kAudioHardwareNoError; if (!initialized) return false; if (!soundPlaying) return false; err = AudioDeviceStop(device, appIOProc); // stop playing sound through the device if (err != kAudioHardwareNoError) return false; err = AudioDeviceRemoveIOProc(device, appIOProc); // remove the IO proc from the device if (err != kAudioHardwareNoError) return false; soundPlaying = false; // set the playing status global to false return true; } @end Can anyone help me compiling this example? I'd really appriciate it. Thanks

    Read the article

  • Black & White Video with Youtube

    - by RobertPitt
    Hey'a guys, I have an issue with the Youtube Player within Ubuntu, it seems that when playing videos there in black & white, no matter if there normal or HD. I kind of figured out how to prevent this from happening by deleting the cookie PREF that holds a KV Pair like so: PREF f1=50000000&fv=10.2.154 .youtube.com / Sat, 12 Mar 2011 11:38:29 GMT The issue is the Flash Version key, when I Delete this the color comes back, but obviously when I navigate to another page the cookie is set again. Does anyone know why this issue occurs and a possible fix? Using Latest Google Chrome on the latest version of Ubuntu (Installed 3 days ago) Thanks

    Read the article

  • Adding a Wine (.exe) file to open/play files in Thunar's 'Custom Actions'

    - by cipricus
    What is the line parameter in Thunar's "custom actions" that would play a music directory in a Wine (.exe) program? I want to put Foobar2000 (and possibly other such applications) in Thunar's Custom Actions (actions that can be added/edited to be used in the context menu to play music folder's content). If I put no command parameter at the end of path, the player opens but there is no play and no files in the play list. If I add any of the listed parameters I get the error unknown commandline parameter: /path.to.file (Giving an answer to this question depends on knowing the CLI command that would have to be added into Thunar - so please look at this question too)

    Read the article

  • Python rpg adivce? [closed]

    - by nikita.utiu
    I have started coding an text rpg engine in python. I have basic concepts laid down, like game state saving, input, output etc. I was wondering how certain scripted game mechanics(eg. debuffs that increase damage received from a certain player or multiply damage by the number of hits received, overriding of the mobs default paths for certain events etc) are implemented usually implemented. Some code bases or some other source code would be useful(not necessarily python). Thanks in advance.

    Read the article

  • Direction of the bullet - how to have something else than left, right, top, bottom

    - by Florian Margaine
    I'm making a simple shooter game using canvas and javascript. The current code can be seen here. To know which way I want the bullet to be shot, I simply have a direction property that can have 4 values (left, right, bottom, top), and I can then calculate the next position of the bullet easily. However, I'd like to move the bullet to the mouse position, but I don't really see how to do this. How do I calculate the next position? I'm guessing there is some formula to calculate the line between two positions (the player's and the mouse's), but I don't have much idea yet. So there is no obstacle, but I don't see how to calculate this, and be able have the next position of the bullet at each frame.

    Read the article

  • Blue people on flash videos (Youtube, Vimeo, etc...)

    - by Luis Alvarado
    After a recent update I am seeing EVERYONE in any video blue. Only the people are blue. First I was thinking it was an April Fools joke about the song Blue from Eiffel 65 but it has been 4 days already. I tested on another 2 PCs, same problem. It started about a week ago I think. I am using Flash 11.2.202.228 with Firefox 11. The problem is not there if I use Google Chrome. In Chrome the Flash Player is 11.2.31.118

    Read the article

  • Drawing graphics in Java game

    - by wolf
    I am quite new to game development, so here is a question (maybe a stupid one): In my sidescroller i have a bunch of different graphics objects that i need to draw (player, background tiles, creatures, projectiles etc). Most tutorials i've read so far show that each object has its own draw method, which is then called from some other method. What if I had one method that does all the drawing? Lets say i keep all my objects in an array or queue (or multiple arrays) and then go through each of them, get an image and draw it. So basically would it be better (and why) to have each object have its own draw method or one method that does all the drawing? Or does it matter at all? I feel like the second option is more comfortable, because then all the stuff to do with drawing would be in one place...

    Read the article

< Previous Page | 207 208 209 210 211 212 213 214 215 216 217 218  | Next Page >