Search Results

Search found 34 results on 2 pages for 'avaudiorecorder'.

Page 1/2 | 1 2  | Next Page >

  • Problem using AVAudioRecorder.

    - by tek3
    Hi all, I am facing a strange problem with AVAudioRecorder. In my application i need to record audio and play it. I am creating my player as : if(recorder) { if(recorder.recording) [recorder stop]; [recorder release]; recorder = nil; } NSString * filePath = [NSHomeDirectory() stringByAppendingPathComponent: [NSString stringWithFormat:@"Documents/%@.caf",songTitle]]; NSDictionary *recordSettings = [[NSDictionary alloc] initWithObjectsAndKeys: [NSNumber numberWithFloat: 44100.0],AVSampleRateKey, [NSNumber numberWithInt: kAudioFormatAppleIMA4],AVFormatIDKey, [NSNumber numberWithInt: 1], AVNumberOfChannelsKey, [NSNumber numberWithInt: AVAudioQualityMax],AVEncoderAudioQualityKey,nil]; recorder = [[AVAudioRecorder alloc] initWithURL: [NSURL fileURLWithPath:filePath] settings: recordSettings error: nil]; recorder.delegate = self; if ([recorder prepareToRecord] == YES){ [recorder record]; I am releasing and creating player every time i press record button. But the problem is that ,AVAudiorecorder is taking some time before starting to record , and so if i press record button multiple times continuously ,my application freezes for some time. The same code works fine without any problem when headphones are connected to device...there is no delay in recording, and the app doesn't freeze even if i press record button multiple times. Any help in this regard will be highly appreciated. Thanx in advance.

    Read the article

  • AVAudioRecorder Memory Leak

    - by Eric Ranschau
    I'm hoping someone out there can back me up on this... I've been working on an application that allows the end user to record a small audio file for later playback and am in the process of testing for memory leaks. I continue to very consistently run into a memory leak when the AVAudioRecorder's "stop" method attempts to close the audio file to which it's been recording. This really seems to be a leak in the framework itself, but if I'm being a bonehead you can tell me. To illustrate, I've worked up a stripped down test app that does nothing but start/stop a recording w/ the press of a button. For the sake of simplicty, everything happens in app. delegate as follows: @synthesize audioRecorder, button; @synthesize window; - (BOOL) application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions { // create compplete path to database NSString *tempPath = NSTemporaryDirectory(); NSString *audioFilePath = [tempPath stringByAppendingString:@"/customStatement.caf"]; // define audio file url NSURL *audioFileURL = [[NSURL alloc] initFileURLWithPath:audioFilePath]; // define audio recorder settings NSDictionary *settings = [[NSDictionary alloc] initWithObjectsAndKeys: [NSNumber numberWithInt:kAudioFormatAppleIMA4], AVFormatIDKey, [NSNumber numberWithInt:1], AVNumberOfChannelsKey, [NSNumber numberWithInt:AVAudioQualityLow], AVSampleRateConverterAudioQualityKey, [NSNumber numberWithFloat:44100], AVSampleRateKey, [NSNumber numberWithInt:8], AVLinearPCMBitDepthKey, nil ]; // define audio recorder audioRecorder = [[AVAudioRecorder alloc] initWithURL:audioFileURL settings:settings error:nil]; [audioRecorder setDelegate:self]; [audioRecorder setMeteringEnabled:YES]; [audioRecorder prepareToRecord]; // define record button button = [UIButton buttonWithType:UIButtonTypeRoundedRect]; [button addTarget:self action:@selector(handleTouch_recordButton) forControlEvents:UIControlEventTouchUpInside]; [button setFrame:CGRectMake(110.0, 217.5, 100.0, 45.0)]; [button setTitle:@"Record" forState:UIControlStateNormal]; [button setTitle:@"Stop" forState:UIControlStateSelected]; // configure the main view controller UIViewController *viewController = [[UIViewController alloc] init]; [viewController.view addSubview:button]; // add controllers to window [window addSubview:viewController.view]; [window makeKeyAndVisible]; // release [audioFileURL release]; [settings release]; [viewController release]; return YES; } - (void) handleTouch_recordButton { if ( ![button isSelected] ) { [button setSelected:YES]; [audioRecorder record]; } else { [button setSelected:NO]; [audioRecorder stop]; } } - (void) dealloc { [audioRecorder release]; [button release]; [window release]; [super dealloc]; } The stack trace from Instruments that shows pretty clearly that the "closeFile" method in the AVFoundation code is leaking...something. You can see a screen shot of the Instruments session here: Developer Forums: AVAudioRecorder Memory Leak Any thoughts would be greatly appreciated!

    Read the article

  • AVAudioRecorder prepareToRecord works, but record fails

    - by iPadDeveloper2011
    I just built and tested a basic AVAudioRecorder/AVAudioPlayer sound recorder and player. Eventually I got this working on the device, as well as the simulator. My player/recorder code is in a single UIView subclass. Unfortunately, when I copy the class into my main project, it no longer works (on the device--the simulator is fine). prepareToRecord is working fine, but record isn't. Here is some code: audioRecorder = [[ AVAudioRecorder alloc] initWithURL:url settings:recordSettings error:&error]; if ([audioRecorder prepareToRecord]){ audioRecorder.meteringEnabled = YES; if(![audioRecorder record])NSLog(@"recording failed!"); }else { int errorCode = CFSwapInt32HostToBig ([error code]); NSLog(@"preparedToRecord=NO Error: %@ [%4.4s])" , [error localizedDescription], (char*)&errorCode); ... I get "recording failed". Anyone have any ideas why this is happening?

    Read the article

  • iPhone - openAL stops playing if I record with AVAudioRecorder

    - by Oscar Peli
    Hi there, this is an iPhone-related question: I use openAL to play some sound (I have to manage gain, pitch, etc.). I want to record what I'm playing and I use AVAudioRecorder but when I "prepareToRecord" openAL stops to play audio. What's the problem? Here is the record IBAction I use: - (IBAction) record: (id) sender { NSError *error; NSMutableDictionary *settings = [NSMutableDictionary dictionary]; [settings setValue: [NSNumber numberWithInt:kAudioFormatLinearPCM] forKey:AVFormatIDKey]; [settings setValue: [NSNumber numberWithFloat:8000.0] forKey:AVSampleRateKey]; [settings setValue: [NSNumber numberWithInt: 1] forKey:AVNumberOfChannelsKey]; [settings setValue: [NSNumber numberWithInt:16] forKey:AVLinearPCMBitDepthKey]; [settings setValue: [NSNumber numberWithBool:NO] forKey:AVLinearPCMIsBigEndianKey]; [settings setValue: [NSNumber numberWithBool:NO] forKey:AVLinearPCMIsFloatKey]; NSURL *url = [NSURL fileURLWithPath:FILEPATH]; self.recorder = [[AVAudioRecorder alloc] initWithURL:url settings:settings error:&error]; self.recorder.delegate = self; self.recorder.meteringEnabled = YES; [self.recorder prepareToRecord]; [self.recorder record]; } Thanks

    Read the article

  • AVAudioPlayer crash after playing from an AVAudioRecorder

    - by munchine
    I've got a button the user tap to start recording and tap again to stop. When it stop I want the recorded voice 'echo' back so the user can hear what was recorded. This works fine the first time. If I hit the button for the third time, it starts a new recording and when I hit stop it crashes with EXC_BAD_ACCESS. - (IBAction) readToMeTapped { if(recording) { recording = NO; [readToMeButton setTitle:@"Stop Recording" forState: UIControlStateNormal ]; NSMutableDictionary *recordSetting = [[NSDictionary alloc] initWithObjectsAndKeys: [NSNumber numberWithFloat: 44100.0], AVSampleRateKey, [NSNumber numberWithInt: kAudioFormatAppleLossless], AVFormatIDKey, [NSNumber numberWithInt: 1], AVNumberOfChannelsKey, [NSNumber numberWithInt: AVAudioQualityMax], AVEncoderAudioQualityKey, nil]; // Create a new dated file NSDate *now = [NSDate dateWithTimeIntervalSinceNow:0]; NSString *caldate = [now description]; recordedTmpFile = [NSURL fileURLWithPath:[[NSString stringWithFormat:@"%@/%@.caf", DOCUMENTS_FOLDER, caldate] retain]]; error = nil; recorder = [[ AVAudioRecorder alloc] initWithURL:recordedTmpFile settings:recordSetting error:&error]; [recordSetting release]; if(!recorder){ NSLog(@"recorder: %@ %d %@", [error domain], [error code], [[error userInfo] description]); UIAlertView *alert = [[UIAlertView alloc] initWithTitle: @"Warning" message: [error localizedDescription] delegate: nil cancelButtonTitle:@"OK" otherButtonTitles:nil]; [alert show]; [alert release]; return; } NSLog(@"Using File called: %@",recordedTmpFile); //Setup the recorder to use this file and record to it. [recorder setDelegate:self]; [recorder prepareToRecord]; [recorder recordForDuration:(NSTimeInterval) 5]; //recording for a limited time } else { // it crashes the second time it gets here! recording = YES; NSLog(@"Recording YES Using File called: %@",recordedTmpFile); [readToMeButton setTitle:@"Start Recording" forState:UIControlStateNormal ]; [recorder stop]; //Stop the recorder. //playback recording AVAudioPlayer * newPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:recordedTmpFile error:&error]; [recordedTmpFile release]; self.aPlayer = newPlayer; [newPlayer release]; [aPlayer setDelegate:self]; [aPlayer prepareToPlay]; [aPlayer play]; } } - (void)audioRecorderDidFinishRecording:(AVAudioRecorder *)sender successfully:(BOOL)flag { NSLog (@"audioRecorderDidFinishRecording:successfully:"); [recorder release]; recorder = nil; } Checking the debugger, it flags the error here @synthesize aPlayer, recorder; This is the part I don't understand. I thought it may have something to do with releasing memory but I've been careful. Have I missed something?

    Read the article

  • AVAudioPlayer won't play audio file after AVAudioRecorder

    - by Kevin
    I create a .caf audio file using AVAudioRecorder and if I try and play it back using AVAudioPlay I get no sound on the iPhone (if played in simulator works fine). If I close my application and reopen the file plays fine. Also I am not able to adjust the phone volume after recording unless I close and reopen my application. Any ideas?

    Read the article

  • AVAudioRecorder - Continue recording to file after user stops recording by leaving the application a

    - by Tegeril
    Can this be done? And if not, how far down towards Core Audio do I need to go (what method of recording should I be using instead)? I've noticed the behavior of AVAudioRecorder is to overwrite a file if it finds one at the path provided when you request that it record again, so I know that's not going to work. I'm also curious about file format restriction with this idea. Can you effectively resume an AAC or IMA4 encoding (the length of the files I want to record make WAV and probably even Apple Lossless prohibitive)? Thanks.

    Read the article

  • Getting AveragePower and PeakPower for a Channel in AVAudioRecorder

    - by Biranchi
    Hi all, I am annoyed with this piece of code. I am trying to get the averagePowerForChannel and peakPowerForChannel while recording Audio, but every time i am getting it as 0.0 Below is my code for recording audio : NSMutableDictionary *recordSetting =[[NSDictionary alloc] initWithObjectsAndKeys:[NSNumber numberWithFloat: 22050.0], AVSampleRateKey, [NSNumber numberWithInt: kAudioFormatLinearPCM], AVFormatIDKey, [NSNumber numberWithInt: 1], AVNumberOfChannelsKey, [NSNumber numberWithInt: AVAudioQualityMax], AVEncoderAudioQualityKey, [NSNumber numberWithInt:32],AVLinearPCMBitDepthKey, [NSNumber numberWithBool:NO],AVLinearPCMIsBigEndianKey, [NSNumber numberWithBool:NO],AVLinearPCMIsFloatKey, nil]; recorder1 = [[AVAudioRecorder alloc] initWithURL:[NSURL fileURLWithPath:audioFilePath] settings:recordSetting error:&err]; recorder1.meteringEnabled = YES; recorder1.delegate=self; [recorder1 prepareToRecord]; [recorder1 record]; levelTimer = [NSTimer scheduledTimerWithTimeInterval: 0.3f target: self selector: @selector(levelTimerCallback:) userInfo: nil repeats: YES]; - (void)levelTimerCallback:(NSTimer *)timer { [recorder1 updateMeters]; NSLog(@"Peak Power : %f , %f", [recorder1 peakPowerForChannel:0], [recorder1 peakPowerForChannel:1]); NSLog(@"Average Power : %f , %f", [recorder1 averagePowerForChannel:0], [recorder1 averagePowerForChannel:1]); } What is the error in the code ???

    Read the article

  • OpenAL doesn't work when using AVAudioRecorder and AVAudioPlayer

    - by Pawe
    hi. i have been troubled about audio problem for several days. i don't think OpenAL get along with AVAudio functions. i have my own OpenAL class. ( wrapped the MyOpenAL class ) my app start to record using AVAudioRecorder. i stop recording. and then i clicked the "OpenAL Play"button that play any sound using OpenAL. i can't hear it. but i can hear my recording when i clicked the "AVAudioPlayer Play" button using AVAudioPlayer. i tested oalTouch,avTouch,SpeakHear sample code. they resulted same. Does OpenAL have the problem using AVAudio~ Functions together? I was googling for a long time. but i couln't find out the solutions and the same problems issue. thanks for reading mine.

    Read the article

  • AVAudioRecorder Won't Record On Device

    - by Dyldo42
    This is my method: -(void) playOrRecord:(UIButton *)sender { if (playBool == YES) { NSError *error = nil; NSString *filePath = [[NSBundle mainBundle] pathForResource:[NSString stringWithFormat: @"%d", [sender tag]] ofType:@"caf"]; NSURL *fileUrl = [NSURL fileURLWithPath:filePath]; AVAudioPlayer *player = [[AVAudioPlayer alloc] initWithContentsOfURL:fileUrl error:&error]; [player setNumberOfLoops:0]; [player play]; } else if (playBool == NO) { if ([recorder isRecording]) { [recorder stop]; [nowRecording setImage:[UIImage imageNamed:@"NormalNormal.png"] forState:UIControlStateNormal]; [nowRecording setImage:[UIImage imageNamed:@"NormalSelected.png"] forState:UIControlStateSelected]; } if (nowRecording == sender) { nowRecording = nil; return; } nowRecording = sender; NSError *error = nil; NSString *filePath = [[NSBundle mainBundle] pathForResource:[NSString stringWithFormat: @"%d", [sender tag]] ofType:@"caf"]; NSURL *fileUrl = [NSURL fileURLWithPath:filePath]; [sender setImage:[UIImage imageNamed:@"RecordingNormal.png"] forState:UIControlStateNormal]; [sender setImage:[UIImage imageNamed:@"RecordingSelected.png"] forState:UIControlStateSelected]; recorder = [[AVAudioRecorder alloc] initWithURL:fileUrl settings:recordSettings error:&error]; [recorder record]; } } Most of it is self explanatory; playBool is a BOOL that is YES when it is in play mode. Everything works in the simulator however, when I run it on a device, [recorder record] returns NO. Does anyone have a clue as to why this is happening?

    Read the article

  • Cocoa: AVAudioRecorder Fails to Record

    - by kumaryr
    AVAudioSession *audioSession = [AVAudioSession sharedInstance]; NSError *err = nil; [audioSession setCategory:AVAudioSessionCategoryPlayAndRecord error:&err]; if(err){ NSLog(@"audioSession: %@ %d %@", [err domain], [err code], [[err userInfo] description]); return; } [audioSession setActive:YES error:&err]; err = nil; if(err){ NSLog(@"audioSession: %@ %d %@", [err domain], [err code], [[err userInfo] description]); return; } NSMutableDictionary *recordSetting = [[NSMutableDictionary alloc] init]; [recordSetting setValue:[NSNumber numberWithInt: kAudioFormatAppleIMA4] forKey:AVFormatIDKey]; [recordSetting setValue:[NSNumber numberWithFloat:40000.0] forKey:AVSampleRateKey]; [recordSetting setValue:[NSNumber numberWithInt: 2] forKey:AVNumberOfChannelsKey]; [recordSetting setValue:[NSNumber numberWithInt:16] forKey:AVLinearPCMBitDepthKey]; [recordSetting setValue:[NSNumber numberWithBool:NO] forKey:AVLinearPCMIsBigEndianKey]; [recordSetting setValue:[NSNumber numberWithBool:NO] forKey:AVLinearPCMIsFloatKey]; // Create a new dated file NSDate *now = [NSDate dateWithTimeIntervalSinceNow:0]; NSString *caldate = [now description]; NSString *recorderFilePath = [[NSString stringWithFormat:@"%@/%@.caf", DOCUMENTS_FOLDER, caldate] retain]; NSLog(recorderFilePath); url = [NSURL fileURLWithPath:recorderFilePath]; err = nil; recorder = [[ AVAudioRecorder alloc] initWithURL:url settings:recordSetting error:&err]; if(!recorder){ NSLog(@"recorder: %@ %d %@", [err domain], [err code], [[err userInfo] description]); UIAlertView *alert = [[UIAlertView alloc] initWithTitle: @"Warning" message: [err localizedDescription] delegate: nil cancelButtonTitle:@"OK" otherButtonTitles:nil]; [alert show]; [alert release]; return; } //prepare to record [recorder setDelegate:self]; [recorder prepareToRecord]; recorder.meteringEnabled = YES; BOOL audioHWAvailable = audioSession.inputIsAvailable; if (! audioHWAvailable) { UIAlertView *cantRecordAlert = [[UIAlertView alloc] initWithTitle: @"Warning" message: @"Audio input hardware not available" delegate: nil cancelButtonTitle:@"OK" otherButtonTitles:nil]; [cantRecordAlert show]; [cantRecordAlert release]; return; } // [NSTimer scheduledTimerWithTimeInterval:1 target:self selector:@selector( updateTimerDisplay) userInfo:nil repeats:YES]; // [recorder recordForDuration:(NSTimeInterval)10 ]; // [NSTimer scheduledTimerWithTimeInterval:1 target:self selector:@selector( updateTimerDisplay) userInfo:nil repeats:YES];

    Read the article

  • Conflit between AVAudioRecorder and AVAudioPlayer

    - by John
    Hi, here is my problem : The code (FooController) : NSString *path = [[NSBundle mainBundle] pathForResource:@"mySound" ofType:@"m4v"]; soundEffect = [[AVAudioPlayer alloc] initWithContentsOfURL:[NSURL fileURLWithPath:path] error:NULL]; [soundEffect play]; // MicBlow micBlow = [[MicBlowController alloc]init]; And MicBlowController contains : NSURL *url = [NSURL fileURLWithPath:@"/dev/null"]; NSDictionary *settings = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithFloat: 44100.0], AVSampleRateKey, [NSNumber numberWithInt: kAudioFormatAppleLossless], AVFormatIDKey, [NSNumber numberWithInt: 1], AVNumberOfChannelsKey, [NSNumber numberWithInt: AVAudioQualityMax], AVEncoderAudioQualityKey, nil]; and [recorder updateMeters]; const double ALPHA = 0.05; double peakPowerForChannel = pow(10,(0.05*[recorder peakPowerForChannel:0])); lowPassResults = ALPHA * peakPowerForChannel + (1.0 - ALPHA) * lowPassResults; NSLog(@"Average input: %f Peak input %f Low pass results: %f",[recorder averagePowerForChannel:0],[recorder peakPowerForChannel:0],lowPassResults); If I play the background sound and try to get the peak from the mic I get this log : Average input: 0.000000 Peak input -120.000000 Low pass results: 0.000001 But if I comment all parts about AVAudioPlayer it works. I think there is a problem of channel. Thanks

    Read the article

  • Record AVAudioPlayer output using AVAudioRecorder

    - by Kieran
    In my app the user plays a sound by pressing a button. There are several buttons which can be played simultaneously. The sounds are played using AVAudioPlayer instances. I want to record the output of these instances using AVAudioRecorder. I have set it all up and a file is created and records but when I play it back it does not play any sound. It is just a silent file the length of the recording. Does anyone know if there is a setting I am missing with AVAudioPlayer or AVAudioRecorder? Thanks

    Read the article

  • Operation could not be completed. AVAudioRecorder iphone SDK

    - by Jonathan
    I am trying to record using the iphones microphone: This is my code: NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES); NSString *documentsDirectory = [paths objectAtIndex:0]; // the path to write file NSString *appFile = [documentsDirectory stringByAppendingPathComponent:@"testing.mp3"]; NSURL *url = [NSURL fileURLWithPath:appFile isDirectory:NO]; NSDictionary *settings = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithFloat: 44100.0], AVSampleRateKey, [NSNumber numberWithInt: kAudioFormatMPEGLayer3], AVFormatIDKey, [NSNumber numberWithInt: 1], AVNumberOfChannelsKey, [NSNumber numberWithInt: AVAudioQualityLow], AVEncoderAudioQualityKey, nil]; NSError *error; recorder = [[AVAudioRecorder alloc] initWithURL:url settings:settings error:&error]; if ([recorder prepareToRecord] == YES){ [recorder record]; }else { int errorCode = CFSwapInt32HostToBig ([error code]); NSLog(@"Error: %@ [%4.4s])" , [error localizedDescription], (char*)&errorCode); } NSLog(@"BOOL = %d", (int)recorder.recording); This is the error I get: Operation could not be completed. (OSStatus error 1718449215.) And I can not work out why this doesn't work, as a lot of the code I got from a website. Jonathan

    Read the article

  • iPhone metering problems

    - by Eric Christensen
    When I'm recording an AVAudioRecorder object, I repeat a chunk of code via an NSTimer that includes: [soundrecording updateMeters]; NSLog(@"channel 0 average:%f, peak:%f",[soundrecording averagePowerForChannel:0],[soundrecording peakPowerForChannel:0]); NSLog(@"channel 1 average:%f, peak:%f",[soundrecording averagePowerForChannel:1],[soundrecording peakPowerForChannel:1]); When I'm recording a mono file, the peak power for channel 0 is just what you'd expect, a float from -160 to 0. But average power for channel 0 is always zero. (And, of course, the values for channel 1 are both zero.) When I'm recording a stereo file, both the average and peak values for both channels are as expected. Any thoughts on why, when recording a mono file, the average value for channel 0 isn't returning correctly, even though the peak is? Thanks!

    Read the article

  • Cannot play a recorded sound on device.

    - by B_
    I'm using the exact code from the iPhone Application Programming Guide Multimedia Support to use AVAudioRecorder to record a file to the disk and then AVAudioPlayer to load and play that file. This is working fine in the simulator but is not working on the device. The file gets loaded (we can see the NSTimeInterval) but does not play (play returns false). After it didn't work with the sample code from the website, we tried changing to a bunch of different codecs with no success. And of course, the sound is on. Thanks a bunch.

    Read the article

  • Can AVAudioSession do full duplex?

    - by Eric Christensen
    It would seem like it should be able to, but the following breakout test code can't do both: //play a file: NSArray *pathsArray = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES); NSString *documentsDirectory = [pathsArray objectAtIndex:0]; NSString* playFilePath=[documentsDirectory stringByAppendingPathComponent:@"testplayfile.wav"]; AVAudioPlayer *tempplayer = [[AVAudioPlayer alloc] initWithContentsOfURL:[NSURL fileURLWithPath:playFilePath] error:nil]; [tempplayer prepareToPlay]; [tempplayer play]; //and record a file: NSString* recFilePath=[documentsDirectory stringByAppendingPathComponent:@"testrecordfile.wav"]; AVAudioRecording *soundrecording = [[AVAudioRecorder alloc] initWithURL:[NSURL fileURLWithPath:recFilePath] settings:nil error:nil]; [soundrecording prepareToRecord]; [soundrecording record]; This is the minimum I can think of to individually play one file and record another. And this works just fine in the simulator. I can play back a file and record at the same time. But it doesn't work on the iphone itself. If I comment out either function, the other performs fine. The playback plays fine either alone or with both, if it's first. If I comment out the playback, the record records fine. (There's additional code to stop the recording not shown here.) So each works fine, but not together. I know audioQueue has a setting to allow both, but I don't see an analogue for AVAudioSessions. Any idea if it's possible, and if so, what I need to add? Thanks!

    Read the article

  • "SpeakHere" iPhone Sample App

    - by Biranchi
    Hi All, Why does Apple has given such complex code in the documentation as Sample code for reference ??? I have wasted too much time in finding out how "averagePowerForChannel" works, but still no good. Where can i find good simple sample codes ??

    Read the article

  • Audio Recording and Playback

    - by Siva
    Hi, I am new to iphone development. In my app, I want to record a voice and play the recorded voice. Now I am trying to do via speak here sample code, but i feel it is too hard to understand with AudioToolbox framework. Somebody saying AudioToolbox framework is too difficult to implement it. is there any other sample with other than AudioToolbox framework or which way is best to do that? Please help me!

    Read the article

  • Are these AVAudioSettings right?

    - by Dyldo42
    self.recordSettings = [NSMutableDictionary dictionary]; [recordSettings setValue:[NSNumber numberWithInt:kAudioFormatAppleIMA4] forKey:AVFormatIDKey]; [recordSettings setValue:[NSNumber numberWithFloat:44100.0] forKey:AVSampleRateKey]; [recordSettings setValue:[NSNumber numberWithInt:1] forKey:AVNumberOfChannelsKey]; recordSettings is declared in my view's header file and initialised in its viewDidLoad method. For some reason, everything is working in the simulator, but on a device, the [recorder record] method is returning NO. The only theory I have is that something in the recordSettings isn't compatible with an actual device. Any ideas?

    Read the article

  • How to check the type of inputAvailable in iPad?

    - by iphoneDev
    Hi, I am implementing the sound recording functionality in my iPad app. I want to prompt the user to attach his headphone with microphone for better performance.For this I need to check that whether the user has connected the headphone with microphone or not. In the AVAudioSession there is a method inputIsAvailable.But this method returns 'Yes' for the inbuilt mic of iPad also.So,please suggest how to detect that whether the headphone with mic is connected to the device or not??

    Read the article

  • Sound does not working in Device

    - by diana
    In my app i write for record NSArray *filePaths = NSSearchPathForDirectoriesInDomains (NSDocumentDirectory, NSUserDomainMask, YES); NSString *recordingDirectory = [filePaths objectAtIndex: 0]; NSString *resourcePath = [recordingDirectory stringByAppendingString:@"/sound.caf"]; self.soundFileURL = [NSURL fileURLWithPath:resourcePath]; AVAudioSession *audioSession = [AVAudioSession sharedInstance]; audioSession.delegate = self; [audioSession setActive: YES error: nil]; [[AVAudioSession sharedInstance] setCategory : AVAudioSessionCategoryRecorderror: nil]; NSDictionary *recordSettings = [[NSDictionary alloc] initWithObjectsAndKeys: [NSNumber numberWithFloat: 44100.0], AVSampleRateKey, [NSNumber numberWithInt: kAudioFormatAppleLossless], AVFormatIDKey, [NSNumber numberWithInt: 1], AVNumberOfChannelsKey, [NSNumber numberWithInt: AVAudioQualityMax], AVEncoderAudioQualityKey, nil]; AVAudioRecorder *newRecorder = [[AVAudioRecorder alloc] initWithURL: soundFileURL settings: recordSettings error: nil]; [recordSettings release]; self.soundRecorder = newRecorder; [newRecorder release]; soundRecorder.delegate = self; [soundRecorder prepareToRecord]; [soundRecorder record]; recording = YES; I write for stop recording [soundRecorder stop]; recording = NO; self.soundRecorder = nil; I write for play AVAudioPlayer *newPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL: self.soundFileURL error: nil]; [fileURL release]; self.player = newPlayer; [newPlayer release]; [player prepareToPlay]; [player setDelegate: self]; [button setTitle : @"Pause"forState: UIControlStateHighlighted]; [button setTitle : @"Pause"forState: UIControlStateNormal]; [player play]; In iPhone Simulator all ok I record then stop then play and all work fine. But in my iPhone device no sound. Any help will be greatly apprecieated

    Read the article

  • iPhone SDK: Change playback speed using core audio AVAudioPlayer

    - by Harkonian
    I'd like to be able to play back audio I've recorded using AVAudioRecorder @ 1.5x or 2.0x speed. I don't see anything in AVAudioPlayer that will support that. I'd appreciate some suggestions, with code if possible, on how to accomplish this with the iPhone 3.x SDK. I'm not overly concerned with lowering the pitch to compensate for increased playback speed, but being able to do so would be optimal.

    Read the article

1 2  | Next Page >