Search Results

Search found 4 results on 1 pages for 'avcapturedevice'.

Page 1/1 | 1 

  • Setting White balance and Exposure mode for iphone camera + enum default

    - by Spectravideo328
    I am using the back camera of an iphone4 and doing the standard and lengthy process of creating an AVCaptureSession and adding to it an AVCaptureDevice. Before attaching the AvCaptureDeviceInput of that camera to the session, I am testing my understanding of white balance and exposure, so I am trying this: [self.theCaptureDevice lockForConfiguration:nil]; [self.theCaptureDevice setWhiteBalanceMode:AVCaptureWhiteBalanceModeLocked]; [self.theCaptureDevice setExposureMode:AVCaptureExposureModeContinuousAutoExposure]; [self.theCaptureDevice unlockForConfiguration]; 1- Given that the various options for white balance mode are in an enum, I would have thought that the default is always zero since the enum Typedef variable was never assigned a value. I am finding out, if I breakpoint and po the values in the debugger, that the default white balance mode is actually set to 2. Unfortunately, the header files of AVCaptureDevice does not say what the default are for the different camera setting. 2- This might sound silly, but can I assume that once I stop the app, that all settings for whitebalance, exposure mode, will go back to their default. So that if I start another app right after, the camera device is not somehow stuck on those "hardware settings".

    Read the article

  • Why AVCaptureSession output a wrong orientation?

    - by Peter
    Hey guys, So, I followed Apple's instructions to capture video session using AVCaptureSession: http://developer.apple.com/iphone/library/qa/qa2010/qa1702.html. One problem I'm facing is that even though the orientation of the camera / iphone device is vertical (and the AVCaptureVideoPreviewLayer shows a vertical camera stream), the output image seems to be in the landscape mode. I checked the width and height of imageBuffer inside imageFromSampleBuffer: of the sample code, and I got 640px and 480px respectively. Does anyone know why this's the case? Thanks!

    Read the article

  • How to directly rotate CVImageBuffer image in IOS 4 without converting to UIImage?

    - by Ian Charnas
    I am using OpenCV 2.2 on the iPhone to detect faces. I'm using the IOS 4's AVCaptureSession to get access to the camera stream, as seen in the code that follows. My challenge is that the video frames come in as CVBufferRef (pointers to CVImageBuffer) objects, and they come in oriented as a landscape, 480px wide by 300px high. This is fine if you are holding the phone sideways, but when the phone is held in the upright position I want to rotate these frames 90 degrees clockwise so that OpenCV can find the faces correctly. I could convert the CVBufferRef to a CGImage, then to a UIImage, and then rotate, as this person is doing: Rotate CGImage taken from video frame However that wastes a lot of CPU. I'm looking for a faster way to rotate the images coming in, ideally using the GPU to do this processing if possible. Any ideas? Ian Code Sample: -(void) startCameraCapture { // Start up the face detector faceDetector = [[FaceDetector alloc] initWithCascade:@"haarcascade_frontalface_alt2" withFileExtension:@"xml"]; // Create the AVCapture Session session = [[AVCaptureSession alloc] init]; // create a preview layer to show the output from the camera AVCaptureVideoPreviewLayer *previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:session]; previewLayer.frame = previewView.frame; previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill; [previewView.layer addSublayer:previewLayer]; // Get the default camera device AVCaptureDevice* camera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; // Create a AVCaptureInput with the camera device NSError *error=nil; AVCaptureInput* cameraInput = [[AVCaptureDeviceInput alloc] initWithDevice:camera error:&error]; if (cameraInput == nil) { NSLog(@"Error to create camera capture:%@",error); } // Set the output AVCaptureVideoDataOutput* videoOutput = [[AVCaptureVideoDataOutput alloc] init]; videoOutput.alwaysDiscardsLateVideoFrames = YES; // create a queue besides the main thread queue to run the capture on dispatch_queue_t captureQueue = dispatch_queue_create("catpureQueue", NULL); // setup our delegate [videoOutput setSampleBufferDelegate:self queue:captureQueue]; // release the queue. I still don't entirely understand why we're releasing it here, // but the code examples I've found indicate this is the right thing. Hmm... dispatch_release(captureQueue); // configure the pixel format videoOutput.videoSettings = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA], (id)kCVPixelBufferPixelFormatTypeKey, nil]; // and the size of the frames we want // try AVCaptureSessionPresetLow if this is too slow... [session setSessionPreset:AVCaptureSessionPresetMedium]; // If you wish to cap the frame rate to a known value, such as 10 fps, set // minFrameDuration. videoOutput.minFrameDuration = CMTimeMake(1, 10); // Add the input and output [session addInput:cameraInput]; [session addOutput:videoOutput]; // Start the session [session startRunning]; } - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { // only run if we're not already processing an image if (!faceDetector.imageNeedsProcessing) { // Get CVImage from sample buffer CVImageBufferRef cvImage = CMSampleBufferGetImageBuffer(sampleBuffer); // Send the CVImage to the FaceDetector for later processing [faceDetector setImageFromCVPixelBufferRef:cvImage]; // Trigger the image processing on the main thread [self performSelectorOnMainThread:@selector(processImage) withObject:nil waitUntilDone:NO]; } }

    Read the article

  • Implementing ZBar QR Code Reader in UIView

    - by Bobster4300
    I really need help here. I'm pretty new to iOS/Objective-C so sorry if the problem resolution is obvious or if my code is terrible. Be easy on me!! :-) I'm struggling to integrate ZBarSDK for reading QR Codes into an iPad app i'm building. If I use ZBarReaderController (of which there are plenty of tutorials and guides on implementing), it works fine. However I want to make the camera come up in a UIView as opposed to the fullscreen camera. Now I have gotten as far as making the camera view (readerView) come up in the UIView (ZBarReaderView) as expected, but I get an error when it scans a code. The error does not come up until a code is scanned making me believe this is either delegate related or something else. Here's the important parts of my code: (ZBarSDK.h is imported at the PCH file) SignInViewController.h #import <UIKit/UIKit.h> #import <AVFoundation/AVFoundation.h> @class AVCaptureSession, AVCaptureDevice; @interface SignInViewController : UIViewController < ZBarReaderDelegate > { ZBarReaderView *readerView; UITextView *resultText; } @property (nonatomic, retain) UIImagePickerController *imgPicker; @property (strong, nonatomic) IBOutlet UITextView *resultText; @property (strong, nonatomic) IBOutlet ZBarReaderView *readerView; -(IBAction)StartScan:(id) sender; SignInViewController.m #import "SignInViewController.h" @interface SignInViewController () @end @implementation SignInViewController @synthesize resultText, readerView; -(IBAction)StartScan:(id) sender { readerView = [ZBarReaderView new]; readerView.readerDelegate = self; readerView.tracksSymbols = NO; readerView.frame = CGRectMake(30,70,230,230); readerView.torchMode = 0; readerView.device = [self frontFacingCameraIfAvailable]; ZBarImageScanner *scanner = readerView.scanner; [scanner setSymbology: ZBAR_I25 config: ZBAR_CFG_ENABLE to: 0]; [self relocateReaderPopover:[self interfaceOrientation]]; [readerView start]; [self.view addSubview: readerView]; resultText.hidden=NO; } - (void) readerControllerDidFailToRead: (ZBarReaderController*) reader withRetry: (BOOL) retry{ NSLog(@"the image picker failing to read"); } - (void) imagePickerController: (UIImagePickerController*) reader didFinishPickingMediaWithInfo: (NSDictionary*) info { NSLog(@"the image picker is calling successfully %@",info); id<NSFastEnumeration> results = [info objectForKey: ZBarReaderControllerResults]; ZBarSymbol *symbol = nil; NSString *hiddenData; for(symbol in results) hiddenData=[NSString stringWithString:symbol.data]; NSLog(@"the symbols is the following %@",symbol.data); resultText.text=symbol.data; NSLog(@"BARCODE= %@",symbol.data); NSLog(@"SYMBOL : %@",hiddenData); resultText.text=hiddenData; } The error I get when a code is scanned: 2012-12-16 14:28:32.797 QRTestApp[7970:907] -[SignInViewController readerView:didReadSymbols:fromImage:]: unrecognized selector sent to instance 0x1e88b1c0 2012-12-16 14:28:32.799 QRTestApp[7970:907] *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '-[SignInViewController readerView:didReadSymbols:fromImage:]: unrecognized selector sent to instance 0x1e88b1c0' I'm not too worried about what happens with the results just yet, just want to get over this error. Took me ages just to get the camera to come up in the UIView due to severe lack of tutorial or documentation on ZBarReaderView (for beginners anyway). Thanks all.

    Read the article

1