Search Results

Search found 63 results on 3 pages for 'nstimeinterval'.

Page 3/3 | < Previous Page | 1 2 3 

  • iPhone SDK can't get UIScrollView to work with a UIWebView

    - by Maxwell Segal
    I am building an app that in part of it has two views - one portrait and one landscape. And I want to offer scrolling and pinch & zooming on the landscape orientation. I've built the two views and the simple UIImages in the landscape view all work OK but it does not seem to insert my UIWebView into the UIScrollView. I've done this before with UIImage and all was OK but I must be missing something here - can anybody help with some advice. I'm sure it's something very simple - perhaps to do with the adding of the subview??. Thanks in advance of your help. I've shown the relevant parts of my code below: @interface MonthViewController : UIViewController <UIScrollViewDelegate> { MonthViewController *monthController; IBOutlet UIView *portraitView; IBOutlet UIView *landscapeView; IBOutlet UIWebView *monthKWHChartPortrait; UIWebView *monthKWHChartLandscape; IBOutlet UIScrollView *scrollChartLandscape; IBOutlet UIImageView *eLogoPortrait; IBOutlet UIImageView *eLogoLandscape; IBOutlet UIImageView *clientLogoPortrait; IBOutlet UIImageView *clientLogoLandscape; IBOutlet UIActivityIndicatorView *loadIndicatorPortrait; IBOutlet UIActivityIndicatorView *loadIndicatorLandscape; NSTimer *timer; IBOutlet UILabel *test; } -(UIView *)viewForZoomingInScrollView:(UIScrollView *)scrollChartLandscape { return monthKWHChartLandscape; } - (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation { return (interfaceOrientation == UIInterfaceOrientationPortrait || interfaceOrientation == UIInterfaceOrientationLandscapeLeft || interfaceOrientation == UIInterfaceOrientationLandscapeRight); } -(void)willRotateToInterfaceOrientation:(UIInterfaceOrientation)toInterfaceOrientation duration:(NSTimeInterval)duration { [super willRotateToInterfaceOrientation:(UIInterfaceOrientation)toInterfaceOrientation duration:duration]; if (toInterfaceOrientation ==UIInterfaceOrientationLandscapeRight) { NSLog(@"right"); test.text=@"right"; NSString *urlAddressLandscape = @"http://PRIVATEURLHERE.aspx?WIDTH=440"; NSURL *urlLandscape = [NSURL URLWithString:urlAddressLandscape]; NSURLRequest *requestObjLandscape = [NSURLRequest requestWithURL:urlLandscape]; [self.monthKWHChartLandscape loadRequest:requestObjLandscape]; scrollChartLandscape.contentSize = CGSizeMake(monthKWHChartLandscape.frame.size.width, monthKWHChartLandscape.frame.size.height); scrollChartLandscape.maximumZoomScale = 2.5; scrollChartLandscape.minimumZoomScale = 0.6; scrollChartLandscape.showsHorizontalScrollIndicator = YES; scrollChartLandscape.showsVerticalScrollIndicator = YES; scrollChartLandscape.clipsToBounds = YES; scrollChartLandscape.delegate = self; [self.scrollChartLandscape addSubview:monthKWHChartLandscape]; self.view=landscapeView; //self.view.transform=CGAffineTransformMakeRotation(deg2rad*(90)); //self.view.bounds=CGRectMake(0.0, 0.0, 480.0, 320.0); } else

    Read the article

  • Cocoa: AVAudioRecorder Fails to Record

    - by kumaryr
    AVAudioSession *audioSession = [AVAudioSession sharedInstance]; NSError *err = nil; [audioSession setCategory:AVAudioSessionCategoryPlayAndRecord error:&err]; if(err){ NSLog(@"audioSession: %@ %d %@", [err domain], [err code], [[err userInfo] description]); return; } [audioSession setActive:YES error:&err]; err = nil; if(err){ NSLog(@"audioSession: %@ %d %@", [err domain], [err code], [[err userInfo] description]); return; } NSMutableDictionary *recordSetting = [[NSMutableDictionary alloc] init]; [recordSetting setValue:[NSNumber numberWithInt: kAudioFormatAppleIMA4] forKey:AVFormatIDKey]; [recordSetting setValue:[NSNumber numberWithFloat:40000.0] forKey:AVSampleRateKey]; [recordSetting setValue:[NSNumber numberWithInt: 2] forKey:AVNumberOfChannelsKey]; [recordSetting setValue:[NSNumber numberWithInt:16] forKey:AVLinearPCMBitDepthKey]; [recordSetting setValue:[NSNumber numberWithBool:NO] forKey:AVLinearPCMIsBigEndianKey]; [recordSetting setValue:[NSNumber numberWithBool:NO] forKey:AVLinearPCMIsFloatKey]; // Create a new dated file NSDate *now = [NSDate dateWithTimeIntervalSinceNow:0]; NSString *caldate = [now description]; NSString *recorderFilePath = [[NSString stringWithFormat:@"%@/%@.caf", DOCUMENTS_FOLDER, caldate] retain]; NSLog(recorderFilePath); url = [NSURL fileURLWithPath:recorderFilePath]; err = nil; recorder = [[ AVAudioRecorder alloc] initWithURL:url settings:recordSetting error:&err]; if(!recorder){ NSLog(@"recorder: %@ %d %@", [err domain], [err code], [[err userInfo] description]); UIAlertView *alert = [[UIAlertView alloc] initWithTitle: @"Warning" message: [err localizedDescription] delegate: nil cancelButtonTitle:@"OK" otherButtonTitles:nil]; [alert show]; [alert release]; return; } //prepare to record [recorder setDelegate:self]; [recorder prepareToRecord]; recorder.meteringEnabled = YES; BOOL audioHWAvailable = audioSession.inputIsAvailable; if (! audioHWAvailable) { UIAlertView *cantRecordAlert = [[UIAlertView alloc] initWithTitle: @"Warning" message: @"Audio input hardware not available" delegate: nil cancelButtonTitle:@"OK" otherButtonTitles:nil]; [cantRecordAlert show]; [cantRecordAlert release]; return; } // [NSTimer scheduledTimerWithTimeInterval:1 target:self selector:@selector( updateTimerDisplay) userInfo:nil repeats:YES]; // [recorder recordForDuration:(NSTimeInterval)10 ]; // [NSTimer scheduledTimerWithTimeInterval:1 target:self selector:@selector( updateTimerDisplay) userInfo:nil repeats:YES];

    Read the article

  • AVAudioPlayer crash after playing from an AVAudioRecorder

    - by munchine
    I've got a button the user tap to start recording and tap again to stop. When it stop I want the recorded voice 'echo' back so the user can hear what was recorded. This works fine the first time. If I hit the button for the third time, it starts a new recording and when I hit stop it crashes with EXC_BAD_ACCESS. - (IBAction) readToMeTapped { if(recording) { recording = NO; [readToMeButton setTitle:@"Stop Recording" forState: UIControlStateNormal ]; NSMutableDictionary *recordSetting = [[NSDictionary alloc] initWithObjectsAndKeys: [NSNumber numberWithFloat: 44100.0], AVSampleRateKey, [NSNumber numberWithInt: kAudioFormatAppleLossless], AVFormatIDKey, [NSNumber numberWithInt: 1], AVNumberOfChannelsKey, [NSNumber numberWithInt: AVAudioQualityMax], AVEncoderAudioQualityKey, nil]; // Create a new dated file NSDate *now = [NSDate dateWithTimeIntervalSinceNow:0]; NSString *caldate = [now description]; recordedTmpFile = [NSURL fileURLWithPath:[[NSString stringWithFormat:@"%@/%@.caf", DOCUMENTS_FOLDER, caldate] retain]]; error = nil; recorder = [[ AVAudioRecorder alloc] initWithURL:recordedTmpFile settings:recordSetting error:&error]; [recordSetting release]; if(!recorder){ NSLog(@"recorder: %@ %d %@", [error domain], [error code], [[error userInfo] description]); UIAlertView *alert = [[UIAlertView alloc] initWithTitle: @"Warning" message: [error localizedDescription] delegate: nil cancelButtonTitle:@"OK" otherButtonTitles:nil]; [alert show]; [alert release]; return; } NSLog(@"Using File called: %@",recordedTmpFile); //Setup the recorder to use this file and record to it. [recorder setDelegate:self]; [recorder prepareToRecord]; [recorder recordForDuration:(NSTimeInterval) 5]; //recording for a limited time } else { // it crashes the second time it gets here! recording = YES; NSLog(@"Recording YES Using File called: %@",recordedTmpFile); [readToMeButton setTitle:@"Start Recording" forState:UIControlStateNormal ]; [recorder stop]; //Stop the recorder. //playback recording AVAudioPlayer * newPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:recordedTmpFile error:&error]; [recordedTmpFile release]; self.aPlayer = newPlayer; [newPlayer release]; [aPlayer setDelegate:self]; [aPlayer prepareToPlay]; [aPlayer play]; } } - (void)audioRecorderDidFinishRecording:(AVAudioRecorder *)sender successfully:(BOOL)flag { NSLog (@"audioRecorderDidFinishRecording:successfully:"); [recorder release]; recorder = nil; } Checking the debugger, it flags the error here @synthesize aPlayer, recorder; This is the part I don't understand. I thought it may have something to do with releasing memory but I've been careful. Have I missed something?

    Read the article

  • AVAudioPlayer crash after playing from an AVAudioRecord

    - by munchine
    I've got a button the user tap to start recording and tap again to stop. When it stop I want the recorded voice 'echo' back so the user can hear what was recorded. This works fine the first time. If I hit the button for the third time, it starts a new recording and when I hit stop it crashes with EXC_BAD_ACCESS. - (IBAction) readToMeTapped { if(recording) { recording = NO; [readToMeButton setTitle:@"Stop Recording" forState: UIControlStateNormal ]; NSMutableDictionary *recordSetting = [[NSDictionary alloc] initWithObjectsAndKeys: [NSNumber numberWithFloat: 44100.0], AVSampleRateKey, [NSNumber numberWithInt: kAudioFormatAppleLossless], AVFormatIDKey, [NSNumber numberWithInt: 1], AVNumberOfChannelsKey, [NSNumber numberWithInt: AVAudioQualityMax], AVEncoderAudioQualityKey, nil]; // Create a new dated file NSDate *now = [NSDate dateWithTimeIntervalSinceNow:0]; NSString *caldate = [now description]; recordedTmpFile = [NSURL fileURLWithPath:[[NSString stringWithFormat:@"%@/%@.caf", DOCUMENTS_FOLDER, caldate] retain]]; error = nil; recorder = [[ AVAudioRecorder alloc] initWithURL:recordedTmpFile settings:recordSetting error:&error]; if(!recorder){ NSLog(@"recorder: %@ %d %@", [error domain], [error code], [[error userInfo] description]); UIAlertView *alert = [[UIAlertView alloc] initWithTitle: @"Warning" message: [error localizedDescription] delegate: nil cancelButtonTitle:@"OK" otherButtonTitles:nil]; [alert show]; [alert release]; return; } NSLog(@"Using File called: %@",recordedTmpFile); //Setup the recorder to use this file and record to it. [recorder setDelegate:self]; [recorder prepareToRecord]; [recorder recordForDuration:(NSTimeInterval) 5]; //recording for a limited time } else { // it crashes the second time it gets here! recording = YES; NSLog(@"Recording YES Using File called: %@",recordedTmpFile); [readToMeButton setTitle:@"Start Recording" forState:UIControlStateNormal ]; [recorder stop]; //Stop the recorder. //playback recording AVAudioPlayer * newPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:recordedTmpFile error:&error]; [recordedTmpFile release]; self.aPlayer = newPlayer; [newPlayer release]; [aPlayer setDelegate:self]; [aPlayer prepareToPlay]; [aPlayer play]; } } - (void)audioRecorderDidFinishRecording:(AVAudioRecorder *)sender successfully:(BOOL)flag { NSLog (@"audioRecorderDidFinishRecording:successfully:"); [recorder release]; recorder = nil; } Checking the debugger, it flags the error here @synthesize aPlayer, recorder; This is the part I don't understand. I thought it may have something to do with releasing memory but I've been careful. Have I missed something?

    Read the article

  • Bulk inserts into sqlite db on the iphone...

    - by akaii
    I'm inserting a batch of 100 records, each containing a dictonary containing arbitrarily long HTML strings, and by god, it's slow. On the iphone, the runloop is blocking for several seconds during this transaction. Is my only recourse to use another thread? I'm already using several for acquiring data from HTTP servers, and the sqlite documentation explicitly discourages threading with the database, even though it's supposed to be thread-safe... Is there something I'm doing extremely wrong that if fixed, would drastically reduce the time it takes to complete the whole operation? NSString* statement; statement = @"BEGIN EXCLUSIVE TRANSACTION"; sqlite3_stmt *beginStatement; if (sqlite3_prepare_v2(database, [statement UTF8String], -1, &beginStatement, NULL) != SQLITE_OK) { printf("db error: %s\n", sqlite3_errmsg(database)); return; } if (sqlite3_step(beginStatement) != SQLITE_DONE) { sqlite3_finalize(beginStatement); printf("db error: %s\n", sqlite3_errmsg(database)); return; } NSTimeInterval timestampB = [[NSDate date] timeIntervalSince1970]; statement = @"INSERT OR REPLACE INTO item (hash, tag, owner, timestamp, dictionary) VALUES (?, ?, ?, ?, ?)"; sqlite3_stmt *compiledStatement; if(sqlite3_prepare_v2(database, [statement UTF8String], -1, &compiledStatement, NULL) == SQLITE_OK) { for(int i = 0; i < [items count]; i++){ NSMutableDictionary* item = [items objectAtIndex:i]; NSString* tag = [item objectForKey:@"id"]; NSInteger hash = [[NSString stringWithFormat:@"%@%@", tag, ownerID] hash]; NSInteger timestamp = [[item objectForKey:@"updated"] intValue]; NSData *dictionary = [NSKeyedArchiver archivedDataWithRootObject:item]; sqlite3_bind_int( compiledStatement, 1, hash); sqlite3_bind_text( compiledStatement, 2, [tag UTF8String], -1, SQLITE_TRANSIENT); sqlite3_bind_text( compiledStatement, 3, [ownerID UTF8String], -1, SQLITE_TRANSIENT); sqlite3_bind_int( compiledStatement, 4, timestamp); sqlite3_bind_blob( compiledStatement, 5, [dictionary bytes], [dictionary length], SQLITE_TRANSIENT); while(YES){ NSInteger result = sqlite3_step(compiledStatement); if(result == SQLITE_DONE){ break; } else if(result != SQLITE_BUSY){ printf("db error: %s\n", sqlite3_errmsg(database)); break; } } sqlite3_reset(compiledStatement); } timestampB = [[NSDate date] timeIntervalSince1970] - timestampB; NSLog(@"Insert Time Taken: %f",timestampB); // COMMIT statement = @"COMMIT TRANSACTION"; sqlite3_stmt *commitStatement; if (sqlite3_prepare_v2(database, [statement UTF8String], -1, &commitStatement, NULL) != SQLITE_OK) { printf("db error: %s\n", sqlite3_errmsg(database)); } if (sqlite3_step(commitStatement) != SQLITE_DONE) { printf("db error: %s\n", sqlite3_errmsg(database)); } sqlite3_finalize(beginStatement); sqlite3_finalize(compiledStatement); sqlite3_finalize(commitStatement);

    Read the article

  • not getting voice , which is recorded could you suggest me what is the bug in the below code ?

    - by kumaryr
    AVAudioSession *audioSession = [AVAudioSession sharedInstance]; NSError *err = nil; [audioSession setCategory:AVAudioSessionCategoryPlayAndRecord error:&err]; if(err){ NSLog(@"audioSession: %@ %d %@", [err domain], [err code], [[err userInfo] description]); return; } [audioSession setActive:YES error:&err]; err = nil; if(err){ NSLog(@"audioSession: %@ %d %@", [err domain], [err code], [[err userInfo] description]); return; } NSMutableDictionary *recordSetting = [[NSMutableDictionary alloc] init]; [recordSetting setValue :[NSNumber numberWithInt: kAudioFormatAppleIMA4] forKey:AVFormatIDKey]; [recordSetting setValue:[NSNumber numberWithFloat:40000.0] forKey:AVSampleRateKey]; [recordSetting setValue:[NSNumber numberWithInt: 2] forKey:AVNumberOfChannelsKey]; [recordSetting setValue :[NSNumber numberWithInt:16] forKey:AVLinearPCMBitDepthKey]; [recordSetting setValue :[NSNumber numberWithBool:NO] forKey:AVLinearPCMIsBigEndianKey]; [recordSetting setValue :[NSNumber numberWithBool:NO] forKey:AVLinearPCMIsFloatKey]; // Create a new dated file NSDate *now = [NSDate dateWithTimeIntervalSinceNow:0]; NSString *caldate = [now description]; NSString *recorderFilePath = [[NSString stringWithFormat:@"%@/%@.caf", DOCUMENTS_FOLDER, caldate] retain]; NSLog(recorderFilePath); url = [NSURL fileURLWithPath:recorderFilePath]; err = nil; recorder = [[ AVAudioRecorder alloc] initWithURL:url settings:recordSetting error:&err]; if(!recorder){ NSLog(@"recorder: %@ %d %@", [err domain], [err code], [[err userInfo] description]); UIAlertView *alert = [[UIAlertView alloc] initWithTitle: @"Warning" message: [err localizedDescription] delegate: nil cancelButtonTitle:@"OK" otherButtonTitles:nil]; [alert show]; [alert release]; return; } //prepare to record [recorder setDelegate:self]; [recorder prepareToRecord]; recorder.meteringEnabled = YES; BOOL audioHWAvailable = audioSession.inputIsAvailable; if (! audioHWAvailable) { UIAlertView *cantRecordAlert = [[UIAlertView alloc] initWithTitle: @"Warning" message: @"Audio input hardware not available" delegate: nil cancelButtonTitle:@"OK" otherButtonTitles:nil]; [cantRecordAlert show]; [cantRecordAlert release]; return; } //[NSTimer scheduledTimerWithTimeInterval:1 target:self selector:@selector( updateTimerDisplay) userInfo:nil repeats:YES]; [recorder recordForDuration:(NSTimeInterval)10 ]; // [NSTimer scheduledTimerWithTimeInterval:1 target:self selector:@selector( updateTimerDisplay) userInfo:nil repeats:YES];

    Read the article

  • iPhone: Tracking/Identifying individual touches

    - by FlorianZ
    I have a quick question regarding tracking touches on the iPhone and I seem to not be able to come to a conclusion on this, so any suggestions / ideas are greatly appreciated: I want to be able to track and identify touches on the iphone, ie. basically every touch has a starting position and a current/moved position. Touches are stored in a std::vector and they shall be removed from the container, once they ended. Their position shall be updated once they move, but I still want to keep track of where they initially started (gesture recognition). I am getting the touches from [event allTouches], thing is, the NSSet is unsorted and I seem not to be able to identify the touches that are already stored in the std::vector and refer to the touches in the NSSet (so I know which ones ended and shall be removed, or have been moved, etc.) Here is my code, which works perfectly with only one finger on the touch screen, of course, but with more than one, I do get unpredictable results... - (void) touchesBegan:(NSSet*)touches withEvent:(UIEvent*)event { [self handleTouches:[event allTouches]]; } - (void) touchesEnded:(NSSet*)touches withEvent:(UIEvent*)event { [self handleTouches:[event allTouches]]; } - (void) touchesMoved:(NSSet*)touches withEvent:(UIEvent*)event { [self handleTouches:[event allTouches]]; } - (void) touchesCancelled:(NSSet*)touches withEvent:(UIEvent*)event { [self handleTouches:[event allTouches]]; } - (void) handleTouches:(NSSet*)allTouches { for(int i = 0; i < (int)[allTouches count]; ++i) { UITouch* touch = [[allTouches allObjects] objectAtIndex:i]; NSTimeInterval timestamp = [touch timestamp]; CGPoint currentLocation = [touch locationInView:self]; CGPoint previousLocation = [touch previousLocationInView:self]; if([touch phase] == UITouchPhaseBegan) { Finger finger; finger.start.x = currentLocation.x; finger.start.y = currentLocation.y; finger.end = finger.start; finger.hasMoved = false; finger.hasEnded = false; touchScreen->AddFinger(finger); } else if([touch phase] == UITouchPhaseEnded || [touch phase] == UITouchPhaseCancelled) { Finger& finger = touchScreen->GetFingerHandle(i); finger.hasEnded = true; } else if([touch phase] == UITouchPhaseMoved) { Finger& finger = touchScreen->GetFingerHandle(i); finger.end.x = currentLocation.x; finger.end.y = currentLocation.y; finger.hasMoved = true; } } touchScreen->RemoveEnded(); } Thanks!

    Read the article

  • Problem with Landscape and Portrait view in a TabBar Application

    - by JoshD
    I have an TabBar app that I would like to be in landscape and portrait. The issue is when I go to tab 2 make a selection from my table then select tab 1 and rotate the device, then select tab 2 again the content does not know that the device rotated and will not display my custom orientated content correctly. I am trying to write a priovate method that tells the view what orientation it is currently in. IN viewDidLoad I am assuming it is in portrait but in shouldAutoRotate I have it looking in the private method for the correct alignment of the content. Please Help!! Here is my code: #import "DetailViewController.h" #import "ScheduleTableViewController.h" #import "BrightcoveDemoAppDelegate.h" #import "Constants.h" @implementation DetailViewController @synthesize CurrentLevel, CurrentTitle, tableDataSource,logoName,showDescription,showDescriptionInfo,showTime, showTimeInfo, tableBG; - (void)layoutSubviews { showLogo.frame = CGRectMake(40, 20, 187, 101); showDescription.frame = CGRectMake(85, 140, 330, 65); showTime.frame = CGRectMake(130, 10, 149, 119); tableBG.frame = CGRectMake(0, 0, 480, 320); } /* // The designated initializer. Override if you create the controller programmatically and want to perform customization that is not appropriate for viewDidLoad. - (id)initWithNibName:(NSString *)nibNameOrNil bundle:(NSBundle *)nibBundleOrNil { if (self = [super initWithNibName:nibNameOrNil bundle:nibBundleOrNil]) { // Custom initialization } return self; } */ /* // Implement loadView to create a view hierarchy programmatically, without using a nib. - (void)loadView { } */ // Implement viewDidLoad to do additional setup after loading the view, typically from a nib. - (void)viewDidLoad { [super viewDidLoad]; self.navigationItem.title = CurrentTitle; [showDescription setEditable:NO]; //show the description showDescription.text = showDescriptionInfo; showTime.text = showTimeInfo; NSString *Path = [[NSBundle mainBundle] bundlePath]; NSString *ImagePath = [Path stringByAppendingPathComponent:logoName]; UIImage *tempImg = [[UIImage alloc] initWithContentsOfFile:ImagePath]; [showLogo setImage:tempImg]; [tempImg release]; [self masterView]; } - (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation { return YES; } - (void)willAnimateRotationToInterfaceOrientation:(UIInterfaceOrientation)toInterfaceOrientation duration:(NSTimeInterval)duration { isLandscape = UIInterfaceOrientationIsLandscape(toInterfaceOrientation); if(isLandscape = YES){ [self layoutSubviews]; } } - (void)didReceiveMemoryWarning { // Releases the view if it doesn't have a superview. [super didReceiveMemoryWarning]; // Release any cached data, images, etc that aren't in use. } - (void)viewDidUnload { // Release any retained subviews of the main view. // e.g. self.myOutlet = nil; } - (void)dealloc { [logoName release]; [showLogo release]; [showDescription release]; [showDescriptionInfo release]; [super dealloc]; } @end

    Read the article

  • iPhone Gps logging inaccurate

    - by Martijn
    I'm logging gps points during a walk. Below it shows the function that the coordinates are saved each 5 seconds. i Did several tests but i cannot get the right accuracy i want. (When testing the sky is clear also tests in google maps shows me that the gps signal is good). here is the code: -(void)viewDidAppear:(BOOL)animated{ if (self.locationManager == nil){ self.locationManager = [[[CLLocationManager alloc] init] autorelease]; locationManager.delegate = self; // only notify under 100 m accuracy locationManager.distanceFilter = 100.0f; locationManager.desiredAccuracy= kCLLocationAccuracyBest; [locationManager startUpdatingLocation]; } } - start logging [NSTimer scheduledTimerWithTimeInterval:5 target:self selector:@selector(getData) userInfo:nil repeats:YES]; </code> <code> -(void)getData{ int distance; // re-use location. if ([ [NSString stringWithFormat:@"%1.2f",previousLat] isEqualToString:@"0.00"]){ // if previous location is not available, do nothing distance = 0; }else{ CLLocation *loc1 = [[CLLocation alloc] initWithLatitude:previousLat longitude:previousLong]; CLLocation *loc2 = [[CLLocation alloc] initWithLatitude:latGlobal longitude:longGlobal]; distance = [loc1 getDistanceFrom: loc2]; } // overwrite latGlobal with new variable previousLat = latGlobal; previousLong = longGlobal; // store location and save data to database // this part goes ok } - (void)locationManager:(CLLocationManager *)manager didUpdateToLocation:(CLLocation *)newLocation fromLocation:(CLLocation *)oldLocation { // track the time to get a new gps result (for gps indicator orb) lastPointTimestamp = [newLocation.timestamp copy]; // test that the horizontal accuracy does not indicate an invalid measurement if (newLocation.horizontalAccuracy < 0) return; // test the age of the location measurement to determine if the measurement is cached // don't rely on cached measurements NSTimeInterval locationAge = -[newLocation.timestamp timeIntervalSinceNow]; if (locationAge > 5.0) return; latGlobal = fabs(newLocation.coordinate.latitude); longGlobal= fabs(newLocation.coordinate.longitude); } I have taken a screenshot of the plot results (the walk takes 30 minutes) and an example of what i'am trying to acomplish: http://www.flickr.com/photos/21258341@N07/4623969014/ i really hope someone can put me in the right direction.

    Read the article

  • How to get colliding effect or bouncy when ball hits the track.

    - by Chandan Shetty SP
    I am using below formula to move the ball circular, where accelX and accelY are the values from accelerometer, it is working fine. But the problem in this code is mRadius (I fixed its value to 50), i need to change mRadius according to accelerometer values and also i need bouncing effect when it touches the track. Currently i am developing code by assuming only one ball is on the board. float degrees = -atan2(accelX, accelY) * 180 / 3.14159; int x = cCentrePoint.x + mRadius * cos(degreesToRadians(degrees)); int y = cCentrePoint.y + mRadius * sin(degreesToRadians(degrees)); Here is the snap of the game i want to develop: Updated: I am sending the updated code... mRadius = 5; mRange = NSMakeRange(0,60); -(void) updateBall: (UIAccelerationValue) accelX withY:(UIAccelerationValue)accelY { float degrees = -atan2(accelX, accelY) * 180 / 3.14159; int x = cCentrePoint.x + mRadius * cos(degreesToRadians(degrees)); int y = cCentrePoint.y + mRadius * sin(degreesToRadians(degrees)); //self.targetRect is rect of ball Object self.targetRect = CGRectMake(newX, newY, 8, 9); self.currentRect = self.targetRect; //http://books.google.co.in/books?id=WV9glgdrrrUC&pg=PA455#v=onepage&q=&f=false static NSDate *lastDrawTime; if(lastDrawTime!=nil) { NSTimeInterval secondsSinceLastDraw = -([lastDrawTime timeIntervalSinceNow]); ballXVelocity = ballXVelocity + (accelX * secondsSinceLastDraw) * [self isTouchedTrack:mRadius andRange:mRange]; ballYVelocity = ballYVelocity + -(accelY * secondsSinceLastDraw) * [self isTouchedTrack:mRadius andRange:mRange]; distXTravelled = distXTravelled + secondsSinceLastDraw * ballXVelocity * 50; distYTravelled = distYTravelled + secondsSinceLastDraw * ballYVelocity * 50; CGRect temp = self.targetRect; temp.origin.x += distXTravelled; temp.origin.y += distYTravelled; int radius = (temp.origin.x - cCentrePoint.x) / cos(degreesToRadians(degrees)); if( !NSLocationInRange(abs(radius),mRange)) { //Colided with the tracks...Need a better logic here ballXVelocity = -ballXVelocity; } else { // Need a better logic here self.targetRect = temp; } //NSLog(@"angle = %f",degrees); } [lastDrawTime release]; lastDrawTime = [ [NSDate alloc] init]; } In the above code i have initialized mRadius and mRange(indicate track) to some constant for testing, i am not getting the moving of the ball as i expected( bouncing effect when Collided with track ) with respect to accelerometer. Help me to recognize where i went wrong or send some code snippets or links which does the similar job. I am searching for better logic than my code, if you found share with me.

    Read the article

  • [iPhone Obj-C] How to move a label with the wave of the iPhone?

    - by Didi
    The code below is from the book Beginning iPhone 3 Development, chapter 15, using the accelerometer to control a ball. I want to use a label instead of the ball, and only move in the x direction. However, I donno how to revise the code to achieve this goal. Can anyone please help me? THX so much!!!! (id)initWithCoder:(NSCoder *)coder { if (self = [super initWithCoder:coder]) { self.image = [UIImage imageNamed:@"ball.png"]; self.currentPoint = CGPointMake((self.bounds.size.width / 2.0f) + (image.size.width / 2.0f), (self.bounds.size.height / 2.0f) + (image.size.height / 2.0f)); ballXVelocity = 0.0f; ballYVelocity = 0.0f; } return self; } (id)initWithFrame:(CGRect)frame { if (self = [super initWithFrame:frame]) { // Initialization code } return self; } (void)drawRect:(CGRect)rect { // Drawing code [image drawAtPoint:currentPoint]; } (void)dealloc { [image release]; [acceleration release]; [super dealloc]; } //#pragma mark - - (CGPoint)currentPoint { return currentPoint; } - (void)setCurrentPoint:(CGPoint)newPoint { previousPoint = currentPoint; currentPoint = newPoint; if (currentPoint.x < 0) { currentPoint.x = 0; //ballXVelocity = 0; ballXVelocity = - (ballXVelocity / 2.0); } if (currentPoint.y < 0){ currentPoint.y = 0; //ballYVelocity = 0; ballYVelocity = - (ballYVelocity / 2.0); } if (currentPoint.x > self.bounds.size.width - image.size.width) { currentPoint.x = self.bounds.size.width - image.size.width; //ballXVelocity = 0; ballXVelocity = - (ballXVelocity / 2.0); } if (currentPoint.y > self.bounds.size.height - image.size.height) { currentPoint.y = self.bounds.size.height - image.size.height; //ballYVelocity = 0; ballYVelocity = - (ballYVelocity / 2.0); } CGRect currentImageRect = CGRectMake(currentPoint.x, currentPoint.y, currentPoint.x + image.size.width, currentPoint.y + image.size.height); CGRect previousImageRect = CGRectMake(previousPoint.x, previousPoint.y, previousPoint.x + image.size.width, currentPoint.y + image.size.width); [self setNeedsDisplayInRect:CGRectUnion(currentImageRect, previousImageRect)]; } (void)draw { static NSDate *lastDrawTime; if (lastDrawTime != nil) { NSTimeInterval secondsSinceLastDraw = -([lastDrawTime timeIntervalSinceNow]); ballYVelocity = ballYVelocity + -(acceleration.y * secondsSinceLastDraw); ballXVelocity = ballXVelocity + acceleration.x * secondsSinceLastDraw; CGFloat xAcceleration = secondsSinceLastDraw * ballXVelocity * 500; CGFloat yAcceleration = secondsSinceLastDraw * ballYVelocity * 500; self.currentPoint = CGPointMake(self.currentPoint.x + xAcceleration, self.currentPoint.y +yAcceleration); } // Update last time with current time [lastDrawTime release]; lastDrawTime = [[NSDate alloc] init]; }

    Read the article

  • Problem with the dateformatter's in Iphone sdk.

    - by monish
    Hi guys, Here I had a problem with the date formatters actually I had an option to search events based on the date.for this Im comparing the date with the date store in the database and getting the event based on the date for this I wrote the code as follows: -(NSMutableArray*)getSearchAllLists:(EventsList*)aEvent { [searchList removeAllObjects]; EventsList *searchEvent = nil; const char* sql; NSString *conditionStr = @" where "; if([aEvent.eventName length] > 0) { NSString *str = @"'%"; str = [str stringByAppendingString:aEvent.eventName]; str = [str stringByAppendingString:@"%'"]; conditionStr = [conditionStr stringByAppendingString:[NSString stringWithFormat:@" eventName like %s",[str UTF8String]]]; } if([aEvent.eventWineName length]>0) { NSString *str = @"'%"; str = [str stringByAppendingString:aEvent.eventWineName]; str = [str stringByAppendingString:@"%'"]; if([aEvent.eventName length]>0) { conditionStr = [conditionStr stringByAppendingString:[NSString stringWithFormat:@" and (wineName like %s)",[str UTF8String]]]; } else { conditionStr = [conditionStr stringByAppendingString:[NSString stringWithFormat:@"wineName like %s",[str UTF8String]]]; } } if([aEvent.eventVariety length]>0) { NSString *str = @"'%"; str = [str stringByAppendingString:aEvent.eventVariety]; str = [str stringByAppendingString:@"%'"]; if([aEvent.eventName length]>0 || [aEvent.eventWineName length]>0) { conditionStr = [conditionStr stringByAppendingString:[NSString stringWithFormat:@" and (variety like %s)",[str UTF8String]]]; } else { conditionStr = [conditionStr stringByAppendingString:[NSString stringWithFormat:@"variety like %s" ,[str UTF8String]]]; } } if([aEvent.eventWinery length]>0) { NSString *str = @"'%"; str = [str stringByAppendingString:aEvent.eventWinery]; str = [str stringByAppendingString:@"%'"]; if([aEvent.eventName length]>0 || [aEvent.eventWineName length]>0 || [aEvent.eventVariety length]>0) { conditionStr = [conditionStr stringByAppendingString:[NSString stringWithFormat:@" and (winery like %s)",[str UTF8String]]]; } else { conditionStr = [conditionStr stringByAppendingString:[NSString stringWithFormat:@"winery like %s" ,[str UTF8String]]]; } } if(aEvent.eventRatings >0) { if([aEvent.eventName length]>0 || [aEvent.eventWineName length]>0 || [aEvent.eventWinery length]>0 || [aEvent.eventVariety length]>0) { conditionStr = [conditionStr stringByAppendingString:[NSString stringWithFormat:@" and ratings = %d",aEvent.eventRatings]]; } else { conditionStr = [conditionStr stringByAppendingString:[NSString stringWithFormat:@" ratings = %d",aEvent.eventRatings]]; } } if(aEvent.eventDate >0) { NSDateFormatter *dateFormatter = [[NSDateFormatter alloc]init]; [dateFormatter setDateFormat:@"dd-MMM-yy 23:59:59"]; NSString *dateStr = [dateFormatter stringFromDate:aEvent.eventDate]; NSDate *date = [dateFormatter dateFromString:dateStr]; [dateFormatter release]; NSTimeInterval interval = [date timeIntervalSinceReferenceDate]; printf("\n Interval in advance search:%f",interval); if([aEvent.eventName length]>0 || [aEvent.eventWineName length]>0 || [aEvent.eventWinery length]>0 || aEvent.eventRatings>0 || [aEvent.eventVariety length]>0) { conditionStr = [conditionStr stringByAppendingString:[NSString stringWithFormat:@" and eventDate = %f",interval]]; } else { conditionStr = [conditionStr stringByAppendingString:[NSString stringWithFormat:@" eventDate = %f",interval]]; } } NSString* queryString= @" select * from event"; queryString = [queryString stringByAppendingString:conditionStr]; sqlite3_stmt* statement; sql = (char*)[queryString UTF8String]; if (sqlite3_prepare_v2(database, sql, -1, &statement, NULL) != SQLITE_OK) { NSAssert1(0, @"Error: failed to prepare statement with message '%s'.", sqlite3_errmsg(database)); } sqlite3_bind_text(statement, 1, [aEvent.eventName UTF8String], -1, SQLITE_TRANSIENT); sqlite3_bind_text(statement, 2, [aEvent.eventWineName UTF8String], -1, SQLITE_TRANSIENT); sqlite3_bind_text(statement, 3, [aEvent.eventVariety UTF8String], -1, SQLITE_TRANSIENT); sqlite3_bind_text(statement, 4, [aEvent.eventWinery UTF8String], -1, SQLITE_TRANSIENT); sqlite3_bind_int(statement, 5,aEvent.eventRatings); sqlite3_bind_double(statement,6, [[aEvent eventDate] timeIntervalSinceReferenceDate]); while (sqlite3_step(statement) == SQLITE_ROW) { primaryKey = sqlite3_column_int(statement, 0); searchEvent = [[EventsList alloc] initWithPrimaryKey:primaryKey database:database]; [searchList addObject:searchEvent]; [searchEvent release]; } sqlite3_finalize(statement); return searchList; } Here I compared the interval value in the database and the date we are searching and Im getting the different values and results were not found. Guy's help me to get rid of this. Anyone's help will be much appreciated. Thank you Monish Calapatapu.

    Read the article

  • exporting bind and keyframe bone poses from blender to use in OpenGL

    - by SaldaVonSchwartz
    I'm having a hard time trying to understand how exactly Blender's concept of bone transforms maps to the usual math of skinning (which I'm implementing in an OpenGL-based engine of sorts). Or I'm missing out something in the math.. It's gonna be long, but here's as much background as I can think of. First, a few notes and assumptions: I'm using column-major order and multiply from right to left. So for instance, vertex v transformed by matrix A and then further transformed by matrix B would be: v' = BAv. This also means whenever I export a matrix from blender through python, I export it (in text format) in 4 lines, each representing a column. This is so I can then I can read them back into my engine like this: if (fscanf(fileHandle, "%f %f %f %f", &skeleton.joints[currentJointIndex].inverseBindTransform.m[0], &skeleton.joints[currentJointIndex].inverseBindTransform.m[1], &skeleton.joints[currentJointIndex].inverseBindTransform.m[2], &skeleton.joints[currentJointIndex].inverseBindTransform.m[3])) { if (fscanf(fileHandle, "%f %f %f %f", &skeleton.joints[currentJointIndex].inverseBindTransform.m[4], &skeleton.joints[currentJointIndex].inverseBindTransform.m[5], &skeleton.joints[currentJointIndex].inverseBindTransform.m[6], &skeleton.joints[currentJointIndex].inverseBindTransform.m[7])) { if (fscanf(fileHandle, "%f %f %f %f", &skeleton.joints[currentJointIndex].inverseBindTransform.m[8], &skeleton.joints[currentJointIndex].inverseBindTransform.m[9], &skeleton.joints[currentJointIndex].inverseBindTransform.m[10], &skeleton.joints[currentJointIndex].inverseBindTransform.m[11])) { if (fscanf(fileHandle, "%f %f %f %f", &skeleton.joints[currentJointIndex].inverseBindTransform.m[12], &skeleton.joints[currentJointIndex].inverseBindTransform.m[13], &skeleton.joints[currentJointIndex].inverseBindTransform.m[14], &skeleton.joints[currentJointIndex].inverseBindTransform.m[15])) { I'm simplifying the code I show because otherwise it would make things unnecessarily harder (in the context of my question) to explain / follow. Please refrain from making remarks related to optimizations. This is not final code. Having said that, if I understand correctly, the basic idea of skinning/animation is: I have a a mesh made up of vertices I have the mesh model-world transform W I have my joints, which are really just transforms from each joint's space to its parent's space. I'll call these transforms Bj meaning matrix which takes from joint j's bind pose to joint j-1's bind pose. For each of these, I actually import their inverse to the engine, Bj^-1. I have keyframes each containing a set of current poses Cj for each joint J. These are initially imported to my engine in TQS format but after (S)LERPING them I compose them into Cj matrices which are equivalent to the Bjs (not the Bj^-1 ones) only that for the current spacial configurations of each joint at that frame. Given the above, the "skeletal animation algorithm is" On each frame: check how much time has elpased and compute the resulting current time in the animation, from 0 meaning frame 0 to 1, meaning the end of the animation. (Oh and I'm looping forever so the time is mod(total duration)) for each joint: 1 -calculate its world inverse bind pose, that is Bj_w^-1 = Bj^-1 Bj-1^-1 ... B0^-1 2 -use the current animation time to LERP the componets of the TQS and come up with an interpolated current pose matrix Cj which should transform from the joints current configuration space to world space. Similar to what I did to get the world version of the inverse bind poses, I come up with the joint's world current pose, Cj_w = C0 C1 ... Cj 3 -now that I have world versions of Bj and Cj, I store this joint's world- skinning matrix K_wj = Cj_w Bj_w^-1. The above is roughly implemented like so: - (void)update:(NSTimeInterval)elapsedTime { static double time = 0; time = fmod((time + elapsedTime),1.); uint16_t LERPKeyframeNumber = 60 * time; uint16_t lkeyframeNumber = 0; uint16_t lkeyframeIndex = 0; uint16_t rkeyframeNumber = 0; uint16_t rkeyframeIndex = 0; for (int i = 0; i < aClip.keyframesCount; i++) { uint16_t keyframeNumber = aClip.keyframes[i].number; if (keyframeNumber <= LERPKeyframeNumber) { lkeyframeIndex = i; lkeyframeNumber = keyframeNumber; } else { rkeyframeIndex = i; rkeyframeNumber = keyframeNumber; break; } } double lTime = lkeyframeNumber / 60.; double rTime = rkeyframeNumber / 60.; double blendFactor = (time - lTime) / (rTime - lTime); GLKMatrix4 bindPosePalette[aSkeleton.jointsCount]; GLKMatrix4 currentPosePalette[aSkeleton.jointsCount]; for (int i = 0; i < aSkeleton.jointsCount; i++) { F3DETQSType& lPose = aClip.keyframes[lkeyframeIndex].skeletonPose.jointPoses[i]; F3DETQSType& rPose = aClip.keyframes[rkeyframeIndex].skeletonPose.jointPoses[i]; GLKVector3 LERPTranslation = GLKVector3Lerp(lPose.t, rPose.t, blendFactor); GLKQuaternion SLERPRotation = GLKQuaternionSlerp(lPose.q, rPose.q, blendFactor); GLKVector3 LERPScaling = GLKVector3Lerp(lPose.s, rPose.s, blendFactor); GLKMatrix4 currentTransform = GLKMatrix4MakeWithQuaternion(SLERPRotation); currentTransform = GLKMatrix4Multiply(currentTransform, GLKMatrix4MakeTranslation(LERPTranslation.x, LERPTranslation.y, LERPTranslation.z)); currentTransform = GLKMatrix4Multiply(currentTransform, GLKMatrix4MakeScale(LERPScaling.x, LERPScaling.y, LERPScaling.z)); if (aSkeleton.joints[i].parentIndex == -1) { bindPosePalette[i] = aSkeleton.joints[i].inverseBindTransform; currentPosePalette[i] = currentTransform; } else { bindPosePalette[i] = GLKMatrix4Multiply(aSkeleton.joints[i].inverseBindTransform, bindPosePalette[aSkeleton.joints[i].parentIndex]); currentPosePalette[i] = GLKMatrix4Multiply(currentPosePalette[aSkeleton.joints[i].parentIndex], currentTransform); } aSkeleton.skinningPalette[i] = GLKMatrix4Multiply(currentPosePalette[i], bindPosePalette[i]); } } At this point, I should have my skinning palette. So on each frame in my vertex shader, I do: uniform mat4 modelMatrix; uniform mat4 projectionMatrix; uniform mat3 normalMatrix; uniform mat4 skinningPalette[6]; attribute vec4 position; attribute vec3 normal; attribute vec2 tCoordinates; attribute vec4 jointsWeights; attribute vec4 jointsIndices; varying highp vec2 tCoordinatesVarying; varying highp float lIntensity; void main() { vec3 eyeNormal = normalize(normalMatrix * normal); vec3 lightPosition = vec3(0., 0., 2.); lIntensity = max(0.0, dot(eyeNormal, normalize(lightPosition))); tCoordinatesVarying = tCoordinates; vec4 skinnedVertexPosition = vec4(0.); for (int i = 0; i < 4; i++) { skinnedVertexPosition += jointsWeights[i] * skinningPalette[int(jointsIndices[i])] * position; } gl_Position = projectionMatrix * modelMatrix * skinnedVertexPosition; } The result: The mesh parts that are supposed to animate do animate and follow the expected motion, however, the rotations are messed up in terms of orientations. That is, the mesh is not translated somewhere else or scaled in any way, but the orientations of rotations seem to be off. So a few observations: In the above shader notice I actually did not multiply the vertices by the mesh modelMatrix (the one which would take them to model or world or global space, whichever you prefer, since there is no parent to the mesh itself other than "the world") until after skinning. This is contrary to what I implied in the theory: if my skinning matrix takes vertices from model to joint and back to model space, I'd think the vertices should already be premultiplied by the mesh transform. But if I do so, I just get a black screen. As far as exporting the joints from Blender, my python script exports for each armature bone in bind pose, it's matrix in this way: def DFSJointTraversal(file, skeleton, jointList): for joint in jointList: poseJoint = skeleton.pose.bones[joint.name] jointTransform = poseJoint.matrix.inverted() file.write('Joint ' + joint.name + ' Transform {\n') for col in jointTransform.col: file.write('{:9f} {:9f} {:9f} {:9f}\n'.format(col[0], col[1], col[2], col[3])) DFSJointTraversal(file, skeleton, joint.children) file.write('}\n') And for current / keyframe poses (assuming I'm in the right keyframe): def exportAnimations(filepath): # Only one skeleton per scene objList = [object for object in bpy.context.scene.objects if object.type == 'ARMATURE'] if len(objList) == 0: return elif len(objList) > 1: return #raise exception? dialog box? skeleton = objList[0] jointNames = [bone.name for bone in skeleton.data.bones] for action in bpy.data.actions: # One animation clip per action in Blender, named as the action animationClipFilePath = filepath[0 : filepath.rindex('/') + 1] + action.name + ".aClip" file = open(animationClipFilePath, 'w') file.write('target skeleton: ' + skeleton.name + '\n') file.write('joints count: {:d}'.format(len(jointNames)) + '\n') skeleton.animation_data.action = action keyframeNum = max([len(fcurve.keyframe_points) for fcurve in action.fcurves]) keyframes = [] for fcurve in action.fcurves: for keyframe in fcurve.keyframe_points: keyframes.append(keyframe.co[0]) keyframes = set(keyframes) keyframes = [kf for kf in keyframes] keyframes.sort() file.write('keyframes count: {:d}'.format(len(keyframes)) + '\n') for kfIndex in keyframes: bpy.context.scene.frame_set(kfIndex) file.write('keyframe: {:d}\n'.format(int(kfIndex))) for i in range(0, len(skeleton.data.bones)): file.write('joint: {:d}\n'.format(i)) joint = skeleton.pose.bones[i] jointCurrentPoseTransform = joint.matrix translationV = jointCurrentPoseTransform.to_translation() rotationQ = jointCurrentPoseTransform.to_3x3().to_quaternion() scaleV = jointCurrentPoseTransform.to_scale() file.write('T {:9f} {:9f} {:9f}\n'.format(translationV[0], translationV[1], translationV[2])) file.write('Q {:9f} {:9f} {:9f} {:9f}\n'.format(rotationQ[1], rotationQ[2], rotationQ[3], rotationQ[0])) file.write('S {:9f} {:9f} {:9f}\n'.format(scaleV[0], scaleV[1], scaleV[2])) file.write('\n') file.close() Which I believe follow the theory explained at the beginning of my question. But then I checked out Blender's directX .x exporter for reference.. and what threw me off was that in the .x script they are exporting bind poses like so (transcribed using the same variable names I used so you can compare): if joint.parent: jointTransform = poseJoint.parent.matrix.inverted() else: jointTransform = Matrix() jointTransform *= poseJoint.matrix and exporting current keyframe poses like this: if joint.parent: jointCurrentPoseTransform = joint.parent.matrix.inverted() else: jointCurrentPoseTransform = Matrix() jointCurrentPoseTransform *= joint.matrix why are they using the parent's transform instead of the joint in question's? isn't the join transform assumed to exist in the context of a parent transform since after all it transforms from this joint's space to its parent's? Why are they concatenating in the same order for both bind poses and keyframe poses? If these two are then supposed to be concatenated with each other to cancel out the change of basis? Anyway, any ideas are appreciated.

    Read the article

< Previous Page | 1 2 3