Search Results

Search found 558 results on 23 pages for 'touches'.

Page 3/23 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • UITabbar without controller

    - by Etienne
    Hello. I have a simple app where the only view controller has an outlet to a UITabBar. It also implements UITabBarDelegate and is set as the delegate for the UITabBar: @interface TheMainViewController : UIViewController <UITabBarDelegate> { IBOutlet UITabBar *theTabBar; } I implemented the following method Which gets called whenever any of my 4 UITabBarItems get tapped. I tried just doing something really simple: - (void)tabBar:(UITabBar *)tabBar didSelectItem:(UITabBarItem *)item { tabBar.selectedItem = [tabBar.items objectAtIndex:0]; return; } In theory, it should always stay selected on my first tab and it works perfectly when I just tap any UITabBarItem (nothing happens, the first one always stays selected). But when I touch a UITabBarItem and hold it (not taking my finger off) the selection changes anyway ! Debugging, everything gets called properly. It's like changing the selectedItem property doesn't have any effect is the user still has the item "down" (with his finger on it). What would be a good workaround? I tried overloading UITabBar and messing with touchesBegan and touchesEnd but they don't even get called. Same with UITabBarItem. Oh and please don't suggest using a UITabBarController as it is not flexible enough for my application. So frustrating....thanks!

    Read the article

  • How To Detect "Touch Down" in superview of UIScrollView?

    - by wgpubs
    I have a UIView that contains a UIScrollView and I want to be able to capture the "Touch Down" event in the UIView any time the user taps on the UIScrollView. I've tried including all the touchesBegan/Ended/Cancelled handlers in my UIViewController but none of them get fired when tapping inside the UIScrollView contained in the main UIView. What is the best way to accomplish this?

    Read the article

  • Is there a new way to make Android tabs slide?

    - by Brian515
    Hi all, I'm new to Android development, and I was wondering if anyone knew either how to make Tabs slide, or how to get a similar effect without tabs. I have quite a few tabs in my application, and it does not look good on devices with smaller screens. Or maybe tabs are not what I am looking for. If you don't know what I'm talking about, I'd like to reproduce something similar to Photoshop.com Mobile's effects screen. I know this is possible. Thanks in advance!

    Read the article

  • Is there a way to make Android tabs slide?

    - by Brian515
    Hi all, I'm new to Android development, and I was wondering if anyone knew either how to make Tabs slide, or how to get a similar effect without tabs. I have quite a few tabs in my application, and it does not look good on devices with smaller screens. Or maybe tabs are not what I am looking for. If you don't know what I'm talking about, I'd like to reproduce something similar to Photoshop.com Mobile's effects screen. I know this is possible. Thanks in advance!

    Read the article

  • Touch and Drag from one view to another

    - by jollyCocoa
    Hi all! I've search for some clues on this problem without much success. Hope someone can kick me in the right direction. I am prototyping a couple of apps where I need to design my own GUI. The GUI is made up by two separated UIViews where one of them contains a small thumb of an image. I want to be able to drag this thumb from the first view to the other. Simple as that! But I haven't figured out how this is done. Here is the exact flow I am looking for: touch the thumb animate a small enlargement of the thumb drag the thumb to the other UIView drop the thumb animate a shrink of the thumb Not particularly strange, but the thumb remains related to the first view all the time. I've tried to move the thumb via the first views superview and then back to the second view, but with no luck.

    Read the article

  • Why do touches only get detected ABOVE my CCBitmapFontAtlas and not ON it? (cocos2d)

    - by RexOnRoids
    I am detecting touches for CCBitmapFontAtlas (just text labels) as shown in the code below. But it seems that touches are only detected slightly ABOVE the CCBitmapFontAtlases? Did something get screwed when converting between coordinate systems? (*Note objects label1, label2, etc are CCBitmapFontAtlas) - (void)ccTouchesEnded:(NSSet *)touches withEvent:(UIEvent *)event { for( UITouch *touch in touches ) { CGPoint location = [touch locationInView:[touch view]]; location = [[CCDirector sharedDirector] convertToGL:location]; self.myGraphManager.isSliding = NO; CGRect rectLabel1 = CGRectMake(label1.position.x, label1.position.y, label1.contentSize.width, label1.contentSize.height); CGRect rectLabel2 = CGRectMake(label2.position.x, label2.position.y, label2.contentSize.width, label2.contentSize.height); CGRect rectLabel3 = CGRectMake(label3.position.x, label3.position.y, label3.contentSize.width, label3.contentSize.height); CGRect rectLabel4 = CGRectMake(label4.position.x, label4.position.y, label4.contentSize.width, label4.contentSize.height); CGRect rectLabel5 = CGRectMake(label5.position.x, label5.position.y, label5.contentSize.width, label5.contentSize.height); CGRect rectLabel6 = CGRectMake(label6.position.x, label6.position.y, label6.contentSize.width, label6.contentSize.height); if(CGRectContainsPoint(rectLabel1, location)){ NSLog(@"Label 1 Touched"); }else if(CGRectContainsPoint(rectLabel2, location)){ NSLog(@"Label 2 Touched"); }else if(CGRectContainsPoint(rectLabel3, location)){ NSLog(@"Label 3 Touched"); }else if(CGRectContainsPoint(rectLabel4, location)){ NSLog(@"Label 4 Touched"); }else if(CGRectContainsPoint(rectLabel5, location)){ NSLog(@"Label 5 Touched"); }else if(CGRectContainsPoint(rectLabel6, location)){ NSLog(@"Label 6 Touched"); } } }

    Read the article

  • How can I differentiate two different touches on a layer ?

    - by srikanth rongali
    I am writing an app in cocos2d. I hava a sprite and a text in my scene. I have written two separate classes for sprite and text. And I added both of them to another class. In sprite class I have written - (void)ccTouchesEnded:(NSSet *)touches withEvent:(UIEvent *)event And in text class I have written -(void) registerWithTouchDispatcher { [[CCTouchDispatcher sharedDispatcher]addTargetedDelegate:self priority:0 swallowsTouches:YES]; } -(BOOL) ccTouchBegan:(UITouch *)touch withEvent:(UIEvent *)event { return YES; } -(void) ccTouchEnded:(UITouch *)touch withEvent:(UIEvent *)event { NSLog(@"Recognized tOuches in Instructions");// CGSize windowSize = [[CCDirector sharedDirector] winSize]; CCNode *node = [self getChildByTag:kTagNode]; [node setPosition: ccp(text1.contentSize.width/2,text1.contentSize.height/2 - windowSize.height)]; } -(void) ccTouchMoved:(UITouch *)touch withEvent:(UIEvent *)event { CGPoint touchLocation = [touch locationInView: [touch view]]; CGPoint prevLocation = [touch previousLocationInView: [touch view]]; touchLocation = [[CCDirector sharedDirector] convertToGL: touchLocation]; prevLocation = [[CCDirector sharedDirector] convertToGL: prevLocation]; CGPoint diff = ccpSub(touchLocation,prevLocation); CCNode *node = [self getChildByTag:kTagNode]; CGPoint currentPos = [node position]; [node setPosition: ccpAdd(currentPos, diff)]; } But, only touches in the text are recognized and touch of sprite is not recognized ? How can I differentiate the two touches.

    Read the article

  • Custom Gesture in cocos2d

    - by Lewis
    I've found a little tutorial that would be useful for my game: http://blog.mellenthin.de/archives/2012/02/13/an-one-finger-rotation-gesture-recognizer/ But I can't work out how to convert that gesture to work with cocos2d, I have found examples of pre made gestures in cocos2d, but no custom ones, is it possible? EDIT STILL HAVING PROBLEMS WITH THIS: I've added the code from Sentinel below (from SO), the Gesture and RotateGesture have both been added to my solution and are compiling. Although In the rotation class now I only see selectors, how do I set those up? As the custom gesture found in that project above looks like: header file for custom gesture: #import <Foundation/Foundation.h> #import <UIKit/UIGestureRecognizerSubclass.h> @protocol OneFingerRotationGestureRecognizerDelegate <NSObject> @optional - (void) rotation: (CGFloat) angle; - (void) finalAngle: (CGFloat) angle; @end @interface OneFingerRotationGestureRecognizer : UIGestureRecognizer { CGPoint midPoint; CGFloat innerRadius; CGFloat outerRadius; CGFloat cumulatedAngle; id <OneFingerRotationGestureRecognizerDelegate> target; } - (id) initWithMidPoint: (CGPoint) midPoint innerRadius: (CGFloat) innerRadius outerRadius: (CGFloat) outerRadius target: (id) target; - (void)reset; - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event; - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event; - (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event; - (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event; @end .m for custom gesture file: #include <math.h> #import "OneFingerRotationGestureRecognizer.h" @implementation OneFingerRotationGestureRecognizer // private helper functions CGFloat distanceBetweenPoints(CGPoint point1, CGPoint point2); CGFloat angleBetweenLinesInDegrees(CGPoint beginLineA, CGPoint endLineA, CGPoint beginLineB, CGPoint endLineB); - (id) initWithMidPoint: (CGPoint) _midPoint innerRadius: (CGFloat) _innerRadius outerRadius: (CGFloat) _outerRadius target: (id <OneFingerRotationGestureRecognizerDelegate>) _target { if ((self = [super initWithTarget: _target action: nil])) { midPoint = _midPoint; innerRadius = _innerRadius; outerRadius = _outerRadius; target = _target; } return self; } /** Calculates the distance between point1 and point 2. */ CGFloat distanceBetweenPoints(CGPoint point1, CGPoint point2) { CGFloat dx = point1.x - point2.x; CGFloat dy = point1.y - point2.y; return sqrt(dx*dx + dy*dy); } CGFloat angleBetweenLinesInDegrees(CGPoint beginLineA, CGPoint endLineA, CGPoint beginLineB, CGPoint endLineB) { CGFloat a = endLineA.x - beginLineA.x; CGFloat b = endLineA.y - beginLineA.y; CGFloat c = endLineB.x - beginLineB.x; CGFloat d = endLineB.y - beginLineB.y; CGFloat atanA = atan2(a, b); CGFloat atanB = atan2(c, d); // convert radiants to degrees return (atanA - atanB) * 180 / M_PI; } #pragma mark - UIGestureRecognizer implementation - (void)reset { [super reset]; cumulatedAngle = 0; } - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { [super touchesBegan:touches withEvent:event]; if ([touches count] != 1) { self.state = UIGestureRecognizerStateFailed; return; } } - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { [super touchesMoved:touches withEvent:event]; if (self.state == UIGestureRecognizerStateFailed) return; CGPoint nowPoint = [[touches anyObject] locationInView: self.view]; CGPoint prevPoint = [[touches anyObject] previousLocationInView: self.view]; // make sure the new point is within the area CGFloat distance = distanceBetweenPoints(midPoint, nowPoint); if ( innerRadius <= distance && distance <= outerRadius) { // calculate rotation angle between two points CGFloat angle = angleBetweenLinesInDegrees(midPoint, prevPoint, midPoint, nowPoint); // fix value, if the 12 o'clock position is between prevPoint and nowPoint if (angle > 180) { angle -= 360; } else if (angle < -180) { angle += 360; } // sum up single steps cumulatedAngle += angle; // call delegate if ([target respondsToSelector: @selector(rotation:)]) { [target rotation:angle]; } } else { // finger moved outside the area self.state = UIGestureRecognizerStateFailed; } } - (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event { [super touchesEnded:touches withEvent:event]; if (self.state == UIGestureRecognizerStatePossible) { self.state = UIGestureRecognizerStateRecognized; if ([target respondsToSelector: @selector(finalAngle:)]) { [target finalAngle:cumulatedAngle]; } } else { self.state = UIGestureRecognizerStateFailed; } cumulatedAngle = 0; } - (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event { [super touchesCancelled:touches withEvent:event]; self.state = UIGestureRecognizerStateFailed; cumulatedAngle = 0; } @end Then its initialised like this: // calculate center and radius of the control CGPoint midPoint = CGPointMake(image.frame.origin.x + image.frame.size.width / 2, image.frame.origin.y + image.frame.size.height / 2); CGFloat outRadius = image.frame.size.width / 2; // outRadius / 3 is arbitrary, just choose something >> 0 to avoid strange // effects when touching the control near of it's center gestureRecognizer = [[OneFingerRotationGestureRecognizer alloc] initWithMidPoint: midPoint innerRadius: outRadius / 3 outerRadius: outRadius target: self]; [self.view addGestureRecognizer: gestureRecognizer]; The selector below is also in the same file where the initialisation of the gestureRecogonizer: - (void) rotation: (CGFloat) angle { // calculate rotation angle imageAngle += angle; if (imageAngle > 360) imageAngle -= 360; else if (imageAngle < -360) imageAngle += 360; // rotate image and update text field image.transform = CGAffineTransformMakeRotation(imageAngle * M_PI / 180); [self updateTextDisplay]; } I can't seem to get this working in the RotateGesture class can anyone help me please I've been stuck on this for days now. SECOND EDIT: Here is the users code from SO that was suggested to me: Here is projec on GitHub: SFGestureRecognizers It uses builded in iOS UIGestureRecognizer, and don't needs to be integrated into cocos2d sources. Using it, You can make any gestures, just like you could, if you whould work with UIGestureRecognizer. For example: I made a base class Gesture, and subclassed it for any new gesture: //Gesture.h @interface Gesture : NSObject <UIGestureRecognizerDelegate> { UIGestureRecognizer *gestureRecognizer; id delegate; SEL preSolveSelector; SEL possibleSelector; SEL beganSelector; SEL changedSelector; SEL endedSelector; SEL cancelledSelector; SEL failedSelector; BOOL preSolveAvailable; CCNode *owner; } - (id)init; - (void)addGestureRecognizerToNode:(CCNode*)node; - (void)removeGestureRecognizerFromNode:(CCNode*)node; -(void)recognizer:(UIGestureRecognizer*)recognizer; @end //Gesture.m #import "Gesture.h" @implementation Gesture - (id)init { if (!(self = [super init])) return self; preSolveAvailable = YES; return self; } - (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer { return YES; } - (BOOL)gestureRecognizer:(UIGestureRecognizer *)recognizer shouldReceiveTouch:(UITouch *)touch { //! For swipe gesture recognizer we want it to be executed only if it occurs on the main layer, not any of the subnodes ( main layer is higher in hierarchy than children so it will be receiving touch by default ) if ([recognizer class] == [UISwipeGestureRecognizer class]) { CGPoint pt = [touch locationInView:touch.view]; pt = [[CCDirector sharedDirector] convertToGL:pt]; for (CCNode *child in owner.children) { if ([child isNodeInTreeTouched:pt]) { return NO; } } } return YES; } - (void)addGestureRecognizerToNode:(CCNode*)node { [node addGestureRecognizer:gestureRecognizer]; owner = node; } - (void)removeGestureRecognizerFromNode:(CCNode*)node { [node removeGestureRecognizer:gestureRecognizer]; } #pragma mark - Private methods -(void)recognizer:(UIGestureRecognizer*)recognizer { CCNode *node = recognizer.node; if (preSolveSelector && preSolveAvailable) { preSolveAvailable = NO; [delegate performSelector:preSolveSelector withObject:recognizer withObject:node]; } UIGestureRecognizerState state = [recognizer state]; if (state == UIGestureRecognizerStatePossible && possibleSelector) { [delegate performSelector:possibleSelector withObject:recognizer withObject:node]; } else if (state == UIGestureRecognizerStateBegan && beganSelector) [delegate performSelector:beganSelector withObject:recognizer withObject:node]; else if (state == UIGestureRecognizerStateChanged && changedSelector) [delegate performSelector:changedSelector withObject:recognizer withObject:node]; else if (state == UIGestureRecognizerStateEnded && endedSelector) { preSolveAvailable = YES; [delegate performSelector:endedSelector withObject:recognizer withObject:node]; } else if (state == UIGestureRecognizerStateCancelled && cancelledSelector) { preSolveAvailable = YES; [delegate performSelector:cancelledSelector withObject:recognizer withObject:node]; } else if (state == UIGestureRecognizerStateFailed && failedSelector) { preSolveAvailable = YES; [delegate performSelector:failedSelector withObject:recognizer withObject:node]; } } @end Subclass example: //RotateGesture.h #import "Gesture.h" @interface RotateGesture : Gesture - (id)initWithTarget:(id)target preSolveSelector:(SEL)preSolve possibleSelector:(SEL)possible beganSelector:(SEL)began changedSelector:(SEL)changed endedSelector:(SEL)ended cancelledSelector:(SEL)cancelled failedSelector:(SEL)failed; @end //RotateGesture.m #import "RotateGesture.h" @implementation RotateGesture - (id)initWithTarget:(id)target preSolveSelector:(SEL)preSolve possibleSelector:(SEL)possible beganSelector:(SEL)began changedSelector:(SEL)changed endedSelector:(SEL)ended cancelledSelector:(SEL)cancelled failedSelector:(SEL)failed { if (!(self = [super init])) return self; preSolveSelector = preSolve; delegate = target; possibleSelector = possible; beganSelector = began; changedSelector = changed; endedSelector = ended; cancelledSelector = cancelled; failedSelector = failed; gestureRecognizer = [[UIRotationGestureRecognizer alloc] initWithTarget:self action:@selector(recognizer:)]; gestureRecognizer.delegate = self; return self; } @end Use example: - (void)addRotateGesture { RotateGesture *rotateRecognizer = [[RotateGesture alloc] initWithTarget:self preSolveSelector:@selector(rotateGesturePreSolveWithRecognizer:node:) possibleSelector:nil beganSelector:@selector(rotateGestureStateBeganWithRecognizer:node:) changedSelector:@selector(rotateGestureStateChangedWithRecognizer:node:) endedSelector:@selector(rotateGestureStateEndedWithRecognizer:node:) cancelledSelector:@selector(rotateGestureStateCancelledWithRecognizer:node:) failedSelector:@selector(rotateGestureStateFailedWithRecognizer:node:)]; [rotateRecognizer addGestureRecognizerToNode:movableAreaSprite]; } I dont understand how to implement the custom gesture code at the start of this post into the rotateGesture class which is a subclass of the gesture class written by the SO user. Any ideas please? When I get 6 more rep I'll add a bounty to this.

    Read the article

  • How does overlayViewTouched notification work in the MoviePlayer sample code

    - by Jonathan
    Hi, I have a question regarding the MoviePlayer sample code provided by apple. I don't understand how the overlayViewTouch notification works. The NSlog message I added to it does not get sent when I touch the view (not button). // post the "overlayViewTouch" notification and will send // the overlayViewTouches: message - (void)overlayViewTouches:(NSNotification *)notification { NSLog(@"overlay view touched"); // Handle touches to the overlay view (MyOverlayView) here... } I can, however, get the NSlog notification if I place it in -(void)touchesBegan in "MyOverlayView.m". Which makes me think it is recognizing touches but not sending a notification. // Handle any touches to the overlay view - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { UITouch* touch = [touches anyObject]; if (touch.phase == UITouchPhaseBegan) { NSLog(@"overlay touched(from touchesBegan") // IMPORTANT: // Touches to the overlay view are being handled using // two different techniques as described here: // // 1. Touches to the overlay view (not in the button) // // On touches to the view we will post a notification // "overlayViewTouch". MyMovieViewController is registered // as an observer for this notification, and the // overlayViewTouches: method in MyMovieViewController // will be called. // // 2. Touches to the button // // Touches to the button in this same view will // trigger the MyMovieViewController overlayViewButtonPress: // action method instead. NSNotificationCenter *nc = [NSNotificationCenter defaultCenter]; [nc postNotificationName:OverlayViewTouchNotification object:nil]; } } Can anyone shed light on what I am missing or doing wrong? Thank you.

    Read the article

  • How to get objects to react to touches in Cocos2D?

    - by Wayfarer
    Alright, so I'm starting to learn more about Coco2D, but I'm kinda frusterated. A lot of the tutorials I have found are for outdated versions of the code, so when I look through and see how they do certain things, I can't translate it into my own program, because a lot has changed. With that being said, I am working in the latest version of Coco2d, version 0.99. What I want to do is create a sprite on the screen (Done) and then when I touch that sprite, I can have "something" happen. For now, let's just make an alert go off. Now, I got this code working with the help of a friend. Here is the header file: // When you import this file, you import all the cocos2d classes #import "cocos2d.h" // HelloWorld Layer @interface HelloWorld : CCLayer { CGRect spRect; } // returns a Scene that contains the HelloWorld as the only child +(id) scene; @end And here is the implementation file: // // cocos2d Hello World example // http://www.cocos2d-iphone.org // // Import the interfaces #import "HelloWorldScene.h" #import "CustomCCNode.h" // HelloWorld implementation @implementation HelloWorld +(id) scene { // 'scene' is an autorelease object. CCScene *scene = [CCScene node]; // 'layer' is an autorelease object. HelloWorld *layer = [HelloWorld node]; // add layer as a child to scene [scene addChild: layer]; // return the scene return scene; } // on "init" you need to initialize your instance -(id) init { // always call "super" init // Apple recommends to re-assign "self" with the "super" return value if( (self=[super init] )) { // create and initialize a Label CCLabel* label = [CCLabel labelWithString:@"Hello World" fontName:@"Times New Roman" fontSize:64]; // ask director the the window size CGSize size = [[CCDirector sharedDirector] winSize]; // position the label on the center of the screen label.position = ccp( size.width /2 , size.height/2 ); // add the label as a child to this Layer [self addChild: label]; CCSprite *sp = [CCSprite spriteWithFile:@"test2.png"]; sp.position = ccp(300,200); [self addChild:sp]; float w = [sp contentSize].width; float h = [sp contentSize].height; CGPoint aPoint = CGPointMake([sp position].x - (w/2), [sp position].y - (h/2)); spRect = CGRectMake(aPoint.x, aPoint.y, w, h); CCSprite *sprite2 = [CCSprite spriteWithFile:@"test3.png"]; sprite2.position = ccp(100,100); [self addChild:sprite2]; //[self registerWithTouchDispatcher]; self.isTouchEnabled = YES; } return self; } // on "dealloc" you need to release all your retained objects - (void) dealloc { // in case you have something to dealloc, do it in this method // in this particular example nothing needs to be released. // cocos2d will automatically release all the children (Label) // don't forget to call "super dealloc" [super dealloc]; } - (void)ccTouchesEnded:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *touch = [touches anyObject]; //CGPoint location = [[CCDirector sharedDirector] convertCoordinate:[touch locationInView:touch.view]]; CGPoint location = [touch locationInView:[touch view]]; location = [[CCDirector sharedDirector] convertToGL:location]; if (CGRectContainsPoint(spRect, location)) { UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"Win" message:@"testing" delegate:nil cancelButtonTitle:@"okay" otherButtonTitles:nil]; [alert show]; [alert release]; NSLog(@"TOUCHES"); } NSLog(@"Touch got"); } However, this only works for 1 object, the sprite which I create the CGRect for. I can't do it for 2 sprites, which I was testing. So my question is this: How can I have all sprites on the screen react to the same event when touched? For my program, the same event needs to be run for all objects of the same type, so that should make it a tad easier. I tried making a subclass of CCNode and over write the method, but that just didn't work at all... so I'm doing something wrong. Help would be appreciated!

    Read the article

  • Detecting UITableView scrolling

    - by Xeph
    Hi I've subclassed UITableView (as KRTableView) and implemented the four touch-based methods (touchesBegan, touchesEnded, touchesMoved, and touchesCancelled) so that I can detect when a touch-based event is being handled on a UITableView. Essentially what I need to detect is when the UITableView is scrolling up or down. However, subclassing UITableView and creating the above methods only detects when scrolling or finger movement is occuring within a UITableViewCell, not on the entire UITableView. As soon as my finger is moved onto the next cell, the touch events don't do anything. This is how I'm subclassing UITableView: #import "KRTableView.h" @implementation KRTableView - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { [super touchesBegan:touches withEvent:event]; NSLog(@"touches began..."); } - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { [super touchesMoved:touches withEvent:event]; NSLog(@"touchesMoved occured"); } - (void)touchesCancelled:(NSSet*)touches withEvent:(UIEvent *)event { [super touchesCancelled:touches withEvent:event]; NSLog(@"touchesCancelled occured"); } - (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event { [super touchesEnded:touches withEvent:event]; NSLog(@"A tap was detected on KRTableView"); } @end How can I detect when the UITableView is scrolling up or down?

    Read the article

  • Detect ANY touch in a view (iPhone SDK)

    - by David
    Hello, I'm currently using ... - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { - (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event { - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { to detect swipes. I've got everything working. The only problem is if the user touches on top of something (eg a UIButton or something) the - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { is not called. Is there something like touchesBegan but will work if I touch ANYWHERE on the view? Thanks in advance, David

    Read the article

  • How can we detect a touch of a sprite?

    - by srikanth rongali
    I have two sprites in my app. Both should have touches enabled and both touches are independent of one another. And if I touch the screen (not on sprites) it should have different touches. My problem is all three sprite1, sprite2, remaining screen should have independent touches. But my program is taking all the touches as same. How can I make them as what I needed ? Thank You.

    Read the article

  • How to know when the user touches the OK button of the last StoreKit alert "Thank you. Your purchase

    - by Walchy
    I have integrated "In App Purchase" in a game to let the user unlock more levels. Everything works fine, but I have a little problem with the last alert "Thank You. Your purchase was successful. [OK]". My program gets informed that the transaction was successfully completed before this last alert pops up and so my game starts running again - then the alert comes up, annoying the user. I would like to wait with my game running until the user touches the "OK" button, but since it is an alert from StoreKit I have no idea when this happens or how I could catch it. I don't want to create another dialog (this time my own, therefor under my control) below the alert, just asking for touching "OK" again - would be a bad user experience. Anybody have any ideas?

    Read the article

  • Intercepting/Hijacking iPhone Touch Events for MKMapView

    - by Shawn
    Is there a bug in the 3.0 SDK that disables real-time zooming and intercepting the zoom-in gesture for the MKMapView? I have some real simple code so I can detect tap events, but there are two problems: zoom-in gesture is always interpreted as a zoom-out none of the zoom gestures update the Map's view in realtime. In hitTest, if I return the "map" view, the MKMapView functionality works great, but I don't get the opportunity to intercept the events. Any ideas? MyMapView.h: @interface MyMapView : MKMapView { UIView *map; } MyMapView.m: - (id)initWithFrame:(CGRect)frame { if (![super initWithFrame:frame]) return nil; self.multipleTouchEnabled = true; return self; } - (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event { NSLog(@"Hit Test"); map = [super hitTest:point withEvent:event]; return self; } - (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event { NSLog(@"%s", __FUNCTION__); [map touchesCancelled:touches withEvent:event]; } - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent*)event { NSLog(@"%s", __FUNCTION__); [map touchesBegan:touches withEvent:event]; } - (void)touchesMoved:(NSSet*)touches withEvent:(UIEvent*)event { NSLog(@"%s, %x", __FUNCTION__, mViewTouched); [map touchesMoved:touches withEvent:event]; } - (void)touchesEnded:(NSSet*)touches withEvent:(UIEvent*)event { NSLog(@"%s, %x", __FUNCTION__, mViewTouched); [map touchesEnded:touches withEvent:event]; }

    Read the article

  • Static when metal USB plug touches my case, and other electrical problems.

    - by Archagon
    I have an aluminum PC case. Whenever the metal USB plug from my external drive contacts the front, I get crackling from my speakers, which are connected to an external USB soundcard. (This crackling happens even if they're not connected to the actual output jack.) A few possibly related problems: my audio occasionally starts popping once every few minutes, and my USB devices sometimes play the "connected" sound in Windows even though they're already connected, as if they're briefly disconnecting. My guess is that this has to do with the grounding, but I'm not sure exactly what to do. My case has a round grounding wire, but I don't know where to attach it, and fiddling with it didn't seem to have any effect. Suggestions?

    Read the article

  • What could cause a button on a UIActionSheet to "miss" on touches?

    - by Alex Gosselin
    I have a UIActionSheet as follows: UIActionSheet *sheet = [[UIActionSheet alloc] initWithTitle:[NSString stringWithFormat: @"Cancel New %@? Changes will be lost.", [creator propertyName]] delegate:self cancelButtonTitle:@"Stay Here" destructiveButtonTitle:@"Discard and Close" otherButtonTitles:@"Save and Close", nil]; [sheet showInView:self.view]; [sheet release]; It creates the action sheet, The buttons display, the Destructive button is on top, cancel button is on the bottom, other button (save and close) shows up in the middle, the top two buttons, (destructive and other) work fine, but the bottom button has a gap, so it is farther down than the other buttons. For some reason though, in order to press the button I need to touch where it would be if there was no gap. Touching the actual button doesn't work. Sorry if this isn't super clear, has anyone encountered something like this? I don't like to whip out the "I found a bug" card too fast, maybe I'm doing something wrong here.

    Read the article

  • zoomfactor value in CGAffineTransformMakeScale in iPhone

    - by suse
    Hello, 1) I'm doing pinch zoom on the UIImageView , how should i decide upon the zoomfactor value, because when the zoomfactor value goes beyond 0[i.e negative value]the image is gettig tilted, which i dont want it to happen. how to avoid this situation. 2) Y is the flickring kind of rotationis happening, Y not the smooth rotation? ll this be taken care by CGAffineTransformMakeScale(zoomfactor,zoomfactor);method? This is what i'm doing in my code: zoomFactor = 0;// Initially zoomfactor is set to zero - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{ NSLog(@" Inside touchesBegan .................."); NSArray *twoTouches = [touches allObjects]; UITouch *first = [twoTouches objectAtIndex:0]; OPERATION = [self identifyOperation:touches :first]; NSLog(@"OPERATION : %d",OPERATION); if(OPERATION == OPERATION_PINCH){ //double touch pinch UITouch *second = [twoTouches objectAtIndex:1]; f_G_initialDistance = distanceBetweenPoints([first locationInView:self.view],[second locationInView:self.view]); } NSLog(@" leaving touchesBegan .................."); } - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { NSLog(@" Inside touchesMoved ................."); NSArray *twoTouchPoints = [touches allObjects]; if(OPERATION == OPERATION_PINCH){ CGFloat currentDistance = distanceBetweenPoints([[twoTouchPoints objectAtIndex:0] locationInView:self.view],[[twoTouchPoints objectAtIndex:1] locationInView:self.view]); int pinchOperation = [self identifyPinchOperation:f_G_initialDistance :currentDistance]; G_zoomFactor = [self calculateZoomFactor:pinchOperation :G_zoomFactor]; [uiImageView_G_obj setTransform:CGAffineTransformMakeScale(G_zoomFactor, G_zoomFactor)]; [self.view bringSubviewToFront:resetButton]; [self.view bringSubviewToFront:uiSlider_G_obj]; f_G_initialDistance = currentDistance; } NSLog(@" leaving touchesMoved .................."); } - (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event { NSLog(@" Inside touchesEnded .................."); NSArray *twoTouches = [touches allObjects]; UITouch *first = [twoTouches objectAtIndex:0]; if(OPERATION == OPERATION_PINCH){ //do nothing } NSLog(@" Leaving touchesEnded .................."); } Thank You.

    Read the article

  • Draggable cards (touch enumeration) issue

    - by glitch
    I'm trying to let a player tap, drag and release a card from a fanned stack on the screen to a 4x4 field on the board. My cards are instantiated from a custom class that inherits from the UIImageView class. I started with the Touches sample app, and I modified the event handlers for touches to iterate over my player's card hand instead of the 3 squares the sample app allows you to move on screen. Everything works, until that is, I move the card I'm dragging near another card. I'm really drawing a blank here for the logic to get the cards to behave properly. Here's my code: - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { NSUInteger numTaps = [[touches anyObject] tapCount]; if(numTaps = 1) { for (UITouch *touch in touches) { [self dispatchFirstTouchAtPoint:[touch locationInView: self.boardCardView] forEvent:nil]; } } } -(void) dispatchFirstTouchAtPoint:(CGPoint)touchPoint forEvent:(UIEvent *)event { for (int i = 0; i<5; i++) { UIImageView *touchedCard = boardBuffer[i]; if (CGRectContainsPoint([touchedCard frame], touchPoint)) { [self animateFirstTouchAtPoint:touchPoint forView:touchedCard]; } } } - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { NSUInteger touchCount = 0; for (UITouch *touch in touches){ [self dispatchTouchEvent:[touch view] toPosition:[touch locationInView:self.boardCardView]]; touchCount++; } } My questions are: How do I get the touch logic to disallow other cards from being picked up by a dragging finger? Is there anyway I can only enumerate the objects that are directly below a player's finger and explicitly disable other objects from responding? Thanks!

    Read the article

  • How can I take zooming into account when a user touches a UIScrollView?

    - by Bill
    I have a UIImageView inside of a UIScrollView. The parent scroll view allows zooming and panning. When the user taps a point in the scroll view, I want to find the location in the raw image inside the UIImageView - i.e. I want the point after including any zooming and panning the user has done in the scroll view. Right now, I have a UIScrollView subclass called ForwardingScrollView that handles touch events and attempts to convert them into locations in the coordinate system of the child image view. I tried adding contentOffset to these points, tried multiplying them by zoomScale, and even tried doing both. I also tried calling [touch locationInView: self] and [touch locationInView: parent], but none of these methods correctly return the point that I clicked in the underlying image. What's the best way to do this? Thanks in advance.

    Read the article

  • iPhone smooth move and pinch of UIImageView

    - by Jacob
    I have an image view that I'm wanting to be able to move around, and pinch to stretch it. It's all working, but it's kinda jumpy when I start to do any pinch movements. The position will jump back and forth between the two fingers. - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { startLocation = [[touches anyObject] locationInView:mouth_handle]; if([touches count] == 2) { NSArray *twoTouches = [touches allObjects]; UITouch *first = [twoTouches objectAtIndex:0]; UITouch *second = [twoTouches objectAtIndex:1]; initialDistance = distanceBetweenPoints([first locationInView:mouth_handle],[second locationInView:mouth_handle]); } } - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { CGPoint pt = [[touches anyObject] locationInView:mouth_handle]; CGRect frame = [mouth_handle frame]; frame.origin.x += pt.x - startLocation.x; frame.origin.y += pt.y - startLocation.y; frame.origin.x = (frame.origin.x < 58) ? 58 : frame.origin.x; frame.origin.x = (frame.origin.x > (260 - mouth_handle.frame.size.width)) ? (260 - mouth_handle.frame.size.width) : frame.origin.x; frame.origin.y = (frame.origin.y < 300) ? 300 : frame.origin.y; frame.origin.y = (frame.origin.y > 377) ? 377 : frame.origin.y; if(frame.origin.x - prevDistanceX > 2 && frame.origin.x - prevDistanceX < -2) frame.origin.x = prevDistanceX; if(frame.origin.y - prevDistanceY > 2 && frame.origin.y - prevDistanceY < -2) frame.origin.y = prevDistanceY; prevDistanceX = frame.origin.x; prevDistanceY = frame.origin.y; CGFloat handleWidth = mouth_handle.frame.size.width; if([touches count] == 2) { NSArray *twoTouches = [touches allObjects]; UITouch *first = [twoTouches objectAtIndex:0]; UITouch *second = [twoTouches objectAtIndex:1]; CGFloat currentDistance = distanceBetweenPoints([first locationInView:mouth_handle],[second locationInView:mouth_handle]); handleWidth = mouth_handle.frame.size.width + (currentDistance - initialDistance); handleWidth = (handleWidth < 60) ? 60 : handleWidth; handleWidth = (handleWidth > 150) ? 150 : handleWidth; if(initialDistance == 0) { initialDistance = currentDistance; } initialDistance = currentDistance; } mouth_handle.frame = CGRectMake(frame.origin.x, frame.origin.y, handleWidth, 15); } Any thoughts on how to make this smoother?

    Read the article

  • Create a screen like the iPhone home screen with Scrollview and Buttons

    - by Anthony Chan
    Hi, I'm working on a project and need to create a screen similar to the iPhone home screen: A scrollview with multiple pages A bunch of icons When not in edit mode, swipe through different pages (even I started the touch on an icon) When not in edit mode, tap an icon to do something When in edit mode, drag the icon to swap places, and even swap to different pages When in edit mode, tap an icon to remove it Previously I read from several forums that I have to subclass UIScrollview in order to have touch input for the UIViews on top of it. So I subclassed it overriding the methods to handle touches: - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { //If not dragging, send event to next responder if (!self.dragging) [self.nextResponder touchesBegan:touches withEvent:event]; else [super touchesBegan:touches withEvent:event]; } In general I've override the touchesBegan:, touchesMoved: and touchesEnded: methods similarly. Then in the view controller, I added to following code: - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *touch = [touches anyObject]; UIView *hitView = (UIView *)touch.view; if ([hitView isKindOfClass:[UIView class]]) { [hitView doSomething]; NSLog(@"touchesBegan"); } } - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { // Some codes to move the icons NSLog(@"touchesMoved"); } - (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event { NSLog(@"touchesEnded"); } When I run the app, I have the touchesBegan method detected correctly. However, when I tried to drag the icon, the icon just moved a tiny bit and then the page started to scroll. In console, it logged with 2 or 3 "touchesMoved" message only. However, I learned from another project that it should logged tonnes of "touchesMoved" message as long as I'm still dragging on the screen. (I'm suspecting I have the delaysContentTouches set to YES, so it delays a little bit when I tried to drag the icons. After that minor delay, it sends to signal back to the scrollview to scroll through the page. Please correct me if I'm wrong.) So if any help on the code to perform the above tasks would be greatly appreciated. I've stuck in this place for nearly a week with no hope. Thanks a lot.

    Read the article

  • android 2.2 browser dont work pageY or PageX in ontouchend event

    - by juanca
    I have a web app that work perfect in android 2.1, when I upgrade to 2.2 the pageX property in ontouchend event, this is my code: menu1.ontouchend = function(e){ e.preventDefault(); if (e.touches && e.touches.length0) { // iPhone x2 = e.touches[0].pageX; y2 = e.touches[0].pageY; } else { // all others x2 = e.pageX; y2 = e.pageY; } } Anybody know what change in the javascript API for touch events from 2.1 to 2.2?????

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >