Search Results

Search found 128 results on 6 pages for 'uitouch'.

Page 1/6 | 1 2 3 4 5 6  | Next Page >

  • Using UITouch inside a UIScrollView

    - by Chris
    Hi all, Just toying with the SDK and I was wondering if possible a UITouch event can work inside a UIScrollView. I have setup a UIScrollView which handles a large UIView, inside the UIView is a UIImageView, I've managed to get the UITouch to drag the UIImageView outside of the UIScrollView but inside it's not registering the event. I suppose what I was trying to accomplish was dragging the UIImageView around the large UIView whilst the UIScrollView moves along the image if the user drags it beyond the POS of when the UIView when the UIImageView began it's dragging, if that makes sense? Many thanks

    Read the article

  • iPhone App rejected because of Three20 private API undocumented, private UITouch instance variables:

    - by Sijo
    I got a notification mail after submitting to app store.. "During our review of your application we found it is using private APIs, which is in violation of the iPhone Developer Program License Agreement section 3.3.1; "3.3.1 Applications may only use Documented APIs in the manner prescribed by Apple and must not use or call any private APIs." While your application has not been rejected, it would be appropriate to resolve this issue in your next update. The non-public APIs that are included in your application are the following undocumented, private UITouch instance variables: firstResponder UITouch._locationInWindow UITouch._phase UITouch._previousLocationInWindow UITouch._tapCount UITouch._timestamp UITouch._touchFlags UITouch._view UITouch._window Please resolve this issue in your next update to Application " . My application contains Three20. These variables are used in "UIViewAdditions.m". Is there any way to resolve this issue ? Please help me. Thanks in advance

    Read the article

  • UITouch Event Propagation To Background UIViews

    - by drewww
    I'm having troubles getting any UIView that's not the foreground UIView to receive UITouch events. I'm building an all-Core Graphics-app, so I'm not using any built in UIViews or IB or anything - everything is programmatically constructed and drawn into. Here's my view hierarchy: Root View Type A Container Type A View Type A View Type A View Type B Container Type B View Type B View Type B View The containers are just vanilla UIView objects that I create programmatically and add instances of Type A and B to when they're created. I did this originally to make hitTesting easier—Type A objects can be drag-and-dropped onto Type B objects. Type A objects receive touch events fine, but Type B objects (which are contained by Type B Container which is behind Type A Container) don't receive touch events. Both containers occupy the entire screen; they're basically just convenience containers. If I pull Type B Container to the front (eg [self.view bringSubviewToFront:Type B Container]) it receives events properly, but then the Type A Container doesn't get events. How do I propagate events from the view that's on top? Both views occupy the entire screen, so it makes sense that the top-most view is catching the events, but how should I get it to pass those events on to Type B Container? I could inject some code in the container that passes the touch events back to the main ViewController which can pass them on to Type B Container but that feels really messy to me. Is there a nicer way to not have the Type A Container stop propagation? What's the best practice here?

    Read the article

  • UITouch Events and Table Views

    - by Andy
    I'm working on a navigation-based iPhone-only app that serves two main purposes: One, to present data in a hierarchical view, allowing users to drill down and eventually edit said data, and, two, to all users to perform a default action when the table view cell is tapped. I now need to offer a small set of options tied to the same data; however, both the didSelectRowAtIndexPath: and accessoryButtonTappedForRowAtIndexPath: methods are obviously taken. So, my options seem to be to implement a double-tap method, wherein the small list of additional options would be presented after (you guessed it) a double-tap on said table row; or, preferably, a tap-and-hold method. From what I can tell, tap-and-hold seems like the way to go in SDK 4.0 - which does me no good right this red-hot minute. I decided to go with the double-tap option, but I'm having a little trouble. First and foremost, the touchesBegan:withEvent: method does not seem to be getting called at all; a breakpoint placed within the method is never called while the application runs, and the table view responds exactly as it did before I inserted the method (which is to say, it performs the default action): - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *aTouch = [touches anyObject]; if (aTouch.tapCount == 2) { [NSObject cancelPreviousPerformRequestsWithTarget:self]; } } Second, I don't really need to handle a single-tap - the didSelectRowAtIndexPath: method can handle the single-tap just fine. The double-tap is the funky one I want to handle. I suspect the answer is going to contain the phrase, "You can't have the table view handle the single-tap and the touchesBegan: method handle the double-tap. The touch handling methods have to handle all of them." I would really appreciate some guidance from some of you who've dealt with this issue. Thanks in advance.

    Read the article

  • Cancel UITouch Events When View Covered By Modal UIViewController

    - by kkrizka
    Hi there, I am writing an application where the user has to move some stuff on the screen using his fingers and drop them. To do this, I am using the touchesBegan,touchesEnded... function of each view that has to be moved. The problem is that sometimes the views are covered by a view displayed using the [UIViewController presentModalViewController] function. As soon as that happens, the UIView that I was moving stops receiving the touch events, since it was covered up. But there is no event telling me that it stopped receiving the events, so I can reset the state of the moved view. The following is an example that demonstrates this. The functions are part of a UIView that is being shown in the main window. It listens to touch events and when I drag the finger for some distance, it presents a modal view that covers everything. In the Run Log, it prints what touch events are received. - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { NSLog(@"touchesBegan"); touchStart=[[touches anyObject] locationInView:self]; } - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { CGPoint touchAt=[[touches anyObject] locationInView:self]; float xx=(touchAt.x-touchStart.x)*(touchAt.x-touchStart.x); float yy=(touchAt.y-touchStart.y)*(touchAt.y-touchStart.y); float rr=xx+yy; NSLog(@"touchesMoved %f",rr); if(rr > 100) { NSLog(@"Show modal"); [viewController presentModalViewController:[UIViewController new] animated:NO]; } } - (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event { NSLog(@"touchesEnded"); } - (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event { NSLog(@"touchesCancelled"); } But when I test the application and trigger the modal dialog to be displayed, the following is the output in the Run Log. [Session started at 2010-03-27 16:17:14 -0700.] 2010-03-27 16:17:18.831 modelTouchCancel[2594:207] touchesBegan 2010-03-27 16:17:19.485 modelTouchCancel[2594:207] touchesMoved 2.000000 2010-03-27 16:17:19.504 modelTouchCancel[2594:207] touchesMoved 4.000000 2010-03-27 16:17:19.523 modelTouchCancel[2594:207] touchesMoved 16.000000 2010-03-27 16:17:19.538 modelTouchCancel[2594:207] touchesMoved 26.000000 2010-03-27 16:17:19.596 modelTouchCancel[2594:207] touchesMoved 68.000000 2010-03-27 16:17:19.624 modelTouchCancel[2594:207] touchesMoved 85.000000 2010-03-27 16:17:19.640 modelTouchCancel[2594:207] touchesMoved 125.000000 2010-03-27 16:17:19.641 modelTouchCancel[2594:207] Show modal Any suggestions on how to reset the state of a UIView when its touch events are interrupted by a modal view?

    Read the article

  • Getting x/y coordinate of a UITouch...

    - by Tarek
    HI, I have been trying to get the x/y coordinates from a touch on any iDevice. When getting the touch locations, everything looks ok if the touch is in the middle of the screen. But if I drag my finger to the bottom of the screen, I can only get a y coordinate of 1015. It should be getting to 1023. Same thing for dragging my finger to the top of the screen. I get -6. It should be 0. I have explicitly set the window and views to an origin of 0,0 and the width, height of the device's screen. Still nothing. I am really lost on what might be going on. Is something shifted? Am I not reading the x/y coordinates properly. Does something need to be transformed or converted? Any help would be much appreciated. T

    Read the article

  • Differentiating Between UITouch Objects On The iPhone

    - by Jasarien
    Hey guys, I'm trying to differentiate between two (or more) UITouch objects on the iPhone. Specifically, I'd like to know the order in which the touches occurred. For instance, in my -touchesBegan:withEvent: method I get an NSSet of UITouch objects. Now I can find out how many touches there are, but, which object represents which finger? I notice the timestamp property on UITouch - is this what I'm looking for? I see how that would be useful to obtaining the last or first touch - providing the touches don't mutate... Therein lies my problem. I can use the timestamp to single out the latest touch, but then the touch that occurred first moves, and IT becomes the latest touch... At the end of this exercise, I'd like to be able to implement the "pinch" gesture to zoom in or out, etc. Any help would be greatly appreciated, thanks.

    Read the article

  • Odd values/movement with UITouch and CGPoint.

    - by Joshua
    I'm getting odd numbers from UITouch and CGPoint and one is different, I also think this maybe causing a flickering affect in my app when I try to move something by following a touch. This is the code I'm using: - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { NSLog(@"touchDown"); UITouch *touch = [touches anyObject]; firstTouch = [touch locationInView:self.view]; if (CGRectContainsPoint(but.frame, firstTouch)) { butContains = YES; NSLog(@"butContains = %d", butContains); } } - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *touch = [touches anyObject]; currentTouch = [touch locationInView:self.view]; NSInteger x = currentTouch.x; NSInteger y = currentTouch.y; CGFloat CGX = (CGFloat)x; CGFloat CGY = (CGFloat)y; if (butContains == YES) { NSLog(@"touch in subView/contentView"); sub.frame = CGRectMake(CGX, CGY, 130.0, 21.0); } NSLog(@"touch moved"); } - (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *touch = [touches anyObject]; currentTouch = [touch locationInView:self.view]; NSLog(@"User tapped at %@", NSStringFromCGPoint(currentTouch)); NSLog(@"Point %a, %a", currentTouch.x, currentTouch.y); NSInteger x = currentTouch.x; NSInteger y = currentTouch.y; NSLog(@"Point %a, %a", y, x); CGFloat CGX = (CGFloat)x; CGFloat CGY = (CGFloat)y; NSLog(@"Point %g, %g", CGX, CGY); if (butContains == YES) { NSLog(@"touch in subView/contentView"); sub.frame = CGRectMake(CGX, CGY, 130.0, 21.0); } butContains = NO; NSLog(@"touch ended"); } - (IBAction)add:(id)sender{ InSightViewController *contentView = [[InSightViewController alloc] initWithNibName:@"SubView" bundle:[NSBundle mainBundle]]; [contentView loadView]; [self.view insertSubview:contentView.view atIndex:0]; } This is what I get from the touchesEnded method in the Debugger. 2010-04-20 20:06:13.045 InSight[25042:207] User tapped at {50, 78} 2010-04-20 20:06:13.047 InSight[25042:207] Point 0x1.9p+5, 0x1.38p+6 2010-04-20 20:06:13.048 InSight[25042:207] Point 0x1.900000027p-1037, 0x1.38p+6 2010-04-20 20:06:13.048 InSight[25042:207] Point 50, 78 And this is what's happening in the Simulator. fwdr.org/file:y8bd As this is a complicated problem this is the source code of my XCode Project aswell. http://cl.ly/Qjj

    Read the article

  • Multiple touch problem using UITouch/UIView in iphone

    - by John Qualis
    Hi, I am trying to implement a 2-finger "pinch" and "expand" (or enlarge) on a UIView using iPhone 3G and SDK 3.1.2. I haven't done programmed in UITouch/UIEvent before, so I appreciate any guidance to the problem I am facing. When I touch the screen with 2 fingers I see that sometimes "touchesBegan" gives me only 1 event. I need UITouch event count == 2 to measure the distance between 2 fingers. Hence I have to lift up my fingers and repeat this process again till I get both fingers to be detected. Is there a way around? Can I improve it? The actual resize works correctly once both fingers are detected but this issue seen at the start needs to be resolved. I know SDK 3.2 detects gestures such as pinch but I was wondering how it can be done in 3.1.2? The code is given below. The "OnFingerDown" function gets called randomly even when I put 2 fingers down whereas I want "onTwoFingersDown" to be called each time. Moreover the I get only 1 touch event each time I put my fingers down. I mean if "OnFingerDown" was called twice I could somehow get it to work. Thanks in advance. Appreciate any help. John - (void) touchesBegan: (NSSet*) touches withEvent: (UIEvent*) event { UITouch* touch, *touch1; CGPoint location, location1; if([[event allTouches] count] == 1) { touch=[UI[[event allTouches] allObjects] objectAtIndex:0]; location=[touch locationInView:self]; OnFingerDown(location); } else if([[event allTouches] count] == 2) { touch=[[[event allTouches] allObjects] objectAtIndex:0]; location=[touch locationInView:self]; touch1=[[[event allTouches] allObjects] objectAtIndex:1]; location1=[touch1 locationInView:self]; OnTwoFingersDown(location, location1); } }

    Read the article

  • Unable to forward UITouch events to my view controller

    - by hyn
    I have a UISplitViewController setup with a custom view added as a subview of the view (UILayoutContainerView) of split view controller. I am trying to forward touch events from my custom view controller to the master and detail views, but the following (which was suggested here on another thread) seems to have no effect: - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *touch = [touches anyObject]; // Do something [self.nextResponder touchesBegan:touches withEvent:event]; } (I couldn't get this formatted properly) As a result my custom view controller locks the events and all the UI underneath never has a chance to do anything. How can I get my master and detail view controllers to receive events?

    Read the article

  • UIScrollView zoomToRect not zooming to given rect (created from UITouch CGPoint)

    - by pmhart
    My application has a UIScrollView with one subview. The subview is an extended UIView which prints a PDF page to itself using layers in the drawLayer event. Zooming using the built in pinching works great. setZoomScale also works as expected. I have been struggling with the zoomToRect function. I found an example online which makes a CGRect zoomRect variable from a given CGPoint. In the touchesEnded function, if there was a double tap and they are all the way zoomed out, I want to zoom in to that PDFUIView I created as though they were pinching out with the center of the pinch where they double tapped. So assume that I pass the UITouch variable to my function which utilizes zoomToRect if they double tap. I started with the following function I found on apples site: http://developer.apple.com/iphone/library/documentation/WindowsViews/Conceptual/UIScrollView_pg/ZoomZoom/ZoomZoom.html The following is a modified version for my UIScrollView extended class: - (void)zoomToCenter:(float)scale withCenter:(CGPoint)center { CGRect zoomRect; zoomRect.size.height = self.frame.size.height / scale; zoomRect.size.width = self.frame.size.width / scale; zoomRect.origin.x = center.x - (zoomRect.size.width / 2.0); zoomRect.origin.y = center.y - (zoomRect.size.height / 2.0); //return zoomRect; [self zoomToRect:zoomRect animated:YES]; } When I do this, the UIScrollView seems to zoom using the bottom right edge of the zoomRect above and not the center. If I make UIView like this UIView *v = [[UIView alloc] initWithFrame:zoomRect]; [v setBackgroundColor:[UIView redColor]]; [self addSubview:v]; The red box shows up with the touch point dead in the center. Please note: I am writing this from my PC, I recall messing around with the divided by two part on my Mac, so just assume that this draws a rect with the touch point in the center. If the UIView drew off center but zoomed to the right spot it would be all good. However, what happens is when it preforms the zoomToRect it seems to use the bottom right off the zoomRect at the top left of the zoomed in results. Also, I noticed that depending on where I click on the UIScrollView, it anchors to diffrent spots. It almost seems like there is a cross down the middle and it's reflecting the points somehow as though anywhere left of the middle is a negative reflection and anywhere right of the middle is a positive reflection? This seems to complicated, shouldn't it just zoom to the rect that was drawn as the UIView was able to draw? I used a lot of research to figure out how to create a PDF that scales in high quality, so I am assuming that using the CALayer may be throwing off the coordinate system? But to the UIScrollView it should just treat it as a view with 768x985 dimensions. This is sort of advanced, please assume the code for creating the zoomRect is all good. There is something deeper with the CALayer in the UIView which is in the UIScrollView....

    Read the article

  • touchesBegan and other touch events not getting detected in UINavigationController

    - by SaltyNuts
    In short, I want to detect a touch on the navigation controller titlebar, but having trouble actually catching any touches at all! Everything is done without IB, if that makes a difference. My app delegate's .m file contains: MyViewController *viewController = [[MyViewController alloc] init]; navigationController = [[UINavigationController alloc] initWithRootViewController:viewController]; [window addSubview:navigationController.view]; There are a few other subviews added to this window in a way that overlays navigationController leaving only the navigation bar visible. MyViewController is a subclass of UIViewController and its .m file contains: - (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event { for (UITouch *touch in touches) { NSLog(@"ended\n"); } } -(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { for (UITouch *touch in touches) { NSLog(@"began\n"); } } I also tried putting these functions directly into app delegate's .m file, but the console remains blank. What am I doing wrong?

    Read the article

  • touchesEnded not being called??? or randomly being called

    - by Rob
    If I lift my finger up off the first touch, then it will recognize the next touch just fine. It's only when I hold my first touch down continuously and then try and touch a different area with a different finger at the same time. It will then incorrectly register that second touch as being from the first touch again. Update It has something to do with touchesEnded not being called until the very LAST touch has ended (it doesn't care if you already had 5 other touches end before you finally let go of the last one... it calls them all to end once the very last touch ends) - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { UITouch* touch = [touches anyObject]; NSString* filename = [listOfStuff objectAtIndex:[touch view].tag]; // do something with the filename now } - (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event { ITouch* touch = [touches anyObject]; NSString* buttonPressed = [listOfStuff objectAtIndex:[touch view].tag]; // do something with this info now }

    Read the article

  • touchesBegan / Ended incorrectly identifying second, third, etc. touch

    - by Rob
    I have an issue where touchesBegan and touchesEnded are incorrectly identifying my second, third, etc touch if I continue to hold down my first touch. If I lift my finger up off the first touch, then it will recognize the next touch just fine. It's only when I hold my first touch down continuously and then try and touch a different area with a different finger at the same time. It will then incorrectly register that second touch as being from the first touch again. Any insights into how I can fix this? - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { UITouch* touch = [touches anyObject]; NSString* filename = [listOfStuff objectAtIndex:[touch view].tag]; // do something with the filename now } - (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event { ITouch* touch = [touches anyObject]; NSString* buttonPressed = [listOfStuff objectAtIndex:[touch view].tag]; // do something with this info now }

    Read the article

  • How do you tell what object is being touched in touchesBegan?

    - by Flafla2
    I know that this is a very commonly asked question, but all of the answers on every website don't work! If you still don't know what I mean, then maybe this line of code will help you understand. - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *touch = [[event allTouches] anyObject]; CGPoint location = [touch locationInView:self.view]; if (touch.view == nextbutton) [self performSelector:@selector(next)]; if (touch.view == prevbutton) [self performSelector:@selector(previous)]; if (touch.view == moreoptionsbutton) [self performSelector:@selector(moresettings)]; } It doesn't do anything when you touch nextbutton, prevbutton, and more optionsbutton, which are UIImageViews by the way. I have also tried using isEqual: instead of ==, but that hasn't worked out either. Any suggestions?

    Read the article

  • iPad. UIBarButtonItem has an undocumented view of type UIToolbarTextButton. Huh?

    - by dugla
    I have an iPad app where I have a view controller that is the UIGestureRecognizerDelegate for a number of UIGestureRecognizers. I have implemented the following method of UIGestureRecognizerDelegate: - (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldReceiveTouch:(UITouch *)touch { // Double tapping anywhere on the screen hides/shows the toolbar if ([gestureRecognizer isKindOfClass:[UITapGestureRecognizer class]] == YES) { if (touch.tapCount == 2) { self.toolbar.hidden = self.toolbar.isHidden ? NO : YES; } // if (touch.tapCount == 2) } // if ([gestureRecognizer isKindOfClass:[UITapGestureRecognizer class]] == YES) // All gestures are ignored unless they happen on the fullscreen EAGLView if ([touch.view isKindOfClass:[EAGLView class]] == NO) { return NO; } // if ([touch.view isKindOfClass:[EAGLView class]] == NO) return YES; } My setup is a fullscreen EAGLView with a UIToolbar atop the EAGLView. There is a UIBarButtonItem on the toolbar. The idea here is that double-tapping anywhere toggles the appearance of the toolbar. All other gestures must occur on the EAGLView. My problem is that taps directly on the UIBarButtonItem show touch.view to be the UIView subclass UIToolbarTextButton which is undocumented and can't be introspected. Huh? Can someone suggest a work around, preferably that uses introspective goodness of some form? Thanks, Doug Thanks, Doug

    Read the article

  • Handling touches in UITableViewController

    - by subw
    I want to implement the handling of an additional swipe gesture in my UITableViewController. However, it seems that in the case of tableviews the usual touch handling methods like -[touchesBegan::] of the controller are not called. How can I handle touches on a UITableView?

    Read the article

  • iPhone - how to track touches and allow button taps at the same time?

    - by Jonathan Cohen
    I'm wondering how to track touches anywhere on the iPhone screen and still have UIButtons respond to taps. I subclassed a UIView, made it full screen and the highest view in the hierarchy, and overrode its pointInside:withEvent method. If I return YES, I'm able to track touches anywhere on the screen but the buttons don't respond (likely because the view is instructed to handle and terminate the touch). If I return NO, the touch passes through the view and the buttons respond, but I'm not able to track touches. Do I need to subclass UIButton or is this possible through the responder chain? What am I doing wrong? - (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event{ return NO; } //only works if pointInside:withEvent: returns YES. -(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{ NSLog(@"began"); [self.nextResponder touchesBegan:touches withEvent:event]; } //only works if pointInside:withEvent: returns YES. -(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event{ NSLog(@"end"); [self.nextResponder touchesEnded:touches withEvent:event]; }

    Read the article

  • Detect if certain UIView was touched amongst other UIViews

    - by Rudiger
    HI Guys, Sorry if this has been answered elsewhere but I can't seem to get it to work. I have 3 UIViews, layered on top of one large uiview. I want to know if the user touches the top one and not care about the other ones. I will have a couple of buttons in the second UIView and a UITable in the 3rd UIView. Problem is I turn userInteractionEngabled on on the first view and that works, but all the other views respond in the same way even if I turn it off. If I disable userInteractionEnabled on self.view none of them respond. I also can't detect which view was touched in the touchesBegan delegate method. my code: UIView *aView = [[UIView alloc] initWithFrame:CGRectMake(0, 0, 320, 150)]; aView = userInteractionEnabled = YES; [self.view addSubview:aView]; UIView *bView = [[UIView alloc] initWithFrame:CGRectMake(0, 150, 320, 50)]; bView.userInteractionEnabled = NO; [self.view addSubview:bView]; -(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { //This gets called for a touch anywhere } Thanks for any help.

    Read the article

  • Can I detect if higher subview has been touched?

    - by Kevin Beimers
    I've got a big UIView that responds to touches, and it's covered with lots of little UIViews that respond differently to touches. Is it possible to touch anywhere on screen and slide around, and have each view know if it's being touched? For example, I put my finger down on the upper left and slide toward the lower right. The touchesBegan/Moved is collected by the baseView. As I pass over itemView1, itemView2, and itemView3, control passes to them. If I lift my finger while over itemView2, it performs itemView2's touchesEnded method. If I lift my finger over none of the items, it performs baseView's touchesEnded. At the moment, if I touch down on baseView, touchEnded is always baseView and the higher itemViews are ignored. Any ideas?

    Read the article

  • iPhone "touchesBegan" and "touchesMoved" message ... do not move to centre of touch

    - by Nippysaurus
    I am modifying the "MoveMe" example from the apple web site. When I get the "touchesMoved" message I move the centre of the target view to the centre of the touch. Is there a way that when the touch starts ("touchesBegan" message) I can remember the offset from the target view and keep that offset. Let me know if this is hard to understand and I will try to explain it a bit better.

    Read the article

  • Not receiving touchesEnded/Moved/Cancelled after adding subView

    - by Sam
    Title more or less says it all. In response to a touchesBegan event, my UIViewController recolours itself and adds some subviews. It never receives the touchesEnded. I guess because the added subviews are somehow intercepting the event. I tried calling resignFirstResponder on the subviews to no avail. The code works fine when I don't add the child views and the touch events are called as normal. Any ideas? Thanks

    Read the article

  • Is it okay to programmatically create UITouch and UIEvent objects to emulate touch events?

    - by mystify
    I want to simulate some touches on my UI, without using private API. So one simple way to do it is to simply call those -touchesBegan:withEvent:, -touchesMoved:withEvent:, -touchesEnded:WithEvent: and -touchesCancelled:withEvent: methods inside my custom controls. For that, I would have to create UITouch and UIEvent dummy objects with appropriate data inside. Is this fine with them? Or would they reject my app?

    Read the article

  • Isometric layer moving inside map

    - by gronzzz
    i'm created isometric map and now trying to limit layer moving. Main idea, that i have left bottom, right bottom, left top, right top points, that camera can not move outside, so player will not see map out of bounds. But i can not understand algorithm of how to do that. It's my layer scale/moving code. - (void)touchBegan:(UITouch *)touch withEvent:(UIEvent *)event { _isTouchBegin = YES; } - (void)touchMoved:(UITouch *)touch withEvent:(UIEvent *)event { NSArray *allTouches = [[event allTouches] allObjects]; UITouch *touchOne = [allTouches objectAtIndex:0]; CGPoint touchLocationOne = [touchOne locationInView: [touchOne view]]; CGPoint previousLocationOne = [touchOne previousLocationInView: [touchOne view]]; // Scaling if ([allTouches count] == 2) { _isDragging = NO; UITouch *touchTwo = [allTouches objectAtIndex:1]; CGPoint touchLocationTwo = [touchTwo locationInView: [touchTwo view]]; CGPoint previousLocationTwo = [touchTwo previousLocationInView: [touchTwo view]]; CGFloat currentDistance = sqrt( pow(touchLocationOne.x - touchLocationTwo.x, 2.0f) + pow(touchLocationOne.y - touchLocationTwo.y, 2.0f)); CGFloat previousDistance = sqrt( pow(previousLocationOne.x - previousLocationTwo.x, 2.0f) + pow(previousLocationOne.y - previousLocationTwo.y, 2.0f)); CGFloat distanceDelta = currentDistance - previousDistance; CGPoint pinchCenter = ccpMidpoint(touchLocationOne, touchLocationTwo); pinchCenter = [self convertToNodeSpace:pinchCenter]; CGFloat predictionScale = self.scale + (distanceDelta * PINCH_ZOOM_MULTIPLIER); if([self predictionScaleInBounds:predictionScale]) { [self scale:predictionScale scaleCenter:pinchCenter]; } } else { // Dragging _isDragging = YES; CGPoint previous = [[CCDirector sharedDirector] convertToGL:previousLocationOne]; CGPoint current = [[CCDirector sharedDirector] convertToGL:touchLocationOne]; CGPoint delta = ccpSub(current, previous); self.position = ccpAdd(self.position, delta); } } - (void)touchEnded:(UITouch *)touch withEvent:(UIEvent *)event { _isDragging = NO; _isTouchBegin = NO; // Check if i need to bounce _touchLoc = [touch locationInNode:self]; } #pragma mark - Update - (void)update:(CCTime)delta { CGPoint position = self.position; float scale = self.scale; static float friction = 0.92f; //0.96f; if(_isDragging && !_isScaleBounce) { _velocity = ccp((position.x - _lastPos.x)/2, (position.y - _lastPos.y)/2); _lastPos = position; } else { _velocity = ccp(_velocity.x * friction, _velocity.y *friction); position = ccpAdd(position, _velocity); self.position = position; } if (_isScaleBounce && !_isTouchBegin) { float min = fabsf(self.scale - MIN_SCALE); float max = fabsf(self.scale - MAX_SCALE); int dif = max > min ? 1 : -1; if ((scale > MAX_SCALE - SCALE_BOUNCE_AREA) || (scale < MIN_SCALE + SCALE_BOUNCE_AREA)) { CGFloat newSscale = scale + dif * (delta * friction); [self scale:newSscale scaleCenter:_touchLoc]; } else { _isScaleBounce = NO; } } }

    Read the article

  • iPhone smooth move and pinch of UIImageView

    - by Jacob
    I have an image view that I'm wanting to be able to move around, and pinch to stretch it. It's all working, but it's kinda jumpy when I start to do any pinch movements. The position will jump back and forth between the two fingers. - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { startLocation = [[touches anyObject] locationInView:mouth_handle]; if([touches count] == 2) { NSArray *twoTouches = [touches allObjects]; UITouch *first = [twoTouches objectAtIndex:0]; UITouch *second = [twoTouches objectAtIndex:1]; initialDistance = distanceBetweenPoints([first locationInView:mouth_handle],[second locationInView:mouth_handle]); } } - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { CGPoint pt = [[touches anyObject] locationInView:mouth_handle]; CGRect frame = [mouth_handle frame]; frame.origin.x += pt.x - startLocation.x; frame.origin.y += pt.y - startLocation.y; frame.origin.x = (frame.origin.x < 58) ? 58 : frame.origin.x; frame.origin.x = (frame.origin.x > (260 - mouth_handle.frame.size.width)) ? (260 - mouth_handle.frame.size.width) : frame.origin.x; frame.origin.y = (frame.origin.y < 300) ? 300 : frame.origin.y; frame.origin.y = (frame.origin.y > 377) ? 377 : frame.origin.y; if(frame.origin.x - prevDistanceX > 2 && frame.origin.x - prevDistanceX < -2) frame.origin.x = prevDistanceX; if(frame.origin.y - prevDistanceY > 2 && frame.origin.y - prevDistanceY < -2) frame.origin.y = prevDistanceY; prevDistanceX = frame.origin.x; prevDistanceY = frame.origin.y; CGFloat handleWidth = mouth_handle.frame.size.width; if([touches count] == 2) { NSArray *twoTouches = [touches allObjects]; UITouch *first = [twoTouches objectAtIndex:0]; UITouch *second = [twoTouches objectAtIndex:1]; CGFloat currentDistance = distanceBetweenPoints([first locationInView:mouth_handle],[second locationInView:mouth_handle]); handleWidth = mouth_handle.frame.size.width + (currentDistance - initialDistance); handleWidth = (handleWidth < 60) ? 60 : handleWidth; handleWidth = (handleWidth > 150) ? 150 : handleWidth; if(initialDistance == 0) { initialDistance = currentDistance; } initialDistance = currentDistance; } mouth_handle.frame = CGRectMake(frame.origin.x, frame.origin.y, handleWidth, 15); } Any thoughts on how to make this smoother?

    Read the article

1 2 3 4 5 6  | Next Page >