Search Results

Search found 558 results on 23 pages for 'touches'.

Page 1/23 | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • Querying current number of touches on screen without using events on iPhone

    - by nikhil
    I have an application that starts playing a sound when user touches the uiview and changing to different tones as the user slides the finger on the screen. The sound stops when the user lifts the finger. I am using the touchesBegan, Moved and Ended Events for this. My problem is touches Ended (and/or cancelled) is sometimes not fired properly and the sound keeps playing even after the finger is lifted from screen. So as a workaround I would like to implement a timer that would check for the number of touches on the screen and if it is zero it will check and stop the audioplayer if playing. I have been searching for some code that could get me the number of touches like UITouch *touches=[self getAllTouchesonScreen]; or something :)

    Read the article

  • Navigation view system with webview problem with touches!

    - by Gonçalo Falcão
    Hello i have search everything and i didn't figure this out! I have a tab bar controller with 5 navigation controlls, in one of the navigation control, i have a view, with a table view inside, and when i click that item i push a new view, that view have view -webview -view i create that second view(is transperant) because i need to handle a single tap to hide my toolbar and navigation bar, and the webview was eating all the touches! I put that view and implement on the view controller -(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{ UITouch* touch = [touches anyObject]; if(touch.tapCount == 2){ [NSObject cancelPreviousPerformRequestsWithTarget:self]; } [[wv.subviews objectAtIndex:0] touchesBegan:touches withEvent:event]; [super touchesBegan:touches withEvent:event]; } -(void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event{ [[wv.subviews objectAtIndex:0] touchesMoved:touches withEvent:event]; [super touchesMoved:touches withEvent:event]; } -(void) touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event{ UITouch* touch = [touches anyObject]; if(touch.tapCount == 1){ [self performSelector:@selector(hideBars) withObject:nil afterDelay:0.3]; } [[wv.subviews objectAtIndex:0] touchesEnded:touches withEvent:event]; [super touchesEnded:touches withEvent:event]; } -(void) touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event{ [[wv.subviews objectAtIndex:0] touchesCancelled:touches withEvent:event]; [super touchesCancelled:touches withEvent:event]; } wv is my UIWebView IBOutlet now i can get the the touches in my controller and send them to my webview. So i thought everything was working, i'm able to scroll, but now when i have links i'm not able to click them. And the webview is detecting the links i have made that test. So any other way to implements this to get the touches in the links, or i should change this workaround to hide the toolbars so i can have the full functionability of the webview? Thks for the help in advance.

    Read the article

  • Handling touches during animation on iPhone

    - by SalvoMaltese
    I have a view with multiple controls inside (a picker, a switch, a slider...). I use an animation to move this view; it appears from bottom and goes up until it disappear by the top. I can't get the inside controls respond to touches while the view is moving. How can I catch those touches?

    Read the article

  • Unittest test case only touches the file name

    - by Chen OT
    I was told that unittest is fast and the tests which touches DB, across network, and touches FileSystem are not unittest. In one of my testcases, its input are the file names (amount about 300~400) under a specific folder. Although these input are part of file system, the execution time of this test is very fast. Should I moved this test, which is fast but touches file system, to higher level test?

    Read the article

  • Get touches from UIScrollView

    - by Peter Lapisu
    Basically i want to subclass UIScrollView and hande the touches, however, the touch methods dont get called (i searched the web for a solution and i found out people pointing to override the hit test, what i did, but with no result :( ) .h @interface XScroller : UIScrollView @end .m - (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event { UIView *result = nil; for (UIView *child in self.subviews) { if ([child pointInside:point withEvent:event]) { if ((result = [child hitTest:point withEvent:event]) != nil) { break; } } } return result; } - (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { NSLog(@"BEGAN"); [super touchesBegan:touches withEvent:event]; } - (void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { NSLog(@"MOVED"); [super touchesMoved:touches withEvent:event]; } - (void) touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event { NSLog(@"ENDED"); [super touchesEnded:touches withEvent:event]; } - (void) touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event { NSLog(@"CANCELED"); [super touchesCancelled:touches withEvent:event]; } none of the - (void) touches* methods get called, the scrolling works ok

    Read the article

  • iPhone + OpenGL + Touches: FPS drop

    - by Anton
    Hey there, Recently I ran into a very strange issue: touching the screen of the iPhone and moving a finger around can eat up to 50% of my FPS. Yeah, I checked my code for possible bottlenecks – not the issue. The last resort I tried before writing this post – commenting out all the touch processing code and looking at FPS then. Results are: no touches – 58-60. Touching and moving the finger – 35-40 FPS instantly. The rendering is done in a separate thread, so that no main runloop events shall collide with it. However, it's very crushial for me (and the game I develop) to resolve this issue, because such FPS drop is really noticeable. Thank you for your help in advance. UPDATE: seems that setting rendering thread's priority to higher value helps a bit...

    Read the article

  • A view "transparent" to touches in some areas

    - by Mike
    I have this transparent view that covers the whole screen. I using this view to group objects because I need to run them together around a specific anchor point. Lets call this view transparentView. At some point, transparentView contains two subviews. Two vertical bars full of icons, one on the left and one on the right of the screen. I need these bars and their icons to respond to touches, so I have to set transparentView setUserInteraction to YES. The area between the two vertical bars are totally transparent. transparentView is on top of other views and I need these other views to respond to taps but, the transparent area of transparentView are intercepting the taps and not letting them go thru to the view below. This transparent view is a UIImageView based class. I have tried to forward taps on that class, using - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { [self.nextResponder touchesBegan: touches withEvent:event]; } - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { [self.nextResponder touchesMoved: touches withEvent:event]; } - (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event { [self.nextResponder touchesEnded: touches withEvent:event]; } - (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event { [self touchesEnded:touches withEvent:event]; } but this is not working. How can I do that? thanks.

    Read the article

  • MySQL et MariaDB : alerte à une faille de sécurité "tragiquement comique", 50 % des serveurs seraient touchés

    MySQL et MariaDB : alerte à une faille de sécurité "tragiquement comique" 50 % des serveurs seraient touchés MySQL et son fork MariaDB souffrent d'une grave vulnérabilité à une attaque de force brute, prodigieusement facile à exploiter. En peu de secondes, un pirate peut contourner l'authentification aux serveurs de base de données pour peu qu'il dispose d'un nom d'utilisateur correct (« root » est en général toujours présent et actif avec un maximum de prévilèges). Il suffit au pira...

    Read the article

  • iPhone: Tracking/Identifying individual touches

    - by FlorianZ
    I have a quick question regarding tracking touches on the iPhone and I seem to not be able to come to a conclusion on this, so any suggestions / ideas are greatly appreciated: I want to be able to track and identify touches on the iphone, ie. basically every touch has a starting position and a current/moved position. Touches are stored in a std::vector and they shall be removed from the container, once they ended. Their position shall be updated once they move, but I still want to keep track of where they initially started (gesture recognition). I am getting the touches from [event allTouches], thing is, the NSSet is unsorted and I seem not to be able to identify the touches that are already stored in the std::vector and refer to the touches in the NSSet (so I know which ones ended and shall be removed, or have been moved, etc.) Here is my code, which works perfectly with only one finger on the touch screen, of course, but with more than one, I do get unpredictable results... - (void) touchesBegan:(NSSet*)touches withEvent:(UIEvent*)event { [self handleTouches:[event allTouches]]; } - (void) touchesEnded:(NSSet*)touches withEvent:(UIEvent*)event { [self handleTouches:[event allTouches]]; } - (void) touchesMoved:(NSSet*)touches withEvent:(UIEvent*)event { [self handleTouches:[event allTouches]]; } - (void) touchesCancelled:(NSSet*)touches withEvent:(UIEvent*)event { [self handleTouches:[event allTouches]]; } - (void) handleTouches:(NSSet*)allTouches { for(int i = 0; i < (int)[allTouches count]; ++i) { UITouch* touch = [[allTouches allObjects] objectAtIndex:i]; NSTimeInterval timestamp = [touch timestamp]; CGPoint currentLocation = [touch locationInView:self]; CGPoint previousLocation = [touch previousLocationInView:self]; if([touch phase] == UITouchPhaseBegan) { Finger finger; finger.start.x = currentLocation.x; finger.start.y = currentLocation.y; finger.end = finger.start; finger.hasMoved = false; finger.hasEnded = false; touchScreen->AddFinger(finger); } else if([touch phase] == UITouchPhaseEnded || [touch phase] == UITouchPhaseCancelled) { Finger& finger = touchScreen->GetFingerHandle(i); finger.hasEnded = true; } else if([touch phase] == UITouchPhaseMoved) { Finger& finger = touchScreen->GetFingerHandle(i); finger.end.x = currentLocation.x; finger.end.y = currentLocation.y; finger.hasMoved = true; } } touchScreen->RemoveEnded(); } Thanks!

    Read the article

  • Objective - Gestures while finger touches screen

    - by marcg11
    I'm creating a space cocos2d game with objetive-c. I have in the bottom left 2 arrows to move the sprite left or right. I also implemented a swipe gesture to change weapon, however it only happens when I'm not touching the screen. I would like the player to change weapons while he's moving the sprite and not have to lift the finger from the arrows and stop moving the sprite to change weapons. Is there any way I can detect Gestures while having a finger pressed un a buton in thes screen?

    Read the article

  • iPhone sdk Cocoa Touch - Pass touches down from parent UIView to child UIScrollview

    - by Joe
    I have a UIView inside my xib in IB and inside that is a UIScrollview that is a small 80x80 square and dynamically loaded with 8 or so 80 x 80 thumbnail images. The UIScrollview is not clipping the images so that they extend out either side so you can swipe left and right to scroll a chosen image into the the centre, paging is on so they snap ti each image. I have researched and found this is the best and possibly only way to do this. The UIScrollview sits in a 'container' UIView for one reason, it is there to receive the touches/swipes and pass them down to it's child the UIScrollview as otherwise all touches would have to start in the small 80x80 UIScrollview area and I wan them to be anywhere along the row of images. I have seen some sample code somewhere for doing this but just can not implement it. Treat me as a noob, starting from beginning to end, how should the UIView and UIScrollview be set up in IB to allow any touches to be passed, and what code should I put into where? The UIView is set up as scroll_container and the child UIScrollview is char_scroll At the moment I have got it all working except for the touches being passed from the parent to the child, and at the moment the touches have to always start inside the UIScrollview (tiny 80x80 box in centre) when I want to be able to swipe left or right in the long 480X80 horizontal parent UIView and have this still scroll the child UIScrollview. Hope you can help and understand what I mean!

    Read the article

  • View doesn't respond to touches after shake

    - by Mike Rychev
    In my app I've implemented a shake event and it shows a UIImageView. When the UIImageView is shown, I hide the Nav Bar with: [self.navigationController setNavigationBarHidden:YES animated:NO]; And after that I want to bring it back when user touches the screen: - (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event { [[self navigationController] setNavigationBarHidden:NO animated:YES]; } But it doesn't work! Like my view doesn't respond to touches. Thanks in advance!

    Read the article

  • iPhone - Track three touches

    - by Striker
    Suppose you have three points of contact on the iPhone screen and one of those touches moves... The touchesMoved method will be invoked and the [[event touchesForView:self] count] will be equal to '3' because there are three touches for the event, but how can you distinguish between the touches? For example - find out whether it was the first, second, or third touch which moved? Thanks.

    Read the article

  • UIScrollView eating touches from its parent

    - by Jon Hull
    I have nested scrollViews (or rather a subclass of UIScrollView inside of an actual scrollview). I set the size of the inner view to its contentSize and set scrollEnabled = NO, because I only want the outside view scrolling. But the innerView occasionally eats touches and keeps the outerView from scrolling when it should. Is there something else I need to set to keep it from stealing the scrolling touches, but still allowing user interaction (e.g. editing a textView)?

    Read the article

  • Is the test, which touches the filenames under directory, a kind of unittest? [on hold]

    - by Chen OT
    I was told that unittest is fast and the tests which touches DB, across network, and touches FileSystem are not unittest. In one of my testcases, its input are the file names (amount about 300~400) under a specific folder. Although these input are part of file system, the execution time of this test is very fast. Should I moved this test, which is fast but touches file system, to higher level test?

    Read the article

  • iPhone multitouch - Some touches dispatch touchesBegan: but not touchesMoved:

    - by zkarcher
    I'm developing a multitouch application. One touch is expected to move, and I need to track its position. For all other touches, I need to track their beginnings and endings, but their movement is less critical. Sometimes, when 3 or more touches are active, my UIView does not receive touchesMoved: events for the moving touch. This problem is intermittent, and can always be reproduced after a few attempts: Touch the screen with 2 fingers. Touch the screen with another finger, and move this finger around. The moving finger always dispatches touchesBegan: and touchesEnded:, but sometimes does not dispatch any touchesMoved: events. Whenever the moving touch does not dispatch touchesMoved: events, I can force it to dispatch touchesMoved: if I move one of the other touches. This seems to "force" every touch to recheck its position, and I successfully receive a touchesMoved: event. However, this is clumsy. This bug is reproducible on both the iPhone 2G and 3GS models. My question is: How do I ensure that my moving touch dispatches touchesMoved: events? Does anyone have any experience with this issue? I've spent few fruitless days searching the web for answers. I found a post describing how to sync touch events with the VBL: http://www.71squared.com/2009/04/maingameloop-changes/ . However, this has not solved the problem. I really don't know how to proceed. Any help is appreciated!

    Read the article

  • Intercept UITableView scroll touches

    - by Jonesy
    Is it possible to control when the UITableView scrolls in my own code. I am trying to get behaviour where a vertical swipe scrolls and a horizontal swipe gets passed through to my code (of which there are many example) BUT I want a DIAGONAL swipe to do nothing, i.e the UITableView should not even begin scrolling. I tried catching it in here - (void)scrollViewWillBeginDragging:(UIScrollView *)scrollView but the scrollView.contentOffset.x is always 0 so I cannot detect a horizontal movement. I also tried subclassing UITableView and implementing - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event - (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event etc.. but the UITableView (and I guess it's parent UIScrollView) start to scroll before the touches are notified? To re-iterate, I want the UITableView scrolling to be locked if a diagonal swipe is made, but to scroll vertically normally. (This behaviour can be seen in Tweetie(Twitter) for the iPhone) Thanks for any help!

    Read the article

  • Process touches behind the UINavigationBar

    - by Reed Olsen
    In my application, I'm displaying a fullscreen image in a 320 x 480 frame. After I display the image, I fade the navigation bar out to allow the user to see the whole picture. When the user taps in the area where the navigation bar was, I would like to bring the navigation bar back. This is very similar to what happens in the iPhone Photos app. Unfortunately, after I've hidden the UINavigationBar, I can't process touches on the screen where the navigation bar once was. I believe this is because the origin of the parent view is right below the navigation bar: How can I process touches in this area to bring the nav bar back?

    Read the article

  • Problems with CGPoint in touches event

    - by Jason
    I'm having some problems with storing variables from my touch events. The warning I get when I run this is that coord and icoord are unused, but I used them in the viewDidLoad implementation, is there a reason why this does not work? Any suggestions? -(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *touch = [[event allTouches] anyObject]; CGPoint icoord = [touch locationInView:touch.view]; } -(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *touch = [[event allTouches] anyObject]; CGPoint coord = [touch locationInView:touch.view]; } - (void)viewDidLoad { if (coord.x > icoord.x) { player.center = CGPointMake(player.center.x + 5, player.center.y); } } Thanks.

    Read the article

  • Pass through touches to portion of UIWebView

    - by cannyboy
    I have a UIWebView. Using something like this: http://blog.evandavey.com/2009/02/how-to-make-uiwebview-transparent.html .. I have made the UIWebView transparent. I now need to be able to pass though touches on the webview to the view below, but only on a rectangular portion of the web view. So, imagine the webview is a square, and it has a square area in the centre which must pass-thru touches to the view below. Is there a way to do this?

    Read the article

  • Help moving multiple images at once with touches

    - by daveMac
    Here is my problem: I am trying to move a puzzle piece around the screen and then connect to the other piece if they are in close proximity. I have achieved this, though it is perhaps a little odd the way I did. My problem though is that once they have connected, I can't figure out how to move them as one image, instead of two separate entities. I would really appreciate any help or suggestions. Here is a sample of what I have been doing: (void)touchesMovedNSSet *)touches withEventUIEvent *)event{ UITouch *touch = [touches anyObject]; [self dispatchTouchEvent:[touch view] toPosition:[touch locationInView:self.view]]; } -(void)dispatchTouchEventUIView *)theView toPositionCGPoint)position{ if ((CGRectContainsPoint([picture frame], position))) { picture.center = position; } if (CGRectContainsPoint([picture2 frame], position)) { picture2.center = position; } }

    Read the article

  • How to disable multiple touches on a ScrollView and UIImage

    - by Rob
    I have a scrollview that I am loading images into that the user can touch and play a sound. However, the program is getting confused when I press one image with one finger and then another one with a different finger. It thinks you are pushing the same button again and therefore plays the sound again (so you have two of the same sounds playing at the same time even though you may have pressed a different sound button). I tried setting exclusiveTouch for each UIImage but that didn't seem to work in this case for some reason. What am I missing or is there a better way to do this? Here is some code: for creating buttons.... - (void) createButtons { CGRect myFrame = [self.outletScrollView bounds]; CGFloat gapX, gapY, x, y; int columns = 3; int myIndex = 0; int viewWidth = myFrame.size.width; int buttonsCount = [g_AppsList count]; float actualRows = (float) buttonsCount / columns; int rows = buttonsCount / columns; int buttonWidth = 100; int buttonHeight = 100; if (actualRows > rows) rows++; //set scrollview content size to hold all the glitter icons library gapX = (viewWidth - columns * buttonWidth) / (columns + 1); gapY = gapX; y = gapY; int contentHeight = (rows * (buttonHeight + gapY)) + gapY; [outletScrollView setContentSize: CGSizeMake(viewWidth, contentHeight)]; UIImage* myImage; NSString* buttonName; //center all buttons to view int i = 1, j = 1; for (i; i <= rows; i++) { //calculate gap between buttons gapX = (viewWidth - (buttonWidth * columns)) / (columns + 1); if (i == rows) { //this is the last row, recalculate gap and pitch gapX = (viewWidth - (buttonWidth * buttonsCount)) / (buttonsCount + 1); columns = buttonsCount; }//end else x = gapX; j = 1; for (j; j <= columns; j++) { //get shape name buttonName = [g_AppsList objectAtIndex: myIndex]; buttonName = [NSString stringWithFormat: @"%@.png", buttonName]; myImage = [UIImage imageNamed: buttonName]; TapDetectingImageView* imageView = [[TapDetectingImageView alloc] initWithImage: myImage]; [imageView setFrame: CGRectMake(x, y, buttonWidth, buttonHeight)]; [imageView setTag: myIndex]; [imageView setContentMode:UIViewContentModeScaleToFill]; [imageView setUserInteractionEnabled: YES]; [imageView setMultipleTouchEnabled: NO]; [imageView setExclusiveTouch: YES]; [imageView setDelegate: self]; //add button to current view [outletScrollView addSubview: imageView]; [imageView release]; x = x + buttonWidth + gapX; //increase button index myIndex++; }//end for j //increase y y = y + buttonHeight + gapY; //decrease buttons count buttonsCount = buttonsCount - columns; }//end for i } and for playing the sounds... - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { //stop playing theAudio.stop; // cancel any pending handleSingleTap messages [NSObject cancelPreviousPerformRequestsWithTarget:self selector:@selector(handleSingleTap) object:nil]; UITouch* touch = [[event allTouches] anyObject]; NSString* filename = [g_AppsList objectAtIndex: [touch view].tag]; NSString *path = [[NSBundle mainBundle] pathForResource: filename ofType:@"m4a"]; theAudio=[[AVAudioPlayer alloc] initWithContentsOfURL:[NSURL fileURLWithPath:path] error:NULL]; theAudio.delegate = self; [theAudio prepareToPlay]; [theAudio setNumberOfLoops:-1]; [theAudio setVolume: g_Volume]; [theAudio play]; } - (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event { BOOL allTouchesEnded = ([touches count] == [[event touchesForView:self] count]); if (allTouchesEnded) { //stop playing theAudio.stop; }//end if //stop playing theAudio.stop; }

    Read the article

  • Problems with CGPoint/touches event

    - by Jason
    I'm having some problems with storing variables from my touch events. The warning I get when I run this is that coord and icoord are unused, but I used them in the viewDidLoad implementation, is there a reason why this does not work? Any suggestions? #import "iGameViewController.h" @implementation iGameViewController @synthesize player; -(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *touch = [[event allTouches] anyObject]; CGPoint icoord = [touch locationInView:touch.view]; } -(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *touch = [[event allTouches] anyObject]; CGPoint coord = [touch locationInView:touch.view]; } - (void)viewDidLoad { if (coord.x > icoord.x) { player.center = CGPointMake(player.center.x + 5, player.center.y); } if (coord.x < icoord.x) { player.center = CGPointMake(player.center.x - 5, player.center.y); } if (coord.y > icoord.y) { player.center = CGPointMake(player.center.x, player.center.y - 5); } if (coord.y < icoord.y) { player.center = CGPointMake(player.center.x, player.center.y + 5); } } Thanks.

    Read the article

  • Handling touches in UITableViewController

    - by subw
    I want to implement the handling of an additional swipe gesture in my UITableViewController. However, it seems that in the case of tableviews the usual touch handling methods like -[touchesBegan::] of the controller are not called. How can I handle touches on a UITableView?

    Read the article

  • Detecting Touches in an OpenGL rendered scene

    - by Icky
    Hey. I was wondering whether there is a way to detect a touch in an OpenGL rendered scene. What I have i a set of images which are being rendered in my main view. Now if the user touches one of these images (or objects) I would like to know which one was touched - similar to the CGRectContainsPoint(frame, [touch locationInView:self.view] method. Is there an easy way to find out? If there is none, this would also help.

    Read the article

1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >