Search Results

Search found 4564 results on 183 pages for 'sencha touch 2'.

Page 7/183 | < Previous Page | 3 4 5 6 7 8 9 10 11 12 13 14  | Next Page >

  • iPhone: Activate UISlider and set its value to the location of the current touch programatically

    - by carloe
    Is it possible to set an UISlider as first responder and set its current value to the location of the current touch programatically? The way my app is set up I have a UIView container that takes up the whole screen. Inside the container I have another UIView offscreen at the bottom edge (I'll call this bottomBar). Inside the bottomBar there is a UISlider element. Right now, when the user swipes along the bottom edge of the screen the bottomBar the slider it contains slide up. What I am trying to achieve is to activate the UISlider, and set the position of the slider (the value) to the position of the users touch. Is this possible? Could someone please point me in the right direction?

    Read the article

  • How to get a continuous Touch Event?

    - by daliz
    My class extends View and I need to get continuous touch events on it. If I use: public boolean onTouchEvent(MotionEvent me) { if(me.getAction()==MotionEvent.ACTION_DOWN) { myAction(); } return true; } ... the touch event is captured once. What if I need to get continuous touches without moving the finger? Please, tell me I don't need to use threads or timers. My app is already too much heavy. Thanks.

    Read the article

  • Limit to area that receives touch events

    - by Typeoneerror
    Is there's a bounding box on an application that receives touch events? I created a few sample round rect buttons and placed them in different places in my view. The ones in the center of the view receive touch events (and show the highlighted blue color) but if I place a button near the edges of the view, only parts of them are clickable in the simulator. Is this because of Apples style guidelines? I placed a button exactly where a UITabNavigationItem would appear and only the bottom half of it is clickable.

    Read the article

  • Prevent status bar from receiving touch events

    - by Typeoneerror
    Edit After further testing, it appears that the part of my button that are not clickable are where the status bar used to be. I'm hiding the status bar with : // -- Override point for customization after app launch [[UIApplication sharedApplication] setStatusBarHidden:YES]; But it's still receiving touches. Any idea on how to disable this? Is there's a bounding box on an application that receives touch events? I created a few sample round rect buttons and placed them in different places in my view. The ones in the center of the view receive touch events (and show the highlighted blue color) but if I place a button near the edges of the view, only parts of them are clickable in the simulator. Is this because of Apples style guidelines? I placed a button exactly where a UITabNavigationItem would appear and only the bottom half of it is clickable.

    Read the article

  • WPF Manipulation and Touch events not firing, but mouse events do?

    - by Smetad Anarkist
    I have a Samsung LD220Z multi touch monitor, and when I'm experimenting with some WPF and touch features the touch events arent firing. The mouse down event seems to be firing however. Does this mean that I have to take into account that not all touch screen behave the same, and how do I get the touch inputs on my screen? I've tried the Windows 7 Touch pack that microsoft released. And the multi touch features seems to work there. Any suggestion on how to proceed with this?

    Read the article

  • Touch Screen Product Catalog for Retail Store

    - by Patrick
    I am a UI/UX designer and I would like to create kiosk type of app that would be a product catalog (help/suggestor) for customers in a retail store using a touch screen monitor (and computer). Something as simple as this: http://www.youtube.com/watch?v=aoH0u6YTTK4 This is what I would like it to do: 1st Screen (Main Menu): Pick a type of category (For example: Dog, Cat, Small Animal) 2nd Screen pick a sub-category of a main category (For example: Puppy, Adult, Senior - DOG) 3rd Screen pick a sub-category of previous sub-category (For example: Food, Healthy, Toys) Then it will display a list of all products with a picture, small description, and price. Thats it. So the point of the kiosk is to help customers find certain products that match their pet criteria. (Dog Puppy Healthy Dog Food) I am wondering what is the best solution: RIA (Flex/Air or Silveright) or flash/action script. I am not sure what is the best technology to use for the following benefits: user-experience (smoothness of touch screen actions) and fast development.

    Read the article

  • Oracle 'In Touch' PartnerCast - July 1, 2014

    - by Cinzia Mascanzoni
    27 May 2014 'In Touch' Webcast for Oracle EMEA Partners Invitation Stay Connected Oracle Media Network   OPN on PartnerCast   Oracle 'In Touch' PartnerCast (July 1, 2014)Be prepared for a year of growth Register Now! Dear partner, We would like to invite you to join David Callaghan, Senior Vice President Oracle EMEA Alliances and Channels, and his studio guests for the next broadcast of the Oracle ‘In Touch’ PartnerCast on Tuesday 1st July 2014 from 10:30am UK / 11:30am CET. In this cast, David’s studio guests and his regional reporters will be looking at your priorities as EMEA partners and how best to grow with Oracle. We also look forward to the broadcast covering topics on the following: Highlights of FY14 Strategic themes for FY15 HCM, CRM and ERP Oracle on Oracle Exclusive for ‘In Touch’ David Callaghan questions Rich Geraffo, Senior Vice President, Global Alliances & Channels, on how the FY15 partner Global kick off relates to EMEA. Plus David provides your chance to hear from some of the newly appointed Worldwide A&C Leadership team as he discusses with Bruce Chumley VP Oracle Channel Distribution Sales & Troy Richardson VP Oracle Strategic Alliances; their core focus and strategy of growth and what they intend on bringing to the table in their new role. Register Now! With lots of studio guests joining David, why not get in touch on Twitter using the hashtag #OracleInTouch or by emailing [email protected] to get your questions featured in the cast! To find out more information and to watch previous episodes on-demand, please visit our webpage here. Best regards, Oracle EMEA Alliances & Channels Oracle 'In Touch' PartnerCast: be prepared for a year of growth July 01, 2014 10:30am UK / 11:30am CET Duration: 45 mins. Host David Callaghan Senior VP Oracle EMEA Alliances & Channels Studio Guests Alistair Hopkins VP Sales & Strategy, Technology Solutions, Oracle EMEA Alliances & Channels More to be announced shortly Features Contributors Rich Geraffo Senior Vice President, Oracle Worldwide Alliances & Channels Bruce Chumley Vice President Channel Distribution Sales, Oracle WW Alliances & Channels Steve Biondi VP Channel Distribution Sales, Oracle WW Alliances & Channels Regional Reporters Silvia Kaske VP Oracle A&C WCE North Will O'Brien VP Oracle A&C UK/IE Eric Fontaine VP Oracle A&C WCE South Janusz Naklicki VP Oracle A&C ECEMEA

    Read the article

  • Me on Windows 7 Touch (and I mention Silverlight Hack)

    At the MPV Summit 2010 I was asked to talk a little bit about Windows 7 touch and so I talk abit about this touch tag Kiosk technology and our experience at Wirestone working with Windows 7 Touch, WPF and Silverlight. Anyway its pretty cool:http://www.vimeo.com/10357419...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • security policy error iphone ipod touch issue

    - by Joey
    I'm getting an "Error from Debugger: Error launching remote program: security policy error" when I try to run my app on my ipod touch. The provisions look in order, and the app builds to my iphone 3gs just fine. The app used to build just fine to my ipod touch, so I'm flustered what could have changed and wondering if anyone has any thoughts on what might be causing this issue. The build logs are below. Mon Mar 15 14:25:54 unknown com.apple.debugserver-43[449] : Connecting to com.apple.debugserver service... Mon Mar 15 14:25:55 unknown SpringBoard[24] : Unable to launch com.yourcompany.Unearthed because it has an invalid code signature, inadequate entitlements or its profile has not been explicitly trusted by the user. Mon Mar 15 14:25:55 unknown com.apple.debugserver-43[449] : error: unable to launch the application with CFBundleIdentifier 'com.yourcompany.Unearthed' sbs_error = 9 Mon Mar 15 14:25:55 unknown com.apple.debugserver-43[449] : 1 [01c1/0903]: RNBRunLoopLaunchInferior DNBProcessLaunch() returned error: '' Mon Mar 15 14:25:55 unknown com.apple.debugserver-43[449] : error: failed to launch process (null): security policy error Mon Mar 15 14:26:03 unknown MobileSafari[72] : void SendDelegateMessage(NSInvocation*): delegate (webView:decidePolicyForNavigationAction:request:frame:decisionListener:) failed to return after waiting 10 seconds. main run loop mode: UITrackingRunLoopMode

    Read the article

  • iPad start in Landscape receive only touch within 768x768

    - by user1307179
    It works perfect fine when starting in portrait and also works when you rotate from portrait to landscape and back. It does not work when starting in landscape. But then it works when you rotate from landscape to portrait and back. In landscape starting mode, the screen does not respond with any touch where screen coordinateX greater than 768. What happens in code is, I use status bar orientation to determine original orientation and rotate each view manually. The views display correctly but does not receive touch properly. Then my root view controller will get called when ipad start rotating with: - (void)willRotateToInterfaceOrientation:(UIInterfaceOrientation)toInterfaceOrientation duration:(NSTimeInterval)duration which will rotate every subviews. Root controller: - (void)loadView { self.view = [[UIView alloc]init ]; //initialize child views [self willRotateToInterfaceOrientation:0 duration:0]; } - (void)willRotateToInterfaceOrientation:(UIInterfaceOrientation)toInterfaceOrientation duration:(NSTimeInterval)duration { if ([model isLandscape]) { self.view.frame = CGRectMake(0, 0, 1024, 768-80); } else { self.view.frame = CGRectMake(0, 0, 768, 1024-80); } //rotate child views } My code [model isLandscape] works so I don't need to provide details as to how it works but here are the code anyway: - (bool)isLandscape { if (orientation == UIInterfaceOrientationLandscapeLeft || orientation == UIInterfaceOrientationLandscapeRight) return true; else return false; } -(id) init { [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(orientationChanged:) name:UIDeviceOrientationDidChangeNotification object:nil]; } - (void)orientationChanged:(NSNotification *)notification { UIInterfaceOrientation curOrientation = [[UIDevice currentDevice] orientation]; if (curOrientation == UIDeviceOrientationPortrait || curOrientation == UIDeviceOrientationPortraitUpsideDown || curOrientation == UIDeviceOrientationLandscapeLeft || curOrientation == UIDeviceOrientationLandscapeRight) { orientation = curOrientation; ((AppDelegate*)([UIApplication sharedApplication].delegate)).savedOrientationForRestart = orientation; NSLog(@"changed"); } } -(void)validateOrientation { //first time when initializing orientation UIInterfaceOrientation curOrientation = [[UIDevice currentDevice] orientation]; if (curOrientation != UIDeviceOrientationPortrait && curOrientation != UIDeviceOrientationPortraitUpsideDown && curOrientation != UIDeviceOrientationLandscapeLeft && curOrientation != UIDeviceOrientationLandscapeRight) { orientation = [[UIApplication sharedApplication] statusBarOrientation]; } }

    Read the article

  • Touch gestures in IE not working without explorer.exe being run once

    - by Michael
    Edit: Rephrasing my question: Upon further troubleshooting, I can conclude that: Touch gestures (dragging, pinch to zoom, touch-and-hold right click) in Internet Explorer start to work when: The system has been running for ~2 minutes. This coincides with the delayed start of services. Explorer.exe is being run, then killed. I assume Explorer.exe starts some services? The services with delayed start are as follows: Security Center Software Protection Windows Defender, Search and Update Windows Font Cache Service Microsoft .NET Framework NGEN v4.0.30319_X64 and X86 I see no connection between these services and touch gestures, but just in case, I manually tried starting these services, but without luck. What else happens delayed after system boot, which also happens when explorer is started? Old question: Details: Internet Explorer 9 and Windows 7 Professional, running on a HP TouchSmart (touch screen PC). It is going to be a kiosk PC (running a custom GUI for displaying websites). Scenario 1: When running Internet Explorer as a normal program in Windows 7, touch functions work perfectly. I can scroll the website by dragging it with my finger, I can pinch zoom and I can touch-and-hold right click. I now change the default shell in Windows to Internet Explorer (ie. IE starts instead of explorer.exe). Internet Explorer of course starts up when logging in. However, touch functions are reduced to basic clicking (no dragging, no pinch zooming, no touch-and-hold right click). Then I manually start explorer.exe, and the touch functions work again! And here is the weird part: When I kill explorer.exe, the touch functions keeps working - even if I close IE and start a new instance. Scenario 2: The exact same, but instead of changing the default shell to Internet Explorer, I change it to my own program, which uses an embedded Internet Explorer ("WebBrowser"). Same thing happens. What I've tried: Autorun programs: When explorer.exe launches, it launches all the autorun programs. There are no relevant programs being run by explorer, but just in case, I have manually started all the autorun programs, so that it is identical (but without explorer.exe) to a normal login. It still does not work (until I launch explorer.exe). Specifically TabTip.exe, TabTip32.exe and wisptis.exe are all running. All services are also started. To sum it up Running explorer.exe once changes something in the touch capabilities of Internet Explorer. It doesn't matter if explorer.exe is running - as long as it has been run once. Does anyone know what causes this behavior? Or how I can circumvent it neatly?

    Read the article

  • Reading .ppt (MS PowerPoint) file in Cocoa Touch

    - by Biranchi
    Hi All, Any idea how to read a .ppt file in Cocoa Touch ? I tried to load the contents of the file in UIWebView but it didn't work. Here is the code : [aWebView loadData:[NSData dataWithContentsOfFile:filePath] MIMEType:@"application/vnd.ms-powerpoint" textEncodingName:@"utf-8" baseURL:[NSURL fileURLWithPath:filePath]]; [powerWeb loadData:[NSData dataWithContentsOfFile:filePath] MIMEType:@"application/vnd.ms-powerpoint" textEncodingName:@"utf-8" baseURL:[NSURL fileURLWithPath:filePath]]; All suggestions are highly appreciated. Thanks

    Read the article

  • HTML APIs for touch devices?

    - by askvictor
    What HTML APIs are available for touch screen devices (e.g. tablet PCs)? I notice that GMail's iPad interface (and other mobile interfaces) doesn't scroll down in a normal web browser (pretending to be an iPad via a user-agent hack). How can one access this API on a PC? I have a school full of tablet PCs that aren't wonderful in tablet mode due to lack of application support, but there looks to be an increasing number of web-based apps that will fill this gap.

    Read the article

  • How to let the CCSprite handles Touch event?

    - by Tattat
    I created my OwnCCSprite, and it get the implemented CCStandardTouchDelegate protocol, and ccTouchesBegan event. But it seems not working. When I click the CCSprite, the ccTouchesBegan in the CCLayer is called, but the CCSprite's ccTouchesBegan can't called. How can I detect the CCSprite is being touched in CCLayer / OwnCCSprite? or I need to calculate the touch position, and compares it to the OwnCCSprite positions? Thz.....

    Read the article

  • Open Source multi-touch API's?

    - by daft
    I'm looking for a good open source multi-touch API to use in a project we might get. So far I've found PyMT, but haven't really seen any comments on the maturity of that product, so any input in that regard would be much appreciated. I'd also like some other suggestions on API's that might be of interest, since googling have only given so much, and as with PyMT, it is quite difficult finding opinions on the frameworks out there. Many thanks.

    Read the article

  • Windows version of the Unix touch command

    - by Paul Hargreaves
    I'm looking for a Windows port of the UNIX touch command. I don't want to install an entire MKS toolkit just for the one tool. Is there a native port available somewhere or a command in Windows that does the same thing and supports features like all files in a directory by wildcard? Specifically I'm after changing mtime, ctime and atime for a project that reports ages of files based on... mtime, ctime and atime.

    Read the article

  • iPod Touch (OS 3.0) bluetooth connection to non apple device

    - by Avi
    I need to know if I can programmatically connect my iPod Touch (OS 3.0) to a non apple blue tooth device, Using the Apple iPhone SDK. I know that I can connect to other iPhone using GameKit API, But can I connect to other non apple Bluetooth devices for example an measuring device that send out real time data over blue tooth?

    Read the article

  • UIButton does not respond to touch events after changing its position using setFrame

    - by Pranathi
    I have a view controller class (child) which extends from view controller class (parent). In the parent class's loadView() method I create a sub-view (named myButtonView) with two buttons (buttons are horizontally laid out in the subview) and add it to the main view. In the subclass I need to shift these two buttons up by 50pixels. So, I am shifting the buttonView by calling the setFrame method. This makes the buttons shift and render properly but they do not respond to touch events after this. Buttons work properly in the views of Parent class type. In the child class type view also, if I comment out the setFrame() call the buttons work properly. How can I shift the buttons and still make them respond to touch events? Any help is appreciated. Following is snippets of the code. In the parent class: - (void)loadView { // Some code... CGRect buttonFrameRect = CGRectMake(0,yOffset+1,screenRect.size.width,KButtonViewHeight); myButtonView = [[UIView alloc]initWithFrame:buttonFrameRect]; myButtonView.backgroundColor = [UIColor clearColor]; [self.view addSubview:myButtonView]; // some code... CGRect nxtButtonRect = CGRectMake(screenRect.size.width - 110, 5, 100, 40); myNxtButton = [UIButton buttonWithType:UIButtonTypeCustom]; [myNxtButton setTitle:@"Submit" forState:UIControlStateNormal]; myNxtButton.frame = nxtButtonRect; myNxtButton.backgroundColor = [UIColor clearColor]; [myNxtButton addTarget:self action:@selector(nextButtonPressed:) forControlEvents:UIControlEventTouchUpInside]; [myButtonView addSubview:myNxtButton]; CGRect backButtonRect = CGRectMake(10, 5, 100, 40); myBackButton = [UIButton buttonWithType:UIButtonTypeCustom]; [myBackButton setTitle:@"Back" forState:UIControlStateNormal]; myBackButton.frame = backButtonRect; myBackButton.backgroundColor = [UIColor clearColor]; [myBackButton addTarget:self action:@selector(backButtonPressed:) forControlEvents:UIControlEventTouchUpInside]; [myButtonView addSubview:myBackButton]; // Some code... } In the child class: - (void)loadView { [super loadView]; //Some code .. CGRect buttonViewRect = myButtonView.frame; buttonViewRect.origin.y = yOffset; // This is basically original yOffset + 50 [myButtonView setFrame:buttonViewRect]; yOffset += KButtonViewHeight; // Add some other view below myButtonView .. }

    Read the article

  • MkMapView setRegion animation prevents touch events on Annotation Views

    - by Vlad Gurovich
    Hi there! We have a MKmapView with a bunch of Image Annotation where each Image annotation responds to touch by overriding these methods of AnnotationView subclass: -(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event; -(void) touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event Our map region is updated using [MkMapView setRegion:animated:] whenever the new location is received and is far enough from the old location to make a difference. What I noticed is that if we set animated flag to YES the touches on our annotation are rarely detected(probably due to the fact that main thread is busy animating between two map regions. When we set animated flag to NO, everything is fine, but map transition may(or may not) become jerky. The question I have is whether this is an expected behavior of animated flag of [MkMapView setRegion:animated] function or whether there is a workaround for this issue. Thanks in advance

    Read the article

< Previous Page | 3 4 5 6 7 8 9 10 11 12 13 14  | Next Page >