Search Results

Search found 212 results on 9 pages for 'gestures'.

Page 2/9 | < Previous Page | 1 2 3 4 5 6 7 8 9  | Next Page >

  • Getting a gestureoverlayview

    - by Codejoy
    I have been using some nice tutorials on drawing graphics on my android. I wanted to also add in the cool gesture demo found here: http://developer.android.com/resources/articles/gestures.html That takes these lines of code: GestureOverlayView gestures = (GestureOverlayView) findViewById(R.id.gestures); gestures.addOnGesturePerformedListener(this); This is fine and dandy yet I realize in my demo i'm trying to build using code from "Playing with Graphics in Android". The demos make sense, everything makes sense but I found out by using: setContentView(new Panel(this)); as is required by the Playing With Graphics tutorials, then the findViewById seems to no longer be valid and returns null. At first I was about to post a stupider question as to why this is happening, a quick test of playing with the setContentView made me realize the cause of findViewById returning null, I just do not know how to remedy this issue. Whats the key I am missing here? I realize that the new Panel is doinking some reference up but I am not sure how to make the connection here. The: R.id.gestures is defined right int he main.xml as: (just like the tutorial) <android.gesture.GestureOverlayView android:id="@+id/gestures" android:layout_width="fill_parent" android:layout_height="0dip" android:layout_weight="1.0" /> So I did confirm the setContentView(new Panel(this)) is causing the issue. So I know the issue is that I have to figure out how to add the android.gesture.GestureOverlayView to the panel class somehow, I am just not sure how to go about this. After fighting with this I generally know what I need to do just now how to do it. I think I need either the equivalent of creating a panel in that main.xml OR figuring out how to build whats in main.xml for the gestures in code. I am close because I did this: GestureOverlayView gestures = new GestureOverlayView(this); which gets me a non null gestures now, unfortunately since I am not telling it to fill Parent anywhere I don't think its really showing up, so I am trying hard to figure out layout pa rams. Am I even on the right track?

    Read the article

  • Did 12.04 just add multi-touch gesture support mid-release?

    - by adempewolff
    I was reviewing the updates I was about to download today and I noticed that a lot of them had to do with gesture support, noticed that many of these were new installs rather than upgrades. Has 12.04 just added multi-touch gesture support mid-release? If so, what are the capabilities that this adds? Which applications already support these capabilities and can I expect others to add support in the near future? Here are the packages that were installed: Install: libframe6:amd64 (2.2.4-0ubuntu0.12.04.1), libgeis1:amd64 (2.2.9.2-0ubuntu1), libgrail5:amd64 (3.0.6-0ubuntu0.12.04.01, automatic) And here are those that were upgraded (also including many with touch support): Upgrade: libgrip0:amd64 (0.3.4-0ubuntu2~ubuntu12.04.1, 0.3.5-0ubuntu1~12.04.1), eog:amd64 (3.4.2-0ubuntu1, 3.4.2-0ubuntu1.1), ginn:amd64 (0.2.4-0ubuntu1, 0.2.4.1-0ubuntu1) Of which the descriptions for the new installs are, libgeis1: Gesture engine interface support A common API for clients of a systemwide gesture recognition and propagation engine. libframe6: Touch Frame Library This library handles the buildup and synchronization of a set of simultaneous touches. The library is input agnostic, with bindings for mtdev, frame and XI2.1. libgrail5: Gesture Recognition And Instantiation Library This library consists of an interface and tools for handling gesture recognition and gesture instantiation. Applications can use the grail callbacks to receive gesture primitives and raw input events from the underlying kernel device. And the descriptions for the upgraded packages are, ligrip0: provides multitouch gestures to GTK+ apps Libgrip hooks gesture recognition into GTK+ applications. ginn: Gesture Injector: No-GEIS, No-Toolkits A daemon with jinn-like wish-granting capabilities: it gives applications the ability to support a subset of multi-touch gestures without having to integrate GEIS or multi-touch GTK/Qt libs. Adding in a ton of new libraries and upgrading the existing components makes me wonder if 12.04 is meant to start natively supporting gestures other than two finger scroll in the near future. I expected these capabilities to be introduced soon but I thought that they would only be rolled out in a new release, not as upgrades for an existing release. Anyone have any info about this?

    Read the article

  • enable all touchpad functions

    - by user118136
    When I had been using Windows 8 my touchpad had multiple gestures: 2 fingers direction top-bottom = vertical revers scrolling(if I scrolled top than page have scrolled bottom); 2 finger direction left-right = horizontal revers scrolling zoom in and zoom out like smartphones with 2 fingers 2 finger rotation = rotate image in image viewer (+ 90 deg or -90 deg) place a finger in the left edge and drag it to right = change windows application, in Ubuntu I want to change the active program to left like Ctrl+Shift+Tab place a finger in the right edge and drag it to left = open right menu and select the option moving finger on direction top-bottom, in Ubuntu I want to change the active program to right like Alt+Tab I succeed enable 2 finger vertical scrolling of System Setings, but I want that it do not work in revers sense. Do it exist a method to enable the rest of gestures and revers the vertical scrolling? edit: It's a Synaptics touchpad.

    Read the article

  • How do I programmatically add an Android Gestures view to a custom view?

    - by user351201
    I have a custom view that works fine and I'm trying to get gestures into it. The most common technique I see is to add XML, such as this (from Android docs: My view is within a RelativeView and when I attempt to reference this GetureOverlayView, I get an exception. I've also tried to connect within my existing custom view class, like this: mGestures = new GestureOverlayView(context, attrs); mGestures.addOnGesturePerformedListener(this); But the callback is never invoked. Can someone see my errors or suggest a better way that will allow me to get gesture callbacks?

    Read the article

  • Touch gestures in IE not working without explorer.exe being run once

    - by Michael
    Edit: Rephrasing my question: Upon further troubleshooting, I can conclude that: Touch gestures (dragging, pinch to zoom, touch-and-hold right click) in Internet Explorer start to work when: The system has been running for ~2 minutes. This coincides with the delayed start of services. Explorer.exe is being run, then killed. I assume Explorer.exe starts some services? The services with delayed start are as follows: Security Center Software Protection Windows Defender, Search and Update Windows Font Cache Service Microsoft .NET Framework NGEN v4.0.30319_X64 and X86 I see no connection between these services and touch gestures, but just in case, I manually tried starting these services, but without luck. What else happens delayed after system boot, which also happens when explorer is started? Old question: Details: Internet Explorer 9 and Windows 7 Professional, running on a HP TouchSmart (touch screen PC). It is going to be a kiosk PC (running a custom GUI for displaying websites). Scenario 1: When running Internet Explorer as a normal program in Windows 7, touch functions work perfectly. I can scroll the website by dragging it with my finger, I can pinch zoom and I can touch-and-hold right click. I now change the default shell in Windows to Internet Explorer (ie. IE starts instead of explorer.exe). Internet Explorer of course starts up when logging in. However, touch functions are reduced to basic clicking (no dragging, no pinch zooming, no touch-and-hold right click). Then I manually start explorer.exe, and the touch functions work again! And here is the weird part: When I kill explorer.exe, the touch functions keeps working - even if I close IE and start a new instance. Scenario 2: The exact same, but instead of changing the default shell to Internet Explorer, I change it to my own program, which uses an embedded Internet Explorer ("WebBrowser"). Same thing happens. What I've tried: Autorun programs: When explorer.exe launches, it launches all the autorun programs. There are no relevant programs being run by explorer, but just in case, I have manually started all the autorun programs, so that it is identical (but without explorer.exe) to a normal login. It still does not work (until I launch explorer.exe). Specifically TabTip.exe, TabTip32.exe and wisptis.exe are all running. All services are also started. To sum it up Running explorer.exe once changes something in the touch capabilities of Internet Explorer. It doesn't matter if explorer.exe is running - as long as it has been run once. Does anyone know what causes this behavior? Or how I can circumvent it neatly?

    Read the article

  • Macbook Air trackpad not clicking nor moving cursor, but multi-finger gestures are working

    - by GJ.
    This has been happening to me several times recently just after I disconnect my MBA from a USB hub (for an external keyboard and mouse). The internal trackpad just won't move the cursor, not does it generate clicks. However, strangely enough, multi-finger gestures are still working, e.g. scrolling with two fingers. Only after a system restart does the trackpad return to normal (until the next time..)

    Read the article

  • How may I monitor all gestures of the entire system

    - by Jiulong Zhao
    I can make a view to accept touch event and receive detailed gesture event like this: [self setAcceptsTouchEvents:YES]; // NSSet *touches = [event touchesMatchingPhase:NSTouchPhaseBegan inView:self]; I can also receive global event by crate a monitor in this way: NSUInteger eventMasks = NSEventTypeBeginGesture | NSEventTypeEndGesture | NSEventMaskGesture; [NSEvent addGlobalMonitorForEventsMatchingMask:eventMasks handler:^(NSEvent *theevent) {//do something}]; But I can NOT get the detailed event from monitor by -touchesMatchingPhase How may I monitor all gestures of the entire system.

    Read the article

  • Asus K55A Windows 8.1 touchpad smart gestures not working

    - by user291792
    Took me a bit to realize that since upgrading to 8.1 my touchpad smart gestures (two finger scroll, top down swipe, left in swipe, ect) don't work. I have read up a little online but can't seem to find an answer or fix that 1) sounds trustworthy and 2) is put into language and steps that I understand. I already have a hard time navigating windows 8 in general, so any help is awesome, but please dumb it down for me if you can. Thanks -K-

    Read the article

  • Is there a mousegestures add-on for Google Chrome?

    - by kgrad
    With the release of chrome 3.0 I am again considering switching to chrome as my default browser. The only thing stopping me, and it has been stopping me since chrome 1 came out, is the lack of the mousegestures add-on that i have in firefox. Mouse Gestures have become so routine for me that I simply can't use another browser that doesn't have them. There are ways to kinda emulate mouse gestures using 3rd party programs like gmote but they are not the same and not quite as good. I know that chrome developer has add-ins but I havent been able to find a mouse-gestures one. I'm fairly confident that many people want one. So, does a mousegestures add-on exist for chrome? bonus points if there is a firebug/xmarks add-in as well! thanks.

    Read the article

  • How do you add gestures to a UITableViewController?

    - by mea36
    I want to implement right-to-left and left-to-right gestures on a view that inherits from UITableViewController. I have the code for the gestures implemented in another view (UIViewController) and it works. It does't seem like touchesBegan is even getting called. Does anyone know know to do this? Thanks

    Read the article

  • Emulating touch screen on Windows 8 with a Touch-Pad

    - by Akshat Mittal
    I am currently running Windows 8 Pro RTM (MSDN) and wonder if there is any way to use the touch-pad as some kind of touchscreen for it or maybe just gestures. I have a Synaptics Touch-pad, searching the internet I found some articles saying about some kind of relation between Synaptics Touch-pad and Windows 8, but I was not able to get info about how to use the gestures or something similar. Simply the question is, How can I enable Synaptics Gestures for Windows 8 or use the touch-pad as touch-screen (I know it would be a really tiny touchscreen, but I want to try) with third-party Tools or hacks?

    Read the article

  • Implementing `fling` logic without pan gesture recognizers

    - by KDiTraglia
    So I am trying to port over a simple game that I originally wrote to iphone into cocos2d-x. I've hit a minor bump however in implementing simple 'fling' logic I had in the iphone version that is difficult to port over to the c++. In iOS I could get the velocity of a pan gesture very easily: CGPoint velocity = [recognizer velocityInView:recognizer.view]; However now I basically only know where the touch began, where the touch ended, and all the touches that are logged in between. For now I logged all the pts onto a stack then pulled the last point and the 6th to last point (seemed to work the best), find the difference between those pts multiply by a constant and use that as the velocity. It works relatively well, but I'm wondering if anyone else has any better algorithms, when given a bunch of touch pts, to figure out a new speed upon releasing an object that feels natural (Note speed in my game is just a constant x and y, there's no drag or spin or anything tricky like that). Bonus points if anyone has figured out how to get pan gestures into the newest version (3.0 alpha) of cocos2d-x without losing ability to build cross platform.

    Read the article

  • Facing gestures in Linux mint Chrome Browser

    - by aravind.udayashankara
    I am using Linux mint OS since an Year , I use to consistently download updates and install , I use chrome as a default browser , when ever I open youtube and watch some video , I listen to some gestures in sound ( say repeated lyrics of song ) while it is playing , In firefox it is working fine . What is the problem am I missing any plugin , AFAIK Chrome doesn't need a flash player plug in , It has a built in flash player . IS that the problem ? And also previously I was not facing this , recently I started using Cinimon UI centOS after this all these kind of problems started MY hard ware is 64 bit intel core i3 and also I have installed linux mint 64 bit Please let me know what is the problem and how to fix this . Thanks in advance for responding to this post

    Read the article

  • Getting a gestureoverlayview

    - by Codejoy
    I have been using some nice tutorials on drawing graphics on my android. I wanted to also add in the cool gesture demo found here: http://developer.android.com/resources/articles/gestures.html That takes these lines of code: GestureOverlayView gestures = (GestureOverlayView) findViewById(R.id.gestures); gestures.addOnGesturePerformedListener(this); This is fine and dandy yet I realize in my demo i'm trying to build using code from "Playing with Graphics in Android". The demos make sense, everything makes sense but I found out by using: setContentView(new Panel(this)); as is required by the Playing With Graphics tutorials, then the findViewById seems to no longer be valid and returns null. At first I was about to post a stupider question as to why this is happening, a quick test of playing with the setContentView made me realize the cause of findViewById returning null, I just do not know how to remedy this issue. Whats the key I am missing here? I realize that the new Panel is doinking some reference up but I am not sure how to make the connection here. THe: R.id.gestures is defined right int he main.xml as: (just like the tutorial) Thanks, Shane p.s. im new here be gentle.

    Read the article

  • Swipe gestures on Android ListView items

    - by Bartek
    I have a ListView populated by a ResourceCursorAdapter. I use the loaders mechanism to query a ContentProvider for list items. I detect swipe gestures on the list items to perform some actions on them. New items get added by a background service, so the list can change dynamically. Everything works fine, except when I start swiping and a database change occurs (as a result of the background service adding a new row). In such case the gesture is not detected properly. I noticed that ACTION_CANCEL is dispatched to the list item view and also that bindView is executed for all visible items. Inside the bindView method I only set some text - I don't change any listeners there. How can I make gestures work even when new items are being added by the background service? Perhaps there's a way to prevent the motion from being cancelled or I can pause database updates so they don't interrupt the gesture.

    Read the article

  • Apple Magic Trackpad Gestures carried out as personalized Commands in Windows

    - by Adele
    I want to have the Apple Magic Trackpad to work on Windows, but NOT as a regular Mouse! I will need the normal Trackpad Gestures to work with a c# Application (i.e. when carring out a 2 finger swipe to the left, it will start playing a song...). I guess I´ll have to write my own driver? Is there a way to use Apples MAgic Trackpad Driver for Windows and re-write that one? OR is there any way (API, self-written driver), so that I could just hook the gestures to my Commands? Or any RAW Input examples? Does anybody know how to do that, or where to start? Thank you so much, I´m really lost.

    Read the article

  • iPhone Gestures Adding 2 at once

    - by BahaiResearch.com
    Objective C answers are fine too. Currently I am using this code to add 2 gestures (left / right) to my WebView. Works fine. Can I combine this into less code though to indicate that both gestures go to the same action? //LEFT UISwipeGestureRecognizer sgr = new UISwipeGestureRecognizer (); sgr.AddTarget (this, MainViewController.MySelector); sgr.Direction = UISwipeGestureRecognizerDirection.Left; sgr.Delegate = new SwipeRecognizerDelegate (); this.View.AddGestureRecognizer (sgr); //RIGHT UISwipeGestureRecognizer sgrRight = new UISwipeGestureRecognizer (); sgrRight.AddTarget (this, MainViewController.MySelector); sgrRight.Direction = UISwipeGestureRecognizerDirection.Right; sgrRight.Delegate = new SwipeRecognizerDelegate (); this.View.AddGestureRecognizer (sgrRight);

    Read the article

  • Ambiguation between multitouch geistures tap and free drag in Windows Phone 8 Emulator (Monogame)

    - by Moses Aprico
    I am making a 2d tile based tactic game. I want the map to be slided around (because it's bigger than the screen) with FreeDrag (It's perfectly done, the map can moved around, that's not the problem). And then, I want to display the character's actions, everytime it's tapped. The problem then appeared. Everytime I want to FreeDrag the map, the Tap trigger always fired first before the FreeDrag one. Is there any way to differ the map sliding than the character tapping? Below is my code. while (TouchPanel.IsGestureAvailable) { GestureSample gesture = TouchPanel.ReadGesture(); switch (gesture.GestureType) { case GestureType.FreeDrag: { //a } break; case GestureType.Tap: { //b } break; } } Every time I first want to free drag (at the first touch), it always goes to "b" first (see commented line above), and then to "a" rather than immediately goes to "a". I've tried flick, but it seems the movement produced by flick is too fast, so freedrag fits the most. Is there any way or workaround to perform FreeDrag (or similar) without firing the Tap trigger? Thanks in advance.

    Read the article

  • Windows 7: Touch gestures in IE not working without explorer.exe being run once

    - by Michael
    Details: Internet Explorer 9 and Windows 7 Professional, running on a HP TouchSmart (touch screen PC). It is going to be a kiosk PC (running a custom GUI for displaying websites). Scenario 1: When running Internet Explorer as a normal program in Windows 7, touch functions work perfectly. I can scroll the website by dragging it with my finger, I can pinch zoom and I can touch-and-hold right click. I now change the default shell in Windows to Internet Explorer (ie. IE starts instead of explorer.exe). Internet Explorer of course starts up when logging in. However, touch functions are reduced to basic clicking (no dragging, no pinch zooming, no touch-and-hold right click). Then I manually start explorer.exe, and the touch functions work again! And here is the weird part: When I kill explorer.exe, the touch functions keeps working - even if I close IE and start a new instance. Scenario 2: The exact same, but instead of changing the default shell to Internet Explorer, I change it to my own program, which uses an embedded Internet Explorer ("WebBrowser"). Same thing happens. What I've tried: Autorun programs: When explorer.exe launches, it launches all the autorun programs. There are no relevant programs being run by explorer, but just in case, I have manually started all the autorun programs, so that it is identical (but without explorer.exe) to a normal login. It still does not work (until I launch explorer.exe). Specifically TabTip.exe, TabTip32.exe and wisptis.exe are all running. All services are also started. To sum it up Running explorer.exe once changes something in the touch capabilities of Internet Explorer. It doesn't matter if explorer.exe is running - as long as it has been run once. Does anyone know what causes this behavior? Or how I can circumvent it neatly? Thanks!

    Read the article

  • Implementing tracing gestures on iPhone

    - by bmoeskau
    I'd like to create an iPhone app that supports tracing of arbitrary shapes using your finger (with accuracy detection). I have seen references to an Apple sample app called "GestureMatch" that supposedly implemented exactly that, but it was removed from the SDK at some point and I cannot find the source anywhere via Google. Does anyone know of a current official sample that demonstrates tracing like this? Or any solid suggestions on other resources to look at? I've done some iPhone programming, but not really anything with the graphics API's or custom handling of touch gestures, so I'm not sure where to start.

    Read the article

  • Windows 8 trackpad edge swipe zones

    - by askvictor
    I'm running Windows 8 on a Lenovo x220 laptop; and have just inadvertently discovered the edge-swipe feature or that brings up the charms or switches between desktop and RT. Only problem is that the landing zones are a little too wide for my liking - I'd like to keep this feature, but to narrow the zone where it can start. I'd rather not disable it completely as per: Modify or disable Windows 8 swipe gestures on touchpad / laptop The Synaptic driver (latest available) doesn't seem to provide for changing this (though it does for other zones). Any ideas?

    Read the article

  • windows mobile 6.5 Gestures and DirectDraw

    - by ArjanW
    I'm trying to build a UI using directdraw in c#. For this im using a DirectDrawWrapper as sugested here. My initial tests setting up the screen work perfectly. But now i'd like to incorporate gesture recognition into the UI. So i instantiate a GestureRecognizer and tie it to the _form which also gets passed to the DirectDrawGraphics constructor, form = new Form(); _form.show(); _graphics = new DirectDrawGraphics(_form, CooperativeFlags.Fullscreen, BackbufferMode.Any); gestureRecognizer = new GestureRecognizer(); gestureRecognizer.TargetControl = _form; Pasting the whole DirectDrawWrapper code might be a bit to much, so let me try to formulate a question. I guess directdraw talks directly to the video memory, as it should. But then my form wont receive any messages, thus any eventhandlers i'v tied op to the GestureRecognizer wont be fired. How can i still receive any messages from the touchscreen?

    Read the article

  • Trouble with detecting gestures over ListView

    - by Andrew
    I have an Activity that contains a ViewFlipper. The ViewFlipper includes 2 layouts, both of which are essentially just ListViews. So the idea here is that I have two lists and to navigate from one to the other I would use a horizontal swipe. I have that working. However, what ever list item your finger is on when the swipe begins executing, that item will also be long-clicked. Here is the relevant code I have: public class MyActivity extends Activity implements OnItemClickListener, OnClickListener { private static final int SWIPE_MIN_DISTANCE = 120; private static final int SWIPE_MAX_OFF_PATH = 250; private static final int SWIPE_THRESHOLD_VELOCITY = 200; private GestureDetector mGestureDetector; View.OnTouchListener mGestureListener; class MyGestureDetector extends SimpleOnGestureListener { @Override public boolean onFling(MotionEvent e1, MotionEvent e2, float velocityX, float velocityY) { try { if (Math.abs(e1.getY() - e2.getY()) > SWIPE_MAX_OFF_PATH) return false; // right to left swipe if(e1.getX() - e2.getX() > SWIPE_MIN_DISTANCE && Math.abs(velocityX) > SWIPE_THRESHOLD_VELOCITY) { if (mCurrentScreen != SCREEN_SECONDLIST) { mCurrentScreen = SCREEN_SECONDLIST; mFlipper.setInAnimation(inFromRightAnimation()); mFlipper.setOutAnimation(outToLeftAnimation()); mFlipper.showNext(); updateNavigationBar(); } } else if (e2.getX() - e1.getX() > SWIPE_MIN_DISTANCE && Math.abs(velocityX) > SWIPE_THRESHOLD_VELOCITY) { if (mCurrentScreen != SCREEN_FIRSTLIST) { mCurrentScreen = SCREEN_FIRSTLIST; mFlipper.setInAnimation(inFromLeftAnimation()); mFlipper.setOutAnimation(outToRightAnimation()); mFlipper.showPrevious(); updateNavigationBar(); } } } catch (Exception e) { // nothing } return true; } } @Override public boolean onTouchEvent(MotionEvent event) { if (mGestureDetector.onTouchEvent(event)) return true; else return false; } ViewFlipper mFlipper; private int mCurrentScreen = SCREEN_FIRSTLIST; private ListView mList1; private ListView mList2; /** Called when the activity is first created. */ @Override public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); requestWindowFeature(Window.FEATURE_NO_TITLE); setContentView(R.layout.layout_flipper); mFlipper = (ViewFlipper) findViewById(R.id.flipper); mGestureDetector = new GestureDetector(new MyGestureDetector()); mGestureListener = new View.OnTouchListener() { public boolean onTouch(View v, MotionEvent event) { if (mGestureDetector.onTouchEvent(event)) { return true; } return false; } }; // set up List1 screen mList1List = (ListView)findViewById(R.id.list1); mList1List.setOnItemClickListener(this); mList1List.setOnTouchListener(mGestureListener); // set up List2 screen mList2List = (ListView)findViewById(R.id.list2); mList2List.setOnItemClickListener(this); mList2List.setOnTouchListener(mGestureListener); } … } If I change the "return true;" statement from the GestureDetector to "return false;", I do not get long-clicks. Unfortunately, I get regular clicks. Does anyone know how I can get around this?

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9  | Next Page >