Search Results

Search found 19745 results on 790 pages for 'touch event'.

Page 44/790 | < Previous Page | 40 41 42 43 44 45 46 47 48 49 50 51  | Next Page >

  • Beginning Game Development on iPhone/iPad [closed]

    - by Ilya Knaup
    I'm willing to begin learning iPhone Game development. The problem is that I've found many resources for older models of iPad and iPhone. As you know now both have retina displays and amazingly fast graphics processors (Older resources don't take advantage of it). So I'm here to ask you for help on how to kickoff the development. Any recent tools, libraries, standards etc. Is there anything you can recommend? Ideally game should work on both iPad and iPhone, Retina and non Retina. It's going be a 2d / cartoon graphics based game with intense touching (So detecting touches quiet fast is a must have). Any advice, everything you that could help us started is very much appreciated.

    Read the article

  • Access local email stored on worstation on laptop on lan

    - by crafter
    I have the following scenario with my email : I am using Evolution as my primary email client on my workstation. The evolution mail is downloaded from my mail servers using POP, then deleted from the server. When I am mobile, I access my email on my email server using webmail. My laptop is my primary computer thesedays. The workstation is hardly used. When I am mobile, I am restricted to new email that has not been downloaded onto the workstation I am now looking for a way to access my email from my workstation on my laptop, amlost as if my workstation is my second level email server/ I tried evolution on X display but attachments will browse on my workstation (not ideal as most docs are on my laptop). I am open to changing mail client or installing a service on my workstation. What would be the best way to address this requirement?

    Read the article

  • Apple Magic Trackpad multitouch configuration

    - by Sureshkannan Duraisamy
    Today I installed the Ubuntu 10.10 release on my Desktop PC. I was running Ubuntu 10.04 LTS with an Apple Magic Trackpad and everything was working fine. After today's fresh installation of Ubuntu 10.10, I don't see my Apple Magic Trackpad's multitouch working. Two-finger scrolling and three-finger third mouse button clicking are completely broken. Has anyone else experienced a similar issue? Has anyone had success with Ubuntu 10.10 and an Apple Magic TrackPad? Please help me to fix this issue. Your help is highly appreciated...

    Read the article

  • Configuring Touchpad Multi-Tap on Ubuntu 11.10

    - by nunos
    I am having a hard time configuring my notebook's touchpad. I can do everything I can in Windows with the exception of three-finger tap, which doesn't work, and the action of two-finger tap which is giving me the equivalent to a right-mouse click, when I wanted a middle-mouse click. I read on a forum to use this as a guide. The problem is that I can't even find the configuration file /etc/X11/xorg.conf.d/10-synaptics.conf. I tried running pacman -S xf86-input-synaptics but I don't have the pacman program installed. When I try to install it by sudo apt-get install I get a pacman game instead! I know the guide is for archlinux, so maybe that's why it doesn't work with me. I am running Ubuntu 11.10 on an Asus N82JV. Any help on this is appreciated. Here's the output of xinput list: nuno@mozart:~$ xinput list ? Virtual core pointer id=2 [master pointer (3)] ? ? Virtual core XTEST pointer id=4 [slave pointer (2)] ? ? Microsoft Microsoft® Nano Transceiver v2.0 id=12 [slave pointer (2)] ? ? Microsoft Microsoft® Nano Transceiver v2.0 id=13 [slave pointer (2)] ? ? ETPS/2 Elantech Touchpad id=16 [slave pointer (2)] ? Virtual core keyboard id=3 [master keyboard (2)] ? Virtual core XTEST keyboard id=5 [slave keyboard (3)] ? Power Button id=6 [slave keyboard (3)] ? Video Bus id=7 [slave keyboard (3)] ? Video Bus id=8 [slave keyboard (3)] ? Sleep Button id=9 [slave keyboard (3)] ? USB2.0 2.0M UVC WebCam id=10 [slave keyboard (3)] ? Microsoft Microsoft® Nano Transceiver v2.0 id=11 [slave keyboard (3)] ? Asus Laptop extra buttons id=14 [slave keyboard (3)] ? AT Translated Set 2 keyboard id=15 [slave keyboard (3)]

    Read the article

  • Rainy Day Wallpaper Collection for Your iPhone

    - by Akemi Iwaya
    Rainy days are great for staying indoors to read your favorite new book, taking a nap, or even going outside for a quiet walk. Let the rain fall on your iPhone’s screen with the first in our series of Rainy Day Wallpaper collections. Rainy Day Series 1 Note: Click on the pictures to view and download the full-size versions at their individual homepages. The images shown here are in thumbnail format.                     

    Read the article

  • ETPS/" Elantech Touchpad fails detecting single finger

    - by Philipp
    I have a new laptop with an "ETPS/2 Elantech Touchpad". A single finger is only detected, if it is held in such a way that a big area touches the pad. If one touches the pad only with the fingertip, it wil not be detected, both for clicking and for moving the pointer. Strangley, two-finger gestures are detected even if the pad is only touched with the fingertips. On ubuntuusers.de I found that Elantech touchpads require to reload the psmouse-module by: sudo modprobe -r psmouse sudo modprobe psmouse proto=imps If I do this, one finger finally gets detected as it should, but all two or more finger gestures stop working. Also grep -B 5 mouse /proc/bus/input/devices tells that the touchpad now is identified as a mouse, while before the changes it was identified correctly as ETPS/2 Elantech Touchpad.

    Read the article

  • How to practice typing of programmer keys such as tilde, pipe and programmer quote?

    - by user7893
    It is nice that there are services such as TypeRacer where you can practice casual writing but I want to practice programmer keys, covers more numbers and keys not used by regular typist. There was some tutor with which I practiced some programmer keys and noticed that my speed dropped dramatically from 70-80 wpm to even about 15-30 wpm, it also trains different muscles. So how can I practice just programming keys with programming texts or just random code pieces?

    Read the article

  • Basic game architechture best practices in Cocos2D on iOS

    - by MrDatabase
    Consider the following simple game: 20 squares floating around an iPhone's screen. Tapping a square causes that square to disappear. What's the "best practices" way to set this up in Cocos2D? Here's my plan so far: One Objective-c GameState singleton class (maintains list of active squares) One CCScene (since there's no menus etc) One CCLayer (child node of the scene) Many CCSprite nodes (one for each square, all child nodes of the layer) Each sprite listens for a tap on itself. Receive tap = remove from GameState Since I'm relatively new to Cocos2D I'd like some feedback on this design. For example I'm unsure of the GameState singleton. Perhaps it's unnecessary.

    Read the article

  • Problems in creating click package for native application

    - by Swordfish90
    Hello everyone I've tried may times to package a native click application mainly following this tutorial: http://notyetthere.org/?p=316 The procedure works and I'm able to execute the binaries both on phones and desktops. The problems start when I install the click package. The application is launched but the game is not working (the "gamefield" is not loaded). The only error message that is given is that versions of localstorages are not compatible (that is impossbile, because it never changed). I suspect that the problem is related to apparmor but I have no proof of that. The source code of the application is available here: https://launchpad.net/ubuntu-netwalk Thank you in advance to everyone...

    Read the article

  • GestureListener's fling method doesn't get called

    - by nosferat
    I'm using SimpleGestureDetector from the libgdx-users Wiki as my InputProcessor. I set it in the created() method: Gdx.input.setInputProcess(new SimpleDirectionGestureDetector(charController)); charController is my class which implements the DirectionListener interface defined in the SimpleDirectionGestureDetector class and it is responsible for moving the player character. However the character doesn't change direction when I'm performing a fling action in any direction. I've checked and the fling() method in the SimpleDirectionGesture class doesn't get called and I have no idea why, since everything seems good. What am I doing wrong?

    Read the article

  • How to use uTouch on multitouch-enabled touchpads?

    - by Freddi
    I currently have a Synaptics touchpad with only few classic multitouch features (2 finger scroll, right click). By installing the uTouch testing suite, I saw that it doesn't accept my touchpad as input device. I want to buy a newer notebook and would like to benefit of uTouch features (window management, swipe, pinch, rotate). Does uTouch only work on touchscreens or also on touchpads? What requirements should I take into account when choosing a new notebook?

    Read the article

  • Mobile broadband not connect without unplug and plug

    - by Muhammad Zohaib
    I have recently installed ubuntu 13.10 and I am still very new in this operating system. My problem is that when I start my computer, it detects all the wifi connections around but not my mobile broadband usb connection (huwaie). I dont get any mobile broadband section automatically. I have to unplug and then plug my broadband usb to connect and have mobile broadband section available. I dont like to unplug and then plug my device always as it will loose my laptop and I always want to be plug in laptop even in shutdown. I always want to auto detect my usb broadband by ubuntu. Please someone guide me. Thanks in advance.

    Read the article

  • How to tell what part of a 3D cube was touched

    - by user2539517
    I am writing a rather simple android game and I am implementing Open GL to draw a 3D cube that spins upon the X, Y and Z axis and I need to know where the user has clicked on the texture of the cube. The texture is a simple square bitmap (100x100) that has a smaller square in the center. I need to know if the user touches the inner square. As well was tell which face of the cube the user touches. Does anyone know how this can be accomplished if not can anyone give some pseudo code on how to tell where the ray correlates to the texture? Or at least point me in the right direction. The textures of each face are like this: The code I am using is from: http://www3.ntu.edu.sg/home/ehchua/programming/android/Android_3D.html2.9 It is a port to android from Lesson 6 NeHe. Example 6a: Photo-Cube

    Read the article

  • Unable to Restore Nexus 7 2013

    - by belkinsa
    I have a Nexus 7 2013 with FLO-04.01 bootloader and I tired ./flash-all.sh and it fails. Happens every time. Output below: svetlana@svetlana-TECRA-M5:~/Downloads/razor-krt16s$ sudo ./flash-all.sh [sudo] password for svetlana: < waiting for device > sending 'bootloader' (3911 KB)... OKAY [ 0.163s] writing 'bootloader'... OKAY [ 1.446s] finished. total time: 1.609s rebooting into bootloader... OKAY [ 0.006s] finished. total time: 0.006s archive does not contain 'boot.sig' archive does not contain 'recovery.sig' failed to allocate 721539744 bytes error: update package missing system.img svetlana@svetlana-TECRA-M5:~/Downloads/razor-krt16s$

    Read the article

  • Restoring two finger middle click again

    - by Thomas A.
    it used to be that tapping two fingers on the touchpad send a middle mouse click. Now it does a right click and three fingers now are the middle click. I really can't understand the change and think it is a bug or badly copied from Apple or something. The reasoning escapes me totally. I use middle click to open links in a new tab in the browser all day and I rarely use right click (and I have a right mouse button below the touchpad, doh) Tapping three fingers on my tiny EeePC touchpad is next to impossible so I want the old behavior. I found: synclient TapButtons2=2 synclient TapButtons3=3 but that did not work on 10.10 Does anyone know how to restore sane behavior?

    Read the article

  • Multitouch screen not detected on Asus Taichi 21DH71

    - by geekfreak
    I just bought this Ultrabook "Asus Taichi 21 DH71". This has Intel 3rd generation i7 processor and 4gb ram with 256 gb SSD. The main feature is that it is a hybrid machine. Naming it has dual screens. When the lid is closed it can be used as a tablet and when lid open it can be used as a notebook. This machine can also be used with the two screens on at the same time. I used ubuntu many years ago and loved it. But I never tried any linux later. My questions are Does the new version of Ubuntu support the Multitouch interface? Will it work specifically on this machine? Will Ubuntu support gestures on multi touchpad? Update 2/22/2013 I did try the latest 64bit Ubuntu(12.10) from live usb and noticed that it couldn't detect the tablet screen. Everything else worked seamlessly. Do you guys think the tablet screen would be detected if I make a complete installation on to the notebook? Please help guys..

    Read the article

  • Logging Output of Azure Startup Tasks to the Event Log

    - by Your DisplayName here!
    This can come in handy when troubleshooting: using System; using System.Diagnostics; using System.Text;   namespace Thinktecture.Azure {     class Program     {         static EventLog _eventLog = new EventLog("Application", ".", "StartupTaskShell");         static StringBuilder _out = new StringBuilder(64);         static StringBuilder _err = new StringBuilder(64);           static int Main(string[] args)         {             if (args.Length != 1)             {                 Console.WriteLine("Invalid arguments: " + String.Join(", ", args));                 _eventLog.WriteEntry("Invalid arguments: " + String.Join(", ", args));                                 return -1;             }               var task = args[0];               ProcessStartInfo info = new ProcessStartInfo()             {                 FileName = task,                 WorkingDirectory = Environment.CurrentDirectory,                 UseShellExecute = false,                 ErrorDialog = false,                 CreateNoWindow = true,                 RedirectStandardOutput = true,                 RedirectStandardError = true             };               var process = new Process();             process.StartInfo = info;               process.OutputDataReceived += (s, e) =>                 {                     if (e.Data != null)                     {                         _out.AppendLine(e.Data);                     }                 };             process.ErrorDataReceived += (s, e) =>                 {                     if (e.Data != null)                     {                         _err.AppendLine(e.Data);                     }                 };               process.Start();             process.BeginOutputReadLine();             process.BeginErrorReadLine();             process.WaitForExit();               var outString = _out.ToString();             var errString = _err.ToString();               if (!string.IsNullOrWhiteSpace(outString))             {                 outString = String.Format("Standard Out for {0}\n\n{1}", task, outString);                 _eventLog.WriteEntry(outString, EventLogEntryType.Information);             }               if (!string.IsNullOrWhiteSpace(errString))             {                 errString = String.Format("Standard Err for {0}\n\n{1}", task, errString);                 _eventLog.WriteEntry(errString, EventLogEntryType.Error);             }               return 0;         }     } } You then wrap your startup tasks with the StartupTaskShell and you’ll be able to see stdout and stderr in the application event log.

    Read the article

  • Examples of Android Joystick Controls? [on hold]

    - by KRB
    I can't seem to find any well executed code examples for Android joystick controls. Whatever it may be, algorithms, pseudo code, actual code examples, strategies, or anything to assist with the design and implementation of Android joystick controls; I can't seem to find anything decent on the net. What are some well executed examples? More specifically, Pseudo Code Current Examples Idea/Design Functionality Description Controller Hints Related Directly to Android Architecture What kind of classes will I have making this? Will there be only one? How would this be implemented to the game architecture? All things I am thinking about. Cheers! UPDATE I've found this on the subject Joystick Example1, though I am still looking for different examples/resources. Answered my own question with a link to the code of the above video. It's a fantastic start to Android Joystick Controls.

    Read the article

  • Getting a 404 when using the Nexus 7 installer PPA, how do I fix this? [duplicate]

    - by Vitaliy
    This question already has an answer here: How can I fix a 404 Error when using a PPA? 2 answers ubuntu 13.10 sudo add-apt-repository ppa:ubuntu-nexus7/ubuntu-nexus7-installer OK sudo apt-get update: W: ?? ??????? ???????? http://ppa.launchpad.net/ubuntu-nexus7/ubuntu-nexus7-installer/ubuntu/dists/saucy/main/binary-amd64/Packages 404 Not Found W: ?? ??????? ???????? http://ppa.launchpad.net/ubuntu-nexus7/ubuntu-nexus7-installer/ubuntu/dists/saucy/main/binary-i386/Packages 404 Not Found Thanks for the answer

    Read the article

  • Middle-click does nothing but makes window controls appear

    - by hleinone
    Just did a fresh install of Precise Pangolin on my laptop and noticed that the middle-click (actually three finger-tap on the touchpad) on Firefox doesn't work as it used to. When doing it on a link it doesn't get opened on a new tab, in fact, it doesn't get opened at all. Only the (useless) window size and position controls appear, as demonstrated on terminal in the following screen shot. How do I get my tab-opening middle-clicks back?

    Read the article

  • How do I make 3finger multitouch work on Samsung 530?

    - by RiaD
    I have Samsung 530U4C. How can I configure 3-finger gestures to work? Choosing next photo using 3 finger "scrool" worked in Windows 7. If I use synclient TapButton3=2 I may use it as medium mouse button, but there's problem: it gets cancelled after reboot, and as I read over the Internet, sometime else. Moreover, it would be great to see all gestures my laptop support and configure them all as I need. I find some info about touchegg, but I didn't manage to understand what it exactly is and even run it. Two-finger gestures works fine(configured using System setting menu)

    Read the article

  • How to get past "sending 'system' (92311 KB)..." when installing ubuntu on Nexus 7

    - by brew182
    I'm in the process of installing the Ubuntu image onto my Nexus 7 following these directions. I am on the step 4 and typed in phablet-flash -b and it downloaded some files, erased 'system' and is now on sending 'system' (92311 KB)... However, it has been on this step for about 3 hours now. I am assuming it shouldn't take this long so how can I go about getting out of this and restarting the flash without bricking my device?

    Read the article

  • permanently disable bluetooth

    - by NotABluetoothUser
    I simply don't use bluetooth. Since it can be a security risk and also drains the battery I would like to keep it deactivated. I quickly found the option to turn it off in the settings menu, but the problem is: it doesn't stay dead! Everytime I pull my Nexus 4 out of standby mode bluetooth reappears in the top bar as if I never deactivated it. How can I deactivate it so it stays deactivated or better yet how can I remove it from my phone entirely? I tried sudo apt-get remove bluez bluetooth, but I am not allowed to edit this package.

    Read the article

  • raising a vb6 event using interop

    - by Steve
    Hi, I have a legacy VB6 component that I've imported into VS using tlbimp.exe to generate my interop assembly. The VB6 component defines an event that allows me to pass messages within VB6. Public Event Message(ByVal iMsg As Variant, oCancel As Variant) I would really like to be able to raise this even in my C# program, but its getting imported as an event, not a delegate or something else useful. So, I can only listen, but never fire. Does anyone know how to fire an event contained within VB6? The C# event looks like [TypeLibType(16)] [ComVisible(false)] public interface __MyObj_Event { event __MyObj_MessageEventHandler Message; } I unfortunately cannot change the VB6 code. Thanks.

    Read the article

  • Correct way of using/testing event service in Eclipse E4 RCP

    - by Thorsten Beck
    Allow me to pose two coupled questions that might boil down to one about good application design ;-) What is the best practice for using event based communication in an e4 RCP application? How can I write simple unit tests (using JUnit) for classes that send/receive events using dependency injection and IEventBroker ? Let’s be more concrete: say I am developing an Eclipse e4 RCP application consisting of several plugins that need to communicate. For communication I want to use the event service provided by org.eclipse.e4.core.services.events.IEventBroker so my plugins stay loosely coupled. I use dependency injection to inject the event broker to a class that dispatches events: @Inject static IEventBroker broker; private void sendEvent() { broker.post(MyEventConstants.SOME_EVENT, payload) } On the receiver side, I have a method like: @Inject @Optional private void receiveEvent(@UIEventTopic(MyEventConstants.SOME_EVENT) Object payload) Now the questions: In order for IEventBroker to be successfully injected, my class needs access to the current IEclipseContext. Most of my classes using the event service are not referenced by the e4 application model, so I have to manually inject the context on instantiation using e.g. ContextInjectionFactory.inject(myEventSendingObject, context); This approach works but I find myself passing around a lot of context to wherever I use the event service. Is this really the correct approach to event based communication across an E4 application? how can I easily write JUnit tests for a class that uses the event service (either as a sender or receiver)? Obviously, none of the above annotations work in isolation since there is no context available. I understand everyone’s convinced that dependency injection simplifies testability. But does this also apply to injecting services like the IEventBroker? This article describes creation of your own IEclipseContext to include the process of DI in tests. Not sure if this could resolve my 2nd issue but I also hesitate running all my tests as JUnit Plug-in tests as it appears impractible to fire up the PDE for each unit test. Maybe I just misunderstand the approach. This article speaks about “simply mocking IEventBroker”. Yes, that would be great! Unfortunately, I couldn’t find any information on how this can be achieved. All this makes me wonder whether I am still on a "good path" or if this is already a case of bad design? And if so, how would you go about redesigning? Move all event related actions to dedicated event sender/receiver classes or a dedicated plugin?

    Read the article

< Previous Page | 40 41 42 43 44 45 46 47 48 49 50 51  | Next Page >