Rice Flour Cookies
See other posts from Programmers
or by Rice Flour Cookies
Published on 2014-08-20T20:17:07Z Indexed on 2014/08/20 22:32 UTC
Read the original article Hit count: 227
I recently went out and purchased a touch-screen monitor with the intention of learning how to program touch-enabled web applications. I had reviewed the MDN documentation about touch events, as well as the W3C specification.
To get started, I wrote a very short test page with two event handlers: one for the
mousedown event and one for the
touchstart event. I fired up the web page in IE and touched the document and found that only the mousedown event fired.
I saw the same behavior with Firefox, only to find out later that Firefox can be set to enable the touchstart event using
about:config. When touch events are enabled, the touchstart event fires, but not mousedown.
Chrome was even stranger: it fired both events when I touched the document: touchstart and mousedown, in that order.
Only on my Android phone does it appear to be the case that only the touchstart event fires when I touch the document.
I did a a Google search and ended up on two interesting pages. First, I found the page on CanIUse for touch events: http://caniuse.com/#feat=touch
Can I Use clearly indicates that IE does not support touch events as of this writing, and Firefox only supports touch events if they are manually enabled.
Furthermore, all four browsers I mentioned treat the touch in a completely different way. It boils down to this:
IE: simulated mouse click Firefox with touch disabled: simulated mouse click Firefox with touch enabled: touch event Chrome: touch event and simulated mouse click Android: touch event
What is more frustrating is that Google also found a Microsoft page called RethinkIE. RethinkIE brags about touch support in IE; as a matter of fact, one of their slogans is "Touch the Web". It links to a number of touch-based application. I followed some of these links, and as best I can tell, it's just like CanIUse described; no proper touch support; just simulated mouse clicks.
The MDN (https://developer.mozilla.org/en-US/docs/Web/API/Touch) and W3C (http://www.w3.org/TR/touch-events/) documentation describe a far richer interface; an interface that doesn't just simulate mouse clicks, but keeps track of multiple touches at once, the contact area, rotation, and force of each touch, and unique identifiers for each touch so that they can be tracked individually.
I don't see how simulated mouse clicks can ever touch the above described functionality, which, once again, is part of the W3C specification, although it is listed as "non-normative", meaning that a browser can claim to be standards-compliant without implementing it. (Why bother making it part of the standard, then?)
What motivated my research is that I've written an HTML5 application that doesn't work on Android because Android doesn't fire mouse events.
I'm now afraid to try to implement touch for my application because the browsers all behave so differently. I imagine that at some time in the future, the browsers might start handling touch similarly, but how can I tell how they might be handled in the future short of writing code to handle the behavior of each individual browser?
Is it possible to write code today that will work with touch-enabled browsers for years to come? If so, how?
© Programmers or respective owner