Touch Gestures

Traditionally, web relies on a mouse and a keyboard as the only input devices, while mobile devices are mostly controlled by touch. Mobile web started with a bit touchy solution of translating touch events to mouse events like mousedown.

Newer HTML5 approach is to embrace touch as the first-class input mean, allowing web applications to intercept and identify complex multitouch gestures, free-hand drawing etc. Unfortunately, the support is twofold - either via touch events like touchstart that were first introduced by Apple and standardized later as a de-facto solution, when other vendors went the same route, or via the newer, more general Pointer Events specification, initiated by Microsoft.

API glimpse

Touch Events API

element.addEventListener('touchstart', listener)
An event triggered when the finger has been placed on a DOM element.
element.addEventListener('touchmove', listener)
An event triggered when the finger has been dragged along a DOM element.
element.addEventListener('touchend', listener)
An event triggered when the finger has been removed from a DOM element.

Pointer Events API

element.addEventListener('pointerdown', listener)
An event triggered when the finger has been placed on a DOM element.
element.addEventListener('pointermove', listener)
An event triggered when the finger has been dragged along a DOM element.
element.addEventListener('pointerup', listener)
An event triggered when the finger has been removed from a DOM element.

Resources