Touch Events

On my application I have event listeners registered for just about all the event listeners (left click, left double click, left down, left up, mouse move and right click).

There is talk of supporting a windows tablet in the future. While playing with the tablet which was loaned to me for a couple hours (a couple months ago) I remember all touch interactions (tap, tap and hold) were mapped to a left click.

Are there any plans to extend the screenSpaceEventType to have touch events other than reusing the click events? If the user has a touch display and a mouse the event handling logic is going to get really messy.

Thanks in advance

Jerry

Hello Jerry,

The ScreenSpaceEventHandler currently supports touch events, and we’ve had pretty good results with most tablets. Here are the built-in controls for moving the globe:

one finger drag is pan

two finger drag is tilt

and two finger pinch is zoom

We designed the ScreenSpaceEventHandler such that it didn’t touch specific events so that the behavior mostly consistent in both touch and mouse applications. (ie when ScreenSpaceEventType.LEFT_DOWN.is sent to ScreenSpaceEventHandler.setInputAction, ScreenSpaceEventHandler adds a mousedown and a touchstart event listener.)

Hope this helps!

Hannah

Hanna,

I agree that the gestures are present, I was looking for events equivalents to:
touchstart
touchmove
touchend

which are part of the proposed specification: https://w3c.github.io/touch-events/#touch-interface. The spec is dated earlier this month.

Thanks
Jerry

If you need these events, I recommend you just listen for them directly, rather than do it via ScreenSpaceEventHandler. Input is a mess on the web and the fact that touch events are different than mouse events is actually a bad thing in most people’s opinions. That’s why PointerEvents exists to try and unify everything. Since Cesium supports both legacy events and PointerEvents, it doesn’t make sense for ScreenSpaceEventHandler to expose touchXXX functions (even though we listen for them). To Cesium touchstart, pointerdown, and mousedown are the same thing. In fact, if you only listen to touchXXX your code will not work on a lot of Microsoft browser-based and apps, which only support PointerEvents. (For example, Surface tablets).

Since PointerEvents aren’t available in all browsers yet, I encourage anyone creating a Cesium based application to look into using PEP, which is the de factor Pointer Events polyfill. It will ensure your code works across all varieties of mobile and desktop devices, as well as devices that have both mouse and touch inputs. The only reason Cesium itself doesn’t include PEP is because it modified global state (By adding the window.PointerEvent object) and modifying global state is against our rules for Cesium development. (Because we don’t want to make those choices for our users).

There’s a some discussion about this in this issue: https://github.com/AnalyticalGraphicsInc/cesium/issues/2631 At one point I wanted to get rid of ScreenSpaceEventHandler altogether, but I think improving it is a better option now.

Hope that helps,

Matt

2 Likes

Matt,

Thanks for the information I agree Pointer Events seems like a more extensible approach.

Jerry

Matt,

Just an update I did get the tablet back and did some more playing with it. The tablet is a microsoft surface, the end user only cares about using the Chrome browser. I noticed that when I click with a mouse I am not receiving a pointer event.

I agree that for the time being the whole touch, mouse, pointer issue is a mine field that think you should avoid getting dragged into until (os, browser, hardware) converge on a single standard specification, which is supported by the major browsers.

Having said that I did look into PEP and Hammer, but for the limited supported environment it was kinda overkill. To get the desired behavior that I needed/wanted I made the following changes and thought it might be of useful to the community.

In the ScreenSpaceEventHandlet I changed lines 84-to 88 (1.13 release) from:

        if (FeatureDetection.supportsPointerEvents()) {
            registerListener(screenSpaceEventHandler, 'pointerdown', element, handlePointerDown);
            registerListener(screenSpaceEventHandler, 'pointerup', element, handlePointerUp);
            registerListener(screenSpaceEventHandler, 'pointermove', element, handlePointerMove);
        } else {

to

        if (FeatureDetection.supportsPointerEvents()) {
            registerListener(screenSpaceEventHandler, 'pointerdown', element, handlePointerDown);
            registerListener(screenSpaceEventHandler, 'pointerup', element, handlePointerUp);
            registerListener(screenSpaceEventHandler, 'pointermove', element, handlePointerMove);

            registerListener(screenSpaceEventHandler, 'mousedown', element, handleMouseDown);
            registerListener(screenSpaceEventHandler, 'mouseup', alternateElement, handleMouseUp);
            registerListener(screenSpaceEventHandler, 'mousemove', alternateElement, handleMouseMove);
        } else {

Then inside of the handleMouseDown, handleMouseUp and handleMouseMove functions inside of ScreenSpaceEventHandler I commented out the lines:

        if (screenSpaceEventHandler._seenAnyTouchEvents) {
            return;
        }

Jerry