Highlight IOPoint on touch

I would like the user to get a visual feedback when he touches an IOPoint in the RealityEditor. So when he draws a connection he gets a feedback and knows that he is hitting the mark. I thought I could just add an event listener for touch events in the datapointInterfaces index.js but it seems the event is never propagated down that far. I suspect that is because when you touch an IOPoint the then called line drawing routines stop the further propagation of the touch event. Does anyone have an idea for a workaround and/or can confirm my suspicions?

@Carsten I like that idea!
The reason why there is no touch event is, that the Apple model of a touch event can not be used for drag and drop functionality. Because you touch an object and as long as you keep touching the screen your touch event is associated with the same object and no other events will be called. As such a drag and drop functionality can not be modeled. Luckily I found the handjs project from Microsoft. They have implemented an alternative touch model called pointer that provides the necessary functionality. However when you interact with objects and Multitouch, the Apple model is better. Therefore you will find a mixed use in the Reality Editor.

In index.js L634 and L635 you can find “pointerdown” and “pointerup” being attached to the iframe elements. In onload.js L165 you can find “pointerup” being attached to the background canvas.

In eventHandlers.js you can find the functions touchDown, documentPointerUp -> falseTouchUp and trueTouchUp that are called by the eventHandlers.

touchDown is called when an IOPoint is touched
falseTouchUp is called when the touch ends outside of any element (and therefore on the background canvas)
trueTouchUp is called when the touch ends on top any IOPoint.

trueTouchUp then tests if there is an endless loop in the system or if the touchup is occurring at the same element as touchDown. If everything is clear, a new link is send to the server.

If the developer editing (UI Move) mode is selected, normal multi touch events are used index.js L639 and L640.

These events call MultiTouchMove and MultiTouchEnd in eventHandlers.js L257 and L333.

To answer your question:
If you want a visual feedback touching the IOPoints, you can place the trigger in the touchDown function and trueTouchUp.
If you want a visual feedback for the developer editing (UI Move) mode you can place it in the MultiTouchMove function.

1 Like

Thank you that clears things up. I will try to implement something :slight_smile:

I think I found out why there are no touch events passed on to the datapoint interface UI. There is a div which is on top of the iframe containing the datapoint UI. It has a higher z-index which means it is in front of it. So you can never actually touch the iframe. This div is only visible when the iframe contains a datapoint UI which is why touchevents work when the iframe contains the hardwareInterfaces Web UI. See index.js.

var tempAddContent =
            "<iframe id='iframe" + thisKey + "' onload='on_load(\"" +
                generalObject + "\",\"" + thisKey + "\")' frameBorder='0' " +
                "style='width:" + thisObject.frameSizeX + "px; height:" + thisObject.frameSizeY + "px;" +
                "top:" + ((globalStates.width - thisObject.frameSizeX) / 2) + "px; left:" +
                ((globalStates.height - thisObject.frameSizeY) / 2) + "px; visibility: hidden;' " +
                "src='" + thisUrl + "' class='main'>" +
            "</iframe>";

        tempAddContent += "<div id='" + thisKey + "' frameBorder='0' style='width:" + thisObject.frameSizeX + "px; height:" + thisObject.frameSizeY + "px;" +
        "top:" + ((globalStates.width - thisObject.frameSizeX) / 2) + "px; left:" + ((globalStates.height - thisObject.frameSizeY) / 2) + "px; visibility: hidden;' class='mainEditing'></div>" +
        "";

@valentin is there a reason why you didn’t just add the event listeners directly to the iframe whenever it contains a datapointInterface instead of adding them to the overlaying div?

var theObject = document.getElementById(thisKey);
        theObject.style["touch-action"] = "none";
        theObject["handjs_forcePreventDefault"] = true;
        theObject.addEventListener("pointerdown", touchDown, false);
        theObject.addEventListener("pointerup", trueTouchUp, false);

Yes, this entire behavior is on purpose.
I experimented around a lot and ended up with this behavior.
If the DIV is not on top of the iFrame, the touch/pointer event is captured within the iFrame but is not received by the Editor.

The drag and drop is not working otherwise.

Ok, that makes it all very difficult because you can’t change anything in the iframe from within the RealityEditor code since the iframe content is loaded from another domain and cross site scripting is disabled for security reasons. So I need to figure out a way to pass the event through to the iframe and handle it on both layers. I’ve got no idea so far :frowning: With setting the css property “pointer-events” to “none” you can pass events through to lower layers but than the events won’t be registered on the upper layer itself and apparently the setting is only supported in safari 9 and above…

you can use window.postMessages.
This is already used to tell the editor the exact size of the page within the iframe.
And I am using it in the new version allow to subscribe to the 3d transformation data.
You can find it in index.js L715

Aaah, yes of course, thank you. Got it working with the handjs pointerenter and pointerleave events. I’m just sending a message to the datapoint interface which indicates if the pointer entered or left the element. Thanks again for the postMessages tip @valentin.