@Carsten I like that idea!
The reason why there is no touch event is, that the Apple model of a touch event can not be used for drag and drop functionality. Because you touch an object and as long as you keep touching the screen your touch event is associated with the same object and no other events will be called. As such a drag and drop functionality can not be modeled. Luckily I found the handjs project from Microsoft. They have implemented an alternative touch model called pointer that provides the necessary functionality. However when you interact with objects and Multitouch, the Apple model is better. Therefore you will find a mixed use in the Reality Editor.
In index.js L634 and L635 you can find "pointerdown" and "pointerup" being attached to the iframe elements. In onload.js L165 you can find "pointerup" being attached to the background canvas.
In eventHandlers.js you can find the functions touchDown, documentPointerUp -> falseTouchUp and trueTouchUp that are called by the eventHandlers.
touchDown is called when an IOPoint is touched
falseTouchUp is called when the touch ends outside of any element (and therefore on the background canvas)
trueTouchUp is called when the touch ends on top any IOPoint.
trueTouchUp then tests if there is an endless loop in the system or if the touchup is occurring at the same element as touchDown. If everything is clear, a new link is send to the server.
If the developer editing (UI Move) mode is selected, normal multi touch events are used index.js L639 and L640.
These events call MultiTouchMove and MultiTouchEnd in eventHandlers.js L257 and L333.
To answer your question:
If you want a visual feedback touching the IOPoints, you can place the trigger in the touchDown function and trueTouchUp.
If you want a visual feedback for the developer editing (UI Move) mode you can place it in the MultiTouchMove function.