In my scene, I have a Camera2D and a node that reacts to touch events. The node uses
node.get_viewport_transform().xform_inv(event.position); to translate the incoming event's position from screen space to world space. So far this works perfectly. Things happen in the game world exactly where I touch the screen.
I then set the Camera2D's zoom property to
Vector2(3.0, 3.0). The view expands like I would like, but the change doesn't affect the viewport's transform. The event's translated position is as if there was no zoom. Camera2D doesn't have a method for projecting or unprojecting points.
I can't figure out a way to find an event's global position when using a camera with a zoom.
I need to deal with events from multiple touch points, so the helpers for finding global or relative mouse positions and polling for pointer information instead of processing events aren't available.