I don't have an example to hand but if you have used analog inputs (from a controller) for movement before it should be relatively easy to transform the touch data to the same format. You can get the continuous location of a touch event on the screen in the _input(event)
function and the event should be of type InputEvent.SCREEN_DRAG
. The position of the event will be stored in event.pos
.
You now have to convert this pos to a direction vector that should have a maximum length of 1 (but it can be shorter if you want people to be able to walk slower than max speed). The way to do this conversion depends on how you get the input event.
The straight forward way would be to get the global position data, subtract the origin of your omni-directional touchpad and then to scale the resulting vector to the correct relative size. Another way (and I'm not sure if this would work but it would be very elegant I think) is to grab the input position data in the local coordinates of your correctly scaled touchpad so that they automatically have a max length of 1.
In both ways you must be careful to check what happens in different resolutions!
InputEvent.SCREEN_DRAG
only updates on movement so make sure to keep the last known position and only reset it if you get a InputEvent.SCREEN_TOUCH
where is_pressed() == false
. You also have to monitor InputEvent.SCREEN_TOUCH
for the initial touch but in that case only for is_pressed() == true
.
If you have your direction vector ready you simply scale it with some speed variable and the delta of the process function and then apply it to your character, either with move()
if you use a KinematicBody2D
or by applying a force.