Touch Input Best Practices for Unity Mobile Games

Everything you need to know about touch input in Unity — from the New Input System to gesture detection, virtual joysticks, haptics, and multi-touch handling.

Touch input is the foundation of every mobile game, yet it's one of the most commonly botched systems. A game can have brilliant mechanics and gorgeous art, but if the touch controls feel sluggish, imprecise, or unresponsive, players will uninstall within seconds. This guide covers everything you need to build touch input that feels native and polished in Unity.

TL;DR: Use legacy Input for simple mobile games and the New Input System for complex cross-platform projects. Key techniques include swipe detection with distance and time thresholds, pinch-to-zoom via two-finger delta tracking, virtual joysticks (fixed, dynamic, or floating), and haptic feedback for polish. Always handle TouchPhase.Canceled, test on real devices, and keep touch targets at least 44x44 points. This guide provides copy-paste C# implementations for all of these.

We'll walk through the New Input System versus legacy Input, gesture detection algorithms, virtual joystick design, haptic feedback, multi-touch handling, and the common mistakes that plague mobile games. Every pattern comes with production-ready C# code. For a complete touch controls package, check out our Mobile Controls Kit.

New Input System vs Legacy Input

Unity offers two input systems: the legacy Input class (built-in, always available) and the New Input System package (installed separately via Package Manager). For mobile games in 2026, here's the honest comparison:

  • Legacy Input: Simpler API, less setup, works immediately. Input.GetTouch(0) gives you everything you need for basic touch. Great for hyper-casual games and prototypes.
  • New Input System: Action-based, event-driven, supports input remapping. Better for complex games with multiple input modes (touch + gamepad). More boilerplate but more maintainable at scale.
  • Recommendation: Use legacy Input for simple mobile-only games (1-2 touch actions). Use the New Input System for games that need gamepad support, complex gesture combinations, or cross-platform input handling.

Both systems can coexist — Unity's Player Settings lets you enable both simultaneously. For the code examples in this article, we'll use legacy Input for clarity, but the concepts apply to both systems.

Basic Touch Detection: Phases and Touch Count

Every touch event in Unity has a phase that tells you where the finger is in its lifecycle. Understanding touch phases is critical for building responsive controls:

C#
using UnityEngine;

public class TouchDetector : MonoBehaviour
{
    void Update()
    {
        if (Input.touchCount == 0) return;

        for (int i = 0; i < Input.touchCount; i++)
        {
            Touch touch = Input.GetTouch(i);

            switch (touch.phase)
            {
                case TouchPhase.Began:
                    // Finger just touched the screen
                    OnTouchStart(touch);
                    break;

                case TouchPhase.Moved:
                    // Finger is moving on the screen
                    OnTouchMove(touch);
                    break;

                case TouchPhase.Stationary:
                    // Finger is touching but not moving
                    OnTouchHold(touch);
                    break;

                case TouchPhase.Ended:
                    // Finger lifted off the screen
                    OnTouchEnd(touch);
                    break;

                case TouchPhase.Canceled:
                    // Touch was interrupted (phone call, etc.)
                    OnTouchCanceled(touch);
                    break;
            }
        }
    }

    private void OnTouchStart(Touch touch)
    {
        Debug.Log($"Touch {touch.fingerId} started at {touch.position}");
    }

    private void OnTouchMove(Touch touch)
    {
        Debug.Log($"Touch {touch.fingerId} moved by {touch.deltaPosition}");
    }

    private void OnTouchHold(Touch touch)
    {
        Debug.Log($"Touch {touch.fingerId} holding at {touch.position}");
    }

    private void OnTouchEnd(Touch touch)
    {
        Debug.Log($"Touch {touch.fingerId} ended at {touch.position}");
    }

    private void OnTouchCanceled(Touch touch)
    {
        Debug.Log($"Touch {touch.fingerId} was canceled");
    }
}

Key points: touch.fingerId is stable across the lifetime of a touch — use it to track specific fingers. touch.deltaPosition gives you per-frame movement in pixels. touch.position is in screen space (pixels from bottom-left).

Gesture Detection: Swipe Recognition

Swipe detection is one of the most common input needs in mobile games — lane switching, card swiping, menu navigation. The algorithm is simple: track where the finger started, where it ended, whether the movement was fast enough, and which direction it moved most. Our Swipe Input Controller provides a production-ready version, but here's the core algorithm:

C#
using UnityEngine;
using UnityEngine.Events;

public class SwipeDetector : MonoBehaviour
{
    [SerializeField] private float minSwipeDistance = 50f;
    [SerializeField] private float maxSwipeTime = 0.5f;

    public UnityEvent<Vector2> onSwipe; // Normalized direction

    private Vector2 startPosition;
    private float startTime;

    void Update()
    {
        if (Input.touchCount == 0) return;

        Touch touch = Input.GetTouch(0);

        switch (touch.phase)
        {
            case TouchPhase.Began:
                startPosition = touch.position;
                startTime = Time.time;
                break;

            case TouchPhase.Ended:
                float elapsed = Time.time - startTime;
                if (elapsed > maxSwipeTime) return;

                Vector2 delta = touch.position - startPosition;
                if (delta.magnitude < minSwipeDistance) return;

                // Determine primary direction
                Vector2 direction;
                if (Mathf.Abs(delta.x) > Mathf.Abs(delta.y))
                    direction = delta.x > 0 ? Vector2.right : Vector2.left;
                else
                    direction = delta.y > 0 ? Vector2.up : Vector2.down;

                onSwipe?.Invoke(direction);
                break;
        }
    }
}

The minSwipeDistance prevents accidental swipes from slight finger movement. The maxSwipeTime distinguishes swipes (fast) from drags (slow). Tune these values on actual devices — what feels right on a 6-inch phone is different from a 10-inch tablet.

Pinch-to-Zoom Implementation

Pinch-to-zoom requires tracking two fingers and measuring the distance between them over time. When the distance increases, the player is zooming in; when it decreases, zooming out. Our Pinch to Zoom Camera script provides a camera-integrated version. Here's the core logic:

C#
using UnityEngine;

public class PinchZoom : MonoBehaviour
{
    [SerializeField] private float zoomSpeed = 0.01f;
    [SerializeField] private float minZoom = 2f;
    [SerializeField] private float maxZoom = 10f;

    private Camera cam;

    void Awake()
    {
        cam = GetComponent<Camera>();
    }

    void Update()
    {
        if (Input.touchCount != 2) return;

        Touch touch0 = Input.GetTouch(0);
        Touch touch1 = Input.GetTouch(1);

        // Calculate previous positions
        Vector2 prevPos0 = touch0.position - touch0.deltaPosition;
        Vector2 prevPos1 = touch1.position - touch1.deltaPosition;

        // Calculate distance between fingers
        float prevDistance = (prevPos0 - prevPos1).magnitude;
        float currentDistance = (touch0.position - touch1.position).magnitude;

        // Calculate zoom delta
        float deltaMagnitude = currentDistance - prevDistance;

        if (cam.orthographic)
        {
            cam.orthographicSize -= deltaMagnitude * zoomSpeed;
            cam.orthographicSize = Mathf.Clamp(cam.orthographicSize, minZoom, maxZoom);
        }
        else
        {
            cam.fieldOfView -= deltaMagnitude * zoomSpeed;
            cam.fieldOfView = Mathf.Clamp(cam.fieldOfView, minZoom * 10f, maxZoom * 10f);
        }
    }
}

Virtual Joystick Design

Virtual joysticks are essential for games that need analog directional input — twin-stick shooters, top-down RPGs, platformers. There are three main design patterns, each with distinct trade-offs:

  • Fixed joystick: Always visible at a set position. The player knows exactly where to place their thumb. Best for games where joystick usage is constant (twin-stick shooters). Downside: covers part of the screen permanently.
  • Dynamic joystick: Appears wherever the player first touches the screen, then stays fixed until they lift their finger. Best for games with intermittent movement (strategy, puzzle). Downside: slight visual delay on first touch.
  • Floating joystick: Follows the player's finger, re-centering if the finger moves beyond the joystick radius. Feels the most natural but can drift across the screen. Best for casual games with simple movement.

Our Touch Joystick script supports all three modes with a simple enum toggle. For most games, we recommend starting with the dynamic joystick — it balances visibility with screen real estate.

Haptic Feedback

Haptic feedback (vibration) adds a tactile dimension to touch interactions that makes games feel dramatically more polished. A subtle vibration on button press, a heavier pulse on taking damage, and a sharp buzz on a game-over screen all contribute to the visceral feel of a game.

C#
using UnityEngine;
#if UNITY_IOS
using System.Runtime.InteropServices;
#endif

public static class HapticFeedback
{
    // Basic vibration — works on both Android and iOS
    public static void Vibrate()
    {
#if UNITY_ANDROID || UNITY_IOS
        Handheld.Vibrate();
#endif
    }

#if UNITY_IOS
    // iOS Taptic Engine — more nuanced haptics
    [DllImport("__Internal")]
    private static extern void _playHaptic(int type);

    /// <summary>
    /// iOS only: 0 = light, 1 = medium, 2 = heavy
    /// </summary>
    public static void PlayiOSHaptic(int intensity)
    {
        if (Application.platform == RuntimePlatform.IPhonePlayer)
            _playHaptic(intensity);
    }
#endif

#if UNITY_ANDROID
    // Android: Custom vibration duration via AndroidJavaObject
    public static void VibrateAndroid(long milliseconds)
    {
        if (Application.platform != RuntimePlatform.Android) return;

        using (var unityPlayer = new AndroidJavaClass("com.unity3d.player.UnityPlayer"))
        using (var activity = unityPlayer.GetStatic<AndroidJavaObject>("currentActivity"))
        using (var vibrator = activity.Call<AndroidJavaObject>("getSystemService", "vibrator"))
        {
            vibrator.Call("vibrate", milliseconds);
        }
    }
#endif

    // Convenience methods for game events
    public static void LightTap()
    {
#if UNITY_IOS
        PlayiOSHaptic(0);
#elif UNITY_ANDROID
        VibrateAndroid(10);
#endif
    }

    public static void MediumImpact()
    {
#if UNITY_IOS
        PlayiOSHaptic(1);
#elif UNITY_ANDROID
        VibrateAndroid(30);
#endif
    }

    public static void HeavyImpact()
    {
#if UNITY_IOS
        PlayiOSHaptic(2);
#elif UNITY_ANDROID
        VibrateAndroid(50);
#endif
    }
}

Always provide a setting to disable haptics. Some players find vibrations annoying, and accessibility best practices require an opt-out. Store the preference in PlayerPrefs and check it before every haptic call.

Dead Zones and Thresholds

Dead zones are the most underrated element of touch input feel. A dead zone is a small area around the initial touch point where movement is ignored. Without dead zones, virtual joysticks feel twitchy — the slightest finger tremor moves the character. With properly tuned dead zones, the input feels stable and intentional.

The ideal dead zone size depends on the device. On a phone, 10-15 pixels works well. On a tablet, 15-25 pixels. You can also express dead zones as a percentage of screen width (about 1-2%) for resolution independence. The same principle applies to swipe thresholds — the minimum distance a finger must travel before a swipe is registered. Set it too low and you get false positives; set it too high and swipes feel unresponsive.

Multi-Touch Handling: Managing Finger IDs

Multi-touch games (dual joystick, pinch-zoom + pan, two-player same-device) require careful finger tracking. The key concept is fingerId — Unity assigns a stable integer ID to each finger from the moment it touches the screen until it lifts. Here's a robust multi-touch manager:

C#
using UnityEngine;
using System.Collections.Generic;

public class MultiTouchManager : MonoBehaviour
{
    private Dictionary<int, TouchData> activeTouches = new Dictionary<int, TouchData>();

    public struct TouchData
    {
        public Vector2 startPosition;
        public Vector2 currentPosition;
        public float startTime;
        public int assignedControl; // 0 = unassigned, 1 = left joystick, 2 = right joystick
    }

    [SerializeField] private float screenMidpoint = 0.5f; // Fraction of screen width

    void Update()
    {
        for (int i = 0; i < Input.touchCount; i++)
        {
            Touch touch = Input.GetTouch(i);
            int id = touch.fingerId;

            switch (touch.phase)
            {
                case TouchPhase.Began:
                    TouchData data = new TouchData
                    {
                        startPosition = touch.position,
                        currentPosition = touch.position,
                        startTime = Time.time,
                        assignedControl = AssignControl(touch.position)
                    };
                    activeTouches[id] = data;
                    break;

                case TouchPhase.Moved:
                case TouchPhase.Stationary:
                    if (activeTouches.TryGetValue(id, out var existing))
                    {
                        existing.currentPosition = touch.position;
                        activeTouches[id] = existing;
                        ProcessTouch(id, existing);
                    }
                    break;

                case TouchPhase.Ended:
                case TouchPhase.Canceled:
                    if (activeTouches.ContainsKey(id))
                    {
                        ReleaseControl(activeTouches[id].assignedControl);
                        activeTouches.Remove(id);
                    }
                    break;
            }
        }
    }

    private int AssignControl(Vector2 position)
    {
        // Left half of screen = movement, right half = aim/action
        float normalizedX = position.x / Screen.width;
        return normalizedX < screenMidpoint ? 1 : 2;
    }

    private void ProcessTouch(int fingerId, TouchData data)
    {
        Vector2 delta = data.currentPosition - data.startPosition;

        if (data.assignedControl == 1)
        {
            // Feed delta to movement joystick
        }
        else if (data.assignedControl == 2)
        {
            // Feed delta to aim/action joystick
        }
    }

    private void ReleaseControl(int controlId)
    {
        // Reset joystick position when finger lifts
    }
}

Common Mistakes to Avoid

After reviewing hundreds of mobile Unity games, these are the touch input mistakes that show up again and again:

  • Not testing on actual devices: The Unity editor simulates touch with mouse clicks, but mouse and finger input feel completely different. A button that's easy to click with a mouse cursor might be impossible to hit with a thumb. Always test on a physical phone.
  • Ignoring safe areas: Modern phones have notches, rounded corners, and home indicator bars that eat into screen space. Touch targets near screen edges may be unreachable. Use our Mobile Safe Area script to handle this automatically.
  • Touch targets too small: Apple's HIG recommends 44x44 points minimum for touch targets. On a 1080p phone, that's roughly 130x130 pixels. Buttons, joystick handles, and interactive elements must meet this minimum.
  • Using Input.mousePosition for mobile: Input.mousePosition only tracks one finger and behaves differently from touch on some devices. Use Input.GetTouch() for mobile builds.
  • No visual feedback on touch: When the player taps a button, it should visually respond immediately — scale down, change color, or pulse. Delayed feedback (waiting for the touch to end) makes the game feel sluggish.
  • Forgetting to handle TouchPhase.Canceled: Touches get canceled when the app loses focus (incoming call, notification pull-down). If you don't handle cancellation, joysticks can get stuck in the "pressed" state.

Building a Complete Touch System

The scripts and patterns in this article are building blocks. For a complete touch input solution, combine them based on your game's needs. Here's what we recommend as a starting stack:

All of these are available individually in our scripts library, or bundled together in the Mobile Controls Kit game system. For hyper-casual games specifically, the Build a Hyper-Casual Game collection pairs these input scripts with scoring, pooling, and game loop management.

Touch input is the most intimate interface between your player and your game. Every millisecond of latency, every pixel of dead zone, every frame of visual feedback matters. Test on real devices, test with real fingers, and iterate until it feels invisible — because the best touch input is the kind players never notice.