Apple’s new Eye Tracking accessibility feature in iOS and iPadOS 18 lets you control your iPhone or iPad with your eyes. No touch needed to select icons or navigate apps — just look. Designed originally for users with limited motor control, it can also prove handy in hands‑free situations. This article walks you through what it is, how to set it up, use it well, and troubleshoot common issues.
What is Eye Tracking?
- Eye Tracking uses the device’s front‑facing camera + on‑device intelligence to follow where you’re looking.
- It works by highlighting or moving a pointer to the item your gaze falls on, and then using “dwell” (holding your gaze on something for a little time) to select or activate it.
- It integrates with AssistiveTouch, letting you perform gestures or actions (like swipes, buttons) by gaze alone.
- All control data stays on device; nothing is uploaded to external servers.
Which Devices & Requirements
- Requires iPhone or iPad running iOS 18 or iPadOS 18 (or later).
- Your device must have a suitable front camera and the hardware to support the processing — newer devices perform better.
- Good lighting, steady position, clean camera lens improve reliability.
Step‑by‑Step Setup
- Ensure you have updated to iOS 18 / iPadOS 18.
- Go to Settings → Accessibility → Eye Tracking.
- Toggle Eye Tracking on.
- Follow on‑screen calibration: you’ll be asked to look at dots or targets as they appear in different parts of the screen. This helps the system map your eye movements.
- Once calibration is done, enable / tweak these options:
- Dwell Control: decide how long you must look at something to select it.
- Snap to Item: assists by jumping the pointer to nearest selectable item.
- Auto‑Hide: hide the pointer when not in use.
- Smoothing: reduce jitter; smooth pointer movement.
How to Use Eye Tracking
- Move your eyes to navigate: as you look at app icons, buttons, or other items, they’ll highlight.
- Hold your gaze (“dwell”) to activate or tap.
- Use AssistiveTouch menu to access gestures or hardware‑equivalent actions (e.g. pressing volume, home, lock).
- You can scroll by looking at certain edges or using AssistiveTouch scroll options.
- Adjust sensitivity or timings to suit your needs or comfort.
Tips to Improve Accuracy & Comfort
- Keep your device steady — place it on a stand or table rather than holding.
- Position it about 30–50 cm from your face.
- Calibration in good lighting helps; avoid strong backlight or very dim light.
- Keep eyelids, glasses, or other objects from obstructing the camera.
- Blink naturally, but avoid excessively rapid blinking during calibration.
Limitations & Common Problems
| Problem | Why It Happens | What To Do |
|---|---|---|
| Pointer doesn’t track well | Poor lighting or face not centered | Adjust lighting; position face properly |
| Selections misfire (you select wrong item) | Dwell timeout too fast | Increase dwell time; use Snap to Item |
| Fatigue / eye strain | Keeping gaze fixed is tiring | Take breaks; adjust timings; alternate with touch use |
| Device too sensitive to small eye/head movements | Smoothing low or no offset | Increase smoothing; reduce sensitivity |
How to Turn it Off or Reset
- To disable: Settings → Accessibility → Eye Tracking → toggle off.
- If tracking feels off, you can recalibrate by going through the calibration step again.
- If needed, reset the Eye Tracking settings to defaults.
Who Benefits Most, & Extra Uses
- Primarily for people with motor or physical disabilities preventing use of touch.
- Also helpful when hands are occupied (e.g. cooking, carrying objects).
- Could be used in educational or assistive situations: accessibility labs, adaptive tools.
Conclusion
Apple’s Eye Tracking is a strong step forward in accessibility, bridging the gap for users who cannot use touch, or who prefer an alternate mode of interaction. With a few tweaks and patience during calibration, it can feel natural and useful. As Apple refines it, expect easier use and more reliable tracking on newer devices.
