Haptix Documentation
Haptix is a daemon + MCP server that lets AI coding agents see, touch, and control real iOS apps on physical devices over USB. Install on the Mac, point your agent at localhost:4278/mcp, plug in an iPhone or iPad — your agent now controls any app on the device. Adding the optional iOS SDK gets you console output and annotations.
Quick links
- Setup Guide — Install and configure Haptix in under five minutes
- MCP Setup — Connect your AI agent (Cursor, Claude Code, VS Code, Windsurf, and more)
- MCP Tools Reference — The current MCP tools your agent can call
- CLI Reference — Control Haptix from your terminal (every MCP tool has a CLI mirror)
- System Requirements — Supported macOS, iOS, and device versions
- Compatibility — What works, what's partial, and what's broken
- Agent Guide — Strategies for AI agents navigating iOS devices
- Troubleshooting — Common issues and how to fix them
How it works
One required piece, one optional.
-
Haptix CLI + daemon (required). Install with
curl -fsSL https://get.haptix.dev/install.sh | bash. Thehaptixbinary is the daemon, the MCP server, and a CLI mirror of every MCP tool. It runs as a launchd service, serves the MCP endpoint atlocalhost:4278/mcp, discovers iOS devices over USB, and deploys the on-device controller automatically on the first session. The daemon owns the license key (haptix trialorhaptix license <KEY>). -
HaptixKit iOS SDK (optional). A Swift Package you embed in your own iOS app for deeper integration: your app's console output (
print,NSLog,os_log), annotation overlays drawn on the device, the activity halo, and haptic feedback. Steps 1–3 in the Setup Guide work on any iOS app without it. The daemon already handles basic device control (touch, screenshots, accessibility tree, gestures) — no in-app changes needed.
For a fresh install, use the runtime installer. If you are building your own iOS app and want deeper integration, add the SDK after the runtime is working.
The connection flow
- You connect an iPhone or iPad to your Mac with a USB cable
haptix devicesconfirms the daemon sees it- Your AI agent calls
start_session, which auto-discovers the device + deploys the on-device controller - The agent now has full control — fresh screenshots, gestures, draw paths, text input, accessibility tree
- If you've embedded HaptixKit in your own dev app, your
print()/NSLog/os_logoutput streams to the agent in real time
MCP endpoint
http://localhost:4278/mcp
Any MCP-capable agent connects here via Streamable HTTP: Cursor, Claude Code, VS Code, Windsurf, Claude Desktop, Codex, OpenCode, or any custom MCP client.
Everything available via MCP is also available through the haptix CLI — useful for scripting tests, debugging from your terminal, or driving CI without spinning up an agent.
Get started
Install Haptix and start a free trial — no credit card required, no signup. Full access to all features.
- Getting Started — haptix.dev/getting-started
- Purchase — haptix.dev/plans
- Support — support@haptix.dev