Getting Started

Five steps to control any iOS app from your AI agent. Plus one optional step if you want deeper integration in your own dev app.

1

Install Haptix on your Mac

The haptix binary is the daemon, the MCP server, and a CLI mirror of every MCP tool. Install with the one-shot script.

Before you install

  • Install full Xcode and launch it once. Command Line Tools alone are not enough.
  • On your iPhone or iPad, enable Developer Mode and UI Automation.
  • During setup, grant Camera permission so Haptix can capture the device screen during sessions.
Terminal
1curl -fsSL https://get.haptix.dev/install.sh | bash

Requires macOS 15.0+ (Sequoia) and Apple Silicon. haptix setup walks through macOS Camera permission (required for screen capture during sessions) and the launchd daemon.

Note: If haptix setup reports cameraPermissionDenied, open System Settings → Privacy & Security → Camera, enable Haptix (or your terminal app), then run haptix stop && haptix start. Camera permission is a hard requirement — there is no degraded path.
Note: If Xcode is installed but Haptix still cannot see xcodebuild or devicectl, select full Xcode and accept its required components.
Terminal
1sudo xcode-select -s /Applications/Xcode.app/Contents/Developer
2sudo xcodebuild -license accept
3sudo xcodebuild -runFirstLaunch
2

Get a license key

Start a free trial — no signup, no credit card. One trial per machine.

Terminal
1haptix trial

Got a purchased key from /plans? Activate it with haptix license HPTX-XXXX-XXXX-XXXX. The key lives in the daemon — your iOS app never sees it.

3

Prepare your iPhone or iPad

Before Haptix can control a physical iPhone or iPad (iOS 18+), enable two settings on the device. You cannot just plug in any phone and start driving it; iOS requires Developer Mode first.

On the device

  1. Open Settings → Privacy & Security → Developer Mode, then turn it on.
  2. Confirm the restart. The device reboots once before control is available.
  3. After the reboot, open Settings → Developer → Enable UI Automation and turn it on.
Note: Physical devices only. Haptix works with connected iPhone and iPad hardware, not the iOS Simulator.
4

Connect via USB

Plug your device into your Mac with a USB cable (Lightning or USB-C). Haptix discovers USB devices automatically — no IP addresses, no Bonjour, no firewall prompts. Make sure the cable supports data; some charge-only cables won't work.

5

Configure your AI agent

Add the Haptix MCP server to your AI agent's configuration. Pick your agent below and copy the config:

.cursor/mcp.json
1{
2 "mcpServers": {
3 "haptix": {
4 "url": "http://localhost:4278/mcp"
5 }
6 }
7}
You're ready. Ask your agent to start a session. It can now take screenshots, tap, swipe, type, pinch, rotate, and read the accessibility tree on any app on your device.
+

Add HaptixKit for deeper integration

Optional

Steps 1–5 work on any iOS app. To give your agent extra context that the daemon alone can't see — your app's console output ( print(), NSLog, os_log), annotation overlays, the activity halo, haptic feedback — embed HaptixKit in your own app.

  1. Go to File → Add Package Dependencies
  2. Paste the package URL below
  3. Click Add Package

Package URL

https://github.com/haptix-dev/HaptixKit

Then add one line to your app's init(), wrapped in #if DEBUG:

MyApp.swift
1#if DEBUG
2import HaptixKit
3#endif
4
5@main
6struct MyApp: App {
7 init() {
8 #if DEBUG
9 Haptix.start()
10 #endif
11 }
12
13 var body: some Scene {
14 WindowGroup {
15 ContentView()
16 }
17 }
18}

No license argument — the daemon already has it from haptix trial or haptix license. Wrap in #if DEBUG to keep HaptixKit out of production builds. App Review may reject apps that include development tools, and it would expose a remote control surface to your users.