Vibe coding just went native.
Your AI agent writes the Swift. Haptix lets it see, touch, and control the running app. Stop building blind — be a Visual Builder.
No credit card required
You can vibe code an iOS app.
But your agent is building blind.
AI agents can browse the web, write your code, and read your files. But they can't see or touch your running iOS app. Haptix changes that.
Go native. Skip the compromises.
Cross-platform frameworks got you this far. Swift takes you the rest of the way — and now AI can come along for the ride.
Faster. Smoother. Native.
Swift apps run directly on Apple silicon. No JavaScript bridge, no virtual DOM. Just raw performance and buttery-smooth 120fps animations.
Every Apple API. Day one.
ARKit, Core ML, HealthKit, StoreKit, Live Activities, Dynamic Island — Swift gives you the full platform. No waiting for wrapper libraries.
AI loves Swift.
Modern Swift is concise, type-safe, and well-documented. AI agents generate better Swift than most cross-platform code — and now they can test it too.
How it works
Three steps. One install command. One MCP config. Developer Mode on the device.
Install
One installer. The CLI is also the daemon and the MCP server. After install, haptix setup walks through Camera permission and the launchd daemon. haptix trial gets you a free trial key — no signup, no credit card. The license lives in the daemon, not your code.
Drive it — from an agent or your shell
Plug Haptix into an AI agent via MCP (Cursor, Claude Code, VS Code, any MCP client) or call the daemon directly from your terminal. Same endpoints, same tools — every MCP tool has a matching haptix CLI subcommand. Pick what fits the job.
| 1 | { |
| 2 | "mcpServers": { |
| 3 | "haptix": { |
| 4 | "url": "http://localhost:4278/mcp" |
| 5 | } |
| 6 | } |
| 7 | } |
Control any iOS app over USB
On the iPhone or iPad, enable Developer Mode and UI Automation first, then plug it in over USB. Your agent can take fresh screenshots, tap, swipe, drag, draw paths, type, pinch, rotate, and read the accessibility tree on any app you put on the device. SwiftUI, UIKit, React Native, Flutter, hybrid — doesn't matter how the app was built. The app doesn't even need to know Haptix exists.
Want deeper integration into your own app — console output, annotations, haptics? Add HaptixKit ↓
Everything your agent needs
Full device control through MCP or the CLI. No compromise.
Full iOS Control
Tap, swipe, drag, draw continuous paths, pinch, rotate, long press — every gesture your users make, driven by AI on real iPhones and iPads over USB.
Any iOS app
SwiftUI, UIKit, React Native, Flutter, Capacitor, hand-coded, vibe-coded — if it's a valid iOS app running on a connected device, Haptix drives it. The app doesn't need to know Haptix exists.
Live Screenshots
Screenshots are fresh-or-fail, and annotated captures overlay every interactive element with labels, identifiers, and coordinates.
Accessibility Tree
Semantic UI understanding — labels, values, traits, frames. Your agent knows what's on screen, not just pixels.
USB-Only Discovery
Plug in your device and Haptix finds it automatically. No Bonjour, no LAN broadcasting, no IP addresses, no firewall prompts. Just enable Developer Mode and UI Automation.
Connection Security
USB-only transport, Curve25519 app authentication, license validation. Physical cable equals consent. Nothing leaves your machine.
Multi-Device
Connect multiple devices without Haptix guessing. If capture-source identity drifts across attached phones, it fails clearly instead of showing another device's pixels.
Intelligent Wait
Handles device latency transparently. Automatic wait-and-verify ensures gestures land before reporting success.
Any MCP Agent
Works with Cursor, Claude Code, VS Code, Claude Desktop, Windsurf, and any client that speaks MCP.
Get Haptix
One installer on your Mac. One optional Swift Package in your app. You'll be running in minutes.
Haptix CLI + daemon
macOS 15.0+ · Apple Silicon · Camera permission required
The MCP server. Discovers your USB device automatically, runs as a launchd daemon, mirrors every MCP tool as a CLI command. Includes a free trial — no credit card required.
Before you install
- Install full Xcode and launch it once. Command Line Tools alone are not enough.
- On your iPhone or iPad, enable Developer Mode and UI Automation.
- During setup, grant Camera permission so Haptix can capture the device screen over USB.
Recommended
$ curl -fsSL https://get.staging.haptix.dev/install.sh | bashAfter install: haptix setup walks through Camera permission and license activation. Run haptix trial for a free trial key, or paste a purchased key with haptix license <KEY>.
HaptixKit
iOS 18.0+ · Swift Package Manager
Embed in your own iOS app for deeper integration: console output ( print(), os_log), annotation overlays, the activity halo, and haptic feedback. Not required for basic device control — that's already in the daemon.
Package URL
https://github.com/haptix-dev/HaptixKit- In Xcode, go to File → Add Package Dependencies
- Paste the URL above and click Add Package
Looking for a specific version? Browse all previous releases
Simple, transparent pricing
Try Haptix free before you buy. No credit card required.
Buy once, own it forever. Each license includes 1 seat (1 developer) with 2 machine activations — e.g. laptop + desktop.
Standard License
One-time purchase · 1 year of updates
Everything in Haptix today. All features, all updates, for the current major version.
Your license is perpetual — the app keeps working after the update year ends. Renew anytime to get another year.
Lifetime License
One-time purchase · Updates forever
Everything in Haptix, forever. Every major version — current and future — for one price.
Team Licenses
Central license management for your whole team. One account, all your seats.
Each seat = 1 developer = 2 machines. Manage activations, reassign seats, and track usage from a single dashboard.
Team Standard License
One-time purchase · 1 year of updates
Minimum 10 seats · 20 machines
Your license is perpetual — the app keeps working after the update year ends. Renew anytime to get another year.
Team Lifetime License
One-time purchase · Updates forever
Minimum 10 seats · 20 machines
7-day free trial. Prices in USD. Secure checkout by Stripe. Billing policy
Want deeper integration? Add HaptixKit.
HaptixKit is a Swift Package you embed in your own iOS app to give your agent extra context that the daemon alone can't see.
Console output
Your print(), NSLog, os_log, framework warnings, runtime errors — read by the agent in real time.
Annotation overlays
The agent can draw bounding boxes, grids, crosshairs, and custom shapes directly on the device for spatial reasoning.
Activity halo
A subtle border glows on the device while the agent is working — the developer always knows when the agent is in control.
Haptic feedback
Each agent action plays a haptic on the device — light for taps, medium for swipes, selection-changed for typing.
Add the Swift Package, then add one line to your app's init()
| 1 | #if DEBUG |
| 2 | import HaptixKit |
| 3 | #endif |
| 4 | |
| 5 | @main |
| 6 | struct MyApp: App { |
| 7 | init() { |
| 8 | #if DEBUG |
| 9 | Haptix.start() |
| 10 | #endif |
| 11 | } |
| 12 | |
| 13 | var body: some Scene { |
| 14 | WindowGroup { |
| 15 | ContentView() |
| 16 | } |
| 17 | } |
| 18 | } |
No license argument needed — the daemon already has it from haptix trial or haptix license. Wrap in #if DEBUG to keep HaptixKit out of production builds.