Built for Visual Builders

Vibe coding just went native.

Your AI agent writes the Swift. Haptix lets it see, touch, and control the running app. Stop building blind — be a Visual Builder.

No credit card required

Works with
Cursor
Claude
VS Code
Windsurf
Any MCP Client

You can vibe code an iOS app.
But your agent is building blind.

AI agents can browse the web, write your code, and read your files. But they can't see or touch your running iOS app. Haptix changes that.

Browse the Web
Write Code
Run Shell Commands
Read Files
Real iOS Devices ← Haptix

Go native. Skip the compromises.

Cross-platform frameworks got you this far. Swift takes you the rest of the way — and now AI can come along for the ride.

Faster. Smoother. Native.

Swift apps run directly on Apple silicon. No JavaScript bridge, no virtual DOM. Just raw performance and buttery-smooth 120fps animations.

Every Apple API. Day one.

ARKit, Core ML, HealthKit, StoreKit, Live Activities, Dynamic Island — Swift gives you the full platform. No waiting for wrapper libraries.

AI loves Swift.

Modern Swift is concise, type-safe, and well-documented. AI agents generate better Swift than most cross-platform code — and now they can test it too.

macOS support coming soon — vibe code native Mac apps too

How it works

Three steps. One install command. One MCP config. Developer Mode on the device.

01

Install

One installer. The CLI is also the daemon and the MCP server. After install, haptix setup walks through Camera permission and the launchd daemon. haptix trial gets you a free trial key — no signup, no credit card. The license lives in the daemon, not your code.

Terminal
$ curl -fsSL https://get.staging.haptix.dev/install.sh | bash
$ haptix setup
$ haptix trial
Already have a license? haptix license HPTX-…
02

Drive it — from an agent or your shell

Plug Haptix into an AI agent via MCP (Cursor, Claude Code, VS Code, any MCP client) or call the daemon directly from your terminal. Same endpoints, same tools — every MCP tool has a matching haptix CLI subcommand. Pick what fits the job.

.cursor/mcp.json
1{
2 "mcpServers": {
3 "haptix": {
4 "url": "http://localhost:4278/mcp"
5 }
6 }
7}
03

Control any iOS app over USB

On the iPhone or iPad, enable Developer Mode and UI Automation first, then plug it in over USB. Your agent can take fresh screenshots, tap, swipe, drag, draw paths, type, pinch, rotate, and read the accessibility tree on any app you put on the device. SwiftUI, UIKit, React Native, Flutter, hybrid — doesn't matter how the app was built. The app doesn't even need to know Haptix exists.

Terminal
$ haptix status
Haptix MCP Server
State: running
Port: 4278
Devices
iPhone 16 Pro — USB, connected

Want deeper integration into your own app — console output, annotations, haptics? Add HaptixKit ↓

Everything your agent needs

Full device control through MCP or the CLI. No compromise.

Full iOS Control

Tap, swipe, drag, draw continuous paths, pinch, rotate, long press — every gesture your users make, driven by AI on real iPhones and iPads over USB.

Any iOS app

SwiftUI, UIKit, React Native, Flutter, Capacitor, hand-coded, vibe-coded — if it's a valid iOS app running on a connected device, Haptix drives it. The app doesn't need to know Haptix exists.

Live Screenshots

Screenshots are fresh-or-fail, and annotated captures overlay every interactive element with labels, identifiers, and coordinates.

Accessibility Tree

Semantic UI understanding — labels, values, traits, frames. Your agent knows what's on screen, not just pixels.

USB-Only Discovery

Plug in your device and Haptix finds it automatically. No Bonjour, no LAN broadcasting, no IP addresses, no firewall prompts. Just enable Developer Mode and UI Automation.

Connection Security

USB-only transport, Curve25519 app authentication, license validation. Physical cable equals consent. Nothing leaves your machine.

Multi-Device

Connect multiple devices without Haptix guessing. If capture-source identity drifts across attached phones, it fails clearly instead of showing another device's pixels.

Intelligent Wait

Handles device latency transparently. Automatic wait-and-verify ensures gestures land before reporting success.

Any MCP Agent

Works with Cursor, Claude Code, VS Code, Claude Desktop, Windsurf, and any client that speaks MCP.

Get Haptix

One installer on your Mac. One optional Swift Package in your app. You'll be running in minutes.

Haptix CLI + daemon

macOS 15.0+ · Apple Silicon · Camera permission required

The MCP server. Discovers your USB device automatically, runs as a launchd daemon, mirrors every MCP tool as a CLI command. Includes a free trial — no credit card required.

Before you install

  • Install full Xcode and launch it once. Command Line Tools alone are not enough.
  • On your iPhone or iPad, enable Developer Mode and UI Automation.
  • During setup, grant Camera permission so Haptix can capture the device screen over USB.

Recommended

$ curl -fsSL https://get.staging.haptix.dev/install.sh | bash

After install: haptix setup walks through Camera permission and license activation. Run haptix trial for a free trial key, or paste a purchased key with haptix license <KEY>.

Optional

HaptixKit

iOS 18.0+ · Swift Package Manager

Embed in your own iOS app for deeper integration: console output ( print(), os_log), annotation overlays, the activity halo, and haptic feedback. Not required for basic device control — that's already in the daemon.

Package URL

https://github.com/haptix-dev/HaptixKit
  1. In Xcode, go to File → Add Package Dependencies
  2. Paste the URL above and click Add Package

Looking for a specific version? Browse all previous releases

Simple, transparent pricing

Try Haptix free before you buy. No credit card required.

Buy once, own it forever. Each license includes 1 seat (1 developer) with 2 machine activations — e.g. laptop + desktop.

Standard License

$69

One-time purchase · 1 year of updates

Everything in Haptix today. All features, all updates, for the current major version.

Your license is perpetual — the app keeps working after the update year ends. Renew anytime to get another year.

Best Value

Lifetime License

$299

One-time purchase · Updates forever

Everything in Haptix, forever. Every major version — current and future — for one price.

Team Licenses

Central license management for your whole team. One account, all your seats.

Each seat = 1 developer = 2 machines. Manage activations, reassign seats, and track usage from a single dashboard.

Team Standard License

$49/ seat

One-time purchase · 1 year of updates

Minimum 10 seats · 20 machines

Seats
Machines20 machines (10 developers)
Total$490

Your license is perpetual — the app keeps working after the update year ends. Renew anytime to get another year.

Team Lifetime License

$249/ seat

One-time purchase · Updates forever

Minimum 10 seats · 20 machines

Seats
Machines20 machines (10 developers)
Total$2490

7-day free trial. Prices in USD. Secure checkout by Stripe. Billing policy

Optional

Want deeper integration? Add HaptixKit.

HaptixKit is a Swift Package you embed in your own iOS app to give your agent extra context that the daemon alone can't see.

Console output

Your print(), NSLog, os_log, framework warnings, runtime errors — read by the agent in real time.

Annotation overlays

The agent can draw bounding boxes, grids, crosshairs, and custom shapes directly on the device for spatial reasoning.

Activity halo

A subtle border glows on the device while the agent is working — the developer always knows when the agent is in control.

Haptic feedback

Each agent action plays a haptic on the device — light for taps, medium for swipes, selection-changed for typing.

1

Add the Swift Package, then add one line to your app's init()

MyApp.swift
1#if DEBUG
2import HaptixKit
3#endif
4
5@main
6struct MyApp: App {
7 init() {
8 #if DEBUG
9 Haptix.start()
10 #endif
11 }
12
13 var body: some Scene {
14 WindowGroup {
15 ContentView()
16 }
17 }
18}

No license argument needed — the daemon already has it from haptix trial or haptix license. Wrap in #if DEBUG to keep HaptixKit out of production builds.