Your desktop for AI-powered coding
Run Claude Code, Codex, and Gemini side-by-side with shared MCP knowledge, isolated git worktrees, remote SSH, and an integrated local LLM that can download GGUF models, assist tasks, and chat in the sidebar.
Features
Everything you need in one workspace
Claude Code, Codex, Gemini, and your local GGUF models collaborate through shared knowledge, isolated worktrees, and a unified task scheduler.
Multi-Agent Workspace
Run Claude Code, Codex, and Gemini side-by-side in the same task. Each gets its own terminal tab with persistent session restore.
Shared MCP Knowledge
All agents share persistent memories, plans, style guides, and a subtask scheduler — scoped per project and task.
Built-In Local LLM
Use the bundled llama.cpp runner to download GGUF models, test them in-app, and keep a library of installed local models.
Choose the Assistant Model
Set a default local assistant model separately from the test model so task assist, repo-context summaries, and local chat use the right GGUF.
Git Worktree Isolation
Each task runs in its own worktree and branch. A reserve pool makes task creation instant.
Remote SSH
Connect to remote servers and run all agents over SSH with automatic reconnect and terminal state restore.
Subtask Scheduler
Break plans into subtasks. All agents read and update the same shared checklist from the sidebar.
Controlled Local AI Actions
Local models can propose scoped subtasks, memories, and task notes through a restricted bridge instead of getting raw unrestricted MCP writes.
Sidebar Local Chat
Open a dedicated Local LLM tab in the right sidebar and chat against the active task context when local AI is enabled.
Shared Timeline & Handoffs
Track task activity, subtask progress, and recent commits in one stream, then copy a structured handoff artifact in one click.
File Editor & Git UI
Syntax highlighting, diff editor, commit graph, and GitHub issue linking — all built in.
Multi-Project Tasks
Manage multiple projects with multiple tasks each. Quick-switch with configurable keyboard shortcuts.
Session Persistence
Terminal state, agent tabs, and conversations survive restarts. Snapshots ensure nothing is lost.
Faster, Safer Runtime
Non-blocking project creation, hardened crash behavior, and lazy-loaded terminal UI keep startup responsive and behavior predictable.
Cross Platform
Available for macOS, Linux, and Windows with native performance on every OS.
Workflow
From project to production
A project has tasks. Each task has its own worktree, branch, agent sessions, and optional local AI context.
Open a project
Add a local repo or connect to a remote server over SSH.
Create a task
Each task gets its own git worktree, branch, and agent tabs — fully isolated.
Run your agents
Open Claude Code, Codex, and Gemini side-by-side. All share the same MCP knowledge layer.
Enable local AI
Download a GGUF in Settings, pick a default assistant model, and turn on the local LLM workflow.
Plan and execute
Create a plan, break it into subtasks, and let the local model assist with subtasks, memories, and task notes.
Chat and ship
Use the Local LLM sidebar chat for scoped help, then stage changes, review diffs, visualize the commit graph, and push.
Download
Get Pylon for your platform
Free and open source. MIT licensed. Includes managed local-model downloads, default assistant model selection, and the new sidebar local chat workflow.
macOS
Not available yet
Coming SoonLinux
Not available yet
Coming SoonWindows
Not available yet
Coming Soon