Skip to content

zerx-lab/warp

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

224 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

OpenWarp

Bring any AI model into your terminal

OpenWarp is a community fork of Warp that opens up the AI layer. Keep the full Warp terminal experience — blocks, workflows, keymaps — while plugging in any OpenAI-compatible provider, customizing system prompts with minijinja templates, and keeping every credential local.

简体中文 · Docs · Upstream Warp

⚠️ Early development. No official release yet. Not affiliated with Warp, Inc.


✨ Why OpenWarp

The official Warp client routes AI through Warp's cloud agent service. OpenWarp opens that layer entirely:

Upstream Warp OpenWarp
AI provider Warp gateway Any OpenAI-compatible endpoint
Credentials Cloud account Local config file, never leaves device
System prompt Server-assembled, opaque minijinja templates, fully editable
UI language English Native English + Simplified Chinese, extensible
Cloud Agent / Computer Use On by default Off by default, fully local
Blocks / Workflows / Keymaps ✓ Fully preserved
License AGPL-3.0 / MIT dual Same as upstream

🚀 Three steps to take AI fully into your own hands

01 · Plug in any provider Paste a Base URL and API key in settings — any OpenAI Chat Completions–compatible endpoint works out of the box. Credentials are stored locally only.

02 · Author dynamic prompts A minijinja-powered template engine renders the system prompt in real time based on the current working directory, language, and role.

03 · Use it in the terminal immediately Switch models, conversations, and command suggestions with one click — the experience is identical to Warp, but every layer is yours.

🧩 Verified providers

Provider Base URL Notes
OpenAI https://api.openai.com/v1 Native protocol
Anthropic via genai native Claude 4.x family
DeepSeek https://api.deepseek.com/v1 thinking + tool calling
Gemini via genai native Google AI Studio
Ollama http://localhost:11434/v1 Local inference, no key
OpenRouter https://openrouter.ai/api/v1 Aggregator gateway
Qwen / Groq / Together / LM Studio / any OpenAI-compatible proxy Configure and go

🔧 Core features

  • BYOP custom providers — five native protocols (OpenAI / OpenAIResp / Anthropic / Gemini / Ollama / DeepSeek) explicitly bound on top of genai 0.6
  • SSE streaming — incremental block rendering identical to Warp's first-party path
  • 18 local tools — shell / read / edit / search / mcp / drive docs / skills / ask, all executed locally
  • System prompt templates — eight model-family prompts ported from opencode (default / anthropic / gpt / beast / gemini / kimi / codex / trinity)
  • models.dev integration — searchable Providers subpage with thousands of preloaded model entries
  • Privacy first — Cloud Agent / Computer Use / Referral disabled by default; no telemetry
  • Warp experience preserved — continuously merged with upstream; Blocks, Workflows, AI commands, Keymaps and themes all kept
  • Localized UI — Simplified Chinese + English, community-extensible

📦 Build from source

git clone https://github.com/zerx-lab/openwarp
cd openwarp
./script/bootstrap   # platform-specific deps
./script/run         # build & run
./script/presubmit   # fmt / clippy / tests

If you prefer raw cargo, always target the OSS binary explicitly:

cargo build --release --bin warp-oss
cargo run   --release --bin warp-oss

⚠️ Do not run cargo build --release / cargo run --release --bin {warp,stable,dev,preview} without a filter — those entry points (local.rs / stable.rs / dev.rs / preview.rs) load their channel config through Warp's private warp-channel-config binary, which lives in a closed-source repo. Compilation succeeds, but the resulting executables panic at startup asking you to run ./script/install_channel_config. That script clones an SSH repo only Warp employees can access. OpenWarp users only need the warp-oss binary.

See WARP.md for the full engineering guide (style, testing, platform notes).

📜 License

Same as upstream Warp:

  • warpui_core / warpui crates — MIT
  • Everything else — AGPL-3.0

🌿 Branches & upstream sync

zerx-lab/warp keeps two long-lived branches:

Branch Tracks Purpose
main zerx-lab/warp:main (default) OpenWarp's main development line. All PRs target this.
warp-upstream warpdotdev/warp:master Pristine mirror of upstream Warp, used to pull in new commits. No fork-local changes.

For contributors

Open PRs against main. Never against warp-upstream.

For maintainers (write access)

⚠️ Do not click the "Sync fork" button on main in the GitHub web UI. It would merge the entire upstream history straight into OpenWarp's main line and trigger large-scale conflicts. Pull upstream changes through the mirror branch instead:

# one-time setup
git remote add upstream https://github.com/warpdotdev/warp.git

# refresh the mirror
git checkout warp-upstream
git pull                          # fast-forwards from upstream/master
git push origin warp-upstream

# bring selected commits into main
git checkout main
git cherry-pick <sha>             # or merge warp-upstream when a full sync makes sense

🤝 Contributing

Community contributions welcome. See CONTRIBUTING.md for the full flow.

Before filing, please search existing issues. Security vulnerabilities should be reported privately per CONTRIBUTING.md#reporting-security-issues.

🙏 Acknowledgements

OpenWarp stands on the shoulders of the Warp team and many open-source projects:

Warp · genai · opencode · models.dev · Tokio · NuShell · Alacritty · Hyper · minijinja

About

OpenWarp is a free version of the open source client based on warp

Resources

License

AGPL-3.0, MIT licenses found

Licenses found

AGPL-3.0
LICENSE-AGPL
MIT
LICENSE-MIT

Code of conduct

Contributing

Security policy

Stars

Watchers

Forks

Packages

 
 
 

Contributors

Languages

  • Rust 97.2%
  • Shell 0.7%
  • Fluent 0.5%
  • Python 0.4%
  • Objective-C 0.3%
  • TypeScript 0.2%
  • Other 0.7%