Episode 219: Probe: Inference Modes We demo three inference modes in the day-old Probe TUI: 1. Codex - with better streaming than the official CLI! - using the same limits from your ChatGPT account 2. Qwen 3.5 2B via Psionic - served over Tailscale from our Linux desktop 2000 miles away 3. Apple FM - using this Macbook's built-in Apple model You can imagine coding agent workloads that use a mix of these modes (and more) to optimize for speed, quality, and cost -- all under your control.

OpenAgents
@OpenAgentsInc
03-31
Episode 218: Probe
We're building a coding agent!
All Rust, model-agnostic but local-first, using Psionic, borrowing code & concepts from Codex and OpenCode and... other places.
Contributions welcome: https://github.com/OpenAgentsInc/probe…
From Twitter
Disclaimer: The content above is only the author's opinion which does not represent any position of Followin, and is not intended as, and shall not be understood or construed as, investment advice from Followin.
Like
Add to Favorites
Comments
Share
Relevant content




