Creator here. FAF started as a way to stop re-explaining projects to AI tools.
The problem: CLAUDE.md works in Claude Code. SOUL.md works in Moltbot. .cursorrules works in Cursor. None of them talk to each other.
FAF is one source file (project.faf) that renders to all of them. YAML format, IANA-registered.
v4.0 adds:
- Moltbot integration (faf-moltbot on npm)
- Gemini CLI integration
- Full bi-sync (edit CLAUDE.md, it flows back to .faf)
- 21k+ npm downloads
Install: npm install -g faf-cli
Moltbot users: npm install -g faf-moltbot
Happy to answer questions.
TL;DR: Built this to stop re-explaining my projects to AI assistants. Now IANA-registered and published on Zenodo/CERN.
.faf is YAML-based - noodles for AI. Human-readable strands of context that any model can parse instantly.
Problem: Every AI session starts cold.
Solution: One file any AI can read.
From the paper:
- 10,000+ projects tested
- 99.7% faster context reconstruction
- Validated with Claude, Gemini, GPT
Journey: Solo dev → IANA MIME type → 20k npm downloads → CERN/Zenodo paper
CLI: npm install -g faf-cli && faf init
Happy to answer questions.
TL;DR: We studied how Boris Cherny (creator of Claude Code) structures his projects - subagents, slash commands, MCP servers, Bun runtime - and built a 12-test integration suite that validates all of it. Now .faf detects and preserves the complete Claude Code ecosystem. Every publish passes Boris-Flow or it doesn't ship.
.FAF (Foundational AI-context Format) is a 40-line file that gives any AI instant project context.
This post documents a real build session with Grok where:
- Uploaded one .faf file
- Grok scored it 95/100 and locked in full context
- Built a complete GitHub PR code reviewer
- Zero re-explaining across the entire session
- I found 9 code issues, Grok fixed them, found 1 more, fixed that too
Full Grok conversation: https://x.com/i/grok/share/bWWQ7qHbCjHc2Wx3G9WZUq5ay
Format is open, MIT licensed, works with Claude, Grok, any AI.
I built FAF (Foundational AI-context Format) to solve context loss in AI coding sessions. Now IANA-registered, 18k+ npm downloads, Anthropic-approved MCP server.
This week I shipped two new implementations:
1. bun-sticky-faf - Pure Bun, zero deps, 328 tests
2. bun-sticky-zig - 77KB binary, zero runtime, built in Bun's language
The Zig version is interesting - Bun itself is built on Zig, so this CLI speaks Bun's native language. Sub-millisecond cold start.
They said it couldn't be done. They said one person couldn't hold both the IANA registration and MCP stewardship while also shipping SDKs.
They forgot about the snake.
Today, .FAF (Foundational AI-context Format) announced the release of its Python SDK, completing the first leg of a multi-language deployment that nobody asked for but everybody needed.
simplify with AI hasn't quite caught on lol
reply