CLI Commands
The CLI Commands reference provides comprehensive documentation for all available commands in the docs-mcp tool. These commands enable automated documentation generation, interactive AI assistance, and repository analysis through a unified command-line interface powered by Claude AI models and the Bunli framework.
Command Overview
Section titled “Command Overview”The docs-mcp CLI includes five core commands designed for documentation generation and AI-assisted code exploration:
- ask: Quick question answering from the command line
- build-repo-docs: Comprehensive automated documentation generation for code repositories
- chat: Interactive conversational interface with LLM assistance
- create-description: AI-powered description generation for documentation files
- create-llms-txt: Project guidance file generation for AI coding assistants
All commands require the ANTHROPIC_API_KEY environment variable to be set for API access.
Ask a question to an LLM and receive a streamed response directly to your terminal.
Usage:
docs-mcp ask -q "Your question here"Options:
-q, --question(required, default: “What is the meaning of life?”): The question to ask
This command streams responses from Anthropic’s Claude Haiku model, displaying the query and piping the response directly to stdout with colored terminal output.
handler: async ({ flags, colors }) => { const { textStream } = streamText({ model: anthropic("claude-haiku-4-5-20251001"), prompt: flags.question, }); for await (const textPart of textStream) { process.stdout.write(textPart); }};build-repo-docs
Section titled “build-repo-docs”Walk a code repository and build comprehensive documentation using AI agents through a five-stage pipeline.
Usage:
docs-mcp build-repo-docs -r ./my-repo -o ./docsOptions:
-r, --repo(required): Path to the repository root-o, --output(default: ”./docs”): Output directory for generated docs
This command executes a sophisticated multi-step pipeline:
- Building file tree: Scans the repository structure
- Extracting structure: Analyzes code organization and relationships
- Summarising: Creates summaries of code components
- Planning topics: Determines documentation organization and topics
- Writing docs: Generates final documentation files
Results are persisted in a local SQLite database (.docs-mcp.db) and token usage is tracked throughout the process.
await runStep("Building file tree", () => buildFileTree(flags.repo, db));await runStep("Extracting structure", () => extractStructure(db, usage));await runStep("Summarising", () => summarise(db, usage));await runStep("Planning topics", () => planTopics(db, usage));await runStep("Writing docs", () => writeDocs(db, flags.repo, flags.output, usage));Launch an interactive terminal user interface (TUI) for conversational chat with an LLM.
Usage:
docs-mcp chatOptions: None
This command provides an immersive interactive experience for exploring code documentation and asking questions in a conversational format using the Chat TUI component.
export default defineCommand({ name: 'chat', description: 'Interactive TUI chat with an LLM', tui: { renderer: { bufferMode: 'alternate' } }, render: () => <Chat />,})create-description
Section titled “create-description”Generate a description summary from documentation sources through a four-stage AI pipeline. The output is written to the description frontmatter field.
Usage:
docs-mcp create-description -f ./docs/guide.mdxOptions:
-f, --file(required): Path to the.mdxor.mdfile to process
This command orchestrates a four-step pipeline for transforming documentation:
- Analyze: Extracts key information from the article
- Compose: Creates a structured description with purpose, context, approach, and findings
- Polish: Refines the description for style compliance
- Insert: Writes the description into the file’s frontmatter
descriptionfield
The pipeline supports .mdx and .md file formats with format-specific syntax handling.
create-llms-txt
Section titled “create-llms-txt”Generate a project guidance file (CLAUDE.md / llms.txt) from documentation descriptions to optimize AI coding assistants.
Usage:
docs-mcp create-llms-txt -d ./src/content/docs -o ./CLAUDE.mdOptions:
-o, --output(optional, default: ”./CLAUDE.md”): Output file path-d, --content-dir(optional, default: ”./src/content/docs”): Content directory to scan
This command implements a three-stage pipeline:
- Collect: Extracts descriptions from Markdown files in the content directory
- Synthesize: Transforms collected descriptions into structured guidance
- Write: Persists the final guidance file to disk
The generated file serves as a comprehensive project overview for AI coding assistants like Claude.
Global Options
Section titled “Global Options”All commands inherit the following global options from the Bunli framework:
- Environment Variables: All commands require
ANTHROPIC_API_KEYto be set in your environment - Colored Output: Commands automatically use colored terminal output when available
- Error Handling: Failed steps are reported with error context and suggestions
Setting the API Key:
export ANTHROPIC_API_KEY=your-anthropic-api-keyShell Completion: The CLI provides shell completion support through the Bunli plugin system for improved terminal experience across bash, zsh, and fish shells.