llmwiki

raw JSON →
0.5.1 verified Fri May 01 auth: no javascript

LLM Wiki Compiler is a CLI tool in active development (v0.5.1) that compiles raw sources (URLs, files) into an interlinked markdown wiki with claim-level provenance, semantic search, and MCP server integration for AI agents. It supports Anthropic, OpenAI, and Ollama providers, with configurable timeouts. Key differentiators: incremental compilation, Obsidian-compatible wikilinks, and provider-agnostic design.

error ERR_PACKAGE_PATH_NOT_EXPORTED
cause v0.5.0 deep imports into youtube-transcript that does not expose that subpath.
fix
npm install -g llm-wiki-compiler@0.5.1
error Error: No LLM provider configured
cause Missing LLMWIKI_PROVIDER or provider-specific API key.
fix
Set LLMWIKI_PROVIDER and corresponding API key environment variable.
breaking v0.5.0 crashes on startup with 'ERR_PACKAGE_PATH_NOT_EXPORTED' due to deep import into youtube-transcript.
fix Upgrade to v0.5.1: npm install -g llm-wiki-compiler@0.5.1
gotcha Anthropic provider requires either ANTHROPIC_API_KEY or ANTHROPIC_AUTH_TOKEN; order of precedence: shell env, .env, then Claude Code settings.
fix Set ANTHROPIC_API_KEY in shell or .env file.
gotcha OpenAI provider requires OPENAI_BASE_URL to include '/v1' (e.g., http://host:port/v1).
fix Ensure OPENAI_BASE_URL ends with /v1.
npm install llm-wiki-compiler
yarn add llm-wiki-compiler
pnpm add llm-wiki-compiler

Global install, set Anthropic API key, ingest a URL, compile sources into wiki, then ask a question.

npm install -g llm-wiki-compiler@0.5.1
export ANTHROPIC_API_KEY=sk-...
llmwiki ingest https://example.com/article
llmwiki compile
llmwiki query "What is the main point?"