How it works

Four stages. One repo.

Your MDX is the corpus. The build step is the indexer. A small API route is the retriever. A streamed call to your model is the synthesizer.

Stage 01

Chunk

Each .mdx file under content/docs/ becomes a page. Wrap retrievable units in <Chunk id>. Frontmatter carries the metadata. Versioned with git.

Stage 02

Index

npm run ingest walks the tree, embeds chunks with Voyage AI using your key, and writes them to a single sqlite-vec file at data/docs.db.

Stage 03

Retrieve

/api/docs/search embeds the query, runs a vec0 ANN scan with a lexical filter, and returns top-k chunks. Spotlight (⌘K) calls the same route.

Stage 04

Answer

The chat panel packs chunks into a system prompt, appends history, and streams the model's answer through your provider key. Citations link to the source chunk.

Hosted RAG stack

  • Managed vector database
  • Embedding pipeline (queue, worker, retry)
  • Backend & query broker
  • Retrieval observability platform
  • Monthly bill
  • On-call rotation

The doks way

  • sqlite-vec in a single file
  • npm run ingest at build time
  • One Next.js route handler
  • The Network tab + your own chunks
  • Whatever your static host already costs
  • Nothing to operate
MIT · OPEN SOURCE

Read the source. Fork it. Ship it.

doks is a public pattern, released under MIT. There is no company behind it, no email list to join, and nothing to install beyond a Next.js project. Take it and make your docs answer questions.