Local-first LLM wiki for research
I kept losing context across research sessions. Notes in one place, sources in another, LLM conversations gone after closing the tab. I wanted something that ties it all together locally — no cloud sync, no vendor lock-in, just my files.
OAMC ingests raw sources into an Obsidian vault and compiles them into a structured wiki using LLMs. Drop a paper, article, or notes into the raw folder, and the pipeline processes it into linked, searchable wiki pages. Query your knowledge base from the CLI, the local dashboard, or the macOS menubar.
Built with Python and FastAPI. Runs entirely on your machine. Your research stays yours.