Chat with your documents

Zotero integration with LM Studio

Below are three reliable ways to wire ZoteroLM Studio. Pick A (quick, no-code), B (“live” access via MCP), or C (scripted, for full control).

A) Quick, no-code: export notes/PDFs from Zotero and use LM Studio’s “Chat with Documents”

  1. In Zotero, export the papers’ notes/annotations to Markdown or TXT (plugins below) and/or export PDFs.
    • Plugins commonly used for Markdown/TXT export: mdnotes and Better Notes.
  2. In LM Studio, open a chat and attach the exported files (.pdf/.docx/.txt) so the model can retrieve from them (RAG). For long files, LM Studio will auto-RAG.

Pros: no coding; works fully offline.

Cons: you re-export when your Zotero notes change; no library search from inside LM Studio.

B) “Live” integration via MCP (best balance)

LM Studio ≥ v0.3.17 can host Model Context Protocol (MCP) tools. Several community MCP servers expose your Zotero library (search, read metadata, fetch annotations) to local LLMs.

Steps (example with zotero-mcp):

  1. Enable Zotero’s local API (Zotero 7): Preferences → Advanced → API → Enable local API. This is Zotero’s built-in local HTTP API (read-only) introduced for v7.
  2. Install an MCP Zotero server (pick one):
  3. In LM Studio, add the MCP server under Use MCP Servers (Settings → MCP) so your model can call tools like search libraryget item metadataget notes/annotations.

Pros: query your Zotero library from the chat; fetch annotations on demand; stays local.

Cons: community projects (not official Zotero); you must keep Zotero running with the local API enabled.

C) Scripted pipeline (LM Studio OpenAI-compatible API + Zotero API)

If you prefer your own control loop (e.g., “search Zotero → stuff results into the prompt → ask the model”), combine Pyzotero with LM Studio’s OpenAI-compatible local server.

Outline:

  1. Start LM Studio’s local server (default http://localhost:1234/v1).
  2. Use Pyzotero to pull CSL-JSON/metadata/fulltext (local or web API).
  3. Insert retrieved snippets (with item keys) into the model prompt or a RAG chunker.

References: LM Studio OpenAI-compat docs; Pyzotero quickstart; Zotero Web API v3.

What I recommend (practical setups)

  • If you want “ask my library” inside LM Studio: do B (MCP). It’s the smoothest interactive experience: search collections, pull notes/annotations, then ask the model to synthesize or draft.
  • If you just need the model to read a few papers with your highlights: do A (export + attach) with mdnotes/Better Notes and the PDFs.
  • If you need a programmable workflow (e.g., batch literature reviews, auto-inserting citekeys): do C and optionally add Better BibTeX for stable citekeys and a local citation-picker endpoint (CAYW).

Notes & gotchas

  • Local API must be on in Zotero 7 (port 23119). If a plugin blocks it, re-enable or free the port; the “connector/ping” and config entries are documented in forum/dev threads.
  • CSL-JSON from the Web API doesn’t include tags; if you need tags in exports, use other formats or plugins.
  • For annotation access, the Zotero MCP docs explain using local text extraction when needed; Better BibTeX can help with citekeys/annotation flows.
  • LM Studio’s “Chat with Documents” currently lists .pdf/.docx/.txt; Markdown works fine if saved as .txt (or just attach the PDFs).

Key sources

  • LM Studio: Chat with Documents; OpenAI-compatible API; MCP host support.
  • Zotero: Web API v3; Local API in Zotero 7; Pyzotero docs.
  • Zotero → Markdown/notes: mdnotes; Better Notes.
  • MCP Zotero servers: 54yyyu/zotero-mcp (guide & tooling); alternatives.
  • Better BibTeX (citekeys, CAYW/HTTP picker).

Start your setup today and chat with your documents. Reach out if you need assistance.

Don’t miss on GenAI tips!

We don’t spam! We are not selling your data. Read our privacy policy for more info.

Don’t miss on GenAI tips!

We don’t spam! We are not selling your data. Read our privacy policy for more info.

Share the Post:

Related Posts