MCP — connect every AI you use
to your knowledge base.
Knovya is an MCP client- and server-ready knowledge base. Twenty-five tools, organised in five tiers, served to Claude, Cursor, ChatGPT, Goose and the rest of the protocol's growing client list — through one open standard.
today
Pick an AI. Pick a prompt. Watch your knowledge land.
Four real clients. Four real prompts. Knovya speaks the protocol — your notes show up where you already work, with one config and one consent screen.
notes:readTwenty-five tools, five tiers.
Every MCP server picks a few primitives. Knovya picks the whole knowledge base — read, write, link, search, transform, share, message, and reason — each tool answering a different question your AI might ask.
Every tool above is exposed through the open Model Context Protocol — JSON-RPC 2.0 over Streamable HTTP, with stdio for local servers. OAuth 2.1 with PKCE, scoped API keys, end-to-end encryption for sensitive notes.
No client lock-in, no vendor extension, no proprietary handshake.
modelcontextprotocol.io →One server. Every AI you reach for.
Seven clients live today, more shipping support every quarter. The protocol fans out — your knowledge stays in one place, every agent reads from the same source of truth.
Every AI conversation starts amnesic —
and you keep paying the toll.
Claude has memory. ChatGPT has memory. Cursor has memory. None of them share.
You explain who you are. You explain what you are working on. You paste in the constraint you mentioned three days ago. Then you do the same thing in the next tab — because the next tab is a different vendor and a different memory store.
- The cost
- Industry surveys put it at six to nine hours weekly — entire workdays lost to copy-pasting context between AI tools.
- The fix
- One protocol. One source of truth. Every AI reads from the same place — with permission, with audit, with scope.
From REST to your second brain.
MCP did not appear from nothing. Five protocol moments — each solving a version of the same problem — taught it the shape it has today.
- 2000Roy Fielding — REST Uniform interface as a constraint. Resources as first-class citizens. The architecture that made the web composable across every server in the world. UC Irvine · doctoral thesis
- 2016Microsoft — Language Server Protocol N×M reduced to N+M. One protocol so any editor could talk to any language server. The pattern MCP would later port — JSON-RPC, capabilities, lifecycle. Visual Studio Code · open spec
- 2024Anthropic — MCP launch Same pattern, ported to AI. Tools, resources, prompts. JSON-RPC over stdio and HTTP. The connector that ended the bespoke-integration era. Anthropic · November
- 2025Linux Foundation / AAIF Anthropic donates MCP to the Agentic AI Foundation. OpenAI, Google, Microsoft, AWS, Cloudflare and Bloomberg back the standard. Protocol becomes vendor-neutral. December · Linux Foundation
- 2026Knovya MCP Twenty-five tools, five tiers, epistemic primitives — experience envelopes, persona, agentic memory. The protocol gets a knowledge base. ★ Knovya · production
Nobody else gives your AI a knowledge base.
Memory APIs exist. Reference servers exist. Knowledge-graph servers exist. None of them ship with a real notes editor, an outcome-aware experience layer, and twenty-five purpose-built tools — wired through one protocol, with OAuth and per-scope permissions.
- Anthropic memory server reference example · primitives only
- Mem0 developer memory api · no editor
- Letta (formerly MemGPT) conversational memory layer
- Cognee memory engine · dev-focused
- Notion MCP notes-shaped · no agent layer
- mcp.so directory 3,000+ single-purpose servers
- ★ Knovya 25 tools · 5 tiers · experience layer · oauth · e2ee
MCP shows up wherever your AI already lives.
One server. Four places it surfaces — settings consent, terminal install, in-IDE response, sessions monitor.
Generate a scoped API key, choose what your AI can read and write, watch which clients are connected — all from one panel.
One command in your terminal. Streamable HTTP transport, OAuth 2.1 prompt opens in your browser, consent screen takes the rest.
Cursor and Continue surface tool calls inline as the agent works. The user sees what was searched, what was read, what got written — no black box.
Knovya keeps a list of every active MCP session — which client, which scope, last call. Revoke from one row.
MCP composes with the rest of the AI Layer.
AI Memory
MCP delivers memory across every AI client. Same notes, every conversation.
Conv → Note
Any AI chat round-trips through MCP and lands as a structured Knovya note.
Meeting Notes
Transcripts in over MCP, structured minutes out. Decisions, actions, attendees.
Agentic Memory
Recall, box, temporal — the memory primitives MCP exposes for agents.
A few honest answers.
What is the Model Context Protocol?
MCP is an open standard, originally introduced by Anthropic in November 2024 and donated to the Linux Foundation in December 2025, that lets AI applications discover and call external tools and data sources through a uniform JSON-RPC interface. It is the protocol that lets Claude, Cursor, ChatGPT, Goose, Continue, Windsurf and a growing list of clients read from and write to systems like Knovya without bespoke integration.
How do I connect Knovya MCP to Claude Desktop?
Open Settings → Connections → MCP in Knovya, generate a scoped API key, then add Knovya as a Streamable HTTP server in your claude_desktop_config.json with a single command. Cursor, Goose, Continue, Windsurf and ChatGPT use the same OAuth flow — once consent is granted, Claude can search, read, write and reason over your notes.
What is an MCP client?
An MCP client is the AI application that calls an MCP server — Claude Desktop, Cursor, ChatGPT, Goose, Continue, Windsurf, GitHub Copilot and Zed are all MCP clients in 2026. Knovya is on the other side of that connection: an MCP server that exposes 25 tools, so any of those clients can act on your knowledge base.
Is the Knovya MCP server free?
Yes. Every Knovya plan includes the MCP server. Free gives you 50 MCP calls per month — enough to wire up Claude or Cursor and try the full memory layer. Pro raises the cap to 5,000 calls per month with scoped API keys and OAuth 2.1. Team is unlimited and adds shared workspace memory.
How is Knovya MCP different from a memory server?
Memory servers expose primitives — set, get, recall. Knovya exposes a knowledge base. The 25 tools include search, read, write, edit, links, folders, history, version diffs, an experience envelope that groups past decisions by outcome, agent-to-agent messaging and AI-native transforms. Memory is one tier of five.
Can I scope what my AI can read or write?
Yes. API keys are issued per scope — notes:read, notes:write, folders:read, folders:write, attachments, share, versions, code. OAuth 2.1 with PKCE adds dynamic registration for browser-based agents. Encrypted notes are excluded from MCP responses by design.
Which AI clients work with Knovya MCP?
Claude Desktop, Claude Code, Cursor, ChatGPT, Gemini, GitHub Copilot, Goose (Block), Continue, Windsurf and Zed all support MCP natively in 2026. Any client implementing the open MCP specification can connect to Knovya.
Connect your AI in one config.
Free includes 50 MCP calls a month — enough to wire up Claude, Cursor or ChatGPT and feel the difference. Pro lifts you to 5,000.