Neomanex Logo
AI OperationsInternal

Internal at Neomanex — commercialisation deferred

AI that remembers.

NeoRouter stores the experiences, paths, and context your AI sessions generate — then serves them back by semantic query in milliseconds. The next session starts with everything the last one learned.

AI OperationsInternal

At a glance

Institutional Memory for AI Operations

Status
Internal — commercialisation gated on Gnosari $5K MRR
Built for
Platform engineers and AI-ops leads running many-session agent workloads
memorycontextmcpplatforminternal

Why it matters

Positioning pillars

  • Context cards, not logs

    Retrievable by meaning.

    Stored experiences are indexed as context cards: paths, notes, decisions, and gotchas tied to semantic queries. Unlike logs, they are retrievable by meaning — not timestamp. Unlike vector DBs, they carry operational context (which file, which decision, which fix), not just embeddings.

  • Instant retrieval, not exploration

    Cold-start tax, eliminated.

    A new AI session queries in natural language. NeoRouter returns relevant files, paths, and notes in milliseconds — replacing the exploratory "read 40 files to understand the codebase" tax that every cold session otherwise pays. The AI starts productive, not lost.

  • Compounding knowledge

    The system gets smarter as it is used.

    Every exploration that creates a route makes the next session faster. Knowledge does not die when a session ends — it accumulates. The organisation learns, the agents benefit, and the cold-start problem shrinks with every query. Institutional knowledge compounds instead of evaporating.

The mechanics

How it works

  1. Step 1

    Query by natural language

    A new session asks "where does auth live?" or "how did we fix the stale cache issue?". NeoRouter returns the paths, notes, and decisions captured by previous sessions — no re-reading the codebase.

  2. Step 2

    Create routes on-demand

    When a session discovers useful context, it stores a route. No build step, no nightly job, no staleness window — the index grows organically as work happens.

  3. Step 3

    Serve via MCP

    NeoRouter exposes an MCP server as its primary interface. Any MCP-compatible client — Claude Desktop, Claude Code, Gnosari agents, ConvOps workflows — can query it directly. One memory layer, every tool.

Neomanex internal AI operations

NeoRouter is the institutional memory every AI session at Neomanex queries at startup. Every file read, every decision, every fix gets captured as a route — and every future session benefits. Internal since late 2025; load-bearing for the entire AI-operations stack.

The ecosystem

Fits into the portfolio

Uses under the hood

Questions, answered

Frequently asked questions

See how we're using it inside Neomanex.