Neomanex Logo
AI OperationsInternal

Internal at Neomanex — commercialisation deferred

Run AI agents remotely. Manage them from anywhere.

Conveyor is a cloud-native execution platform for AI agents. Launch sessions via REST API, stream results over WebSocket, and manage running agents through a terminal-style UI — with full session persistence, multi-provider orchestration, and Kubernetes-native workload management.

AI OperationsInternal

At a glance

Kubernetes-native autonomous AI agent execution

Status
Internal — commercialisation gated on Gnosari $5K MRR
Built for
Platform engineers and AI-ops leads running autonomous agent workloads
agent-runtimeremote-executionkubernetesautonomousinternal

Why it matters

Positioning pillars

  • Sessions, not sandboxes

    Persistent, resumable workloads.

    Conveyor runs agent sessions as persistent, resumable workloads with full conversation history stored in PostgreSQL. Sessions survive restarts, reconnects, and interruptions. Resume interrupted work with a single API call. Unlike disposable sandbox environments that vanish when the process ends, Conveyor sessions maintain state, context, and audit trail across the entire execution lifecycle.

  • Streaming, not polling

    Real-time output over WebSocket.

    Agent output is delivered in real time over WebSocket — character-by-character delta streaming from the claude_sdk provider, structured message streaming from claude_code. Watch work happen live from a phone, laptop, or tablet through the terminal-style UI. No refresh cycles. No polling intervals. No stale dashboards. The output arrives as it is produced.

  • Managed workloads, not raw compute

    An execution layer with an API and a UI.

    Conveyor handles provider orchestration, session persistence, and workload lifecycle management on Kubernetes. Choose between claude_code (subprocess, process isolation) and claude_sdk (51% cheaper, delta streaming) per session. Conveyor manages the queue, the worker, the persistence layer, and the real-time delivery pipeline. Not another box to SSH into — a managed execution layer with an API and a UI.

The mechanics

How it works

  1. Step 1

    API-first execution

    Launch agent sessions with a single POST request. Conveyor provides a REST API for creating, managing, and monitoring sessions — no SSH, no manual setup, no terminal tethering. Every operation is API-addressable.

  2. Step 2

    Real-time streaming

    Agent output streams over WebSocket as it is produced. Watch sessions execute live from any device through the terminal-style management UI. No polling, no refresh cycles — real-time observability from anywhere.

  3. Step 3

    Session persistence and resume

    Full conversation history is stored in PostgreSQL. Sessions survive restarts and disconnects. Resume interrupted work with a single API call — the agent picks up exactly where it left off, with full context intact.

  4. Step 4

    Multi-provider orchestration

    Choose between claude_code (subprocess-based, CLI parity) and claude_sdk (SDK-based, 51% cost reduction) per session. The provider abstraction layer handles queue management, worker lifecycle, and delivery pipeline for each.

Neomanex internal AI operations

Conveyor powers autonomous agent execution across the Neomanex portfolio — scheduled tasks, unattended jobs, and remote agent sessions for ConvOps and Gnosari. Internal since early 2026; load-bearing for every autonomous AI operation in the stack.

The contrast

How we compare

  • Them

    Run agents locally, tied to your terminal

    Us

    Remote execution as managed Kubernetes workloads — sessions persist when you close your laptop.

  • Them

    Agent sandboxes (E2B, Daytona) — ephemeral, disposable environments

    Us

    Persistent, stateful session management with multi-turn resume, streaming, and lifecycle audit trail.

  • Them

    Build your own agent execution infrastructure

    Us

    REST API, WebSocket streaming, PostgreSQL persistence, terminal UI, and Kubernetes workload management — out of the box.

  • Them

    Raw compute — SSH into a box and run a process

    Us

    Managed execution layer with provider orchestration, session resume, real-time streaming, and a management UI.

The ecosystem

Fits into the portfolio

Uses under the hood

Questions, answered

Frequently asked questions

See how we're using it inside Neomanex.