AI banner

AI Systems & NLP

Luna (Flagship) • Ainor • NLP Toolkit • Vision & Edge

Luna - assistant core, embedded hive, on-chain access

Mega Project

Luna - Assistant • Hive • Chain

A next-gen home or edge assistant that orchestrates devices, powers a robotics hive, and exposes a build via opt-in on-chain access - with a redacted open-source core.

  • Assistant Core: streaming ASR ↔ TTS, barge-in, intent and tools, privacy modes, guardrails.
  • Embedded Hive: controls AceBot, X-EYE, VIGI; shared state bus, policies, watchdogs.
  • Chain Access: token or NFT-gated utilities inside a supervised environment.
  • OSS Core: redacted, production-minded edition for local builds and extensions.
Ainor - Self-Evolving Living House AI System GUI

Rank #2 System

Ainor - Self-Evolving Living House AI

Ainor is a privacy-first, on-device AI with advanced visual cognition and compositional language generation. It learns through interaction and never relies on templates. Professional Svelte GUI on Raspberry Pi.

  • Visual Cognition: PiCamera for face, object, and gesture tracking for contextual responses.
  • Compositional AI: semantic roles and grammar parsing for unique, context-aware replies.
  • Self-Evolution: Dream State retraining consolidates memory and refines the core model.
  • Frontend: Svelte or Flask stack with voice orb, live conversation panel, emotion grid.

Systems

Luna on-device assistant UI

Luna - Voice Assistant (Flagship)

Assistant core with tool and intent routing, plus an embedded hive mode for robotics. A redacted OSS core ships for local builds.

  • ASR ↔ TTS
  • Intent Router
  • Guardrails
  • Tracing
Ainor professional GUI dashboard

Ainor - Living House AI (Rank #2)

On-device smart home AI with visual cognition, a Dream State learning cycle, and compositional, contextual language generation. Built with Python and Svelte.

  • Visual Cognition
  • Self-Evolving
  • Multi-Layer Memory
  • PiCamera
Chailse L-system explorer (PyQt)

Chailse - Interactive L-Systems

Define, iterate, and visualize L-systems in 2D or 3D with live controls and export (PNG or SVG, OBJ or STL). Procedural design and pedagogy.

  • PyQt
  • NumPy
  • SVG or PNG
  • OBJ or STL
VIGI AI Cam - vision & edge tie-in

VIGI AI Cam - Vision & Edge

PIR plus dual ultrasonic plus ESP32-CAM with threat scoring and servo targeting. Lives in Robotics; surfaced here as the assistant’s vision & edge module.

  • ESP32-CAM
  • PIR
  • HC-SR04×2
  • Servo
NLP toolkit - tokenization and labeling views

NLP Toolkit - Text Ops

Preprocessing, labeling helpers, prompt libraries, evaluation suites, and a compact RAG for local docs. Deterministic runs and metrics-first.

  • RAG
  • Eval
  • Determinism
  • CLI
Data pipeline DAG and checks

Data Pipeline - Clean → Validate → Serve

Ingest → schema validation → feature prep → caching → API or IPC. JSON logs with rotation, correlation IDs, and health checks.

  • Validation
  • Caching
  • API
  • Health
Observability dashboard and traces

Model Ops - Observability & Guardrails

Event or span tracing across ASR, NLU, policy, tools, and TTS. Quotas, cooldowns, fail-closed execution, and exportable session summaries.

  • Tracing
  • Policies
  • Quotas
  • QA

Architecture Overview

Control Mesh

Luna is #1 and Ainor is #2. Each can control the other. Other systems can be controlled by them, but cannot control both. Communication uses MQTT or REST with a shared policy bus.

I/O & Runtime

Streaming mic capture and TTS playback with barge-in. Crash-safe queues and offline-first fallbacks.

NLU & Routing

Intent mapping and slot extraction feed a tool or action layer. Confirmations for sensitive tasks.

Data & Knowledge

Local embeddings with chunked docs for RAG. Deterministic prompt sets for measurable evaluations.

Observability

JSON logs, rotation, correlation IDs, per-stage latencies, and session rollups for QA and regression tracking.

Want an assistant that ships?

On-device voice, robust NLP, and documented pipelines with observability and guardrails.