G-K0TMFLLLS9
16 views
ago in Designing a Private Edge AI Home Assistant on Raspberry Pi 5 by

Implementation Blueprint: From Architecture to a Running MVP on Raspberry Pi 5
A Service-Oriented Layout That Avoids “Script Spaghetti”

Introduction
The fastest way to kill an Edge AI project is to grow it as a pile of scripts. It starts as “just one Python file,” and ends as an unmaintainable system where changing one module breaks three others. This article provides a concrete implementation blueprint: directory structure, service layout, process separation, a minimal working MVP, and a clean path to run everything as Linux services on Raspberry Pi 5.

Goals of the Blueprint
This layout optimizes for:
– clarity of responsibilities (one service = one job)
– stable interfaces between components
– deterministic startup and restart behavior
– debuggability via structured logs
– incremental expansion without rewrites

Process Separation: What Runs Where
We separate the system into processes so that failures and resource spikes do not cascade:
1) vision_service: camera capture + face detection + embeddings + candidate identity
2) identity_service: enrollment DB + matching + confidence gating + identity state
3) scenario_service: event bus consumer + deterministic scenario selection + action requests
4) dialogue_service: STT/intent + response generation (local or API) + TTS requests
5) knowledge_service: RSS fetch + extraction + ranking + summarization + structured results
6) automation_service: email/alerts/calls/webhooks with strict whitelisting
7) api_gateway (optional MVP+): local HTTP API for admin + health checks
MVP uses only: vision_service + identity_service + scenario_service + (optional TTS stub).

Directory Structure (Concrete)
Use a single repo with clear boundaries:
repo/
  README.md
  pyproject.toml
  .env.example
  configs/
    app.yaml
    topics.yaml
    scenarios.yaml
    identities.yaml
  data/
    embeddings/
    identities.db
    cache/
    logs/
  services/
    vision_service/
      __init__.py
      main.py
      camera.py
      detect.py
      embed.py
      config.py
    identity_service/
      __init__.py
      main.py
      store.py
      match.py
      thresholds.py
      config.py
    scenario_service/
      __init__.py
      main.py
      rules.py
      actions.py
      cooldowns.py
      config.py
    dialogue_service/
      __init__.py
      main.py
      stt.py
      intent.py
      llm.py
      tts.py
      config.py
    knowledge_service/
      __init__.py
      main.py
      rss.py
      extract.py
      rank.py
      summarize.py
      config.py
    automation_service/
      __init__.py
      main.py
      email.py
      notify.py
      calls.py
      webhooks.py
      policy.py
      config.py
  shared/
    __init__.py
    events.py
    bus.py
    logging.py
    schemas.py
    security.py
  scripts/
    enroll_identity.py
    test_camera.py
    inject_event.py
  deploy/
    systemd/
      vision.service
      identity.service
      scenario.service
      dialogue.service
      knowledge.service
      automation.service
    nginx/
      local.conf

Rule #1: shared/ contains only “boring” cross-cutting utilities (events, logging, schemas). If shared grows into business logic, you are rebuilding a monolith.

Interfaces: Event Bus First
To avoid tight coupling, services communicate through an event bus abstraction (can start simple):
– MVP option A: local file-backed queue (simple, reliable)
– MVP option B: Redis pub/sub (cleaner, still lightweight)
– MVP option C: MQTT (good if you later add microcontrollers)
In all cases, messages are structured events:
Event = {type, timestamp, source, payload, trace_id}

Minimal Working MVP (Day-1 Target)
MVP behavior:
1) vision_service detects a face and produces embedding
2) identity_service matches embedding to known identities
3) scenario_service selects a greeting scenario
4) scenario_service triggers a “speak” action (initially a stub that prints)
That’s enough to validate the full architecture loop end-to-end.

MVP Event Flow
vision_service emits:
– FaceSeen {embedding_id, quality, bbox, cam_id}
identity_service emits:
– IdentityResolved {identity: owner|guest|unknown, name?, confidence}
scenario_service emits:
– ActionRequested {action: speak, text, voice_profile}
(automation/dialogue/knowledge can be added later without redesign.)

Configuration Strategy (No Hardcoding)
All behavior must live in configs/:
– thresholds (recognition confidence)
– identities (enrolled people)
– scenarios (rules + priorities + cooldowns)
– topics (for knowledge engine later)
The code reads configs at startup and supports reload by restart. Avoid hot-reload complexity early.

Logging and Observability
Every service logs structured JSON lines:
– timestamp, service, level, event_type, trace_id, message
Store logs in data/logs/. Prefer rotation. A single “trace_id” per flow makes debugging easy across services.

Running as Services (systemd)
systemd is the simplest reliable supervisor on Raspberry Pi OS.
Each service gets:
– its own user (optional but ideal)
– its own working directory
– restart on failure
– environment file for secrets
You avoid “run it in a terminal forever” operational fragility.

Example systemd unit (Pattern)
deploy/systemd/vision.service should define:
– ExecStart: python -m services.vision_service.main
– WorkingDirectory: repo/
– Restart: on-failure
– EnvironmentFile: /etc/yourassistant/env
Repeat for identity/scenario. Start with 3 services only.

Avoiding Script Spaghetti (Hard Rules)
1) No cross-imports between services (only shared/)
2) No hidden globals (use config objects)
3) No “just call that function” across boundaries—use events
4) One responsibility per service
5) Add features by adding modules, not by expanding main.py
If a file exceeds a few hundred lines, split it.

Incremental Expansion Plan
After MVP:
Step 1: replace “speak stub” with a real TTS service call
Step 2: add dialogue_service for voice commands
Step 3: add knowledge_service for RSS digests
Step 4: add automation_service with strict whitelists
At every step, the event contracts remain stable.

What Comes Next
Next article: “MVP Build Guide: Installing Dependencies, Creating the Event Bus, and Running Vision→Identity→Scenario on Raspberry Pi 5.” This will include concrete commands, minimal code skeletons, and the first runnable demo.

Your answer

Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.
Anti-spam verification:
To avoid this verification in future, please log in or register.

44 questions

2 answers

3 comments

2 users

Welcome to Asky Q&A, where you can ask questions and receive answers from other members of the community.
Asky AI - Home
HeyPiggy Banner
...