CLAUDE.md 8.2 KB

CLAUDE.md

This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.

What This Is

PalmOilAI — an Angular 20 single-page application for palm oil fruit bunch (FFB) ripeness detection. Supports three inference engines: tflite and onnx run 100% client-side in the browser via WASM; socket streams to the NestJS backend for server-side inference. The NestJS backend is required for the socket engine, chat proxy, and remote history.

Detection classes: Ripe, Unripe, Underripe, Overripe, Abnormal, Empty_Bunch.

Commands

npm install
ng serve                        # Dev server at http://localhost:4200
ng serve --host 0.0.0.0         # Expose to LAN (device testing)
ng build                        # Production build → dist/
ng test                         # Karma + Jasmine unit tests

Architecture

Routing

/analyzerAnalyzerComponent (main scanner UI, default route)
/historyHistoryComponent (local + remote vault of past scans)
/chatbotChatbotComponent (chat interface backed by n8n via WebSocket)

Components (src/app/components/)

AnalyzerComponent — Three-engine scan hub:

  • Engine selector: tflite (browser WASM), onnx (browser WASM), socket (NestJS remote)
  • Browser engines: file upload → local WASM inference → bounding-box canvas draw
  • Socket engine: webcam snap or gallery image → Base64 → vision:analyze → canvas draw
  • Batch ingestion: queues multiple files, sends one-by-one via socket, collects results into FullSessionReport
  • Evidence canvas: separate from snap canvas; renders bounding boxes for audit drill-down
  • Computes BatchResult[] and FullSessionReport (timing, success/fail counts, raw tensor sample)

HistoryComponent — Dual-tab vault:

  • Local tab: browser localStorage via LocalHistoryService, up to 20 records
  • Remote tab: NestJS SQLite records via RemoteInferenceService, grouped by batch_id into BatchSessionGroup with aggregate summaries
  • Per-record delete and clear-all (with confirmation)

ChatbotComponent — n8n RAG chat:

  • Messages sent via ChatSocketService.send() (promise-based one-shot listener)
  • NestJS proxies to n8n server-to-server (no browser CORS)
  • Unwraps n8n array response to first element
  • Shows per-message response latency; error fallback with NestJS→n8n diagnostic hint

PerformanceHudComponent — Floating, draggable, collapsible overlay (CDK drag-drop):

  • Live CPU/RAM for NestJS, n8n, and Ollama processes via SurveillanceService
  • Mounts in app root and never unmounts (zombie socket pattern)

HeaderComponent — App header with dark/light theme toggle via ThemeService.

BottomNavComponent — Mobile fixed tab bar (hidden on desktop). Three tabs: Scanner | Intelligence | Vault.

Services (src/app/services/ and src/app/core/services/)

LocalInferenceService — Core browser AI engine. Dispatches to ONNX or TFLite backend based on model file extension:

  • ONNX path: onnxruntime-web with WASM execution provider. Input: [1, 3, 640, 640] (CHW).
  • TFLite path: Uses the globally-injected window.tflite object. Input is transposed CHW→HWC to [1, 640, 640, 3] before prediction.

ImageProcessorService — Resizes any image to 640×640 via offscreen Canvas, then converts RGBA→CHW Float32Array normalized to [0.0, 1.0].

LocalHistoryService — Persists up to 20 scan records (FIFO) to localStorage key palm_oil_vault. Each record includes detection summary, latency, engine type, Base64 thumbnail, and bounding boxes.

InferenceService (src/app/core/services/) — Hub dispatcher using Angular Signals:

  • Signals: mode (local | api), localEngine (onnx | tflite), detections, summary
  • analyze(base64, w, h) dispatches to local (LocalInferenceService) or remote (RemoteInferenceService)
  • Local path: base64 → blob → file → LocalInferenceService → parse detections
  • Remote path: blob → RemoteInferenceService.analyze() → map detections
  • Computes class-count summary

RemoteInferenceService (src/app/core/services/) — HTTP client for NestJS:

  • analyze(blob) → POST /analyzeAnalysisResponse
  • getHistory(), deleteRecord(archiveId), clearAll()
  • Hits http://localhost:3000

VisionSocketService — Socket.io client for NestJS /vision namespace:

  • snapAndSend(videoEl, batchId?) — captures webcam frame and emits as raw Base64
  • sendBase64(dataUrl, batchId?) — emits gallery image as raw Base64
  • Hard rule: raw, uncompressed Base64 strings only — no binary frames, no WebRTC
  • Signals: connected, analyzing, lastResult, lastError
  • Zombie socket pattern: socket is never explicitly closed

ChatSocketService — Socket.io client sharing NestJS /vision namespace for chat:

  • send(message): Promise<ChatResult> — one-shot listener; emits chat:send, resolves on chat:result or rejects on chat:error
  • Unwraps n8n response variants (output | answer | response | text)
  • Zombie socket pattern

SurveillanceService — Socket.io monitoring client for NestJS /monitor namespace:

  • Emits monitor:subscribe on connect; receives monitor:data (500ms ticks) and monitor:status
  • Signals: metrics, connected, nestStatus (computed), n8nStatus (checking | ready | not-ready)
  • formatBytes(bytes) helper for memory display
  • Zombie socket pattern

ThemeService — Dark/light theme toggle:

  • Adds/removes theme-dark / theme-light class on document.body via Renderer2
  • Persists preference to localStorage (palm-ai-theme)
  • Exposes currentTheme$ observable and isDark() boolean

Key Interfaces (src/app/core/interfaces/)

BatchResult — Single scan audit record:

  • image_id, timestamp, status (ok | error)
  • detections[]: bunch_id, ripeness_class, confidence_pct, is_health_alert, bounding_box, norm_box
  • performance: inference_ms, processing_ms, round_trip_ms
  • technical_evidence: engine, archive_id, total_count, threshold, industrial_summary, raw_tensor_sample
  • localBlobUrl?, error?

FullSessionReport — Batch run summary:

  • session_id (UUID), generated_at
  • meta: total_images, successful, failed, total_time_ms, avg_inference_ms, avg_round_trip_ms
  • results: BatchResult[]

TFLite Bundler Constraint

@tensorflow/tfjs-tflite is a legacy CommonJS/UMD hybrid incompatible with the Vite/esbuild ESM bundler. Both tf.min.js and tf-tflite.min.js are loaded as global scripts in angular.json, not as ES modules. This populates window.tflite and window.tf before Angular bootstraps. Do not import them via import statements.

Required Manual Assets (not installed by npm)

These binary files must be placed manually after npm install:

Path Source
src/assets/models/onnx/best.onnx YOLOv8 model file
src/assets/models/tflite/best_float32.tflite Full-precision TFLite model
src/assets/models/tflite/best_float16.tflite Half-precision TFLite model
src/assets/wasm/ Copy from node_modules/onnxruntime-web/dist/
src/assets/tflite-wasm/ Copy 7 files from node_modules/@tensorflow/tfjs-tflite/dist/

Environment

  • environment.nestWsUrl = http://localhost:3000
  • Angular 20, standalone components, Signals-based reactivity

Inference Pipeline (socket engine)

  1. User uploads image or webcam snap → VisionSocketService.sendBase64() → NestJS
  2. NestJS ONNX inference + SQLite write → vision:result with BatchResult payload
  3. AnalyzerComponent appends to BatchResult[], draws bounding boxes on evidence canvas
  4. On session end → FullSessionReport assembled from collected results

Inference Pipeline (browser engines)

  1. User uploads image → ImageProcessorService.processImage() → CHW Float32Array
  2. LocalInferenceService.loadModel(modelPath) → creates ONNX session or loads TFLite model
  3. LocalInferenceService.runInference(input) → raw output tensor
  4. LocalInferenceService.parseDetections(rawData, threshold) → filtered detections with class labels and bounding boxes
  5. AnalyzerComponent draws bounding boxes on canvas
  6. LocalHistoryService.save() → persists to localStorage