CLAUDE.md 3.5 KB

CLAUDE.md

This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.

What This Is

PalmOilAI — an Angular 20 single-page application for palm oil fruit bunch (FFB) ripeness detection. All AI inference runs 100% client-side in the browser using ONNX Runtime Web and TensorFlow.js TFLite. There is no required backend dependency; the NestJS backend is optional and used only for server-side inference and chat proxy.

Detection classes: Ripe, Unripe, Underripe, Overripe, Abnormal, Empty_Bunch.

Commands

npm install
ng serve                        # Dev server at http://localhost:4200
ng serve --host 0.0.0.0         # Expose to LAN (device testing)
ng build                        # Production build → dist/
ng test                         # Karma + Jasmine unit tests

Architecture

Routing

/analyzerAnalyzerComponent (main scanner UI, default route)
/historyHistoryComponent (local vault of past scans)
/chatbotChatbotComponent (chat interface backed by n8n via WebSocket)

Key Services (src/app/services/)

LocalInferenceService — Core AI engine. Dispatches to ONNX or TFLite backend based on model file extension:

  • ONNX path: onnxruntime-web with WASM execution provider. Input: [1, 3, 640, 640] (CHW).
  • TFLite path: Uses the globally-injected window.tflite object. Input is transposed CHW→HWC to [1, 640, 640, 3] before prediction.

ImageProcessorService — Resizes any image to 640×640 via offscreen Canvas, then converts RGBA→CHW Float32Array normalized to [0.0, 1.0].

LocalHistoryService — Persists up to 20 scan records (FIFO) to localStorage key palm_oil_vault. Each record includes detection summary, latency, engine type, Base64 thumbnail, and bounding boxes.

VisionSocketService / ChatSocketService — WebSocket clients connecting to the NestJS backend (/vision and unspecified chat namespace respectively).

SurveillanceService (frontend) — Connects to the NestJS /monitor namespace for live CPU/memory metrics of NestJS, n8n, and Ollama processes.

TFLite Bundler Constraint

@tensorflow/tfjs-tflite is a legacy CommonJS/UMD hybrid incompatible with the Vite/esbuild ESM bundler. Both tf.min.js and tf-tflite.min.js are loaded as global scripts in angular.json, not as ES modules. This populates window.tflite and window.tf before Angular bootstraps. Do not attempt to import them via import statements.

Required Manual Assets (not installed by npm)

These binary files must be placed manually after npm install:

Path Source
src/assets/models/onnx/best.onnx YOLOv8 model file
src/assets/models/tflite/best_float32.tflite Full-precision TFLite model
src/assets/models/tflite/best_float16.tflite Half-precision TFLite model
src/assets/wasm/ Copy from node_modules/onnxruntime-web/dist/
src/assets/tflite-wasm/ Copy 7 files from node_modules/@tensorflow/tfjs-tflite/dist/

Inference Pipeline (AnalyzerComponent orchestrates)

  1. User uploads image → ImageProcessorService.processImage() → CHW Float32Array
  2. LocalInferenceService.loadModel(modelPath) → creates ONNX session or loads TFLite model
  3. LocalInferenceService.runInference(input) → raw output tensor
  4. LocalInferenceService.parseDetections(rawData, threshold) → filtered detections with class labels and bounding boxes
  5. AnalyzerComponent draws bounding boxes on Canvas
  6. LocalHistoryService.save() → persists to localStorage