This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
PalmOilAI — an Angular 20 single-page application for palm oil fruit bunch (FFB) ripeness detection. All AI inference runs 100% client-side in the browser using ONNX Runtime Web and TensorFlow.js TFLite. There is no required backend dependency; the NestJS backend is optional and used only for server-side inference and chat proxy.
Detection classes: Ripe, Unripe, Underripe, Overripe, Abnormal, Empty_Bunch.
npm install
ng serve # Dev server at http://localhost:4200
ng serve --host 0.0.0.0 # Expose to LAN (device testing)
ng build # Production build → dist/
ng test # Karma + Jasmine unit tests
/analyzer → AnalyzerComponent (main scanner UI, default route)
/history → HistoryComponent (local vault of past scans)
/chatbot → ChatbotComponent (chat interface backed by n8n via WebSocket)
src/app/services/)LocalInferenceService — Core AI engine. Dispatches to ONNX or TFLite backend based on model file extension:
onnxruntime-web with WASM execution provider. Input: [1, 3, 640, 640] (CHW).window.tflite object. Input is transposed CHW→HWC to [1, 640, 640, 3] before prediction.ImageProcessorService — Resizes any image to 640×640 via offscreen Canvas, then converts RGBA→CHW Float32Array normalized to [0.0, 1.0].
LocalHistoryService — Persists up to 20 scan records (FIFO) to localStorage key palm_oil_vault. Each record includes detection summary, latency, engine type, Base64 thumbnail, and bounding boxes.
VisionSocketService / ChatSocketService — WebSocket clients connecting to the NestJS backend (/vision and unspecified chat namespace respectively).
SurveillanceService (frontend) — Connects to the NestJS /monitor namespace for live CPU/memory metrics of NestJS, n8n, and Ollama processes.
@tensorflow/tfjs-tflite is a legacy CommonJS/UMD hybrid incompatible with the Vite/esbuild ESM bundler. Both tf.min.js and tf-tflite.min.js are loaded as global scripts in angular.json, not as ES modules. This populates window.tflite and window.tf before Angular bootstraps. Do not attempt to import them via import statements.
These binary files must be placed manually after npm install:
| Path | Source |
|---|---|
src/assets/models/onnx/best.onnx |
YOLOv8 model file |
src/assets/models/tflite/best_float32.tflite |
Full-precision TFLite model |
src/assets/models/tflite/best_float16.tflite |
Half-precision TFLite model |
src/assets/wasm/ |
Copy from node_modules/onnxruntime-web/dist/ |
src/assets/tflite-wasm/ |
Copy 7 files from node_modules/@tensorflow/tfjs-tflite/dist/ |
ImageProcessorService.processImage() → CHW Float32ArrayLocalInferenceService.loadModel(modelPath) → creates ONNX session or loads TFLite modelLocalInferenceService.runInference(input) → raw output tensorLocalInferenceService.parseDetections(rawData, threshold) → filtered detections with class labels and bounding boxesAnalyzerComponent draws bounding boxes on CanvasLocalHistoryService.save() → persists to localStorage