This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
PalmOilAI — an Angular 20 single-page application for palm oil fruit bunch (FFB) ripeness detection. Supports three inference engines: tflite and onnx run 100% client-side in the browser via WASM; socket streams to the NestJS backend for server-side inference. The NestJS backend is required for the socket engine, chat proxy, and remote history.
Detection classes: Ripe, Unripe, Underripe, Overripe, Abnormal, Empty_Bunch.
npm install
ng serve # Dev server at http://localhost:4200
ng serve --host 0.0.0.0 # Expose to LAN (device testing)
ng build # Production build → dist/
ng test # Karma + Jasmine unit tests
/analyzer → AnalyzerComponent (main scanner UI, default route)
/history → HistoryComponent (local + remote vault of past scans)
/chatbot → ChatbotComponent (chat interface backed by n8n via WebSocket)
src/app/components/)AnalyzerComponent — Three-engine scan hub:
tflite (browser WASM), onnx (browser WASM), socket (NestJS remote)vision:analyze → canvas drawFullSessionReportBatchResult[] and FullSessionReport (timing, success/fail counts, raw tensor sample)HistoryComponent — Dual-tab vault:
localStorage via LocalHistoryService, up to 20 recordsRemoteInferenceService, grouped by batch_id into BatchSessionGroup with aggregate summariesChatbotComponent — n8n RAG chat:
ChatSocketService.send() (promise-based one-shot listener)PerformanceHudComponent — Floating, draggable, collapsible overlay (CDK drag-drop):
SurveillanceServiceHeaderComponent — App header with dark/light theme toggle via ThemeService.
BottomNavComponent — Mobile fixed tab bar (hidden on desktop). Three tabs: Scanner | Intelligence | Vault.
src/app/services/ and src/app/core/services/)LocalInferenceService — Core browser AI engine. Dispatches to ONNX or TFLite backend based on model file extension:
onnxruntime-web with WASM execution provider. Input: [1, 3, 640, 640] (CHW).window.tflite object. Input is transposed CHW→HWC to [1, 640, 640, 3] before prediction.ImageProcessorService — Resizes any image to 640×640 via offscreen Canvas, then converts RGBA→CHW Float32Array normalized to [0.0, 1.0].
LocalHistoryService — Persists up to 20 scan records (FIFO) to localStorage key palm_oil_vault. Each record includes detection summary, latency, engine type, Base64 thumbnail, and bounding boxes.
InferenceService (src/app/core/services/) — Hub dispatcher using Angular Signals:
mode (local | api), localEngine (onnx | tflite), detections, summaryanalyze(base64, w, h) dispatches to local (LocalInferenceService) or remote (RemoteInferenceService)LocalInferenceService → parse detectionsRemoteInferenceService.analyze() → map detectionsRemoteInferenceService (src/app/core/services/) — HTTP client for NestJS:
analyze(blob) → POST /analyze → AnalysisResponsegetHistory(), deleteRecord(archiveId), clearAll()http://localhost:3000VisionSocketService — Socket.io client for NestJS /vision namespace:
snapAndSend(videoEl, batchId?) — captures webcam frame and emits as raw Base64sendBase64(dataUrl, batchId?) — emits gallery image as raw Base64connected, analyzing, lastResult, lastErrorChatSocketService — Socket.io client sharing NestJS /vision namespace for chat:
send(message): Promise<ChatResult> — one-shot listener; emits chat:send, resolves on chat:result or rejects on chat:erroroutput | answer | response | text)SurveillanceService — Socket.io monitoring client for NestJS /monitor namespace:
monitor:subscribe on connect; receives monitor:data (500ms ticks) and monitor:statusmetrics, connected, nestStatus (computed), n8nStatus (checking | ready | not-ready)formatBytes(bytes) helper for memory displayThemeService — Dark/light theme toggle:
theme-dark / theme-light class on document.body via Renderer2localStorage (palm-ai-theme)currentTheme$ observable and isDark() booleansrc/app/core/interfaces/)BatchResult — Single scan audit record:
image_id, timestamp, status (ok | error)detections[]: bunch_id, ripeness_class, confidence_pct, is_health_alert, bounding_box, norm_boxperformance: inference_ms, processing_ms, round_trip_mstechnical_evidence: engine, archive_id, total_count, threshold, industrial_summary, raw_tensor_samplelocalBlobUrl?, error?FullSessionReport — Batch run summary:
session_id (UUID), generated_atmeta: total_images, successful, failed, total_time_ms, avg_inference_ms, avg_round_trip_msresults: BatchResult[]@tensorflow/tfjs-tflite is a legacy CommonJS/UMD hybrid incompatible with the Vite/esbuild ESM bundler. Both tf.min.js and tf-tflite.min.js are loaded as global scripts in angular.json, not as ES modules. This populates window.tflite and window.tf before Angular bootstraps. Do not import them via import statements.
These binary files must be placed manually after npm install:
| Path | Source |
|---|---|
src/assets/models/onnx/best.onnx |
YOLOv8 model file |
src/assets/models/tflite/best_float32.tflite |
Full-precision TFLite model |
src/assets/models/tflite/best_float16.tflite |
Half-precision TFLite model |
src/assets/wasm/ |
Copy from node_modules/onnxruntime-web/dist/ |
src/assets/tflite-wasm/ |
Copy 7 files from node_modules/@tensorflow/tfjs-tflite/dist/ |
environment.nestWsUrl = http://localhost:3000VisionSocketService.sendBase64() → NestJSvision:result with BatchResult payloadAnalyzerComponent appends to BatchResult[], draws bounding boxes on evidence canvasFullSessionReport assembled from collected resultsImageProcessorService.processImage() → CHW Float32ArrayLocalInferenceService.loadModel(modelPath) → creates ONNX session or loads TFLite modelLocalInferenceService.runInference(input) → raw output tensorLocalInferenceService.parseDetections(rawData, threshold) → filtered detections with class labels and bounding boxesAnalyzerComponent draws bounding boxes on canvasLocalHistoryService.save() → persists to localStorage