Dr-Swopt 1d641b6408 update on vault history in the frontend 1 week ago
..
public 70e0ec0eb7 chore: install frontend dependencies and initialize project assets 2 weeks ago
src 1d641b6408 update on vault history in the frontend 1 week ago
.editorconfig 70e0ec0eb7 chore: install frontend dependencies and initialize project assets 2 weeks ago
.gitignore 70e0ec0eb7 chore: install frontend dependencies and initialize project assets 2 weeks ago
MOBILE_OPTIMIZATION.md 3b1c3c9efc fixed history vault and adjust for mobile view 2 weeks ago
README.md 9bb1ac592a feat: initialize frontend project with Angular, ONNX Runtime, and TFLite inference services 1 week ago
angular.json 8e8f687551 fixed drawing boxes on fruit isses 1 week ago
package-lock.json d02d415981 only for ONNX. tflite is still not working 2 weeks ago
package.json d02d415981 only for ONNX. tflite is still not working 2 weeks ago
tsconfig.app.json 70e0ec0eb7 chore: install frontend dependencies and initialize project assets 2 weeks ago
tsconfig.json 70e0ec0eb7 chore: install frontend dependencies and initialize project assets 2 weeks ago
tsconfig.spec.json 70e0ec0eb7 chore: install frontend dependencies and initialize project assets 2 weeks ago

README.md

PalmOilAI Frontend

Version 2.6 Industrial — A fully client-side Angular web application for palm oil fruit bunch ripeness detection, running AI inference 100% in the browser with no backend dependency.


Overview

The PalmOilAI frontend is a standalone Angular application that loads YOLOv8-based detection models directly in the browser using ONNX Runtime Web (for the industrial engine) and TensorFlow.js TFLite (for the standard PoC engine). All inference, image preprocessing, and result history are handled client-side with no network round-trips.

Detection Classes

The models detect six ripeness categories:

Class Description Color
Ripe Ready for harvest Green
Unripe Not ready Olive
Underripe Almost ready Amber
Overripe Past optimal harvest Brown
Abnormal Health alert — disease or damage Red
Empty_Bunch No fruit present Grey

Technology Stack

Category Technology Version
Framework Angular ^20.0.0
Bundler Vite/esbuild (via @angular/build) ^20.0.5
ONNX Inference onnxruntime-web ^1.24.3
TFLite Inference @tensorflow/tfjs-tflite ^0.0.1-alpha.10
TF Core @tensorflow/tfjs ^4.22.0
Styling SCSS
Language TypeScript ~5.8.2

Project Structure

src/
├── app/
│   ├── components/
│   │   ├── analyzer/       # Main scanner/inference UI
│   │   ├── header/         # Navigation & theme toggle
│   │   ├── history/        # Inference history (Vault)
│   │   └── settings/       # Confidence threshold config (backend)
│   ├── services/
│   │   ├── local-inference.service.ts   # Core AI engine (ONNX + TFLite)
│   │   ├── image-processor.service.ts   # Image resize + CHW preprocessing
│   │   ├── local-history.service.ts     # LocalStorage-based result vault
│   │   ├── theme.service.ts             # Dark/light mode persistence
│   │   ├── api.service.ts               # Backend API client (optional)
│   │   └── tflite.d.ts                  # TypeScript declarations for TFLite globals
│   ├── app.routes.ts       # Client-side routing
│   └── app.ts              # Root application component
└── assets/
    ├── models/
    │   ├── onnx/best.onnx              # YOLOv8 Industrial model (~9.4 MB)
    │   └── tflite/
    │       ├── best_float32.tflite    # Standard PoC model, full precision (~9.4 MB)
    │       └── best_float16.tflite    # Reduced precision variant (~4.8 MB)
    ├── wasm/                           # ONNX Runtime WASM binaries
    └── tflite-wasm/                    # TFLite WASM runtime glue files

Application Routes

Path Component Description
/ /analyzer Redirect to Scanner
/analyzer AnalyzerComponent Main image upload + inference UI
/history HistoryComponent Saved inference "Vault" records
/settings SettingsComponent Backend confidence threshold (requires backend)

Key Services

LocalInferenceService

The core AI engine. Selects between ONNX and TFLite backends based on the model file extension.

  • ONNX path (best.onnx): Uses onnxruntime-web with WASM execution provider. Input tensor shape: [1, 3, 640, 640] (CHW).
  • TFLite path (best_float32.tflite): Accesses the globally-loaded tflite object (injected via angular.json scripts). Input is transposed from CHW to HWC ([1, 640, 640, 3]) before prediction.

Bundler Note: Because @tensorflow/tfjs-tflite is a legacy CommonJS/UMD hybrid that is incompatible with the modern Vite/esbuild ESM bundler, both TF and TFLite are loaded as global scripts in angular.json. This is intentional — it ensures they populate window.tflite and window.tf before Angular bootstraps, bypassing all module resolution issues.

ImageProcessorService

Resizes any input image to 640×640 using an offscreen Canvas and converts the pixel data from RGBA to a normalized Float32Array in CHW format ([1, 3, 640, 640]).

LocalHistoryService

Persists inference results to localStorage under the key palm_oil_vault. Stores up to 20 records (FIFO), each containing the detection summary, inference latency, engine type, thumbnail image data, and bounding box coordinates.

ThemeService

Manages dark/light mode by toggling theme-dark / theme-light CSS classes on <body>. Persists the user's preference to localStorage under palm-ai-theme.


Required Asset Files

The following binary files must be manually placed — they are not installed by npm install:

ONNX Models (src/assets/models/onnx/)

File Description
best.onnx YOLOv8 industrial detection model

TFLite Models (src/assets/models/tflite/)

File Description
best_float32.tflite Full-precision TFLite model
best_float16.tflite Half-precision TFLite model (smaller, faster)

ONNX WASM Runtime (src/assets/wasm/)

Copy from node_modules/onnxruntime-web/dist/ after npm install.

TFLite WASM Runtime (src/assets/tflite-wasm/)

Copy from node_modules/@tensorflow/tfjs-tflite/dist/ after npm install. Required files:

File Purpose
tflite_web_api_cc.js TFLite glue (non-SIMD)
tflite_web_api_cc.wasm TFLite engine (non-SIMD)
tflite_web_api_cc_simd.js TFLite glue (SIMD-accelerated)
tflite_web_api_cc_simd.wasm TFLite engine (SIMD-accelerated)
tflite_web_api_cc_simd_threaded.js TFLite glue (SIMD + multi-threaded)
tflite_web_api_cc_simd_threaded.wasm TFLite engine (SIMD + multi-threaded)
tflite_web_api_client.js TFLite high-level client API

Getting Started

Prerequisites

  • Node.js ≥ 18
  • Angular CLI (npm install -g @angular/cli)

Installation

npm install

Development Server

# Standard local serve
ng serve

# Serve on all network interfaces (for device testing)
ng serve --host 0.0.0.0 --port 4200

Open your browser and navigate to http://localhost:4200/. The app redirects to /analyzer by default.

Build for Production

ng build

Build artifacts are placed in the dist/ directory.

Important: The production build will warn about initial bundle size due to the TF.js global scripts (~4MB). This is expected — both tf.min.js and tf-tflite.min.js are loaded as external scripts, not tree-shakeable modules.


Inference Pipeline

The AnalyzerComponent orchestrates the following pipeline when Run Inference is clicked:

User uploads image
       │
       ▼
ImageProcessorService.processImage()
  └─ Resize to 640×640 via Canvas
  └─ Convert RGBA → CHW Float32Array ([1, 3, 640, 640])
       │
       ▼
LocalInferenceService.loadModel(modelPath)
  ├─ ONNX: Set WASM path → Create InferenceSession
  └─ TFLite: setWasmPath → loadTFLiteModel
       │
       ▼
LocalInferenceService.runInference(input)
  ├─ ONNX: Create tensor → session.run() → extract output
  └─ TFLite: Transpose CHW→HWC → model.predict() → extract data
       │
       ▼
LocalInferenceService.parseDetections(rawData, threshold)
  └─ Filter by confidence → map to class labels + bounding boxes
       │
       ▼
AnalyzerComponent draws bounding boxes on Canvas
LocalHistoryService saves result to localStorage

Running Tests

ng test

Uses Karma + Jasmine. No end-to-end test framework is configured by default.


Additional Resources