|
@@ -1,59 +1,219 @@
|
|
|
-# Frontend
|
|
|
|
|
|
|
+# PalmOilAI Frontend
|
|
|
|
|
|
|
|
-This project was generated using [Angular CLI](https://github.com/angular/angular-cli) version 20.0.5.
|
|
|
|
|
|
|
+**Version 2.6 Industrial** — A fully client-side Angular web application for palm oil fruit bunch ripeness detection, running AI inference 100% in the browser with no backend dependency.
|
|
|
|
|
|
|
|
-## Development server
|
|
|
|
|
|
|
+---
|
|
|
|
|
|
|
|
-To start a local development server, run:
|
|
|
|
|
|
|
+## Overview
|
|
|
|
|
+
|
|
|
|
|
+The PalmOilAI frontend is a standalone Angular application that loads YOLOv8-based detection models directly in the browser using **ONNX Runtime Web** (for the industrial engine) and **TensorFlow.js TFLite** (for the standard PoC engine). All inference, image preprocessing, and result history are handled client-side with no network round-trips.
|
|
|
|
|
+
|
|
|
|
|
+### Detection Classes
|
|
|
|
|
+
|
|
|
|
|
+The models detect six ripeness categories:
|
|
|
|
|
+
|
|
|
|
|
+| Class | Description | Color |
|
|
|
|
|
+|---|---|---|
|
|
|
|
|
+| `Ripe` | Ready for harvest | Green |
|
|
|
|
|
+| `Unripe` | Not ready | Olive |
|
|
|
|
|
+| `Underripe` | Almost ready | Amber |
|
|
|
|
|
+| `Overripe` | Past optimal harvest | Brown |
|
|
|
|
|
+| `Abnormal` | Health alert — disease or damage | Red |
|
|
|
|
|
+| `Empty_Bunch` | No fruit present | Grey |
|
|
|
|
|
+
|
|
|
|
|
+---
|
|
|
|
|
+
|
|
|
|
|
+## Technology Stack
|
|
|
|
|
+
|
|
|
|
|
+| Category | Technology | Version |
|
|
|
|
|
+|---|---|---|
|
|
|
|
|
+| Framework | Angular | ^20.0.0 |
|
|
|
|
|
+| Bundler | Vite/esbuild (via `@angular/build`) | ^20.0.5 |
|
|
|
|
|
+| ONNX Inference | `onnxruntime-web` | ^1.24.3 |
|
|
|
|
|
+| TFLite Inference | `@tensorflow/tfjs-tflite` | ^0.0.1-alpha.10 |
|
|
|
|
|
+| TF Core | `@tensorflow/tfjs` | ^4.22.0 |
|
|
|
|
|
+| Styling | SCSS | — |
|
|
|
|
|
+| Language | TypeScript | ~5.8.2 |
|
|
|
|
|
+
|
|
|
|
|
+---
|
|
|
|
|
+
|
|
|
|
|
+## Project Structure
|
|
|
|
|
|
|
|
-```bash
|
|
|
|
|
-ng serve
|
|
|
|
|
```
|
|
```
|
|
|
|
|
+src/
|
|
|
|
|
+├── app/
|
|
|
|
|
+│ ├── components/
|
|
|
|
|
+│ │ ├── analyzer/ # Main scanner/inference UI
|
|
|
|
|
+│ │ ├── header/ # Navigation & theme toggle
|
|
|
|
|
+│ │ ├── history/ # Inference history (Vault)
|
|
|
|
|
+│ │ └── settings/ # Confidence threshold config (backend)
|
|
|
|
|
+│ ├── services/
|
|
|
|
|
+│ │ ├── local-inference.service.ts # Core AI engine (ONNX + TFLite)
|
|
|
|
|
+│ │ ├── image-processor.service.ts # Image resize + CHW preprocessing
|
|
|
|
|
+│ │ ├── local-history.service.ts # LocalStorage-based result vault
|
|
|
|
|
+│ │ ├── theme.service.ts # Dark/light mode persistence
|
|
|
|
|
+│ │ ├── api.service.ts # Backend API client (optional)
|
|
|
|
|
+│ │ └── tflite.d.ts # TypeScript declarations for TFLite globals
|
|
|
|
|
+│ ├── app.routes.ts # Client-side routing
|
|
|
|
|
+│ └── app.ts # Root application component
|
|
|
|
|
+└── assets/
|
|
|
|
|
+ ├── models/
|
|
|
|
|
+ │ ├── onnx/best.onnx # YOLOv8 Industrial model (~9.4 MB)
|
|
|
|
|
+ │ └── tflite/
|
|
|
|
|
+ │ ├── best_float32.tflite # Standard PoC model, full precision (~9.4 MB)
|
|
|
|
|
+ │ └── best_float16.tflite # Reduced precision variant (~4.8 MB)
|
|
|
|
|
+ ├── wasm/ # ONNX Runtime WASM binaries
|
|
|
|
|
+ └── tflite-wasm/ # TFLite WASM runtime glue files
|
|
|
|
|
+```
|
|
|
|
|
+
|
|
|
|
|
+---
|
|
|
|
|
+
|
|
|
|
|
+## Application Routes
|
|
|
|
|
+
|
|
|
|
|
+| Path | Component | Description |
|
|
|
|
|
+|---|---|---|
|
|
|
|
|
+| `/` | → `/analyzer` | Redirect to Scanner |
|
|
|
|
|
+| `/analyzer` | `AnalyzerComponent` | Main image upload + inference UI |
|
|
|
|
|
+| `/history` | `HistoryComponent` | Saved inference "Vault" records |
|
|
|
|
|
+| `/settings` | `SettingsComponent` | Backend confidence threshold (requires backend) |
|
|
|
|
|
+
|
|
|
|
|
+---
|
|
|
|
|
+
|
|
|
|
|
+## Key Services
|
|
|
|
|
+
|
|
|
|
|
+### `LocalInferenceService`
|
|
|
|
|
+The core AI engine. Selects between **ONNX** and **TFLite** backends based on the model file extension.
|
|
|
|
|
+
|
|
|
|
|
+- **ONNX path** (`best.onnx`): Uses `onnxruntime-web` with WASM execution provider. Input tensor shape: `[1, 3, 640, 640]` (CHW).
|
|
|
|
|
+- **TFLite path** (`best_float32.tflite`): Accesses the globally-loaded `tflite` object (injected via `angular.json` scripts). Input is transposed from CHW to HWC (`[1, 640, 640, 3]`) before prediction.
|
|
|
|
|
+
|
|
|
|
|
+> **Bundler Note:** Because `@tensorflow/tfjs-tflite` is a legacy CommonJS/UMD hybrid that is incompatible with the modern Vite/esbuild ESM bundler, **both TF and TFLite are loaded as global scripts** in `angular.json`. This is intentional — it ensures they populate `window.tflite` and `window.tf` before Angular bootstraps, bypassing all module resolution issues.
|
|
|
|
|
+
|
|
|
|
|
+### `ImageProcessorService`
|
|
|
|
|
+Resizes any input image to `640×640` using an offscreen Canvas and converts the pixel data from RGBA to a normalized `Float32Array` in CHW format (`[1, 3, 640, 640]`).
|
|
|
|
|
|
|
|
-Once the server is running, open your browser and navigate to `http://localhost:4200/`. The application will automatically reload whenever you modify any of the source files.
|
|
|
|
|
|
|
+### `LocalHistoryService`
|
|
|
|
|
+Persists inference results to `localStorage` under the key `palm_oil_vault`. Stores up to **20 records** (FIFO), each containing the detection summary, inference latency, engine type, thumbnail image data, and bounding box coordinates.
|
|
|
|
|
|
|
|
-## Code scaffolding
|
|
|
|
|
|
|
+### `ThemeService`
|
|
|
|
|
+Manages dark/light mode by toggling `theme-dark` / `theme-light` CSS classes on `<body>`. Persists the user's preference to `localStorage` under `palm-ai-theme`.
|
|
|
|
|
|
|
|
-Angular CLI includes powerful code scaffolding tools. To generate a new component, run:
|
|
|
|
|
|
|
+---
|
|
|
|
|
+
|
|
|
|
|
+## Required Asset Files
|
|
|
|
|
+
|
|
|
|
|
+The following binary files must be manually placed — they are not installed by `npm install`:
|
|
|
|
|
+
|
|
|
|
|
+### ONNX Models (`src/assets/models/onnx/`)
|
|
|
|
|
+| File | Description |
|
|
|
|
|
+|---|---|
|
|
|
|
|
+| `best.onnx` | YOLOv8 industrial detection model |
|
|
|
|
|
+
|
|
|
|
|
+### TFLite Models (`src/assets/models/tflite/`)
|
|
|
|
|
+| File | Description |
|
|
|
|
|
+|---|---|
|
|
|
|
|
+| `best_float32.tflite` | Full-precision TFLite model |
|
|
|
|
|
+| `best_float16.tflite` | Half-precision TFLite model (smaller, faster) |
|
|
|
|
|
+
|
|
|
|
|
+### ONNX WASM Runtime (`src/assets/wasm/`)
|
|
|
|
|
+Copy from `node_modules/onnxruntime-web/dist/` after `npm install`.
|
|
|
|
|
+
|
|
|
|
|
+### TFLite WASM Runtime (`src/assets/tflite-wasm/`)
|
|
|
|
|
+Copy from `node_modules/@tensorflow/tfjs-tflite/dist/` after `npm install`. Required files:
|
|
|
|
|
+
|
|
|
|
|
+| File | Purpose |
|
|
|
|
|
+|---|---|
|
|
|
|
|
+| `tflite_web_api_cc.js` | TFLite glue (non-SIMD) |
|
|
|
|
|
+| `tflite_web_api_cc.wasm` | TFLite engine (non-SIMD) |
|
|
|
|
|
+| `tflite_web_api_cc_simd.js` | TFLite glue (SIMD-accelerated) |
|
|
|
|
|
+| `tflite_web_api_cc_simd.wasm` | TFLite engine (SIMD-accelerated) |
|
|
|
|
|
+| `tflite_web_api_cc_simd_threaded.js` | TFLite glue (SIMD + multi-threaded) |
|
|
|
|
|
+| `tflite_web_api_cc_simd_threaded.wasm` | TFLite engine (SIMD + multi-threaded) |
|
|
|
|
|
+| `tflite_web_api_client.js` | TFLite high-level client API |
|
|
|
|
|
+
|
|
|
|
|
+---
|
|
|
|
|
+
|
|
|
|
|
+## Getting Started
|
|
|
|
|
+
|
|
|
|
|
+### Prerequisites
|
|
|
|
|
+- Node.js ≥ 18
|
|
|
|
|
+- Angular CLI (`npm install -g @angular/cli`)
|
|
|
|
|
+
|
|
|
|
|
+### Installation
|
|
|
|
|
|
|
|
```bash
|
|
```bash
|
|
|
-ng generate component component-name
|
|
|
|
|
|
|
+npm install
|
|
|
```
|
|
```
|
|
|
|
|
|
|
|
-For a complete list of available schematics (such as `components`, `directives`, or `pipes`), run:
|
|
|
|
|
|
|
+### Development Server
|
|
|
|
|
|
|
|
```bash
|
|
```bash
|
|
|
-ng generate --help
|
|
|
|
|
|
|
+# Standard local serve
|
|
|
|
|
+ng serve
|
|
|
|
|
+
|
|
|
|
|
+# Serve on all network interfaces (for device testing)
|
|
|
|
|
+ng serve --host 0.0.0.0 --port 4200
|
|
|
```
|
|
```
|
|
|
|
|
|
|
|
-## Building
|
|
|
|
|
|
|
+Open your browser and navigate to `http://localhost:4200/`. The app redirects to `/analyzer` by default.
|
|
|
|
|
|
|
|
-To build the project run:
|
|
|
|
|
|
|
+### Build for Production
|
|
|
|
|
|
|
|
```bash
|
|
```bash
|
|
|
ng build
|
|
ng build
|
|
|
```
|
|
```
|
|
|
|
|
|
|
|
-This will compile your project and store the build artifacts in the `dist/` directory. By default, the production build optimizes your application for performance and speed.
|
|
|
|
|
|
|
+Build artifacts are placed in the `dist/` directory.
|
|
|
|
|
|
|
|
-## Running unit tests
|
|
|
|
|
|
|
+> **Important:** The production build will warn about initial bundle size due to the TF.js global scripts (~4MB). This is expected — both `tf.min.js` and `tf-tflite.min.js` are loaded as external scripts, not tree-shakeable modules.
|
|
|
|
|
|
|
|
-To execute unit tests with the [Karma](https://karma-runner.github.io) test runner, use the following command:
|
|
|
|
|
|
|
+---
|
|
|
|
|
|
|
|
-```bash
|
|
|
|
|
-ng test
|
|
|
|
|
|
|
+## Inference Pipeline
|
|
|
|
|
+
|
|
|
|
|
+The `AnalyzerComponent` orchestrates the following pipeline when **Run Inference** is clicked:
|
|
|
|
|
+
|
|
|
|
|
+```
|
|
|
|
|
+User uploads image
|
|
|
|
|
+ │
|
|
|
|
|
+ ▼
|
|
|
|
|
+ImageProcessorService.processImage()
|
|
|
|
|
+ └─ Resize to 640×640 via Canvas
|
|
|
|
|
+ └─ Convert RGBA → CHW Float32Array ([1, 3, 640, 640])
|
|
|
|
|
+ │
|
|
|
|
|
+ ▼
|
|
|
|
|
+LocalInferenceService.loadModel(modelPath)
|
|
|
|
|
+ ├─ ONNX: Set WASM path → Create InferenceSession
|
|
|
|
|
+ └─ TFLite: setWasmPath → loadTFLiteModel
|
|
|
|
|
+ │
|
|
|
|
|
+ ▼
|
|
|
|
|
+LocalInferenceService.runInference(input)
|
|
|
|
|
+ ├─ ONNX: Create tensor → session.run() → extract output
|
|
|
|
|
+ └─ TFLite: Transpose CHW→HWC → model.predict() → extract data
|
|
|
|
|
+ │
|
|
|
|
|
+ ▼
|
|
|
|
|
+LocalInferenceService.parseDetections(rawData, threshold)
|
|
|
|
|
+ └─ Filter by confidence → map to class labels + bounding boxes
|
|
|
|
|
+ │
|
|
|
|
|
+ ▼
|
|
|
|
|
+AnalyzerComponent draws bounding boxes on Canvas
|
|
|
|
|
+LocalHistoryService saves result to localStorage
|
|
|
```
|
|
```
|
|
|
|
|
|
|
|
-## Running end-to-end tests
|
|
|
|
|
|
|
+---
|
|
|
|
|
|
|
|
-For end-to-end (e2e) testing, run:
|
|
|
|
|
|
|
+## Running Tests
|
|
|
|
|
|
|
|
```bash
|
|
```bash
|
|
|
-ng e2e
|
|
|
|
|
|
|
+ng test
|
|
|
```
|
|
```
|
|
|
|
|
|
|
|
-Angular CLI does not come with an end-to-end testing framework by default. You can choose one that suits your needs.
|
|
|
|
|
|
|
+Uses Karma + Jasmine. No end-to-end test framework is configured by default.
|
|
|
|
|
+
|
|
|
|
|
+---
|
|
|
|
|
|
|
|
## Additional Resources
|
|
## Additional Resources
|
|
|
|
|
|
|
|
-For more information on using the Angular CLI, including detailed command references, visit the [Angular CLI Overview and Command Reference](https://angular.dev/tools/cli) page.
|
|
|
|
|
|
|
+- [Angular CLI Reference](https://angular.dev/tools/cli)
|
|
|
|
|
+- [ONNX Runtime Web](https://onnxruntime.ai/docs/get-started/with-javascript/web.html)
|
|
|
|
|
+- [TensorFlow.js TFLite](https://www.tensorflow.org/js/guide/tflite)
|