|
|
1 vecka sedan | |
|---|---|---|
| .. | ||
| public | 2 veckor sedan | |
| src | 1 vecka sedan | |
| .editorconfig | 2 veckor sedan | |
| .gitignore | 2 veckor sedan | |
| MOBILE_OPTIMIZATION.md | 2 veckor sedan | |
| README.md | 1 vecka sedan | |
| angular.json | 1 vecka sedan | |
| package-lock.json | 2 veckor sedan | |
| package.json | 2 veckor sedan | |
| tsconfig.app.json | 2 veckor sedan | |
| tsconfig.json | 2 veckor sedan | |
| tsconfig.spec.json | 2 veckor sedan | |
Version 2.6 Industrial — A fully client-side Angular web application for palm oil fruit bunch ripeness detection, running AI inference 100% in the browser with no backend dependency.
The PalmOilAI frontend is a standalone Angular application that loads YOLOv8-based detection models directly in the browser using ONNX Runtime Web (for the industrial engine) and TensorFlow.js TFLite (for the standard PoC engine). All inference, image preprocessing, and result history are handled client-side with no network round-trips.
The models detect six ripeness categories:
| Class | Description | Color |
|---|---|---|
Ripe |
Ready for harvest | Green |
Unripe |
Not ready | Olive |
Underripe |
Almost ready | Amber |
Overripe |
Past optimal harvest | Brown |
Abnormal |
Health alert — disease or damage | Red |
Empty_Bunch |
No fruit present | Grey |
| Category | Technology | Version |
|---|---|---|
| Framework | Angular | ^20.0.0 |
| Bundler | Vite/esbuild (via @angular/build) |
^20.0.5 |
| ONNX Inference | onnxruntime-web |
^1.24.3 |
| TFLite Inference | @tensorflow/tfjs-tflite |
^0.0.1-alpha.10 |
| TF Core | @tensorflow/tfjs |
^4.22.0 |
| Styling | SCSS | — |
| Language | TypeScript | ~5.8.2 |
src/
├── app/
│ ├── components/
│ │ ├── analyzer/ # Main scanner/inference UI
│ │ ├── header/ # Navigation & theme toggle
│ │ ├── history/ # Inference history (Vault)
│ │ └── settings/ # Confidence threshold config (backend)
│ ├── services/
│ │ ├── local-inference.service.ts # Core AI engine (ONNX + TFLite)
│ │ ├── image-processor.service.ts # Image resize + CHW preprocessing
│ │ ├── local-history.service.ts # LocalStorage-based result vault
│ │ ├── theme.service.ts # Dark/light mode persistence
│ │ ├── api.service.ts # Backend API client (optional)
│ │ └── tflite.d.ts # TypeScript declarations for TFLite globals
│ ├── app.routes.ts # Client-side routing
│ └── app.ts # Root application component
└── assets/
├── models/
│ ├── onnx/best.onnx # YOLOv8 Industrial model (~9.4 MB)
│ └── tflite/
│ ├── best_float32.tflite # Standard PoC model, full precision (~9.4 MB)
│ └── best_float16.tflite # Reduced precision variant (~4.8 MB)
├── wasm/ # ONNX Runtime WASM binaries
└── tflite-wasm/ # TFLite WASM runtime glue files
| Path | Component | Description |
|---|---|---|
/ |
→ /analyzer |
Redirect to Scanner |
/analyzer |
AnalyzerComponent |
Main image upload + inference UI |
/history |
HistoryComponent |
Saved inference "Vault" records |
/settings |
SettingsComponent |
Backend confidence threshold (requires backend) |
LocalInferenceServiceThe core AI engine. Selects between ONNX and TFLite backends based on the model file extension.
best.onnx): Uses onnxruntime-web with WASM execution provider. Input tensor shape: [1, 3, 640, 640] (CHW).best_float32.tflite): Accesses the globally-loaded tflite object (injected via angular.json scripts). Input is transposed from CHW to HWC ([1, 640, 640, 3]) before prediction.Bundler Note: Because
@tensorflow/tfjs-tfliteis a legacy CommonJS/UMD hybrid that is incompatible with the modern Vite/esbuild ESM bundler, both TF and TFLite are loaded as global scripts inangular.json. This is intentional — it ensures they populatewindow.tfliteandwindow.tfbefore Angular bootstraps, bypassing all module resolution issues.
ImageProcessorServiceResizes any input image to 640×640 using an offscreen Canvas and converts the pixel data from RGBA to a normalized Float32Array in CHW format ([1, 3, 640, 640]).
LocalHistoryServicePersists inference results to localStorage under the key palm_oil_vault. Stores up to 20 records (FIFO), each containing the detection summary, inference latency, engine type, thumbnail image data, and bounding box coordinates.
ThemeServiceManages dark/light mode by toggling theme-dark / theme-light CSS classes on <body>. Persists the user's preference to localStorage under palm-ai-theme.
The following binary files must be manually placed — they are not installed by npm install:
src/assets/models/onnx/)| File | Description |
|---|---|
best.onnx |
YOLOv8 industrial detection model |
src/assets/models/tflite/)| File | Description |
|---|---|
best_float32.tflite |
Full-precision TFLite model |
best_float16.tflite |
Half-precision TFLite model (smaller, faster) |
src/assets/wasm/)Copy from node_modules/onnxruntime-web/dist/ after npm install.
src/assets/tflite-wasm/)Copy from node_modules/@tensorflow/tfjs-tflite/dist/ after npm install. Required files:
| File | Purpose |
|---|---|
tflite_web_api_cc.js |
TFLite glue (non-SIMD) |
tflite_web_api_cc.wasm |
TFLite engine (non-SIMD) |
tflite_web_api_cc_simd.js |
TFLite glue (SIMD-accelerated) |
tflite_web_api_cc_simd.wasm |
TFLite engine (SIMD-accelerated) |
tflite_web_api_cc_simd_threaded.js |
TFLite glue (SIMD + multi-threaded) |
tflite_web_api_cc_simd_threaded.wasm |
TFLite engine (SIMD + multi-threaded) |
tflite_web_api_client.js |
TFLite high-level client API |
npm install -g @angular/cli)npm install
# Standard local serve
ng serve
# Serve on all network interfaces (for device testing)
ng serve --host 0.0.0.0 --port 4200
Open your browser and navigate to http://localhost:4200/. The app redirects to /analyzer by default.
ng build
Build artifacts are placed in the dist/ directory.
Important: The production build will warn about initial bundle size due to the TF.js global scripts (~4MB). This is expected — both
tf.min.jsandtf-tflite.min.jsare loaded as external scripts, not tree-shakeable modules.
The AnalyzerComponent orchestrates the following pipeline when Run Inference is clicked:
User uploads image
│
▼
ImageProcessorService.processImage()
└─ Resize to 640×640 via Canvas
└─ Convert RGBA → CHW Float32Array ([1, 3, 640, 640])
│
▼
LocalInferenceService.loadModel(modelPath)
├─ ONNX: Set WASM path → Create InferenceSession
└─ TFLite: setWasmPath → loadTFLiteModel
│
▼
LocalInferenceService.runInference(input)
├─ ONNX: Create tensor → session.run() → extract output
└─ TFLite: Transpose CHW→HWC → model.predict() → extract data
│
▼
LocalInferenceService.parseDetections(rawData, threshold)
└─ Filter by confidence → map to class labels + bounding boxes
│
▼
AnalyzerComponent draws bounding boxes on Canvas
LocalHistoryService saves result to localStorage
ng test
Uses Karma + Jasmine. No end-to-end test framework is configured by default.