Dr-Swopt 1 день тому
батько
коміт
e009acb78a

+ 94 - 10
CLAUDE.md

@@ -4,7 +4,7 @@ This file provides guidance to Claude Code (claude.ai/code) when working with co
 
 ## What This Is
 
-PalmOilAI — an Angular 20 single-page application for palm oil fruit bunch (FFB) ripeness detection. All AI inference runs **100% client-side** in the browser using ONNX Runtime Web and TensorFlow.js TFLite. There is no required backend dependency; the NestJS backend is optional and used only for server-side inference and chat proxy.
+PalmOilAI — an Angular 20 single-page application for palm oil fruit bunch (FFB) ripeness detection. Supports three inference engines: `tflite` and `onnx` run 100% client-side in the browser via WASM; `socket` streams to the NestJS backend for server-side inference. The NestJS backend is required for the socket engine, chat proxy, and remote history.
 
 Detection classes: `Ripe`, `Unripe`, `Underripe`, `Overripe`, `Abnormal`, `Empty_Bunch`.
 
@@ -22,12 +22,41 @@ ng test                         # Karma + Jasmine unit tests
 
 ### Routing
 `/analyzer` → `AnalyzerComponent` (main scanner UI, default route)  
-`/history` → `HistoryComponent` (local vault of past scans)  
+`/history` → `HistoryComponent` (local + remote vault of past scans)  
 `/chatbot` → `ChatbotComponent` (chat interface backed by n8n via WebSocket)
 
-### Key Services (`src/app/services/`)
+### Components (`src/app/components/`)
 
-**`LocalInferenceService`** — Core AI engine. Dispatches to ONNX or TFLite backend based on model file extension:
+**`AnalyzerComponent`** — Three-engine scan hub:
+- Engine selector: `tflite` (browser WASM), `onnx` (browser WASM), `socket` (NestJS remote)
+- Browser engines: file upload → local WASM inference → bounding-box canvas draw
+- Socket engine: webcam snap or gallery image → Base64 → `vision:analyze` → canvas draw
+- Batch ingestion: queues multiple files, sends one-by-one via socket, collects results into `FullSessionReport`
+- Evidence canvas: separate from snap canvas; renders bounding boxes for audit drill-down
+- Computes `BatchResult[]` and `FullSessionReport` (timing, success/fail counts, raw tensor sample)
+
+**`HistoryComponent`** — Dual-tab vault:
+- **Local tab**: browser `localStorage` via `LocalHistoryService`, up to 20 records
+- **Remote tab**: NestJS SQLite records via `RemoteInferenceService`, grouped by `batch_id` into `BatchSessionGroup` with aggregate summaries
+- Per-record delete and clear-all (with confirmation)
+
+**`ChatbotComponent`** — n8n RAG chat:
+- Messages sent via `ChatSocketService.send()` (promise-based one-shot listener)
+- NestJS proxies to n8n server-to-server (no browser CORS)
+- Unwraps n8n array response to first element
+- Shows per-message response latency; error fallback with NestJS→n8n diagnostic hint
+
+**`PerformanceHudComponent`** — Floating, draggable, collapsible overlay (CDK drag-drop):
+- Live CPU/RAM for NestJS, n8n, and Ollama processes via `SurveillanceService`
+- Mounts in app root and never unmounts (zombie socket pattern)
+
+**`HeaderComponent`** — App header with dark/light theme toggle via `ThemeService`.
+
+**`BottomNavComponent`** — Mobile fixed tab bar (hidden on desktop). Three tabs: Scanner | Intelligence | Vault.
+
+### Services (`src/app/services/` and `src/app/core/services/`)
+
+**`LocalInferenceService`** — Core browser AI engine. Dispatches to ONNX or TFLite backend based on model file extension:
 - ONNX path: `onnxruntime-web` with WASM execution provider. Input: `[1, 3, 640, 640]` (CHW).
 - TFLite path: Uses the globally-injected `window.tflite` object. Input is transposed CHW→HWC to `[1, 640, 640, 3]` before prediction.
 
@@ -35,12 +64,57 @@ ng test                         # Karma + Jasmine unit tests
 
 **`LocalHistoryService`** — Persists up to 20 scan records (FIFO) to `localStorage` key `palm_oil_vault`. Each record includes detection summary, latency, engine type, Base64 thumbnail, and bounding boxes.
 
-**`VisionSocketService`** / **`ChatSocketService`** — WebSocket clients connecting to the NestJS backend (`/vision` and unspecified chat namespace respectively).
-
-**`SurveillanceService`** (frontend) — Connects to the NestJS `/monitor` namespace for live CPU/memory metrics of NestJS, n8n, and Ollama processes.
+**`InferenceService`** (`src/app/core/services/`) — Hub dispatcher using Angular Signals:
+- Signals: `mode` (local | api), `localEngine` (onnx | tflite), `detections`, `summary`
+- `analyze(base64, w, h)` dispatches to local (`LocalInferenceService`) or remote (`RemoteInferenceService`)
+- Local path: base64 → blob → file → `LocalInferenceService` → parse detections
+- Remote path: blob → `RemoteInferenceService.analyze()` → map detections
+- Computes class-count summary
+
+**`RemoteInferenceService`** (`src/app/core/services/`) — HTTP client for NestJS:
+- `analyze(blob)` → POST `/analyze` → `AnalysisResponse`
+- `getHistory()`, `deleteRecord(archiveId)`, `clearAll()`
+- Hits `http://localhost:3000`
+
+**`VisionSocketService`** — Socket.io client for NestJS `/vision` namespace:
+- `snapAndSend(videoEl, batchId?)` — captures webcam frame and emits as raw Base64
+- `sendBase64(dataUrl, batchId?)` — emits gallery image as raw Base64
+- Hard rule: raw, uncompressed Base64 strings only — no binary frames, no WebRTC
+- Signals: `connected`, `analyzing`, `lastResult`, `lastError`
+- Zombie socket pattern: socket is never explicitly closed
+
+**`ChatSocketService`** — Socket.io client sharing NestJS `/vision` namespace for chat:
+- `send(message): Promise<ChatResult>` — one-shot listener; emits `chat:send`, resolves on `chat:result` or rejects on `chat:error`
+- Unwraps n8n response variants (`output | answer | response | text`)
+- Zombie socket pattern
+
+**`SurveillanceService`** — Socket.io monitoring client for NestJS `/monitor` namespace:
+- Emits `monitor:subscribe` on connect; receives `monitor:data` (500ms ticks) and `monitor:status`
+- Signals: `metrics`, `connected`, `nestStatus` (computed), `n8nStatus` (checking | ready | not-ready)
+- `formatBytes(bytes)` helper for memory display
+- Zombie socket pattern
+
+**`ThemeService`** — Dark/light theme toggle:
+- Adds/removes `theme-dark` / `theme-light` class on `document.body` via `Renderer2`
+- Persists preference to `localStorage` (`palm-ai-theme`)
+- Exposes `currentTheme$` observable and `isDark()` boolean
+
+### Key Interfaces (`src/app/core/interfaces/`)
+
+**`BatchResult`** — Single scan audit record:
+- `image_id`, `timestamp`, `status` (ok | error)
+- `detections[]`: `bunch_id`, `ripeness_class`, `confidence_pct`, `is_health_alert`, `bounding_box`, `norm_box`
+- `performance`: `inference_ms`, `processing_ms`, `round_trip_ms`
+- `technical_evidence`: engine, archive_id, total_count, threshold, industrial_summary, raw_tensor_sample
+- `localBlobUrl?`, `error?`
+
+**`FullSessionReport`** — Batch run summary:
+- `session_id` (UUID), `generated_at`
+- `meta`: total_images, successful, failed, total_time_ms, avg_inference_ms, avg_round_trip_ms
+- `results: BatchResult[]`
 
 ### TFLite Bundler Constraint
-`@tensorflow/tfjs-tflite` is a legacy CommonJS/UMD hybrid incompatible with the Vite/esbuild ESM bundler. Both `tf.min.js` and `tf-tflite.min.js` are loaded as **global scripts** in `angular.json`, not as ES modules. This populates `window.tflite` and `window.tf` before Angular bootstraps. Do not attempt to import them via `import` statements.
+`@tensorflow/tfjs-tflite` is a legacy CommonJS/UMD hybrid incompatible with the Vite/esbuild ESM bundler. Both `tf.min.js` and `tf-tflite.min.js` are loaded as **global scripts** in `angular.json`, not as ES modules. This populates `window.tflite` and `window.tf` before Angular bootstraps. Do not import them via `import` statements.
 
 ### Required Manual Assets (not installed by npm)
 These binary files must be placed manually after `npm install`:
@@ -53,10 +127,20 @@ These binary files must be placed manually after `npm install`:
 | `src/assets/wasm/` | Copy from `node_modules/onnxruntime-web/dist/` |
 | `src/assets/tflite-wasm/` | Copy 7 files from `node_modules/@tensorflow/tfjs-tflite/dist/` |
 
-### Inference Pipeline (AnalyzerComponent orchestrates)
+### Environment
+- `environment.nestWsUrl` = `http://localhost:3000`
+- Angular 20, standalone components, Signals-based reactivity
+
+### Inference Pipeline (socket engine)
+1. User uploads image or webcam snap → `VisionSocketService.sendBase64()` → NestJS
+2. NestJS ONNX inference + SQLite write → `vision:result` with `BatchResult` payload
+3. `AnalyzerComponent` appends to `BatchResult[]`, draws bounding boxes on evidence canvas
+4. On session end → `FullSessionReport` assembled from collected results
+
+### Inference Pipeline (browser engines)
 1. User uploads image → `ImageProcessorService.processImage()` → CHW Float32Array
 2. `LocalInferenceService.loadModel(modelPath)` → creates ONNX session or loads TFLite model
 3. `LocalInferenceService.runInference(input)` → raw output tensor
 4. `LocalInferenceService.parseDetections(rawData, threshold)` → filtered detections with class labels and bounding boxes
-5. `AnalyzerComponent` draws bounding boxes on Canvas
+5. `AnalyzerComponent` draws bounding boxes on canvas
 6. `LocalHistoryService.save()` → persists to localStorage

+ 4 - 29
src/app/components/chatbot/chatbot.component.html

@@ -19,25 +19,7 @@
           <span>n8n → Angular Response</span>
         </div>
 
-        <!-- Agent readiness indicator (driven by NestJS webhook probe) -->
-        <div class="n8n-status-row">
-          <span class="n8n-status-dot"
-                [class.dot-green]="surveillance.n8nStatus() === 'ready'"
-                [class.dot-yellow]="surveillance.n8nStatus() === 'checking'"
-                [class.dot-red]="surveillance.n8nStatus() === 'not-ready'">
-          </span>
-          <span class="n8n-status-label"
-                [class.state-online]="surveillance.n8nStatus() === 'ready'"
-                [class.state-warn]="surveillance.n8nStatus() === 'checking'"
-                [class.state-offline]="surveillance.n8nStatus() === 'not-ready'">
-            Agent:
-            {{ surveillance.n8nStatus() === 'ready' ? 'Ready'
-             : surveillance.n8nStatus() === 'not-ready' ? 'Not Ready'
-             : 'Checking...' }}
-          </span>
-        </div>
-
-        <button class="btn btn-outline sidebar-clear" (click)="clearChat()">
+        <button class="btn btn-outline sidebar-clear" (click)="onClearChat()">
           Clear Chat
         </button>
       </aside>
@@ -71,7 +53,7 @@
                 <div class="thinking-dots">
                   <span></span><span></span><span></span>
                 </div>
-                <p class="thinking-label">Routing → Ollama synthesizing...</p>
+                <p class="thinking-label">Agent is typing...</p>
               </div>
             </div>
           }
@@ -84,13 +66,6 @@
           </div>
         }
 
-        <!-- Agent not ready banner — NestJS cannot reach n8n webhook -->
-        @if (surveillance.nestStatus() === 'ONLINE' && surveillance.n8nStatus() === 'not-ready') {
-          <div class="n8n-offline-banner">
-            🟡 LLM agent not ready — NestJS cannot reach the n8n webhook. Check that n8n is running and the workflow is active.
-          </div>
-        }
-
         <!-- Input row -->
         <div class="chat-input-row">
           <textarea
@@ -99,12 +74,12 @@
             (keydown)="onEnter($event)"
             placeholder="Ask about palm oil ripeness, site data, operational reports..."
             rows="2"
-            [disabled]="loading() || chatSocket.sending() || surveillance.nestStatus() === 'OFFLINE' || surveillance.n8nStatus() !== 'ready'">
+            [disabled]="loading() || chatSocket.sending() || surveillance.nestStatus() === 'OFFLINE'">
           </textarea>
           <button
             class="btn btn-primary send-btn"
             (click)="sendMessage()"
-            [disabled]="!inputText.trim() || loading() || chatSocket.sending() || surveillance.nestStatus() === 'OFFLINE' || surveillance.n8nStatus() !== 'ready'">
+            [disabled]="!inputText.trim() || loading() || chatSocket.sending() || surveillance.nestStatus() === 'OFFLINE'">
             {{ chatSocket.sending() ? '...' : 'Send' }}
           </button>
         </div>

+ 2 - 1
src/app/components/chatbot/chatbot.component.ts

@@ -102,7 +102,7 @@ export class ChatbotComponent implements AfterViewChecked {
     }
   }
 
-  clearChat(): void {
+  onClearChat(): void {
     this.messages.set([
       {
         role: 'bot',
@@ -110,6 +110,7 @@ export class ChatbotComponent implements AfterViewChecked {
         timestamp: new Date(),
       },
     ]);
+    this.chatSocket.clearBackendSession();
   }
 
   private pushMessage(msg: ChatMessage): void {

+ 1 - 26
src/app/components/performance-hud/performance-hud.component.html

@@ -21,14 +21,6 @@
             </span>
             <span class="hud-led-label">Nest</span>
           </span>
-          <span class="hud-led-group" title="n8n Agent: {{ surveillance.n8nStatus() }}">
-            <span class="hud-led-dot"
-                  [class.led-green]="surveillance.n8nStatus() === 'ready'"
-                  [class.led-yellow]="surveillance.n8nStatus() === 'checking'"
-                  [class.led-red]="surveillance.n8nStatus() === 'not-ready'">
-            </span>
-            <span class="hud-led-label">n8n</span>
-          </span>
         </div>
       </div>
 
@@ -39,8 +31,7 @@
         </div>
       } @else if (surveillance.metrics().length === 0) {
         <div class="hud-waiting">
-          <div class="hud-spinner"></div>
-          <span>Connecting to backend...</span>
+          <span>No processes detected</span>
         </div>
       }
 
@@ -81,22 +72,6 @@
             {{ surveillance.connected() ? 'Connected' : 'Offline — Socket features disabled' }}
           </span>
         </div>
-        <div class="hud-svc-row">
-          <span class="hud-svc-label">n8n</span>
-          <span class="hud-svc-dot"
-                [class.dot-green]="surveillance.n8nStatus() === 'ready'"
-                [class.dot-yellow]="surveillance.n8nStatus() === 'checking'"
-                [class.dot-red]="surveillance.n8nStatus() === 'not-ready'">
-          </span>
-          <span class="hud-svc-state"
-                [class.state-online]="surveillance.n8nStatus() === 'ready'"
-                [class.state-warn]="surveillance.n8nStatus() === 'checking'"
-                [class.state-offline]="surveillance.n8nStatus() === 'not-ready'">
-            {{ surveillance.n8nStatus() === 'ready' ? 'Agent Ready'
-             : surveillance.n8nStatus() === 'not-ready' ? 'Agent Not Ready'
-             : 'Checking...' }}
-          </span>
-        </div>
       </div>
 
       <div class="hud-footer">PID polling @ 500ms · Lego 09</div>

+ 4 - 0
src/app/services/chat-socket.service.ts

@@ -80,6 +80,10 @@ export class ChatSocketService implements OnDestroy {
     });
   }
 
+  clearBackendSession(): void {
+    this.socket.emit('chat:clear');
+  }
+
   ngOnDestroy(): void {
     this.socket.disconnect();
   }

+ 1 - 22
src/app/services/surveillance.service.ts

@@ -5,9 +5,7 @@
  * Emits monitor:subscribe on connect and keeps the tunnel alive indefinitely.
  * Exposes signals for the PerformanceHUD and chatbot status indicators.
  *
- * n8nStatus is driven by monitor:status events emitted by NestJS every time
- * its server-side webhook probe result changes (probed every 10 s). This is
- * truth-based: NestJS POSTs to the actual webhook, not just a port check.
+ * Exposes nestStatus (ONLINE/OFFLINE) derived from socket connection state.
  */
 
 import { Injectable, signal, computed, OnDestroy } from '@angular/core';
@@ -22,10 +20,6 @@ export interface MonitorPayload {
   timestamp: string;
 }
 
-export interface MonitorStatus {
-  n8nWebhookReady: boolean;
-  timestamp: string;
-}
 
 @Injectable({ providedIn: 'root' })
 export class SurveillanceService implements OnDestroy {
@@ -39,12 +33,6 @@ export class SurveillanceService implements OnDestroy {
     this.connected() ? 'ONLINE' : 'OFFLINE'
   );
 
-  /**
-   * n8n agent readiness — driven by NestJS webhook probe via monitor:status.
-   * 'checking' until the first probe result arrives after connect.
-   */
-  readonly n8nStatus = signal<'checking' | 'ready' | 'not-ready'>('checking');
-
   private socket: Socket;
 
   constructor() {
@@ -57,25 +45,16 @@ export class SurveillanceService implements OnDestroy {
 
     this.socket.on('connect', () => {
       this.connected.set(true);
-      this.n8nStatus.set('checking');
-      // Lego 11: emit monitor:subscribe to start the data stream
       this.socket.emit('monitor:subscribe');
     });
 
     this.socket.on('disconnect', () => {
       this.connected.set(false);
-      this.n8nStatus.set('not-ready');
     });
 
-    // Every 500ms tick from NestJS SurveillanceService
     this.socket.on('monitor:data', (payload: MonitorPayload[]) => {
       this.metrics.set(payload);
     });
-
-    // Webhook probe result — emitted immediately on connect and on every change
-    this.socket.on('monitor:status', (status: MonitorStatus) => {
-      this.n8nStatus.set(status.n8nWebhookReady ? 'ready' : 'not-ready');
-    });
   }
 
   ngOnDestroy() {