Dr-Swopt 4 dienas atpakaļ
vecāks
revīzija
0832da840f
2 mainītis faili ar 62 papildinājumiem un 62 dzēšanām
  1. 33 51
      README.md
  2. 29 11
      main.py

+ 33 - 51
README.md

@@ -1,15 +1,15 @@
-## README.md
+### ✅ Updated README.md (Copy/Paste this)
 
-# Palm Oil Ripeness Agent (n8n + YOLOv8)
+# 🌴 Palm Oil Ripeness Agent (n8n + YOLOv8)
 
-This project uses a custom-trained **YOLOv8** model to detect the ripeness of Palm Oil Fresh Fruit Bunches (FFB). It features a local Python FastAPI server that integrates into an **agentic n8n workflow**, storing results and embeddings in **MongoDB Atlas**.
+This project uses a custom-trained **YOLOv8** model to detect the ripeness of Palm Oil Fresh Fruit Bunches (FFB). It features a local Python FastAPI server and a Streamlit Dashboard, both integrated into an **agentic n8n workflow**.
 
 ## 🚀 Project Overview
 
-1. **Model:** YOLOv8 Nano (custom-trained on Roboflow dataset).
-2. **Server:** FastAPI (Python) hosting the model for inference.
-3. **Database:** MongoDB Atlas (Vector Search for historical similarity).
-4. **Orchestration:** n8n (Agentic workflow for decision making).
+1. **Vision Engine:** YOLOv8 Nano (Custom-trained on MPOB-standard datasets).
+2. **Inference Server:** FastAPI (Python) for n8n integration.
+3. **Demo Dashboard:** Streamlit UI for drag-and-drop batch testing.
+4. **Intelligence:** MongoDB Atlas Vector Search for similarity-based reasoning.
 
 ---
 
@@ -18,89 +18,71 @@ This project uses a custom-trained **YOLOv8** model to detect the ripeness of Pa
 * Python 3.10+
 * n8n (Desktop or Self-hosted)
 * MongoDB Atlas Account (with Vector Search index enabled)
-* *Optional:* NVIDIA GPU with CUDA for faster training.
 
 ---
 
 ## 📦 Setup Instructions
 
-### 1. Clone & Environment
+### 1. Environment Setup
 
 ```bash
 git clone <your-repo-url>
 cd palm-oil-ai
 python -m venv venv
 # Windows: venv\Scripts\activate | Mac: source venv/bin/activate
-pip install -r requirements.txt
+pip install ultralytics fastapi uvicorn streamlit python-multipart pillow
 
 ```
 
-### 2. Dataset Preparation
+### 2. Dataset & Training
 
-1. Download the dataset from [Roboflow Universe](https://www.google.com/search?q=https://universe.roboflow.com/assignment-vvtq7/oil-palm-ripeness/).
-2. Unzip into the `/datasets` folder.
-3. Ensure your `data.yaml` matches the local paths:
-```yaml
-train: ../datasets/train/images
-val: ../datasets/valid/images
+1. Download the dataset from [Roboflow](https://universe.roboflow.com/assignment-vvtq7/oil-palm-ripeness/dataset/5/download/yolov8)
+*Or you can also find your own separate source of datasets. Make sure the file/folder format structure is consistent, especially the .yaml file.
+2. Extract into `/datasets`.
+3. **Train the model:**
+```bash
+python train_script.py
 
 ```
 
 
+4. Copy the resulting `best.pt` from `runs/detect/train/weights/` to the project root.
 
-### 3. Training the Model
+### 3. Running the Demo (Streamlit)
 
-To train locally without hanging your PC, use the throttled script:
+To show the interactive dashboard to colleagues:
 
 ```bash
-python train_script.py
+streamlit run demo_app.py
 
 ```
 
-* **Outputs:** The best model will be saved at `runs/detect/train/weights/best.pt`.
-* **Move it:** Copy `best.pt` to the root directory for the server to use.
+* **Local URL:** `http://localhost:8501`
+
+### 4. Running the API for n8n
 
-### 4. Running the Inference Server
+To connect your AI to n8n workflows:
 
 ```bash
 python main.py
 
 ```
 
-The server will start at `http://localhost:8000`.
-
-* **Endpoint:** `POST /detect`
-* **Payload:** Multipart Form-data (Key: `file`, Value: `image.jpg`)
-
----
-
-## 🤖 n8n Integration
-
-The n8n workflow follows this logic:
-
-1. **Trigger:** Receives image (Telegram/Webhook).
-2. **HTTP Request:** Sends image to `localhost:8000/detect`.
-3. **MongoDB Node:** Performs Vector Search using the returned embedding.
-4. **Agent Logic:** Final ripeness determination based on model confidence + DB similarity.
+* **Endpoint:** `POST http://localhost:8000/detect`
+* **Payload:** Form-data with key `file`.
 
 ---
 
 ## 📂 Repository Structure
 
 ```text
-├── datasets/           # Labeled images from Roboflow
-├── runs/               # YOLO training logs and weights
-├── main.py             # FastAPI Inference Server
-├── train_script.py     # Local training configuration
-├── best.pt             # The "Brain" (Trained Model)
-├── requirements.txt    # dependencies
-└── README.md           # You are here
+├── datasets/           # Labeled images (Train/Valid/Test)
+├── runs/               # YOLO training logs and output weights
+├── main.py             # FastAPI Inference Server (for n8n)
+├── demo_app.py         # Streamlit Dashboard (for demos)
+├── train_script.py     # Throttled training configuration
+├── best.pt             # THE BRAIN: The trained model weights
+└── README.md           # Documentation
 
 ```
 
----
-
-## 📝 Future Improvements
-
-* Implement **CLIP** embeddings for higher-accuracy vector similarity.
-* Add a **Streamlit** dashboard for manual batch verification.

+ 29 - 11
main.py

@@ -1,27 +1,45 @@
-from fastapi import FastAPI, File, UploadFile
+from fastapi import FastAPI, File, UploadFile, Body
 from ultralytics import YOLO
 import io
-import torch
 from PIL import Image
 
 app = FastAPI()
 
-# Load your custom trained model
+# 1. Load your custom trained model
 model = YOLO('best.pt') 
 
+# 2. Global state for the confidence threshold
+# Defaulting to 0.25 (YOLO's internal default)
+current_conf = 0.25
+
+@app.get("/get_confidence")
+async def get_confidence():
+    """Returns the current confidence threshold used by the model."""
+    return {
+        "status": "success",
+        "current_confidence": current_conf,
+        "model_version": "best.pt"
+    }
+
+@app.post("/set_confidence")
+async def set_confidence(threshold: float = Body(..., embed=True)):
+    """Updates the confidence threshold globally."""
+    global current_conf
+    if 0.0 <= threshold <= 1.0:
+        current_conf = threshold
+        return {"status": "success", "new_confidence": current_conf}
+    else:
+        return {"status": "error", "message": "Threshold must be between 0.0 and 1.0"}
+
 @app.post("/detect")
 async def detect_ripeness(file: UploadFile = File(...)):
     image_bytes = await file.read()
     img = Image.open(io.BytesIO(image_bytes))
 
-    # 1. Run YOLO detection
-    results = model(img)
+    # 3. Apply the dynamic threshold to the inference
+    results = model(img, conf=current_conf)
 
-    # 2. Extract Detections and the 'Embedding'
-    # We use the feature map from the model as a vector
     detections = []
-    # Using the last hidden layer or a flattened feature map as a 'pseudo-vector'
-    # For a true vector, we'd usually use a CLIP model, but for now, we'll return detection data
     for r in results:
         for box in r.boxes:
             detections.append({
@@ -32,8 +50,8 @@ async def detect_ripeness(file: UploadFile = File(...)):
 
     return {
         "status": "success", 
-        "data": detections,
-        "message": "Model processed palm oil FFB successfully"
+        "current_threshold": current_conf,
+        "data": detections
     }
 
 if __name__ == "__main__":