A service to detect if a given image of palm is ripe or not. A R&D initiated for Swopt

Dr-Swopt 2e5a32e038 Config: Update repository and API to use COLLECTION_NAME from .env пре 3 дана
src 2e5a32e038 Config: Update repository and API to use COLLECTION_NAME from .env пре 3 дана
.env 930fbc5999 Refactor: Implement Domain-Driven Design (DDD) with Vertex AI and MongoDB integration пре 3 дана
.gitignore ebcabef0d7 basic setup and trained data пре 4 дана
README.md 930fbc5999 Refactor: Implement Domain-Driven Design (DDD) with Vertex AI and MongoDB integration пре 3 дана
Streamlit.md ebcabef0d7 basic setup and trained data пре 4 дана
best.pt ebcabef0d7 basic setup and trained data пре 4 дана
demo_app.py ebcabef0d7 basic setup and trained data пре 4 дана
main.py 0832da840f updates пре 4 дана
test_model.py ebcabef0d7 basic setup and trained data пре 4 дана
train_script.py 6dc23e13aa simple setup пре 4 дана
yolov8n.pt 6dc23e13aa simple setup пре 4 дана

README.md

🌴 Palm Oil Ripeness Agent (n8n + YOLOv8)

This project uses a custom-trained YOLOv8 model to detect the ripeness of Palm Oil Fresh Fruit Bunches (FFB). It features a local Python FastAPI server and a Streamlit Dashboard, both integrated into an agentic n8n workflow.

🚀 Project Overview

  1. Vision Engine: YOLOv8 Nano (Custom-trained on MPOB-standard datasets).
  2. Inference Server: FastAPI (Python) for n8n integration.
  3. Demo Dashboard: Streamlit UI for drag-and-drop batch testing.
  4. Intelligence: MongoDB Atlas Vector Search for similarity-based reasoning.

🛠 Prerequisites

  • Python 3.10+
  • n8n (Desktop or Self-hosted)
  • MongoDB Atlas Account (with Vector Search index enabled)

📦 Setup Instructions

1. Environment Setup

git clone <your-repo-url>
cd palm-oil-ai
python -m venv venv
# Windows: venv\Scripts\activate | Mac: source venv/bin/activate
pip install ultralytics fastapi uvicorn streamlit python-multipart pillow

2. Dataset & Training

  1. Download the dataset from Roboflow *Or you can also find your own separate source of datasets. Make sure the file/folder format structure is consistent, especially the .yaml file.
  2. Extract into /datasets.
  3. Train the model:

    python train_script.py
    
    
    1. Copy the resulting best.pt from runs/detect/train/weights/ to the project root.

    Running the API Server (DDD Structure)

    The new architecture decouples the vision logic from the API entry point.

    # Run the FastAPI server from the src directory
    python -m src.api.main
    

By default, the server runs on http://localhost:8000.

Running the Streamlit Dashboard

The Streamlit app still provides the user interface for manual testing.

# Run the Streamlit app
streamlit run demo_app.py
---

## 📂 Repository Structure

```text
├── datasets/           # Labeled images (Train/Valid/Test)
├── runs/               # YOLO training logs and output weights
├── main.py             # FastAPI Inference Server (for n8n)
├── demo_app.py         # Streamlit Dashboard (for demos)
├── train_script.py     # Throttled training configuration
├── best.pt             # THE BRAIN: The trained model weights
└── README.md           # Documentation