Fruit Detection System Using Deep learning

Ask For Price


Fruit Detection System Using Deep learning

In this project, we have implemented a fruit detection system for the automatic harvesting of fruits. The system was implemented to detect the matured strawberry fruits from the plants so that the robots could plug the fruits from the plants and collect them on the tray. This image processing system used Matlab for image processing. We have used deep neural networks for training. The proposed algorithm works well and is compared with other algorithms in terms of accuracy.

Block Diagram of Fruit detection system using Deep Learning

The proposed method consists of four modules

1) Base network image classifier A set of convolutional layers for basic image classification.

2) Multi-scale feature maps for detection Additional convolutional feature layers of progressively decreasing size augment the truncated base network to allow predictions of detections at multiple scales.

3) Convolutional predictors for detection A set of convolutional feature layers which generate a fixed set of predictions using a set of convolutional filters.

4) Default boxes and aspect ratios Layers that generate and associate default bounding boxes with feature map cells and compute per-class scores for each identified bounding box.

The tool used Matlab 2018 b

Toolbox required: Deep Learning Toolbox


Our system strives to be fast, precise, and affordable to spur easy adoption and high efficiency. We implement the state-of-the-art SSD neural network framework as a fruit detector. We use a sparse, three-layer convolutional classifier as our base classifier, and alter the network in various ways to increase speed and precision.

Reference Paper

[1] A Strawberry Detection System Using Convolutional Neural Networks

Published in:?2018 IEEE International Conference on Big Data (Big Data)

Customer Reviews

There are no reviews yet.

Be the first to review “Fruit Detection System Using Deep learning”

Your email address will not be published. Required fields are marked *