Project number:
22026
Academic year:
2021-2022
Sponsor requirements:
Abstract
Proximal sensing and observation of the land surface and its cover can benefit ecosystems and crops management. These technologies are utilized across many science and engineering disciplines. In the agriculture sector, these platforms can help estimate the accumulation of biomass and yield, help with early stress detection, weed and disease identification and control, and general monitoring.
The challenges are how to collect useful and meaningful observations accurately then add value to these observations in real- to near-real-time. UASs equipped with a variety of cameras and sensors are one of the leading approaches. Nevertheless, a major challenge is how to deal with the resulting large data streams in a practical and useful manner.
Current and still partial solutions are based on data post-processing, which is quite effective but limited and require complex data processing pipelines and external expertise, for example, cloud-based, at times very expensive, and non-cost effective processing.
The objective of this senior project proposal is to develop a hardware (UAS + sensors) equipped with an onboard data processing, analysis, and display system capable of identifying and mapping, in real to near-real-time, the land cover health and status, presence of unwanted vegetation, water stress levels, presence of pests and diseases, etc. in an operational and effective monitoring platform.
The primary criteria of this prototype is real-time operation with instantaneous image analysis and results display on preferably a UAS based platform. The sensing, data analysis, and results display system should be versatile to be deployed near the ground.
A successful prototype should be capable of processing at least one single nadir view based image from a reasonable height during hovering with instantaneous results about:
• Land cover physical characteristics (dimension, distribution, count, biomass, etc.)
• Land cover health
• Identification of stress (water, diseases, pest)
• Detection of unwanted vegetation
• Operate as a monitoring tool by overlaying the results on a real-time feed
There are many challenges to consider:
• Precise and autonomous UAS platform
• Onboard processing of images and transfer and display of results on the ground station
• Or transfer of data/images to a ground-based analysis and display system
• AI/ML-based data/images processing pipeline
• Ground system to coordinate and manage the platform and results
Goal
Proximal sensing and observation of the land surface and its cover can benefit ecosystems and crops management. These technologies are utilized across many science and engineering disciplines. In the agriculture sector, these platforms can help estimate the accumulation of biomass and yield, help with early stress detection, weed and disease identification and control, and general monitoring.
The challenges are how to collect useful and meaningful observations accurately then add value to these observations in real- to near-real-time. UASs equipped with a variety of cameras and sensors are one of the leading approaches. Nevertheless, a major challenge is how to deal with the resulting large data streams in a practical and useful manner.
Current and still partial solutions are based on data post-processing, which is quite effective but limited and require complex data processing pipelines and external expertise, for example, cloud-based, at times very expensive, and non-cost effective processing.
The objective of this senior project proposal is to develop a hardware (UAS + sensors) equipped with an onboard data processing, analysis, and display system capable of identifying and mapping, in real to near-real-time, the land cover health and status, presence of unwanted vegetation, water stress levels, presence of pests and diseases, etc. in an operational and effective monitoring platform.
The primary criteria of this prototype is real-time operation with instantaneous image analysis and results display on preferably a UAS based platform. The sensing, data analysis, and results display system should be versatile to be deployed near the ground.
A successful prototype should be capable of processing at least one single nadir view based image from a reasonable height during hovering with instantaneous results about:
• Land cover physical characteristics (dimension, distribution, count, biomass, etc.)
• Land cover health
• Identification of stress (water, diseases, pest)
• Detection of unwanted vegetation
• Operate as a monitoring tool by overlaying the results on a real-time feed
There are many challenges to consider:
• Precise and autonomous UAS platform
• Onboard processing of images and transfer and display of results on the ground station
• Or transfer of data/images to a ground-based analysis and display system
• AI/ML-based data/images processing pipeline
• Ground system to coordinate and manage the platform and results
Goal
Booklet description:
Vegetation stress is a key indicator of crop health, which often is determined using manual field analysis. This project presents a real-time drone analysis as a cost- and time-saving alternative. The Crop Level of Stress Analysis with Visual Export (CLOSAVE) software system can detect and categorize the level of stress in vegetation leaves.
The design is split up into two elements. The primary element is the CLOSAVE software, which receives video from a drone as it flies over a crop and then uses machine learning algorithms to indicate areas of stress detected on the plant. A commercial off-the-shelf color camera captures the video, and the software is pre-trained to recognize vegetation stress indicators. The second element is a drone, custom-built entirely by the team, that uses parts tailored to the design requirements of the software.
The design is split up into two elements. The primary element is the CLOSAVE software, which receives video from a drone as it flies over a crop and then uses machine learning algorithms to indicate areas of stress detected on the plant. A commercial off-the-shelf color camera captures the video, and the software is pre-trained to recognize vegetation stress indicators. The second element is a drone, custom-built entirely by the team, that uses parts tailored to the design requirements of the software.
Booklet description finalized
Project video:
Sponsor Information
Organization: