PRE2019 4 Group2

From Control Systems Technology Group
Jump to navigation Jump to search

Leighton van Gellecom, Hilde van Esch, Timon Heuwekemeijer, Karla Gloudemans, Tom van Leeuwen


Problem statement


Current farming methods such as monocropping are outdated and have negative effects on soil quality, greenhouse gas emissions, the presence of invasive species and the increase in crop diseases and pests (Plourde et al., 2013). Moreover, there are second order problems from this approach such as the removal of pesticides by using herbicides. Some seek the answer to such problems in the rise of precision farming. Precision farming’s promises are to reduce waste thereby cutting private and environmental costs (Finger et al., 2019). Others look further into the future and consider agroforestry. In the book Agroforestry Implications (Koh, 2010) the following definition is used: “agroforestry is loosely defined as production systems or practices that integrate trees with agricultural crops or livestock”. The author of the book poses that agroforestry compromises on expanding production while maintaining the potential for forest protection, the need for biodiversity and alleviating poverty.

Agroforestry is however labor intensive, therefore the need arises for automation taking over some tasks. A particular task is that of weed identification and removal. The definition of weeds might differ between people. Some definitions include that of Ferreira et al. (2017) who define weeds as “undesirable plants that grow in agricultural crops, competing for elements such as sunlight and water, causing losses to crop yields. ” or a definition by their features (Tang et al., 2016): fast growth rate, greater growth increment and competition for resources such as water, fertilizer and space. The main conclusion that could be drawn from these definitions is that weeds harm the agricultural crops, thus they need to be removed.

Such a weeding robot, or even a general purpose machine, would need many different modules operating independently completing their task, but they should also communicate among each other. This research restricts itself by specifically looking at weed detection in a setting of agroforestry where between different (fruit) trees plants grow. The aim is to identify weeds by means of computer vision.

Users


Farmers that adopt a sustainable farming method differ significantly from conventional farmers on personal characteristics. Sustainable farmers tend to have a higher level of education, be younger, have more off-farm income, and adopt more new farming practices (Ekanem & co., 1999). This suggests that sustainable farmers are more likely to originally not be farmers. Also, having more off-farm income indicates limited time devotion to the farm. The willingness to adopt new farming practices could benefit our new software, as it might be more likely to be accepted and tried out.

There is a growing trend of sustainable farmers, with the support of the EU, which has set goals for sustainable farming and promotes these guidelines (Ministerie van Landbouw, Natuur en Voedselkwaliteit, 2019).

Agroforestry imposes more difficulty in removal of weeds, due to the mixed crops. Weeding is a physically heavy and dreadful job. Due to all of these stated reasons, there is a growing need for weeding systems from farmers who made a transition to agroforestry, as experienced by Marius Moonen, co-owner of CSSF.

Spraying pesticides preventively reduces the food quality and poses the problem of environmental pollution (Tang, J., Chen, X., Miao, R., & Wang, D.,2016). This means that the users of the software for weed detection would not only be the sustainable farmers and the robot engineers of weeding robots, but also indirectly the common consumers of farming products, as it poses an influence on their foods and environment. Since the approach and views of sustainable farmers may differ, one of the needs of the system is that is flexible in its views what may be concerned as weeds, and what as useful plants (Perrins, Williamson, Fitter, 1992). Furthermore, regarding the set-up of agroforestry, it should be able to deal with different kinds of plants in a small region.


State of the art



Articles Hilde:

a. Subject: combat of unwanted plants using detection by deep learning

The combat against unwanted potato plants is an intensive and boring task for farmers, which they would gladly leave to robots. Until now this was impossible, since the robots could not distinguish between the potato and beetroot plants. Using deep learning, this has now succeeded with a 96% success rate. A robot was developed which drives on the land and makes pictures, which are sent to a KPN-cloud through 5G. The pictures are then analysed by the deep learning algorithm, and the result is sent back to the robot. This deep learning algorithm was constructed with a dataset of about 5500 labelled pictures of potato and sugar beet plants to train the system. Next, the robot combats the plants that have been detected as the unwanted potato plants using a spraying unit, which is instructed by the system. This development is already a big step forward, but the fault rate is still too large for the system to be put into practice.

Booij, J., Nieuwenhuizen, A., van Boheemen, K., de Vissr, C., Veldhuisen, B., Vroegop, A., ... Ruigrok, T. (2020). 5G Fieldlab Rural Drenthe: duurzame en autonome onkruidbestrijding. (Rapport / Stichting Wageningen Research, Wageningen Plant Research, Business unit Agrosysteemkunde; No. WPR). Wageningen: Stichting Wageningen Research, Wageningen Plant Research, Business unit Agrosysteemkunde. https://doi.org/10.18174/517141


b. Subject: detection of plant disease using deep learning

Potato blackleg is a bacterial disease that can occur in potato plants that causes decay of the plant, and may spread to neighbouring plants if the diseased plant is not taken away. So far, only systems have been devised that were able to detect the disease after harvesting the plants. In this research, a system was created that had a 95% precision rate in detection of healthy and diseased potato plants. This system consisted of a deep learning algorithm, which used a neural network trained by a dataset of 532 labelled images. There is a downside to the system, however, since it was devised, and trained, to detect plants that were separate and do not overlap. In most scenarios, this is not the case. Further developments need to be made to be able to use the system in all scenarios. In addition, it proved to be difficult to gain enough labelled images of the plants.

Afonso, M. V., Blok, P. M., Polder, G., van der Wolf, J. M., & Kamp, J. A. L. M. (2019). Blackleg Detection in Potato Plants using Convolutional Neural Networks. Paper presented at 6th IFAC Conference on Sensing, Control and Automation Technologies for Agriculture, AgriControl 2019, Sydney, Australia.

c. Subject: Broad-leaved dock weed plant recognition based on feature extraction and images

Most weed recognition and detection systems designed up to now are specifically designed for a sole purpose or context. Plants are generally considered weeds when they either compete with the crops or are harmful to livestock. Weeds are traditionally mostly battled using pesticides, but this diminishes the quality of the crops. The Broad-leaved dock weed plant is one of the most common grassland weeds, and this research aims to create a general weed recognition system for this weed. The system designed relied on images and feature extraction, instead of the classical choice for neural networks. It had a 89% accuracy.

Kounalakis, T., Triantafyllidis, G. A., & Nalpantidis, L. (2018). Image-based recognition framework for robotic weed control systems. Multimedia Tools and Applications, 77(8), 9567-9594. doi:http://dx.doi.org/10.1007/s11042-017-5337-y

d. Subject: Plant classification of 22 species using feature extraction on the leaves

This paper describes the research of a method to classify plants based on 15 features of their leaves. This yielded a 85% accuracy for classification of 22 species with a training data set of 660 images. The algorithm was based on feature extraction, with the help of the Canny Edge Detector and SVM Classifier.

Salman, A., Semwal, A., Bhatt, U., Thakkar, V.M., "Leaf classification and identification using Canny Edge Detector and SVM classifier," 2017 International Conference on Inventive Systems and Control (ICISC), Coimbatore, 2017, pp. 1-4.

e. Subject: weed classification

A weed is a plant that is unwanted at the place where it grows. This is a rather broad definition, though, and therefore this research was focused on what plants are regarded as weeds among 56 scientists. Again, it was discovered that views greatly differed among the scientists. Therefore it is not possible to clearly classify plants into weeds or non-weeds, since it depends on the views of a person, and the context of the plant.

Perrins, J., Williamson, M., & Fitter, A. (1992). A survey of differing views of weed classification: implications for regulation of introductions. Biological Conservation, 60(1), 47-56.


(Karla: Working on the following articles:)

a. Hemming, J., Blok, P., & Ruizendaal, J. (2018). Precisietechnologie Tuinbouw: PPS Autonoom onkruid verwijderen: Eindrapportage. (Rapport WPR; No. 750). Bleiswijk: Wageningen Plant Research, Business unit Glastuinbouw. https://doi.org/10.18174/442083

b. Hemming, J., Barth, R., & Nieuwenhuizen, A. T. (2013). Automatisch onkruid bestrijden PPL-094 : doorontwikkelen algoritmes voor herkenning onkruid in uien, peen en spinazie. Wageningen: Plant Research International, Business Unit Agrosysteemkunde.

c. Bawden, O., Kulk, J., Russell, R., McCool, C., English, A., Dayoub, F., . . . Perez, T. (2017). Robot for weed species plant-specific management. Journal of Field Robotics, 34(6), 1179-1199. doi:10.1002/rob.21727

d. Duong, L.T., Nguyen, P.T., Sipio, C., Ruscio, D. (2020). Automated fruit recognition using EfficientNet and MixNet. Computers and Electronics in Agriculture, 171. https://doi.org/10.1016/j.compag.2020.105326

e. Carvalho, L., & Von Wangenheim, A. (2019). 3d object recognition and classification: A systematic literature review. Pattern Analysis and Applications, 22(4), 1243-1292. doi:10.1007/s10044-019-00804-4


Articles currently working on Leighton:


a. Piron, A., van der Heijden, F. & Destain, M.F. Weed detection in 3D images. Precision Agric 12, 607–622 (2011). https://doi-org.dianus.libr.tue.nl/10.1007/s11119-010-9205-2

The researchers suggest that there are two different types of problems. First a problem that is characterized by detection of weeds between rows or more generally structurally placed crops. The second problem is characterized by random positions. Computer vision has led to successful discrimination between weeds and rows of crops. Knowing where, and in which patterns, crops are expected to grow and assuming everything outside that region is a weed has proven to be successful. This study has shown that plant height is a discriminating factor between crop and weed at early growth stages since the speed of growth of these plants differ. An approach with three-dimensional images is used to facilitate this. The classification is by far not robust enough, but the study shows that plant height is a key feature. The researchers also suggest that camera position and ground irregularities influences classification accuracy negatively.

b. Dos Santos Ferreira, A., Matte Freitas, D., Gonçalves da Silva, G., Pistori, H., & Theophilo Folhes, M. (2017). Weed detection in soybean crops using convnets. Computers and Electronics in Agriculture, 143, 314-324. doi:10.1016/j.compag.2017.10.027

The authors of this paper define weeds as “undesirable plants that grow in agricultural crops, competing for elements such as sunlight and water, causing losses to crop yields”. Data is created by taking pictures with a drone at a height of 4 meters above ground level. The approach used convolutional neural networks. The results achieved high accuracy in discriminating different types of weeds. In comparison to traditional neural networks and support vector machines deep learning has the key factor that features extraction is automatically learned from raw data. Thus it requires little by hand effort. Convolutional neural networks have been proven to be successful in image recognition. For image segmentation the simple linear iterative clustering algorithm (SLIC) is used, which is based upon the k-means centroid based clustering algorithm. The goal was to separate the image into segments that contain multiple leaves of soy or weeds. Important is that the pictures have a high resolution of 4000 by 3000 pixels. Segmentation was significantly influenced by lightning conditions. The convolutional neural network consists of 8 layers, 5 convolutional layers and 3 fully connected layers. The last layer uses SoftMax to produce the probability distribution. ReLU was used for the output of the fully connected layers and the convolutional layers. The classification of the segments was done with high robustness and had superior results to other approaches such as random forests and support vector machines. If a threshold of 0.98 is set to than 96.3% of the images are classified correctly and none received incorrect identification.


c. Alchanatis, V., Ridel, L., Hetzroni, A., & Yaroslavsky, L. (2005). Weed detection in multi-spectral images of cotton fields. Computers and Electronics in Agriculture, 47(3), 243-260. doi:10.1016/j.compag.2004.11.019

Different approaches might exist: machine vision methods and spectroscopic methods (utilizing spectral reflectance or absorbance patterns). With spectroscopic methods features such as water content, moisture or humidity can be measured. Field studies have shown that weeds and agricultural crops can be distinguished based on their relative spectral reflectance characteristics. The researchers propose an image processing algorithm based on image texture to discriminate weeds from cotton. They used images hyperspectral images to perform basic segmentation between crop and soil. The authors used a robust statistics algorithm yielding an average false alarm rate of 15% (worse than newer, different approaches, perhaps link this to weed detection in soybeans article).

d. Yu, J., Schumann, A., Cao, Z., Sharpe, S., & Boyd, N. (2019). Weed detection in perennial ryegrass with deep learning convolutional neural network. Frontiers in Plant Science, 10, 1422-1422. doi:10.3389/fpls.2019.01422

Researchers argued that the deep convolutional neural networks (DCNN) takes much time in training (hours), and little time in classification (under a second). The authors compared different existing DCNN for weed detection in perennial ryegrass and detection between different weeds too. Due to the recency of the paper and the comparison across different approaches it is a good estimation of the current state of the art. The best results seem to be > 0.98. It also shows weed detection in perennial ryegrass, so not perfectly aligned crops. However, only the distinction between the ryegrass or weeds is made. For robotics applications in agroforestry, different plants should be discriminated from different weeds. (e.g. a gap for us to research)

e. Tang, J., Chen, X., Miao, R., & Wang, D. (2016). Weed detection using image processing under different illumination for site-specific areas spraying. Computers and Electronics in Agriculture, 122, 103-111. doi:10.1016/j.compag.2015.12.016

Weeds hold particular features among: fast growth rate, greater growth increment and competition for resources such as water, fertilizer and space. These features are harmful for crops growth. Lots of line detection algorithms use Hough transformations and the perspective method. The robustness of Hough transformations is high. The problem with the perspective method is that it cannot accurately calculate the position of the lines for the crops on the sides of an image. This research proposes to combine the vertical projection method and linear scanning method to reduce the shortcomings of other approaches. It is roughly based upon transforming the pictures into binary black- and white pictures to control for different illumination conditions and then drawing a line between the bottom and top of the image such that the amount of white pixels is maximized. In contrast to other methods, this method is real-time and its accuracy is relatively high.


Tom's articles:

Two articles investigating the use of specific signalling compounds to mark desired plants, so that weeds can be removed. This created a way of marking the plants with a "machine-readable signal". This could thus be used for automatic classification of plants. According to one of the studies, an accuracy of at least 98% was achieved for detecting weeds and crops. --This is a good state of the art (very recent), but it hasn't actually been automated yet, it just shows the possibility of the method.--

a. doi: https://dx.doi.org/10.1016/j.biosystemseng.2020.02.011 b. doi: https://dx.doi.org/10.1016/j.biosystemseng.2020.02.002


This article describes how they modified farm environment and design to best suit a robot harvester. This took into account what kind of harvesting is possible for a robot, and what is possible for different crops, and then tried to determine how the robot could best do its job.

c. doi: https://dx.doi.org/10.1016/j.biosystemseng.2020.01.021

The next article created a vision and control system that was able to remove most weeds from an area, without explained visual features of crops and weeds. It achieved a crop detection accuracy of 97.8%, and was able to remove 83% of weeds around plants. This seemed to be in a very controlled setting, however, and still works with mainly simple farms.

d. doi: https://doi.org/10.1016/j.biosystemseng.2020.03.022

This last article tries to improve the functioning of vision-based weed control, and does this by taking a slower approach to visual processing and decision-making. It uses multiple cameras, but apparently uses overhead cameras, which aren't suited for all types of crops. It does use 3D vision, so the camera position might be modifiable. --It has been tested on sugar beets, so nothing too special yet. Also this thing is gigantic.--

e. doi: https://doi.org/10.1002/rob.21938

Timon:

Mateo Gašparović, Mladen Zrinjski, Đuro Barković, Dorijan Radočaj,An automatic method for weed mapping in oat fields based on UAV imagery, Computers and Electronics in Agriculture, Volume 173, 2020, 105385, ISSN 0168-1699, https://doi.org/10.1016/j.compag.2020.105385. This discusses the use of unmanned aerial vehicles (UAV) to acquire spatial data which can be used to locate weeds. In this paper four classification algorithms are tested, based on the random forest machine learning algorithm, which was suggested to be the best algorithm for the automation of classification as it requires very little parameters in the following article: Belgiu, M., Drăguţ, L., 2016. Random forest in remote sensing: a review of applications and future directions. ISPRS J. Photogramm. Remote Sens. 114, 24–31. https://doi:10.1016/j.isprsjprs.2016.01.011 Random forest algorithms were proposed by L. Breiman in 2001 in this paper: Breiman, L. Random Forests. Machine Learning 45, 5–32 (2001). https://doi.org/10.1023/A:1010933404324

Borja Espejo-Garcia, Nikos Mylonas, Loukas Athanasakos, Spyros Fountas, Ioannis Vasilakoglou, Towards weeds identification assistance through transfer learning, Computers and Electronics in Agriculture, Volume 171, 2020, 105306, ISSN 0168-1699, https://doi.org/10.1016/j.compag.2020.105306. This article deals with weed classification through transfer learning, where pre-trained convolutional neural networks are combined with more "traditional" machine learning methods for classification, in order to avoid overfitting and providing consistent and robust performance.This provides some impressively accurate classification algorithms.

Yanfen Li, Hanxiang Wang, L. Minh Dang, Abolghasem Sadeghi-Niaraki, Hyeonjoon Moon, Crop pest recognition in natural scenes using convolutional neural networks, Computers and Electronics in Agriculture, Volume 169, 2020, 105174, ISSN 0168-1699, https://doi.org/10.1016/j.compag.2019.105174. This article deals with using convolutional neural networks for recognizing crop pests. They found that GoogLeNet had the best performance.

Daniel Riehle, David Reiser, Hans W. Griepentrog, Robust index-based semantic plant/background segmentation for RGB- images, Computers and Electronics in Agriculture, Volume 169, 2020, 105201, ISSN 0168-1699, https://doi.org/10.1016/j.compag.2019.105201. This article gives a novel algorithm that can be used for plant/background segmentation in RGB images, which is a key component in digital image analysis dealing with plants. The algorithm has shown to work in spite of over- or underexposure of the camera, as well as with varying colours of the crops and background. The algorithm is index-based, and has shown to be more accurate and robust than other index-based approaches.

References


Comer, S., Ekanem, E., Muhammad, S., Singh, S. P., & Tegegne, F. (1999). Sustainable and conventional farmers: A comparison of socio-economic characteristics, attitude, and beliefs. Journal of Sustainable Agriculture, 15(1), 29-45.

Finger, R., Scott M. S., Nadja El B., and Achim W. 2019. “Precision Farming at the Nexus of Agricultural Production and the Environment.” Annual Review of Resource Economics 11(1):313–35.

Koh, Lian Pin. 2010. “Agroforestry Implications.” Biotropica 42(6):760–60.

Ministerie van Landbouw, Natuur en Voedselkwaliteit. (2019). Landbouwbeleid. Consulted from: https://www.rijksoverheid.nl/onderwerpen/landbouw-en-tuinbouw/landbouwbeleid

Perrins, J., Williamson, M., & Fitter, A. (1992). A survey of differing views of weed classification: implications for regulation of introductions. Biological Conservation, 60(1), 47-56.

Plourde J.D, Pijanowski B.C, and Pekin B.K. 2013. “Evidence for Increased Monoculture Cropping in the Central United States.” Agriculture, Ecosystems and Environment 165:50–59.


Tang, J. L., Chen, X. Q., Miao, R. H., & Wang, D. (2016). Weed detection using image processing under different illumination for site-specific areas spraying. Computers and Electronics in Agriculture, 122, 103-111.


Who has done what



Week 1:

Name (ID) Hours Work done
Hilde van Esch (1306219) 11 Intro lecture (1 hour) + meetings (3 hours) + literature research (5 hours) + User part written (2 hours)
Leighton van Gellecom (1223623) 13 Intro lecture + group formation (1 hour) + Meetings (3 hours) + Brainstorming ideas (1 hour) + Literature research (6.5 hours) + Problem statement (1.5 hours)