Review Article

Precision Agriculture Science and Technology. 31 December 2025. 397-420
https://doi.org/10.22765/pastj.20250027

ABSTRACT


MAIN

  • Introduction

  • Overview of major root vegetables in Korea and harvesting mechanism

  • Principles and technologies of non-contact yield monitoring

  •   Principles of non-contact measurement

  •   Optical sensor techniques for non-contact yield measurement

  •   LiDAR and proximity sensor techniques for non-contact yield measurement

  •   Other sensors techniques for non-contact yield measurement

  •   Sensor fusion and integrated systems for non-contact yield monitoring

  •   Data processing and analysis

  •   Discussion and Future Directions

  • Conclusion

Introduction

The global population is projected to reach 9.7 billion by 2050, placing unprecedented pressure on agricultural systems to increase food production while preserving natural resources (Kabir et al., 2023; Islam et al, 2024). Simultaneously, rapid urbanization and infrastructure expansion continue to diminish cultivable land, threatening the sustainability and resilience of global food systems (Islam et al., 2023; Kabir et al., 2024). Meeting future food demands therefore requires the integration of innovative, data-driven, and resource-efficient technologies into conventional farming practices.

Vegetables play a vital role in human nutrition by supplying essential micronutrients and bioactive compounds. Among them, root crops such as potato, sweet potato, carrot, and beet are fundamental to global food security due to their adaptability, storability, and contribution to energy and dietary fiber intake (Qin et al., 2022; Chen et al., 2021). Their resilience under varying environmental conditions makes them indispensable for sustainable agricultural systems, particularly in regions experiencing land or labor constraints (Kabir et al., 2023).

In the Republic of Korea, root vegetables are both agriculturally and culturally significant (Chung et al., 2016). Potatoes and sweet potatoes serve as economically important staples integrated into traditional cuisine and local food industries (Kim et al., 2023; Park et al., 2023). Sweet potato is valued for high fiber, vitamin, and antioxidant content (Won et al., 2024; Paul et al., 2021), whereas potato provides carbohydrates, potassium, and vitamin C, contributing to energy metabolism, cardiovascular, and immune functions (Rather and Alotaibi, 2025). Carrots supply provitamin A that supports vision and immunity, while beets are associated with improved circulation and detoxification. Given the rising consumption and economic value of these crops, accurate yield estimation is essential to ensure market stability, efficient supply chain management, and national food security (Karim et al., 2025).

Yield monitoring represents a core component of precision agriculture, enabling quantitative and spatial analysis of crop productivity (Kabir et al., 2024; Reza et al., 2025). By integrating sensors, positioning systems, and data analytics, yield monitoring systems generate real-time or post-harvest data that quantify spatial yield variability across fields (Constantinescu and Sala, 2021). These data are commonly visualized as yield maps, which illustrate within-field productivity patterns and identify variability drivers such as soil fertility, moisture distribution, pest pressure, and nutrient imbalance (Kim et al., 2023; Kabir et al., 2024). Such spatial insights facilitate variable-rate management of fertilizers, pesticides, and irrigation (Dutta et al., 2020), enabling farmers to optimize input use, enhance profitability, and reduce environmental impacts. The adoption of yield monitoring thus aligns with the broader goals of sustainable intensification and data-driven decision-making in modern agriculture (Kabir et al., 2024; Kiraga et al., 2025; Olakiumide, 2021).

Despite advances in precision agriculture, yield measurement of root vegetables continues to rely on manual and post-harvest weighing systems, which are labor-intensive, time-consuming, and spatially limited (Kaur et al., 2023). These conventional methods fail to provide real-time or georeferenced data during harvesting, hindering the identification of spatial yield variability and preventing immediate feedback to improve field operations (Longchamps et al., 2022). As a result, delayed and inaccurate measurements can lead to inefficient post-harvest handling, suboptimal storage, and increased crop losses during transportation. The lack of integration with mechanized harvesting systems limits scalability and reduces suitability for large-scale or cooperative-based production (Kiraga et al., 2025; Kabir et al., 2025). These challenges highlight the urgent need for automated, non-contact, and real-time yield monitoring technologies capable of accurately estimating yield while simultaneously collecting spatial data during harvesting operations.

The agricultural sector in Korea is undergoing rapid mechanization and digital transformation, driven by the need to counteract labor shortages and fragmented landholdings. Within this context, non-contact yield monitoring systems integrated into root crop harvesters or transport conveyors can provide instantaneous, field-based yield estimation, improving both operational efficiency and supply chain coordination (Kabir et al., 2024; Das et al., 2025). Real-time yield data also enable localized field management and adaptive decision-making, contributing to sustainability, profitability, and national food resilience. Therefore, the objective of this review was to analyze and summarize the development and application of non-contact yield monitoring technologies for root vegetable crops.

Overview of major root vegetables in Korea and harvesting mechanism

Root vegetables play a major role in Korean agriculture, nutrition, and food culture. Among them, potato, radish, and sweet potato are the most economically significant and widely cultivated. Potato has become one of major root crops in Korea because of its high yield potential, adaptability to diverse environments, and rich nutritional value (Kou et al., 2023). As a staple food and an important source of farm income, potato contributes substantially to national food security (Olakiumide, 2021). Radish, particularly the white Korean radish, is deeply embedded in Korean cuisine and is essential for preparing kimchi, soups, and stews, reinforcing its cultural and agricultural relevance (Kim et al., 2019). Sweet potato is similarly important due to its versatility and health benefits; globally it ranks as the eleventh most important food crop and is valued for its high carbohydrate, fiber, vitamin, and antioxidant content (Paul et al., 2021). Together, these crops form a major component of agricultural identity and dietary system in Korea.

Despite their value, root vegetables face notable harvesting and postharvest challenges. Sweet potatoes are particularly prone to mechanical injury during harvest and are vulnerable to physiological disorders and fungal diseases such as Fusarium rot, dry rot, charcoal rot, and soft rot, which contribute to postharvest losses (Kou et al., 2023; Paul et al., 2021). Efforts to mitigate quality decline include treatments such as ethylene and 1-methylcyclopropene (1-MCP) to suppress sprouting and decay, and biocontrol agents like Trichoderma harzianum, which demonstrate strong antifungal activity (Paul et al., 2021).

Root vegetable harvesting in Korea typically involves soil digging, lifting, and conveying mechanisms. The tractor-mounted radish collector developed for upland farming, which uses a conveyor belt to transport harvested radishes (Chowdhury et al., 2020). The performance of the system on uneven terrain revealed that vibration levels increase with conveyor speed, affecting both harvester stability and yield monitoring accuracy (Reza et al., 2025). While vibrations under loaded conditions generally met ISO thresholds, the first conveyor unit exceeded recommended limits, indicating the need for improved vibration control.

Conveyor systems are integral in root vegetable harvesters, with chain conveyors and belt conveyors being the most common. Belt conveyors offer smoother and more continuous material flow, which can enhance yield monitoring consistency, but they require careful maintenance to prevent misalignment and idler wear (Won et al., 2024; Reza et al., 2025). Chain conveyors, while robust, may introduce more vibration and irregular flow, influencing measurement accuracy. Therefore, the selection and optimization of conveyor systems significantly influence harvesting efficiency, product quality, and the precision of real-time yield monitoring. Table 1 summarizes major root vegetables in Korea, their importance, harvesting mechanisms, and challenges.

Table 1.

Summary of major root vegetable crops, their economic significance, harvesting mechanisms, and key technical challenges in Korea.

Root vegetable Economic importance Harvesting mechanism Key harvesting challenges Reference
Radish
(Raphanus sativus L.)
Widely grown, core ingredient in kimchi, soups, and side dishes Manual harvest, tractor-mounted diggers, and lifters Root size variation, soil adhesion, limited mechanization Kim et al., 2019
Sweet potato
(Ipomoea batatas L.)
Highest grown root crop, key staple and healthy food Plow-type harvesters, vibration diggers, double conveyor, and combine type conveyor High tuber damage rate, low harvesting efficiency Won et al., 2024
Carrot
(Daucus carota L.)
Consumed fresh and processed, grown in highlands, winter crop Manual pulling, small-scale lifters, tractor-drawn belt harvesters Brittle taproot, requires uniform soil texture and moisture Park et al., 2023
Onion and garlic Staple crop for Korean cuisine, widely cultivated and exported Vibratory digger harvesters, conveyor equipped machine Bulb bruising from vibration, soil contamination, post-harvest cleaning difficulty Rasool et al., 2020
Other root vegetables
(grafted/transplanted)
Grafted vegetable for large-scale cultivation, disease resistance and high yield Mechanized transplanter and harvester High initial costs, standardization issue across crop and equipment types An et al., 2021

Principles and technologies of non-contact yield monitoring

Principles of non-contact measurement

Traditional yield measurement methods for root vegetables such as manual sampling and mechanical weighing are typically labor-intensive, time-consuming, and susceptible to human error (Kabir et al., 2023; Kiraga et al., 2025; Reza et al., 2025). Moreover, these conventional techniques often cause physical damage to crops, particularly during the harvesting of delicate root vegetables such as potatoes, sweet potatoes, carrots, and radishes. In contrast, non-contact sensing technologies provide substantial advantages by enabling continuous yield assessment without physical contact with the crop (Fu et al., 2021; Kiraga et al., 2025; Kabir et al., 2024).

Non-contact yield monitoring systems estimate harvested crop mass or volume without direct physical contact between the sensor and the produce. These systems integrate advanced sensing, imaging, and automation technologies to evaluate crop volume and perform accurate counting. A range of remote sensing platforms including unmanned aerial vehicles (UAVs), aerial imagery, light detection and ranging (LiDAR), multispectral and hyperspectral sensors, RGB and thermal cameras, and radar-based systems are used to achieve precise and scalable crop assessment (Chung et al., 2016; Zheng et al., 2022; Bidese-Puhl et al., 2023). These systems generally monitor key parameters and improve efficiency and safety. The implementation of non-contact measurement techniques in yield monitoring processes is a critical advancement across diverse fields, offering real-time data acquisition and minimal process interference (Bidese-Puhl et al., 2023; Longchamps et al., 2022). In precision agriculture, a suite of technologies, including optical, ultrasonic, and infrared sensors integrated with global positioning system (GPS), allows for comprehensive crop health assessment, biomass estimation, and the creation of detailed geo-referenced yield maps without damaging the crops (Zheng et al., 2022; Kabir et al., 2024). These systems provide data-driven insights, enabling farmers to optimize resource allocation, reduce waste, and improve overall productivity and sustainability. Fig. 1 illustrates the overall framework of non-contact yield monitoring for root vegetable crops, integrating the fundamental measurement principles, core sensing technologies, field deployment platforms, AI-based data processing, and practical applications.

https://cdn.apub.kr/journalsite/sites/kspa/2025-007-04/N0570070407/images/kspa_2025_074_397_F1.jpg
Fig. 1.

Comprehensive framework of principles, sensing technologies, platforms, data processing, and applications for non-contact yield monitoring in root vegetable crops.

Sensors constitute the fundamental components of non-contact measurement systems, enabling the detection, counting, and quantification of crop attributes such as volume and weight, as well as the capture of spectral reflectance and color variations (Chung et al., 2016; Fu et al., 2021; Zeng et al., 2021). These measurements facilitate the estimation of vegetation indices, plant numbers, and plant height, which are closely associated with crop health and growth status. Optical sensors comprising RGB, multispectral, and hyperspectral cameras play a pivotal role in non-contact yield estimation by acquiring detailed spectral and structural information before and during harvesting. RGB cameras provide high-resolution imagery suitable for canopy coverage analysis and plant counting, whereas multispectral and hyperspectral sensors capture reflectance data across multiple wavelengths to evaluate plant health, biomass, and growth stages (Fu et al., 2021; Zeng et al., 2021). These optical systems support yield estimation through the generation of two- and three-dimensional crop models using photogrammetry and Structure-from-Motion (SfM) techniques to reconstruct plant morphology. By analyzing canopy size, height, and density, they provide real-time, non-invasive yield predictions that enhance the efficiency of precision harvesting operations. LiDAR sensors employ laser pulses to generate detailed three-dimensional representations of crop structures, enabling the estimation of plant volume and density as well as the creation of spatial yield maps (Villordon et al. (2020). Similarly, ultrasonic and proximity sensors often mounted on harvesting machinery measure plant height, spacing, and canopy density based on reflected sound waves, thereby providing complementary data for non-destructive yield estimation (Tan et al., 2025).

Automation technologies integrate advanced sensors and imaging systems into robotic platforms, UAVs, and autonomous harvesting machines to enable large-scale, real-time yield measurement (Emmi et al., 2014; Duan et al., 2019). UAV-based remote sensing facilitates rapid scanning and assessment of extensive agricultural fields, while autonomous tractors and harvesters equipped with non-contact sensors enable continuous, on-the-go yield monitoring during harvesting operations (Olson and Anderson, 2021). The integration of machine learning (ML) and artificial intelligence (AI) algorithms further enhances the interpretation of sensor and imaging data by improving crop classification accuracy, object detection, and anomaly identification (Islam et al., 2024; Wu et al., 2025). Deep learning frameworks such as convolutional neural networks (CNNs) and you only look once (YOLO) are increasingly adopted for real-time segmentation and classification of crops, providing more precise and automated yield estimations (Islam et al., 2024; Wu et al., 2025). The combination of sensor-based monitoring, advanced imaging, and automation thus establishes a non-destructive, efficient, and scalable approach to yield assessment, supporting precision agriculture and data-driven decision-making in modern crop production systems. Table 2 shows a variety of non-contact technologies are used to measure different parameters for yield monitoring.

Table 2.

Various technologies and sensors are employed in non-contact yield monitoring systems.

Technology Principle/measurement Application in yield monitoring
Optical sensors Measure reflected or transmitted light to assess
plant health, growth, and stress levels
Provide insights into yield potential
and plant vigor
Cameras/Imaging Capture high-resolution images (visual,
multispectral, hyperspectral, etc.) for processing
with computer vision techniques
Used for counting plants/fruits, identifying
disease, and assessing overall crop status
LiDAR Uses laser pulses to measure distance and create
detailed 3D maps
Used for precise topographic mapping, vegetation structure analysis, and crop height estimation
Infrared sensors Identify temperature anomalies, which can indicate
water stress or potential yield variations
Helps in optimizing irrigation and
fertilization practices
Ultrasonic sensors Use sound waves to measure distances and monitor
plant canopy structure and growth patterns
Aids in estimating biomass and yield potential
GPS/GNSS Provides precise location data for all
measurements
Essential for generating accurate, location-specific
yield maps and supporting variable rate applications
IoT and cloud computing Facilitate data collection, transmission,
storage, and analysis
Enables seamless integration and
management of large datasets

Such integration of non-contact sensing and intelligent analysis enhances measurement accuracy and operational efficiency, even under variable field and environmental conditions. Furthermore, the adoption of non-contact yield monitoring systems promotes data-driven precision-agriculture practices, supporting informed decision-making and improved productivity particularly relevant in the context of Korean root-crop production systems.

Optical sensor techniques for non-contact yield measurement

Optical sensing technologies have emerged as a critical component in non-contact yield estimation for root and tuber crops. Among them, RGB imaging systems operating within the visible light spectrum (400–750 nm) represent the most widely deployed and cost-effective solutions for monitoring crop morphology, physical volume, and yield characteristics in both research and industrial environments (Artés-Hernández et al., 2022). A spectral overview of ultraviolet, visible, and infrared wavelength domains relevant to agricultural imaging is shown in Fig. 2.

https://cdn.apub.kr/journalsite/sites/kspa/2025-007-04/N0570070407/images/kspa_2025_074_397_F2.jpg
Fig. 2.

Spectral wavelength ranges of ultraviolet, visible, and infrared light for agricultural imaging applications (modified from Artés-Hernández et al., 2022).

RGB cameras may be mounted on tractors, autonomous ground vehicles, harvesters, and unmanned aerial vehicles (UAVs) to obtain high-resolution image data for automated processing through machine-vision algorithms (Islam et al. 2024). Conveyor-integrated RGB imaging systems have demonstrated promising field-level performance by enabling continuous root detection and yield documentation during harvesting. For example, a shallot-harvesting prototype combining an RGB camera, GNSS, and a video logger achieved accurate bulb detection and size-based classification, obtaining the highest precision for larger size classes (Boatswain Jacques et al., 2021). Sweet potato grading using RGB imaging demonstrated >90% accuracy in laboratory settings, although performance degraded under variable field conditions such as soil contamination, uneven illumination, and vibration effects (Gogineni et al., 2002). Stereo-vision enhancements have improved depth estimation for size measurement within operational conveyor distances, supporting more robust 3D root characterization (Zheng et al., 2022).

Recent advancements in data processing are driving improved accuracy and real-time implementation. Wang et al. (2025) introduced 3DPotatoTwin, a hybrid dataset combining RGB-D point clouds and Structure-from-Motion (SfM) reconstructions to model potato morphology with sub-millimeter registration accuracy. The complexity and potential inaccuracy of marker-based registration methods may introduce alignment errors when fusing multi-sensory data. Blok et al. (2025) proposed a fast 3D shape completion of potato tubers directly on a harvester using RGB-D cameras and improving volume estimation with results computed in just 10 milliseconds per tuber as shown in Fig. 3(a). Though effective for real-time, high-throughput yield estimation under field conditions, its limitations include some residual estimation errors due to partial camera views and challenges scaling to more complex environments or highly occluded shapes. A non-destructive approach using image processing algorithms to accurately determine the physical dimensions and weight of sweet potatoes was proposed by Huynh et al. (2022). By analyzing features from 2D and 3D images, the method achieves more than 98% accuracy in mass and volume estimation, with mass prediction errors as low as 4–8%. This reliable and efficient technique enables automated assessment of sweet potato yield and quality, but it may require further refinement for varied lighting, occlusions, and broader crop types. ​Dolata et al. (2021) proposed a method integrating RGB imaging camera and simulation-based neural model training to automatically segment root crops on conveyors and estimate their sizes in real time. Using static and dynamic conveyor experiments as shown in Fig. 3(b), the system accurately determined crop dimensions with less than 10% error, maintaining robustness regardless of tuber orientation or clustering. The study demonstrates the potential for precise, automated grading and yield estimation without manual measurement, though future work is needed to further reduce errors and adapt the technology to broader crop varieties and challenging post-harvest scenarios.​ A potato yield monitoring module integrating a digital RGB camera, real-time kinematic (RTK) GPS, and image-filtering algorithms was developed and evaluated by Lee and Shin (2020) as shown in Fig. 3(c). The system processed each acquired image within 344 ms, providing yield estimates spatially referenced by GPS coordinates. Performance assessment indicated a yield estimation error of 268.60 g, corresponding to a root mean square deviation of 15.33%, demonstrating the feasibility of real-time, image-based yield monitoring during potato harvesting. Jang et al. (2023) demonstrated the development and evaluation of a real-time potato yield monitoring system utilizing a camera-based YOLOv5 object detection algorithm integrated with DeepSORT for object tracking as shown in Fig. 3(d). By estimating potato mass from bounding box geometry and applying density-based conversion, the system achieved strong prediction performance, with an individual tuber mass estimation accuracy of R2 = 0.9034 and an average error of 13.09%, and a 95.2% detection rate under continuous feeding conditions, resulting in a 9% error in cumulative mass estimation. Although effective in laboratory environments, system performance was affected by computational limitations, frame loss, and variations in potato position within the camera field of view. Long et al. (2018) proposed a method using binocular stereo vision with an RGB-D camera system to measure the volume of individual potato tubers. The approach captured 3D point clouds of the potatoes from different angles, computes a digital surface model, and calculates the volume through geometric integration. Although this technique facilitated non-destructive and automated volume estimation suitable for real-time use on harvesters, it had limitations in accuracy, showing up to 9% relative error for regular-shaped potatoes and as high as 30% for irregular shapes. Wu et al. (2025) introduced a novel approach to simultaneously estimate multiple geometric properties of moving carrots on a conveyor, including width, length, volume, and mass as shown in Fig. 3(e). RGB-D integrated learning framework based on YOLO network simultaneously performs instance segmentation and geometric property regression with high accuracy. It achieved MAPE < 2.5%, improved to < 2% using multi-view conveyor data, and demonstrated strong detection performance with an F1-score of 98.78% and 89.25% IoU. Real-time implementation on an NVIDIA Jetson Orin Nano reached 80 FPS, confirming its suitability for on-harvester deployment. Su et al. (2018) presented a machine vision system using depth cameras to non-destructively grade potatoes based on size, shape, and surface defects. By capturing depth images of 110 potatoes, including those with deformations like bumps and bends, the study computed length, width, thickness, and volume as key indicators for quality and defect detection. Using a volume-based linear regression model, 90% of samples were correctly classified by weight categories. The system also achieved 88% accuracy in identifying surface defects such as bumps and hollows by combining 2D and 3D surface data.

Wang and Li (2014) evaluated an RGB-D imaging approach for non-destructive size and volume estimation of sweet onions during postharvest handling. Images were captured from six orientations, and geometric traits were extracted from both color and depth data. Depth-based measurement demonstrated superior accuracy over RGB imagery, achieving a diameter RMSE of 2 mm and a volume prediction accuracy of 96.3% (RMSE = 18.5 cm3). Additionally, the system showed strong potential for estimating onion density using depth-derived volumetric information. The results indicate that RGB-D sensing can improve automation, precision, and throughput in onion phenotyping and sorting applications. Kiraga et al. (2025) developed a computer vision based yield monitoring system to estimate radish volume in real time under simulated uneven field and dynamic harvesting conditions. An RGB image acquisition system was mounted on a conveyor placed on a vibration table and adjustable slope platform as shown in Fig. 3(f), enabling controlled testing at three slope angles and three vibration levels, including combined scenarios. Two modeling approaches, ellipsoidal geometry and multiple linear regression (MLR)were evaluated by comparing estimated volumes against water displacement measurements. The MLR model consistently outperformed the ellipsoidal approach, achieving the highest accuracy at R2 values of 0.94, 0.96, and 0.92 for independent slope tests. ANOVA results showed no significant differences between MLR-based estimates and ground-truth values across all conditions. Integrating environmental variables with plant-based descriptors has proven beneficial for improving yield prediction in root crops. Kabir et al. (2025) used a genetic algorithm–optimized extreme learning machine (GA-ELM) to predict radish mass under simulated harvesting conditions via a laboratory test bench. Statistical comparisons with water displacement values showed no significant differences, though the algorithm slightly underestimated mass. Strong predictive performance was demonstrated with high coefficients of determination (R2 = 0.94–0.98) across various slope and vibration conditions, confirming model reliability.

A digital phenotyping platform was developed by Brainard et al. (2021) to classify carrot root shapes that are traditionally judged visually. The system captures root geometry such as length, width, and curvature to differentiate market classes. Validation in diverse germplasm and diallel populations confirmed high classification accuracy and strong genetic signal detection. PCA of curvature profiles also quantified key characteristics including shoulder broadness and tip fill, capturing 87% and 84% of variance, respectively. Narrow-sense heritability values varied substantially among traits, with shape-related parameters such as aspect ratio showing high heritability (h2 = 0.84). Studies such as Akhand et al. (2016) and Abou Ali et al. (2020) reported prediction errors below 10% and R2 values of 0.44–0.57 when combining vegetation indices with environmental factors. Given the strong dependence of root crop yield on soil moisture, temperature, and atmospheric conditions, environmental indicators often exceed plant traits in feature importance and frequency of use within predictive models.

https://cdn.apub.kr/journalsite/sites/kspa/2025-007-04/N0570070407/images/kspa_2025_074_397_F3.jpg
Fig. 3.

Image acquisition systems deployed during harvesting operation using optical sensors: (a) RGB-D imaging system installed on a potato harvester (Blok et al., 2025), (b) RGB image acquisition system used during potato harvest (Dolata et al., 2021), (c) field test of machine vision system on a prototype potato harvester (Lee and Shin, 2020); (e) laboratory test of machine vision system on a prototype potato harvester (Jang et al., 2023), (e) estimation of multiple geometric properties of moving carrots on conveyor (Wu et al., 2025), and (f) radish yield estimation using laboratory test bench (Kiraga et al., 2025).

LiDAR and proximity sensor techniques for non-contact yield measurement

LiDAR and 3D scanning systems have gained significant attention for non-contact yield monitoring of root crops due to their ability to directly capture three-dimensional geometric features. LiDAR provides dense point clouds representing the spatial coordinates (x, y, z) of harvested produce, enabling high-accuracy quantification of volume, surface area, curvature, and shape irregularity. As a laser-based active sensing system, LiDAR is largely independent of ambient lighting, offering clear advantages over visible-spectrum imaging for yield estimation in dynamic harvesting environments where soil debris, shadows, and variable illumination conditions commonly occur.

LiDAR applications in root crop research have demonstrated strong performance in both offline grading and online yield measurement. For example, Villordon et al. (2020) utilized a low-cost 3D scanner to model sweet potato storage roots and reported strong correlations between scanner-derived and manually measured geometric parameters, confirming the feasibility of non-destructive 3D modeling for shape-based classification and processing quality. Cai et al. (2020) implemented a laser triangulation system combining a monocular camera and structured-light projection to reconstruct potato geometry with exceptional accuracy, achieving volumetric and mass prediction errors of −0.08% and 0.48%, respectively. Furthermore, Xu et al. (2024) developed an online LiDAR-based measurement system integrated with a roller conveyor as shown in Fig. 4(a), and applied deep neural network modeling to estimate sweet potato volume, reaching a prediction accuracy of 97.9%, outperforming polynomial and linear regression approaches. These findings support suitability for high-precision, real-time measurement of harvested roots during grading and yield monitoring. Xie et al. (2023) proposed a 3D reconstruction-based method for accurate carrot morphological measurement, addressing the limitations of 2D imaging that lacks depth information as shown in Fig. 4(b). An RGB-D acquisition system combining a ToF Kinect sensor and a turntable with circular markers captured 16 RGB and 16 depth images per sample to cover the full carrot surface. Point cloud registration accuracy was high, with most alignment errors within 1 mm and all within 2.4 mm. Using Poisson reconstruction, key morphological traits—volume, length, and maximum diameter—were extracted from 3D models generated for 136 carrots. Validation results showed MAPE < 3% for all variables, confirming strong measurement accuracy. The method offers a low-cost and reliable solution for 3D phenotyping and grading of produce with limited surface features.

LiDAR-based yield monitoring systems encounter challenges such as noise from soil debris, occlusion of overlapping roots, and motion distortion on vibrating conveyors, all of which reduce volumetric accuracy for irregularly shaped crops. Their performance is also highly sensitive to calibration, mounting geometry, and environmental conditions. Additionally, real-time point-cloud processing requires high computational power, while LiDAR hardware cost and durability issues hinder commercial adoption. Sensor fusion and ruggedized system design are therefore essential for reliable field deployment.

Proximity and ultrasonic sensors represent cost-effective and adaptable solutions for non-contact yield estimation in agricultural systems, particularly where simpler structural measurements are sufficient. Proximity sensors based on infrared reflectance or capacitive detection provide continuous monitoring of canopy density and crop presence, whereas ultrasonic sensors quantify the distance between the transducer and plant surfaces by measuring the return time of acoustic waves, enabling indirect estimation of crop height, volume, and biomass distribution (Karim et al., 2025; Tan et al., 2025). These sensors can be easily integrated into ground vehicle platforms or harvesting equipment to support spatial yield mapping at the field scale.

Beyaz and Gerdan (2020) explored the use of ultrasonic sensing for potato size classification, using a LabVIEW-based software platform to acquire and process length measurements as shown in Fig. 4(c). Static and dynamic ultrasonic measurements were compared against caliper-based ground truth to assess classification accuracy. Strong linear relationships were observed, with regression coefficients of 95.5% for static ultrasonic vs. caliper length, 86.9% for dynamic ultrasonic vs. caliper length, and 87.9% for static vs. dynamic ultrasonic measurements. Ultrasonic potato classification is limited by temperature sensitivity affecting accuracy and difficulties in reliably measuring irregular or moving potatoes. The utility of ultrasonic sensing is also complemented by LiDAR-based methodologies. These studies highlight the value of non-contact ranging systems for improving machine-crop interaction and establishing proxy relationships between measured structural traits and final yield.

An integrated non-contact yield monitoring system mounted on a potato harvester as shown in Fig. 4d. The system consists of an RGB camera installed above a rollover conveyor to capture continuous images of harvested potato for object detection, counting, and geometric analysis. Below the camera, an array of ultrasonic sensors is positioned to measure the height and bulk density of the potato flow as the conveyor rotates and exposes all sides of the produce. A control box mounted above the sensor assembly houses the data acquisition electronics and computing hardware required for real-time processing and synchronization of sensor outputs. This multimodal sensing configuration is designed to enable simultaneous volume and mass estimation of harvested root vegetables during on-the-go field operations. Proximity and ultrasonic sensors have limited accuracy in root crop yield monitoring due to sensitivity to ambient noise, crop geometry, soil adhesion, and vibration-induced errors on harvesters. Their low spatial resolution also restricts reliable detection of overlapping or irregularly shaped roots.

https://cdn.apub.kr/journalsite/sites/kspa/2025-007-04/N0570070407/images/kspa_2025_074_397_F4.jpg
Fig. 4.

Image acquisition systems deployed during harvesting operation using Lidar and proximity sensors: (a) LiDAR-based measurement system integrated with a roller conveyor (Xu et al., 2024), (b) point cloud registration using Kinect ToF sensor in laboratory setup (Xie et al., 2023), (c) ultrasonic sensor setup for potato classification and size measurement (Beyaz and Gerdan (2020), and (d) array of ultrasonic sensor used in rotatory conveyor to collect potato volume data.

Other sensors techniques for non-contact yield measurement

Thermal, radar, and hyperspectral sensing technologies provide complementary diagnostic capabilities for root crop yield monitoring where conventional optical methods encounter limitations. Thermal infrared imaging detects temperature anomalies related to bruising, tissue damage, and moisture loss for real-time quality assessment (Sun et al., 2022). Radar systems, including microwave and millimeter-wave sensors, offer reliable bulk detection under dusty, low-visibility, or nighttime conditions due to illumination-independent penetration (Chung et al., 2016; Bidese-Puhl et al., 2023). Hyperspectral imaging enables detailed spectral analysis for identifying compositional traits such as dry matter and soluble solids, supporting grading and defect detection (Ahmed et al., 2024). These technologies contribute valuable non-contact information on quality, structure, and crop distribution, strengthening decision-support in precision harvesting systems. Ahmed et al. (2024) introduced an explainable AI–enabled hyperspectral imaging (HSI) method for evaluating key postharvest quality attributes of sweet potatoes as shown in Fig. 5, including dry matter content, soluble solid content, and firmness. Using a portable VNIR-HSI system (400–1000 nm), spectral features were extracted and optimized through wavelength selection techniques. SHAP-based explainability was incorporated to interpret model performance and identify influential spectral regions. The developed regression models demonstrated high predictive accuracy, particularly for dry matter (R2ₚ = 0.92, RPD = 5.58) and firmness (R2ₚ = 0.85, RPD = 2.63). Prediction maps were generated to visualize spatial quality variations across tubers. Hyperspectral sensing has successfully predicted tuber yield in potatoes, and can monitor crop health, nutrient levels, and stress tolerance for root crops. It supports real-time precision agriculture and improves yield prediction accuracy.

https://cdn.apub.kr/journalsite/sites/kspa/2025-007-04/N0570070407/images/kspa_2025_074_397_F5.jpg
Fig. 5.

Image acquisition systems deployed during harvesting hyperspectral camera (Ahmed et al., 2024).

Thermal imaging performance is affected by environmental heat variation, soil moisture, and emissivity differences, which can obscure subtle bruising or damage. Radar systems provide only coarse spatial resolution and struggle with overlapping roots, while GPR accuracy decreases due to soil heterogeneity and signal attenuation. Both technologies require intensive noise filtering and rugged hardware to withstand harsh harvesting conditions. Therefore, thermal and radar sensors are best used in combination with optical or geometric methods to achieve more reliable yield monitoring. Table 3 shows the general comparison between thermal, radar, and hyperspectral sensing technologies for root crop yield estimation scenarios.

Table 3.

Summary of thermal, radar, and hyperspectral technologies for root crop yield monitoring.

Technology Key Parameters Measured Yield Prediction Capability Usual application
Thermal Sensing Canopy temperature, water stress, nutrient stress Indirect yield estimation through stress indicators Useful for irrigation management and stress detection
Radar Sensing Crop growth stages, biomass, leaf area index Forecast yield via vegetation growth monitoring Weather-independent, good for temporal growth tracking
Hyperspectral Sensing Nutrient status, biochemical composition, biomass Direct tuber yield prediction and crop health Detailed biochemical insights, high accuracy, complex data processing

Sensor fusion and integrated systems for non-contact yield monitoring

Sensor fusion has emerged as a key strategy for overcoming the limitations of individual sensing techniques in root vegetable yield monitoring. By combining optical imaging, depth sensing (LiDAR/RGB-D), and range-based proxies (ultrasonic/radar), integrated systems can simultaneously measure geometric structure, physical mass estimates, and quality attributes under challenging field conditions (Karim et al., 2025; Tan et al., 2025). Fusion approaches typically involve multi-sensor synchronization on conveyor-mounted platforms or autonomous harvesting machines, where data from different sources are aligned spatially and temporally using GNSS or encoder inputs. Machine learning and deep neural networks (e.g., multimodal CNNs or feature-level fusion architectures) are increasingly used to exploit complementary features, such as surface texture from RGB images, 3D volume from LiDAR, and bulk density indicators from ultrasonic or radar signals (Lv et al., 2024). These systems enable significant improvements in real-time segmentation, root-to-soil separation, and weight prediction accuracy, delivering robust high-resolution yield maps and automated grading outputs tailored for cooperatives and commercial packing operations. Sensor fusion therefore represents a crucial technological pathway toward full-field deployment of real-time yield monitoring in potato, sweet potato, radish, and carrot production.

Regression and other models (Random Forest, SVR, MLP) further enhance prediction accuracy when combining RGB, LiDAR, and ultrasonic features. Hybrid Sensor Fusion represents a new trend, where multi-modal data (RGB + LiDAR + multispectral) are integrated with ML algorithms to overcome limitations of individual sensors (Lv et al., 2024). This improves robustness in conditions of occlusion, soil contamination, or complex canopy structures (Rather and Alotaibi, 2025).

The integration of multiple sensing systems introduces challenges in hardware coordination, sensor mounting geometry, and data synchronization, particularly under high-speed harvesting motion (Lv et al., 2024). Fusion pipelines require accurate calibration to ensure spatial alignment of multimodal datasets; even minor vibration-induced misregistration can degrade predictive accuracy. Increased data volume demands high-performance onboard processing, edge computing, and efficient communication protocols to maintain real-time performance. Cost and system complexity escalate as additional sensors are added, limiting adoption among small-scale farms common in root crop production regions such as Korea. Moreover, fusion algorithms may suffer from reduced generalization when trained on limited cultivars or environmental conditions, necessitating extensive field validation. Overcoming these limitations will require advances in lightweight neural architectures, standardized calibration frameworks, and ruggedized plug-and-play sensor modules for commercial harvester integration. Table 4 summarizes the current applications of non-contact sensing technologies for yield monitoring in major root vegetable crops, highlighting the sensing approaches, key measurement parameters, performance outcomes, and field-readiness levels reported in the literature.

Table 4.

Summary of non-contact sensing technologies and their applications in yield monitoring of root vegetable crops.

Sensor Application Crop Method / Algorithm Accuray Reference
RGB imaging Yield monitoring and
grading
Sweet potato Regression models,
neural networks
> 90% Gogineni et al. (2002)
Volume and weight
estimation
Sweet potato Image segmentation,
regression model
R2 = 0.98 (volume)
R2 = 0.96 (weight)
Huynh et al. (2022)
Geometric property
estimation
Carrot YOLO-Net < 2.5% MAPE
98.78% F1-score
Wu et al. (2025)
Yield monitoring Potato Digital imaging and mapping 84.67% Lee et al. (2020)
Root shape Carrot Principal component analysis 87% Brainard et al. (2021)
RGB-D stereo
vision
Volume measurement Potato Stereo geometry,
depth image analysis
91% Long et al. (2018)
Shape, phenotyping,
yield analysis
Potato 3D modeling 0.59 ± 0.11 mm Wang et al. (2025)
Grading and mass
prediction
Potato 2D–3D surface data analysis 90% Su et al. (2018)
Size estimation Onion Image processing 96.3% Wang and Li (2014)
3D scanning
and LiDAR
Volume estimation,
shape classification
Sweet potato 3D, PCA analysis 96.2% Villordon et al. (2020)
3D volume estimation Potato Line laser triangulation –0.08% (volume error)
0.48%
(mass error)
Cai et al. (2020)
Volume measurement Sweet potato 3D data and Deep Neural
Network
97.9% Xu et al. (2024)
Volume estimation Potato deep learning, image processing RMSE 22.6 Blok et al. (2025)
RGB and
hyperspectral
Yield estimation Potato Vegetation indices r2 > 0.90 Li et al. (2020)
Hybrid Yield estimation Potato Image processing,
machine learning
- Rather and Alotaibi (2025);
Li et al. (2020)

Data processing and analysis

Data processing and analysis are essential for transforming raw sensory inputs into reliable indicators of yield, morphology, and quality in root crop monitoring. As illustrated in Fig. 6, the workflow generally follows four primary stages: sensor data acquisition, pre-processing, feature extraction, and predictive modeling. The specific techniques applied within each stage depend on the sensing modality used such as RGB imaging, LiDAR-based 3D scanning, or ultrasonic/proximity ranging with each requiring distinct processing pipelines and analytical frameworks to meet accuracy and real-time operational demands.

https://cdn.apub.kr/journalsite/sites/kspa/2025-007-04/N0570070407/images/kspa_2025_074_397_F6.jpg
Fig. 6.

Overview of data processing and analysis stages for different sensors in non-contact yield measurement technologies.

Effective non-contact yield monitoring relies heavily on robust data processing pipelines capable of transforming raw imagery and sensor outputs into accurate crop trait estimates. Raw RGB, LiDAR, depth, and hyperspectral data frequently contain noise, illumination variation, geometric distortion, and spatial misalignment, requiring preprocessing steps such as denoising, image enhancement, depth calibration, and point-cloud filtering as shown in Fig. 7(a) and (b) to ensure data fidelity (Lee and Shin, 2020; Boatswain Jacques et al., 2021). Background removal and vegetation indices (e.g., NDVI) are commonly applied in RGB and multispectral workflows to isolate root crop features, while LiDAR and 3D point-cloud datasets demand geometric registration and surface smoothing to enable precise reconstruction Cai et al. (2020).

After preprocessing, feature extraction algorithms are employed to derive morphological and spectral attributes relevant to yield quantification. In RGB image analysis, deep-learning-based detection and segmentation such as YOLO frameworks, convolutional neural networks (CNNs), and contour modeling support tuber counting and size estimation Wu et al. (2025). For 3D and LiDAR point clouds, clustering, triangulation, and multi-view stereo modeling facilitate volumetric computation and shape analysis (Karim et al., 2025). Ultrasonic and proximity sensors extract distance-based canopy height and bulk density profiles following signal calibration (Karim et al, 2025; Tan et al., 2025).

Extracted features are then used in predictive modeling to estimate yield-related parameters such as mass, volume, quality category, and spatial variability. Regression-based methods including multiple linear regression (MLR), partial least squares regression (PLSR), and principal component analysis (PCA) support vector regression (SVR), and random forest (RF) remain prevalent in continuous trait prediction as shown in Fig. 7(c) (Rather and Alotaibi, 2025; Brainard et al., 2021). More advanced neural architectures multi-task learning, depth-enhanced CNNs, and multimodal ensemble models improve robustness by exploiting complementary geometry and spectral cues across sensing approaches as shown in Fig. 7(d) and (e). For LiDAR and ultrasonic systems, volume–mass calibration models provide direct real-time quantification suitable for harvester control and automated throughput (Karim et al., 2025). Recent developments illustrate a trend toward data-rich, multimodal processing. High-resolution RGB imaging with segmentation has proven effective for onion and shallot grading (Boatswain Jacques, 2021), yet performance decreases under variable lighting. UAV-based hyperspectral–RGB fusion improves spatial potato yield prediction (Li et al., 2020) but requires stable flights and dense vegetation. RGB-D fusion enables accurate 3D shape modeling for phenotyping and mass prediction (Wang et al., 2025; Huynh et al., 2022; Su et al., 2018), although computational load and calibration remain challenges. LiDAR-based approaches deliver efficient volume estimation on conveyors through neural regression models but are sensitive to cost constraints and motion distortion (Karim et al., 2025; Tan et al., 2025; Xu et al., 2024). Additionally, deep learning for 3D segmentation has demonstrated superior potato and carrot geometry reconstruction (Blok et al., 2025; Wu et al., 2025), albeit requiring large annotated datasets for generalization.

https://cdn.apub.kr/journalsite/sites/kspa/2025-007-04/N0570070407/images/kspa_2025_074_397_F7.jpg
Fig. 7.

Examples of image processing and feature extraction techniques used in non-contact yield monitoring of root vegetable crops: (a) binary mask generation (Lee and Shin, 2020), (b) Gaussian filter, HSV color space and threshold, morphological operations and opening and closing of contour for volume and count estimation (Boatswain Jacques et al. 2021), (c) contour and curvature detection (Brainard et al., 2021), and (d) instance segmentation (Dolata et al. 2021), and (e) contour detection and counting using machine learning.

Hybrid sensor-fusion systems integrating RGB, LiDAR, ultrasonic, and hyperspectral data (Rather and Alotaibi, 2025) consistently outperform single-sensor approaches by improving accuracy, soil/tuber separation, and environmental resilience. However, they introduce higher data rates, synchronization complexity, and real-time processing demands, which must be addressed for field deployment. Overall, modern data processing workflows from noise reduction to advanced fusion modeling—play a pivotal role in enabling real-time, non-destructive yield estimation and trait mapping in root vegetable crops. A comprehensive comparison of methods, performance indicators, and processing requirements is presented in Table 5.

Table 5.

Data processing and analysis techniques for non-contact yield measurement in root vegetables crops.

Sensor Pre-processing Feature extraction Reference
RGB camera Image enhancement, noise removal,
segmentation
Object detection, size classification Boatswain Jacques (2021)
Noise removal, vegetation indices, crop
height calibration
Hyperspectral and RGB vegetation indices Li et al. (2020)
Background segmentation, image slicing Contour detection, shape measurement Huynh et al. (2022)
Multi-view image acquisition,
normalization
YOLO-based geometric property extraction Wu et al. (2025)
Image acquisition, background separation
(potato vs. soil)
Object detection and segmentation Lee et al. (2020)
RGB-D stereo
camera
Image alignment, noise reduction, point
cloud registration
Multi-sensor fusion, 3D model
reconstruction
Wang et al. (2025)
Depth map filtering, surface reconstruction Defect segmentation, 2D–3D shape analysis Su et al. (2018)
Depth image calibration, noise filtering 3D shape extraction, edge mapping Long et al. (2018)
3D scanning
and LiDAR
Point cloud generation, model smoothing Surface and volume measurement Villordon et al. (2020)
Point interpolation, surface smoothing Laser reflection and triangulation mapping Cai et al. (2020)
Conveyor-based scanning, multi-view data
fusion
LiDAR point cloud feature extraction Xu et al. (2024)
Depth map completion, shape
reconstruction
CNN-based feature extraction, 3D segmentation Blok et al. (2025)
Hybrid Multi-modal preprocessing (RGB + LiDAR
alignment, multispectral fusion)
Feature extraction from multiple sources
(spectral indices + 3D structure)
Rather and Alotaibi (2025);
Li et al. (2020)

Discussion and Future Directions

Non-contact yield monitoring has demonstrated substantial potential to enhance precision agriculture in root vegetable production systems, where conventional sampling-based methods are labor-intensive, destructive, and frequently inaccurate. The findings of this review indicate that multiple sensing technologies such as RGB imaging, LiDAR, ultrasonic ranging, thermal sensing, and radar systems each contribute valuable structural, volumetric, or qualitative information for yield estimation. Among them, RGB imaging is currently the most widely adopted and cost-effective technique for tuber counting, canopy assessment, and size-based grading; however, its performance is highly dependent on external illumination, soil contamination, and occlusion. LiDAR-based systems provide highly resolved 3D information that supports volume estimation and morphology reconstruction, but their operational costs, calibration sensitivity, and vulnerability to vibration-induced noise constrain broad deployment in high-speed harvesting environments. Ultrasonic and proximity sensors offer low-cost measurement of canopy height and crop presence but exhibit lower resolution and are prone to acoustic interference, making them better suited for complementary integration rather than stand-alone yield quantification methods.

Deep learning and machine-vision advancements are accelerating the transition from experimental demonstrations to real-time implementation, enabling automated segmentation, yield prediction, and quality evaluation. The integration of multimodal sensors with AI-driven fusion models has emerged as a promising strategy to offset the limitations of individual sensing techniques, improving robustness under dynamic harvesting conditions. However, root vegetable crops present unique sensing challenges: irregular tuber morphology, overlapping and rolling behavior on conveyors, soil adhesion, and varying planting geometry contribute to detection uncertainty and require robust data processing and calibration frameworks. Additionally, many existing studies have been validated primarily under controlled or small-scale settings; fewer have undergone evaluation in commercial environments where throughput, vibration, and dust exposure are critical factors influencing performance stability.

In the Korean agricultural context, the relevance of these systems is particularly notable due to high domestic consumption of potatoes, sweet potatoes, and radishes paired with continued declines in agricultural labor availability. Korea’s existing mechanization infrastructure provides a strategic opportunity to embed non-contact yield sensing directly into harvesters, allowing real-time transparency of harvest volume and quality and enabling more efficient supply chain coordination. Such integration can support cooperative-based distribution strategies, reduce post-harvest losses, and stabilize market supply, contributing to improved food security and competitiveness. Fig. 8 provides a conceptual framework summarizing the current landscape and future direction of non-contact yield monitoring for root vegetable crops

https://cdn.apub.kr/journalsite/sites/kspa/2025-007-04/N0570070407/images/kspa_2025_074_397_F8.jpg
Fig. 8.

Framework of challenges, sensing technology comparison, contextual relevance, and future development directions for non-contact yield monitoring in root vegetable crops.

Future research should prioritize the development of low-cost, ruggedized, and edge-computable sensor systems capable of operating reliably in harsh field conditions with minimal calibration requirements. Expanding the application of multimodal sensor fusion—combining RGB, LiDAR, ultrasonic, and radar data—has the potential to achieve higher accuracy and generalizability across diverse crop varieties and environments. Moreover, real-time analytics should be linked with decision support, such as automated harvester adjustments, dynamic logistics planning, and predictive processing workflows. Accelerated standardization, open datasets, and collaborative development among researchers, manufacturers, and policymakers will be essential to ensure accessibility to farms of all scales and accelerate commercialization.

Overall, while no single sensing modality yet provides a complete solution for yield monitoring in root vegetable crops, the rapid convergence of sensing technologies and AI-based data interpretation is moving the field toward real-time, scalable, and highly reliable systems. Addressing the technological and implementation challenges identified in this review will be critical to realizing their full role in driving sustainability, labor efficiency, and competitive advantage in future agricultural production systems.

Conclusion

Non-contact yield monitoring offers a practical pathway to modernize root vegetable production and address labor shortages, especially in Korea. This review highlighted the potential of RGB/RGB-D imaging, LiDAR-based 3D scanning, ultrasonic ranging, thermal and radar sensing, and hyperspectral imaging for estimating crop volume, mass, and quality without disrupting harvesting operations.

RGB and RGB-D systems are currently the most practical for field deployment, aided by deep learning for segmentation and geometric extraction, but their performance is affected by lighting, soil adhesion, and occlusion. LiDAR enables accurate volumetric reconstruction yet remains costly and sensitive to vibration and calibration errors. Ultrasonic and proximity sensors provide affordable structural measurement but suffer from limited resolution and environmental interference.

Future progress will rely on sensor fusion, where multimodal data are integrated to offset individual limitations and improve robustness in dynamic field conditions. Deployment challenges—such as real-time processing, synchronization, and rugged hardware design—must be addressed for widespread adoption.

Overall, although no single sensing approaches yet satisfies all technical and economic constraints, rapid advances in sensing hardware, AI-driven analytics, and integrated system design are narrowing this gap. Continued innovation toward robust, low-cost, and intelligent non-contact yield monitoring systems will be critical for sustainable intensification, labor efficiency, and resilience in Korean and global root vegetable production.

Conflict of Interest

No potential conflict of interest relevant to this article was reported.

Acknowledgements

This work was supported by Korea Institute of Planning and Evaluation for Technology in Food, Agriculture and Forestry (IPET), through Technology Commercialization Support Program, funded by Ministry of Agriculture, Food and Rural Affairs (MAFRA) (Project No. RS-2024-00401926), Republic of Korea.

References

1

Abou, Ali, H., Delparte, D., Griffel, L.M. 2020. From pixel to yield: Forecasting potato productivity in Lebanon and IDAHO. In: International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, pp. 1–7. https://doi.org/10.5194/isprs-archives-XLII-3-W11-1-2020

10.5194/isprs-archives-XLII-3-W11-1-2020
2

Ahmed, T., Wijewardane, N.K., Lu, Y., Jones, D.S., Kudenov, M., Williams, C., Villordon, A., Kamruzzaman, M., 2024. Advancing sweetpotato quality assessment with hyperspectral imaging and explainable artificial intelligence. Computers and Electronics in Agriculture 220: 108855. https://doi.org/10.1016/j.compag.2024.108855

10.1016/j.compag.2024.108855
3

Akhand, K., Nizamuddin, M., Roytman, L., Kogan, F. 2016. Using remote sensing satellite data and artificial neural network for prediction of potato yield in Bangladesh. In: Remote Sensing and Modeling of Ecosystems for Sustainability XIII, 997508. https://doi.org/10.1117/12.2237214

10.1117/12.2237214
4

An, S., Bae, J.H., Kim, H.C., Kwack, Y. 2021. Production of grafted vegetable seedlings in the republic of Korea: achievements, challenges and perspectives. Horticultural Science and Technology 39(5): 547-559. https://doi.org/10.7235/HORT.20210049

10.7235/HORT.20210049
5

Artés-Hernández, F., Castillejo, N., Martínez-Zamora, L. 2022. UV and visible spectrum led lighting as abiotic elicitors of bioactive compounds in sprouts, microgreens, and baby leaves-A comprehensive review including their mode of action. Foods 11(3): 265. https://doi.org/10.3390/foods11030265

10.3390/foods1103026535159417PMC8834035
6

Beyaz, A., Gerdan, D. 2020. Potato classification by using ultrasonic sensor with LabVIEW. Agricultural Science Digest-A Research Journal 40(4): 376-381. https://doi.org/10.18805/ag.D-173

10.18805/ag.D-173
7

Bidese-Puhl, R., Butts, C. L., Rewis, M., McIntyre, J. S., Morris, J., Branch, B., Bao, Y. 2023. An mmWave radar-based mass flow sensor using machine learning towards a peanut yield monitor. Computers and Electronics in Agriculture 215: 108340. https://doi.org/10.1016/j.compag.2023.108340

10.1016/j.compag.2023.108340
8

Blok, P. M., Magistri, F., Stachniss, C., Wang, H., Burridge, J., Guo, W. 2025. High-throughput 3D shape completion of potato tubers on a harvester. Computers and Electronics in Agriculture 228: 109673. https://doi.org/10.1016/j.compag.2024.109673

10.1016/j.compag.2024.109673
9

Boatswain Jacques, A. A., Adamchuk, V. I., Park, J., Cloutier, G., Clark, J. J., Miller, C. 2021. Towards a machine vision-based yield monitor for the counting and quality mapping of shallots. Frontiers in Robotics and AI 8: 627067. https://doi.org/10.3389/frobt.2021.627067

10.3389/frobt.2021.62706734046434PMC8146908
10

Brainard, S.H., Bustamante, J.A., Dawson, J.C., Spalding, E.P., Goldman, I.L. 2021. A digital image-based phenotyping platform for analyzing root shape attributes in carrot. Frontiers in plant science 12: 690031. https://doi.org/10.3389/fpls.2021.690031

10.3389/fpls.2021.69003134220912PMC8244657
11

Cai, Z., Jin, C., Xu, J., Yang, T. 2020. Measurement of potato volume with laser triangulation and three-dimensional reconstruction. IEEE Access 8: 176565-176574. https://doi.org/10.1109/ACCESS.2020.3027154

10.1109/ACCESS.2020.3027154
12

Chen, L., Zhu, Y., Hu, Z., Wu, S., Jin, C. 2021. Beetroot as a functional food with huge health benefits: Antioxidant, antitumor, physical function, and chronic metabolomics activity. Food Science & Nutrition 9(11): 6406-6420. https://doi.org/10.1002/fsn3.2577

10.1002/fsn3.257734760270PMC8565237
13

Chowdhury, M., Islam, M. N., Iqbal, M. Z., Islam, S., Lee, D. H., Kim, D. G., Jun, H. J., Chung, S. O. 2020. Analysis of overturning and vibration during field operation of a tractor-mounted 4-row radish collector toward ensuring user safety. Machines 8(4): 1-14. https://doi.org/10.3390/machines8040077

10.3390/machines8040077
14

Chung, S. O., Choi, M. C., Lee, K. H., Kim, Y. J., Hong, S. J., Li, M. 2016. Sensing technologies for grain crop yield monitoring systems: A review. Journal of Biosystems Engineering 41(4): 408-417. https://doi.org/10.5307/JBE.2016.41.4.408

10.5307/JBE.2016.41.4.408
15

Constantinescu, C., Sala, F. 2021. Use of drone for monitoring and production estimating in agricultural crops; case study in wheat. Scientific Papers Series Management, Economic Engineering in Agriculture and Rural Development 21(4): 151-160.

16

Das, B., Hoque, A., Roy, S., Kumar, K., Laskar, A. A., Mazumder, A. S. 2025. Post-Harvest Technologies and Automation: Al-Driven Innovations in Food Processing and Supply Chains. Int. J. Sci. Res. Sci. Technol 12(1): 183-205. https://doi.org/10.32628/IJSRST25121170

10.32628/IJSRST25121170
17

Dolata, P., Wróblewski, P., Mrzygłód, M., Reiner, J. 2021. Instance segmentation of root crops and simulation-based learning to estimate their physical dimensions for on-line machine vision yield monitoring. Computers and Electronics in Agriculture 190: 106451. https://doi.org/10.1016/j.compag.2021.106451

10.1016/j.compag.2021.106451
18

Duan, B., Fang, S., Zhu, R., Wu, X., Wang, S., Gong, Y., Peng, Y. 2019. Remote estimation of rice yield with unmanned aerial vehicle (UAV) data and spectral mixture analysis. Frontiers in plant science 10: 204. https://doi.org/10.3389/fpls.2019.00204

10.3389/fpls.2019.0020430873194PMC6400984
19

Dutta, J., Dutta, J., Gogoi, S. 2020. Smart farming: An opportunity for efficient monitoring and detection of pests and diseases. J. Entomol. Zool. Stud 8: 2352-2359. https://doi.org/10.22271/j.ento.2020.v8.i4ab.7392

10.22271/j.ento.2020.v8.i4ab.7392
20

Emmi, L., Gonzalez-de-Soto, M., Pajares, G., Gonzalez-de-Santos, P. 2014. New trends in robotics for agriculture: integration and assessment of a real fleet of robots. The Scientific World Journal 2014(1): 404059. https://doi.org/10.1155/2014/404059

10.1155/2014/40405925143976PMC3985338
21

Fu, H., Wang, C., Cui, G., She, W., Zhao, L. 2021. Ramie yield estimation based on UAV RGB images. Sensors 21(2): 669. https://doi.org/10.3390/s21020669

10.3390/s2102066933477949PMC7833380
22

Gogineni, S., White, J. G., Thomasson, J. A., Thompson, P. G., Wooten, J. R., & Shankle, M. 2002. Image-based sweetpotato yield and grade monitor. In 2002 ASAE Annual Meeting. 021169. American Society of Agricultural and Biological Engineers (ASABE). https://doi.org/10.13031/2013.10586

10.13031/2013.10586
23

Huynh, T. T., TonThat, L., Dao, S. V. 2022. A vision-based method to estimate volume and mass of fruit/vegetable: Case study of sweet potato. International Journal of Food Properties 25(1): 717-732. https://doi.org/10.1080/10942912.2022.2057528

10.1080/10942912.2022.2057528
24

Islam, S., Reza, M.N., Ahmed, S., Kabir, M.S.N., Chung, S.O., Kim, H. 2023. Short-range sensing for fruit tree water stress detection and monitoring in orchards: A review. Korean Journal of Agricultural Science 50(4): 883-902. https://doi.org/10.7744/kjoas.500424

10.7744/kjoas.500424
25

Islam, S., Reza, M.N., Ahmed, S., Samsuzzaman, Lee, K.H., Cho, Y.J., Noh, D.H., Chung, S.O. 2024. Nutrient stress symptom detection in cucumber seedlings using segmented regression and a mask region-based convolutional neural network model. Agriculture 14(8): 1390. https://doi.org/10.3390/agriculture14081390

10.3390/agriculture14081390
26

Jang, S.H., Moon, S.P., Kim, Y.J., Lee, S.H. 2023. Development of potato mass estimation system based on deep learning. Applied sciences 13(4): 2614. https://doi.org/10.3390/app13042614

10.3390/app13042614
27

Kabir, M.S., Gulandaz, M. A., Ali, M., Reza, M. N., Kabir, M. S. N., Chung, S. O., Han, K. 2024. Yield monitoring systems for non-grain crops: A review. Korean Journal of Agricultural Science 51(1): 63-77. https://doi.org/10.7744/kjoas.510106

10.7744/kjoas.510106
28

Kabir, M.S., Reza, M.N., Lee, K.H., Gulandaz, M.A., Chowdhury, M. 2025. Radish mass estimation using simulated harvester dynamics using GA-ELM modeling. Precision Agriculture Science and Technology 7(2): 134-149. https://doi.org/10.22765/PASTJ.20250011

10.22765/PASTJ.20250011
29

Kabir, M.S.N., Reza, M.N., Chowdhury, M., Ali, M., Samsuzzaman, Ali, M.R., Lee, K.Y., Chung, S.O. 2023. Technological trends and engineering issues on vertical farms: a review. Horticulturae 9(11): 1229. https://doi.org/10.3390/horticulturae9111229

10.3390/horticulturae9111229
30

Karim, M.R., Reza, M.N., Ahmed, S., Lee, K.H., Sung, J. and Chung, S.O. 2025. Proximal LiDAR Sensing for Monitoring of Vegetative Growth in Rice at Different Growing Stages. Agriculture 15(15): 1579. https://doi.org/10.3390/agriculture15151579

10.3390/agriculture15151579
31

Kaur, B., Dimri, S., Singh, J., Mishra, S., Chauhan, N., Kukreti, T., Sharma, B., Singh, S.P., Arora, S., Uniyal, D., Agrawal, Y. 2023. Insights into the harvesting tools and equipment's for horticultural crops: From then to now. Journal of Agriculture and Food Research 14: 100814. https://doi.org/10.1016/j.jafr.2023.100814

10.1016/j.jafr.2023.100814
32

Kim, D. H., Park, Y. N., Cho, Y. S. 2023. Sweet Potato, a Crop to Respond to Climate Change: Domestic Production Trend and its Industrial Application. Food Engineering Progress 27(3): 173-179. https://doi.org/10.13050/foodengprog.2023.27.3.173

10.13050/foodengprog.2023.27.3.173
33

Kim, S.K., Park, S., Kwak, J.H., Choi, S.K., Chae, W.B., Yang, E.Y., Lee, M.J., Jang, Y., Seo, M.H., Lee, S.H., Kang, T.G. 2019. Proper plant density for mechanical transplanting of several leafy vegetables under Korean agricultural condition. Journal of Biosystems Engineering 44(4): 276-280. https://doi.org/10.1007/s42853-019-00038-6

10.1007/s42853-019-00038-6
34

Kiraga, S., Reza, M.N., Lee, K.H., Gulandaz, M.A., Karim, M.R., Habineza, E., Kabir, M.S., Lee, D.H., Chung, S.O. 2025. Vibration and Slope Harvesting Conditions Affect Real-Time Vision-Based Radish Volume Measurements: Experimental Study Using a Laboratory Test Bench. Journal of Biosystems Engineering 50(2): 193-209. https://doi.org/10.1007/s42853-025-00259-y

10.1007/s42853-025-00259-y
35

Kou, J., Chen, Y., Zhu, G., Zang, X., Li, M., Li, W., Zhang, H. 2023. Effects of Ethylene and 1-Methylcyclopropene on the Quality of Sweet Potato Roots during Storage: A Review. Horticulturae 9(6): 667. https://doi.org/10.3390/horticulturae9060667

10.3390/horticulturae9060667
36

Lee, Y. J., Shin, B. S. 2020. Development of potato yield monitoring system using machine vision. Journal of Biosystems Engineering 45(4): 282-290. https://doi.org/10.1007/s42853-020-00069-4

10.1007/s42853-020-00069-4
37

Li, B., Xu, X., Zhang, L., Han, J., Bian, C., Li, G., Liu, J., Jin, L. 2020. Above-ground biomass estimation and yield prediction in potato by using UAV-based RGB and hyperspectral imaging. ISPRS Journal of Photogrammetry and Remote Sensing 162: 161-172. https://doi.org/10.1016/j.isprsjprs.2020.02.013

10.1016/j.isprsjprs.2020.02.013
38

Long, Y., Wang, Y., Zhai, Z., Wu, L., Li, M., Sun, H., Su, Q. 2018. Potato volume measurement based on RGB-D camera. IFAC-PapersOnLine 51(17): 515-520. https://doi.org/10.1016/j.ifacol.2018.08.157

10.1016/j.ifacol.2018.08.157
39

Longchamps, L., Tisseyre, B., Taylor, J., Sagoo, L., Momin, A., Fountas, S., Manfrini, L., Ampatzidis, Y., Schueller, J.K., Khosla, R. 2022. Yield sensing technologies for perennial and annual horticultural crops: a review. Precision Agriculture 23(6): 2407-2448. https://doi.org/10.1007/s11119-022-09906-2

10.1007/s11119-022-09906-2
40

Lv, X., Zhang, X., Gao, H., He, T., Lv, Z., Zhangzhong, L. 2024. When crops meet machine vision: A review and development framework for a low-cost nondestructive online monitoring technology in agricultural production. Agriculture Communications 2(1): 100029. https://doi.org/10.1016/j.agrcom.2024.100029

10.1016/j.agrcom.2024.100029
41

Olakiumide, O. 2021. Post-harvest loss reduction: Enhancing food security and economic sustainability. Journal Siplieria Sciences 2(2): 7-17.

42

Olson, D., Anderson, J. 2021. Review on unmanned aerial vehicles, remote sensors, imagery processing, and their applications in agriculture. Agronomy Journal 113(2): 971-992. https://doi.org/10.1002/agj2.20595

10.1002/agj2.20595
43

Park, Y., Seol, J., Pak, J., Jo, Y., Jun, J., Son, H.I.2023. A novel end-effector for a fruit and vegetable harvesting robot: mechanism and field experiment. Precision Agriculture 24(3): 948-970. https://doi.org/10.1007/s11119-022-09981-5

10.1007/s11119-022-09981-5
44

Paul, N. C., Sang, H., Park, S., Han, G. H., Liu, H., Kim, H., Lee, J. G. 2021. Fungi Associated with Postharvest Diseases of Sweet Potato Storage Roots and In Vitro Antagonistic Assay of Trichoderma Harzianum against Diseases. Journal of Fungi 7(11): 927. https://doi.org/10.3390/jof7110927

10.3390/jof711092734829216PMC8625119
45

Qin, Y., Naumovski, N., Ranadheera, C. S., D’Cunha, N. M. 2022. Nutrition-related health outcomes of sweet potato (Ipomoea batatas) consumption: A systematic review. Food Bioscience 50: 102208. https://doi.org/10.1016/j.fbio.2022.102208

10.1016/j.fbio.2022.102208
46

Rasool, K., Islam, M.N., Ali, M., Jang, B.E., Khan, N.A., Chowdhury, M., Chung, S.O. Kwon, H.J., 2020. Onion transplanting mechanisms: A review. Precis. Agric. Sci. Technol 2: 196.

47

Rather, K.U.I., Alotaibi, E.S. 2025. Utilising machine learning and hyperspectral data to decode growth patterns, cultivar identification and yield dynamics in potato cultivation. Potato Research, 68: 4399-4412. https://doi.org/10.1007/s11540-025-09915-4

10.1007/s11540-025-09915-4
48

Reza, M.N., Karim, M.R., Ali, M.R., Lee, K.H., Bicamumakuba, E., Lee, K.Y. and Chung, S.O., 2025. Field evaluation of a transplanter and a collector under development for Korean spring cabbage production in greenhouses. AgriEngineering 7(7): 226. https://doi.org/10.3390/agriengineering7070226

10.3390/agriengineering7070226
49

Su, Q., Kondo, N., Li, M., Sun, H., Al Riza, D. F., Habaragamuwa, H. 2018. Potato quality grading based on machine vision and 3D shape analysis. Computers and electronics in agriculture 152: 261-268. https://doi.org/10.1016/j.compag.2018.07.012

10.1016/j.compag.2018.07.012
50

Sun, C., Zhou, J., Ma, Y., Xu, Y., Pan, B., Zhang, Z. 2022. A review of remote sensing for potato traits characterization in precision agriculture. Frontiers in Plant Science 13: 871859. https://doi.org/10.3389/fpls.2022.871859

10.3389/fpls.2022.87185935923874PMC9339983
51

Tan, H., Zhao, X., Fu, H., Yang, M. Zhai, C. 2025. A novel fusion positioning navigation system for greenhouse strawberry spraying robot using LiDAR and ultrasonic tags. Agriculture Communications 3(2): 100087. https://doi.org/10.1016/j.agrcom.2025.100087

10.1016/j.agrcom.2025.100087
52

Villordon, A., Gregorie, J.C., LaBonte, D. 2020. Direct measurement of sweet potato surface area and volume using a low-cost 3D scanner for identification of shape features related to processing product recovery. HortScience 55(5): 722-728. https://doi.org/10.21273/HORTSCI14964-20

10.21273/HORTSCI14964-20
53

Wang, H., Blok, P.M., Burridge, J., Jiang, T., Miyauchi, M., Miyamoto, K., Tanaka, K., Guo, W. 2025. 3DPotatoTwin: a Paired Potato Tuber Dataset for 3D Multi-Sensory Fusion. Plant Phenomics 7(4): 100123. https://doi.org/10.1016/j.plaphe.2025.100123

10.1016/j.plaphe.2025.100123
54

Wang, W., Li, C. 2014. Size estimation of sweet onions using consumer-grade RGB-depth sensor. Journal of Food Engineering 142: 153-162. https://doi.org/10.1016/j.jfoodeng.2014.06.019

10.1016/j.jfoodeng.2014.06.019
55

Won, J., Kim, D.C., Han, J., Park, O.R. Cho, Y. 2024. Evaluation of a Prototype Double-conveyor Sweet Potato Harvester for Low-damage Rate. Journal of Agricultural, Life and Environmental Sciences 36(4): 522-533.

10.22698/JALES.20240040
56

Wu, Y. L., Liong, S.T., Liong, G.B., Liang, J.H., Gan, Y.S. 2025. Enhanced geometric properties prediction for carrots in motion using a Multi-task YOLO-based Linked Network (YL-Net). Computers and Electronics in Agriculture 237: 110583. https://doi.org/10.1016/j.compag.2025.110583

10.1016/j.compag.2025.110583
57

Xie, W., Wei, S., Yang, D. 2023. Morphological measurement for carrot based on three-dimensional reconstruction with a ToF sensor. Postharvest Biology and Technology 197: 112216. https://doi.org/10.1016/j.postharvbio.2022.112216

10.1016/j.postharvbio.2022.112216
58

Xu, J., Lu, Y., Olaniyi, E., Harvey, L. 2024. Online volume measurement of sweet potatoes by A LiDAR-based machine vision system. Journal of Food Engineering 361: 111725. https://doi.org/10.1016/j.jfoodeng.2023.111725

10.1016/j.jfoodeng.2023.111725
59

Zeng, L., Peng, G., Meng, R., Man, J., Li, W., Xu, B., Lv, Z., Sun, R. 2021. Wheat yield prediction based on unmanned aerial vehicles-collected red–green–blue imagery. Remote Sensing 13(15): 2937. https://doi.org/10.3390/rs13152937

10.3390/rs13152937
60

Zheng, B., Sun, G., Meng, Z., Nan, R. 2022. Vegetable size measurement based on stereo camera and keypoints detection. Sensors 22(4): 1617. https://doi.org/10.3390/s22041617

10.3390/s2204161735214518PMC8877767
페이지 상단으로 이동하기