Using a custom "camera-to-rice" platform combined with deep-learning methods for feature extraction, matching, segmentation, and denoising, the system generates detailed 3D point clouds across growth stages. Height, canopy area, and volume estimates closely match manual measurements, enabling efficient, accurate monitoring and improved management of rice seedling cultivation in cold regions.
Rice is a staple food for billions of people, and in cold regions such as Northeast China, short growing seasons and low temperatures place strict demands on seedling quality. Traditional phenotyping—visually inspecting plants and measuring traits by hand—is slow, labor-intensive, and prone to human error, making it difficult to manage large nurseries with precision. Existing 3D technologies like LiDAR and depth cameras can capture plant structure but are often expensive, data-heavy, and sensitive to ambient light. Passive multiview stereo approaches are cheaper, yet they struggle with dense canopies, similar textures, and changing illumination, which can cause mismatches and missing structures in reconstructed models. These challenges underscore the need for a low-cost, robust 3D reconstruction method tailored to densely planted, nonrigid seedlings such as rice.
A study (DOI: 10.1016/j.plaphe.2025.100122) published in Plant Phenomics on 30 September 2025 by Rui Gao's & Zhongbin Su's team, Northeast Agricultural University, establishes a low-cost, high-precision 3D reconstruction and phenotyping framework that reliably quantifies rice seedling growth traits, enabling efficient, nondestructive monitoring for improved crop management.
In this study, the authors developed a 3D phenotyping pipeline for rice seedlings using an Ubuntu-based workstation equipped with deep-learning and computer-vision tools. The system integrates Python/C++ with PyTorch, employs COLMAP for sparse reconstruction and OpenMVS for dense reconstruction, and uses PCL for point cloud processing. The team compared three feature extractors—ORB, SIFT, and SuperPoint—across multiview rice images, assessing feature count, quality, and extraction time, and later evaluated matching methods including FLANN, SuperGlue, and LightGlue under varied lighting using repeatability, localization error, and match quantity as metrics. DeepLabV3+ enabled accurate background segmentation, while a two-stage denoising process using HSV filtering and statistical outlier removal produced cleaner, more consistent point clouds. Five reconstruction pipelines were then compared, with SuperPoint + LightGlue emerging as the strongest performer within the SFM framework. Although SuperPoint extracts fewer features and is slower than ORB or SIFT, it yields markedly superior and more stable keypoints in low-texture and overlapping leaf regions. Combined with LightGlue, it greatly enhances matching robustness, improves repeatability, and reduces localization error compared to traditional SIFT/ORB + FLANN and even surpasses SuperPoint + SuperGlue in efficiency. DeepLabV3+ segmentation achieved high accuracy (0.98, IoU 0.971), and the resulting point clouds displayed clearer boundaries and greater completeness. The SuperPoint + LightGlue pipeline also delivered the longest camera trajectories and lowest reprojection errors, with only moderate computational overhead suitable for daily phenotyping. Trait extraction from the reconstructed point clouds was highly precise, with plant height (R² = 0.989), canopy area (R² = 0.991), and volume (R² = 0.984) closely matching reference measurements, demonstrating reliable quantification across growth stages.
From the reconstructed 3D point clouds, key phenotypic traits are extracted with high accuracy. Average plant height estimated from the top 5% of points shows strong agreement with manual measurements (R² = 0.989, RMSE 4.54 mm). Projected canopy area from top-view projections is similarly precise (R² = 0.991, RMSE 18.29 cm²). Seedling volume calculated via a voxel-based method aligns well with Poisson reconstruction results (R² = 0.984, rRMSE 2.18%). These nondestructive metrics enable efficient monitoring of seedling vigor, canopy development, and biomass, supporting optimized management in cold-region rice cultivation.