Xinjiang long-staple cotton is widely used in the production of high-end textiles due to its excellent quality. However, foreign fibers such as plastic film, cotton boll hull, human hair, and polypropylene fibers are easily mixed in during mechanical harvesting and processing. Currently, the cleaning of long-staple cotton in China mainly relies on manual sorting. Workers are prone to visual fatigue after long-term work, leading to reduced accuracy and consistency of detection.
Traditional foreign fiber identification technologies mostly rely on color features of RGB images or fluorescence reactions. However, it is difficult to distinguish foreign fibers that are white, transparent, or similar in color to cotton fibers (such as plastic film and white packaging rope). Colorless, transparent plastic film without fluorescence reaction poses an even greater challenge to classification. So, how to accurately identify these hard-to-detect foreign fibers in long-staple cotton and improve sorting efficiency and automation level?
Associate Professor Ling Zhao from the College of Mechanical and Automotive Engineering, Liaocheng University, and his team proposed an intelligent identification method based on hyperspectral imaging and the PCA-AlexNet model, providing a new solution to this problem. The related research has been published in Frontiers of Agricultural Science and Engineering ( DOI: 10.15302/J-FASE-2025639 ).
The study innovatively integrates hyperspectral imaging technology with a deep learning model. Hyperspectral imaging can simultaneously capture spatial and spectral information of objects. Each pixel contains reflectance data from multiple bands, forming a continuous spectral curve that can distinguish foreign fibers with similar colors. The research team first performed dimensionality reduction on the hyperspectral data using principal component analysis (PCA) to select the optimal feature bands for each type of foreign fiber, reducing data redundancy and shortening model training time. Subsequently, they fine-tuned the parameters of the classic AlexNet convolutional neural network, trained the model using the data from the selected feature bands, and finally determined the optimal model as PCA-AlexNet-23.
Experimental results show that the PCA-AlexNet-23 model exhibits excellent performance in multi-class foreign fiber identification: the overall accuracy (OA) reaches 97.2%, the average accuracy (AA) is 95.2%, and the Kappa coefficient is 93.1%, all outperforming traditional models such as support vector machine (SVM), artificial neural network (ANN), and LDA-VGGNet. In practical sorting tests, the foreign fiber removal rate exceeds 85%. The model is particularly proficient in identifying white, transparent, or foreign fibers similar in color to cotton fibers, solving the problem of insufficient recognition capability of traditional methods for such fibers.
PCA technology effectively reduces the dimensionality of hyperspectral data by retaining the most critical feature information, avoiding interference from redundant information. The optimized AlexNet model can automatically extract joint spectral and spatial features, improving classification accuracy. Compared with 3D convolutional neural networks that have numerous parameters and long training time, this model adopts a 2D convolutional structure, reducing computational costs while ensuring accuracy.
Currently, the mechanized harvesting and processing technology for Xinjiang long-staple cotton is still in its early stages. This method provides core technical support for the automated sorting of foreign fibers in long-staple cotton, helping to reduce manual reliance and improve production efficiency. In the future, the research team plans to further expand the dataset of foreign fiber types, optimize data preprocessing technologies, and explore multi-source data fusion methods to continuously improve the performance of hyperspectral multi-target recognition algorithms, promoting the development of the Xinjiang long-staple cotton industry towards an efficient and fully automated mechanization direction.