Introduction: The great diversity of tropical timber species demands the development of new technologies capable of identifying them based on their patterns or anatomical characteristics. The application of convolutional neural networks (CNN) for the recognition of tropical timber species has increased in recent years due to the promising results of CNNs. Objective: To evaluate the quality of macroscopic images with three cutting tools to improve the visualization and distinction of anatomical features in the CNN model training. Methods: Samples were collected from 2020 to 2021 in areas of logging and sawmills in the Central Jungle, Peru. They were later sized and, after botanical and anatomical identification, cut in cross sections. A database of macroscopic images of the cross-section of wood was generated through cutting with three different tools and observing its performance in the laboratory, field, and checkpoint. Results: Using three cutting tools, we obtained high quality images of the cross section of wood; 3 750 macroscopic images were obtained with a portable microscope and correspond to 25 timber species. We found the “Tramontina” knife to be durable, however, it loses its edge easily and requires a sharpening tool, the “Pretul” retractable cutter is suitable for cutting soft and hard wood in small laboratory samples and finally the “Ubermann” knife is suitable for use in the field, laboratory, and checkpoint, because it has a durable sheath and interchangeable blades in case of dullness. Conclusion: The quality of the images is decisive in the classification of timber species, because it allows a better visualization and distinction of the anatomical characteristics in training with the EfficientNet B0 and Custom Vision convolutional neural network models, which was evidenced in the precision metrics.
|Translated title of the contribution
|Cutting tools to optimize classification parameters of timber species with convolutional neural networks
|Revista de Biologia Tropical
|Indexed - 1 Jan 2023
Bibliographical notePublisher Copyright:
© 2023, Universidad de Costa Rica. All rights reserved.