• EI
  • Scopus
  • 中国科技期刊卓越行动计划项目资助期刊
  • 北大核心期刊
  • DOAJ
  • EBSCO
  • 中国核心学术期刊RCCSE A+
  • 中国精品科技期刊
  • JST China
  • FSTA
  • 中国农林核心期刊
  • 中国科技核心期刊CSTPCD
  • CA
  • WJCI
  • 食品科学与工程领域高质量科技期刊分级目录第一方阵T1
中国精品科技期刊2020
马景余,孙涛,王彦荣,等. 基于电子舌和电子眼信息融合的贝母品种快速辨识方法[J]. 华体会体育,2024,45(18):9−18. doi: 10.13386/j.issn1002-0306.2024020161.
引用本文: 马景余,孙涛,王彦荣,等. 基于电子舌和电子眼信息融合的贝母品种快速辨识方法[J]. 华体会体育,2024,45(18):9−18. doi: 10.13386/j.issn1002-0306.2024020161.
MA Jingyu, SUN Tao, WANG Yanrong, et al. A Fast Identification Method for Fritillaria Varieties Based on the Fusion of Electronic Tongue and Electronic Eye Information[J]. Science and Technology of Food Industry, 2024, 45(18): 9−18. (in Chinese with English abstract). doi: 10.13386/j.issn1002-0306.2024020161.
Citation: MA Jingyu, SUN Tao, WANG Yanrong, et al. A Fast Identification Method for Fritillaria Varieties Based on the Fusion of Electronic Tongue and Electronic Eye Information[J]. Science and Technology of Food Industry, 2024, 45(18): 9−18. (in Chinese with English abstract). doi: 10.13386/j.issn1002-0306.2024020161.

基于电子舌和电子眼信息融合的贝母品种快速辨识方法

A Fast Identification Method for Fritillaria Varieties Based on the Fusion of Electronic Tongue and Electronic Eye Information

  • 摘要: 贝母是一种应用广泛的中药材,其来源复杂,品种繁多,不同品种之间具有相似的外形特征,传统方法难以分辨。为实现贝母品种的快速、客观鉴别,本文提出了一种基于电子舌和电子眼结合深度学习模型的贝母品种快速辨识方法。分别使用电子舌、电子眼采集不同品种贝母的味觉指纹信息以及视觉图像信息。针对电子舌采集信号,采用基于因果注意力机制改进的Transformer编码器提取信号中的时间序列特征,提高对局部特征的提取能力。针对电子眼采集图像,采用基于坐标注意力机制改进的ShuffleNetV2网络提取图像不同区域的形态特征并抑制其背景噪声。提出一种特征加权融合分类模块,对电子舌和电子眼提取的特征信息进行自适应加权融合,并实现对融合特征的分类识别。结果表明,基于信息融合方法相较于单独采用电子舌和电子眼具有更好的分类效果,其测试准确率达到了98.4%。本研究为贝母品种的快速鉴别提供了一种新方法,并可为其它中药材的品种分类和溯源分析提供研究思路。

     

    Abstract: Fritillaria is a widely used traditional Chinese medicine, with a complex source and a wide variety of medicinal materials. Different varieties have similar external characteristics, making it difficult to distinguish using traditional methods. To achieve rapid and objective identification of Fritillaria species, this study proposed a method for rapid identification based on electronic tongue and electronic eye combined with a deep learning model. Electronic tongue and electronic eye were utilized to collect gustatory fingerprint and visual image information from different categories of Fritillaria, respectively. An enhanced Transformer encoder based on causal attention mechanism was employed to extract time-series features from the ET signals and augment the ability to extract local features. Meanwhile, an improved ShuffleNetV2 network based on coordinate attention mechanism was used to extract morphological features of EE image and suppress background noise. Subsequently, a feature weighted fusion module was presented to adaptively integrate the feature information extracted from both the electronic tongue and electronic eye, and achieve classification and recognition of the fused features. The experimental results indicated that the proposed information fusion method had better classification performance compared to separate usage of electronic tongue and electronic eye, with a testing accuracy of 98.4%. This study provides a novel approach for rapidly identifying Fritillaria varieties, which offers research insights into the classification and traceability analysis of other Chinese medicinal materials.

     

/

返回文章
返回