论文标题

生物传感器模型的教师研究领域适应

Teacher-Student Domain Adaptation for Biosensor Models

论文作者

Phillips, Lawrence G., Grimes, David B., Li, Yihan Jessie

论文摘要

我们提出了一种适应域适应的方法,以解决来自源域的数据丰富,来自目标域的标记数据的情况有限或不存在,并且可用少量的配对源目标数据。该方法旨在开发深度学习模型,以根据消费级便携式生物传感器的数据来检测医疗状况的存在。它解决了该领域的一些关键问题,即,从生物传感器中获取大量临床标记的数据的困难以及可能影响临床标签的噪声和歧义。这个想法是在从传感器模式的大型数据集上预先训练一个表达模型,该模式的数据大量具有丰富的数据,然后调整模型的下层,以使目标模态上的预测类似于来自源模态的成对示例上的原始模型。我们表明,预先训练的模型的预测提供了比临床医生提供的标签更好的学习信号,并且这种教师学生的技术的表现显着超过了对监督深度学习的天真应用,以及在合成数据集和现实情况下对睡眠apnead的域名研究的域名适应性的域名适应性。通过减少所需的数据量并消除对标签的需求,我们的方法应降低与开发生物传感器的高性能深度学习模型相关的成本。

We present an approach to domain adaptation, addressing the case where data from the source domain is abundant, labelled data from the target domain is limited or non-existent, and a small amount of paired source-target data is available. The method is designed for developing deep learning models that detect the presence of medical conditions based on data from consumer-grade portable biosensors. It addresses some of the key problems in this area, namely, the difficulty of acquiring large quantities of clinically labelled data from the biosensor, and the noise and ambiguity that can affect the clinical labels. The idea is to pre-train an expressive model on a large dataset of labelled recordings from a sensor modality for which data is abundant, and then to adapt the model's lower layers so that its predictions on the target modality are similar to the original model's on paired examples from the source modality. We show that the pre-trained model's predictions provide a substantially better learning signal than the clinician-provided labels, and that this teacher-student technique significantly outperforms both a naive application of supervised deep learning and a label-supervised version of domain adaptation on a synthetic dataset and in a real-world case study on sleep apnea. By reducing the volume of data required and obviating the need for labels, our approach should reduce the cost associated with developing high-performance deep learning models for biosensors.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源