1. 首页
  2. 人工智能
  3. 论文/代码
  4. TLU-Net: A Deep Learning Approach for Automatic Steel Surface Defect Detection

TLU-Net: A Deep Learning Approach for Automatic Steel Surface Defect Detection

上传者: 2021-01-24 05:16:42上传 .PDF文件 1.93 MB 热度 17次

TLU-Net: A Deep Learning Approach for Automatic Steel Surface Defect Detection

Visual steel surface defect detection is an essential step in steel sheet manufacturing. Several machine learning-based automated visual inspection (AVI) methods have been studied in recent years.However, most steel manufacturing industries still use manual visual inspection due to training time and inaccuracies involved with AVI methods. Automatic steel defect detection methods could be useful in less expensive and faster quality control and feedback. But preparing the annotated training data for segmentation and classification could be a costly process. In this work, we propose to use the Transfer Learning-based U-Net (TLU-Net) framework for steel surface defect detection. We use a U-Net architecture as the base and explore two kinds of encoders: ResNet and DenseNet. We compare these nets' performance using random initialization and the pre-trained networks trained using the ImageNet data set. The experiments are performed using Severstal data. The results demonstrate that the transfer learning performs 5% (absolute) better than that of the random initialization in defect classification. We found that the transfer learning performs 26% (relative) better than that of the random initialization in defect segmentation. We also found the gain of transfer learning increases as the training data decreases, and the convergence rate with transfer learning is better than that of the random initialization.

TLU-Net:一种用于自动检测钢表面缺陷的深度学习方法

视觉钢表面缺陷检测是钢板制造中必不可少的步骤。近年来,已经研究了几种基于机器学习的自动视觉检查(AVI)方法。.. 但是,由于培训时间和AVI方法的不准确性,大多数钢铁制造业仍然使用手动外观检查。自动钢缺陷检测方法可能对降低成本和加快质量控制和反馈很有用。但是,准备带注释的训练数据进行细分和分类可能是一个昂贵的过程。在这项工作中,我们建议使用基于转移学习的U-Net(TLU-Net)框架进行钢表面缺陷检测。我们以U-Net架构为基础,并探索两种编码器:ResNet和DenseNet。我们使用随机初始化将这些网络的性能与使用ImageNet数据集进行训练的预训练网络进行比较。使用Severstal数据进行实验。结果表明,在缺陷分类中,转移学习的性能比随机初始化的性能好5%(绝对值)。我们发现,在缺陷分割中,转移学习的性能比随机初始化好26%(相对)。我们还发现,随着训练数据的减少,迁移学习的收益增加,并且迁移学习的收敛速度要好于随机初始化。 (阅读更多)

下载地址
用户评论