ipmash@ipme.ru | +7 (812) 321-47-78
пн-пт 10.00-17.00
Институт Проблем Машиноведения РАН ( ИПМаш РАН ) Институт Проблем Машиноведения РАН ( ИПМаш РАН )

МИНОБРНАУКИ РОССИИ
Федеральное государственное бюджетное учреждение науки
Институт проблем машиноведения Российской академии наук

МИНОБРНАУКИ РОССИИ
Федеральное государственное бюджетное учреждение науки
Институт проблем машиноведения Российской академии наук

Incremental deep learning training approach for lesion detection and classification in mammograms

Авторы:
Habib Rostami , Hamed Behzadi-Khormouji , Siavash Salemi , Ahmad Keshavarz , Yaser Keshavarz , Yahya Tabesh ,
Страницы:
234–245
Аннотация:

Recently, Deep Convolutional Neural Networks (DCNNs) have opened their ways into various medical image processing practices such as Computer-Aided Diagnosis (CAD) systems. Despite significant developments in CAD systems based on deep models, designing an efficient model, as well as a training strategy to cope with the shortage of medical images have yet to be addressed. To address current challenges, this paper presents a model including a hybrid DCNN, which takes advantage of various feature maps of different deep models and an incremental training algorithm. Also, a weighting Test Time Augmentation strategy is presented. Besides, the proposed work develops the Mask-RCNN to not only detect mass and calcification in mammography images, but also to classify normal images. Moreover, this work aims to benefit from a radiology specialist to compare with the performance of the proposed method. Illustrating the region of interest to explain how the model makes decisions is the other aim of the study to cover existing challenges among the state-of-the-art research works. The wide range of conducted quantitative and qualitative experiments suggest that the proposed method can classify breast X-ray images of the INbreast dataset to normal, mass, and calcification classes with Accuracy 0.96, 0.98, and 0.97, respectively.

Файл (pdf):
02:29
645
Используя этот сайт, вы соглашаетесь с тем, что мы используем файлы cookie.