| Breast cancer is the most frequently diagnosed cancer among women.The morbidity of breast cancer is increasing year by year.Early diagnosis can assist the patients to be early detected and timely treated.The gold standard of breast cancer diagnosis is based on pathological examination of breast tissue.With the development of the Computer Aided Diagnosis(CAD)methods,CAD methods has been proposed to assist automatic analysis of the breast histopathology images.However,the classification methods based on handcrafted features relies on accurate cell segmentation and feature extraction.However,accurate segmentation and effective feature extraction still remain a challenge due to image quality.With the development of deep learning,the CNN model has been applied to classify the breast cancer histopathological images.The methods based on deep learning can automatically learn and discover discriminative and representative information from the raw data.However,large and labeled datasets are crucial to the high performance of deep learning methods.As an effective method,transfer learning based on convolutional neural network has been developed for the classification task of small datasets by sharing parameters from other large datasets.Based on deep transfer learning and single deep high-level features for the classification of breast cancer histopathological images,the hidden features of microscopic images cannot be fully exploited.This can restrict the classification accuracy and the stability of the model.In order to solve the above problems,in this thesis,two methods based on cross domain deep transfer learning have been proposed.Cross domain of similar domain and multi-type feature fusion have been utilized to fully exploit features of breast cancer microscopic images in target domain.This can learn rich multi-features in limited breast histopathology images.These multi-features have been jointly used in model training.This can avoid the overfitting problem and get stable and effective model.The specific research contents are as follows:(1)Multi-level deep feature fusion based on deep transfer learning method for classification of breast cancer microscopic images is proposed.First,pretrain the multi-layer deep feature fusion model for classification based on public breast cancer dataset Brea KHis.Then,transfer the parameters in the pre-trained model to the target model for the classification task of the target breast cancer microscopic images.Finally,multi-level(low-level,middle-level,high-level)fusion is performed on the target breast cancer micrographic images.Finetune the transferred model and get the final classification model of the microscopic images.Results show that the method get classification accuracy of 93.75%,94.32%and 91.45%for the three classification tasks.The results confirm the effectiveness of the method in this chapter.(2)Interactive cross-task extreme learning machine based on double deep high-level features for classification of breast histopathology images is proposed.The method consists of two parts:a double deep transfer learning(D~2TL)and interactive cross-task extreme learning machine(ICELM).Firstly,transfer learning(TL)and double-step transfer learning(DSTL)of CNN are utilized to extract two deep hierarchical feature representations for the same breast cancer target data.After high level feature extraction,these two feature sets are applied as regulation terms in interactive cross-task extreme learning machine to improve the stability and classification accuracy of the proposed method.Finally,the optimal classification model of breast histopathological microscopic images was obtained.Results show that the method get classification accuracy of 98.18%,96.96%,96.67%for the three classification tasks.The results confirm the effectiveness of the method in this chapter. |