TY - GEN
T1 - Unsharp masking layer
T2 - 28th International Conference on Artificial Neural Networks, ICANN 2019
AU - Carranza-Rojas, Jose
AU - Calderon-Ramirez, Saul
AU - Mora-Fallas, Adán
AU - Granados-Menani, Michael
AU - Torrents-Barrena, Jordina
N1 - Publisher Copyright:
© Springer Nature Switzerland AG 2019.
PY - 2019
Y1 - 2019
N2 - Image enhancement refers to the enrichment of certain image features such as edges, boundaries, or contrast. The main objective is to process the original image so that the overall performance of visualization, classification and segmentation tasks is considerably improved. Traditional techniques require manual fine-tuning of the parameters to control enhancement behavior. To date, recent Convolutional Neural Network (CNN) approaches frequently employ the aforementioned techniques as an enriched pre-processing step. In this work, we present the first intrinsic CNN pre-processing layer based on the well-known unsharp masking algorithm. The proposed layer injects prior knowledge about how to enhance the image, by adding high frequency information to the input, to subsequently emphasize meaningful image features. The layer optimizes the unsharp masking parameters during model training, without any manual intervention. We evaluate the network performance and impact on two applications: CIFAR100 image classification, and the PlantCLEF identification challenge. Results obtained show a significant improvement over popular CNNs, yielding 9.49% and 2.42% for PlantCLEF and general-purpose CIFAR100, respectively. The design of an unsharp enhancement layer plainly boosts the accuracy with negligible performance cost on simple CNN models, as prior knowledge is directly injected to improve its robustness.
AB - Image enhancement refers to the enrichment of certain image features such as edges, boundaries, or contrast. The main objective is to process the original image so that the overall performance of visualization, classification and segmentation tasks is considerably improved. Traditional techniques require manual fine-tuning of the parameters to control enhancement behavior. To date, recent Convolutional Neural Network (CNN) approaches frequently employ the aforementioned techniques as an enriched pre-processing step. In this work, we present the first intrinsic CNN pre-processing layer based on the well-known unsharp masking algorithm. The proposed layer injects prior knowledge about how to enhance the image, by adding high frequency information to the input, to subsequently emphasize meaningful image features. The layer optimizes the unsharp masking parameters during model training, without any manual intervention. We evaluate the network performance and impact on two applications: CIFAR100 image classification, and the PlantCLEF identification challenge. Results obtained show a significant improvement over popular CNNs, yielding 9.49% and 2.42% for PlantCLEF and general-purpose CIFAR100, respectively. The design of an unsharp enhancement layer plainly boosts the accuracy with negligible performance cost on simple CNN models, as prior knowledge is directly injected to improve its robustness.
KW - Convolutional Neural Networks
KW - PlantCLEF
KW - Preprocessing
KW - Prior knowledge injection
KW - Unsharp masking
UR - http://www.scopus.com/inward/record.url?scp=85072873365&partnerID=8YFLogxK
U2 - 10.1007/978-3-030-30508-6_1
DO - 10.1007/978-3-030-30508-6_1
M3 - Contribución a la conferencia
AN - SCOPUS:85072873365
SN - 9783030305079
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 3
EP - 16
BT - Artificial Neural Networks and Machine Learning – ICANN 2019
A2 - Tetko, Igor V.
A2 - Karpov, Pavel
A2 - Theis, Fabian
A2 - Kurková, Vera
PB - Springer Verlag
Y2 - 17 September 2019 through 19 September 2019
ER -