

We are living in the big data era where all areas of science and industry generate massive amounts of data. Hence, a basic understanding of these network architectures is important to be prepared for future developments in AI.

Importantly, those core architectural building blocks can be composed flexibly-in an almost Lego-like manner-to build new application-specific network architectures. These models form the major core architectures of deep learning models currently used and should belong in any data scientist's toolbox. For this reason, we present in this paper an introductory review of deep learning approaches including Deep Feedforward Neural Networks (D-FFNN), Convolutional Neural Networks (CNNs), Deep Belief Networks (DBNs), Autoencoders (AEs), and Long Short-Term Memory (LSTM) networks. On a downside, the mathematical and computational methodology underlying deep learning models is very challenging, especially for interdisciplinary scientists. Recent breakthrough results in image analysis and speech recognition have generated a massive interest in this field because also applications in many other domains providing big data seem possible.

5College of Artificial Intelligence, Nankai University, Tianjin, Chinaĭeep learning models stand for a new learning paradigm in artificial intelligence (AI) and machine learning.4Department of Biomedical Computer Science and Mechatronics, University for Health Sciences, Medical Informatics and Technology (UMIT), Hall in Tyrol, Austria.3School of Management, University of Applied Sciences Upper Austria, Steyr, Austria.2Institute of Biosciences and Medical Technology, Tampere, Finland.1Predictive Society and Data Analytics Lab, Faculty of Information Technology and Communication Sciences, Tampere University, Tampere, Finland.Frank Emmert-Streib 1,2 *, Zhen Yang 1, Han Feng 1,3, Shailesh Tripathi 1,3 and Matthias Dehmer 3,4,5
