Int Introductio ion t n to Deep Deep Lea earn rning
- Prof. Leal-Taixé and Prof. Niessner
1
Int Introductio ion t n to Deep Deep Lea earn rning Prof. - - PowerPoint PPT Presentation
Int Introductio ion t n to Deep Deep Lea earn rning Prof. Leal-Taix and Prof. Niessner 1 The The Te Team Lecturers Prof. Dr. Laura Prof. Dr. Matthias Leal-Taix Niessner Tutors Patrick Andreas Dendorfer Rssler Prof.
1
Lecturers
Leal-Taixé
Niessner Patrick Dendorfer
Tutors
Andreas Rössler
2
3
4
Computer Vision
5
Computer Vision Physics Psychology Biology Mathematics Engineering Computer science Artificial Intelligence ML Neuroscience Algorithms Optimization NLP Speech Robotics Optics Image processing
6
Computer Vision Physics Psychology Biology Engineering Computer science Artificial Intelligence ML Algorithms Optimization NLP Speech Robotics Optics Image processing Mathematics Neuroscience
7
Computer Vision Physics Psychology Biology Engineering Computer science Artificial Intelligence ML Algorithms Optimization NLP Speech Robotics Optics Image processing Mathematics Neuroscience
8
Computer Vision Physics Psychology Biology Engineering Computer science Artificial Intelligence ML Algorithms Optimization NLP Speech Robotics Optics Image processing Mathematics Neuroscience
9
Pre 2012
10
A
Awesome magic box
Become magicians
Post 2012
Open the box
11
12
13
14
recognition dataset
training
recognition dataset
training 1988 LeCun et al. 2012 Krizhevsky et al.
15
Big Data
Models know where to learn from
Hardware
Models are trainable
Deep
Models are complex
16
AlphaGo Machine translation Emoticon suggestion
18
Self-driving cars
19
Healthcare, cancer detection
20
Forecasts to 2022", the deep learning market is expected to be worth USD D 1,7 ,722.9 Million by y 2022.
26
One-Shot Video Object Segmentation, CVPR 2017.
27
28
CC3 CC2 CC1 Reshape Conv+BN+ReLU Pooling Upsample Concat Score
DDFF
29
ScanNet: Dai, Chang, Savva, Halber, Funkhouser, Niessner., CVPR 2017.
ScanNet Stats:
sensors
environments
MTurk labels
2D frames
30
Map Photo
31
32
https://dvl.in.tum.de/lectures/i2dl-ws18.html
33
tba
https://dvl.in.tum.de/lectures/i2dl-ws18.html
34
be explained, no not to be missed!
35
Introduction to Deep Learning Optimization CNN Introduction to NN Machine Learning basics Back- propagation RNN
36
37
Intro to Deep Learning DL for Physics
(Th Thuerey)
DL for Vision
(Ni Niessner, , Le Leal al-Ta Taixe)
DL for Medical Applicat.
(Me Menze)
DL in Robotics
(Bä Bäuml)
Machine Learning
(Gü Günnema mann)
38
39
Task
40
41
Pose Appearance Illumination
42
Occlusions
43
Background clutter
44
Representation
45
Task Image classification Experience Data
46
Unsupervised learning Supervised learning
the structure of the data
PCA)
47
Unsupervised learning Supervised learning
48
classes
Unsupervised learning Supervised learning
49
DOG DOG DOG CAT CAT CAT
Unsupervised learning Supervised learning
50
Experience Data Training data Test data Underlying assumption that train and test data come from the same distribution
51
Reinforcement learning
Agents Environment interaction
Unsupervised learning Supervised learning
52
Reinforcement learning
Agents Environment reward
Unsupervised learning Supervised learning
53
Task Image classification Experience Data Performance measure Accuracy
54
55
56
distance
NN classifier = dog
57
distance
k-NN classifier = cat
58
Courtesy of Stanford course cs231n
What is the performance on training data for NN classifier? What classifier is more likely to perform best on test data?
59
aram ameters
Distance (L1, L2) k (number of neighbors)
60
train validation Run 1 Run 2 Run 3 Run 4 Run 5 Split the trai aini ning ng dat ata a into N folds
61
train test train test validation 20% Find your hyperparameters
62
CNN, regularization.
63
bonus do not miss it!
64