# Uncertainty in Deep Learning

Posted 2017.08.11 13:22

Quantifying the confidence about the model's predictions

우리가 제안하는 방법을 Model Uncertainty라고 하는 것도 좋아보인다.

아니면 general하게 predictive variance라고 부르는 것도..?

우리는 regression에 집중한다는 것도 강조할 필요가 있겠다.

2017 Uncertainty in Deep Learning

Epistemic uncertainty: Knowledge uncertainty / reducible (데이터가 많이 주어지면 줄어든다.)

Aleatoric uncertainty: irreducible (데이터가 많아짐으로는 줄일 수 없다. 하지만 measurement precision이 increase하면 줄어들 수 있기는 하다.)

Model Uncertainty

Given a neural network and by placing a probability distribution over each weight, a Gaussian process can be recovered in the limit of infinitely many weights. For a finite number of weights, model uncertainty can still be obtained by placing distributions over the weights which is often called Bayesian neural networks.

Approximation predictive mean and variance estimators are derived from the sample mean and variance of a deep neural network with a stochastic forward path with stochastic regularization techniques, e.g., a Monte Carlo dropout.

Full theoretical justification has been made by showing the connection between a Bayesian neural network with dropout with an approximate Gaussian process.

This Monte Carlo estimate is referred to as MC dropout.

Note that the normal neural network model itself is not changed. To estimate the predictive mean and uncertainty, we simply collect the results of stochastic forward passes through the model.

이건 장점이자 단점이다. 바로 MC dropout은 간단하긴 하지만, real-time application에는 적합하지 않다.

논문에서도 아래와 같이 언급하고 있다.

First, even though the training time of our model is identical to that of existing models in the ﬁeld, the test time is scaled by T—the number of averaged forward passes through the network.

두 번째 문제는 calibrate이 안되어있다는 점이다.

Another concern is that the model’s uncertainty is not calibrated. A calibrated model is one in which the predictive probabilities match the empirical frequency of the data.

Chapter 6 Deep Insights

#### 'Enginius > Machine Learning' 카테고리의 다른 글

 Uncertainty in Deep Learning  (0) 2017.08.11 2017.06.25 2016.06.06 2015.12.21 2015.12.16 2015.12.03
« PREV : 1 : 2 : 3 : 4 : 5 : 6 : ··· : 572 : NEXT »