Mon compte

# Résultats de recherche

Nombre de programmes trouvés : 5558
Label UNT Vidéocours le (6m8s)

## 2.5. Reminds on probability

In this sequence I want to remind you a few concepts in the theory of probability and then in the next one we finally derive the equations of the Bayes filter. So the concept that I want to remind you are 3: the Markov assumption, the theorem of a total probability and the Bayes theorem.
Voir la vidéo
Label UNT Vidéocours le (8m2s)

## 2.8. The Extended Kalman Filter (EKF)

We have seen the grid localization, and the advantage of this approach is that we can deal with any kind of probability distribution; in particular we don't need to do a Gaussian assumption. The drawback is that the solution becomes very expensive.There exists another solution that is the Kalman filter, this is a completely different solution because it is a totally analytical solution.
Voir la vidéo
Label UNT Vidéocours le (9m14s)

## 3.1. Examples for the Action in the EKF

In part 2, we have seen the equations of the Bayes filter, which are the general equations which allow us to update the probability distribution, as the data from both proprioceptive sensors and exteroceptive sensors are delivered. We have seen a possible implementation of these equations, based on a numerical solution: the grid localization.We have also started to see the equations of the Kalman filter, or better the extended Kalman filter. In part 3, we want to better explain these equations starting from a very simple example in 1D. Then ...
Voir la vidéo  