Boosting-based face detection and adaptation
Collection : Synthesis lectures on computer vision. Publié par : Morgan and Claypool Publishers ([S.l.] ) Détails physiques : 140 p. 24 cm. ISBN :160845133X (paperback); 9781608451333 (paperback).Type de document | Site actuel | Cote | Statut | Date de retour prévue | Code à barres | Réservations |
---|---|---|---|---|---|---|
Livre | La bibliothèque des sciences de l'ingénieur | 006.42 ZHA (Parcourir l'étagère) | Disponible | 0000000018992 |
Survol La bibliothèque des sciences de l'ingénieur Étagères Fermer l'étagère
006.42 SHA Motion-based recognition | 006.42 WAN Pattern recognition, machine intelligence and biometrics | 006.42 WEC Reliable face recognition methods | 006.42 ZHA Boosting-based face detection and adaptation | 006.424 CHE Character recognition systems : | 006.454 BEC Speech recognition : | 006.454 BEI Fundamentals of speaker recognition / |
Face detection, because of its vast array of applications, is one of the most active research areas in computer vision. In this book, we review various approaches to face detection developed in the past decade, with more emphasis on boosting-based learning algorithms. We then present a series of algorithms that are empowered by the statistical view of boosting and the concept of multiple instance learning. We start by describing a boosting learning framework that is capable to handle billions of training examples. It differs from traditional bootstrapping schemes in that no intermediate thresholds need to be set during training, yet the total number of negative examples used for feature selection remains constant and focused (on the poor performing ones). A multiple instance pruning scheme is then adopted to set the intermediate thresholds after boosting learning. This algorithm generates detectors that are both fast and accurate. Table of Contents: A Brief Survey of the Face Detection Literature / Cascade-based Real-Time Face Detection / Multiple Instance Learning for Face Detection / Detector Adaptation / Other Applications / Conclusions and Future Work.
Il n'y a pas de commentaire pour ce document.