This shows you the differences between two versions of the page.
Both sides previous revision Previous revision Next revision | Previous revision | ||
groupmeeting-fall2014 [2014/10/30 01:35] nguyenpx |
groupmeeting-fall2014 [2014/12/10 12:08] (current) jsupanci [Week 11 - Dec 18th - James - DBH4011] |
||
---|---|---|---|
Line 61: | Line 61: | ||
==== Week 9 - Dec 4th - Sam - DBH4011 ==== | ==== Week 9 - Dec 4th - Sam - DBH4011 ==== | ||
- | Paper: | + | **Paper**: Local Decorrelation for Improved Pedestrian Detection |
- | ==== Week 10 - Dec 11th - Greg - DBH4011 ==== | + | **Abstract**: Even with the advent of more sophisticated, data-hungry methods, boosted decision trees remain extraordinarily successful for fast rigid object detection, achieving top accuracy on numerous datasets. While effective, most boosted detectors use decision trees with orthogonal (single feature) splits, and the topology of the resulting decision boundary may not be well matched to the natural topology of the data. Given highly correlated data, decision trees with oblique (multiple splits) can be effective. Use of oblique splits, however, comes at considerable computational expense. Inspired by recent work on discriminative decorrelation of HOG features, we instead propose an efficient feature transform that removes correlations in local neighborhoods. The result is an overcomplete but locally decorrelated representation ideally suited for use with orthogonal decision trees. In fact, orthogonal trees with our locally decorrelated features outperform oblique trees trained over the original features at a fraction of the computational cost. The overall improvement in accuracy is dramatic: on the Caltech Pedestrian Dataset, we reduce false positives nearly tenfold over the previous state-of-the-art. |
- | Paper: | + | [[http://vision.ucsd.edu/~pdollar/files/papers/NamNIPS14ldcf.pdf|vision.ucsd.edu/~pdollar/files/papers/NamNIPS14ldcf.pdf]] |
- | ==== Week 11 - Dec 18th - James - DBH4011 ==== | + | ==== Week 10 - Dec 11th - James - DBH4011 ==== |
+ | |||
+ | Paper: Filter Forests for learning Data-Dependent Convolutional Kernels | ||
+ | |||
+ | Abstract: We propose ‘filter forests’ (FF), an efficient new discrimi-\\ | ||
+ | native approach for predicting continuous variables given a signal and its context. FF can be used for general signal restoration tasks that can be tackled via convolutional filter- ing, where it attempts to learn the optimal filtering kernels to be applied to each data point. The model can learn both the size of the kernel and its values, conditioned on the ob- servation and its spatial or temporal context. We show that FF compares favorably to both Markov random field based and recently proposed regression forest based approaches for labeling problems in terms of efficiency and accuracy. In particular, we demonstrate how FF can be used to learn optimal denoising filters for natural images as well as for other tasks such as depth image refinement, and 1D signal magnitude estimation. Numerous experiments and quanti- tative comparisons show that FFs achieve accuracy at par or superior to recent state of the art techniques, while being several orders of magnitude faster | ||
+ | |||
+ | http://research.microsoft.com/pubs/217099/CVPR2014ForestFiltering.pdf | ||
+ | |||
+ | ==== Week 11 - Dec 18th - Greg - DBH4011 ==== | ||
Paper: | Paper: | ||