Hi Stefan and Tony, Thanks for your time and feedback. I have referred to the following links to compare feature detectors and descriptors. 1] Comparative Evaluation of Binary Features<http://www.cs.unc.edu/~jheinly/publications/eccv2012-heinly.pdf> 2] Comparison of Feature Detection Algorithms<http://computer-vision-talks.com/2011/01/comparison-of-the-opencvs-feature-d...> 3] Feature Descriptor Comparison report<http://computer-vision-talks.com/2011/08/feature-descriptor-comparison-repor...> 4] Battle of Feature Descriptors<http://computer-vision-talks.com/2012/08/a-battle-of-three-descriptors-surf-...> My conclusion about their comparative performance(time), quality(% of correct descriptor matches) and implementation time are as follows - Feature Detectors : (best)FAST > STAR :: Performance(time) (best)STAR > FAST :: Quality Implementation time :: 3 weeks each Binary Feature Descriptors : (best)BRIEF > ORB<http://www.vision.cs.chubu.ac.jp/CV-R/pdf/Rublee_iccv2011.pdf>> FREAK > BRISK :: Performance(time) (best)BRISK > FREAK >= ORB > BRIEF :: Quality(Averaging effects of Scaling, Rotation and Viewpoint change) BRIEF works best among the above four for Non-geometric transforms like change in brightness and exposure Implementation time :: Brisk - 5 weeks; FREAK and ORB - 2 weeks(provided BRIEF has been implemented, see NOTE), BRIEF - 2 weeks Note :: BRISK is dependent on FAST and ORB on BRIEF. Based on the above observations and implementation time details, please help me in choosing those that should be implemented during the course of summer. Thanks again for your time and please reply as soon it is possible for you because the gsoc-melange site is known to have gone down in the past on the last day. Thanks. Regards, Ankit Agrawal, Communication and Signal Processing, IIT Bombay.