Tuesday, June 29

ICML 2010 and Yahoo! Learning to Rank Workshop

Last week was ICML 2010 in Haifa. You can read the Hal's coverage on NLPers. The conference also had two workshops of note, the Yahoo! Learning to Rank Workshop and the Machine Learning Open Source Software (mloss) workshop. I'm going to focus mainly on the LTR workshop, but be sure to check out the mloss site for more details.

One highlight of YLTR was Chris Burges' MSR team winning track 1 with LambdaMART. They given an overview of their method in a recent tech report:

From RankNet to LambdaRank to LambdaMART: An Overview
LambdaMART is the boosted tree version of LambdaRank, which is based on RankNet. RankNet, LambdaRank, and LambdaMART have proven to be very successful algorithms for solving real world ranking problems: for example an ensemble of LambdaMART rankers won Track 1 of the recent Yahoo! Learning To Rank Challenge.
The other winning teams were from the Russian search company, Yandex. See the company blog post on the topic (via Google translate). You can also read the presentations from the top leaders:
The two teams' methods are related to those used by Yandex for it's ranking.


  1. Michael B.11:38 AM EDT

    Thanks for the updates and the pointers! Glad to see you back in the blogging business.

  2. Anonymous10:17 AM EDT

    One more report from the competition and also from Yandex - http://www.cs.cmu.edu/~daria/papers/ranking_AG.pdf - 5th place, using so-called additive groves, basically an extension over MART.

    And what's surprising about it is that it's just a plain old regression algorithm, which optimizes for square loss! I.e. a "pointwise" algorithm in modern lingvo, not specifically tuned to do any ranking, like MSR's LambdaMART!

    I wonder what if one replaces MART in LambdaMART with AG? Seems like such an algo will rock harder than anything!