- A User-oriented Model for Expert Finding by Smirnova and Balog (best paper)
- A Methodology for Evaluating Aggregated Search Results. by Arguello, Diaz, Callan, and Carterette. (best student paper)
The paper Learning Models for Ranking Aggregates by Craig MacDonald and Iadh Ounis was also nominated.
The best poster prize was A novel reranking approach inspired by quantum measurement by Zhao et al. (via Owen Phelan).
An trend at the conference is the handling of ranking and evaluating "aggregate" results, aka "blended" results or "universal search" where results from multiple verticals are blended into a single presentation. In addition to the above two papers, there is:
Other trends in the conference appear to be:
- Smoothing Click Counts for Aggregated Vertical Search by Janwon Seo, et al.
- Crowd sourcing evaluation (an entire session)
- Realtime and Microblog (Twitter) applications (multiple papers across tracks)
The DDR 2011 workshop on diversity in document retrieval also proved popular. The proceedings are available for download. There is a fair bit of discussion on Twitter, #ddr2011.
There are two other papers from UMass that I want to highlight: