Thursday, November 12

Machine Learning Talk: Lee Spector on Genetic Programming; applications to Learning Ranking Functions

Today at the Yahoo! sponsored machine learning lunch, Lee Spector presented his work on genetic programming. His talk, Expressive Languages For Evolved Programs highlighted his work using the Push programming language for solving interesting and hard real-world problems.

He pointed to two key principles that these systems need to have to learn solutions, based on observations from biology:
  • Meaningful variation - Variations can't just be random, the mutations and selections have to produce meaningful effects in the domain.

  • Heritability - children need the ability to inherit desirable features from the parent without being clones.
During the talk, a really obvious application would be to use GP to learn IR ranking functions. Recently, Ronan Cummins, did some work in this area. Ronan's recent paper at SIGIR 2009 applied it to learning proximity functions, Learning in a Pairwise Term-Term Proximity Framework for Information Retrieval.

I think there's still interesting work combining GP with IR. For example, one problem is that collections and users evolve over time, but most ranking functions are static.