Friday, September 9, 2011

[DMANET] CFP: NIPS 2011 Workshop on Discrete Optimization in Machine Learning (DISCML) -- Uncertainty, Generalization and Feedback

===============================================

Call for Papers


3rd Workshop on
Discrete Optimization in Machine Learning (DISCML):
Uncertainty, Generalization and Feedback

at the
25th Annual Conference on Neural Information Processing Systems
(NIPS 2011)

http://www.discml.cc

Submission Deadline: Friday 11/11/11, 11:11


===============================================
- We apologize for multiple postings -


Solving optimization problems with ultimately discretely solutions is becoming increasingly important in machine learning: At the core of statistical machine learning is to infer conclusions from data, and when the variables underlying the data are discrete, both the tasks of inferring the model from data, as well as performing predictions using the estimated model are discrete optimization problems. This workshop aims at exploring discrete structures relevant to machine learning and techniques relevant to solving discrete learning problems. The focus of this year's workshop is on the interplay between discrete optimization and machine learning: How can we solve inference problems arising in machine learning using discrete optimization? How can one solve discrete optimization problems that themselves are learned from training data? How can we solve challenging sequential and adaptive discrete optimization problems where we have the opportunity to incorporate feedback? We wil!
l also explore applications of such approaches.

We would like to encourage high quality submissions of short papers relevant to the workshop topics. Accepted papers will be presented as spotlight talks and posters. Of particular interest are new algorithms with theoretical guarantees, as well as applications of discrete optimization to machine learning problems in areas such as
the following:

Combinatorial algorithms

•Submodular & supermodular optimization
•Discrete convex analysis
•Pseudo-boolean optimization
•Randomized / approximation algorithms


Continuous relaxations

•Sparse approximation & compressive sensing
•Regularization techniques
•Structured sparsity models


Learning in discrete domains

•Online learning / bandit optimization
•Generalization in discrete optimization
•Adaptive / stochastic optimization


Applications

•Graphical model inference & structure learning
•Clustering
•Feature selection, active learning & experimental design
•Structured prediction
•Novel discrete optimization problems in ML, Computer
Vision, NLP, ...


Submission deadline: November 11, 2011

Length & Format: max. 6 pages NIPS 2011 format

Time & Location: December 16/17 2011, Sierra Nevada, Spain

Submission instructions: Email to submit@discml.cc

Organizers:
Andreas Krause (ETH Zurich, Switzerland),
Pradeep Ravikumar (University of Texas, Austin),
Jeff A. Bilmes (University of Washington),
Stefanie Jegelka (Max Planck Institute for Intelligent Systems, Germany)
**********************************************************
*
* Contributions to be spread via DMANET are submitted to
*
* DMANET@zpr.uni-koeln.de
*
* Replies to a message carried on DMANET should NOT be
* addressed to DMANET but to the original sender. The
* original sender, however, is invited to prepare an
* update of the replies received and to communicate it
* via DMANET.
*
* DISCRETE MATHEMATICS AND ALGORITHMS NETWORK (DMANET)
* http://www.zaik.uni-koeln.de/AFS/publications/dmanet/
*
**********************************************************