IEEE CIS competition program

Tips of the standards committee


The CIS sponsors yearly competitions whose results are discussed at major IEEE conferences (WCCI, IJCNN, CEC and FUZZ). Competitions stimulate research in computational intelligenc. The goals pursued include:
Other IEEE CIS technical committees are involved with competition organizations [Data mining [Games][Fuzzy systems] [Education]

Past competition programs of the IEEE CIS:

Do you want to organize a competition?

The call for competition proposals for WCCI 2010 will be issued soon, contact isabelle __at__ clopinet __dot__ com if you are interested.

Seven tips to organize a winning competition:
  1. IMPACT: Choose a problem with high impact (economical, humanitarian, societal, etc.)
  2. DATA: Make sure that
  3. RELEVANCE: The problem should be relevant to the computational intelligence community and should be solvable without extensive domain knowledge.
  4. CHALLENGE: The problem posed should be scientifically or technically challenging. However, it should not be impossible to solve; the organizers should provide baseline results. Think of illustrating the same scientific problem using several datasets from various application domains.
  5. APPEAL: The competition setup should be attractive with preferably on-line submission and feed-back and/or an on-site contest at the conference. 
  6. EVALUATION: It should be possible to evaluate the results objectively  (provide a metric of significance in performance differences). 
  7. RESOURCES: The organizers should make sure to have enough resources (team member availability, computers, support staff, other equipment, sponsors).
The CIS will offer prizes to the winners of competitions, which have been selected for designated conferences.

Before you embark on the orgaization of a competition, consider that you will have to do all this and more:

Task 1: Shopping for data. Identify a good problem and a good dataset.
Task 2: Formatting data. Preprocess and format the data to simplify the task of participants and reduce the need for domain knowledge.
Task 3: Assessment. Define a task and evaluation metrics. Define and implement methods of scoring the results and comparing them.
Task 4: Baseline. Try to solve your own problem to see whether it is feasible and provide baseline results.
Task 5: Result formats. Define the formats in which the results should be returned by the competitors.
Task 6: Benchmark protocol. Define the rules of the competition and determine the sequence of events.
Task 7: Web portal. Create a web portal allowing on-line submissions and displaying results on a leaderboard.
Task 8: Guidelines to participants. Write the competition rules, document the formats and the scoring methods, write FAQs.
Task 9: Seek private sponsors. Find additional sponsor money to give travel awards so the participants can attend the workshop.
Task 10: Prepare the workshop. Look for tutorial speakers. Select speakers. Create a schedule. Advertise.
Task 11: Competition result analysis. Compile the results. Produce graphs. Derive conclusions.
Task 12: Release the results on-line. Make available on-line the competition result analyses, fact sheets of the competitors's methods, the workshop slides.
Task 13: Post competition tests. Reproduce the results of the best competitors. Identify candidate essential ingredients of success. Perform a systematic study such ingredients.
Task 14: Write technical reports. Write reports on the benchmark design, the results of the competition, and the results of the post-competition tests.
Task 15: Prepare workshop proceedings. Solicit papers, organize the review process, and edit the papers.


But, this will be rewarding: you will get to meet lots of smart people who will solve your problem in ways you would never have anticipated!

Other recently organized competitions

KDD cup 2009: Fast scoring in a large database.Customer Relationship Management (CRM) is a key element of modern marketing strategies. The KDD Cup 2009 offers the opportunity to work on large marketing databases from the French Telecom company Orange to predict the propensity of customers to switch provider (churn), buy new products or services (appetency), or buy upgrades or add-ons proposed to them to make the sale more profitable (up-selling).

Past KDD cups: Marketing, protein folding, high energy physics, and more...

Causality workbench: Competition and dataset for causal discovery.


Seizure prediction contest, 2007: As part of the 3rd International Workshop on Epileptic Seizure Prediction, a contest will be carried out to quantify the current state of the art in epileptic seizure prediction.


Machine learning for signal processing (MLSP 2007 data analysis competition): Blind source separation.

Agnostic learning vs. Prior knowledge challenge. “When everything fails, ask for additional domain knowledge” is the current motto of machine learning. Therefore, assessing the real added value of prior/domain knowledge is a both deep and practical question.The participants competed in two track: the “prior knowledge track” for which they had access to the raw data and information about the data, and the “agnostic learning track” for which they had access to preprocessed data with no knowledge of the identity of the features.

Performance prediction challenge. “How good are you at predicting how good you are? 145 participants tried to answer that question. Cross-validation came very strong. Can you do better? Measure yourself against the winners by participating to the model selection game.

Feature selection challenge. We organized a competition on five data sets in which hundreds of entries were made. The web site of the challenge is still available for post challenge submissions. Measure yourself against the winners! See the book we published with a CD containing the datasets, tutorials, papers on s.o.a. methods.

Pascal challenges and Pascal2 challenges: The Pascal network is sponsoring several challenges in Machine learning.

AIPR TC5 on benchmarking and software

Data mining competitions:
A list of data mining competitions maintained by KDnuggets, including the well known KDD cup.

UCSD student data mining contest. A yearly competition reserved to students sponsored by FICO.

PACKDD 2009: Credit Risk Assessment on a Private Label Credit Card Application.

UCI machine learning repository: A great collection of datasets for machine learning research.

DELVE: A platform developed at University of Torontoto benchmark machine learning algorithms.

ICDAR
International Conference on Document Analysis and Recognition, a bi-annual conference proposing a contest in printed text recognition. Feature extraction/selection is a key component to win such a contest.

TREC
Text Retrieval conference, organized every year by NIST. The conference is organized around the result of a competition. Past winners have had to address feature extraction/selection effectively.

ICPR
In conjunction with the International Conference on Pattern Recognition, ICPR 2004, a face recognition contest is being organized.

CASP
An important competition in protein structure prediction called Critical Assessment of Techniques for Protein Structure Prediction.

Last updated: March 17, 2009 -- Isabelle Guyon --