The ChaLearn Looking at People ICPR Contest 2016, to be held next Desember in conjunction with ICPR 2016 in Cancún, México, will be devoted to all aspects of computer vision and pattern recognition for the analysis of human personality from images and videos. Co-located with the workshop there is a challenge on first impressions, that is, recognizing personality traits for users after seeing a short video.
The top three ranked participants on each track will be awarded and invited to follow the workshop submission guide for ICPR workshop for inclusion of a description of their system at the ICPR workshop proceedings and submit an extended paper in a Special Issue on Personality Analysis in the IEEE Transactions on Affective Computing at a high-impact factor journal.
The challenge is part of the Speed Interviews project, you can know more about this project by entering the corresponding section, additionally, the following video provides a comprehensive overview of the project:.
As part of the speeds interviews project, we are organizing a challenge on “first impressions”, in which participants will develop solutions for recognizing personality traits of users in short video sequences. We are making available a large newly collected data set sponsored by Microsoft of 10,000 15-second videos collected from YouTube, annotated with personality traits by AMT workers.
For each video sample, RGB and audio information are provided, as well as continuous ground-truth values for each of the 5 Big Five Traits.
A more detailed information is provided on the data section of the competition.
Please consider downloading the data in the section Get Data before the evaluation.
The evaluation consists of computing the mean accuracy among the video tested for the 5 traits between the predicted continuous values and the continuous ground truth values.
If you want to test your method locally to see your results before submitting them, you can download the evaluation script here.
The file evaluateTraits.py perform the evaluation between the predictions produced by the participants methods and the ground-truth.
Once participants train their models, each video sample should output 5 continuous prediction values between the range [0, 1] corresponding to each trait. Then, participants may want to use the function in the script generatePredFile.py to generate their output predictions into a CSV file, which must have the same format as the ground-truth CSV file and is stored at:
>>> ./output/pred/predictions.csv
You can find an example of this file here. Thus, the predictions file that participants generate for the submission with their computed scores must have the same video names and number of entries in the same order as the example predictions file provided. So participants may want to use the example file as a reference of the ground-truth file for generating their predictions file in the proper format for submission, by replacing it to:
>>> ./data/gt/validation_gt.csv
Then, you can test your results using the script evaluateTraits.py. It reads the CSV files for the predictions and ground-truth and performs the evaluation. The resulting output is a list of 5 elements, which refer to the mean accuracies among all videos for each Big Five personality traits.
For each trait and a given video, the accuracy is computed simply as 1 minus the absolute distance among the predicted value and the ground truth value. The mean accuracy among all Big Five traits is also computed.
Note that the predictions should be in the same order as the ground-truth, so the first column of the CSV files should be identical both for the predictions and the ground truth.
Taking the last point into account, the main function of the script evaluateTraits.py can consider the different ground-truth files for the different data subsets as:
>>> gt = np.asarray(readScores("./data/gt/training_gt.csv"))
>>> gt = np.asarray(readScores("./data/gt/validation_gt.csv"))
>>> gt = np.asarray(readScores("./data/gt/test_gt.csv"))
For submitting the results, you just have to provide a ZIP file containing:
Take in mind that the better your descriptions are, the better you will be rated by other participants and you will get more downloads!
Enjoy it!
CHALEARN Contest Rules for ChaLearn Looking at People
First Impression Challenge 2016
http://gesture.chalearn.org/icpr2016_contest
https://docs.google.com/document/d/1hOhiy1ken9iOsxB7lknCdRUxI8YtwnU7dKgyciY-aMM/edit
Official rules
https://docs.google.com/document/d/1hOhiy1ken9iOsxB7lknCdRUxI8YtwnU7dKgyciY-aMM/edit
Common terms used in these rules:
These are the official rules that govern how the Joint Contest on Multimedia Challenges Beyond Visual Analysis contest promotion will operate. This promotion will be simply referred to as the “contest” or the “challenge” throughout the rest of these rules and may be abbreviated on our website, in our documentation, and other publications as ChaLearn LAP.
In these rules, “organizers”, “we,” “our,” and “us” refer to CHALEARN and MS refer to Microsoft; "participant”, “you,” and “yourself” refer to an eligible contest participant.
This is a skill-based contest and chance plays no part in the determination of the winner(s). The goal of the contest is recognizing apparent personality traits from short video clips, the challenge is called first impressions.
The registered participants will be notified by email of any change in the schedule.
30th June, 2016: Beginning of the quantitative competition, release of development (with labels) and validation data (without labels).
8th August, 2016: Release of encrypted final evaluation data (without labels) and validation labels. Participants can start training their methods with the whole data set.
12th August, 2016: Release of final evaluation data decryption key. Participants start predicting the results on the final evaluation data..
16th August, 2016: End of the quantitative competition. Deadline for submission of predictions on the final evaluation data. The organizers start the code verification by running it on the final evaluation data.
17th August, 2016: Deadline for submitting the fact sheets.
20th August, 2016: Release of the verification results to the participants for review. Participants are invited to follow the paper submission guide for submitting contest papers.
25th August, 2016: Paper submission deadline to the associated workshop for participants.
2nd September, 2016: Notification of paper acceptance.
5th September, 2016: Camera ready of contest papers.
December , 2016: ICPR 2016 Joint Contest on Multimedia Challenges Beyond Visual Analysis, challenge results, award ceremony.
You are eligible to enter this contest if you meet the following requirements:
You are not an employee of CHALEARN or any of the sponsoring or co-organizing entities, including Microsoft, NVIDIA INAOE, Faceebook; and
This contest is void within the geographic area identified above and wherever else prohibited by law.
If you choose to submit an entry, but are not qualified to enter the contest, this entry is voluntary, and any entry you submit is governed by the remainder of these contest rules; CHALEARN reserves the right to evaluate it for scientific purposes. If you are not qualified to submit a contest entry and still choose to submit one, under no circumstances will such entries qualify for sponsored prizes.
To be eligible for judging, an entry must meet the following content/technical requirements:
The organizers will use test data to perform the final evaluation, hence the participants’ final entry will be based on test data.
The contest organizers will make available to the participants a training dataset with truth labels, and a validation set with no truth labels. The validation data will be used by the participants for practice purposes to validate their systems. It will be similar in composition to the test set (validation labels will be provided in the final test stage of the challenge).
The organizers may also perform additional post-challenge analyses using extra data, but the results will not affect the ranking of the challenge performed with the test data.
Other than what is set forth below, we are not claiming any ownership rights to your entry. However, by submitting your entry, you:
If you do not want to grant us these rights to your entry, please do not enter this contest.
The board of CHALEARN will select a panel of judges to judge the entries; all judges will be forbidden to enter the contest and will be experts in causality, statistics, machine learning, computer vision, or a related field, or experts in challenge organization. A list of the judges will be made available upon request. The judges will review all eligible entries received and select three winners for each of the three competition tracks based upon the prediction score on test data. The judges will verify that the winners complied with the rules, including that they documented their method by filling out a fact sheet.
The decisions of these judges are final and binding. The distribution of prizes according to the decisions made by the judges will be made within three (3) months after completion of the last round of the contest. If we do not receive a sufficient number of entries meeting the entry requirements, we may, at our discretion based on the above criteria, not award any or all of the contest prizes below. In the event of a tie between any eligible entries, the tie will be broken by giving preference to the earliest submission, using the time stamp of the submission platform.
First place: 1,500 dollars and travel award (*) + Award certificate
Second place: 1,000 dollars and travel award (*) + Award certificate
Third place: 500 dollars and travel award (*) + Award certificate
(*) Estimated amount 400USD, although the final amount will be based on need and availability. The travel award may be used for one of the workshops organized in conjunction with the challenge. The award money will be granted in reimbursement of expenses including airfare, ground transportation, hotel, or workshop registration. Reimbursement is conditioned on (i) attending the workshop, (ii) making an oral presentation of the methods used in the challenge, and (iii) presenting original receipts and boarding passes.
Other travel awards may be distributed to deserving participants based upon need and availability.
http://gesture.chalearn.org/2016-looking-at-people-eccv-workshop-challenge/si-tac.
The organizers may also sponsor other events to stimulate participation.
If there is any change to data, schedule, instructions of participation, or these rules, the registered participants will be notified at the email they provided with the registration.
If you are a potential winner, we will notify you by sending a message to the e-mail address listed on your final entry within seven days following the determination of winners. If the notification that we send is returned as undeliverable, or you are otherwise unreachable for any reason, we may award the prize to an alternate winner, unless forbidden by applicable law.
Winners who have entered the contest as a team will be responsible to share any prize among their members. The prize will be delivered to the registered team leader. If this person becomes unavailable for any reason, the prize will be delivered to be the authorized account holder of the e-mail address used to make the winning entry.
If you are a potential winner, we may require you to sign a declaration of eligibility, use, indemnity and liability/publicity release and applicable tax forms. If you are a potential winner and are a minor in your place of residence, and we require that your parent or legal guardian will be designated as the winner, and we may require that they sign a declaration of eligibility, use, indemnity and liability/publicity release on your behalf. If you, (or your parent/legal guardian if applicable), do not sign and return these required forms within the time period listed on the winner notification message, we may disqualify you (or the designated parent/legal guardian) and select an alternate selected winner.
We will post changes in the rules or changes in the data as well as the names of confirmed winners (after contest decisions are made by the judges) online on http://gesture.chalearn.org/. This list will remain posted for one year or will be made available upon request by sending email to mmgesture@chalearn.org.
If an unforeseen or unexpected event (including, but not limited to: someone cheating; a virus, bug, or catastrophic event corrupting data or the submission platform; someone discovering a flaw in the data or modalities of the challenge) that cannot be reasonably anticipated or controlled, (also referred to as force majeure) affects the fairness and / or integrity of this contest, we reserve the right to cancel, change or suspend this contest. This right is reserved whether the event is due to human or technical error. If a solution cannot be found to restore the integrity of the contest, we reserve the right to select winners based on the criteria specified above from among all eligible entries received before we had to cancel, change or suspend the contest subject to obtaining the approval from the Régie des Alcools, des Courses et des Jeux with respect to the province of Quebec.
Computer “hacking” is unlawful. If you attempt to compromise the integrity or the legitimate operation of this contest by hacking or by cheating or committing fraud in any way, we may seek damages from you to the fullest extent permitted by law. Further, we may ban you from participating in any of our future contests, so please play fairly.
ChaLearn is the sponsor of this contest.
955 Creston Road,
Berkeley, CA 94708, USA
Microsoft, University of Barcelona, NVIDIA, Human Pose Recovery and Behavior Analysis group, INAOE, and Facebook, are the co-sponsors of this contest. Additional sponsors can be added during the competition period.
Privacy
During the development phase of the contest and when they submit their final entries, contest participants do not need to disclose their real identity, but must provide a valid email address where we can be deliver notifications to them regarding the contest. To be eligible for prizes, however, contest participants will need to disclose their real identity to contest organizers, informing them by email of their name, professional affiliation, and address. To enter the contest, the participants will need to become users of the Codalab platform. Any profile information stored on this platform can be viewed and edited by the users. After the contest, the participants may cancel their account with the Codalab and cease to be users of that platform. All personal information will then be destroyed. The Codalab privacy policy will apply to contest information submitted by participants on the Codalab. Otherwise, CHALEARN’s privacy policy will apply to this contest and to all information that we receive from your entry that we receive directly from you or which you have submitted as part of your contest entry on the Codalab. Please read the privacy policy on the contest entry page before accepting the official rules and submitting your entry. Please note that by accepting the official rules you are also accepting the terms of the CHALEARN privacy policy: http://www.chalearn.org/privacy.html.
Certificate of acceptation of prize for ChaLearn Looking at People First Impression Challenge 2016
Team name:
Contact name:
Address:
Country of residence:
Date of birth:
Email:
Rank in challenge:
Prize received:
By accepting this prize, I certify that I have read and understood the rules of the challenge and that I am a representative of the team authorized to receive the prize and sign this document.
To the best of my knowledge, all the team members followed the rules and did not cheat in participating to the challenge. I certify that team complied with all the challenge requirements, including that:
- The team publicly released the source code of the software necessary to reproduce the final entry at mmgesture@chalearn.org and http://gesture.chalearn.org/ under a public license, taken among popular OSI-approved licenses (http://opensource.org/licenses). The code will remain publicly accessible on-line for a period of not less than three years following the end of the challenge.
- The team filled out the requested survey (fact sheet) and, to the best of my knowledge, all information provided is correct.
- The team is invited (not mandatory) to submit a maximum of 8-page paper to the ICPR Joint Workshop and Contest on Multimedia Challenges Beyond Visual Analysis 2016, summarizing their contribution in the contest.
I recognize that I am solely responsible for all applicable taxes related to accepting the prize. NOTE: IF A PRIZE IS DONATED BY CHALEARN, THE RECIPIANT MUST FILL OUT A W9 OR W8BEN FORM
I grant CHALEARN, ChaLearn LAP 2016 competition sponsors, and the contest organizers the right to use, review, assess, test and otherwise analyze results submitted and other material submitted by you in connection with this contest and any future research or contests sponsored by CHALEARN and co-sponsors of this competition the right to feature my entry and all its content in connection with the promotion of this contest in all media (now known or later developed).
CHALEARN and ChaLearn LAP 2016 competition sponsors may use the name of my team, my name, and my place of residence online and in print, or in any other media, in connection with this contest, without payment or compensation.
Name:
Date:
Signature:
The ChaLearn Looking at People First Impressions 2016 Contest - -
[a]This was a requirement for the aaction spotting data set, not sure if it holds?, I suggest eliminating it.
[b]This was a note for the cultural event recognition challenge, it seems it does not harm to keep it
Start: June 30, 2016, midnight
Start: Aug. 8, 2016, midnight
Aug. 17, 2016, midnight
You must be logged in to participate in competitions.
Sign In