您的位置:首页 > 其它

微软研究院图像识别挑战赛 MSR Image Recognition Challenge (IRC)

2016-05-31 19:16 666 查看
MSR Image Recognition Challenge (IRC)

Microsoft Research is happy to continue hosting this series of Image Recognition (Retrieval) Grand Challenges. Do you have what it takes to build the best image recognition system? Enter these MSR Image Recognition Challenges in ACM Multimedia and/or IEEE ICME
to develop your image recognition system based on real world large scale data.


Current Challenge: MS-Celeb-1M: Recognizing One Million Celebrities in the Real World


Details: MSR Image Recognition Challenge @ ACM MM 2016       

5/27/2016: (new!) SampleCode/GUIDs/TestTool
are released to each team, details

5/9/2016: Development dataset is
released for download, to be used during dry-run.
5/9/2016: Competition/paper registration
is opened here, Please provide your Team Name (as the paper title), Organization (as the paper abstract), Team Members and Contact information (as the paper authors).
4/29/2016: Entity list is released
for download
4/5/2016: Cropped
and aligned faces are ready for download
4/4/2016: More data are available to for downloading: samples
4/1/2016: Ms-Celeb-V1
ImageThumbnails ready for downloading! 


Last Challenge: MSR IRC @ IEEE ICME 2016

We just finished the evaluation! More details here
Important Dates:
The dataset for this challenge is described here and can be downloaded here.
Feb 23, 2016: Registration
web site is opened.
Feb 23, 2016: ICME site is
open, please register a placeholder for your final report: select Track ="Grand Challenges" and select Subject Area = "MSR Grand Challenge: Image Recognition Challenge"
Feb 24, 2016:  update
about data sets, and faq
Feb 26, 2016:  update
about sample codes, and faq 
March 3, 2016: update
about test tool, team keys, and faq
March 7~10, 2016: dry
run traffic sent to your system for testing/verification, and faq
March 10, 2016: update
about final evaluation and faq
March 14, 2016: Evaluation started, please
keep your system running stably
March 16, 2016: Evaluation
ends (0:00am PDT)
March 21, 2016: Evaluation
results announced (see the rank table below)
April 3, 2016: Grand Challenge Paper and Data Submission  

[align=left]April 28: paper acceptance notification[/align]

[align=left]May 13: paper camera ready version due[/align]

Rank
TeamIDTeam NamePrecision@5Used External Data
130NLPR_CASIA 89.65%Yes
216ybt_bj 86.90%No
35NFS2016 85.00%Yes
420WestMountain 84.75%Yes
53rucmm 84.55%Yes
617CASIIE-Asgard 83.40%Yes
731GoRocketsGo 81.85%Yes
82CDL-USTC 73.10%Yes
94lyg 71.35%No
1010FrenchBulldog 71.25%No
 


Past Challenge: MSR-Bing IRC @ ACM MM 2015

We have finished the challenge in ACM MM 2015. More details here.

Important Dates:
Dataset available for download (Clickture-Lite) and hard-disk delivery (Clickture-Full).
June 18, 2015: Trial set available for download and test.
June 24, 2015: Final evaluation set for Task#1 available for download (encrypted)
June 26, 2015: Evaluation starts (0:00am PDT) 
June 27, 2015: Evaluation ends (0:00am PDT) 
June 28, 2015: Evaluation results announce. 
July 7, 2015: Paper submission deadline
July 27, 2015: Notification of Acceptance:
August 15, 2015: Camera-ready Submission Deadline
October 28,2015: Grand Challenge Workshop

Latest updates: 
May 22, 2015: Pre-registration form available at http://1drv.ms/1K9aAxo.
June 11, 2015: Training data set ready for downloading: details
June 18, 2015: Trial set for Task#1 is available for download (the same as ACM MM 2014):http://1drv.ms/1pq08Wq
June 18, 2015: Trial code samples for Task#2 is delivered by email. Contact us if you haven't received it.
June 19, 2015: Test tool for Task#2 is delivered by email. Contact us if you haven't received it.
June 24, 2015: Evaluation set for Task#1 available at here (encrypted),
please download and unzip it
June 24~June 25,2015: For task#2, dry run traffic will be sent to your recognition service,please keep your
recognition service running!
June 26, 2015: Password to decrypt Task#1 evaluation data is delivered to all participants by email on 0:00am PST, please let us know if you haven't received it.
June 28, 2015: evaluation results are sent back to teams
July 1, 2015: evaluation result summary:

 
TeamID
TeamName
Task1: Image Retrieval
Task2: Image Recognition
Run1-Master
Run2
Run3
Rank-Task1
Accuracy@1
Arruracy@5
Rank-Task2
1
TINA
 
 
 
 
 
 
 
2
rucmm
0.52006239
0.489675
0.492945
1
42%
71%
2
3
SSDUT
 
 
 
 
 
 
 
4
AmritaLearning
 
 
 
 
 
 
 
5
HIK
 
 
 
 
 
 
 
6
DeepIR
 
 
 
 
 
 
 
7
IVA
0.471570894
0.462925
0.463261
3
57%
85%
1
8
VMA
 
 
 
 
 
 
 
9
WJ-QCZ
0.486851763
 
 
2
 
 
 
Random
0.425987601
 
 
 
 
 
 
Groundtruth
0.692381702
 
 
 
 
  
 


Past Challenge: MSR-Bing IRC @ ICME 2015 

Important dates :
April 21: Final evaluation set available for download here (encrypted)
April 24: Evaluation starts (password for decrypt the evaluation set delivered at 2:30am on April 24, PDT)
April 25: Evaluation end at 3:00AM PDT (very beginning of April 25). Result submission due.
April 28: Evaluation results has been sent to corresponding teams.
May 1, 2015: Paper submission (please follow the guideline of the main conference)
May 10, 2015: Notification
May 15, 2015: Paper camera ready due

Updates:
May 1: More details about the evaluation results are shared with participants. 
April 28: evaluation result sent back to corresponding teams.
April 23: password to decode the evaluation set was delivered to all participants by email. Please contact us if you haven't recevied it.
April 21: Evaluation set available at http://1drv.ms/1K3WIBv (encrypted),
please download and unzip it
April 17: Add one more contact: Yuxiao Hu (yuxhu@microsoft.com)
April 17: Trial set is available for download (the same as ACM MM 2014):http://1drv.ms/1pq08Wq
April 17: CMT Website is online:https://cmt.research.microsoft.com/IRC2015ICME/default.aspx


Past Challenge: MSR-Bing IRC @ ACM MM 2014

More details about the challenge, please visit:

1. The grand challenge page at ACM Multimedia 2014

2. IRC @ MM 14 at this site

Latest announcement will be posted here. 

Updates:
July 5: Evaluation results:

 


June 26: Due to many requests, the
MM14 grand challenge submission deadline was extended for a week. So we also extend MSR-Bing challenge result submission deadline for one week. Please check the updated dates below.
June 25: Encrypted evaluation dataset is available for download now:http://1drv.ms/1lfawui.
Please follow the below steps to submit your prediction results:
Register a "paper" entry at https://cmt.research.microsoft.com/IRC2014.
Make sure to finish this step ASAP (at the latest 30 minutes before the challenge starts). Password to decrypt the evaluation set will be set through CMT.
Download the encrypted evaluation dataset. Please note the downloaded file was zipped twice (once with a password and once not).
Unzip the downloaded file (without password) to make sure the file is not corrupted.
Unzip the file you get from Step C with the password that will be sent to you through CMT. You will then get two files: one is a (key, image thumbnail) table, and the other is a (key, label) table.
Please refer to this page know the details how to do generate prediction results.
Before the end of the challenge, submit your prediction results (up to 6 zipped files - see instructions below).
Submit your grand challenge paper according to the guideline in the ACM Multimedia 2014 website. Please note the CMT site is only for prediction results submission. Your paper should be submitted
to EasyChair paper system. Make sure that you include your evaluation results in the paper (which will be sent to you before the paper submission deadline).

June 25: Evaluation set will be available by EOD today. CMT will be also online at the same time. Instructions: You
are requested to register an entry at the CMT site to receive the password to decrypt the evaluation set as well as submit your prediction results. Please note prediction results based on Clickture-Lite (1M images) are mandatory, while the results on Clickture-Full
(40M images) are optional. When submitting the prediction results, please name the files appropriately so we know which are based on 1M dataset (include "1M" in the file name) and which are based on 40M dataset (include 40M in the file name), as well as which
are master runs (include "master" in the file name). If you submitted results based on both datasets, you are allowed to submit three runs for each dataset (including one master run for each dataset). Please note final evaluation will be based on the master
runs though we will also return you the scores for other runs. (New!)
June 25: Evaluation starts and ends dates changed (1 day delay).
June 19: Trial set is available here: http://1drv.ms/1pq08Wq  (New!)

Schedule (updated on June 26):
Feb 15, 2014: Dataset available for download (Clickture-Lite) and hard-disk delivery (Clickture-Full).
June 18: Trail set available for download and test.
June 25: Final evaluation set available for download (encrypted)
July 3 (updated/firm): Evaluation starts (password
for decrypt the evaluation set delivers at 11:30pm on July 2, PDT)
July 4 (updated/firm): Evaluation end at 0:00AM PDT (very beginning of July 4)/Result
submission due
July 5: Evaluation results announce.
July 6, 2014: Paper submission (please follow the guideline of the main conference)


Links to the Challenges at Difference Conferences:

MSR-Bing IRC at ACM Multimedia 2014 (past)
MSR-Bing IRC at ICME 2014 (past)
MSR-Bing IRC at
ACM Multimedia 2013 (past)
from: http://research.microsoft.com/en-us/projects/irc/
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息