Home   >  Competitions   > 

 

Submission Files

 

Your predictions on the test dataset can be submitted at any time during the challenge. The result of the test set is a csv file with the following fields:

 

 

Evaluation Metric

 

The competition asks the participants to predict the click (finish+like)-through probability on each item of test dataset. We use AUC (area under ROC curve) as our challenge metric.  The weight for 'finish' and 'like' towards final scores are: final score = 0.7 * finish + 0.3 like. 

 

In our competition, one must output a score for the probability of clicking (finish+like) each item. Then AUC is calculated as follows:

 

                                                                     

 

where T is a varying threshold parameter which is used to calculate the TPR and FPR, given by

 

 

and

 

 

The leaderboard is ranked according to the AUC. The higher the AUC, the higher the ranking.

 

 

       

Short Video Understanding Challenge

$20,000

1025 teams

start

Final Submissions

2019-02-11

2019-04-09