You are on page 1of 1

MMSys’22, June 14-17, 2022, Athlone, Ireland Midoglu et al.

Listing 1: Annotation structure for sample card event. Listing 2: Annotation structure for sample goal event.
{ (*) Field optional.
' < t i m e s t a mp > ' , {
'{ ' < t ime st amp > ' ,
" team " : '{
{ " team " :
" id " : <team − i d > , {
" type " : " team " , " id " : <team − i d > ,
" v a l u e " : " < team −name > " " type " : " team " ,
}, " v a l u e " : " < team −name > "
" a c t i o n " : " < y e l l o w / red > c a r d " , },
" player " : " action " : " goal " ,
{ " scorer " :
" id " : < player −id > , {
" type " : " player " , " id " : < player −id > ,
" v a l u e " : " < p l a y e r −name > " " type " : " player " ,
} " v a l u e " : " < p l a y e r −name > "
}' },
} " a s s i s t by " :
{
" id " : < player −id > ,
4 EVALUATION " type " : " player " ,
" v a l u e " : " < p l a y e r −name > "
Participants are free to develop their models in any language or
},
platform they prefer. However, a well-documented open repository
containing the source code for the proposed solution is required " shot type " :
for each submission. Note that no data should be included within {
the repository itself. The hidden test dataset will be injected during " type " : " goal shot type " ,
evaluation, and participants can assume that the dataset will be " v a l u e " : " < s h o t − ty pe > "
located at /mmsys22soccer. }
" after set piece " ( ∗ ) :
4.1 Performance {
As the perceived quality of highlight clips, thumbnails, and game " type " : " s e t piece " ,
summaries are highly subjective, the performance of the submitted " value " : " penalty "
solutions will be evaluated by a jury. In particular, a subjective sur- }
vey will be conducted in double blind fashion with a jury consisting }'
of unaffiliated video experts selected by the challenge organizers. }
For each submitted solution for a given task, the jury members will
be asked to provide an overall subjective performance score out of
100.
4.3 Final Score
4.2 Complexity Aggregation of the subjective performance scores with the objective
Complexity is a factor influencing how well a solution can satisfy complexity scores per submission will be undertaken by the chal-
practical real-time requirements. The following objective metrics lenge organizers. For Task 3, the text (3a) and video (3b) subtasks
will be used to evaluate the submitted solutions in terms of com- are weighted 25% and 75%, respectively.
plexity. Participants are asked to calculate the following metrics
for their model and include these values in their manuscript: 5 CONCLUSION AND OUTLOOK
• Latency: Average runtime per sample (ms). / Frame rate: The MMSys’22 Grand Challenge on AI-based Video Production for
Average number of frames the submitted solution can ana- Soccer addresses the task of automating end-to-end soccer video
lyze per second (fps). production systems. Such systems are used for generating event
• Number of parameters: Total number of trainable param- highlights and game summaries, which are operations typically
eters in the submitted solution. requiring tedious manual labor. An AI-based solution to replace the
• Model size: Storage size (size on disk) of the submitted manual operations has the potential to both reduce human inter-
solution (MB). actions and to yield better results, therefore providing a more cost

You might also like