Loading...

Multi-task learning for joint re-identification, team affiliation, and role classification for sports visual tracking

Mansourian, A. M ; Sharif University of Technology | 2023

0 Viewed
  1. Type of Document: Article
  2. DOI: 10.1145/3606038.3616172
  3. Publisher: Association for Computing Machinery, Inc , 2023
  4. Abstract:
  5. Effective tracking and re-identification of players is essential for analyzing soccer videos. But, it is a challenging task due to the non-linear motion of players, the similarity in appearance of players from the same team, and frequent occlusions. Therefore, the ability to extract meaningful embeddings to represent players is crucial in developing an effective tracking and re-identification system. In this paper, a multi-purpose part-based person representation method, called PRTreID, is proposed that performs three tasks of role classification, team affiliation, and re-identification, simultaneously. In contrast to available literature, a single network is trained with multi-task supervision to solve all three tasks, jointly. The proposed joint method is computationally efficient due to the shared backbone. Also, the multi-task learning leads to richer and more discriminative representations, as demonstrated by both quantitative and qualitative results. To demonstrate the effectiveness of PRTreID, it is integrated with a state-of-the-art tracking method, using a part-based post-processing module to handle long-term tracking. The proposed tracking method, outperforms all existing tracking methods on the challenging SoccerNet tracking dataset. © 2023 ACM
  6. Keywords:
  7. Computer vision ; Deep learning ; Deep metric learning ; Multi-object tracking ; Multi-task learning ; Part-based re-identification ; Re-identification ; Representation learning ; Soccer ; Soccernet ; Sports videos ; Team affiliation
  8. Source: MMSports 2023 - Proceedings of the 6th International Workshop on Multimedia Content Analysis in Sports, Co-located with: MM 2023 ; 2023 , Pages 103-112 ; 979-840070269-3 (ISBN)
  9. URL: https://dl.acm.org/doi/10.1145/3606038.3616172