23–28 Feb 2025
IBS
Asia/Seoul timezone

Transformers for collider analysis

26 Feb 2025, 09:30
1h
IBS

IBS

Speaker

Ahmed Hammad (KEK)

Description

Attention-based Transformer networks have become increasingly prominent in collider analysis, delivering superior performance in tasks such as jet tagging. However, their high computational demands and substantial data requirements pose significant challenges. In this talk, I will explore the role of Transformer networks in LHC analysis, focusing on various attention mechanisms, including self-attention, cross-attention, and differential attention. Additionally, I will discuss different strategies for reducing network complexity while preserving high performance.

Primary author

Ahmed Hammad (KEK)

Presentation materials