The advent of transformer-based models such as BERT has led to the rise neural ranking models. These have improved effectiveness retrieval systems well beyond that lexical term matching BM25. While monolingual tasks benefited from large-scale training collections MS MARCO and advances in architectures, cross-language fallen behind these advancements. This paper introduces ColBERT-X, a generaliz...