Optimizing What Matters: AUC-Driven Learning for Robust Neural Retrieval
Abstract
A new training objective, MW loss, is introduced to improve retriever calibration and ranking quality by directly optimizing the Area under the ROC Curve (AUC), outperforming Contrastive Loss in retrieval-augmented generation tasks.
Dual-encoder retrievers depend on the principle that relevant documents should score higher than irrelevant ones for a given query. Yet the dominant Noise Contrastive Estimation (NCE) objective, which underpins Contrastive Loss, optimizes a softened ranking surrogate that we rigorously prove is fundamentally oblivious to score separation quality and unrelated to AUC. This mismatch leads to poor calibration and suboptimal performance in downstream tasks like retrieval-augmented generation (RAG). To address this fundamental limitation, we introduce the MW loss, a new training objective that maximizes the Mann-Whitney U statistic, which is mathematically equivalent to the Area under the ROC Curve (AUC). MW loss encourages each positive-negative pair to be correctly ranked by minimizing binary cross entropy over score differences. We provide theoretical guarantees that MW loss directly upper-bounds the AoC, better aligning optimization with retrieval goals. We further promote ROC curves and AUC as natural threshold free diagnostics for evaluating retriever calibration and ranking quality. Empirically, retrievers trained with MW loss consistently outperform contrastive counterparts in AUC and standard retrieval metrics. Our experiments show that MW loss is an empirically superior alternative to Contrastive Loss, yielding better-calibrated and more discriminative retrievers for high-stakes applications like RAG.
Community
We propose a new loss function that optimizes the AUC for neural retrieval. This method enables better thresholdability of the results, a crucial element of real-world retrieval systems.
This is an automated message from the Librarian Bot. I found the following papers similar to this paper.
The following papers were recommended by the Semantic Scholar API
- BiXSE: Improving Dense Retrieval via Probabilistic Graded Relevance Distillation (2025)
- RRRA: Resampling and Reranking through a Retriever Adapter (2025)
- CoDiEmb: A Collaborative yet Distinct Framework for Unified Representation Learning in Information Retrieval and Semantic Textual Similarity (2025)
- Zero-shot Multimodal Document Retrieval via Cross-modal Question Generation (2025)
- Does Generative Retrieval Overcome the Limitations of Dense Retrieval? (2025)
- Granite Embedding R2 Models (2025)
- Improving Dense Passage Retrieval with Multiple Positive Passages (2025)
Please give a thumbs up to this comment if you found it helpful!
If you want recommendations for any Paper on Hugging Face checkout this Space
You can directly ask Librarian Bot for paper recommendations by tagging it in a comment:
@librarian-bot
recommend
Models citing this paper 0
No model linking this paper
Datasets citing this paper 0
No dataset linking this paper
Spaces citing this paper 0
No Space linking this paper
Collections including this paper 0
No Collection including this paper