Lamarr at LeQua2024: Regularized Soft-Max Likelihood Maximization
As members of the Lamarr Institute, we participated in the
open LeQua2024 competition. The goal in this competition was to pre-
dict the prevalences of classes in unlabeled sets of data, given a labeled
training set. Our submission builds on the regularized maximization of a
likelihood function with constraints that are implemented through a soft-
max operator. Ultimately, this method ranked in the top three across all
four disciplines of LeQua2024; most notably, we achieved the first place
in discipline T4, a binary quantification task with covariate shift. In this
paper, we detail our approach to the competition.
- Published in:
4th International Workshop on Learning to Quantify (LQ 2024) - Type:
Inproceedings - Authors:
Lotz, Tobias; Bunse, Mirko - Year:
2024 - Source:
https://hal.science/hal-04942724v1/file/LQ2024Proc.pdf#page=100
Citation information
Lotz, Tobias; Bunse, Mirko: Lamarr at LeQua2024: Regularized Soft-Max Likelihood Maximization, 4th International Workshop on Learning to Quantify (LQ 2024), 2024, 93, https://hal.science/hal-04942724v1/file/LQ2024Proc.pdf#page=100, Lotz.Bunse.2024a,
@Inproceedings{Lotz.Bunse.2024a,
author={Lotz, Tobias; Bunse, Mirko},
title={Lamarr at LeQua2024: Regularized Soft-Max Likelihood Maximization},
booktitle={4th International Workshop on Learning to Quantify (LQ 2024)},
pages={93},
url={https://hal.science/hal-04942724v1/file/LQ2024Proc.pdf#page=100},
year={2024},
abstract={As members of the Lamarr Institute, we participated in the
open LeQua2024 competition. The goal in this competition was to pre-
dict the prevalences of classes in unlabeled sets of data, given a labeled
training set. Our submission builds on the regularized maximization of a
likelihood function with constraints that are implemented through a soft-
max operator. Ultimately, this method ranked in...}}