The Many-to-Many Mapping Between the Concordance Correlation Coefficient and the Mean Square Error
(1902.05180)Abstract
We derive the mapping between two of the most pervasive utility functions, the mean square error ($MSE$) and the concordance correlation coefficient (CCC, $\rhoc$). Despite its drawbacks, $MSE$ is one of the most popular performance metrics (and a loss function); along with lately $\rhoc$ in many of the sequence prediction challenges. Despite the ever-growing simultaneous usage, e.g., inter-rater agreement, assay validation, a mapping between the two metrics is missing, till date. While minimisation of $Lp$ norm of the errors or of its positive powers (e.g., $MSE$) is aimed at $\rhoc$ maximisation, we reason the often-witnessed ineffectiveness of this popular loss function with graphical illustrations. The discovered formula uncovers not only the counterintuitive revelation that $MSE_1<MSE_2$' does not imply
$\rho{c1}>\rho{c2}$', but also provides the precise range for the $\rhoc$ metric for a given $MSE$. We discover the conditions for $\rhoc$ optimisation for a given $MSE$; and as a logical next step, for a given set of errors. We generalise and discover the conditions for any given $Lp$ norm, for an even p. We present newly discovered, albeit apparent, mathematical paradoxes. The study inspires and anticipates a growing use of $\rhoc$-inspired loss functions e.g., $\left|\frac{MSE}{\sigma{XY}}\right|$, replacing the traditional $Lp$-norm loss functions in multivariate regressions.
We're not able to analyze this paper right now due to high demand.
Please check back later (sorry!).
Generate a summary of this paper on our Pro plan:
We ran into a problem analyzing this paper.