Different Coefficients for Studying Dependence
Rainio Oona
Different Coefficients for Studying Dependence
Rainio Oona
SPRINGER
Julkaisun pysyvä osoite on:
https://urn.fi/URN:NBN:fi-fe2022102463092
https://urn.fi/URN:NBN:fi-fe2022102463092
Tiivistelmä
Through computer simulations, we research several different measures of dependence, including Pearson's and Spearman's correlation coefficients, the maximal correlation, the distance correlation, a function of the mutual information called the information coefficient of correlation, and the maximal information coefficient (MIC). We compare how well these coefficients fulfill the criteria of generality, power, and equitability. Furthermore, we consider how the exact type of dependence, the amount of noise and the number of observations affect their performance. According to our results, the maximal correlation is often the best choice of these measures of dependence because it can recognize both functional and non-functional types of dependence, fulfills a certain definition of equitability relatively well, and has very high statistical power when the noise grows if there are enough observations. While Pearson's correlation does not find symmetric non-monotonic dependence, it has the highest statistical power for recognizing linear and non-linear but monotonic dependence. The MIC is very sensitive to the noise and therefore has the weakest statistical power.
Kokoelmat
- Rinnakkaistallenteet [19207]