English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Journal Article

Seeing through noise in power laws

MPS-Authors
/persons/resource/persons292101

Newberry,  Mitchell       
Department of Human Behavior Ecology and Culture, Max Planck Institute for Evolutionary Anthropology, Max Planck Society;

External Resource
No external resources are shared
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)

Lin_Seeing_Interface_2023.pdf
(Publisher version), 890KB

Supplementary Material (public)
There is no public supplementary material available
Citation

Lin, Q., & Newberry, M. (2023). Seeing through noise in power laws. Journal of The Royal Society Interface, 20(205): 20230310. doi:10.1098/rsif.2023.0310.


Cite as: https://hdl.handle.net/21.11116/0000-000D-AA79-C
Abstract
Despite widespread claims of power laws across the natural and social sciences, evidence in data is often equivocal. Modern data and statistical methods reject even classic power laws such as Pareto's law of wealth and the Gutenberg-Richter law for earthquake magnitudes. We show that the maximum-likelihood estimators and Kolmogorov-Smirnov (K-S) statistics in widespread use are unexpectedly sensitive to ubiquitous errors in data such as measurement noise, quantization noise, heaping and censorship of small values. This sensitivity causes spurious rejection of power laws and biases parameter estimates even in arbitrarily large samples, which explains inconsistencies between theory and data. We show that logarithmic binning by powers of λ > 1 attenuates these errors in a manner analogous to noise averaging in normal statistics and that λ thereby tunes a trade-off between accuracy and precision in estimation. Binning also removes potentially misleading within-scale information while preserving information about the shape of a distribution over powers of λ, and we show that some amount of binning can improve sensitivity and specificity of K-S tests without any cost, while more extreme binning tunes a trade-off between sensitivity and specificity. We therefore advocate logarithmic binning as a simple essential step in power-law inference.