LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

A Revision of the Traditional Analysis Method of Allometry to Allow Extension of the Normality-Borne Complexity of Error Structure: Examining the Adequacy of a Normal-Mixture Distribution-Driven Error Term

Photo from wikipedia

Huxley's model of simple allometry provides a parsimonious scheme for examining scaling relationships in scientific research, resource management, and species conservation endeavors. Factors including biological error, analysis method, sample size,… Click to show full abstract

Huxley's model of simple allometry provides a parsimonious scheme for examining scaling relationships in scientific research, resource management, and species conservation endeavors. Factors including biological error, analysis method, sample size, and overall data quality can undermine the reliability of a fit of Huxley's model. Customary amendments enhance the complexity of the power function-conveyed systematic term while keeping the usual normality-borne error structure. The resulting protocols bear multiple-parameter complex allometry forms that could pose interpretative shortcomings and parameter estimation difficulties, and even being empirically pertinent, they could potentially bear overfitting. A subsequent heavy-tailed Q-Q normal spread often remains undetected since the adequacy of a normally distributed error term remains unexplored. Previously, we promoted the advantages of keeping Huxley's model-driven systematic part while switching to a logistically distributed error term to improve fit quality. Here, we analyzed eelgrass leaf biomass and area data exhibiting a marked size-related heterogeneity, perhaps explaining a lack of systematization at data gathering. Overdispersion precluded adequacy of the logistically adapted protocol, thereby suggesting processing data through a median absolute deviation scheme aimed to remove unduly replicates. Nevertheless, achieving regularity to Huxley's power function-like trend required the removal of many replicates, thereby questioning the integrity of a data cleaning approach. But, we managed to adapt the complexity of the error term to reliably identify Huxley's model-like systematic part masked by variability in data. Achieving this relied on an error term conforming to a normal mixture distribution which successfully managed overdispersion in data. Compared to normal-complex allometry and data cleaning composites present arrangement delivered a coherent Q-Q normal mixture spread and a remarkable reproducibility strength of derived proxies. By keeping the analysis within Huxley's original theory, the present approach enables substantiating nondestructive allometric proxies aimed at eelgrass conservation. The viewpoint endorsed here could also make data cleaning unnecessary.

Keywords: normal mixture; error term; term; analysis; allometry

Journal Title: BioMed Research International
Year Published: 2022

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.