-
Effective Regulation of Water Quality
Categories: Estimated reading time: 3 minutes
Current regulation of water quality, based on the statutes they implement, fail to effectively describe the current state of water bodies. There are historical and political reasons for this condition but no excuse to continue as we have for the past almost 60 years. Our understanding of aquatic ecosystems and the development of appropriate statistical models for environmental data provide the means to more effectively regulate water quality to benefit natural ecosystems and human health. -
Environmental Decision-Making in Uncertain Times
Categories: Estimated reading time: 4 minutes
Every business complying with environmental laws can be profitable and sustainable while operating environmentally responsibly. Environmental aspects may not have the same importance to you as other business aspects, but it’s under your control to avoid issues that can cost time and money better spent elsewhere. You should take action now to limit your risk of costly or damaging environmental issues such as permit compliance enforcement. Acting now is especially important because the future is uncertain and the present is constantly changing. -
After 50 years it is time to bring environmental policy and regulatory decision making into the 21st century by applying statistical paradigms that produce technically sound and legally defensible results from environmental data. When the Clean Water Act, Endangered Species Act, and National Environmental Policy Act were created, and federal agencies directed to develop regulations to ensure compliance with them, biologists and ecologists knew less about environmental systems and data analyses than we do today.
-
The null hypothesis/significance testing (NHST or frequentist) analytical paradigm does not produce answers for environmental policy or regulatory decisions because rejecting the null hypothesis (of no difference between data sets) says nothing about why or by how much they differ. The likelihood (information theoretic) paradigm overcomes many of NHST’s problems and can be applied to environmental data when its limitations are understood. Download the PDF.
-
The frequentist and likelihood frameworks for analyzing environmental data assume that there is a “true” state of the world represented by the values described by a single hypothesis and its probability distribution. The Bayesian framework assumes that observations are the “truth” while the hypotheses explaining the observations have probability distributions. The Bayesian approach solves many conceptual problems of applying the frequentist approach to environmental data because Bayesian results depend on observations (or measurements) rather than on a range of hypothetical outcomes.
-
The three previous parts of this series described statistical frameworks for objectively analyzing environmental data and explaining where each is appropriate. Correct statistical models applied to environmental concerns are powerful tools for regulators, permit holders, attorneys, and consultants. Results are more technically sound and legally defensible than the commonly used methods. Appropriate statistical analyses can demonstrate compliance with statutory goals and objectives. Download the PDF.
-
The original form of this article was submitted on March 31, 1995 as the Direct Service Industries’ comments to draft regulations proposed by the U.S. Fish and Wildlife Service (FWS) and the National Marine Fisheries Service (NMFS, now NOAA Fisheries). The two agencies wanted to define “distinct population segments” under the Endangered Species Act (ESA) so that they would have a consistent definition for their regulatory decisions. Unfortunately, administrative convenience and political accommodation replaced science in the definition.
-
Predicting concentrations of chemicals in surface waters is a major component of permitting decisions, from NEPA impact assessments and NPDES point source discharge to mine closure and Superfund liability bond releases. Decision delays are costly for operators, and regulators are too often sued by those claiming that decisions were based on inadequate data. Usual approaches to forecasting chemical concentrations are to build complex numeric ecosystem models or predict concentrations of single chemicals rather than the entire set of chemicals of interest.
-
The Resource Conservation and Recovery Act (RCRA) as implemented by EPA and state regulations requires monitoring of ground water chemistry and statistical analyses of these data. The latest revision of the EPA’s statistical guidance document is 887 pages long (plus supplements) and has been augmented by a Webinar because the statistical analyses are not simple or easily understood by non-statisticians/data analysts. Some commercial software is sold to perform these analyses, but like all other statistical software it does not ensure that the user completely understands how to select models to apply or can properly interpret the results.
-
Introduction to Wetlands
Categories: Estimated reading time: 1 minutes
Wetlands are difficult to understand by non-specialists, and by many specialists, too. There are differences among the general public perception of what is a wetland, the definitions used by wetland scientists, and the definitions used by regulators on jurisdictional wetlands. This post is designed to introduce wetland definitions. Wetland water quality is regulated under Section 404 of the Clean Water Act. The definition of wetland used by wetland scientists may not be the same as that used by the wetland regulator.