Best Managing BMPs

May 25, 2018
Overcoming the challenges of storm water BMP evaluation & implementation

About the author: Derek Berg is director of Stormwater Regulatory Management - East for Contech Engineered Solutions LLC. Berg can be reached at [email protected].

When it comes to storm water best management practices (BMPs), crafting and implementing sound policies to ensure they are properly evaluated, deployed and maintained remains an obstacle for many storm water programs. Storm water programs often lack the time and expertise to execute this task effectively. Assuming sound performance data is available, putting policies in place to govern how that information and the BMPs supported by it may be deployed locally is essential.

Consider the Options

Ensuring storm water BMPs are performing as intended is easier said than done. The cost and complexity of robust BMP monitoring efforts limit how much data we are able to collect on any given BMP. The problem is particularly pronounced for those BMPs in the public domain since the funding available for monitoring them often is part of limited research budgets at academic institutions or increasingly scarce grant funding. For proprietary BMPs, also referred to as manufactured treatment devices (MTDs), monitoring costs are borne by the supplier in most cases, which provides regulators leverage to require monitoring from those seeking approval. However, considering a single long-term field study in accordance with a nationally recognized protocol like Washington Department of Ecology’s Technology Assessment Protocol-Ecology (TAPE) can easily exceed $250,000, manufacturers are rightfully motivated to seek acceptance of the resulting data from a successful study by multiple regulatory jurisdictions. In other words, regardless of BMP type, there are limits to how much monitoring can realistically take place.

Older BMPs such as wet ponds and sand filters were predominantly studied decades ago using varying methods and protocols, some of which no longer pass muster, and then all of that data was compiled into a single composite performance assumption that permeates storm water manuals all over North America. A similar approach is being applied to newer green infrastructure practices, but currently there is a less substantial body of data available, so we still have much more to learn. Then there are the MTDs, which often are required to be monitored in accordance with a patchwork of laboratory and field protocols to satisfy major regulatory programs, making comparability a challenge and barriers to new innovation high.

Comparing two or more BMPs side-by-side and definitively determining which one will work best for an application often is not feasible. However, we have learned a lot about BMPs over the years, and we have the ability to make smarter decisions moving forward. No BMPs-green, grey, old or new-should be granted blanket acceptance without robust performance data to support assumptions. As an industry, we might be well served to reach an agreement on some minimal amount of data, with an eye toward quality, before a BMP is granted widespread acceptance.

Resource Constraints

Often compounding the inherent challenges in vetting BMPs are the chronic resource constraints faced by state and local storm water programs. Even well-staffed programs with dedicated funding from storm water utilities or other means fall short when it comes to BMP evaluation. The problem certainly is not one dimensional, but a major contributor is the fact that BMP evaluation often takes a disproportionate amount of program resources relative to its contribution to meeting program obligations required by the applicable permit. Permitting, plan review, site inspections, ordinance development, public engagement and reporting are prioritized when resources are slim. The issue quickly compounds when there is not sufficient in-house expertise in the issues specific to BMP evaluation. The end result often is some combination of extensive delays, poor execution or shelving the effort all together.

It is not surprising that the handful of storm water programs that do actively evaluate BMPs are increasingly referenced by those programs lacking resources to implement their own. Two commonly referenced programs are the New Jersey Department of Environmental Protection’s (NJDEP) MTD Certification program which is jointly implemented by NJDEP and the New Jersey Corporation for Advanced Technology and the TAPE program.

Given their industry prominence, it is not a major surprise that these two programs also are slated to serve as the foundation of the Stormwater Testing and Evaluation for Products and Practices (STEPP) program that is being shepherded forward by the Water Environment Federation with support from a diverse stakeholder group. Recognizing the barriers to innovative BMP acceptance as well as BMP comparability, STEPP is intended to instill confidence in BMPs that complete the process in accordance with applicable criteria. Granting reciprocity to those BMPs that complete such a process is an invaluable solution for local programs with limited resources, but it does not negate the need to adopt sound policies to govern local implementation.

Policy & Implementation

Examination of BMP evaluation programs past and present reveals several themes that often lead to trouble when not addressed via sound policy and consistent implementation. First and foremost, put the BMP evaluation program in writing. While seemingly intuitive, many programs have implemented informal polices, and problems arise as implementation inevitably deviates from the original intent. Putting a simple written framework in place that identifies what data is required, submittal expectations, sizing requirements, various approval categories and establishes a list of approved solutions will make the process more palatable for all stakeholders. Once a written policy is in place, consistent implementation becomes the critical component. Making exceptions over time has been one of the biggest contributors to stakeholder frustration, and ultimately, the outright collapse of local evaluation programs.

For those programs that do not already do so, it also is critical to identify a water quality design storm and provide a means to size both volumetric and flow-based BMPs. Programs that have identified a water quality storm depth often stop short of providing a method for determining a water quality flow rate. This step is essential to ensure consistent and proper sizing for flow through BMPs that treat runoff in real-time as opposed to capturing and slowly treating the water quality volume. It also is essential that BMP approvals are tied to applicable test results, so the maximum treatment rates supported by the data are not exceeded during the water quality event. Failing to link approved sizing to the submitted data commonly results in more aggressive sizing and underperformance relative to tested configurations. The simplest way to address this issue is to mandate that the design hydraulic loading rate from the tested BMP not be exceeded during the peak water quality flow rate. This assumes that the tested BMP actually reached its design operating rate during testing, but if only lower operating rates were evaluated then the approved loading rate should be capped accordingly.

In order to encourage innovation, all policies should create a pathway to acceptance that puts all BMPs with sufficient performance data on equal regulatory footing. Establishing prescriptive standards that favor one type of BMP over another ultimately will stifle innovation and future investments in storm water solutions. Prescriptive standards also can lead to sprawl if there are not sufficient BMPs to overcome the numerous potential site constraints common in urban areas. Establishing strong performance based standards and a pathway to acceptance for all BMPs able to meet those standards ensures maximum design flexibility, encourages ongoing innovation, and most importantly, protects water quality.

Last but not least, we cannot overlook the fact that we have a BMP maintenance problem. More specifically, we are installing thousands of BMPs each year, all of which need regular maintenance, but we are failing to maintain the vast majority of them. The functionality and water quality benefit of neglected BMPs is drastically diminished, so we have to do better to have any chance at meeting water quality goals. The answers are not simple, particularly when it comes to funding, but we need to put stronger maintenance policies in place. Ensuring we know where our BMPs are and who is responsible for them, planning for inevitable maintenance costs, and having a means to enforce policies when expectations are not being met are all steps in the right direction.

There is no question that BMP evaluation is often complicated, but it is an essential task in order to advance the science and ensure we are meeting water quality goals.

About the Author

Derek Berg