EPSC Goes Under the Microscope: An In-depth Approach to Program Audits

Sept. 1, 2002
Sarah Lynn Cunningham knew that upper management at Louisville and Jefferson County’s Metropolitan Sewer District (MSD) in Kentucky couldn’t solve the big problems with its Erosion Prevention and Sediment Control (EPSC) program without a good definition of what the problems were. Cunningham also knew state law required MSD to implement a local ordinance to get nonpoint-source pollution from construction sediment under control.But by 1998-after several years of talking about Louisville’s EPSC problems and the proper language for an ordinance-MSD had made little progress on either front. So Cunningham took action. “We wanted to make sure we were doing what we could to reduce the number-one impact to area streams, and that number-one problem was sediment,” says Cunningham, Coalition for Environmentally Responsible Economies principal engineer and assistant to MSD’s executive director. “We knew a detailed, baseline audit of our community’s EPSC performance-with lots of color photos and numbers-would lay out the problems and show that we were not measuring up.”Cunningham had conducted in-depth audits before, but never for sediment and erosion control. “To be fair, you can’t just evaluate the performance of inspectors,” says Cunningham. “You must evaluate the performance of the contractors and the design engineers. Once we got into it, we found we also had to audit the plan reviewers.”The in-depth audit, designed and conducted by Woolpert LLP in conjunction with MSD, evaluated the performance of design engineers, contractors, and inspectors-either working for MSD or private developers-using site plans, overall site data, and interviews. “The average grade was a D, which was very bad,” reports Cunningham. The report, which identified 33 systemic deficiencies, included a number of bar charts that showed how each Area Team-MSD staff charged with enforcing approved EPSC plans at job sites-was performing. “But for our Executive Team, it was the color photos that gave meaning to the data,” she notes. “Once the team saw these photos, they said, “˜Whoa! That shouldn’t be happening!’ The in-depth audit really opened eyes. It was one heck of a reality check.”In-Depth Audits Can Be Revealing

This silt fence has not provided adequate erosion protection at this Louisville construction site.Opening eyes, scoring results, documenting trends and deficiencies, and establishing a baseline for future improvements is what an in-depth EPSC program audit is all about. “We wanted to evaluate the effectiveness of what we were currently doing-what we had to target for improvement-to get the most impact from our training and resources at the least cost,” explains Jason Gillespie, programs administrator for the Soil and Water Conservation District in Greenville County, SC, which completed an in-depth EPSC program audit in 2001. The South Carolina Department of Transportation (SCDOT) recently wrapped up its first in-depth audit assessing the effectiveness of its sediment reduction program on road construction sites. Ray Vaughan, hydraulic design manager in the Hydraulic Engineering Department at SCDOT, says the department decided to conduct the audit after receiving penalties against a couple of its roadway projects. “Penalties are expensive-they can be as much as $10,000 a day or more-and we need to avoid any potential future violation,” Vaughan says. “We want to ensure that we are focusing on the appropriate areas, such as water crossings and streams, and that avoiding water-body impacts is our top priority.” Woolpert’s audits of these three EPSC programs revealed the following typical findings:Erosion controls on design plans are sometimes not installed.Inspectors might be adding or subtracting controls at will.Engineers might be “cutting and pasting” controls from one plan to the next.Standard details crucial for proper installations of standard and nonstandard best management practices (BMPs) are sometimes omitted from design plans.Erosion control is sometimes treated as an after-the-fact item rather than being designed in.Developers, contractors, and inspectors often pay more attention to control measures in urban areas, where visibility and the likelihood of complaints are greater, than in less populated rural areas, where water-body impacts from construction activities might go unnoticed.Old technologies and methods for sediment and erosion control are often not as effective compared with newer technologies and methods.Contractors and inspectors might not be aware of or trained in recently passed state and local ordinances requiring stricter EPSC controls and fines for noncompliance.Because inspectors are typically generalists-responsible for numerous types of inspections in addition to erosion prevention and sediment control-they are often overwhelmed with too much paperwork and too many sites to inspect within short timeframes.

Two straw bales are the only erosion protection and sediment control on this residential project in Louisville.Before now, there was little pressure from regulators to enforce state and local EPSC measures. But the environment is changing. In the eyes of the Environmental Protection Agency (EPA) or any other state or local regulating body, it takes only a single water-body impact to deem an EPSC program on active construction sites an absolute failure. That’s because sediment is the number-one pollutant in our nation’s rivers and streams, contributing to 38% of all reported water-quality problems in impaired rivers and streams. Pollutants not far behind are nutrients and bacteria, which attach themselves to sediment particles and are transported throughout the waters. Contaminated sediments can kill or harm the entire food chain, from bottom-dwelling organisms to fish and shellfish to waterfowl to freshwater and marine mammals. Cancers and neurological defects have been found in humans who eat contaminated fish.
The amount and rapid nature of development today mean communities and organizations must pay closer attention to their EPSC programs and continuously monitor performance. With requirements for erosion and sediment control more stringent than ever, EPA and state and local regulatory bodies are issuing steep fines for noncompliance with state and local ordinances. Teams not up to par will be forced to make changes-or pay a high price. For a community or organization that wants to isolate problems with its EPSC program and determine where changes must be made, conducting an in-depth audit is the place to begin. Audits Establish a Baseline for Improvement
The Louisville auditing team reviews design plans on-site with the contractor and inspector.In the past, some organizations have reviewed the performance of their EPSC programs by examining weekly inspectors’ reports documenting BMPs that are working, need maintenance, or fail regularly; BMPs that contractors are or are not installing; and BMPs with maintenance problems. However, this approach assumes inspectors are doing a thorough job and doesn’t account for possible bias based on project type, location, or the contractor/inspector relationship. What’s more, it might be difficult to identify and compare trends because different inspectors often use different reporting styles. Another method for reviewing EPSC program performance involves assigning a task force to assess several representative sites and determine whether problems exist. These paper-and-pencil reviews can highlight deficiencies, but it can be cumbersome to effectively compare and contrast results from sites reviewed. While these two approaches might produce such results as “Project A has some water-body impact that needs to be addressed,” neither approach can uncover the root cause of problems reported, which is crucial for finding long-term solutions that minimize future impacts. In-depth audits, performed by an objective third party, have several benefits: They evaluate all three components of an EPSC program to determine its overall effectiveness: the performance of the people involved (e.g., design engineers, plan reviewers, contractors, and inspectors)the design plans (to determine whether they meet requirements in state or local ordinances) the field/site conditions (to determine how BMPs are being applied, installed, and maintained)They pinpoint program deficiencies that might not be evident immediately, or that inspectors might overlook. For example: Are improperly designed plans making the EPSC program ineffective? Are controls appropriate but installed incorrectly or in the wrong places? Are controls installed properly but maintained rarely? Do water-body impacts exist downstream? Are certain contractors chronically noncompliant? They are unbiased. Questions are predetermined, scored, and weighted to allow the severity and extent of problems to be uncovered easily. Thus, the performance of individuals, teams, designs, and sites can be compared and contrasted. They demonstrate whether a specific control is effective or ineffective based on field observations. For example: Are controls designed to work well in hilly areas being applied ineffectively in flat, coastal areas? They provide recommendations for specific improvements (e.g., administrative and regulatory changes). For example, if the audit showed that disturbed areas were consistently underreported, a revised ordinance could require more timely and accurate reporting. If the audit showed chronic problems with certain contractors, additional training might be recommended. Because an in-depth audit can help design engineers, plan reviewers, contractors, and inspectors do their jobs better, water quality in the region will be improved. A General Process for In-Depth Audits

This deep-cut channel would not exist if EPSC BMPs had been installed.Because every EPSC program is unique, the process for conducting an in-depth audit varies based on the community or organization. Following is a general process to consider when planning an in-depth audit. Step 1: Determine Who and What to Audit Organizations must decide whether to compare and contrast plan and field data based on district, region, watershed, or some other criteria. What types of sites-residential, commercial, industrial, or institutional-will be audited? Will design engineers, plan reviewers, contractors, and inspectors be audited individually?MSD, for example, audited its five Area Teams-design engineers, contractors, and inspectors-by their assigned watersheds. SCDOT evaluated new road, road expansion, and ramp construction projects by comparing original design plans against site implementations. According to Vaughan, the plan reviews were crucial to the audit. “Are contractors deviating from the plans? Are designed controls being ignored? Are the contractors going by our standards, or coming up with their own? Those are the things we wanted to know,” Vaughan says.

Step 2: Select Project Sites The auditing team should establish and follow specific criteria for site selection. Sites should be representative of the project area and might or might not be selected at random. One approach might be based on the percentage of residential, commercial, industrial, and institutional construction activities underway. For example, if 80% of an area’s current construction is residential, then perhaps 80% of sites audited should be residential sites. If individuals are being audited, a good rule of thumb is three sites per person (i.e., three sites per design engineer, plan reviewer, contractor, and inspector). Gillespie, for example, targeted 35 single-family, multifamily, and commercial developments that he believed might be troublesome for some reason. “We often looked at rural, out-of-the-way sites that weren’t as visible as urban sites,” he says. “Because our percent development was more residential, we looked at more residential sites than commercial.” Step 3: Identify Pertinent Questions for Objectively Evaluating Plans, Field Conditions, and BMPs
An auditor evaluates a well-vegetated drainage channel with a properly installed rock ditch check.The auditing team should clearly identify what data must be collected for the plan review, field review, and review of BMPs. To ensure objectivity, as many yes/no questions as possible should be designed; a manageable number of questions will make isolating trends easier.During recent audits, Woolpert used proprietary field data collection and scoring software that can be customized from audit to audit based on questions designed. During the Greenville County audit, for example, the auditing team measured, counted, and photographed each BMP, which was then evaluated based on use, location, installation, and compliance. The team also surveyed onsite and offsite areas for erosion and sediment impacts and answered related questions. Step 4: Conduct the Plan ReviewThe auditing team begins by examining copies of the plans for sites being audited; team members log answers to questions in the customized data collection software. Greenville County asked these yes/no questions to learn whether its drawings included the following information: North arrow?Scale and scale bar?Location map?Existing contour lines?Proposed contour lines?Contour intervals?Greenville County’s auditing team also devised these yes/no questions to evaluate EPSC practices on the plans: Construction sequence denoted on plans?Erosion prevention denoted on plans?Specific BMP details on plans?Maintenance clause on plans?EPSC practices located in proper places for protection?EPSC practices sufficiently highlighted and documented for proper contractor implementation?Plans provide adequate water-quality protection? Auditors typically review plans for two or three sites per day. Drawings are put under the magnifying glass, calculations are rerun, and structures and controls are reviewed carefully so questions can be answered objectively. Step 5: Score Plan Review Results The scoring system should be simple and objective. Woolpert developed a proprietary scoring program called E/SCORE (Erosion and Sediment COntrol Response Evaluation), which reads data collected, generates scores for questions, and determines the overall Plan General Score. Step 6: Prepare for the Field ReviewAuditors obtain an electronic base map, such as a planimetric, parcel, or geographic information system map, for sites being audited. BMPs and standard details from the plans are input, and all mapping data are downloaded to a rugged pentop computer for field use. The auditing team will use these data to determine whether BMPs on the plans were installed in the field and whether installation was performed as specified in the standard details.Step 7: Conduct the Field Review As with the plan review, the field review is a time for data collection, not interpretation or analysis. The team begins by walking the site to get an overview of site features, including water bodies and construction activities underway. Auditors log answers to questions in the customized data collection software. The electronic map in the pentop PC is used to locate onsite BMPs; auditors typically take a digital photograph and answer approximately 10-15 questions per BMP, recording answers on the pentop PC. Auditors, who pop manhole lids to determine if pipe networks contain sediment, also take digital photographs of conditions on-site and downstream. To ensure accuracy, auditors should record all data in the data collection software before leaving the site. Typically auditors conduct two or three field reviews daily; the more BMPs installed, the more time a field review requires.SCDOT’s auditing team devised these yes/no questions for the field review: Water-body impact? Stabilized drainage?Channel erosion?Roadway impact?Adjacent property impact?Dust impact?Stabilized inactive areas?Stabilized inactive slopes?Stabilized stockpiles?Sediment containment?Shoulder grading?EPSC practices located in proper places for protection?EPSC practices provide adequate water-quality protection?No-longer-needed-controls removed properly?Other field review questions include:
Extent of property impact?
Type of site erosion?
Erosion location?
Step 8: Score Field Review Results Woolpert also uses E/SCORE to score field review results and determine the overall Field General Score. Some questions can be weighted based on relevancy. Questions related to water-body impacts, for example, can be weighted based on importance: No impact (2 points)Minimal impact (1 point)Needs attention (0)Severe impact (-1 point)Catastrophic impact (-2 points)BMPs are scored after the plan and field reviews are completed. The BMP Score is the average score for a particular BMP, such as silt fences or sediment traps, on a site. The Overall Site BMP Score is the average score for all BMPs audited at a site. The Overall Individual BMP Score is the average score for a particular BMP taken from BMP Scores for every site audited. Step 9: Analyze Results In typical audits, each site receives a Plan General Score and a Field General Score, and each BMP receives a BMP Score. Scores can be aggregated in additional ways based on project needs. Typically organizations conducting a baseline audit should expect an average Field General Score of approximately 50% because a single severe water-body impact will dramatically lower the score. The auditing team should analyze Field General, Plan General, BMP, and Overall Scores so conclusions, trends, and systemic deficiencies can be documented and recommendations made. The team can produce various charts summarizing, comparing, and contrasting data collected, scores, and observations. Gillespie was surprised by some of Greenville County’s audit results. “Sometimes a bad set of plans produced a good implementation,” he says. “Sometimes the contractor covered all the bases with controls, which resulted in sediment being maintained on-site and no offsite impacts.”Although Greenville County’s average Plan General Score was 76.5%, its average Field General Score was 56%. This 20-point difference told Gillespie that contractors might not always understand the importance of properly installing and maintaining a designed BMP.Gillespie also learned that contractors find it easier to install and maintain certain types of BMPs and that engineers might need to consider designing with these controls more frequently. The county’s audit concluded that silt fences get overused-sometimes effectively, sometimes not-and that contractors needed additional training in the use and installation of certain BMPs. Vaughan says he was surprised to learn that SCDOT had the lowest scores at water crossings. “Highway construction is linear by nature and crosses many drainage patterns in this area. Erosion is going to occur. There seems to be a lack of relaying the importance of those areas to field personnel and contractors.” SCDOT’s audit showed that contractors sometimes install different, often unacceptable controls instead of those on design plans. Vaughan learned that some controls used routinely were not performing well and should be replaced with better controls or newer products that reduce the possibility of water-body impacts. The audit also revealed that SCDOT’s plans were often too general. “Contractors sometimes can’t tell from the plans what we want to have done,” he notes.“A generalized note such as “˜use silt fence as directed by engineer’ might not be enough to get the job done.”Cunningham says MSD’s audit showed that plan reviewers needed more training in EPSC design because too often they had approved plans that clearly weren’t acceptable. The audit also proved that engineers sometimes designed EPSC plans without ever visiting the site. “How can you design a plan that’s going to work if you have never even been to the site?” she asks. “We were seeing cookie-cutter plans for sediment and erosion control.”
Geotextile blankets properly applied in the median of a highway construction project by SCDOT.Next StepsSteps must be taken to improve an EPSC program based on audit results, which can become the basis for a new or revised EPSC ordinance, a new or revised design manual, and beefed-up training programs. Vaughan says SCDOT is planning to create separate EPSC maps and work more closely with contractors on interim controls needed throughout the construction process.Gillespie says Greenville County plans to require engineers to make intermediate inspections. “We want the engineers to certify, several times throughout the construction process, that what is in the ground has been installed properly. For example, infrastructure won’t be constructed until the erosion control is in the ground and ready to go.”New OrdinancesAfter MSD’s audit, Larry Pardue Jr., construction enforcement officer, began a coaching program that ran for about 18 months. Pardue and his enforcement team visited sites at random, reviewed erosion control plans, and critiqued the performance of design engineers, plan reviewers, contractors, and inspectors using a 12-point Coaching Sheet. Later, results were forwarded to Area Team leaders, who reviewed these sheets to see how teams were progressing. “It was a way to get teams up to speed while our ordinance was being written,” Pardue says.

MSD’s ordinance took effect January 1, 2001, approximately two years after the audit was completed. Among other requirements, the ordinance mandates training and certification for developers and contractors involved in land-disturbing activities at construction sites. A plan-preparation course is highly recommended for engineers. “The classes help make sure everyone is aware of ordinance requirements,” Pardue says. “None of the ordinance requirements is new, but now we have the ability to deal with noncompliance. People realize there are real financial consequences for failing to comply with the approved erosion control plan.”Within 16 months after the ordinance took effect, MSD had issued eight fines totaling approximately $8,100, says Pardue, who called MSD’s ordinance requirements tough but fair. “Communication is the key to compliance,” he notes. “If it’s a complex problem, or if the contractor demonstrates there are attempts being made to fix the problem, we will work with the contractor toward compliance.” A contractor who ignores a Notice of Violation, however, can accumulate penalties until the problem is corrected.