Frequency Analysis


MetStat and our strategic partner, MGS Engineering Consultants, are uniquely qualified and experienced in computing extremely rare precipitation frequency estimates for use in hydrologic hazard studies, including Risk Informed Decision Making (RIDM) which is a method of dam safety evaluation that uses the likelihood of loading, dam fragility, and consequences of failure to estimate risk. We also specialize in Local Intense Precipitation (LIP) studies that provide 1-hour/1-square mile precipitation-frequency relationships for nuclear power plant sites.

L-moment regional frequency analysis methods, similar to those applied in NOAA Atlas 14, are used in producing site-specific and watershed PF estimates. In fact, several team members were key developers of NOAA Atlas 14. The unparalleled expertise and knowledge of MetStat serves our clients well by providing the insight into processes and procedures needed for probabilistic precipitation analyses.

Regional PF analyses which are intended for application for extreme precipitation events must be conducted with a very high level of attention to detail in all aspects of the analysis. The current state-of-practice in regional PF analysis uses methodologies originally developed by Hosking and Wallis as described in their classic text entitled “Regional Frequency Analysis, An Approach Based on L-Moments” (1997). These methodologies provide the foundation for regional analyses utilized in NOAA Atlas 14.  MetStat builds upon these accepted regional analysis methods along with the findings of existing PF studies. Recent advances in regional analysis methods and spatial mapping of precipitation are being used to improve the process and decrease uncertainty.

Our expertise provides rich data sets of precipitation frequency estimates, plus uncertainty bounds, for any watershed, storm type and duration out to 1 in 1,000,000 years for ensuring ultimate safe operation and design of high-impact infrastructure against the most extreme floods possible.

An example of a 3-day basin average precipitation, including the upper and lower confidence limits, frequency plot. Our robust statistical processes allow estimation of 1 in 1 million year events.
Isopluvial map of 24-hour precipitation with an average recurrence interval of 100 years in the state of Wyoming.

Regional Frequency Analysis Approach

We follow the following standard-of-practice approach to a regional frequency analysis:

  1. Assembly and quality-checking of annual maxima datasets, including delineating analyses for different storm types
  2. Analyses of the seasonality of storms for use in hydrologic modeling
  3. Delineation and verification of climatic regions that are homogeneous with respect to extreme precipitation
  4. Analysis of the spatial behavior of the station at-site means and regional L-moment ratios L-Cv and L-Skewness for use in spatial mapping of precipitation AEPs
  5. Identification of the regional probability distribution for computing precipitation AEPs
  6. Computation of Equivalent Independent Record Length (EIRL) for use in estimating the effective independent size of the regional dataset. This information is used in characterizing the uncertainty of L-moment ratios and distribution parameters and for computing uncertainty bounds for PF estimates
  7. Production of gridded datasets of precipitation for selected AEPs in the range from the median to 1E-9
  8. Conduct uncertainty analyses and develop templates for the mean frequency curve (best-estimate) and uncertainty bounds for representative locations. The uncertainty templates would be scalable/applicable to all locations in the study area
  9. Complete various precipitation-related gridded datasets and other deliverables, such as temporal distribution templates, needed to support products and applications needed by the client. This may include a graphical user interface for accessing the result.


It seems every time an extreme rainstorm results in flooding, inquiring minds want to know the rarity of the rainfall in terms of frequency and/or likelihood/probability of occurring again. In fact, to many, especially those outside the weather/climate community, the equivalent frequency of the storm is more meaningful than the actual amount of rain. For example, if I said “it rained 4.50 inches in one day in Denver,” you might not be as impressed as if I said “Yesterday’s rainfall was a 100-year event in Denver.” In stark contrast, 4.50 inches in one day in Miami, Florida represents an amount that occurs, on average, every single year. Regardless of the frequency of the rainfall, it does not equate to the same frequency of the resulting?  flood. The degree of flooding is largely the result of the rainfall, but other important factors such as soil type, antecedent moisture conditions, slope, vegetation, season, snowpack, regulations on water flow and other environmental factors,  influence the degree of  flooding. In other words, under the right conditions, such as an burn scar where vegetation has been removed, a 5-year rainfall event can cause a 100-year flood. Understanding the complex terminology, background, and uses of frequency-based products is difficult. Here, we present  a brief summary of precipitation-frequency analyses..

Terminology, Semantics & Definitions

By definition, a precipitation frequency estimate (PFE) is defined as the depth of precipitation at a specific location for a specific duration that has a certain probability of occurring in any given year. For example, the PFE for a 24-hour precipitation amount in Denver, CO with a 1% chance of occurring in any given year is 4.50 inches.  PFEs are usually defined in terms of an Average Recurrence Interval (ARI) or Annual Exceedance Probability (AEP); AEPs are often translated into a percent chance of occurring in any given year to alleviate confusion. The term “return period” is equivalent to the ARI, but not widely used since it gives the false sense of an events only occurring every x number of years, when really ARI is the average number of years between exceedances of a given precipitation amount; implicit in this definition is the years between exceedances are generally random. For instance, in Denver, 4.50 inches of precipitation over a 24-hour period happens on average every 100 years.  To emphasize the fact that 4.50 inches in 24-hours could fall several times in a year or over the course of a few years, the AEP is often used to describe the rareness. An AEP is the probability 4.50 inches (or more) in 24-hours will fall during the year; in our example, that equates to probability of 0.01 or 10E-2 in scientific notation. The AEP = 1/ARI. For convenience, AEPs are often multiplied by 100 to convert them into a percent chance of occurring during a given year; so an AEP of .01 equates to a 1% chance. In other words, there is a 1% chance that an event will equal or exceed a certain PFE at a specific location and duration during a given year. And lastly, some like to translate AEPs/ARIs into an odd, similar to winning the lottery for example. In this context, the odds of a 100-year event occurring in any given year is 1 in 100. Below is a table for converting between different AEPs, ARIs and odds.

(scientific notation)
Percent chance
in any given year
Odds in any
given year
111.00E-00100%1 in 1
20.52.00E-0050%1 in 2
50.25.00E-0020%1 in 5
100.11.00E-0110%1 in 10
250.042.50E-014%1 in 25
500.025.00E-012%1 in 50
1000.011.00E-021%1 in 100
2000.0052.00E-020.5%1 in 200
5000.0025.00E-020.2%1 in 500
1,000 0.0011.00E-030.1%1 in 1,000
10,000 0.00011.00E-040.01%1 in 10,000
100,000 0.000011.00E-050.001%1 in 100,000
1,000,000 0.0000011.00E-060.0001%1 in 1 million
10,000,000 0.00000011.00E-070.00001%1 in 10 million
100,000,000 0.000000011.00E-080.000001%1 in 100 million
1,000,000,000 0.0000000011.00E-090.0000001%1 in 1 billion

Calculating Precipitation Frequency Estimates

MetStat utilizes a standard-of-practice approach called the Regional Frequency Analysis (RFA) Approach for calculating PFEs by storm type out to extremely rare AEPs (10E-8). Yes, a 1 in a billion year event! Before questioning this, read on. A RFA analysis includes the following general steps.

  1.       Assembly, quality-checking and sorting of precipitation datasets by storm type. This includes identifying the maximum precipitation value per year for each duration for each storm type; this is the annual maxima. A time series of annual maxima at a station is called the annual maxima series (AMS).
  2.       Analyses of the seasonality of storms – Knowing when the different storm types typically occur are important for hydrologic modeling. For instance, it would not make physical sense to model the runoff from a local storm in the middle of the winter in Denver.
  3.       Delineation and verification of climatic regions that have similar extreme precipitation storms and regimes. Regions of similar regimes allow grouping of precipitation stations and support the concept of trading space for time. In other words, instead of using a single stations’ data, which might last only 50 years, we can group stations and their respective period of record to represent a much larger sample. Statistically homogeneous sets of stations are identified within these regions to develop statistical and predictor relationships.
  4.       Spatially map the different statistical components used to compute the PFEs at every location (a.k.a., point) in study area using predictor equations and spatial mapping procedures using Geographic Information Systems (GISs). This will allow computation of PFEs at all locations.
  5.       Determine the statistical function that describes the behavior of the extreme precipitation data for computing AEPs. This is known as the probability distribution. For those statistically inclined, we use the four-parameter kappa distribution.
  6.       Apply the statistical equations and spatial maps of the different statistical components to compute gridded datasets of precipitation for AEPs ranging up to 1E-9.
  7.       Determine the Equivalent Independent Record Length (EIRL), which is the number of years that conveys how many years of independent data are available for computing the PFEs. It is not unusual for EIRL to be on the order of 5,000 years; if imposing the rule-of-thumb noted above, one can confidently compute PFEs out to ARIs of 10,000 years based on an EIRL of 5,000 years.
  8.       Conduct an uncertainty analyses, which provides users with a sense of how the average PFEs can vary with increasing rarity.
  9.       Determining the PFEs for a watershed versus a point, which is what we’ve been talking about thus far, is a critical last step before handing off the PFEs to a hydrologist.  The process for translating point PFEs to watershed PFEs is highly complex and the subject of a future blog.  

Buried in the details of the RFA approach are robust statistical relationships that support the computation of PFEs out to 1 in 500,000. It is critically important to understand that these statistical analyses are not an expression of over-confidence in the statistical approach, but rather to make comparisons with PMP and to assess the behavior of the frequency curves and uncertainty for dam safety. As economics and risk play more into the design and operation of critical infrastructure, there is great interest in knowing how rare PMP is; our studies have found the AEPs associated with PMP range from 10E-3 to 10E-10, which represents a huge range and equates to significantly different hydrologic risks and economics. The RFA approach can adequately demonstrate the rarity of PMP, thereby providing regulators a more comfortable, objective sense of massive floods or the the Probable Maximum Flood (PMF) at dams or nuclear power plants. If the PMP is questionably conservative (e.g., too high), an RFA provides a sound, objective and economically viable solution for selecting an acceptable degree of risk (i.e., selecting an appropriate AEP) and comparison to PMP for design and operation. The expense of conducting an RFA and subsequent hydrologic analysis is considerably less than the physical modification of a dam or nuclear power plant which can soar into the billions of dollars.

As a side note, AEPs of seismic activity (earthquakes) are often estimated in a manner  similar to AEPs as precipitation via RFA, but on far less data. Point is, although AEPs of 10E-3 to 10E-10 of precipitation and earthquakes are hard to fathom, they are routinely calculated for evaluating the risk of critical infrastructure; this concept is collectively known as Risk Informed Decision Making.

The Ultimate Goal

Although the ARI/AEP of precipitation is fascinating and requires a much greater effort than most people realize or appreciate, the ultimate goal of most precipitation frequency studies is to support hydrologic studies striving to quantity the degree of flooding or risk of flooding at a location. The risk of flooding as a result of a dam over-topping is easily conveyed through a Hydrologic Hazard Curve (HHC), which is a plot that relates maximum reservoir level to AEP. An example HHC is shown in Figure 1.

HHC Example

Figure 1. Example of Hydrologic Hazard Curve for Maximum Reservoir Level.

What about Climate Change?

The impact of climate change on PFEs as been the subject of numerous research efforts, but a standard of practice has not yet been established to adjust PFEs accordingly (Cheng 2017, Kunkel 2013, Perica 2013, Buishand 2009).  In fact, the World Meteorological Organization does not call for explicit inclusion of climate change effects in PMP or rare PFE estimation. Until a standard of practice is established, the effects of climate change can be   considered using uncertainty bounds of the PFEs. Providing uncertainty bounds allows users and regulators to select the upper bound of the PFEs for purposes of regulation, design and/or operation to account for possible climate change. Another important consideration, which is substantiated by findings from some of our studies, suggests other hydro-meteorological impacts of climate change are more impactful than actual changes in the PFEs. For instance, one could imagine that climate change could cause mountain snowpacks to melt earlier in the season and before the typical time of year when extreme springtime storms occur, thereby lessening the flood potential under a warmer climate regime. The take-away is that climate change has several feedbacks that need to be considered when accounting for climate change in PFEs and subsequent hydrologic analyses.

Communication is Key

Effective communication of extreme precipitation, in a probabilistic perspective, is a challenge we face almost daily at MetStat and is largely the motivation of this blog. The most common communication breakdown is the difference between precipitation ARI and flood ARI; it is not unusual for the newspaper headline to read “100-year Flood Inundates City” when in reality the rainfall was a 100-year ARI, but the flood may have been a 250-year flood or a 25-year ARI depending on antecedent conditions. The other misconception is that 100-year storms appear are happening far more frequently than what the term “100-year” ARI would imply. The fact is there are hundreds of 100-year storms in the United States each year, but rarely at the same exact location. The other element of communicating extreme events is the size of the storm or footprint of extreme precipitation. Combining the size and rarely of storms into a storm severity index has been attempted by several researchers, but an industry-wide standard has not been adopted by NOAA’s National Weather Service (Grisa 2014, Waters 2014).

Final Thoughts

Hopefully, this succinct summary provides a foundational understanding of the history, terminology, methods and communication of precipitation frequency estimates. The tools, data and expertise needed to conduct precipitation frequency analyses represent a very narrow niche in the meteorological field. MetStat has been a leader in this niche since 1994 and, with strategic partners, continues to develop state-of-the-practice methods for advancing the science of extreme precipitation.


  • Cheng, Linyin, and Amir AghaKouchak, 2017. Nonstationary Precipitation Intensity-Duration-Frequency Curves for Infrastructure Design in a Changing Climate, Scientific Reports 4 (2014): 7093. PMC. Web. 25 Oct. 2017.
  • Foufoula-Georgiou E., 1989. A probabilistic storm transposition approach for estimating exceedance probabilities of extreme precipitation depths. Water Resource Res 25, 799–815.
  • Buishand, Hanel M., et al, 2009. A nonstationary index flood model for precipitation extremes in transient regional climate model simulations. J Geophys Res-Atmos 114, 1–16.
  • Kunkel, Kenneth E. et al, 2013. Probable maximum precipitation and climate change. Geophys Res Lett 40, 1402–1408.
  • Hosking, J. R. M. & J.R. Wallis, 1997. Regional frequency analysis: an approach based on L-moments. Cambridge University Press. pp 244.
  • Grisa, Thomas M., Relabeling Extreme Rainfall Events To Improve Public Understanding, 2014. Proceedings of the Water Environment Federation, WEFTEC 2013: Session 10 through Session 19, pp. 1335-1345(11)
  • Perica, Sanja, Deborah Martin, Sandra Pavlovic, Ishani Roy, Michael St. Laurent, Carl Trypaluk, Dale Unruh, Michael Yekta, Geoffrey Bonnin, 2013. NOAA Atlas 14 Volume 8 Version 2, Precipitation-Frequency Atlas of the United States, Midwestern States. NOAA, National Weather Service, Silver Spring, MD.
  • Waters, Stephen D., A Classification Index for Precipitation Events in Maricopa County, Arizona, 2014.  Flood Control District of Maricopa County, White Paper, pp. 1-13.

Completed Project Profiles

All-Season 6- and 24-hour Precipitation in Texas - 2014-2015

MetStat produced gridded estimates for 2-year through 1,000-year recurrence intervals to support probable maximum precipitation estimation.  This included collecting and quality controlling precipitation annual maximum series for 6-hour and 24-hour data; creating statistically homogeneous regions across the state of Texas in climates ranging from arid inland to humid along the coast; and conducting precipitation frequency analyses using a regional L-moment approach.


The 100-year rainfall values for the 24-hour duration over Texas and adjacent regions.

Santa Clara, San Mateo, and Alameda County, CA Frequency Analysis - 2015-2016

MetStat built upon existing precipitation frequency studies and incorporated other local data and alternative approaches to provide all-season precipitation frequency estimates to the Counties for durations 15-minutes through 3-days and recurrence intervals of 1-year through 1,000-years with accompanying temporal distributions for durations 6-, 24- and 72-hours.  Stakeholders were actively engaged throughout regarding data quality and local climate concerns to provide the best possible results.  The approach and resulting precipitation frequency estimates and 90% confidence limits were compared with estimates previously published in other studies, such as NOAA Atlas 14.

The 100-year rainfall values at the 24-hour duration for the several counties in the Bay region of central California.

Duke Energy Local Intense Precipitation Analysis

Together with MGS Engineering Consultants, MetStat preformed a Local Intense Precipitation (LIP)  analyses for a Duke Energy-owned Nuclear Station.  According to the Nuclear Regulatory Commission (NRC), a LIP is a “hypothetical locally heavy rainfall event that is used to design flood protection features and/or procedures. LIP is typically assumed to be equivalent to the local probable maximum precipitation (PMP) derived from National Weather Service (NWS) Hydrometeorology Reports (HMRs), from a site-specific PMP study” or a precipitation frequency analysis.  In some locations of the U.S., rainfall in excess of 19 inches in 1-hour (over one square mile) are estimated from the HMRs. In our study, we leveraged storm analyses and results from nearby precipitation frequency studies to ascertain the annual exceedance probability of the LIP. We also provided temporal distributions (shown below) of extreme 1-hour events for hydrologic modeling of the cooling pond. This helped Duke Energy evaluate their flood impacts and risk on required structures, systems, and components (SSC’s).  In the end, our analysis helped eliminate an NRC-issued red violation based the Significance Determination Process (SDP).

Example high temporal resolution analysis of a local intense storm.

Friant Watershed Basin-Average Precipitation Frequency

A 72-hour basin-average precipitation-frequency relationship for the Friant watershed is needed for stochastic flood modeling of inflows to Friant Dam and dams operated by Southern California Edison in the Big Creek System of the Upper San Joaquin River basin. A regional precipitation-frequency analysis using L-moments was conducted for a study area consisting of the west face of the Sierra Mountains and adjacent lowland areas and for the east face of the Sierra Mountains extending from near Bakersfield, CA to near Mt. Shasta, CA.

Sixteen large storm events were identified in the historical record which produced noteworthy 72-hour precipitation totals over the Friant watershed. Isopercental analysis methods were used to develop the spatial distribution of 72-hour precipitation over the Friant watershed. Regression methods were used to establish a relationship between 72-hour point precipitation within the watershed and 72-hour 1,660-mi2 basin-average precipitation for the Friant watershed. Monte Carlo methods were used to develop the 72-hour basin-average precipitation-frequency relationship for the Friant watershed based on the findings of the regional precipitation-frequency analysis and the relationship between 72-hour point precipitation within the Friant watershed and 72-hour basin-average precipitation for the Friant watershed. Uncertainty analyses were conducted for the parameters used to derive the 72-hour basin-average precipitation-frequency analysis which allowed development of uncertainty bounds for the best-estimate 72-hour basin-average precipitation-frequency relationship for the Friant watershed.

The 72-hour basin-average value of Probable Maximum Precipitation (PMP) of 27.2-inches obtained from National Weather Service Hydrometeorological Report 59 for the Friant watershed is estimated to have an annual exceedance probability of 2 x10-5 based on the 72-hour basin-average precipitation-frequency relationship developed for the Friant watershed.

72-hour watershed-mean precipitation frequency curves for Friant Dam, CA.

Contributions to NOAA Atlas 14 Precipitation Frequency Estimates

For over a decade (1998-2013), Tye Parzybok and Debbie Martin of MetStat® played key roles in the development of NOAA Atlas 14, including the original design, development and maintenance of the Precipitation Frequency Data Server (PFDS).  We contributed to the completion of nine volumes of NOAA Atlas 14 for project areas covering a range of climates across the United States and affiliated territories: semiarid southwest U.S., Ohio River Basin and surrounding states, Puerto Rico and the U.S. Virgin Islands, Hawaii, selected Pacific Islands, California, Alaska, Midwestern U.S. and Southeastern U.S. The Atlas contains precipitation frequency estimates with 90% confidence intervals for durations 5-minutes through 60-days for average recurrence intervals of 1-year to 1,000-years along. Through the course of development, data from over 41,000 stations were collected and quality controlled from various sources. Precipitation data were regionalized using various climatological and geographical characteristics and also using a “region-of-influence” approach. L-moments were used to analyze annual maxima data and develop point precipitation-frequency estimates and IDF curves. Other statistical analyses were conducted including temporal distributions of heavy precipitation, trends in mean and variance of annual maximum series data and seasonality of heavy precipitation. Contributions to documentation included internal summaries of technical discussions, on-line Quarterly Progress Reports and final published documents for nine Volumes of NOAA Atlas 14.

NOAA logo