Rebuilding Decisions in Central OklahomaNadajalah Bennett — UT Arlington A large tornado struck the central Oklahoma communities of Newcastle, Oklahoma City, and Moore on May 20th, 2013. A door-to-door survey was conducted of homeowners throughout the cities of Moore and Oklahoma City in the month of June 2013 to understand how residents may have incorporated mitigation techniques and emergency preparedness options since the May 20th, 2013 tornado. The survey was broken into categories of: damage done to homes, factors or reasons for homeowners implementing mitigation strategies, costs of implementing mitigation applications, and emergency preparedness strategies homeowners use to prepare for severe weather. Most homeowners were either considering or had already installed a storm shelter inside their home to help bring them a better sense of safety. Many homeowners were unaware of other mitigation techniques they could add to their homes to help protect them from wind damage and other severe weather. Some of the reasons why homeowners did not implement mitigation strategies were because of the additional cost of having to pay for them and most homeowners did not have a personal budget for out of pocket costs. |
Basis For This Study
What This Study Adds
|
Forecast Sensitivity of Lake-Effect Snow to Choice of Boundary Layer Parameterization SchemeRobert Conrick — Indiana University This study assesses the forecast sensitivity of lake-effect snow to various boundary layer parameterization schemes using the WRF-ARW model. Six boundary layer schemes are tested on a case- study of lake-effect snow over Lake Erie in Dec 2009. The experiments reveal significant precipitation differences (as much as 20 mm over 6 h) between the schemes. Consideration of the heat and moisture fluxes shows that schemes producing more precipitation have higher fluxes over the lake. Forcing all schemes to use the same over-water heat and moisture fluxes causes the precipitation forecasts to be in closer agreement. The heat and moisture fluxes are found to be strongly dependent on the similarity- stability functions for heat, momentum, and moisture (š¹š», š¹š, and š¹š). When the over-water values for š¹š», š¹š, and š¹š are set to be the same in all schemes, precipitation forecasts are similar in all experiments, thus indicating that the parameterization used to determine š¹š», š¹š, an have profound impacts on forecasts of this type of weather. Comparison of the forecast accumulated precipitation to observations shows that most schemes over predict the precipitation. The scheme in closest agreement is the Mellor-Yamada-Nakanishi-Niino scheme. |
Basis For This Study
What This Study Adds
|
Verification of Earth Network's Dangerous Thunderstorm Alerts and National Weather Service WarningsRebecca DiLuzio — Millersville University Earth Networks Incorporated (ENI) has expressed the potential for their Dangerous Thunderstorm Alerts (DTAs) to increase lead time by an additional nine minutes over current National Weather Service (NWS) tornado warnings while maintaining a similar probability of detection (POD) and false alarm ratio (FAR). These automated, storm-based alerts combine lightning-based storm tracking with total lightning flash rate thresholds to designate regions with an increased potential for severe and hazardous weather. ENI produces alert polygons at three different levels: (1) basic thunderstorm, (2) significant thunderstorm, and (3) dangerous thunderstorm. Verification statistics (POD, FAR and lead time) were calculated for ENIās level 3 DTAs and NWS severe thunderstorm and tornado warnings are calculated for a year of data, March 2013 through Feb 2014. A more in depth case study was done for 20 May 2013. The goal of this comparison is to evaluate how well DTAs perform relative to NWS warnings and if use within operational meteorology will improve warnings.
|
Basis For This Study
What This Study Adds
|
Warming Southeast Australian Climate: The Effects of Sea Surface Temperatures (SSTs)Kwanshae Flenory — Langston University In the past few decades, the climate in Australian has been warming at an alarming rate when compared to historical variations. Associated with that warming, extended heat events, lasting for weeks to months have plagued the country. Climate model projections suggest that such events will occur more frequently and intensify in the future. The extreme temperatures have damages ecosystems through droughts and fire and resulted in the loss of human life. This study examines how the combination of sea surface temperatures (SSTs) and climate drivers predict summer mean maximum temperature at selected locations in SE Australia. Ninety-one ocean grid boxes of SST surrounding Australia were used for simultaneous and lag1 relations as well as 42 climate drivers, creating a suite of 224 potential predictors. Variable reduction using 5-fold cross validated linear regression and bagging, resulted in ~ 90% reduction in the number of variables passed to the final prediction equations. Linear multiple and nonlinear kernel regression methods were applied to predict the January anomalies of maximum temperature using this reduced set of predictors. For the nonlinear regressions, two kernels were evaluated: polynomial and radial basis function. The polynomial degree and radial basis function kernel width were optimized for sea surface temperatures and climate drivers by maximizing their 10-fold cross validated correlations with the air temperatures at the various locations in SE Australia. The key findings were (1) climate drivers had as much significant influence on the prediction accuracy as SSTs and (2) the combination of the reduced sets of SSTs and climate drivers often accounted for 40-60% of the January mean maximum temperature variance. Such a large percentage of predictable variance is expected to lead to more effective monthly temperature predictions. |
Basis For This Study
What This Study Adds
|
Sensitivity of Simulated Supercell Thunderstorms to Horizontal Grid ResolutionMontgomery Flora — Ball State University The effects of horizontal grid spacing on idealized supercell simulations are investigated. Motivation for the study largely stems from the NOAA Warn-on-Forecast program, which envisions a paradigm shift from āwarn-on-detectionā, where convective hazard warning decisions are primarily based on current observations, to a new paradigm where storm-scale numerical weather models play a greater role in generating forecasts and warnings. Unlike most previous grid spacing sensitivity studies, we focus on impacts to operationally significant features of supercells. Using the WRF-ARW model, idealized supercell simulations are run for 2 hours using three different environments and horizontal grid spacings of 333 m and 1, 2, 3, and 4 km. Given that forecasts under the Warn-on-Forecast paradigm will be initialized after several radar data assimilation cycles, we initialize our coarser simulations with filtered versions of the 333m ātruthā simulation valid at t = 30 min. To isolate differences in storm evolution arising from finer-scale processes being unrepresented in the coarser simulations, the latter are compared to appropriately filtered versions of the truth simulations. Results show that operationally significant errors in supercell evolution arise as the grid spacing is increased. Furthermore, the grid spacing sensitivity is strongly influenced by the environment. The 4 km grid spacing is too coarse to even qualitatively reproduce the supercell evolution, with the storm dying before the end of the simulation in one of the three environments. The improvement as grid spacing decreases from 2 to 1 km is greater than that from 3 to 2 km. Implications of this and other findings for Warn-on-Forecast are discussed. |
Basis For This Study
What This Study Adds
|
Verification of the Bragg Scatter Method on the WSR-88DJoshua Gebauer — California University of Pennsylvania For the purpose of radar quantitative precipitation estimates, differential reflectivity (ZDR) plays a crucial role and must be accurately calibrated. Currently, some WSR-88Ds in the Next Generation Weather Radar (NEXRAD) fleet may have systematic ZDR biases due to errors in the measurement of the H and V channels. The Radar Operations Center (ROC) monitors these systematic ZDR biases by measuring returns from external targets that should produce or can be adjusted to zero decibels (dB). One such target that has an intrinsic ZDR = 0 dB is Bragg scatter, a clear-air return caused by turbulent mixing in refractive index gradients. The ROC implemented a method the National Severe Storms Laboratory developed to detect Bragg scatter on the WSR-88D. This study uses atmospheric sounding data as truth to verify the radar based Bragg scatter detection method from January to June 2014 (11,521 radar/sounding pairs). Measurements of refractivity gradients and Richardson number from the 00Z sounding (indicators of conditions conducive to Bragg scatter) are compared to radar-based method detections between 00Z and 02Z. Sounding analyses reveal that the potential for Bragg scatter occurs 95.43% of radar/sounding pairs at vertical layers below 5 km in the continental United States (CONUS). However, due to the methodās strict data filters and volume coverage pattern (VCP) requirements, the method only detects Bragg scatter 4.03% of the time (464 radar/sounding pairs). Of the 464 pairs, Bragg scatter detection is verified 84.7% of the time at the same height indicated by the sounding. Climate region characteristics influence variability of verification statistics. We expect that improvements to the data filters for Bragg scatter detection, better use of available VCPs, and improved scanning techniques will increase frequency of Bragg scatter detection. |
Basis For This Study
What This Study Adds
|
Determining the Likelihood of Observing the Tornado Article Debris Signature at Different Geographic Locations throughout the United StatesShawn Handler — Plymouth State University With the upgrade of the National Weather Service network of weather radars to dual-polarization, it has become possible to use the new radar moments to detect tornado debris. This study investigates the likelihood of observing the tornado debris signature (TDS) at different geographic locations throughout the United States given that an ongoing tornado is present. The likelihood of observing a TDS varies according to radar geometry and the presence of materials that can be lofted by a tornado. To estimate the likelihood of observing a TDS at different geographic locations, we employed datasets of range from the nearest radar, lowest unblocked height of the radar beam, population density, and a normalized differenced vegetation index (NDVI). We also modeled the relationship of tornado intensity and the vertical extent of the debris signature. Maps for three distinct seasons in 2012 (spring, summer, fall) were generated identifying areas where TDS detection would or would not be likely for tornadoes of EF0-EF2 and EF3+ intensities. The study indicates that a tornado is likely to be depicted by a TDS on radar if it occurs in regions of close proximity to the radar site, high population density or rich vegetation, and if the tornado itself is strong. The signature is less likely to be seen for weak tornadoes, rural areas that have little vegetation, and regions that experience beam blockage. Tornadoes of EF0 or EF1 intensities are unlikely to exhibit a TDS, and in some areas, like the Gulf Coast, the TDS may only be observed for tornadoes of EF3+ intensity. The range of TDS detection was also found to be limited in areas susceptible to tornadoes which included portions of the Central Plains, Midwest, and Mississippi Valley. |
Basis For This Study
What This Study Adds
|
Spatial and Temporal Variability of Albedo From Enhanced Radiation Measurements in OklahomaNathan Kelly — Valparaiso University In 1999, the Oklahoma Atmospheric Surface-layer Instrumentation System (OASIS) project placed instrumentation focused on observing the surface energy budget at 89 Oklahoma Mesonet stations. At any given time, 10 stations (designated āsuper sitesā), were outfitted with additional instrumentation including a four component net radiometer with the capability to observe incoming and outgoing shortwave (solar) and longwave radiation. Data are available from the beginning of 2000 until October 2008. This data was filtered to remove observations non-representative of the days albedo (e.g. sunrise and sunset periods, cloudy days, and erroneous instrument readings) and monthly averages were computed for each of the super sites in order to develop a better understanding of the spatial and temporal variability of albedo in Oklahoma. |
Basis For This Study
What This Study Adds
|
Examining Polarimetric Characteristics of Electronic Interference in Weather Radar DataThong Phan — East Central University Meteorologists have been able to examine the atmosphere using weather radars to look at what kinds of precipitation have been occurring for many decades. With the recent upgrade to dual-polarization radars (dual-pol) for the WSR-88D (Weather Surveillance Radar 1998 Doppler), meteorologists can now examine the atmosphere with dual-polarization products. These products are: Velocity (V), Reflectivity (Z), Differential Phase on Propagation (PhiDP), Correlation Coefficient (RhoHV), Differential Reflectivity (Zdr), and Spectrum Width (SPW). Though the products are very useful in determining what type of precipitation are in the atmosphere, how large the precipitation event is, and how severe it can be, it picks up many non- meteorological echoes. Electronic interference is a type of non-meteorological echo that has high reflectivity values and is mistakenly forecasted as precipitation by automated systems. This study looks at the reflectivity, differential reflectivity, and correlation coefficient of electronic interference and precipitation to find objective criteria to distinguish a difference between them. The findings are meant to aid in the current quality control algorithm to be more efficient for operational use. |
Basis For This Study
What This Study Adds
|
Motivators and Important Factors Influencing Decisions Made During the Oklahoma Tornadoes of May 2013Julia Ross — Olivet Nazarene University There were three deadly tornado events in central Oklahoma in a two week time span in May 2013. A mass exodus of drivers occurred during the third event, clogging multiple interstates upwards of 60 miles away from the main storms. Scientists needed to understand what motivated people to act the way they did so they could better anticipate peopleās actions and better communicate to the public in the future. To gain a reliable understanding of this, surveys about what people did during the events were created, distributed, and collected. Factors correlated to driving were those with incomes of less than $30,000 and incomes between $70,000 and $100,000; younger age (20-39 years old), and some higher education (a complete or incomplete Bachelorās degree). Past direct experience with tornadoes was correlated to people staying at home, yet 33% of respondents did not feel safe at home. Of the 77 surveys collected, 27 (35%) respondents had never heard of mitigation beforeāthe strengthening of their homes. Fear was commonly expressed (44%) with an undercurrent of self and home feeling vulnerable. Through these findings, scientists will be better able to anticipate Oklahomansā responses to tornadic events and the reasons behind them. |
Basis For This Study
What This Study Adds
|
An Evaluation of Applying Ensemble Data Assimilation to an Antarctic Mesoscale ModelLori Wachowicz — Michigan State University Knowledge of Antarctic weather and climate processes relies heavily on models due to the lack of observations over the continent. The Antarctic Mesoscale Prediction System (AMPS) is a numerical model capable of resolving finer-scale weather phenomena. The Antarcticās unique geography, with a large ocean surrounding a circular continent containing complex terrain makes fine-scale processes potentially very important features in poleward moisture transport and the mass balance of Antarcticaās ice sheets. AMPS currently uses the 3DVAR method to produce atmospheric analyses (AMPS-3DVAR), which may not be well-suited for data-sparse regions like the Antarctic and Southern Ocean. To optimally account for flow-dependence and data sparseness unique to this region, we test the application of an ensemble adjustment Kalman Filter (EAKF) within the framework of the Data Assimilation Research Testbed (DART) and AMPS model (A-DART). We test the hypothesis that the application of A-DART improves the AMPS-3DVAR estimate of the atmosphere. We perform a test using a one-month period from 21 September - 21 October 2010 and find comparable results to both AMPS-3DVAR and GFS. In particular, we find a strong cold model bias near the surface and a warm model bias at upper-levels. Investigation of the surface bias reveals strongly biased land-surface observations while the warm bias at upper-levels is likely a circulation bias from the model warming too rapidly aloft over the continent. Increasing quality control of surface observations and assimilating polar-orbiting satellite data are expected to alleviate these issues in future tests. |
Basis For This Study
What This Study Adds
|
Wind Farm Layout Optimization Problem by Modified Genetic AlgorithmGrant Williams — Oklahoma State University Wind energy is a rapidly growing source of energy for the United States, but there are still technical problems to resolve before it can become the major source of energy production. One of the biggest problems with land based wind farms is minimizing wake- turbine interactions within a constrained space and thus maximizing power. When wind blows through a turbineās blades, a choppy, turbulent wake is created that interferes with the ability of nearby turbines to produce power. Research has already been done on finding ways to model wind farms and place the turbines in a way that minimizes wake-turbine interactions, but current methods are either computationally intensive or require proprietary software. I present a modified genetic algorithm that is able to pro- duce optimized results in a relatively short amount of computation time. The algorithm presented is able to make use of the computation power of graphical processing units and multiple processors and by doing so produces results much quicker than an algorithm run sequentially on a single processor. |
Basis For This Study
What This Study Adds
|
Copyright © 2014 - Board of Regents of the University of Oklahoma