10 research outputs found
Assessing the Impact of Non-Conventional Observations on High-Resolution Analyses and Forecasts
A key recommendation of a 2009 report by the National Research Council (NRC) was for new mesoscale networks to be integrated with existing ones to form a nationwide ânetwork of networksâ. This recommendation originated in response to noted deficiencies in the U.S. mesoscale observing network. The report also recommended that research testbeds be established, such as the Center for Collaborative Adaptive Sensing of the Atmosphere (CASA) DFW Urban Demonstration Network, to ascertain the potential benefit of proposed observing systems.
In this work, non-conventional surface observations from Global Science & Technology (GST) Mobile Platform Environmental Data (MoPED), WeatherBug, Citizen Weather Observer Program (CWOP), and Understory Weather in the DFW Testbed are considered. Radar data include Terminal Doppler Weather Radars (TDWRs) and CASA X-band radars. The Advanced Regional Prediction System (ARPS) model is used to perform observing system experiments (OSEs) that are designed to assess the impact of the aforementioned networks. The three-dimensional variational (3DVAR) analysis system is used, along with the complex cloud analysis, to produce analysis increments every 10 minutes, which are then applied to the model forecast using incremental analysis updating (IAU). Experiments are performed on a supercell thunderstorm that impacted the DFW metroplex on 11 April 2016 with large, damaging hail. The analysis includes qualitative and quantitative comparisons of the forecast reflectivity fields, quantitative comparisons of model-derived hail with radar-observed hail, and surface-level verification of the temperature and dew point fields. The CASA radial velocity data offer positive benefit to the forecasted storm structure as noted in the simulated reflectivity, along with model-derived hail. However, the data appear to be detrimental when considering quantitative comparisons of the simulated reflectivity with observations. The inclusion of dew point temperature measurements from the non-conventional CWOP and WeatherBug networks resulted in a degradation in the forecasted dew point field. The analysis concludes with a brief comparison of the results for single-moment versus double-moment microphysics scheme sensitivity. Future work should assess the impacts of the non-conventional observations on a wider array of cases
Forecast Sensitivity to Observations using Data Denial and Ensemble-based Methods over the Dallas-Fort Worth Testbed
The âNationwide Network of Networksâ (NNoN) concept was introduced by the National Research Council to address the growing need for a national mesoscale observing system. Part of this growing need is the continued advancement toward accurate high-resolution numerical weather prediction. The research testbed known as the Dallas â Fort Worth (DFW) Urban Demonstration Network was created to experiment with many kinds of mesoscale observations that could be used in a data assimilation system, in order to identify observational systems that are most impactful on high-resolution forecasts. Many observation systems have been implemented for the DFW testbed, including Earth Networks (ERNET) Weather Bug surface stations, Citizen Weather Observer Program (CWOP) amateur surface stations, Global Science and Technology (GST) mobile truck observations, CASA X- band radars, SODARs, and radiometers. These ânonconventionalâ observations are combined with conventional operational data from METARs, mesonet, aircraft, rawinsondes, profilers, and operational radars to form the testbed network. A principal component of the NNoN effort is the quantification of observation impact from several different sources of information. This dissertation covers two main themes related to quantifying the impact that observations have on forecasts.
The first part is the quantification of impact using data denial experiments, or observational simulation experiments. The GSI-based EnKF data assimilation system was used together with the WRF-ARW model to examine impacts of observations assimilated for forecasting convection initiation (CI) in the 3 April 2014 hailstorm case.
Data denial experiments were conducted testing the impact of high-frequency (5-min) assimilation of nonconventional data on the timing and location of CI, as well as on the development of storms as they progress through the testbed domain. Results using ensemble probability of reflectivity and neighborhood ensemble probability of hail show nonconventional observations were necessary to capture local details in the dryline structure causing localized enhanced convergence and leading to CI. Diagnosis of denial-minus-control fields showed the cumulative influence each observing network had on the resulting CI forecast. It was found that most of this impact came from the assimilation of thermodynamic observations. Accurate metadata is found to be crucial to the application of nonconventional observations in high-resolution assimilation and forecasts systems.
The second part of this dissertation explored the application of the ensemble- based forecast sensitivity to observations (EFSO). First, tests using a global two-layer model were performed to identify areas of improvement in the localization methods needed to make EFSO estimates accurate. Due to the time-forecast component, localization of the EFSO metric is more complicated than during traditional assimilation because as forecast time increases the error correlation structures evolve with the flow. Experiments made use of the local ensemble transform Kalman filter (LETKF) with a simple two-layer primitive equation model and simulated observations. Application of an adaptive localization method â regression confidence factors (RCF) based on a Monte Carlo âgroup filterâ technique â led to marked improvement especially for longer forecasts and at midlatitudes, when systematically verified against actual impact in RMSE and skill scores.
Results showed that the shape,location, time-dependency, and variable-dependency of RCF localization functions are consistent with underlying dynamical processes of the model. The impact estimates near the equator were not as effective due to large discrepancies between the RCF function and the localization used at assimilation time. These latter results indicated that there exists an inherent relationship between the localization applied during the assimilation time and the proper localization choice for observation impact estimates. Application of RCF for automatically tuned localization is introduced and tested for a single observation experiment.
Next, the EFSO method was applied to the high-resolution CI case from 3 April 2014 and evaluated for accuracy in terms of several verification metrics, including energy norms surface variables, and composite reflectivity. Static and advected localization were applied to EFSO and compared for accuracy to the actual forecast error reduction. The RCF method was also applied to the convective-scale EFSO estimation. Results showed that different verification metrics lead to different forecast length scales of useful estimates. The application of EFSO to reflectivity is hindered by the high nonlinearity of convection, though there were some qualitative insights in its use. The application of RCF localization, while found to reveal the underlying flow- dependence of the case study including the time-forecast component, did not improve upon the advected localization method. This is hypothesized to be due in part to insights gained from the two-layer model work, though other adaptive methods may yet yield better results. Nevertheless, the application of EFSO is appropriate for convective-scale systems on forecast time scales of 90 minutes or less
Recommended from our members
Explorations into Machine Learning Techniques for Precipitation Nowcasting
Recent advances in cloud-based big-data technologies now makes data driven solutions feasible for increasing numbers of scientific computing applications. One such data driven solution approach is machine learning where patterns in large data sets are brought to the surface by finding complex mathematical relationships within the data. Nowcasting or short-term prediction of rainfall in a given region is an important problem in meteorology. In this thesis we explore the nowcasting problem through a data driven approach by formulating it as a machine learning problem.
State-of-the-art nowcasting systems today are based on numerical models which describe the physical processes leading to precipitation or on weather radar extrapolation techniques that predict future radar precipitation maps by advecting from a sequence of past maps. These techniques, while they can perform well over very short prediction horizons (minutes) or very long horizons (hours to days), tend not to perform well over medium horizons (1-2 hours) due to lack of input data at the necessary spatial and temporal scales for the numerical prediction methods or due to the inability of radar extrapolation methods to predict storm growth and decay. Given that water must first concentrate in the atmosphere as water vapor before it can fall to the ground as rain, one goal of this thesis is to understand if water vapor information can improve radar extrapolation techniques by giving the information needed to infer growth and decay. To do so, we use the GPS-Meteorology technique to measure the water vapor in the atmosphere and weather radar reflectivity to measure rainfall. By training a machine learning nowcasting algorithm using both variables and comparing its performance against a nowcasting algorithm trained on reflectivity alone, we draw conclusions as to the predictive power of adding water vapor information.
Another goal of this thesis is to compare different machine learning techniques, viz., the random forest ensemble learning technique, which has shown success on a number of other weather prediction problems, and the current state-of-the-art machine learning technique for images and image sequences, convolutional neural network (CNN). We compare these in terms of problem representation, training complexity, and nowcasting performance.
A final goal is to compare the nowcasting performance of our machine learning techniques against published results for current state-of-the-art model based nowcasting techniques
Aeronautical engineering: A continuing bibliography with indexes (supplement 295)
This bibliography lists 581 reports, articles, and other documents introduced into the NASA Scientific and Technical Information System in Sep. 1993. Subject coverage includes: design, construction and testing of aircraft and aircraft engines; aircraft components, equipment, and systems; ground support systems; and theoretical and applied aspects of aerodynamics and general fluid dynamics
The Music Sound
A guide for music: compositions, events, forms, genres, groups, history, industry, instruments, language, live music, musicians, songs, musicology, techniques, terminology , theory, music video.
Music is a human activity which involves structured and audible sounds, which is used for artistic or aesthetic, entertainment, or ceremonial purposes.
The traditional or classical European aspects of music often listed are those elements given primacy in European-influenced classical music: melody, harmony, rhythm, tone color/timbre, and form. A more comprehensive list is given by stating the aspects of sound: pitch, timbre, loudness, and duration.
Common terms used to discuss particular pieces include melody, which is a succession of notes heard as some sort of unit; chord, which is a simultaneity of notes heard as some sort of unit; chord progression, which is a succession of chords (simultaneity succession); harmony, which is the relationship between two or more pitches; counterpoint, which is the simultaneity and organization of different melodies; and rhythm, which is the organization of the durational aspects of music