35 research outputs found

    A numerical field experiment approach for determining probabilities of microburst intensity

    Get PDF
    Several investigators had determined that some atmospheric parameters were related to the formation and severity of microbursts. For example, Caracena pointed out the relationship between a dry adiabatic lapse rate and microbursts in 'The crash of Delta Flight 191 at Dallas-Fort Worth international airport'. These early investigations led to the idea that numeric modeling of microbursts with varying atmospheric parameters might define 'signatures' that could lead to determining the probability of microburst intensity. The idea was that, by using already available sensors (such as static air temperature, pressure altitude, and radar reflectivity) onboard an aircraft, a reliable prediction of microburst existence and intensity could be formed. Such data could be used to create an 'expert meteorologist' using either artificial intelligence or other techniques that could be used in either reactive or look-ahead systems to vary sensitivity thresholds and coordinate the inputs from different detecting systems. To this end, Honeywell contracted to have the microburst simulations run. The questions to be addressed were the following: using the sensor set available to the aircraft (e.g. temperature, radar reflectivity, etc.), can we calculate the probability that (1) a microburst could be formed? and (2) that the resultant winds would be of sufficient magnitude to threaten the aircraft? Over a two year period, a data set of 1800 microburst simulations was accumulated. Verification of the microburst simulation was obtained using the results of other independent researchers and actual comparison to microburst events in Orlando and Denver. Some of the results from the simulation have already been incorporated into Honeywell's Windshear Detection and Guidance System with excellent results. Various aspects of this investigation are presented in viewgraph form

    Prediction of Convective Storms at Convection-Resolving 1 km Resolution over Continental United States with Radar Data Assimilation: An Example Case of 26 May 2008 and Precipitation Forecasts from Spring 2009

    Get PDF
    For the first time ever, convection-resolving forecasts at 1 km grid spacing were produced in realtime in spring 2009 by the Center for Analysis and Prediction of Storms (CAPS) at the University of Oklahoma. The forecasts assimilated both radial velocity and reflectivity data from all operational WSR-88D radars within a domain covering most of the continental United States. In preparation for the realtime forecasts, 1 km forecast tests were carried out using a case from spring 2008 and the forecasts with and without assimilating radar data are compared with corresponding 4 km forecasts produced in realtime. Significant positive impact of radar data assimilation is found to last at least 24 hours. The 1 km grid produced a more accurate forecast of organized convection, especially in structure and intensity details. It successfully predicted an isolated severe-weather-producing storm nearly 24 hours into the forecast, which all ten members of the 4 km real time ensemble forecasts failed to predict. This case, together with all available forecasts from 2009 CAPS realtime forecasts, provides evidence of the value of both convection-resolving 1 km grid and radar data assimilation for severe weather prediction for up to 24 hours

    46 JOURNAL OF ATMOSPHERIC AND OCEANIC TECHNOLOGY VOLUME 23 An OSSE Framework Based on the Ensemble Square Root Kalman Filter for Evaluating the Impact of Data from Radar Networks on Thunderstorm Analysis and Forecasting

    No full text
    A framework for Observing System Simulation Experiments (OSSEs) based on the ensemble square root Kalman filter (EnSRF) technique for assimilating data from more than one radar network is described. Th

    2128 MONTHLY WEATHER REVIEW VOLUME 127 A Variational Method for the Analysis of Three-Dimensional Wind Fields from Two Doppler Radars � 1999 American Meteorological Society

    No full text
    This paper proposes a new method of dual-Doppler radar analysis based on a variational approach. In it, a cost function, defined as the distance between the analysis and the observations at the data points, is minimized through a limited memory, quasi-Newton conjugate gradient algorithm with the mass continuity equation imposed as a weak constraint. The analysis is performed in Cartesian space. Compared with traditional methods, the variational method offers much more flexibility in its use of observational data and various constraints. Using the radar data directly at observation locations avoids an interpolation step, which is often a source of error, especially in the presence of data voids. In addition, using the mass continuity equation as a weak instead of strong constraint avoids the error accumulation and the subsequent somewhat arbitrary adjustment associated with the explicit vertical integration of the continuity equation. The current method is tested on both model-simulated and observed datasets of supercell storms. It is shown that the circulation inside and around the storms, including the strong updraft and associated downdraft, is well analyzed in both cases. Furthermore, the authors found that the analysis is not very sensitive to the specification of boundary conditions and to data contamination. The method also has the potential for retrieving, with reasonable accuracy, the wind in regions of single-Doppler radar coverage. 1
    corecore