113 research outputs found

    An SMP Soft Classification Algorithm for Remote Sensing

    Get PDF
    This work introduces a symmetric multiprocessing (SMP) version of the continuous iterative guided spectral class rejection (CIGSCR) algorithm, a semiautomated classiļ¬cation algorithm for remote sensing (multispectral) images. The algorithm uses soft data clusters to produce a soft classiļ¬cation containing inherently more information than a comparable hard classiļ¬cation at an increased computational cost. Previous work suggests that similar algorithms achieve good parallel scalability, motivating the parallel algorithm development work here. Experimental results of applying parallel CIGSCR to an image with approximately 10^8 pixels and six bands demonstrate superlinear speedup. A soft two class classiļ¬cation is generated in just over four minutes using 32 processors

    Parallel Deterministic and Stochastic Global Minimization of Functions with Very Many Minima

    Get PDF
    The optimization of three problems with high dimensionality and many local minima are investigated under five different optimization algorithms: DIRECT, simulated annealing, Spallā€™s SPSA algorithm, the KNITRO package, and QNSTOP, a new algorithm developed at Indiana University

    The Influence of Land Use/Land Cover on Climatological Values of the Diurnal Temperature Range

    Get PDF
    The diurnal temperature range (DTR) at weather observation stations that make up the U.S. Historical Climatology Network was evaluated with respect to the predominant land use/land cover associated with the stations within three radii intervals (100, 1000, and 10 000 m) of the stations. Those stations that were associated with predominantly rural land use/land cover (LULC) usually displayed the greatest observed DTR, whereas those associated with urban related land use or land cover displayed the least observed DTR. The results of this study suggest that significant differences in the climatological DTR were observed and could be attributed to the predominant LULC associated with the observation stations. The results also suggest that changes in the predominant LULC conditions, within as great as a 10 000 m radius of an observation station, could significantly influence the climatological DTR. Future changes in the predominant LULC associated with observation sites should be monitored similar to the current practice of monitoring changes in instruments or time of observation at the observations sites

    Temporal variations of extreme precipitation events in the United States: 1895ā€“2000

    Get PDF
    A newly available data set of daily precipitation observations was used to study the temporal variability of the frequency of short-duration extreme precipitation events for 1895ā€“2000 in the conterminous United States. Event durations of 1, 5, 10, and 30 day and return periods of 1, 5, and 20 year were analyzed. For all combinations of duration and return period, heavy precipitation frequencies were relatively high during the late 19th/early 20th Centuries, decreasing to a minimum in the 1920s and 30s, followed by a general increase into the 1990s. The frequencies at the beginning of the 20th Century were nearly as high as during the late 20th Century for some combinations of duration and return period, suggesting that natural variability cannot be discounted as an important contributor to the recent high values. Extensive quality control of data and Monte Carlo testing was performed to provide confidence in the reality of the early period high frequencies

    Trend Identification in Twentieth-Century U.S. Snowfall: The Challenges

    Get PDF
    There is an increasing interest in examining long-term trends in measures of snow climatology. An examination of the U.S. daily snowfall records for 1900ā€“2004 revealed numerous apparent inconsistencies. For example, long-term snowfall trends among neighboring lake-effect stations differ greatly from insignificant to +100% century -1. Internal inconsistencies in the snow records, such as a lack of upward trends in maximum seasonal snow depth at stations with large upward trends in snowfall, point to inhomogeneities. Nationwide, the frequency of daily observations with a 10:1 snowfall-to-liquid-equivalent ratio declined from 30% in the 1930s to a current value of around 10%, a change that is clearly due to observational practice. There then must be biases in cold-season liquid-equivalent precipitation, or snowfall, or both. An empirical adjustment of snow-event, liquid-equivalent precipitation indicates that the potential biases can be statistically significant. Examples from this study show that there are nonclimatic issues that complicate the identification of and significantly change the trends in snow variables. Thus, great care should be taken in interpretation of time series of snow-related variables from the Cooperative Observer Program (COOP) network. Furthermore, full documentation of optional practices should be required of network observers so that future users of these data can properly account for such practices

    Power Saving Experiments for Large Scale Global Optimization

    Get PDF
    Green computing, an emerging ļ¬eld of research that seeks to reduce excess power consumption in high performance computing (HPC), is gaining popularity among researchers. Research in this ļ¬eld often relies on simulation or only uses a small cluster, typically 8 or 16 nodes, because of the lack of hardware support. In contrast, System G at Virginia Tech is a 2592 processor supercomputer equipped with power aware components suitable for large scale green computing research. DIRECT is a deterministic global optimization algorithm, implemented in the mathematical software package VTDIRECT95. This paper explores the potential energy savings for the parallel implementation of DIRECT, called pVTdirect, when used with a large scale computational biology application, parameter estimation for a budding yeast cell cycle model, on System G. Two power aware approaches for pVTdirect are developed and compared against the CPUSPEED power saving system tool. The results show that knowledge of the parallel workload of the underlying application is beneficial for power management

    Quality Control of Pre-1948 Cooperative Observer Network Data

    Get PDF
    A recent comprehensive effort to digitize U.S. daily temperature and precipitation data observed prior to 1948 has resulted in a major enhancement in the computer database of the records of the National Weather Serviceā€™s cooperative observer network. Previous digitization efforts had been selective, concentrating on state or regional areas. Special quality control procedures were applied to these data to enhance their value for climatological analysis. The procedures involved a two-step process. In the first step, each individual temperature and precipitation data value was evaluated against a set of objective screening criteria to flag outliers. These criteria included extreme limits and spatial comparisons with nearby stations. The following data were automatically flagged: 1) all precipitation values exceeding 254 mm (10 in.) and 2) all temperature values whose anomaly from the monthly mean for that station exceeded five standard deviations. Addi- tional values were flagged based on differences with nearby stations; in this case, metrics were used to rank outliers so that the limited resources were concentrated on those values most likely to be invalid. In the second step, each outlier was manually assessed by climatologists and assigned one of the four following flags: valid, plausible, questionable, or invalid. In excess of 22 400 values were manually assessed, of which about 48% were judged to be invalid. Although additional manual assessment of outliers might further improve the quality of the database, the procedures applied in this study appear to have been successful in identifying the most flagrant errors

    Pharmacotherapy and pregnancy: Highlights from the first International Conference for Individualized Pharmacotherapy in Pregnancy

    Get PDF
    Data are sparse on the effects of medication use during pregnancy. Half of the world's population is women. The majority of women become pregnant, and many of those women take some kind of medication during their pregnancy, even if only for a short time. The majority of drugs have not been rigorously studied in pregnant women to determine the most effective dose with the least potential for adverse effects. Instead, women are given ā€œcookieā€cutterā€ therapy, using doses extrapolated from nonpregnant women, men, or pregnant animals. This can lead to problems. Instead, individualization of pharmacotherapy in pregnancy promises to take individual women and determine the optimal dose and drug for them to maximize the effect of the drug while attempting to minimize the side effects to them and their unborn babies. Because this field of study is underrepresented, we held a conference to bring together researchers and experts to discuss current knowledge, issues, and challenges surrounding individualized pharmacotherapy in pregnancy. Speakers came from the NIH, the Food and Drug Administration (FDA), and various research centers in the United States and Canada. Below are the summaries of the discussions at the conference. Full notes from the panel discussions are available from the authors on request
    • ā€¦
    corecore