1,267 research outputs found

    Improving passive microwave sea ice concentration algorithms for coastal areas: applications to the Baltic Sea

    No full text
    Sea ice concentration can be retrieved from passive microwave data using the NASA Team algorithm or the Artist Sea Ice (ASI) algorithm, for example. The brightness temperature measurements obtained from the Special Sensor Microwave Imager (SSM/I) instrument or the Advanced Microwave Scanning Radiometer-EOS (AMSR-E) are commonly used for this purpose. Due to the coarse resolution of these instruments considerable systematic ice concentration errors in coastal regions occur. In the vicinity of the coast the instrument footprints usually contain both land and sea surfaces. Compared to sea surfaces, land surfaces are characterized by higher emissivities and lower polarization differences at the involved microwave channels. Thus, a systematic overestimation of coastal ice concentration is caused. In this paper, a method is developed to remove the land impact on the observed radiation. Combining a high-resolution data set for the shoreline and the antenna gain function the brightness temperature contribution originating from land surfaces can be identified. The brightness temperature related to the ocean fraction within the considered footprint can then be extracted. This separation technique is applied to SSM/I measurements in the Baltic Sea and the resulting ice concentration fields are compared to high-resolution satellite images. The highly complex shoreline of the Baltic Sea region provides an ideal area for testing the method. However, the presented approach can as well be applied to Arctic coastal regions. It is shown that the method considerably improves ice concentration retrieval in regions influenced by land surfaces without removing actually existing sea ice

    Folding and unfolding of a triple-branch DNA molecule with four conformational states

    Get PDF
    Single-molecule experiments provide new insights into biological processes hitherto not accessible by measurements performed on bulk systems. We report on a study of the kinetics of a triple-branch DNA molecule with four conformational states by pulling experiments with optical tweezers and theoretical modelling. Three distinct force rips associated with different transitions between the conformational states are observed in the folding and unfolding trajectories. By applying transition rate theory to a free energy model of the molecule, probability distributions for the first rupture forces of the different transitions are calculated. Good agreement of the theoretical predictions with the experimental findings is achieved. Furthermore, due to our specific design of the molecule, we found a useful method to identify permanently frayed molecules by estimating the number of opened basepairs from the measured force jump values.Comment: 17 pages, 12 figure

    Simple Lattice-Models of Ion Conduction: Counter Ion Model vs. Random Energy Model

    Full text link
    The role of Coulomb interaction between the mobile particles in ionic conductors is still under debate. To clarify this aspect we perform Monte Carlo simulations on two simple lattice models (Counter Ion Model and Random Energy Model) which contain Coulomb interaction between the positively charged mobile particles, moving on a static disordered energy landscape. We find that the nature of static disorder plays an important role if one wishes to explore the impact of Coulomb interaction on the microscopic dynamics. This Coulomb type interaction impedes the dynamics in the Random Energy Model, but enhances dynamics in the Counter Ion Model in the relevant parameter range.Comment: To be published in Phys. Rev.

    Scaling behavior in economics: II. Modeling of company growth

    Full text link
    In the preceding paper we presented empirical results describing the growth of publicly-traded United States manufacturing firms within the years 1974--1993. Our results suggest that the data can be described by a scaling approach. Here, we propose models that may lead to some insight into these phenomena. First, we study a model in which the growth rate of a company is affected by a tendency to retain an ``optimal'' size. That model leads to an exponential distribution of the logarithm of the growth rate in agreement with the empirical results. Then, we study a hierarchical tree-like model of a company that enables us to relate the two parameters of the model to the exponent β\beta, which describes the dependence of the standard deviation of the distribution of growth rates on size. We find that β=lnΠ/lnz\beta = -\ln \Pi / \ln z, where zz defines the mean branching ratio of the hierarchical tree and Π\Pi is the probability that the lower levels follow the policy of higher levels in the hierarchy. We also study the distribution of growth rates of this hierarchical model. We find that the distribution is consistent with the exponential form found empirically.Comment: 19 pages LateX, RevTeX 3, 6 figures, to appear J. Phys. I France (April 1997

    Scaling behavior in economics: I. Empirical results for company growth

    Full text link
    We address the question of the growth of firm size. To this end, we analyze the Compustat data base comprising all publicly-traded United States manufacturing firms within the years 1974-1993. We find that the distribution of firm sizes remains stable for the 20 years we study, i.e., the mean value and standard deviation remain approximately constant. We study the distribution of sizes of the ``new'' companies in each year and find it to be well approximated by a log-normal. We find (i) the distribution of the logarithm of the growth rates, for a fixed growth period of one year, and for companies with approximately the same size SS displays an exponential form, and (ii) the fluctuations in the growth rates -- measured by the width of this distribution σ1\sigma_1 -- scale as a power law with SS, σ1Sβ\sigma_1\sim S^{-\beta}. We find that the exponent β\beta takes the same value, within the error bars, for several measures of the size of a company. In particular, we obtain: β=0.20±0.03\beta=0.20\pm0.03 for sales, β=0.18±0.03\beta=0.18\pm0.03 for number of employees, β=0.18±0.03\beta=0.18\pm0.03 for assets, β=0.18±0.03\beta=0.18\pm0.03 for cost of goods sold, and β=0.20±0.03\beta=0.20\pm0.03 for property, plant, & equipment.Comment: 16 pages LateX, RevTeX 3, 10 figures, to appear J. Phys. I France (April 1997

    Grid enabled high throughput virtual screening against four different targets implicated in malaria

    Get PDF
    PCSVInternational audienceAfter having deployed a first data challenge on malaria and a second one on avian flu, respectively in summer 2005 and spring 2006, we are demonstrating here again how efficiently the computational grids can be used to produce massive docking data at a high-throughput. During more than 2 months and a half, we have achieved at least 140 million dockings, representing an average throughput of almost 80,000 dockings per hour. This was made possible by the availability of thousands of CPUs through different infrastructures worldwide. Through the acquired experience, the WISDOM production environment is evolving to enable an easy and fault-tolerant deployment of biological tools; in this case it is the FlexX commercial docking software which is used to dock the whole ZINC database against 4 different targets
    corecore