6,851,334 research outputs found

    Boltzmann-Gibbs thermal equilibrium distribution for classical systems and Newton law: A computational discussion

    Full text link
    We implement a general numerical calculation that allows for a direct comparison between nonlinear Hamiltonian dynamics and the Boltzmann-Gibbs canonical distribution in Gibbs Γ\Gamma-space. Using paradigmatic first-neighbor models, namely, the inertial XY ferromagnet and the Fermi-Pasta-Ulam β\beta-model, we show that at intermediate energies the Boltzmann-Gibbs equilibrium distribution is a consequence of Newton second law (F=ma{\mathbf F}=m{\mathbf a}). At higher energies we discuss partial agreement between time and ensemble averages.Comment: New title, revision of the text. EPJ latex, 4 figure

    Standardizing Geospatial Information for New England Conservation Lands: Data Capture Methods and Technology

    Get PDF
    In 2002 The New England Environmental Finance Center and Applied Geographics issued the Feasibility Study for a GIS Inventory of New England Conservation Lands 1 describing the conservation lands data status throughout EPA Region 1 (New England). This report identified stakeholders and technologies participating in the maintenance of conservation lands data within this region. In the four years since that initial report dramatic changes have occurred in the technical means by which geographic data are delivered from their respective repositories. These changes have been most pronounced and obvious in the area of web mapping services. Web mapping services are software utilities by which diverse and frequently unrelated geographic data sets are structured and symbolized for consumption by remote clients through the Internet. In a more general sense, these represent a kind of democratization of access to digitally mapped data, by providing tools and content (often free) from remote servers that can be consumed by an end user with only a web browser or a small software download (and with little or no technical expertise). This method of delivery is in striking contrast to the preceding era in GIS evolution where all data and tools were closely held and generally inaccessible by dint of their expense and technical complexity

    Standardizing Geospatial Information for New England Conservation Lands: Perpetrual Data Maintenance and Distrubuted Data

    Get PDF
    This brief paper addresses the problem in microcosm as it occurs in the case of conservation lands data for the northeastern United States. Through an ongoing initiative with Applied Geographics in Boston, the New England Environmental Finance Center (NE/EFC) has worked to identify friction points and opportunities for increased efficiency in the conservation lands data capture and standardization process over the EPA Region 1 (New England) area. Like other thematic layers, conservation lands data are typically best captured as polygons which carry tabular attribution of varying complexity depending upon which state or organization collects and maintains them. By example, Massachusetts has collected information on more than 30,000 parcels and informed these polygons with a fully relational database that contains dozens of tables with nearly 100 active attribute fields. Maine is at the other extreme, and with four times the overall land area has barely one twentieth the number of cataloged conservation properties and a very restricted set of tabular data associated with them. Most of the properties that have fallen through the cracks in Maine belong to the municipal or land trust categories. These are prime candidates for distributed data capture, being broken into small jurisdictions where a large number of local experts have very clear knowledge of their own area but no easy means of passing this knowledge on to others working in a more regional, state or federal capacity

    Resident macrophages and their potential in cardiac tissue engineering

    Get PDF
    Many facets of tissue engineered models aim to understand cellular mechanisms to recapitulate in vivo behavior, study and mimic diseases for drug interventions and to provide better understanding towards improving regenerative medicine. Recent and rapid advances in stem cell biology, material science and engineering, have made the generation of complex engineered tissues much more attainable. One such tissue, human myocardium; is extremely intricate, with a number of different cell types. Recent studies have unraveled cardiac resident macrophages as a critical mediator for normal cardiac function. Macrophages within the heart exert phagocytosis and efferocytosis, facilitate electrical conduction, promote regeneration and remove cardiac exophers to maintain homeostasis. These findings underpin the rationale of introducing macrophages to engineered heart tissue, to more aptly capitulate in vivo physiology. Despite the lack of studies using cardiac macrophages in vitro, there is enough evidence to believe that they will be useful in making engineered heart tissues more physiologically relevant. In this review, we explore the rationale and feasibility of using macrophage as an additional cell source in engineered cardiac tissues

    On Link Estimation in Dense RPL Deployments

    Get PDF
    The Internet of Things vision foresees billions of devices to connect the physical world to the digital world. Sensing applications such as structural health monitoring, surveillance or smart buildings employ multi-hop wireless networks with high density to attain sufficient area coverage. Such applications need networking stacks and routing protocols that can scale with network size and density while remaining energy-efficient and lightweight. To this end, the IETF RoLL working group has designed the IPv6 Routing Protocol for Low-Power and Lossy Networks (RPL). This paper discusses the problems of link quality estimation and neighbor management policies when it comes to handling high densities. We implement and evaluate different neighbor management policies and link probing techniques in Contiki’s RPL implementation. We report on our experience with a 100-node testbed with average 40-degree density. We show the sensitivity of high density routing with respect to cache sizes and routing metric initialization. Finally, we devise guidelines for design and implementation of density-scalable routing protocols

    Applications Research-based Learning Environment Education Course For Writing Scientific

    Full text link
    The purpose of this study to explain applications research-based learning in educational environment for scientific writing majors social studies department. This study uses a quasi-experimental design with two groups of experimental classes and control classes. The results showed the significant value of 0.002 t-test (<0.05). This means that there is the influence of research-based learning applications on students\u27 ability to write scientific subjects of environmental education with the study of carrying capacity. Four indicators were biggest increase in the introduction, literature review, methods, and the results and discussion

    Applications

    Get PDF
    An increasing complexity of models used to predict real-world systems leads to the need for algorithms to replace complex models with far simpler ones, while preserving the accuracy of the predictions. This three-volume handbook covers methods as well as applications. This third volume focuses on applications in engineering, biomedical engineering, computational physics and computer science
    • …
    corecore