42 research outputs found

    Banner News

    Get PDF
    https://openspace.dmacc.edu/banner_news/1451/thumbnail.jp

    Banner News

    Get PDF
    https://openspace.dmacc.edu/banner_news/1451/thumbnail.jp

    Banner News

    Get PDF
    https://openspace.dmacc.edu/banner_news/1447/thumbnail.jp

    Banner News

    Get PDF
    https://openspace.dmacc.edu/banner_news/1443/thumbnail.jp

    Benchmarking implementations of functional languages with ‘Pseudoknot', a float-intensive benchmark

    Get PDF
    Over 25 implementations of different functional languages are benchmarked using the same program, a floating-point intensive application taken from molecular biology. The principal aspects studied are compile time and execution time for the various implementations that were benchmarked. An important consideration is how the program can be modified and tuned to obtain maximal performance on each language implementation. With few exceptions, the compilers take a significant amount of time to compile this program, though most compilers were faster than the then current GNU C compiler (GCC version 2.5.8). Compilers that generate C or Lisp are often slower than those that generate native code directly: the cost of compiling the intermediate form is normally a large fraction of the total compilation time. There is no clear distinction between the runtime performance of eager and lazy implementations when appropriate annotations are used: lazy implementations have clearly come of age when it comes to implementing largely strict applications, such as the Pseudoknot program. The speed of C can be approached by some implementations, but to achieve this performance, special measures such as strictness annotations are required by non-strict implementations. The benchmark results have to be interpreted with care. Firstly, a benchmark based on a single program cannot cover a wide spectrum of ‘typical' applications. Secondly, the compilers vary in the kind and level of optimisations offered, so the effort required to obtain an optimal version of the program is similarly varie

    Archaeological sites as Distributed Long-term Observing Networks of the Past (DONOP)

    Get PDF
    The authors would also like to acknowledge the support of the National Science Foundation, specifically the Arctic Social Sciences Program, and RANNIS (The Icelandic Center for Research).Archaeological records provide a unique source of direct data on long-term human-environment interactions and samples of ecosystems affected by differing degrees of human impact. Distributed long-term datasets from archaeological sites provide a significant contribution to establish local, regional, and continental-scale environmental baselines and can be used to understand the implications of human decision-making and its impacts on the environment and the resources it provides for human use. Deeper temporal environmental baselines are essential for resource and environmental managers to restore biodiversity and build resilience in depleted ecosystems. Human actions are likely to have impacts that reorganize ecosystem structures by reducing diversity through processes such as niche construction. This makes data from archaeological sites key assets for the management of contemporary and future climate change scenarios because they combine information about human behavior, environmental baselines, and biological systems. Sites of this kind collectively form Distributed Long-term Observing Networks of the Past (DONOP), allowing human behavior and environmental impacts to be assessed over space and time. Behavioral perspectives are gained from direct evidence of human actions in response to environmental opportunities and change. Baseline perspectives are gained from data on species, landforms, and ecology over timescales that long predate our typically recent datasets that only record systems already disturbed by people. And biological perspectives can provide essential data for modern managers wanting to understand and utilize past diversity (i.e., trophic and/or genetic) as a way of revealing, and potentially correcting, weaknesses in our contemporary wild and domestic animal populations.PostprintPeer reviewe

    Empirical Legal Studies Before 1940: A Bibliographic Essay

    Get PDF
    The modern empirical legal studies movement has well-known antecedents in the law and society and law and economics traditions of the latter half of the 20th century. Less well known is the body of empirical research on legal phenomena from the period prior to World War II. This paper is an extensive bibliographic essay that surveys the English language empirical legal research from approximately 1940 and earlier. The essay is arranged around the themes in the research: criminal justice, civil justice (general studies of civil litigation, auto accident litigation and compensation, divorce, small claims, jurisdiction and procedure, civil juries), debt and bankruptcy, banking, appellate courts, legal needs, legal profession (including legal education), and judicial staffing and selection. Accompanying the essay is an extensive bibliography of research articles, books, and reports

    Formulation Pre-screening of Inhalation Powders Using Computational Atom–Atom Systematic Search Method

    Get PDF
    The synthonic modeling approach provides a molecule-centered understanding of the surface properties of crystals. It has been applied extensively to understand crystallization processes. This study aimed to investigate the functional relevance of synthonic modeling to the formulation of inhalation powders by assessing cohesivity of three active pharmaceutical ingredients (APIs, fluticasone propionate (FP), budesonide (Bud), and salbutamol base (SB)) and the commonly used excipient, α-lactose monohydrate (LMH). It is found that FP (−11.5 kcal/mol) has a higher cohesive strength than Bud (−9.9 kcal/mol) or SB (−7.8 kcal/mol). The prediction correlated directly to cohesive strength measurements using laser diffraction, where the airflow pressure required for complete dispersion (CPP) was 3.5, 2.0, and 1.0 bar for FP, Bud, and SB, respectively. The highest cohesive strength was predicted for LMH (−15.9 kcal/mol), which did not correlate with the CPP value of 2.0 bar (i.e., ranking lower than FP). High FP–LMH adhesive forces (−11.7 kcal/mol) were predicted. However, aerosolization studies revealed that the FP–LMH blends consisted of agglomerated FP particles with a large median diameter (∼4–5 μm) that were not disrupted by LMH. Modeling of the crystal and surface chemistry of LMH identified high electrostatic and H-bond components of its cohesive energy due to the presence of water and hydroxyl groups in lactose, unlike the APIs. A direct comparison of the predicted and measured cohesive balance of LMH with APIs will require a more in-depth understanding of highly hydrogen-bonded systems with respect to the synthonic engineering modeling tool, as well as the influence of agglomerate structure on surface–surface contact geometry. Overall, this research has demonstrated the possible application and relevance of synthonic engineering tools for rapid pre-screening in drug formulation and design
    corecore