63 research outputs found

    Expert Testimony on Proximate Cause

    Get PDF
    Expert testimony is common in tort litigation, especially on issues of standard of care and cause-in-fact. Rule 704 of the Federal Rules of Evidence and its state counterparts abolished the prohibition of testimony on ultimate issues, leading to the possibility of expert testimony on the often crucial issue of proximate cause. The situation is easy to imagine. After counsel has qualified an expert witness and elicited an opinion that the particular act or omission caused the injury in question, counsel might very well be tempted to inquire whether the witness has an opinion as to whether the act or omission was a proximate or legal cause of the accident. Or, counsel may merge the two lines of inquiry and ask whether the act or omission proximately resulted in the accident or injury to the plaintiff. The inquiry seems harmless. The term proximate is commonly understood to mean only near or close to. The question is not innocuous, however. The issue of expert testimony on the question of proximate cause implicates several restrictions on expert testimony that survive the broad permission of Rule 704, and touches upon the serious issue of the proper roles of expert and fact-finder in the application of law to facts. The few published cases that have considered the issue of expert testimony on proximate cause are split.\u27 The question arises far more often, however, than is indicated by the relative scarcity of reported decisions. This Article addresses the usefulness and propriety of expert testimony on the issue of proximate cause. After briefly defining the concept of proximate cause, this Article argues that expert testimony on proximate cause is inadmissible under Rule 704, despite the general admissibility of testimony on ultimate issues. In addition, opinion on proximate cause is inadmissible because it fails to clear the separate hurdles of Rules 702\u27 and 4036 of the Federal Rules of Evidence. A technical expert on standard of care or actual cause is not qualified to opine on the issue of proximate cause and thus fails the expertise test of Rule 702. Furthermore, even the testimony of a genuine expert on the issue of proximate cause should be excluded because such testimony fails the helpfulness test of Rule 702. Finally, expert testimony on the issue of proximate cause is inadmissible under Rule 403 because its probative value is substantially outweighed by the possibility that such testimony will confuse the issues and mislead the jury

    Open Data, Open Source and Open Standards in chemistry: The Blue Obelisk five years on

    Get PDF
    RIGHTS : This article is licensed under the BioMed Central licence at http://www.biomedcentral.com/about/license which is similar to the 'Creative Commons Attribution Licence'. In brief you may : copy, distribute, and display the work; make derivative works; or make commercial use of the work - under the following conditions: the original author must be given credit; for any reuse or distribution, it must be made clear to others what the license terms of this work are.Abstract Background The Blue Obelisk movement was established in 2005 as a response to the lack of Open Data, Open Standards and Open Source (ODOSOS) in chemistry. It aims to make it easier to carry out chemistry research by promoting interoperability between chemistry software, encouraging cooperation between Open Source developers, and developing community resources and Open Standards. Results This contribution looks back on the work carried out by the Blue Obelisk in the past 5 years and surveys progress and remaining challenges in the areas of Open Data, Open Standards, and Open Source in chemistry. Conclusions We show that the Blue Obelisk has been very successful in bringing together researchers and developers with common interests in ODOSOS, leading to development of many useful resources freely available to the chemistry community.Peer Reviewe

    BALL - biochemical algorithms library 1.3

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The Biochemical Algorithms Library (BALL) is a comprehensive rapid application development framework for structural bioinformatics. It provides an extensive C++ class library of data structures and algorithms for molecular modeling and structural bioinformatics. Using BALL as a programming toolbox does not only allow to greatly reduce application development times but also helps in ensuring stability and correctness by avoiding the error-prone reimplementation of complex algorithms and replacing them with calls into the library that has been well-tested by a large number of developers. In the ten years since its original publication, BALL has seen a substantial increase in functionality and numerous other improvements.</p> <p>Results</p> <p>Here, we discuss BALL's current functionality and highlight the key additions and improvements: support for additional file formats, molecular edit-functionality, new molecular mechanics force fields, novel energy minimization techniques, docking algorithms, and support for cheminformatics.</p> <p>Conclusions</p> <p>BALL is available for all major operating systems, including Linux, Windows, and MacOS X. It is available free of charge under the Lesser GNU Public License (LPGL). Parts of the code are distributed under the GNU Public License (GPL). BALL is available as source code and binary packages from the project web site at <url>http://www.ball-project.org</url>. Recently, it has been accepted into the debian project; integration into further distributions is currently pursued.</p

    COordination of Standards in MetabOlomicS (COSMOS): facilitating integrated metabolomics data access

    Get PDF
    Metabolomics has become a crucial phenotyping technique in a range of research fields including medicine, the life sciences, biotechnology and the environmental sciences. This necessitates the transfer of experimental information between research groups, as well as potentially to publishers and funders. After the initial efforts of the metabolomics standards initiative, minimum reporting standards were proposed which included the concepts for metabolomics databases. Built by the community, standards and infrastructure for metabolomics are still needed to allow storage, exchange, comparison and re-utilization of metabolomics data. The Framework Programme 7 EU Initiative ‘coordination of standards in metabolomics’ (COSMOS) is developing a robust data infrastructure and exchange standards for metabolomics data and metadata. This is to support workflows for a broad range of metabolomics applications within the European metabolomics community and the wider metabolomics and biomedical communities’ participation. Here we announce our concepts and efforts asking for re-engagement of the metabolomics community, academics and industry, journal publishers, software and hardware vendors, as well as those interested in standardisation worldwide (addressing missing metabolomics ontologies, complex-metadata capturing and XML based open source data exchange format), to join and work towards updating and implementing metabolomics standards

    Toward interoperable bioscience data

    Get PDF
    © The Author(s), 2012. This article is distributed under the terms of the Creative Commons Attribution License. The definitive version was published in Nature Genetics 44 (2012): 121-126, doi:10.1038/ng.1054.To make full use of research data, the bioscience community needs to adopt technologies and reward mechanisms that support interoperability and promote the growth of an open 'data commoning' culture. Here we describe the prerequisites for data commoning and present an established and growing ecosystem of solutions using the shared 'Investigation-Study-Assay' framework to support that vision.The authors also acknowledge the following funding sources in particular: UK Biotechnology and Biological Sciences Research Council (BBSRC) BB/I000771/1 to S.-A.S. and A.T.; UK BBSRC BB/I025840/1 to S.-A.S.; UK BBSRC BB/I000917/1 to D.F.; EU CarcinoGENOMICS (PL037712) to J.K.; US National Institutes of Health (NIH) 1RC2CA148222-01 to W.H. and the HSCI; US MIRADA LTERS DEB-0717390 and Alfred P. Sloan Foundation (ICoMM) to L.A.-Z.; Swiss Federal Government through the Federal Office of Education and Science (FOES) to L.B. and I.X.; EU Innovative Medicines Initiative (IMI) Open PHACTS 115191 to C.T.E.; US Department of Energy (DOE) DE-AC02- 06CH11357 and Arthur P. Sloan Foundation (2011- 6-05) to J.G.; UK BBSRC SysMO-DB2 BB/I004637/1 and BBG0102181 to C.G.; UK BBSRC BB/I000933/1 to C.S. and J.L.G.; UK MRC UD99999906 to J.L.G.; US NIH R21 MH087336 (National Institute of Mental Health) and R00 GM079953 (National Institute of General Medical Science) to A.L.; NIH U54 HG006097 to J.C. and C.E.S.; Australian government through the National Collaborative Research Infrastructure Strategy (NCRIS); BIRN U24-RR025736 and BioScholar RO1-GM083871 to G.B. and the 2009 Super Science initiative to C.A.S

    Developing a toolkit for the assessment and monitoring of musculoskeletal ageing

    Get PDF
    The complexities and heterogeneity of the ageing process have slowed the development of consensus on appropriate biomarkers of healthy ageing. The Medical Research Council–Arthritis Research UK Centre for Integrated research into Musculoskeletal Ageing (CIMA) is a collaboration between researchers and clinicians at the Universities of Liverpool, Sheffield and Newcastle. One of CIMA’s objectives is to ‘Identify and share optimal techniques and approaches to monitor age-related changes in all musculoskeletal tissues, and to provide an integrated assessment of musculoskeletal function’—in other words to develop a toolkit for assessing musculoskeletal ageing. This toolkit is envisaged as an instrument that can be used to characterise and quantify musculoskeletal function during ‘normal’ ageing, lend itself to use in large-scale, internationally important cohorts, and provide a set of biomarker outcome measures for epidemiological and intervention studies designed to enhance healthy musculoskeletal ageing. Such potential biomarkers include: biochemical measurements in biofluids or tissue samples, in vivo measurements of body composition, imaging of structural and physical properties, and functional tests. This review assesses candidate biomarkers of musculoskeletal ageing under these four headings, details their biological bases, strengths and limitations, and makes practical recommendations for their use. In addition, we identify gaps in the evidence base and priorities for further research on biomarkers of musculoskeletal ageing

    The ALICE experiment at the CERN LHC

    Get PDF
    ALICE (A Large Ion Collider Experiment) is a general-purpose, heavy-ion detector at the CERN LHC which focuses on QCD, the strong-interaction sector of the Standard Model. It is designed to address the physics of strongly interacting matter and the quark-gluon plasma at extreme values of energy density and temperature in nucleus-nucleus collisions. Besides running with Pb ions, the physics programme includes collisions with lighter ions, lower energy running and dedicated proton-nucleus runs. ALICE will also take data with proton beams at the top LHC energy to collect reference data for the heavy-ion programme and to address several QCD topics for which ALICE is complementary to the other LHC detectors. The ALICE detector has been built by a collaboration including currently over 1000 physicists and engineers from 105 Institutes in 30 countries. Its overall dimensions are 161626 m3 with a total weight of approximately 10 000 t. The experiment consists of 18 different detector systems each with its own specific technology choice and design constraints, driven both by the physics requirements and the experimental conditions expected at LHC. The most stringent design constraint is to cope with the extreme particle multiplicity anticipated in central Pb-Pb collisions. The different subsystems were optimized to provide high-momentum resolution as well as excellent Particle Identification (PID) over a broad range in momentum, up to the highest multiplicities predicted for LHC. This will allow for comprehensive studies of hadrons, electrons, muons, and photons produced in the collision of heavy nuclei. Most detector systems are scheduled to be installed and ready for data taking by mid-2008 when the LHC is scheduled to start operation, with the exception of parts of the Photon Spectrometer (PHOS), Transition Radiation Detector (TRD) and Electro Magnetic Calorimeter (EMCal). These detectors will be completed for the high-luminosity ion run expected in 2010. This paper describes in detail the detector components as installed for the first data taking in the summer of 2008
    corecore