53 research outputs found

    Documenting provenance in noncomputational workflows: Research process models based on geobiology fieldwork in Yellowstone National Park

    Full text link
    Peer Reviewedhttps://deepblue.lib.umich.edu/bitstream/2027.42/146402/1/asi24039_am.pdfhttps://deepblue.lib.umich.edu/bitstream/2027.42/146402/2/asi24039.pd

    Cell Type–dependent Requirement for PIP Box–regulated Cdt1 Destruction During S Phase

    Get PDF
    Previous studies have shown that Cdt1 overexpression in cultured cells can trigger re-replication, but not whether CRL4Cdt2-triggered destruction of Cdt1 is required for normal mitotic cell cycle progression in vivo. We demonstrate that PIP box–mediated destruction of Cdt1Dup during S phase is necessary for the cell division cycle in Drosophila

    Understanding the limitations of radiation-induced cell cycle checkpoints

    Get PDF
    The DNA damage response pathways involve processes of double-strand break (DSB) repair and cell cycle checkpoint control to prevent or limit entry into S phase or mitosis in the presence of unrepaired damage. Checkpoints can function to permanently remove damaged cells from the actively proliferating population but can also halt the cell cycle temporarily to provide time for the repair of DSBs. Although efficient in their ability to limit genomic instability, checkpoints are not foolproof but carry inherent limitations. Recent work has demonstrated that the G1/S checkpoint is slowly activated and allows cells to enter S phase in the presence of unrepaired DSBs for about 4–6 h post irradiation. During this time, only a slowing but not abolition of S-phase entry is observed. The G2/M checkpoint, in contrast, is quickly activated but only responds to a level of 10–20 DSBs such that cells with a low number of DSBs do not initiate the checkpoint or terminate arrest before repair is complete. Here, we discuss the limitations of these checkpoints in the context of the current knowledge of the factors involved. We suggest that the time needed to fully activate G1/S arrest reflects the existence of a restriction point in G1-phase progression. This point has previously been defined as the point when mitogen starvation fails to prevent cells from entering S phase. However, cells that passed the restriction point can respond to DSBs, albeit with reduced efficiency

    Semantics in Support of Biodiversity Knowledge Discovery: An Introduction to the Biological Collections Ontology and Related Ontologies

    Get PDF
    The study of biodiversity spans many disciplines and includes data pertaining to species distributions and abundances, genetic sequences, trait measurements, and ecological niches, complemented by information on collection and measurement protocols. A review of the current landscape of metadata standards and ontologies in biodiversity science suggests that existing standards such as the Darwin Core terminology are inadequate for describing biodiversity data in a semantically meaningful and computationally useful way. Existing ontologies, such as the Gene Ontology and others in the Open Biological and Biomedical Ontologies (OBO) Foundry library, provide a semantic structure but lack many of the necessary terms to describe biodiversity data in all its dimensions. In this paper, we describe the motivation for and ongoing development of a new Biological Collections Ontology, the Environment Ontology, and the Population and Community Ontology. These ontologies share the aim of improving data aggregation and integration across the biodiversity domain and can be used to describe physical samples and sampling processes (for example, collection, extraction, and preservation techniques), as well as biodiversity observations that involve no physical sampling. Together they encompass studies of: 1) individual organisms, including voucher specimens from ecological studies and museum specimens, 2) bulk or environmental samples (e.g., gut contents, soil, water) that include DNA, other molecules, and potentially many organisms, especially microbes, and 3) survey-based ecological observations. We discuss how these ontologies can be applied to biodiversity use cases that span genetic, organismal, and ecosystem levels of organization. We argue that if adopted as a standard and rigorously applied and enriched by the biodiversity community, these ontologies would significantly reduce barriers to data discovery, integration, and exchange among biodiversity resources and researchers

    Trust-based design of HGAs

    No full text
    Thesis (S.M.)--Massachusetts Institute of Technology, Sloan School of Management, Operations Research Center, 2007.Includes bibliographical references (p. 227-229).By combining the strengths of human and computers, Human Machine Collaborative Decision Making has been shown to generate higher quality solutions in less time than conventional computerized methods. In many cases, it is difficult to model continually changing problems and incorporate human objectives into the solution. Human-guided algorithms (HGAs) harness the power of sophisticated algorithms and computers to provide flexibility to the human decision maker to model correctly and dynamically the problem and steer the algorithm to solutions that match his/her objectives for the given problem. HGAs are designed to make the power of Operations Research accessible to problem domain experts and decision makers, and incorporate their expert knowledge into every solution. In order to appropriately utilize algorithms during a planner's decision making, HGA operators must appropriately trust the HGA and the final solution. Through the use of trust-based design (TBD), it was hypothesized that users of the HGA will gain better insight into the solution process, improve their calibration of trust, and generate superior solutions. The application of TBD requires the consideration of algorithms, solution steering methods, and displays required to best match human and computer complimentary strengths and to generate solutions that can be appropriately trusted.(cont.) Abstract hierarchy, Ecological Interface Design, and various trust models are used to ensure that the HGA operators' evaluation of trust can be correctly calibrated to all necessary HGA trust attributes. A human-subject evaluation was used to test the effectiveness of the TBD design approach for HGAs. An HGA, including the appropriate controls and displays, was designed and developed using the described TBD approach. The participants were presented with the task of using the HGA to develop a routing plan for military aircraft to prosecute enemy targets. The results showed that TBD had a significant effect on trust, HGA performance, and in some cases the quality of final solutions. Another finding was that, HGA operators must be provided with additional trust related information to improve their understanding of the HGA, the solution process, and the final solution in order to calibrate properly their trust in the system.by Joseph L. Thomer.S.M

    Working Group Reports on Metadata

    No full text
    While there are many metadata standards for a wide variety of objects, collections, and cataloguing methods, universally recognized standards for metadata related to 3D data has not yet been codified. Through this presentation, we hope to describe current use of 3D metadata standards, discuss recommended fields, and perform gap analysis with feedback from the CS3DP Forum 2 attendees to further inform our work moving forward. Within the scope of 3D data being considered, there are essentially two very broad categories of data generation: scan-to-mesh, where the data generated originates directly from interaction with a physical, real world object by means of external technology such as photogrammetry, laser scanner or CT; and born-digital, where the data generated is created exclusively within the confines of 3D design software such as AutoCAD or AutoDesk, based on a variety of sources including personal research, algorithmically derived analysis, or carefully informed projections. In both cases, a large quantity of information is available with regards to the quantity, quality, and origin of the 3D data being created but there are not yet universal guidelines on what information needs to be stored and how. In order to better understand the range of metadata practices in the cataloging and preservation of 3D data, we conducted a survey in spring and summer 2018. We received responses from nine different institutions and found that they use a wide variety of existing metadata standards and workflows. We are comparing these institutions’ metadata practices by organizing their standards into a spreadsheet, sorting each metadata field into broad categories, and attempting to identify potential crosswalks to establish commonalities in the following categories: Project Level, Item Level, Capture, Rights/Access, and Descriptive vs Technical (capture/processing). The collation of this data will be used to facilitate the assessment of current metadata standards for the preservation and management of 3D data and begin a conversation with Forum 2 participants about what fields and classes of metadata are most crucial for their work. Based on these existing established standards, we hope to ultimately create a series of best practices for 3D metadata, rather than a hard and fast series of fields based on specific ontologies or schemas. Using the information gathered from our survey, we will begin gap analysis with the aid of Forum 2 participants. While we have striven to collect as many responses as possible from a representative cross-section of 3D data creators and managers, we recognize that there are those with expertise who may not have been able to participate in virtual meetings thus far. As such, fueled by the previous discussion, we will facilitate a discussion of metadata information that may include vital but unaccounted for metadata needs and evaluate where the pain points are in current 3D metadata practices. This will include a discussion of questions that 3D data producers and consumers may need metadata to answer in order to best enable potential reuse or reference. The results of these conversations will be used to assess and augment proposed best practices for capture, collection, and collation of metadata for 3D objects and how this relates to best practices for the preservation of this information, whether it be as research data, within library collections, or in digital repositories

    Value and Context in Data Use: Domain Analysis Revisited

    Get PDF
    “Context” is an elusive concept in Information Science – often invoked, and yet rarely explained. In this paper we take a domain analytic approach to examine five sub-disciplines within Earth Systems Science to show how the context of data production and use often impacts the value of data. We argue simply that the value of research data increases with their use. Our analysis is informed by two economic perspectives: first, that data production needs to be situated within a broader information economy; and second, that the concept of anti-fragility helps explain how data increase in value through exposure to diverse contexts of use. We discuss the importance of these perspectives for the development of information systems capable of facilitating interdisciplinary scientific work, as well as the design of sustainable cyberinfrastructures.published or submitted for publicationis peer reviewe
    corecore