318 research outputs found

    The role of Comprehension in Requirements and Implications for Use Case Descriptions

    Get PDF
    Within requirements engineering it is generally accepted that in writing specifications (or indeed any requirements phase document), one attempts to produce an artefact which will be simple to comprehend for the user. That is, whether the document is intended for customers to validate requirements, or engineers to understand what the design must deliver, comprehension is an important goal for the author. Indeed, advice on producing ‘readable’ or ‘understandable’ documents is often included in courses on requirements engineering. However, few researchers, particularly within the software engineering domain, have attempted either to define or to understand the nature of comprehension and it’s implications for guidance on the production of quality requirements. Therefore, this paper examines thoroughly the nature of textual comprehension, drawing heavily from research in discourse process, and suggests some implications for requirements (and other) software documentation. In essence, we find that the guidance on writing requirements, often prevalent within software engineering, may be based upon assumptions which are an oversimplification of the nature of comprehension. Hence, the paper examines guidelines which have been proposed, in this case for use case descriptions, and the extent to which they agree with discourse process theory; before suggesting refinements to the guidelines which attempt to utilise lessons learned from our richer understanding of the underlying discourse process theory. For example, we suggest subtly different sets of writing guidelines for the different tasks of requirements, specification and design

    Detection of regulator genes and eQTLs in gene networks

    Full text link
    Genetic differences between individuals associated to quantitative phenotypic traits, including disease states, are usually found in non-coding genomic regions. These genetic variants are often also associated to differences in expression levels of nearby genes (they are "expression quantitative trait loci" or eQTLs for short) and presumably play a gene regulatory role, affecting the status of molecular networks of interacting genes, proteins and metabolites. Computational systems biology approaches to reconstruct causal gene networks from large-scale omics data have therefore become essential to understand the structure of networks controlled by eQTLs together with other regulatory genes, and to generate detailed hypotheses about the molecular mechanisms that lead from genotype to phenotype. Here we review the main analytical methods and softwares to identify eQTLs and their associated genes, to reconstruct co-expression networks and modules, to reconstruct causal Bayesian gene and module networks, and to validate predicted networks in silico.Comment: minor revision with typos corrected; review article; 24 pages, 2 figure

    Sharing brain mapping statistical results with the neuroimaging data model

    Get PDF
    Only a tiny fraction of the data and metadata produced by an fMRI study is finally conveyed to the community. This lack of transparency not only hinders the reproducibility of neuroimaging results but also impairs future meta-analyses. In this work we introduce NIDM-Results, a format specification providing a machine-readable description of neuroimaging statistical results along with key image data summarising the experiment. NIDM-Results provides a unified representation of mass univariate analyses including a level of detail consistent with available best practices. This standardized representation allows authors to relay methods and results in a platform-independent regularized format that is not tied to a particular neuroimaging software package. Tools are available to export NIDM-Result graphs and associated files from the widely used SPM and FSL software packages, and the NeuroVault repository can import NIDM-Results archives. The specification is publically available at: http://nidm.nidash.org/specs/nidm-results.html

    Lack of functional alpha-lactalbumin prevents involution in Cape fur seals and identifies the protein as an apoptotic milk factor in mammary gland involution

    Get PDF
    The mammary gland undergoes a sophisticated programme of developmental changes during pregnancy/lactation. However, little is known about processes involving initiation of apoptosis at involution following weaning. We used fur seals as models to study the molecular process of involution as these animals display a unique mammary gland phenotype. Fur seals have long lactation periods whereby mothers cycle between secreting copious quantities of milk for 2 to 3 days suckling pups on land, with trips to sea alone to forage for up to 23 days during which time mammary glands remain active without initiating apoptosis/involution.<br /

    Mechanical Work as an Indirect Measure of Subjective Costs Influencing Human Movement

    Get PDF
    To descend a flight of stairs, would you rather walk or fall? Falling seems to have some obvious disadvantages such as the risk of pain or injury. But the preferred strategy of walking also entails a cost for the use of active muscles to perform negative work. The amount and distribution of work a person chooses to perform may, therefore, reflect a subjective valuation of the trade-offs between active muscle effort and other costs, such as pain. Here we use a simple jump landing experiment to quantify the work humans prefer to perform to dissipate the energy of landing. We found that healthy normal subjects (N = 8) preferred a strategy that involved performing 37% more negative work than minimally necessary (P<0.001) across a range of landing heights. This then required additional positive work to return to standing rest posture, highlighting the cost of this preference. Subjects were also able to modulate the amount of landing work, and its distribution between active and passive tissues. When instructed to land softly, they performed 76% more work than necessary (P<0.001), with a higher proportion from active muscles (89% vs. 84%, P<0.001). Stiff-legged landings, performed by one subject for demonstration, exhibited close to the minimum of work, with more of it performed passively through soft tissue deformations (at least 30% in stiff landings vs. 16% preferred). During jump landings, humans appear not to minimize muscle work, but instead choose to perform a consistent amount of extra work, presumably to avoid other subjective costs. The degree to which work is not minimized may indirectly quantify the relative valuation of costs that are otherwise difficult to measure

    Hypoxia and hypoxia inducible factor-1α are required for normal endometrial repair during menstruation

    Get PDF
    About a quarter of pre-menopausal women will suffer from heavy menstrual bleeding in their lives. Here, Maybin and colleagues show hypoxia and subsequent activation of HIF-1α during menses are required for normal endometrial repair, and identify pharmacological stabilisation of HIF-1α as a potential therapeutic strategy for this debilitating condition

    A group randomized trial of a complexity-based organizational intervention to improve risk factors for diabetes complications in primary care settings: study protocol

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Most patients with type 2 diabetes have suboptimal control of their glucose, blood pressure (BP), and lipids – three risk factors for diabetes complications. Although the chronic care model (CCM) provides a roadmap for improving these outcomes, developing theoretically sound implementation strategies that will work across diverse primary care settings has been challenging. One explanation for this difficulty may be that most strategies do not account for the complex adaptive system (CAS) characteristics of the primary care setting. A CAS is comprised of individuals who can learn, interconnect, self-organize, and interact with their environment in a way that demonstrates non-linear dynamic behavior. One implementation strategy that may be used to leverage these properties is practice facilitation (PF). PF creates time for learning and reflection by members of the team in each clinic, improves their communication, and promotes an individualized approach to implement a strategy to improve patient outcomes.</p> <p>Specific objectives</p> <p>The specific objectives of this protocol are to: evaluate the effectiveness and sustainability of PF to improve risk factor control in patients with type 2 diabetes across a variety of primary care settings; assess the implementation of the CCM in response to the intervention; examine the relationship between communication within the practice team and the implementation of the CCM; and determine the cost of the intervention both from the perspective of the organization conducting the PF intervention and from the perspective of the primary care practice.</p> <p>Intervention</p> <p>The study will be a group randomized trial conducted in 40 primary care clinics. Data will be collected on all clinics, with 60 patients in each clinic, using a multi-method assessment process at baseline, 12, and 24 months. The intervention, PF, will consist of a series of practice improvement team meetings led by trained facilitators over 12 months. Primary hypotheses will be tested with 12-month outcome data. Sustainability of the intervention will be tested using 24 month data. Insights gained will be included in a delayed intervention conducted in control practices and evaluated in a pre-post design.</p> <p>Primary and secondary outcomes</p> <p>To test hypotheses, the unit of randomization will be the clinic. The unit of analysis will be the repeated measure of each risk factor for each patient, nested within the clinic. The repeated measure of glycosylated hemoglobin A1c will be the primary outcome, with BP and Low Density Lipoprotein (LDL) cholesterol as secondary outcomes. To study change in risk factor level, a hierarchical or random effect model will be used to account for the nesting of repeated measurement of risk factor within patients and patients within clinics.</p> <p>This protocol follows the CONSORT guidelines and is registered per ICMJE guidelines:</p> <p>Clinical Trial Registration Number</p> <p>NCT00482768</p

    Genetic linkage analysis in the age of whole-genome sequencing

    Get PDF
    For many years, linkage analysis was the primary tool used for the genetic mapping of Mendelian and complex traits with familial aggregation. Linkage analysis was largely supplanted by the wide adoption of genome-wide association studies (GWASs). However, with the recent increased use of whole-genome sequencing (WGS), linkage analysis is again emerging as an important and powerful analysis method for the identification of genes involved in disease aetiology, often in conjunction with WGS filtering approaches. Here, we review the principles of linkage analysis and provide practical guidelines for carrying out linkage studies using WGS data
    corecore