919 research outputs found

    Managing the Complementarity of Knowledge Integration and Process Formalization for Systems Development Performance

    Get PDF
    Systems development processes have received significant negative publicity due to failed projects, often at large costs, and performance issues that continue to plague IS managers. This study complements existing systems development research by proposing a knowledge management perspective for managing tacit and explicit knowledge in the systems development process. Specifically, it proposes that collaborative exchange and integration of explicit knowledge across phases of the development process positively influence the performance of systems development. It also suggests that process formalization not only directly impacts development performance but also moderates the performance effects of the knowledge integration factors. Data for the empirical study were collected from 60 organizations that are part of a user group for one of the world\u27s largest software development tool vendors. Empirical results provide strong evidence of the importance of supporting tacit and explicit knowledge processes in systems development as well as process formalization. The findings suggest that: (i) collaborative exchange among IS employees that integrates their tacit knowledge positively impacts development performance, (ii) explicit knowledge integration in development artifacts across different phases of the systems development process positively impacts development performance, (iii) formalization of processes that establishes routines and discipline yields performance gains, and (iv) the performance effects of both collaborative exchange and explicit knowledge integration are moderated by the formalization of the process. These results have implications for how both tacit and explicit knowledge integration can be managed during systems development, and how formalization of processes complements their relationship with development performance

    Mutations in a plastid-localized elongation factor G alter early stages of plastid development in Arabidopsis thaliana

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Proper development of plastids in embryo and seedling tissues is critical for plant development. During germination, plastids develop to perform many critical functions that are necessary to establish the seedling for further growth. A growing body of work has demonstrated that components of the plastid transcription and translation machinery must be present and functional to establish the organelle upon germination.</p> <p>Results</p> <p>We have identified <it>Arabidopsis thaliana </it>mutants in a gene that encodes a plastid-targeted elongation factor G (<it>SCO1</it>) that is essential for plastid development during embryogenesis since two T-DNA insertion mutations in the coding sequence (<it>sco1-2 </it>and <it>sco1-3</it>) result in an embryo-lethal phenotype. In addition, a point mutation allele (<it>sco1-1</it>) and an allele with a T-DNA insertion in the promoter (<it>sco1-4</it>) of <it>SCO1 </it>display conditional seedling-lethal phenotypes. Seedlings of these alleles exhibit cotyledon and hypocotyl albinism due to improper chloroplast development, and normally die shortly after germination. However, when germinated on media supplemented with sucrose, the mutant plants can produce photosynthetically-active green leaves from the apical meristem.</p> <p>Conclusion</p> <p>The developmental stage-specific phenotype of the conditional-lethal <it>sco1 </it>alleles reveals differences in chloroplast formation during seedling germination compared to chloroplast differentiation in cells derived from the shoot apical meristem. Our identification of embryo-lethal mutant alleles in the Arabidopsis elongation factor G indicates that <it>SCO1 </it>is essential for plant growth, consistent with its predicted role in chloroplast protein translation.</p

    A retrospective study of two populations to test a simple rule for spirometry

    Get PDF
    Abstract Background Chronic lung disease is common and often under-diagnosed. Methods To test a simple rule for conducting spirometry we reviewed spirograms from two populations, occupational medicine evaluations (OME) conducted by Saint Louis and Wake Forest Universities at 3 sites (n = 3260, mean age 64.14 years, 95 % CI 58.94–69.34, 97 % men) and conducted by Wake Forest University preop clinic (POC) at one site (n = 845, mean age 62.10 years, 95 % CI 50.46–73.74, 57 % men). This retrospective review of database information that the first author collected prospectively identified rates, types, sensitivity, specificity and positive and negative predictive value for lung function abnormalities and associated mortality rate found when conducting spirometry based on the 20/40 rule (≥20 years of smoking in those aged ≥ 40 years) in the OME population. To determine the reproducibility of the 20/40 rule for conducting spirometry, the rule was applied to the POC population. Results A lung function abnormality was found in 74 % of the OME population and 67 % of the POC population. Sensitivity of the rule was 85 % for an obstructive pattern and 77 % for any abnormality on spirometry. Positive and negative predictive values of the rule for a spirometric abnormality were 74 and 55 %, respectively. Patients with an obstructive pattern were at greater risk of coronary heart disease (odds ratio (OR) 1.39 [confidence interval (CI) 1.00–1.93] vs. normal) and death (hazard ratio (HR) 1.53, 95 % CI 1.20–1.84) than subjects with normal spirometry. Restricted spirometry patterns were also associated with greater risk of coronary disease (odds ratio (OR) 1.7 [CI 1.23–2.35]) and death (Hazard ratio 1.40, 95 % CI 1.08–1.72). Conclusions Smokers (≥ 20 pack years) age ≥ 40 years are at an increased risk for lung function abnormalities and those abnormalities are associated with greater presence of coronary heart disease and increased all-cause mortality. Use of the 20/40 rule could provide a simple method to enhance selection of candidates for spirometry evaluation in the primary care setting

    Quantum criticality of dipolar spin chains

    Full text link
    We show that a chain of Heisenberg spins interacting with long-range dipolar forces in a magnetic field h perpendicular to the chain exhibits a quantum critical point belonging to the two-dimensional Ising universality class. Within linear spin-wave theory the magnon dispersion for small momenta k is [Delta^2 + v_k^2 k^2]^{1/2}, where Delta^2 \propto |h - h_c| and v_k^2 \propto |ln k|. For fields close to h_c linear spin-wave theory breaks down and we investigate the system using density-matrix and functional renormalization group methods. The Ginzburg regime where non-Gaussian fluctuations are important is found to be rather narrow on the ordered side of the transition, and very broad on the disordered side.Comment: 6 pages, 5 figure

    Longitude : a privacy-preserving location sharing protocol for mobile applications

    Get PDF
    Location sharing services are becoming increasingly popular. Although many location sharing services allow users to set up privacy policies to control who can access their location, the use made by service providers remains a source of concern. Ideally, location sharing providers and middleware should not be able to access users’ location data without their consent. In this paper, we propose a new location sharing protocol called Longitude that eases privacy concerns by making it possible to share a user’s location data blindly and allowing the user to control who can access her location, when and to what degree of precision. The underlying cryptographic algorithms are designed for GPS-enabled mobile phones. We describe and evaluate our implementation for the Nexus One Android mobile phone

    Graph-Based Approach to the Edit Distance Cryptanalysis of Irregularly Clocked Linear Feedback Shift Registers

    Get PDF
    This paper proposes a speed-up of a known-plaintext attack on some stream ciphers based on Linear Feedback Shift Registers (LFSRs). The algorithm consists of two basic steps: first, to guess the initial seed value of one of the LFSRs, and then to use the resulting binary sequence in order to deduce useful information about the cipher parameters. In particular, the proposed divide-and-conquer attack is based on a combination of graph-based techniques with edit distance concepts. While the original edit distance attack requires the exhaustive search over the set of all possible initial states of the involved LFSR, this work presents a new heuristic optimization that avoids the evaluation of an important number of initial states through the identification of the most promising branches of the search graph. The strongest aspects of the proposal are the facts that the obtained results from the attack are absolutely deterministic, and that many inconsistent initial states of the target LFSRs are recognized and avoided during search.This work was supported by the Spanish Ministry of Science and Innovation and European FEDER Fund under Project TIN2008-02236/TSI as well as by CDTI (Spain)and the companies INDRA, Unin Fenosa, Tecnobit, Visual Tool, Brainstorm, SAC and Technosafe under Project Cenit-HESPERIA.Peer reviewe
    corecore