60,537 research outputs found

    When and how to facilitate the introduction of new knowledge processes in organisations

    Get PDF
    © 2014, Emerald Group Publishing Limited. Purpose - The purpose of this paper is to present two case studies hosted between 2012 and 2013 by Woolworths Limited with recommendations to address the question of why has implementing new processes into well-established organisations proved to be problematical. Design/methodology/approach - The research framework used is a novel synthesis of actor-network theory (ANT) with Miller's living systems theory (LST). Systems at each LST level are actors in an actor-network. Higher LST-level actors punctualise lower-level actor-networks, enabling the fine-grained study of dynamic associations within the LST structure. Qualitative measures assess the collaboration's progress. Findings - Gaps were found between teams' capabilities to implement new processes and that required to meet expectations. There were three contributors to the gap: first, knowledge flow was inhibited by social network structural holes; second, a reliance on tacit knowledge made identifying training needs difficult; and third, high utilisation of experts reduced their effectiveness. Research limitations/implications - The nature of logistics means that findings need careful validation before application to other business contexts. Larger studies will benefit from computer mediation for parsing and characterising associations, and computational modelling will be required for validating scenarios that cannot be performed or repeated with human actors. Practical implications - Recommendations for early identification of new ideas that require facilitation will help organisations enhance their adaptability and maintain their competitive advantage in a changing marketplace. Originality/value - The synthesis of ANT with LST provides collaboration researchers with an adaptable framework that combines a focus on dynamic associations within the context of complex social interactions

    Creatine kinase in energy metabolic signaling in muscle

    Get PDF
    There has been much debate on the mechanism of regulation of mitochondrial ATP synthesis to balance ATP consumption during changing cardiac workloads. A key role of creatine kinase (CK) isoenzymes in this regulation of oxidative phosphorylation and in intracellular energy transport had been proposed, but has in the mean time been disputed for many years. It was hypothesized that high-energy phosphoryl groups are obligatorily transferred via CK; this is termed the phosphocreatine shuttle. The other important role ascribed to the CK system is its ability to buffer ADP concentration in cytosol near sites of ATP hydrolysis. 

Almost all of the experiments to determine the role of CK had been done in the steady state, but recently the dynamic response of oxidative phosphorylation to quick changes in
cytosolic ATP hydrolysis has been assessed at various levels of inhibition of CK. Steady state models of CK function in energy transfer existed but were unable to explain the dynamic response with CK inhibited.

The aim of this study was to explain the mode of functioning of the CK system in heart, and in particular the role of different CK isoenzymes in the dynamic response to workload steps. For this purpose we used a mathematical model of cardiac muscle cell energy metabolism containing the kinetics of the key processes of energy production, consumption and transfer pathways. The model underscores that CK plays indeed a dual role in the cardiac cells. The buffering role of CK system is due to the activity of myofibrillar CK (MMCK) while the energy transfer role depends on the activity of mitochondrial CK (MiCK). We propose that this may lead to the differences in regulation mechanisms and energy transfer modes in species with relatively low MiCK activity such as rabbit in comparison with species with high MiCK activity such as rat.

The model needed modification to explain the new type of experimental data on the dynamic response of the mitochondria. We submit that building a Virtual Muscle Cell is not possible without continuous experimental tests to improve the model. In close interaction with experiments we are developing a model for muscle energy metabolism and transport mediated by the creatine kinase isoforms which now already can explain many different types of experiments

    Tensions and paradoxes in electronic patient record research: a systematic literature review using the meta-narrative method

    Get PDF
    Background: The extensive and rapidly expanding research literature on electronic patient records (EPRs) presents challenges to systematic reviewers. This literature is heterogeneous and at times conflicting, not least because it covers multiple research traditions with different underlying philosophical assumptions and methodological approaches. Aim: To map, interpret and critique the range of concepts, theories, methods and empirical findings on EPRs, with a particular emphasis on the implementation and use of EPR systems. Method: Using the meta-narrative method of systematic review, and applying search strategies that took us beyond the Medline-indexed literature, we identified over 500 full-text sources. We used ‘conflicting’ findings to address higher-order questions about how the EPR and its implementation were differently conceptualised and studied by different communities of researchers. Main findings: Our final synthesis included 24 previous systematic reviews and 94 additional primary studies, most of the latter from outside the biomedical literature. A number of tensions were evident, particularly in relation to: [1] the EPR (‘container’ or ‘itinerary’); [2] the EPR user (‘information-processer’ or ‘member of socio-technical network’); [3] organizational context (‘the setting within which the EPR is implemented’ or ‘the EPR-in-use’); [4] clinical work (‘decision-making’ or ‘situated practice’); [5] the process of change (‘the logic of determinism’ or ‘the logic of opposition’); [6] implementation success (‘objectively defined’ or ‘socially negotiated’); and [7] complexity and scale (‘the bigger the better’ or ‘small is beautiful’). Findings suggest that integration of EPRs will always require human work to re-contextualize knowledge for different uses; that whilst secondary work (audit, research, billing) may be made more efficient by the EPR, primary clinical work may be made less efficient; that paper, far from being technologically obsolete, currently offers greater ecological flexibility than most forms of electronic record; and that smaller systems may sometimes be more efficient and effective than larger ones. Conclusions: The tensions and paradoxes revealed in this study extend and challenge previous reviews and suggest that the evidence base for some EPR programs is more limited than is often assumed. We offer this paper as a preliminary contribution to a much-needed debate on this evidence and its implications, and suggest avenues for new research

    A semiotic polyocular framework for multidisciplinary research in relation to multifunctional farming and rural development

    Get PDF
    The concept of multifunctional farming rises out of a problematization of the role of agriculture in society and, in particular, in relation to rural development. Hitherto multifunctional farming has primarily been used as a notion on the relationship between agriculture and society concerning the range of commodity and non-commodity goods that farms provide for society. But the agro-economic achievements together with societal development have led to a point where praxis is questioned and discourse potentially reopened. In an indirect way, the notion of multifunctionality reflects, that aspects not captured by the distinction between commodity and non-commodity need to be reintroduced. This paper offers a new framework (theoretical and methodical) suggesting a poly-ocular multidisciplinary approach and constructivist semiotic understanding of multifunctionality, which supports dialogue and interactions between the approaches, involved. Each research perspective has its own construction of the object of ‘farming’ and the ‘environment’ of farming; and thereby also its own perception of the functions and problems of farming. It therefore comes as no surprise that problems of communication are experienced between different perspectives, or that confusion on shared notions can cause frustrations and difficulties for multidisciplinary studies of multifunctionality. The present framework introduces a notion of multifunctionality, which enables the explicit handling of different perspectives by way of a distinction between the ‘immediate object’, as it appears to the observer, and the ‘dynamical object’, which represents the potentiality of the object in itself. From such semiotic point of view, the notion of multifunctionality becomes genuinely multidisciplinary. Multifunctionality cannot be reduced and included in one perspective, but has to be observed as a second order observation that involves reflexive communication between different perspectives and disciplines

    Engineering failure analysis and design optimisation with HiP-HOPS

    Get PDF
    The scale and complexity of computer-based safety critical systems, like those used in the transport and manufacturing industries, pose significant challenges for failure analysis. Over the last decade, research has focused on automating this task. In one approach, predictive models of system failure are constructed from the topology of the system and local component failure models using a process of composition. An alternative approach employs model-checking of state automata to study the effects of failure and verify system safety properties. In this paper, we discuss these two approaches to failure analysis. We then focus on Hierarchically Performed Hazard Origin & Propagation Studies (HiP-HOPS) - one of the more advanced compositional approaches - and discuss its capabilities for automatic synthesis of fault trees, combinatorial Failure Modes and Effects Analyses, and reliability versus cost optimisation of systems via application of automatic model transformations. We summarise these contributions and demonstrate the application of HiP-HOPS on a simplified fuel oil system for a ship engine. In light of this example, we discuss strengths and limitations of the method in relation to other state-of-the-art techniques. In particular, because HiP-HOPS is deductive in nature, relating system failures back to their causes, it is less prone to combinatorial explosion and can more readily be iterated. For this reason, it enables exhaustive assessment of combinations of failures and design optimisation using computationally expensive meta-heuristics. (C) 2010 Elsevier Ltd. All rights reserved

    Telecommunications Network Planning and Maintenance

    Get PDF
    Telecommunications network operators are on a constant challenge to provide new services which require ubiquitous broadband access. In an attempt to do so, they are faced with many problems such as the network coverage or providing the guaranteed Quality of Service (QoS). Network planning is a multi-objective optimization problem which involves clustering the area of interest by minimizing a cost function which includes relevant parameters, such as installation cost, distance between user and base station, supported traffic, quality of received signal, etc. On the other hand, service assurance deals with the disorders that occur in hardware or software of the managed network. This paper presents a large number of multicriteria techniques that have been developed to deal with different kinds of problems regarding network planning and service assurance. The state of the art presented will help the reader to develop a broader understanding of the problems in the domain

    Ribosomal trafficking is reduced in Schwann cells following induction of myelination.

    Get PDF
    Local synthesis of proteins within the Schwann cell periphery is extremely important for efficient process extension and myelination, when cells undergo dramatic changes in polarity and geometry. Still, it is unclear how ribosomal distributions are developed and maintained within Schwann cell projections to sustain local translation. In this multi-disciplinary study, we expressed a plasmid encoding a fluorescently labeled ribosomal subunit (L4-GFP) in cultured primary rat Schwann cells. This enabled the generation of high-resolution, quantitative data on ribosomal distributions and trafficking dynamics within Schwann cells during early stages of myelination, induced by ascorbic acid treatment. Ribosomes were distributed throughout Schwann cell projections, with ~2-3 bright clusters along each projection. Clusters emerged within 1 day of culture and were maintained throughout early stages of myelination. Three days after induction of myelination, net ribosomal movement remained anterograde (directed away from the Schwann cell body), but ribosomal velocity decreased to about half the levels of the untreated group. Statistical and modeling analysis provided additional insight into key factors underlying ribosomal trafficking. Multiple regression analysis indicated that net transport at early time points was dependent on anterograde velocity, but shifted to dependence on anterograde duration at later time points. A simple, data-driven rate kinetics model suggested that the observed decrease in net ribosomal movement was primarily dictated by an increased conversion of anterograde particles to stationary particles, rather than changes in other directional parameters. These results reveal the strength of a combined experimental and theoretical approach in examining protein localization and transport, and provide evidence of an early establishment of ribosomal populations within Schwann cell projections with a reduction in trafficking following initiation of myelination

    The Expanded Very Large Array

    Full text link
    In almost 30 years of operation, the Very Large Array (VLA) has proved to be a remarkably flexible and productive radio telescope. However, the basic capabilities of the VLA have changed little since it was designed. A major expansion utilizing modern technology is currently underway to improve the capabilities of the VLA by at least an order of magnitude in both sensitivity and in frequency coverage. The primary elements of the Expanded Very Large Array (EVLA) project include new or upgraded receivers for continuous frequency coverage from 1 to 50 GHz, new local oscillator, intermediate frequency, and wide bandwidth data transmission systems to carry signals with 16 GHz total bandwidth from each antenna, and a new digital correlator with the capability to process this bandwidth with an unprecedented number of frequency channels for an imaging array. Also included are a new monitor and control system and new software that will provide telescope ease of use. Scheduled for completion in 2012, the EVLA will provide the world research community with a flexible, powerful, general-purpose telescope to address current and future astronomical issues.Comment: Added journal reference: published in Proceedings of the IEEE, Special Issue on Advances in Radio Astronomy, August 2009, vol. 97, No. 8, 1448-1462 Six figures, one tabl
    corecore