440 research outputs found

    Simultaneous X-ray Video-Fluoroscopy and Pulsed Ultrasound Velocimetry Analyses of the Pharyngeal Phase of Swallowing of Boluses with Different Rheological Properties

    Get PDF
    The Ultrasound Velocity Profiling (UVP) technique allows real-time, non-invasive flow mapping of a fluid along a 1D-measuring line. This study explores the possibility of using the UVP technique and X-ray video-fluoroscopy (XVF) to elucidate the deglutition process with the focus on bolus rheology. By positioning the UVP probe so that the pulsed ultrasonic beam passes behind the air-filled trachea, the bolus flow in the pharynx can be measured. Healthy subjects in a clinical study swallowed fluids with different rheological properties: Newtonian (constant shear viscosity and non-elastic); Boger (constant shear viscosity and elastic); and shear thinning (shear rate-dependent shear viscosity and elastic). The results from both the UVP and XVF reveal higher velocities for the shear thinning fluid, followed by the Boger and the Newtonian fluids, demonstrating that the UVP method has equivalent sensitivities for detecting the velocities of fluids with different rheological properties. The velocity of the contraction wave that clears the pharynx was measured in the UVP and found to be independent of bolus rheology. The results show that UVP not only assesses accurately the fluid velocity in a bolus flow, but it can also monitor the structural changes that take place in response to a bolus flow, with the added advantage of being a completely non-invasive technique that does not require the introduction of contrast media

    Integrative Managerial Capabilities; a New Managerial Mechanism to Achieve a Firm Competitive Response

    Get PDF
    Aims: Companies that should operate in competitive business environments must be able to sustain competitive responses over time. Making such responses, however, typically necessitates the firm's managerial capacity to constantly integrate its properties, ensuring that they are all matched with changing market needs. Based on the literature of Knowledge Management and Dynamic managerial capabilities, this paper contributes to our understanding by developing an Integrative Managerial Capabilities concept, which refers to “managers’ ability to orchestrate a firm resource base through the processes of search, selection, configuration and deployment to achieve and sustain a firm competitive response”. Integration mechanisms are described which provide a key managerial capability. According to the literature on strategic management, limited past studies research has focused on facets of the firm's integrative capabilities. In the current literature, there is a notable lack of comprehensive insight into how companies actually orchestrate "integrate" resources and principal to achieve sustained success in complex environments. Method/Approach: To answer the research questions, the study employed two data collection strategies: qualitative interviews; “semi-structured interviews”, and an openended survey, “online questionnaire”. Findings: The relationship among managerial integrative capabilities and various levels of a firm's systemic structure, in particular, has never been clearly explained. The research concludes that the incorporation process is highly relevant to the case study examples, and that middle management can be seen to have a substantial effect on company change as a result of top and lower-level management integration. Originality: this study argued that better and faster integration between the three main integrative mechanisms “search and selection, configuration and deployment” should itself be a potential source of sustained competitive respond; and that usually that integration will need to traverse “high, middle, and low managerial levels” of management structure. Implications: we recommend that future research into managerial integration processes, such as particular types of transition programs, be undertaken

    Early Results from NASA's Assessment of Satellite Servicing

    Get PDF
    Following recommendations by the NRC, NASA's FY 2008 Authorization Act and the FY 2009 and 2010 Appropriations bills directed NASA to assess the use of the human spaceflight architecture to service existing/future observatory-class scientific spacecraft. This interest in satellite servicing, with astronauts and/or with robots, reflects the success that NASA achieved with the Shuttle program and HST on behalf of the astronomical community as well as the successful construction of ISS. This study, led by NASA GSFC, will last about a year, leading to a final report to NASA and Congress in autumn 2010. We will report on its status, results from our March satellite servicing workshop, and recent concepts for serviceable scientific missions

    Buckling instability of crown sealing

    Get PDF
    Despite the scholarly fascination with water entry of spheres for well over a century,1 we present a new observation, namely, the crown-buckling instability. This instability is characterized by striations appearing near the top of the crown walls just prior to the surface seal, as shown in Fig. 1(a). The crown wall collapses inward due to the pressure differential across the wall created by the moving air in the wake of the sphere and surface tension within the crown. Since the rate of collapse is faster than that at which fluid drains out from the neck region, fluid collects into the striations and the crown buckles. The wall is slightly thicker along these striations than in between where the films are more susceptible to air flow and get drawn inward into the crown interior, thereby developing into bag-like structures (Figs. 1(a) and 1(b)) that ultimately atomize, causing a fine spray inside the crown. Under atmospheric conditions, this typically occurs within 5 ms after impact

    Pneumothorax detection in chest radiographs: optimizing artificial intelligence system for accuracy and confounding bias reduction using in-image annotations in algorithm training

    Get PDF
    OBJECTIVES Diagnostic accuracy of artificial intelligence (AI) pneumothorax (PTX) detection in chest radiographs (CXR) is limited by the noisy annotation quality of public training data and confounding thoracic tubes (TT). We hypothesize that in-image annotations of the dehiscent visceral pleura for algorithm training boosts algorithm's performance and suppresses confounders. METHODS Our single-center evaluation cohort of 3062 supine CXRs includes 760 PTX-positive cases with radiological annotations of PTX size and inserted TTs. Three step-by-step improved algorithms (differing in algorithm architecture, training data from public datasets/clinical sites, and in-image annotations included in algorithm training) were characterized by area under the receiver operating characteristics (AUROC) in detailed subgroup analyses and referenced to the well-established \textquotedblCheXNet\textquotedbl algorithm. RESULTS Performances of established algorithms exclusively trained on publicly available data without in-image annotations are limited to AUROCs of 0.778 and strongly biased towards TTs that can completely eliminate algorithm's discriminative power in individual subgroups. Contrarily, our final \textquotedblalgorithm 2\textquotedbl which was trained on a lower number of images but additionally with in-image annotations of the dehiscent pleura achieved an overall AUROC of 0.877 for unilateral PTX detection with a significantly reduced TT-related confounding bias. CONCLUSIONS We demonstrated strong limitations of an established PTX-detecting AI algorithm that can be significantly reduced by designing an AI system capable of learning to both classify and localize PTX. Our results are aimed at drawing attention to the necessity of high-quality in-image localization in training data to reduce the risks of unintentionally biasing the training process of pathology-detecting AI algorithms. KEY POINTS • Established pneumothorax-detecting artificial intelligence algorithms trained on public training data are strongly limited and biased by confounding thoracic tubes. • We used high-quality in-image annotated training data to effectively boost algorithm performance and suppress the impact of confounding thoracic tubes. • Based on our results, we hypothesize that even hidden confounders might be effectively addressed by in-image annotations of pathology-related image features

    On Quantum Markov Chains on Cayley tree II: Phase transitions for the associated chain with XY-model on the Cayley tree of order three

    Full text link
    In the present paper we study forward Quantum Markov Chains (QMC) defined on a Cayley tree. Using the tree structure of graphs, we give a construction of quantum Markov chains on a Cayley tree. By means of such constructions we prove the existence of a phase transition for the XY-model on a Cayley tree of order three in QMC scheme. By the phase transition we mean the existence of two now quasi equivalent QMC for the given family of interaction operators {K}\{K_{}\}.Comment: 34 pages, 1 figur

    Harnessing entropy to direct the bonding/debonding of polymer systems based on reversible chemistry

    Get PDF
    The widely accepted approach for controlling polymer debonding/rebonding properties in responsive materials has been to purposefully engineer the functional end-groups responsible for monomer dynamic bonding. Here, however, we evidence that the debondin

    Model refactoring by example: A multi‐objective search based software engineering approach

    Full text link
    Declarative rules are frequently used in model refactoring in order to detect refactoring opportunities and to apply the appropriate ones. However, a large number of rules is required to obtain a complete specification of refactoring opportunities. Companies usually have accumulated examples of refactorings from past maintenance experiences. Based on these observations, we consider the model refactoring problem as a multi objective problem by suggesting refactoring sequences that aim to maximize both structural and textual similarity between a given model (the model to be refactored) and a set of poorly designed models in the base of examples (models that have undergone some refactorings) and minimize the structural similarity between a given model and a set of well‐designed models in the base of examples (models that do not need any refactoring). To this end, we use the Non‐dominated Sorting Genetic Algorithm (NSGA‐II) to find a set of representative Pareto optimal solutions that present the best trade‐off between structural and textual similarities of models. The validation results, based on 8 real world models taken from open‐source projects, confirm the effectiveness of our approach, yielding refactoring recommendations with an average correctness of over 80%. In addition, our approach outperforms 5 of the state‐of‐the‐art refactoring approaches.Peer Reviewedhttps://deepblue.lib.umich.edu/bitstream/2027.42/143783/1/smr1916.pdfhttps://deepblue.lib.umich.edu/bitstream/2027.42/143783/2/smr1916_am.pd

    Discovering study-specific gene regulatory networks

    Get PDF
    This article has been made available through the Brunel Open Access Publishing Fund.Microarrays are commonly used in biology because of their ability to simultaneously measure thousands of genes under different conditions. Due to their structure, typically containing a high amount of variables but far fewer samples, scalable network analysis techniques are often employed. In particular, consensus approaches have been recently used that combine multiple microarray studies in order to find networks that are more robust. The purpose of this paper, however, is to combine multiple microarray studies to automatically identify subnetworks that are distinctive to specific experimental conditions rather than common to them all. To better understand key regulatory mechanisms and how they change under different conditions, we derive unique networks from multiple independent networks built using glasso which goes beyond standard correlations. This involves calculating cluster prediction accuracies to detect the most predictive genes for a specific set of conditions. We differentiate between accuracies calculated using cross-validation within a selected cluster of studies (the intra prediction accuracy) and those calculated on a set of independent studies belonging to different study clusters (inter prediction accuracy). Finally, we compare our method's results to related state-of-the art techniques. We explore how the proposed pipeline performs on both synthetic data and real data (wheat and Fusarium). Our results show that subnetworks can be identified reliably that are specific to subsets of studies and that these networks reflect key mechanisms that are fundamental to the experimental conditions in each of those subsets

    Enabling trade in Gene-Edited produce in Asia and Australasia: The developing regulatory landscape and future perspectives

    Get PDF
    Genome- or gene-editing (abbreviated here as ‘GEd’) presents great opportunities for crop improvement. This is especially so for the countries in the Asia-Pacific region, which is home to more than half of the world’s growing population. A brief description of the science of gene-editing is provided with examples of GEd products. For the benefits of GEd technologies to be realized, international policy and regulatory environments must be clarified, otherwise non-tariff trade barriers will result. The status of regulations that relate to GEd crop products in Asian countries and Australasia are described, together with relevant definitions and responsible regulatory bodies. The regulatory landscape is changing rapidly: in some countries, the regulations are clear, in others they are developing, and some countries have yet to develop appropriate policies. There is clearly a need for the harmonization or alignment of GEd regulations in the region: this will promote the path-to-market and enable the benefits of GEd technologies to reach the end-users
    corecore