2,232 research outputs found

    Modeling the human as a controller in a multitask environment

    Get PDF
    Modeling the human as a controller of slowly responding systems with preview is considered. Along with control tasks, discrete noncontrol tasks occur at irregular intervals. In multitask situations such as these, it has been observed that humans tend to apply piecewise constant controls. It is believed that the magnitude of controls and the durations for which they remain constant are dependent directly on the system bandwidth, preview distance, complexity of the trajectory to be followed, and nature of the noncontrol tasks. A simple heuristic model of human control behavior in this situation is presented. The results of a simulation study, whose purpose was determination of the sensitivity of the model to its parameters, are discussed

    Supervisory sampling and control: Sources of suboptimality in a prediction task

    Get PDF
    A process supervisor is defined as a person who decides when to sample the process input and what values of a control variable to specify in order to maximize (minimize) a given value function of input sampling period, control setting, and process state. Presented experimental data in such a process where the value function is a time-averaged sampling cost plus mean squared difference between input and control variable. The task was unpaced prediction of the output of a second order filter driven by white noise. Experimental results, when compared to the optical strategy, reveal several consistently suboptimal behaviors. One is a tendency not to choose a long prediction interval even though the optimal strategy dictates that one should. Some results are also interpreted in terms of those input parameters according to which each subjects' behavior would have been nearest optimal. Differences of those parameters from actual input parameters served to quantify how subjects' prediction behavior differed from optimal

    Methods to decrease blood loss during liver resection: a network meta-analysis

    Get PDF
    BACKGROUND: Liver resection is a major surgery with significant mortality and morbidity. Specialists have tested various methods in attempts to limit blood loss, transfusion requirements, and morbidity during elective liver resection. These methods include different approaches (anterior versus conventional approach), use of autologous blood donation, cardiopulmonary interventions such as hypoventilation, low central venous pressure, different methods of parenchymal transection, different methods of management of the raw surface of the liver, different methods of vascular occlusion, and different pharmacological interventions. A surgeon typically uses only one of the methods from each of these seven categories. The optimal method to decrease blood loss and transfusion requirements in people undergoing liver resection is unknown. OBJECTIVES: To assess the effects of different interventions for decreasing blood loss and blood transfusion requirements during elective liver resection. SEARCH METHODS: We searched the Cochrane Central Register of Controlled Trials (CENTRAL), MEDLINE, Embase, and Science Citation Index Expanded to September 2015 to identify randomised clinical trials. We also searched trial registers and handsearched the references lists of identified trials. SELECTION CRITERIA: We included only randomised clinical trials (irrespective of language, blinding, or publication status) comparing different methods of decreasing blood loss and blood transfusion requirements in people undergoing liver resection. DATA COLLECTION AND ANALYSIS: Two review authors independently identified trials and collected data. We assessed the risk of bias using Cochrane domains. We conducted a Bayesian network meta-analysis using the Markov chain Monte Carlo method in WinBUGS 1.4, following the guidelines of the National Institute for Health and Care Excellence Decision Support Unit guidance documents. We calculated the odds ratios (OR) with 95% credible intervals (CrI) for the binary outcomes, mean differences (MD) with 95% CrI for continuous outcomes, and rate ratios with 95% CrI for count outcomes, using a fixed-effect model or random-effects model according to model-fit. We assessed the evidence with GRADE. MAIN RESULTS: We identified 67 randomised clinical trials involving a total of 6197 participants. All the trials were at high risk of bias. A total of 5771 participants from 64 trials provided data for one or more outcomes included in this review. There was no evidence of differences in most of the comparisons, and where there was, these differences were in single trials, mostly of small sample size. We summarise only the evidence that was available in more than one trial below. Of the primary outcomes, the only one with evidence of a difference from more than one trial under the pair-wise comparison was in the number of adverse events (complications), which was higher with radiofrequency dissecting sealer than with the clamp-crush method (rate ratio 1.85, 95% CrI 1.07 to 3.26; 250 participants; 3 studies; very low-quality evidence). Among the secondary outcomes, the only differences we found from more than one trial under the pair-wise comparison were the following: blood transfusion (proportion) was higher in the low central venous pressure group than in the acute normovolemic haemodilution plus low central venous pressure group (OR 3.19, 95% CrI 1.56 to 6.95; 208 participants; 2 studies; low-quality evidence); blood transfusion quantity (red blood cells) was lower in the fibrin sealant group than in the control (MD -0.53 units, 95% CrI -1.00 to -0.07; 122 participants; 2; very low-quality evidence); blood transfusion quantity (fresh frozen plasma) was higher in the oxidised cellulose group than in the fibrin sealant group (MD 0.53 units, 95% CrI 0.36 to 0.71; 80 participants; 2 studies; very low-quality evidence); blood loss (MD -0.34 L, 95% CrI -0.46 to -0.22; 237 participants; 4 studies; very low-quality evidence), total hospital stay (MD -2.42 days, 95% CrI -3.91 to -0.94; 197 participants; 3 studies; very low-quality evidence), and operating time (MD -15.32 minutes, 95% CrI -29.03 to -1.69; 192 participants; 4 studies; very low-quality evidence) were lower with low central venous pressure than with control. For the other comparisons, the evidence for difference was either based on single small trials or there was no evidence of differences. None of the trials reported health-related quality of life or time needed to return to work. AUTHORS' CONCLUSIONS: Paucity of data meant that we could not assess transitivity assumptions and inconsistency for most analyses. When direct and indirect comparisons were available, network meta-analysis provided additional effect estimates for comparisons where there were no direct comparisons. However, the paucity of data decreases the confidence in the results of the network meta-analysis. Low-quality evidence suggests that liver resection using a radiofrequency dissecting sealer may be associated with more adverse events than with the clamp-crush method. Low-quality evidence also suggests that the proportion of people requiring a blood transfusion is higher with low central venous pressure than with acute normovolemic haemodilution plus low central venous pressure; very low-quality evidence suggests that blood transfusion quantity (red blood cells) was lower with fibrin sealant than control; blood transfusion quantity (fresh frozen plasma) was higher with oxidised cellulose than with fibrin sealant; and blood loss, total hospital stay, and operating time were lower with low central venous pressure than with control. There is no evidence to suggest that using special equipment for liver resection is of any benefit in decreasing the mortality, morbidity, or blood transfusion requirements (very low-quality evidence). Radiofrequency dissecting sealer should not be used outside the clinical trial setting since there is low-quality evidence for increased harm without any evidence of benefits. In addition, it should be noted that the sample size was small and the credible intervals were wide, and we cannot rule out considerable benefit or harm with a specific method of liver resection

    A viscoelastic – viscoplastic material model for superalloy applications

    Get PDF
    An understanding of rate dependency over a wide range of time scales is vitally important in approximating the transient response of critical components operating in extreme environments. Many examples of viscoplastic model formulations can be found in the literature, wherein all rate dependency is assumed to occur after yielding. Such models neglect any viscous effects during elastic deformation. In the present work, a unified viscoelastic – viscoplastic material model is developed for the Nickel superalloy RR1000. Particular emphasis is placed on model parameter determination, which is accomplished using standard cyclic plasticity and stress relaxation experimental data

    Lateral Separation of Macromolecules and Polyelectrolytes in Microlithographic Arrays

    Full text link
    A new approach to separation of a variety of microscopic and mesoscopic objects in dilute solution is presented. The approach takes advantage of unique properties of a specially designed separation device (sieve), which can be readily built using already developed microlithographic techniques. Due to the broken reflection symmetry in its design, the direction of motion of an object in the sieve varies as a function of its self-diffusion constant, causing separation transverse to its direction of motion. This gives the device some significant and unique advantages over existing fractionation methods based on centrifugation and electrophoresis.Comment: 4 pages with 3 eps figures, needs RevTeX 3.0 and epsf, also available in postscript form http://cmtw.harvard.edu/~deniz

    An exploration of function analysis and function allocation in the commercial flight domain

    Get PDF
    The applicability is explored of functional analysis methods to support cockpit design. Specifically, alternative techniques are studied for ensuring an effective division of responsibility between the flight crew and automation. A functional decomposition is performed of the commercial flight domain to provide the information necessary to support allocation decisions and demonstrate methodology for allocating functions to flight crew or to automation. The function analysis employed 'bottom up' and 'top down' analyses and demonstrated the comparability of identified functions, using the 'lift off' segment of the 'take off' phase as a test case. The normal flight mission and selected contingencies were addressed. Two alternative methods for using the functional description in the allocation of functions between man and machine were investigated. The two methods were compared in order to ascertain their relative strengths and weaknesses. Finally, conclusions were drawn regarding the practical utility of function analysis methods

    Screening by symmetry of long-range hydrodynamic interactions of polymers confined in sheets

    Full text link
    Hydrodynamic forces may significantly affect the motion of polymers. In sheet-like cavities, such as the cell's cytoplasm and microfluidic channels, the hydrodynamic forces are long-range. It is therefore expected that that hydrodynamic interactions will dominate the motion of polymers in sheets and will be manifested by Zimm-like scaling. Quite the opposite, we note here that although the hydrodynamic forces are long-range their overall effect on the motion of polymers vanishes due to the symmetry of the two-dimensional flow. As a result, the predicted scaling of experimental observables such as the diffusion coefficient or the rotational diffusion time is Rouse-like, in accord with recent experiments. The effective screening validates the use of the non-interacting blobs picture for polymers confined in a sheet.Comment: http://www.weizmann.ac.il/complex/tlusty/papers/Macromolecules2006.pdf http://pubs.acs.org/doi/abs/10.1021/ma060251

    Predictive Model for Human-Unmanned Vehicle Systems

    Get PDF
    Advances in automation are making it possible for a single operator to control multiple unmanned vehicles. However, the complex nature of these teams presents a difficult and exciting challenge for designers of human–unmanned vehicle systems. To build such systems effectively, models must be developed that describe the behavior of the human–unmanned vehicle team and that predict how alterations in team composition and system design will affect the system’s overall performance. In this paper, we present a method for modeling human–unmanned vehicle systems consisting of a single operator and multiple independent unmanned vehicles. Via a case study, we demonstrate that the resulting models provide an accurate description of observed human-unmanned vehicle systems. Additionally, we demonstrate that the models can be used to predict how changes in the human-unmanned vehicle interface and the unmanned vehicles’ autonomy alter the system’s performance.Lincoln Laborator

    Single and double qubit gates by manipulating degeneracy

    Full text link
    A novel mechanism is proposed for single and double qubit state manipulations in quantum computation with four-fold degenerate energy levels. The principle is based on starting with a four fold degeneracy, lifting it stepwise adiabatically by a set of control parameters and performing the quantum gate operations on non-degenerate states. A particular realization of the proposed mechanism is suggested by using inductively coupled rf-squid loops in the macroscopic quantum tunnelling regime where the energy eigen levels are directly connected with the measurable flux states. The one qubit and two qubit controlled operations are demonstrated explicitly. The appearance of the flux states also allows precise read-in and read-out operations by the measurement of flux.Comment: 6 pages + 5 figures (separately included

    Polymer transport in random flow

    Get PDF
    The dynamics of polymers in a random smooth flow is investigated in the framework of the Hookean dumbbell model. The analytical expression of the time-dependent probability density function of polymer elongation is derived explicitly for a Gaussian, rapidly changing flow. When polymers are in the coiled state the pdf reaches a stationary state characterized by power-law tails both for small and large arguments compared to the equilibrium length. The characteristic relaxation time is computed as a function of the Weissenberg number. In the stretched state the pdf is unstationary and exhibits multiscaling. Numerical simulations for the two-dimensional Navier-Stokes flow confirm the relevance of theoretical results obtained for the delta-correlated model.Comment: 28 pages, 6 figure
    corecore