1,016 research outputs found

    Organ printing as an information technology

    Get PDF
    Funding Information: This work has been sponsored by the São Paulo Research Foundation (FAPESP), The Brazilian Institute of Biofabrication (INCT-BIOFABRIS) and National Council for Scientific and Technological Development (CNPq). Publisher Copyright: © 2015 Published by Elsevier Ltd.Organ printing is defined as a layer by layer additive robotic computer-aided biofabrication of functional 3D organ constructs with using self-assembling tissue spheroids according to digital model. Information technology and computer-aided design softwares are instrumental in the transformation of virtual 3D bioimaging information about human tissue and organs into living biological reality during 3D bioprinting. Information technology enables design blueprints for bioprinting of human organs as well as predictive computer simulation both printing and post-printing processes. 3D bioprinting is now considered as an emerging information technology and the effective application of existing information technology tools and development of new technological platforms such as human tissue and organ informatics, design automation, virtual human organs, virtual organ biofabrication line, mathematical modeling and predictive computer simulations of bioprinted tissue fusion and maturation is an important technological imperative for advancing organ bioprinting.publishersversionPeer reviewe

    The Effects of Previous Misestimation of Task Duration on Estimating Future Task Duration

    Get PDF
    It is a common time management problem that people underestimate the duration of tasks, which has been termed the "planning fallacy." To overcome this, it has been suggested that people should be informed about how long they previously worked on the same task. This study, however, tests whether previous misestimation also affects the duration estimation of a novel task, even if the feedback is only self-generated. To test this, two groups of participants performed two unrelated, laboratory-based tasks in succession. Learning was manipulated by permitting only the experimental group to retrospectively estimate the duration of the first task before predicting the duration of the second task. Results showed that the experimental group underestimated the duration of the second task less than the control group, which indicates a general kind of learning from previous misestimation. The findings imply that people could be trained to carefully observe how much they misestimate task duration in order to stimulate learning. The findings are discussed in relation to the anchoring account of task duration misestimation and the memory-bias account of the planning fallacy. © 2014 Springer Science+Business Media New York

    A frequentist framework of inductive reasoning

    Full text link
    Reacting against the limitation of statistics to decision procedures, R. A. Fisher proposed for inductive reasoning the use of the fiducial distribution, a parameter-space distribution of epistemological probability transferred directly from limiting relative frequencies rather than computed according to the Bayes update rule. The proposal is developed as follows using the confidence measure of a scalar parameter of interest. (With the restriction to one-dimensional parameter space, a confidence measure is essentially a fiducial probability distribution free of complications involving ancillary statistics.) A betting game establishes a sense in which confidence measures are the only reliable inferential probability distributions. The equality between the probabilities encoded in a confidence measure and the coverage rates of the corresponding confidence intervals ensures that the measure's rule for assigning confidence levels to hypotheses is uniquely minimax in the game. Although a confidence measure can be computed without any prior distribution, previous knowledge can be incorporated into confidence-based reasoning. To adjust a p-value or confidence interval for prior information, the confidence measure from the observed data can be combined with one or more independent confidence measures representing previous agent opinion. (The former confidence measure may correspond to a posterior distribution with frequentist matching of coverage probabilities.) The representation of subjective knowledge in terms of confidence measures rather than prior probability distributions preserves approximate frequentist validity.Comment: major revisio

    The Lick AGN Monitoring Project 2011: Reverberation Mapping of Markarian 50

    Get PDF
    The Lick AGN Monitoring Project 2011 observing campaign was carried out over the course of 11 weeks in Spring 2011. Here we present the first results from this program, a measurement of the broad-line reverberation lag in the Seyfert 1 galaxy Mrk 50. Combining our data with supplemental observations obtained prior to the start of the main observing campaign, our dataset covers a total duration of 4.5 months. During this time, Mrk 50 was highly variable, exhibiting a maximum variability amplitude of a factor of 4 in the U-band continuum and a factor of 2 in the H-beta line. Using standard cross-correlation techniques, we find that H-beta and H-gamma lag the V-band continuum by tau_cen = 10.64(-0.93,+0.82) and 8.43(-1.28,+1.30) days, respectively, while the lag of He II 4686 is unresolved. The H-beta line exhibits a symmetric velocity-resolved reverberation signature with shorter lags in the high-velocity wings than in the line core, consistent with an origin in a broad-line region dominated by orbital motion rather than infall or outflow. Assuming a virial normalization factor of f=5.25, the virial estimate of the black hole mass is (3.2+-0.5)*10^7 solar masses. These observations demonstrate that Mrk 50 is among the most promising nearby active galaxies for detailed investigations of broad-line region structure and dynamics.Comment: Accepted for publication in ApJ Letters. 6 pages, 4 figure

    Category Theoretic Analysis of Hierarchical Protein Materials and Social Networks

    Get PDF
    Materials in biology span all the scales from Angstroms to meters and typically consist of complex hierarchical assemblies of simple building blocks. Here we describe an application of category theory to describe structural and resulting functional properties of biological protein materials by developing so-called ologs. An olog is like a “concept web” or “semantic network” except that it follows a rigorous mathematical formulation based on category theory. This key difference ensures that an olog is unambiguous, highly adaptable to evolution and change, and suitable for sharing concepts with other olog. We consider simple cases of beta-helical and amyloid-like protein filaments subjected to axial extension and develop an olog representation of their structural and resulting mechanical properties. We also construct a representation of a social network in which people send text-messages to their nearest neighbors and act as a team to perform a task. We show that the olog for the protein and the olog for the social network feature identical category-theoretic representations, and we proceed to precisely explicate the analogy or isomorphism between them. The examples presented here demonstrate that the intrinsic nature of a complex system, which in particular includes a precise relationship between structure and function at different hierarchical levels, can be effectively represented by an olog. This, in turn, allows for comparative studies between disparate materials or fields of application, and results in novel approaches to derive functionality in the design of de novo hierarchical systems. We discuss opportunities and challenges associated with the description of complex biological materials by using ologs as a powerful tool for analysis and design in the context of materiomics, and we present the potential impact of this approach for engineering, life sciences, and medicine.Presidential Early Career Award for Scientists and Engineers (N000141010562)United States. Army Research Office. Multidisciplinary University Research Initiative (W911NF0910541)United States. Office of Naval Research (grant N000141010841)Massachusetts Institute of Technology. Dept. of MathematicsStudienstiftung des deutschen VolkesClark BarwickJacob Luri

    The fully differential hadronic production of a Higgs boson via bottom quark fusion at NNLO

    Full text link
    The fully differential computation of the hadronic production cross section of a Higgs boson via bottom quarks is presented at NNLO in QCD. Several differential distributions with their corresponding scale uncertainties are presented for the 8 TeV LHC. This is the first application of the method of non-linear mappings for NNLO differential calculations at hadron colliders.Comment: 27 pages, 13 figures, 1 lego plo

    Modelling the impact of toxic and disturbance stress on white-tailed eagle (Haliaeetus albicilla) populations

    Get PDF
    Several studies have related breeding success and survival of sea eagles to toxic or non-toxic stress separately. In the present investigation, we analysed single and combined impacts of both toxic and disturbance stress on populations of white-tailed eagle (Haliaeetus albicilla), using an analytical single-species model. Chemical and eco(toxico)logical data reported from laboratory and field studies were used to parameterise and validate the model. The model was applied to assess the impact of ∑PCB, DDE and disturbance stress on the white-tailed eagle population in The Netherlands. Disturbance stress was incorporated through a 1.6% reduction in survival and a 10–50% reduction in reproduction. ∑PCB contamination from 1950 up to 1987 was found to be too high to allow the return of white-tailed eagle as a breeding species in that period. ∑PCB and population trends simulated for 2006–2050 suggest that future population growth is still reduced. Disturbance stress resulted in a reduced population development. The combination of both toxic and disturbance stress varied from a slower population development to a catastrophical reduction in population size, where the main cause was attributed to the reduction in reproduction of 50%. Application of the model was restricted by the current lack of quantitative dose–response relationships between non-toxic stress and survival and reproduction. Nevertheless, the model provides a first step towards integrating and quantifying the impacts of multiple stressors on white-tailed eagle populations

    Ratio of the Isolated Photon Cross Sections at \sqrt{s} = 630 and 1800 GeV

    Get PDF
    The inclusive cross section for production of isolated photons has been measured in \pbarp collisions at s=630\sqrt{s} = 630 GeV with the \D0 detector at the Fermilab Tevatron Collider. The photons span a transverse energy (ETE_T) range from 7-49 GeV and have pseudorapidity η<2.5|\eta| < 2.5. This measurement is combined with to previous \D0 result at s=1800\sqrt{s} = 1800 GeV to form a ratio of the cross sections. Comparison of next-to-leading order QCD with the measured cross section at 630 GeV and ratio of cross sections show satisfactory agreement in most of the ETE_T range.Comment: 7 pages. Published in Phys. Rev. Lett. 87, 251805, (2001

    Computing the Viscosity of Supercooled Liquids: Markov Network Model

    Get PDF
    The microscopic origin of glass transition, when liquid viscosity changes continuously by more than ten orders of magnitude, is challenging to explain from first principles. Here we describe the detailed derivation and implementation of a Markovian Network model to calculate the shear viscosity of deeply supercooled liquids based on numerical sampling of an atomistic energy landscape, which sheds some light on this transition. Shear stress relaxation is calculated from a master-equation description in which the system follows a transition-state pathway trajectory of hopping among local energy minima separated by activation barriers, which is in turn sampled by a metadynamics-based algorithm. Quantitative connection is established between the temperature variation of the calculated viscosity and the underlying potential energy and inherent stress landscape, showing a different landscape topography or “terrain” is needed for low-temperature viscosity (of order 10[superscript 7] Pa·s) from that associated with high-temperature viscosity (10[superscript −5] Pa·s). Within this range our results clearly indicate the crossover from an essentially Arrhenius scaling behavior at high temperatures to a low-temperature behavior that is clearly super-Arrhenius (fragile) for a Kob-Andersen model of binary liquid. Experimentally the manifestation of this crossover in atomic dynamics continues to raise questions concerning its fundamental origin. In this context this work explicitly demonstrates that a temperature-dependent “terrain” characterizing different parts of the same potential energy surface is sufficient to explain the signature behavior of vitrification, at the same time the notion of a temperature-dependent effective activation barrier is quantified.Corning IncorporatedBoston University. Center for Scientific Computing and VisualizationNational Science Foundation (U.S.) (grant DMR-1008104)National Science Foundation (U.S.) (grant DMR-0520020)United States. Air Force Office of Scientific Research (FA9550-08-1-0325

    Intercalibration of the barrel electromagnetic calorimeter of the CMS experiment at start-up

    Get PDF
    Calibration of the relative response of the individual channels of the barrel electromagnetic calorimeter of the CMS detector was accomplished, before installation, with cosmic ray muons and test beams. One fourth of the calorimeter was exposed to a beam of high energy electrons and the relative calibration of the channels, the intercalibration, was found to be reproducible to a precision of about 0.3%. Additionally, data were collected with cosmic rays for the entire ECAL barrel during the commissioning phase. By comparing the intercalibration constants obtained with the electron beam data with those from the cosmic ray data, it is demonstrated that the latter provide an intercalibration precision of 1.5% over most of the barrel ECAL. The best intercalibration precision is expected to come from the analysis of events collected in situ during the LHC operation. Using data collected with both electrons and pion beams, several aspects of the intercalibration procedures based on electrons or neutral pions were investigated
    corecore