6,775 research outputs found

    High-speed in vitro intensity diffraction tomography

    Get PDF
    We demonstrate a label-free, scan-free intensity diffraction tomography technique utilizing annular illumination (aIDT) to rapidly characterize large-volume three-dimensional (3-D) refractive index distributions in vitro. By optimally matching the illumination geometry to the microscope pupil, our technique reduces the data requirement by 60 times to achieve high-speed 10-Hz volume rates. Using eight intensity images, we recover volumes of ∼350 μm  ×  100 μm  ×  20  μm, with near diffraction-limited lateral resolution of   ∼  487  nm and axial resolution of   ∼  3.4  μm. The attained large volume rate and high-resolution enable 3-D quantitative phase imaging of complex living biological samples across multiple length scales. We demonstrate aIDT’s capabilities on unicellular diatom microalgae, epithelial buccal cell clusters with native bacteria, and live Caenorhabditis elegans specimens. Within these samples, we recover macroscale cellular structures, subcellular organelles, and dynamic micro-organism tissues with minimal motion artifacts. Quantifying such features has significant utility in oncology, immunology, and cellular pathophysiology, where these morphological features are evaluated for changes in the presence of disease, parasites, and new drug treatments. Finally, we simulate the aIDT system to highlight the accuracy and sensitivity of the proposed technique. aIDT shows promise as a powerful high-speed, label-free computational microscopy approach for applications where natural imaging is required to evaluate environmental effects on a sample in real time.https://arxiv.org/abs/1904.06004Accepted manuscrip

    Addressing the dichotomy between individual and societal approaches to personalised medicine in oncology

    Get PDF
    Academic, industry, regulatory leaders and patient advocates in cancer clinical research met in November 2018 at the Innovation and Biomarkers in Cancer Drug Development meeting in Brussels to address the existing dichotomy between increasing calls for personalised oncology approaches based on individual molecular profiles and the need to make resource and regulatory decisions at the societal level in differing health-care delivery systems around the globe. Novel clinical trial designs, the utility and limitations of real-world evidence (RWE) and emerging technologies for profiling patient tumours and tumour-derived DNA in plasma were discussed. While randomised clinical trials remain the gold standard approach to defining clinical utility of local and systemic therapeutic interventions, the broader adoption of comprehensive tumour profiling and novel trial designs coupled with RWE may allow patient and physician autonomy to be appropriately balanced with broader assessments of safety and overall societal benefit. (C) 2019 Published by Elsevier Ltd

    Dynamic Slicing for Deep Neural Networks

    Full text link
    Program slicing has been widely applied in a variety of software engineering tasks. However, existing program slicing techniques only deal with traditional programs that are constructed with instructions and variables, rather than neural networks that are composed of neurons and synapses. In this paper, we propose NNSlicer, the first approach for slicing deep neural networks based on data flow analysis. Our method understands the reaction of each neuron to an input based on the difference between its behavior activated by the input and the average behavior over the whole dataset. Then we quantify the neuron contributions to the slicing criterion by recursively backtracking from the output neurons, and calculate the slice as the neurons and the synapses with larger contributions. We demonstrate the usefulness and effectiveness of NNSlicer with three applications, including adversarial input detection, model pruning, and selective model protection. In all applications, NNSlicer significantly outperforms other baselines that do not rely on data flow analysis.Comment: 11 pages, ESEC/FSE '2

    SITC cancer immunotherapy resource document: a compass in the land of biomarker discovery.

    Get PDF
    Since the publication of the Society for Immunotherapy of Cancer\u27s (SITC) original cancer immunotherapy biomarkers resource document, there have been remarkable breakthroughs in cancer immunotherapy, in particular the development and approval of immune checkpoint inhibitors, engineered cellular therapies, and tumor vaccines to unleash antitumor immune activity. The most notable feature of these breakthroughs is the achievement of durable clinical responses in some patients, enabling long-term survival. These durable responses have been noted in tumor types that were not previously considered immunotherapy-sensitive, suggesting that all patients with cancer may have the potential to benefit from immunotherapy. However, a persistent challenge in the field is the fact that only a minority of patients respond to immunotherapy, especially those therapies that rely on endogenous immune activation such as checkpoint inhibitors and vaccination due to the complex and heterogeneous immune escape mechanisms which can develop in each patient. Therefore, the development of robust biomarkers for each immunotherapy strategy, enabling rational patient selection and the design of precise combination therapies, is key for the continued success and improvement of immunotherapy. In this document, we summarize and update established biomarkers, guidelines, and regulatory considerations for clinical immune biomarker development, discuss well-known and novel technologies for biomarker discovery and validation, and provide tools and resources that can be used by the biomarker research community to facilitate the continued development of immuno-oncology and aid in the goal of durable responses in all patients

    Architecting Data Centers for High Efficiency and Low Latency

    Full text link
    Modern data centers, housing remarkably powerful computational capacity, are built in massive scales and consume a huge amount of energy. The energy consumption of data centers has mushroomed from virtually nothing to about three percent of the global electricity supply in the last decade, and will continuously grow. Unfortunately, a significant fraction of this energy consumption is wasted due to the inefficiency of current data center architectures, and one of the key reasons behind this inefficiency is the stringent response latency requirements of the user-facing services hosted in these data centers such as web search and social networks. To deliver such low response latency, data center operators often have to overprovision resources to handle high peaks in user load and unexpected load spikes, resulting in low efficiency. This dissertation investigates data center architecture designs that reconcile high system efficiency and low response latency. To increase the efficiency, we propose techniques that understand both microarchitectural-level resource sharing and system-level resource usage dynamics to enable highly efficient co-locations of latency-critical services and low-priority batch workloads. We investigate the resource sharing on real-system simultaneous multithreading (SMT) processors to enable SMT co-locations by precisely predicting the performance interference. We then leverage historical resource usage patterns to further optimize the task scheduling algorithm and data placement policy to improve the efficiency of workload co-locations. Moreover, we introduce methodologies to better manage the response latency by automatically attributing the source of tail latency to low-level architectural and system configurations in both offline load testing environment and online production environment. We design and develop a response latency evaluation framework at microsecond-level precision for data center applications, with which we construct statistical inference procedures to attribute the source of tail latency. Finally, we present an approach that proactively enacts carefully designed causal inference micro-experiments to diagnose the root causes of response latency anomalies, and automatically correct them to reduce the response latency.PHDComputer Science & EngineeringUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttps://deepblue.lib.umich.edu/bitstream/2027.42/144144/1/yunqi_1.pd

    Post-Liquefaction Residual Strength of Saturated and Partially Saturated Soils

    Get PDF
    Post-liquefaction response and residual strength play important roles in stability assessment of liquefied ground. Considering the recent advancements in application of induced partial saturation for liquefaction mitigation, the state of knowledge in estimating the residual strength should be extended for liquefied desaturated soils. In this thesis, the residual strength response of a clean sand at different saturation levels was investigated using a Ring Shear Device (RSD). Direct air injection was used to desaturate the soil, which helped mitigating the liquefaction under cyclic loading. However, by raising the shear strain level, both saturated and partially saturated soils were liquefied followed by residual strength measurement. Results indicate that the residual strength increased with a reduction of the saturation level due to the change in compressibility and consequent volume reduction. In addition, the strain rate dependency of the residual strength was confirmed, since an increase of shear strain rate resulted in an increase in residual strength both under saturated and partially saturated conditions

    Post-Liquefaction Residual Strength of Saturated and Partially Saturated Soils

    Get PDF
    Post-liquefaction response and residual strength play important roles in stability assessment of liquefied ground. Considering the recent advancements in application of induced partial saturation for liquefaction mitigation, the state of knowledge in estimating the residual strength should be extended for liquefied desaturated soils. In this thesis, the residual strength response of a clean sand at different saturation levels was investigated using a Ring Shear Device (RSD). Direct air injection was used to desaturate the soil, which helped mitigating the liquefaction under cyclic loading. However, by raising the shear strain level, both saturated and partially saturated soils were liquefied followed by residual strength measurement. Results indicate that the residual strength increased with a reduction of the saturation level due to the change in compressibility and consequent volume reduction. In addition, the strain rate dependency of the residual strength was confirmed, since an increase of shear strain rate resulted in an increase in residual strength both under saturated and partially saturated conditions

    Framework For Quantifying And Tailoring Complexity And Risk To Manage Uncertainty In Developing Complex Products And Systems

    Get PDF
    In recent years there has been a renewed interest in product complexity due its negative impact on launch performance. Research indicates that underestimating complexity is one of the most common errors repeated by new product development (NPD) teams. It was concluded that companies that successfully manage complexity can maintain a competitive advantage. This is particularly true of CoPS projects (Complex Products and Systems) which are defined as large-scale, high value, engineering intensive products and systems. Investment in CoPS projects continues to grow worldwide, with recent estimates placed at over $500B annually. In this research we present methods to improve the planning and coordination of complexity and risk in CoPS projects to support launch success. The methods are designed to be consistent with systems engineering practices which are commonly used in their development. The research proposes novel methods for the assessment, quantification, and management of development complexity and risk. The models are initiated from preliminary customer requirements so they may be implemented at the earliest point in the development process and yield the most significant cost savings and impact. The models presented are validated on a large-scale defense industry project and experimental case study example. The research demonstrates that development complexity and risk can be effectively quantified in the early development stages and used to align and tailor organizational resources to improve PD performance. The methods also provide the benefit of being implementable with little disruption to existing processes as they align closely with current industry practices
    • …
    corecore