58 research outputs found

    Clause-Modifying Particles in Ata Manobo

    Get PDF

    Analysis of AI-Based Single-View 3D Reconstruction Methods for an Industrial Application

    Get PDF
    Machine learning (ML) is a key technology in smart manufacturing as it provides insights into complex processes without requiring deep domain expertise. This work deals with deep learning algorithms to determine a 3D reconstruction from a single 2D grayscale image. The potential of 3D reconstruction can be used for quality control because the height values contain relevant information that is not visible in 2D data. Instead of 3D scans, estimated depth maps based on a 2D input image can be used with the advantage of a simple setup and a short recording time. Determining a 3D reconstruction from a single input image is a difficult task for which many algorithms and methods have been proposed in the past decades. In this work, three deep learning methods, namely stacked autoencoder (SAE), generative adversarial networks (GANs) and U-Nets are investigated, evaluated and compared for 3D reconstruction from a 2D grayscale image of laser-welded components. In this work, different variants of GANs are tested, with the conclusion that Wasserstein GANs (WGANs) are the most robust approach among them. To the best of our knowledge, the present paper considers for the first time the U-Net, which achieves outstanding results in semantic segmentation, in the context of 3D reconstruction tasks. Unlike the U-Net, which uses standard convolutions, the stacked dilated U-Net (SDU-Net) applies stacked dilated convolutions. Of all the 3D reconstruction approaches considered in this work, the SDU-Net shows the best performance, not only in terms of evaluation metrics but also in terms of computation time. Due to the comparably small number of trainable parameters and the suitability of the architecture for strong data augmentation, a robust model can be generated with only a few training data

    Israel: A jewish state or a state for all its citizens? Eine diskursanalytische Untersuchung der arabisch-palästinensischen Minderheit und ihrer Beziehung zum israelischen Staat

    Get PDF
    Die Studie befasst sich mit dem politischen Diskurs über die aktuellen sozialen und politischen Probleme im Zusammenleben zwischen palästinensisch-arabischen und jüdischen Israelis. Ziel ist es herauszufinden, wie die Politik des Staates Israel gegenüber der arabischen Minderheit im eigenen Staat in den Medien dargestellt und bewertet wird. Zu diesem Zweck wird mittels einer Diskursanalyse die Berichterstattung der englischsprachigen israelischen Tages-zeitungen Jerusalem Post und Haaretz über das 2011 verabschiedete Admissions Committee Law untersucht. Das Gesetz gibt Gemeinden mit weniger als 400 Haushalten in den Regionen Galiläa und Negev das Recht, Auswahlkommissionen zu bilden und potentielle Einwohner auf ihre Eignung für das Leben in der Gemeinde zu prüfen. Im medienvermittelten Diskurs spielt die ethnische und kulturelle Zugehörigkeit jüdischer und arabischer Israelis eine wichtige Rolle, wobei eine Grenzziehung auf Basis des kulturellen Bewusstseins erfolgt. Die Analyse zeigt somit, dass sich der zentrale Konflikt, der das Zusammenleben der jüdisch-israelischen Mehrheit und der arabischen Minderheit bestimmt, um die Frage nach dem Charakter des Staates Israel dreht. Soll Israel a jewish state oder a state for all its citizens sein? Diese Grundüberlegung bestimmt das Handeln des Staates auf der einen Seite und die Forderungen der arabischen Minderheit nach Gleichberechtigung auf der anderen Seite

    The Dark Internet: An Exploration of Culture and User Experience

    Get PDF
    Our research sought to investigate the culture of the Dark Internet through a combination of cultural analysis and experiential learning. We split our research into three major portions: analysis of culture of the Dark Internet through the way it is viewed by various media outlets and on the Dark Internet itself; how the culture of the Dark Internet reacts in times of crises; and a comparison of the experience of the users of Dark Internet marketplaces versus users of traditional internet marketplaces. Our cultural analysis was accomplished through the use of textual coding; by coding the articles, forums, and pages that we were gathering, we were able to find and observe key commonalities in behavior and communication among the sources. We also went through the process of purchasing goods from both Dark Internet marketplaces and traditional internet marketplaces allowing us to compare the experiences in a variety of ways including: ease of access, ease of purchase, delivery time, etc. This research aims to provide further insight into the nature of the Dark Internet and open the way for future research into this ever-changing culture

    Evidence-based Toxicology for the 21st Century: Opportunities and Challenges

    Get PDF
    The Evidence-based Toxicology Collaboration (EBTC) was established recently to translate evidence-based approaches from medicine and health care to toxicology in an organized and sustained effort. The EBTC held a workshop on “Evidence-based Toxicology for the 21st Century: Opportunities and Challenges” in Research Triangle Park, North Carolina, USA on January 24-25, 2012. The presentations largely reflected two EBTC priorities: to apply evidence-based methods to assessing the performance of emerging pathwaybased testing methods consistent with the 2007 National Research Council report on “Toxicity Testing in the 21st Century” as well as to adopt a governance structure and work processes to move that effort forward. The workshop served to clarify evidence-based approaches and to provide food for thought on substantive and administrative activities for the EBTC. Priority activities include conducting pilot studies to demonstrate the value of evidence-based approaches to toxicology, as well as conducting educational outreach on these approaches

    Evidence-based Toxicology for the 21st Century: Opportunities and Challenges

    Get PDF
    The Evidence-based Toxicology Collaboration (EBTC) was established recently to translate evidence-based approaches from medicine and health care to toxicology in an organized and sustained effort. The EBTC held a workshop on “Evidence-based Toxicology for the 21st Century: Opportunities and Challenges” in Research Triangle Park, North Carolina, USA on January 24-25, 2012. The presentations largely reflected two EBTC priorities: to apply evidence-based methods to assessing the performance of emerging pathwaybased testing methods consistent with the 2007 National Research Council report on “Toxicity Testing in the 21st Century” as well as to adopt a governance structure and work processes to move that effort forward. The workshop served to clarify evidence-based approaches and to provide food for thought on substantive and administrative activities for the EBTC. Priority activities include conducting pilot studies to demonstrate the value of evidence-based approaches to toxicology, as well as conducting educational outreach on these approaches

    Genetic risk and a primary role for cell-mediated immune mechanisms in multiple sclerosis.

    Get PDF
    Multiple sclerosis is a common disease of the central nervous system in which the interplay between inflammatory and neurodegenerative processes typically results in intermittent neurological disturbance followed by progressive accumulation of disability. Epidemiological studies have shown that genetic factors are primarily responsible for the substantially increased frequency of the disease seen in the relatives of affected individuals, and systematic attempts to identify linkage in multiplex families have confirmed that variation within the major histocompatibility complex (MHC) exerts the greatest individual effect on risk. Modestly powered genome-wide association studies (GWAS) have enabled more than 20 additional risk loci to be identified and have shown that multiple variants exerting modest individual effects have a key role in disease susceptibility. Most of the genetic architecture underlying susceptibility to the disease remains to be defined and is anticipated to require the analysis of sample sizes that are beyond the numbers currently available to individual research groups. In a collaborative GWAS involving 9,772 cases of European descent collected by 23 research groups working in 15 different countries, we have replicated almost all of the previously suggested associations and identified at least a further 29 novel susceptibility loci. Within the MHC we have refined the identity of the HLA-DRB1 risk alleles and confirmed that variation in the HLA-A gene underlies the independent protective effect attributable to the class I region. Immunologically relevant genes are significantly overrepresented among those mapping close to the identified loci and particularly implicate T-helper-cell differentiation in the pathogenesis of multiple sclerosis

    Does the proteasome inhibitor bortezomib sensitize to DNA-damaging therapy in gastroenteropancreatic neuroendocrine neoplasms? – A preclinical assessment in vitro and in vivo

    Get PDF
    Background: Well-differentiated gastroenteropancreatic neuroendocrine neoplasms are rare tumors with a slow proliferation. They are virtually resistant to many DNA-damaging therapeutic approaches, such as chemo- and external beam therapy, which might be overcome by DNA damage inhibition induced by proteasome inhibitors suc

    CATMoS: Collaborative Acute Toxicity Modeling Suite.

    Get PDF
    BACKGROUND: Humans are exposed to tens of thousands of chemical substances that need to be assessed for their potential toxicity. Acute systemic toxicity testing serves as the basis for regulatory hazard classification, labeling, and risk management. However, it is cost- and time-prohibitive to evaluate all new and existing chemicals using traditional rodent acute toxicity tests. In silico models built using existing data facilitate rapid acute toxicity predictions without using animals. OBJECTIVES: The U.S. Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM) Acute Toxicity Workgroup organized an international collaboration to develop in silico models for predicting acute oral toxicity based on five different end points: Lethal Dose 50 (LD50 value, U.S. Environmental Protection Agency hazard (four) categories, Globally Harmonized System for Classification and Labeling hazard (five) categories, very toxic chemicals [LD50 (LD50≤50mg/kg)], and nontoxic chemicals (LD50>2,000mg/kg). METHODS: An acute oral toxicity data inventory for 11,992 chemicals was compiled, split into training and evaluation sets, and made available to 35 participating international research groups that submitted a total of 139 predictive models. Predictions that fell within the applicability domains of the submitted models were evaluated using external validation sets. These were then combined into consensus models to leverage strengths of individual approaches. RESULTS: The resulting consensus predictions, which leverage the collective strengths of each individual model, form the Collaborative Acute Toxicity Modeling Suite (CATMoS). CATMoS demonstrated high performance in terms of accuracy and robustness when compared with in vivo results. DISCUSSION: CATMoS is being evaluated by regulatory agencies for its utility and applicability as a potential replacement for in vivo rat acute oral toxicity studies. CATMoS predictions for more than 800,000 chemicals have been made available via the National Toxicology Program's Integrated Chemical Environment tools and data sets (ice.ntp.niehs.nih.gov). The models are also implemented in a free, standalone, open-source tool, OPERA, which allows predictions of new and untested chemicals to be made. https://doi.org/10.1289/EHP8495
    • …
    corecore