2,395 research outputs found

    Do Humans and Deep Convolutional Neural Networks Use Visual Information Similarly for the Categorization of Natural Scenes?

    Get PDF
    The investigation of visual categorization has recently been aided by the introduction of deep convolutional neural networks (CNNs), which achieve unprecedented accuracy in picture classification after extensive training. Even if the architecture of CNNs is inspired by the organization of the visual brain, the similarity between CNN and human visual processing remains unclear. Here, we investigated this issue by engaging humans and CNNs in a two-class visual categorization task. To this end, pictures containing animals or vehicles were modified to contain only low/high spatial frequency (HSF) information, or were scrambled in the phase of the spatial frequency spectrum. For all types of degradation, accuracy increased as degradation was reduced for both humans and CNNs; however, the thresholds for accurate categorization varied between humans and CNNs. More remarkable differences were observed for HSF information compared to the other two types of degradation, both in terms of overall accuracy and image-level agreement between humans and CNNs. The difficulty with which the CNNs were shown to categorize high-passed natural scenes was reduced by picture whitening, a procedure which is inspired by how visual systems process natural images. The results are discussed concerning the adaptation to regularities in the visual environment (scene statistics); if the visual characteristics of the environment are not learned by CNNs, their visual categorization may depend only on a subset of the visual information on which humans rely, for example, on low spatial frequency information

    The Pragmatics of Person and Imperatives in Sign Language of the Netherlands

    Get PDF
    We present new evidence against a grammatical distinction between second and third person in Sign Language of The Netherlands (NGT). More precisely, we show how pushing this distinction into the domain of pragmatics helps account for an otherwise puzzling fact about the NGT imperative: not only is it used to command your addressee, it can also express ‘non-addressee-oriented commands’

    Defining a roadmap for harmonizing quality indicators in Laboratory Medicine: A consensus statement on behalf of the IFCC Working Group "laboratory Error and Patient Safety" and EFLM Task and Finish Group "performance specifications for the extra-analytical phases"

    Get PDF
    The improving quality of laboratory testing requires a deep understanding of the many vulnerable steps involved in the total examination process (TEP), along with the identification of a hierarchy of risks and challenges that need to be addressed. From this perspective, the Working Group \u201cLaboratory Errors and Patient Safety\u201d (WG-LEPS) of International Federation of Clinical Chemistry and Laboratory Medicine (IFCC) is focusing its activity on implementation of an efficient tool for obtaining meaningful information on the risk of errors developing throughout the TEP, and for establishing reliable information about error frequencies and their distribution. More recently, the European Federation of Clinical Chemistry and Laboratory Medicine (EFLM) has created the Task and Finish Group \u201cPerformance specifications for the extraanalytical phases\u201d (TFG-PSEP) for defining performance specifications for extra-analytical phases. Both the IFCC and EFLM groups are working to provide laboratories with a system to evaluate their performances and recognize the critical aspects where improvement actions are needed. A Consensus Conference was organized in Padova, Italy, in 2016 in order to bring together all the experts and interested parties to achieve a consensus for effective harmonization of quality indicators (QIs). A general agreement was achieved and the main outcomes have been the release of a new version of model of quality indicators (MQI), the approval of a criterion for establishing performance specifications and the definition of the type of information that should be provided within the report to the clinical laboratories participating to the QIs project

    Defining a roadmap for harmonizing quality indicators in Laboratory Medicine: A consensus statement on behalf of the IFCC Working Group "laboratory Error and Patient Safety" and EFLM Task and Finish Group "performance specifications for the extra-analytical phases"

    Get PDF
    The improving quality of laboratory testing requires a deep understanding of the many vulnerable steps involved in the total examination process (TEP), along with the identification of a hierarchy of risks and challenges that need to be addressed. From this perspective, the Working Group “Laboratory Errors and Patient Safety” (WG-LEPS) of International Federation of Clinical Chemistry and Laboratory Medicine (IFCC) is focusing its activity on implementation of an efficient tool for obtaining meaningful information on the risk of errors developing throughout the TEP, and for establishing reliable information about error frequencies and their distribution. More recently, the European Federation of Clinical Chemistry and Laboratory Medicine (EFLM) has created the Task and Finish Group “Performance specifications for the extraanalytical phases” (TFG-PSEP) for defining performance specifications for extra-analytical phases. Both the IFCC and EFLM groups are working to provide laboratories with a system to evaluate their performances and recognize the critical aspects where improvement actions are needed. A Consensus Conference was organized in Padova, Italy, in 2016 in order to bring together all the experts and interested parties to achieve a consensus for effective harmonization of quality indicators (QIs). A general agreement was achieved and the main outcomes have been the release of a new version of model of quality indicators (MQI), the approval of a criterion for establishing performance specifications and the definition of the type of information that should be provided within the report to the clinical laboratories participating to the QIs project

    The dilemma of polypharmacy in psychosis: is it worth combining partial and full dopamine modulation?

    Get PDF
    Antipsychotic polypharmacy in psychotic disorders is widespread despite international guidelines favoring monotherapy. Previous evidence indicates the utility of low-dose partial dopamine agonist (PDAs) add-ons to mitigate antipsychotic-induced metabolic adverse effects or hyperprolactinemia. However, clinicians are often concerned about using PDAs combined with high-potency, full dopaminergic antagonists (FDAs) due to the risk of psychosis relapse. We, therefore, conducted a literature review to find studies investigating the effects of combined treatment with PDAs (i.e. aripiprazole, cariprazine and brexpiprazole) and FDAs having a strong D-2 receptor binding affinity. Twenty studies examining the combination aripiprazole - high-potency FDAs were included, while no study was available on combinations with cariprazine or brexpiprazole. Studies reporting clinical improvement suggested that this may require a relatively long time (similar to 11 weeks), while studies that found symptom worsening observed this happening in a shorter timeframe (similar to 3 weeks). Patients with longer illness duration who received add-on aripiprazole on ongoing FDA monotherapy may be at greater risk for symptomatologic worsening. Especially in these cases, close clinical monitoring is therefore recommended during the first few weeks of combined treatment. These indications may be beneficial to psychiatrists who consider using this treatment strategy. Well-powered randomized clinical trials are needed to derive more solid clinical recommendations. Copyright (C) 2022 The Author(s). Published by Wolters Kluwer Health, Inc

    Complicações cardiovasculares e seus riscos no perioperatório de pacientes submetidos a cirurgias gerais.

    Get PDF
    Trabalho de Conclusão de Curso - Universidade Federal de Santa Catarina, Centro de Ciências da Saúde, Departamento de Clínica Médica, Curso de Medicina, Florianópolis, 199

    Effects of atomic diffraction on the Collective Atomic Recoil Laser

    Full text link
    We formulate a wave atom optics theory of the Collective Atomic Recoil Laser, where the atomic center-of-mass motion is treated quantum mechanically. By comparing the predictions of this theory with those of the ray atom optics theory, which treats the center-of-mass motion classically, we show that for the case of a far off-resonant pump laser the ray optics model fails to predict the linear response of the CARL when the temperature is of the order of the recoil temperature or less. This is due to the fact that in theis temperature regime one can no longer ignore the effects of matter-wave diffraction on the atomic center-of-mass motion.Comment: plain tex, 10 pages, 10 figure

    An Explicit Framework for Interaction Nets

    Full text link
    Interaction nets are a graphical formalism inspired by Linear Logic proof-nets often used for studying higher order rewriting e.g. \Beta-reduction. Traditional presentations of interaction nets are based on graph theory and rely on elementary properties of graph theory. We give here a more explicit presentation based on notions borrowed from Girard's Geometry of Interaction: interaction nets are presented as partial permutations and a composition of nets, the gluing, is derived from the execution formula. We then define contexts and reduction as the context closure of rules. We prove strong confluence of the reduction within our framework and show how interaction nets can be viewed as the quotient of some generalized proof-nets

    Un arbre au désert : Acacia raddiana

    Get PDF

    Recirculation as the form of destination of the Concentrate originated from the Treatment of leachate in landfills by Membrane Processes

    Get PDF
    The different methodologies of leachate treatment are widely debated in the literature, promoting a great discussion among the scientific and academic community on the most efficient and propitious methods. Membrane treatment processes, especially Reverse Osmosis (RO), stand out as the best solution. The RO has pollutant removal rates higher than 99%, with operational cost and complexity competitive with other technologies. Its main disadvantage is the concentrated residue generated in the process that covers about 30% of the volume of leachate entering the system. Its recirculation in the body of the landfill arises as an alternative of low destination cost. Its effectiveness is directly related to the method of recirculation along the geological, climatological, technical and operational conditions of the landfills. Although already widespread, the treatment or destination of the concentrate requires a greater technological assertion. Further research is needed on the recirculation methods of the concentrate and its medium and long-term effects on leachate, settlement and landfills after care period. It is important to make a comparative analysis of landfills with similar characteristics, one with and another without recirculation of the concentrate. Alternatives to treat the concentrate are also of great interest whether they are economically viable in real scale.
    corecore