1,811 research outputs found

    On the Complexity of Case-Based Planning

    Full text link
    We analyze the computational complexity of problems related to case-based planning: planning when a plan for a similar instance is known, and planning from a library of plans. We prove that planning from a single case has the same complexity than generative planning (i.e., planning "from scratch"); using an extended definition of cases, complexity is reduced if the domain stored in the case is similar to the one to search plans for. Planning from a library of cases is shown to have the same complexity. In both cases, the complexity of planning remains, in the worst case, PSPACE-complete

    A study of detecting child pornography on smart phone

    Get PDF
    © Springer International Publishing AG 2018. Child Pornography is an increasingly visible rising cybercrime in the world today. Over the past decade, with rapid growth in smart phone usage, readily available free Cloud Computing storage, and various mobile communication apps, child pornographers have found a convenient and reliable mobile platform for instantly sharing pictures or videos of children being sexually abused. Within this new paradigm, law enforcement officers are finding that detecting, gathering, and processing evidence for the prosecution of child pornographers is becoming increasingly challenging. Deep learning is a machine learning method that models high-level abstractions in data and extracts hierarchical representations of data by using a deep graph with multiple processing layers. This paper presents a conceptual model of deep learning approach for detecting child pornography within the new paradigm by using log analysis, file name analysis and cell site analysis which investigate text logs of events that have happened in the smart phone at the scene of the crime using physical and logical acquisition to assists law enforcement officers in gathering and processing child pornography evidence for prosecution. In addition, this paper shows an illustrative example of logical and physical acquisition on smart phones using forensics tools

    Compilability of Abduction

    Full text link
    Abduction is one of the most important forms of reasoning; it has been successfully applied to several practical problems such as diagnosis. In this paper we investigate whether the computational complexity of abduction can be reduced by an appropriate use of preprocessing. This is motivated by the fact that part of the data of the problem (namely, the set of all possible assumptions and the theory relating assumptions and manifestations) are often known before the rest of the problem. In this paper, we show some complexity results about abduction when compilation is allowed

    Damage and vulnerability analysis of URM churches after the Canterbury earthquake sequence 2010-2011

    Get PDF
    The Canterbury earthquake sequence, in 2010-2011, has highlighted once again the vulnerability ofmonumental structures, in particular churches, and the importance of reducing their risk from an economic, cultural and social point of view. Within this context, detailed analysis is reported of the earthquake-induced damage to a stock of 48 unreinforcedmasonry churches located in the Canterbury Region and the vulnerability analysis of a wider stock of 293 churches located all around New Zealand. New tools were developed forthe assessmentof New Zealand churches. The computation of a new damage grade isproposed, assessed as a proper combination of the damage level to each macroelement, as a step towards the definition of a New Zealand specific damage survey form. Several vulnerability indicators were selected, which are related to easily detectable structural details and geometric dimensions. The collection of such data for the larger set of churches (293) constitutes a useful basis for evaluating the potential impact of future seismic event

    Performance and science reach of the Probe of Extreme Multimessenger Astrophysics for ultrahigh-energy particles

    Get PDF
    The Probe Of Extreme Multi-Messenger Astrophysics (POEMMA) is a potential NASA Astrophysics Probe-class mission designed to observe ultra-high energy cosmic rays (UHECRs) and cosmic neutrinos from space. POEMMA will monitor colossal volumes of the Earth's atmosphere to detect extensive air showers (EASs) produced by extremely energetic cosmic messengers: UHECRs above 20 EeV over the full sky and cosmic neutrinos above 20 PeV. We focus most of this study on the impact of POEMMA for UHECR science by simulating the detector response and mission performance for EAS from UHECRs. We show that POEMMA will provide a significant increase in the statistics of observed UHECRs at the highest energies over the entire sky. POEMMA will be the first UHECR fluorescence detector deployed in space that will provide high-quality stereoscopic observations of the longitudinal development of air showers. Therefore, it will be able to provide event-by-event estimates of the calorimetric energy and nuclear mass of UHECRs. The particle physics in the interactions limits the interpretation of the shower maximum on an event by event basis. In contrast, the calorimetric energy measurement is significantly less sensitive to the different possible final states in the early interactions. We study the prospects to discover the origin and nature of UHECRs using expectations for measurements of the energy spectrum, the distribution of arrival direction, and the atmospheric column depth at which the EAS longitudinal development reaches maximum. We also explore supplementary science capabilities of POEMMA through its sensitivity to particle interactions at extreme energies and its ability to detect ultra-high energy neutrinos and photons produced by top-down models including cosmic strings and super-heavy dark matter particle decay in the halo of the Milky Way.Comment: 40 pages revtex, with 42 figure

    Study protocol: developing a decision system for inclusive housing: applying a systematic, mixed-method quasi-experimental design

    Get PDF
    Background Identifying the housing preferences of people with complex disabilities is a much needed, but under-developed area of practice and scholarship. Despite the recognition that housing is a social determinant of health and quality of life, there is an absence of empirical methodologies that can practically and systematically involve consumers in this complex service delivery and housing design market. A rigorous process for making effective and consistent development decisions is needed to ensure resources are used effectively and the needs of consumers with complex disability are properly met. Methods/Design This 3-year project aims to identify how the public and private housing market in Australia can better respond to the needs of people with complex disabilities whilst simultaneously achieving key corporate objectives. First, using the Customer Relationship Management framework, qualitative (Nominal Group Technique) and quantitative (Discrete Choice Experiment) methods will be used to quantify the housing preferences of consumers and their carers. A systematic mixed-method, quasi-experimental design will then be used to quantify the development priorities of other key stakeholders (e.g., architects, developers, Government housing services etc.) in relation to inclusive housing for people with complex disabilities. Stakeholders randomly assigned to Group 1 (experimental group) will participate in a series of focus groups employing Analytical Hierarchical Process (AHP) methodology. Stakeholders randomly assigned to Group 2 (control group) will participate in focus groups employing existing decision making processes to inclusive housing development (e.g., Risk, Opportunity, Cost, Benefit considerations). Using comparative stakeholder analysis, this research design will enable the AHP methodology (a proposed tool to guide inclusive housing development decisions) to be tested. Discussion It is anticipated that the findings of this study will enable stakeholders to incorporate consumer housing preferences into commercial decisions. Housing designers and developers will benefit from the creation of a parsimonious set of consumer-led housing preferences by which to make informed investments in future housing and contribute to future housing policy. The research design has not been applied in the Australian research context or elsewhere, and will provide a much needed blueprint for market investment to develop viable, consumer directed inclusive housing options for people with complex disability

    Association of Genetic Markers with CSF Oligoclonal Bands in Multiple Sclerosis Patients

    Get PDF
    Objective:to explore the association between genetic markers and Oligoclonal Bands (OCB) in the Cerebro Spinal Fluid (CSF) of Italian Multiple Sclerosis patients.Methods:We genotyped 1115 Italian patients for HLA-DRB1*15 and HLA-A*02. In a subset of 925 patients we tested association with 52 non-HLA SNPs associated with MS susceptibility and we calculated a weighted Genetic Risk Score. Finally, we performed a Genome Wide Association Study (GWAS) with OCB status on a subset of 562 patients. The best associated SNPs of the Italian GWAS were replicated in silico in Scandinavian and Belgian populations, and meta-analyzed.Results:HLA-DRB1*15 is associated with OCB+: p = 0.03, Odds Ratio (OR) = 1.6, 95% Confidence Limits (CL) = 1.1-2.4. None of the 52 non-HLA MS susceptibility loci was associated with OCB, except one SNP (rs2546890) near IL12B gene (OR: 1.45; 1.09-1.92). The weighted Genetic Risk Score mean was significantly (p = 0.0008) higher in OCB+ (7.668) than in OCB- (7.412) patients. After meta-analysis on the three datasets (Italian, Scandinavian and Belgian) for the best associated signals resulted from the Italian GWAS, the strongest signal was a SNP (rs9320598) on chromosome 6q (p = 9.4Ă—10-7) outside the HLA region (65 Mb).Discussion:genetic factors predispose to the development of OCB

    Propositional update operators based on formula/literal dependence

    Get PDF
    International audienceWe present and study a general family of belief update operators in a propositional setting. Its operators are based on formula/literal dependence, which is more fine-grained than the notion of formula/variable dependence that was proposed in the literature: formula/variable dependence is a particular case of formula/literal dependence. Our update operators are defined according to the "forget-then-conjoin" scheme: updating a belief base by an input formula consists in first forgetting in the base every literal on which the input formula has a negative influence, and then conjoining the resulting base with the input formula. The operators of our family differ by the underlying notion of formula/literal dependence, which may be defined syntactically or semantically, and which may or may not exploit further information like known persistent literals and pre-set dependencies. We argue that this allows to handle the frame problem and the ramification problem in a more appropriate way. We evaluate the update operators of our family w.r.t. two important dimensions: the logical dimension, by checking the status of the Katsuno-Mendelzon postulates for update, and the computational dimension, by identifying the complexity of a number of decision problems (including model checking, consistency and inference), both in the general case and in some restricted cases, as well as by studying compactability issues. It follows that several operators of our family are interesting alternatives to previous belief update operators

    Photoemission and photoionization time delays and rates

    Get PDF
    Ionization and, in particular, ionization through the interaction with light play an important role in fundamental processes in physics, chemistry, and biology. In recent years, we have seen tremendous advances in our ability to measure the dynamics of photo-induced ionization in various systems in the gas, liquid, or solid phase. In this review, we will define the parameters used for quantifying these dynamics. We give a brief overview of some of the most important ionization processes and how to resolve the associated time delays and rates. With regard to time delays, we ask the question: how long does it take to remove an electron from an atom, molecule, or solid? With regard to rates, we ask the question: how many electrons are emitted in a given unit of time? We present state-of-the-art results on ionization and photoemission time delays and rates. Our review starts with the simplest physical systems: the attosecond dynamics of single-photon and tunnel ionization of atoms in the gas phase. We then extend the discussion to molecular gases and ionization of liquid targets. Finally, we present the measurements of ionization delays in femto- and attosecond photoemission from the solid–vacuum interface
    • …
    corecore