9,254 research outputs found
Use of metaknowledge in the verification of knowledge-based systems
Knowledge-based systems are modeled as deductive systems. The model indicates that the two primary areas of concern in verification are demonstrating consistency and completeness. A system is inconsistent if it asserts something that is not true of the modeled domain. A system is incomplete if it lacks deductive capability. Two forms of consistency are discussed along with appropriate verification methods. Three forms of incompleteness are discussed. The use of metaknowledge, knowledge about knowledge, is explored in connection to each form of incompleteness
Evolutionary algorithms for robust methods
A drawback of robust statistical techniques is the increased computational effort often needed compared to non robust methods. Robust estimators possessing the exact fit property, for example, are NP-hard to compute. This means thatunder the widely believed assumption that the computational complexity classes NP and P are not equalthere is no hope to compute exact solutions for large high dimensional data sets. To tackle this problem, search heuristics are used to compute NP-hard estimators in high dimensions. Here, an evolutionary algorithm that is applicable to different robust estimators is presented. Further, variants of this evolutionary algorithm for selected estimatorsmost prominently least trimmed squares and least median of squaresare introduced and shown to outperform existing popular search heuristics in difficult data situations. The results increase the applicability of robust methods and underline the usefulness of evolutionary computation for computational statistics. --Evolutionary algorithms,robust regression,least trimmed squares (LTS),least median of squares (LMS),least quantile of squares (LQS),least quartile difference (LQD)
Recommended from our members
A Construct-Modeling Approach to Develop a Learning Progression of how Students Understand the Structure of Matter
This paper builds on the current literature base about learning progressions in science to address the question, “What is the nature of the learning progression in the content domain of the structure of matter?” We introduce a learning progression in response to that question and illustrate a methodology, the Construct Modeling (Wilson, 2005) approach, for investigating the progression through a developmentally based iterative process. This study puts forth a progression of how students understand the structure of matter by empirically inter-relating constructs of different levels of sophistication using a sample of 1,087 middle grade students from a large diverse public school district in the western part of the United States. The study also shows that student thinking can be more complex than hypothesized as in the case of our discovery of a substructure of understanding in a single construct within the larger progression. Data were analyzed using a multidimensional Rasch model. Implications for teaching and learning are discussed—we suggest that the teacher’s choice of instructional approach needs to be fashioned in terms of a model, grounded in evidence, of the paths through which learning might best proceed, working toward the desired targets by a pedagogy which also cultivates students’ development as effective learners. This research sheds light on the need for assessment methods to be used as guides for formative work and as tools to ensure the learning goals have been achieved at the end of the learning period. The development and investigation of a learning progression of how students understand the structure of matter using the Construct Modeling approach makes an important contribution to the research on learning progressions and serves as a guide to the planning and implementation in the teaching of this topic. # 2017 Wiley Periodicals, Inc. J Res Sci Teach 54: 1024–1048, 201
Seismicity relocation and fault structure near the Leech River Fault Zone, southern Vancouver Island
Relatively low rates of seismicity and fault loading have made it challenging to correlate microseismicity to mapped surface faults on the forearc of southern Vancouver Island. Here we use precise relocations of microsciesmicity integrated with existing geologic data, to present the first identification of subsurface seismogenic structures associated with the Leech River fault zone (LRFZ) on southern Vancouver Island. We used HypoDD double difference relocation method to relocate 1253 earthquakes reported by the Canadian National Seismograph Network (CNSN) catalog from 1985 to 2015. Our results reveal an ~8-10 km wide, NNE-dipping zone of seismicity representing a subsurface structure along the eastern 30 km of the terrestrial LRFZ and extending 20 km farther eastward offshore, where the fault bifurcates beneath the Juan de Fuca Strait. Using a clustering analysis we identify secondary structures within the NNE-dipping fault zone, many of which are sub-vertical and exhibit right-lateral strike-slip focal mechanisms. We suggest that the arrangement of these near-vertical dextral secondary structures within a more general NE-dipping fault zone, located well beneath (10-15 km) the Leech River fault (LRF) as imaged by LITHOPROBE, may be a consequence of the reactivation of this fault system as a right-lateral structure in the crust with pre-existing NNE-dipping foliations. Our results provide the first confirmation of active terrestrial crustal faults on Vancouver Island using a relocation method. We suggest that slowly slipping active crustal faults, especially in regions with pre-existing foliations, may result in microseismicity along fracture arrays rather than along single planar structures
Sticky Rebates: Rollback Rebates Induce Non-Rational Loyalty in Consumers
Competition policy often relies on the assumption of a rational consumer, although other models may better account for people’s decision behavior. In three experiments, we investigate the influence of loyalty rebates on consumers based on the alternative Cumulative Prospect Theory (CPT), both theoretically and experimentally. CPT predicts that loyalty rebates could harm consumers by impeding rational switching from an incumbent to an outside option (e.g., a market entrant). In a repeated trading task, participants decided whether or not to enter a loyalty rebate scheme and to continue buying within that scheme. Meeting the condition triggering the rebate was uncertain. Loyalty rebates considerably reduced the likelihood that participants switched to a higher-payoff outside option later. We conclude that loyalty rebates may inflict substantial harm on consumers and may have an underestimated potential to foreclose consumer markets.
Radiation effects on the electronic structure of bilayer graphene
We report on the effects of laser illumination on the electronic properties
of bilayer graphene. By using Floquet theory combined with Green's functions we
unveil the appeareance of laser-induced gaps not only at integer multiples of
but also at the Dirac point with features which are shown to
depend strongly on the laser polarization. Trigonal warping corrections are
shown to lead to important corrections for radiation in the THz range, reducing
the size of the dynamical gaps. Furthermore, our analysis of the topological
properties at low energies reveals that when irradiated with linearly polarized
light, ideal bilayer graphene behaves as a trivial insulator, whereas circular
polarization leads to a non-trivial insulator per valley.Comment: 5 pages 3 figure
- …