343 research outputs found
Recommended from our members
A comparative survey of integrated learning systems
This paper presents the duction framework for unifying the three basic forms of inference - deduction, abduction, and induction - by specifying the possible relationships and influences among them in the context of integrated learning. Special assumptive forms of inference are defined that extend the use of these inference methods, and the properties of these forms are explored. A comparison to a related inference-based learning frame work is made. Finally several existing integrated learning programs are examined in the perspective of the duction framework
Recommended from our members
Theory-driven learning : using intra-example relationships to constrain learning
We describe an incremental learning algorithm, called theory-driven learning, that creates rules to predict the effect of actions. Theory-driven learning exploits knowledge of regularities among rules to constrain the learning problem. We demonstrate that this knowledge enables the learning system to rapidly converge on accurate predictive rules and to tolerate more complex training data. An algorithm for incrementally learning these regularities is described and we provide evidence that the resulting regularities are sufficiently general to facilitate learning in new domains
Recommended from our members
Integrating explanation-based and empirical learning methods in OCCAM
This paper discusses an approach to integrating empirical and explanation based learning techniques. The paper focuses on OCCAM, a program that has the capability to acquire via empirical means the knowledge needed for analytical learning. Two examples of this capability are discussed:The ability to use empirical techniques to acquire a domain theory for explanation based learning.The ability to use empirical learning techniques to find common patterns for causal relationships. These patterns encode a theory of causality (i.e., a set of general principles for recognizing causal relationships). Once acquired, a theory of causality can facilitate later learning by focusing on hypotheses which are consistent with the theory
Physics Reach of DUNE with a Light Sterile Neutrino
We investigate the implications of one light eV scale sterile neutrino on the
physics potential of the proposed long-baseline experiment DUNE. If the future
short-baseline experiments confirm the existence of sterile neutrinos, then it
can affect the mass hierarchy (MH) and CP-violation (CPV) searches at DUNE. The
MH sensitivity still remains above 5 if the three new mixing angles
() are all close to . In
contrast, it can decrease to 4 if the least constrained mixing angle
is close to its upper limit . We also assess the
sensitivity to the CPV induced both by the standard CP-phase , and the new CP-phases and . In the
3+1 scheme, the discovery potential of CPV induced by gets
deteriorated compared to the 3 case. In particular, the maximal
sensitivity (reached around ) decreases from
to if all the three new mixing angles are close to
. It can further diminish to almost if is
large (). The sensitivity to the CPV due to can reach
3 for an appreciable fraction of its true values. Interestingly,
and its associated phase can influence both the
appearance and disappearance channels via matter effects,
which in DUNE are pronounced. Hence, DUNE can also probe CPV induced by
provided is large. We also reconstruct the two
phases and . The typical 1 uncertainty on
() is () if . The
reconstruction of (but not that of ) degrades if
is large.Comment: 26 pages, 8 figures, 2 tables. Minor revisions. Accepted in JHE
Recommended from our members
A Survey of Machine Learning Systems Integrating Explanation-Based and Similarity-Based Methods
Two disparate machine learning approaches have received considerable attention. These are explanation-based and similarity-based learning. The basic goal of an explanation-based learning system is to more efficiently recognize concepts that it is already capable of recognizing. The learning process involves a knowledge-intensive analysis of an environment-provided example of a concept in order to extract its characteristic features. The basic goal of a similarity-based system, on the other hand, is to acquire descriptions that allow the system to recognize concepts it does not yet know. Although they have been applied with some success to problems in a variety of domains, both methods have clear deficiencies. Explanation-based learning assumes that a system will be provided with an explicit domain theory that is complete, correct, and tractable. This assumption is unrealistic for many complex, real-world domains. Similarity-based learning suffers because of its lack of an explicit theory. Since the two methods are complementary in nature, an obvious solution is to augment systems using one approach with techniques from the other. This survey discusses machine learning systems that integrate explanation-based and similarity-based learning methods such that one is incorporated primarily to handle a deficiency of the other. Although sufficient background material is provided that the reader need not be familiar with machine learning, general knowledge of AI is assumed
Applying scrum to interior design and construction
For decades, product development has been accomplished through defined processes, such
as waterfall (Royce, 1970). Defined processes are those that have known inputs, repeatable
processes, and expected outputs. The assumption that innovation in product development can be
achieved through repeatable processes has resulted in most projects being completed over budget,
over schedule, not meeting user needs, or some combination thereof (Sherman, 2015).
Accommodating change and learning in a defined process is expensive. Construction and interior
design projects have followed a similar defined framework of assessing requirements, planning,
estimation, execution, and post-occupancy evaluations. This has resulted in projects delivered late,
projects delivered over budget, waste and rework, unreliable teams, and unsatisfied clients (Lean
Construction Institute, 2022). Solving complex problems require empirical processes to meet user
needs. Empirical processes incorporate change and learning throughout the project lifecycle and
are based on three pillars: transparency, inspection, and adaptation. This research will focus on the
application of an empirical framework, namely Scrum (Schwaber, SCRUM Development Process,
1995), to construction and interior design projects. âScrum is a lightweight framework that helps
people, teams and organizations generate value through adaptive solutions for complex problems.â
(Schwaber & Sutherland, The 2020 Scrum Guide, 2020, p. 3) This study utilizes a case study and
survey revealing that Scrum can be utilized to deliver more value to clients, increase transparency,
reduce risk, and enhance employee engagement amongst project teams. Findings highlight
changes that can be made in the interior design and construction industry to achieve these results.Thesis (M.S.
2 Studio-based Learning: Pedagogy and Practices
Reviews interdisciplinary SBL scholarship and articulates evidence-based pedagogical principleshttps://cedar.wwu.edu/learning_enhanced/1003/thumbnail.jp
Investigating the electric properties of a siliciclastic reservoir based on rock-physics modeling and laboratory experiments
This thesis is submitted for the Philosophy Doctor degree in Petroleum Geophysics at the Section of Petroleum Geology and Geophysics (PEGG), Department of Geosciences, University of Oslo. This study has been financially supported by the Research Council of Norway (NFR) and StatoilHydro within the framework of PETROMAKS (Programme for the Optimal Management of Petroleum Resources) through the project âHonoring the complexity of the petroleum reservoir- a new modeling tool for sea bed loggingâ. One of the main goals of the project has been to extend Controlled-Source EM (CSEM) forward modeling by including a proper electric rock-physics description of a hydrocarbon reservoir. An extensive conductivity model of reservoir rocks based on Differential Effective Medium (DEM) theory has been developed. It was integrated with both 1.5D and 2.5D CSEM forward modeling tools and the potential of this combined method to describe possible production effects of the CSEM response was demonstrated. A parallel work has been to modify a triaxial cell so that it can carry out simultaneous resistivity and acoustic measurements at reservoir conditions. A variety of such tests employing core samples have been carried out to calibrate rock-physics models and to gain basic understanding of the electric and elastic properties of reservoir rocks. The outcomes of this study are briefly presented in an introduction giving the background, main objectives and contributions made followed by three scientific papers (two published and one submitted) and four proceeding papers. The first paper focuses on the development of the DEM model and the second and third papers discuss the modification of the triaxial cell and the corresponding simultaneous resistivity and acoustic measurements on core samples. The first three proceeding papers discuss implementations of various rock-physics models within CSEM forward modeling tools and show the influence of rock properties on the CSEM response. The last proceeding paper compares the efficiency of different antenna types and orientations for detecting hydrocarbon layers employing CSEM
Efficient Learning and Inference for High-dimensional Lagrangian Systems
Learning the nature of a physical system is a problem that presents many challenges and opportunities owing to the unique structure associated with such systems. Many physical systems of practical interest in engineering are high-dimensional, which prohibits the application of standard learning methods to such problems. This first part of this work proposes therefore to solve learning problems associated with physical systems by identifying their low-dimensional Lagrangian structure. Algorithms are given to learn this structure in the case that it is obscured by a change of coordinates. The associated inference problem corresponds to solving a high-dimensional minimum-cost path problem, which can be solved by exploiting the symmetry of the problem. These techniques are demonstrated via an application to learning from high-dimensional human motion capture data. The second part of this work is concerned with the application of these methods to high-dimensional motion planning. Algorithms are given to learn and exploit the struc- ture of holonomic motion planning problems effectively via spectral analysis and iterative dynamic programming, admitting solutions to problems of unprecedented dimension com- pared to known methods for optimal motion planning. The quality of solutions found is also demonstrated to be much superior in practice to those obtained via sampling-based planning and smoothing, in both simulated problems and experiments with a robot arm. This work therefore provides strong validation of the idea that learning low-dimensional structure is the key to future advances in this field
- âŚ