352,345 research outputs found
Recommended from our members
Theory of the perceived motion direction of equal-spatial-frequency plaid stimuli.
At an early stage, 3 different systems independently extract visual motion information from visual inputs. At later stages, these systems combine their outputs. Here, we consider a much studied (>650 publications) class of visual stimuli, plaids, which are combinations of 2 sine waves. Currently, there is no quantitative theory that can account for the perceived motion of plaids. We consider only perceived plaid direction, not speed, and obtain a large set of data exploring the various dimensions in which same-spatial-frequency plaids differ. We find that only 2 of the 3 motion systems are active in plaid processing, and that plaids with temporal frequencies 10 Hz or greater typically stimulate only the first-order motion system, which combines the plaid components by vector summation: Each plaid component is represented by a contrast-strength vector whose length is contrast-squared times a factor representing the relative effectiveness of that component's temporal frequency. The third-order system, which becomes primary at low temporal frequencies, also represents a plaid as 2 vectors that sum according to their contrast strength: a pure plaid in which both components have equal contrast and a residual sine wave. Second-order motion is irrelevant for these plaids. These principles enable a contrast-strength-vector summation theory for the responses of the first-order and third-order motion systems. With zero parameters estimated from the data, the theory captures the essence of the full range of the plaid data and supports the counterintuitive hypothesis that motion direction is processed independently of speed at early stages of visual processing. (PsycInfo Database Record (c) 2020 APA, all rights reserved)
The future of technology enhanced active learning â a roadmap
The notion of active learning refers to the active involvement of learner in the learning process,
capturing ideas of learning-by-doing and the fact that active participation and knowledge construction leads to deeper and more sustained learning. Interactivity, in particular learnercontent interaction, is a central aspect of technology-enhanced active learning. In this roadmap,
the pedagogical background is discussed, the essential dimensions of technology-enhanced active learning systems are outlined and the factors that are expected to influence these systems currently and in the future are identified. A central aim is to address this promising field from a
best practices perspective, clarifying central issues and formulating an agenda for future developments in the form of a roadmap
Managing evolution and change in web-based teaching and learning environments
The state of the art in information technology and educational technologies is evolving constantly.
Courses taught are subject to constant change from organisational and subject-specific reasons. Evolution
and change affect educators and developers of computer-based teaching and learning environments alike â
both often being unprepared to respond effectively. A large number of educational systems are designed
and developed without change and evolution in mind. We will present our approach to the design and
maintenance of these systems in rapidly evolving environments and illustrate the consequences of evolution
and change for these systems and for the educators and developers responsible for their implementation and
deployment. We discuss various factors of change, illustrated by a Web-based virtual course, with the
objective of raising an awareness of this issue of evolution and change in computer-supported teaching and
learning environments. This discussion leads towards the establishment of a development and management
framework for teaching and learning systems
Total data quality management: a study of bridging rigor and relevance
Ensuring data quality is of crucial importance to organizations. The Total Data Quality Management (TDQM) theory provides a methodology to ensure data quality. Although well researched, the TDQM methodology is not easy to apply. In the case of Honeywell Emmen, we found that the application of the methodology requires considerable contextual redesign, flexibility in use, and the provision of practical tools. We identified team composition, toolsets, development of obvious actions, the design of phases, steps, and actions, and sessions as vital elements of making an academically rooted methodology applicable. Such an applicable methodology, we name âwell articulatedâ, because it incorporates the existing academic theory and it is made operational. This enables the methodology to be systematically beta tested and made useful for different organizational conditions
Protocols for Integrity Constraint Checking in Federated Databases
A federated database is comprised of multiple interconnected database systems that primarily operate independently but cooperate to a certain extent. Global integrity constraints can be very useful in federated databases, but the lack of global queries, global transaction mechanisms, and global concurrency control renders traditional constraint management techniques inapplicable. This paper presents a threefold contribution to integrity constraint checking in federated databases: (1) The problem of constraint checking in a federated database environment is clearly formulated. (2) A family of protocols for constraint checking is presented. (3) The differences across protocols in the family are analyzed with respect to system requirements, properties guaranteed by the protocols, and processing and communication costs. Thus, our work yields a suite of options from which a protocol can be chosen to suit the system capabilities and integrity requirements of a particular federated database environment
Integrity Constraint Checking in Federated Databases
A federated database is comprised of multiple interconnected databases that cooperate in an autonomous fashion. Global integrity constraints are very useful in federated databases, but the lack of global queries, global transaction mechanisms, and global concurrency control renders traditional constraint management techniques inapplicable. The paper presents a threefold contribution to integrity constraint checking in federated databases: (1) the problem of constraint checking in a federated database environment is clearly formulated; (2) a family of cooperative protocols for constraint checking is presented; (3) the differences across protocols in the family are analyzed with respect to system requirements, properties guaranteed, and costs involved. Thus, we provide a suite of options with protocols for various environments with specific system capabilities and integrity requirement
A Review of integrity constraint maintenance and view updating techniques
Two interrelated problems may arise when updating a database. On one
hand, when an update is applied to the database, integrity constraints
may become violated. In such case, the integrity constraint maintenance
approach tries to obtain additional updates to keep integrity
constraints satisfied. On the other hand, when updates of derived or
view facts are requested, a view updating mechanism must be applied to
translate the update request into correct updates of the underlying base
facts.
This survey reviews the research performed on integrity constraint
maintenance and view updating. It is proposed a general framework to
classify and to compare methods that tackle integrity constraint
maintenance and/or view updating. Then, we analyze some of these methods
in more detail to identify their actual contribution and the main
limitations they may present.Postprint (published version
Recommended from our members
AGM, a dataflow database machine
In recent years, a number of database machines consisting of large numbers of parallel processing elements have been proposed. Unfortunately, one of the main limitations to parallelism in database processing is the I/O bandwidth of the underlying storage devices. One way to solve this problem is to use multiple parallel disk units. The main problem with this approach, however, is the lack of a computational model capable of utilizing the potential of any significant number of such devices.This paper presents a database model which is based on the principles of data-driven computation. According to this model, the database is represented as a network in which each node is conceptually an independent processing element, capable of communicating with other nodes by exchanging messages along the network arcs. To answer a query, one or more such messages, called tokens, are created and injected into the network. These then propagate asynchronously through the network in the search of results satisfying the given query.To investigate the performance of the proposed system, we have implemented the model on a simulated computer architecture. The results of the simulation ex-periments indicate that the model is capable of exploiting the potential I/O band-width of a large number of disk units as well as the computational power of the associated processing elements
Assessment of Road Crossings for Improving Migratory Fish Passage in the Winnicut River Watershed
This report summarizes the results of a river continuity assessment focused on roadstream crossings. The Winnicut River is the site of a restoration project that removed a head-of-tide dam and resulted in the only free-flowing major tributary to the Great Bay Estuary. The river system currently supports a small annual run of river herring, and with the removal of the dam and ladder system, migratory fish will now have access to a total of 37 miles of potential upstream habitat.
In anticipation of improved access, The Nature Conservancy conducted a fish passage assessment for all stream crossings above the head-of-tide dam. We used an assessment methodology based on the Massachusetts Riverways Program, with adjustments following a similar crossing study in the Ashuelot River system (NH).
We assessed a total of 42 road crossings in the Winnicut watershed, and classified them as severe, moderate, minor, or passable for fish passage. One crossing was identified as severe, thirty-five were moderate, six were minor, and no crossings were determined to be fully passable for all fish.
To develop a priority list of crossings for improvements, we focused on culverts with moderate or severe barrier rankings and screened out crossings associated with major highway infrastructure. We then used GIS analysis to determine the habitat potential upstream of each crossing, and prioritized crossings with greater than 0.5 miles of upstream habitat. We ordered priority crossings from nearest to furthest from the dam site at the river mouth. Our analysis produced a final list of 11 crossings that, if all were improved, would reestablish 19.5 miles of unfragmented habitat for migratory fish.
We are sharing results of this study with local and state officials in hopes of securing funds and making structural enhancements to priority road crossings. Going forward, we hope that this information will lead to increases in migratory fish populations in the Winnicut River and throughout the entire Great Bay Estuary
- âŠ