184,751 research outputs found
Evolving, probabilistic spiking neural networks and neurogenetic systems for spatio- and spectro-temporal data modelling and pattern recognition
Spatio- and spectro-temporal data (SSTD) are the most common types of data collected in many domain areas, including engineering, bioinformatics, neuroinformatics, ecology, environment, medicine, economics, etc. However, there is lack of methods for the efficient analysis of such data and for spatio temporal pattern recognition (STPR). The brain functions as a spatio-temporal information processing machine and deals extremely well with spatio-temporal data. Its organisation and functions have been the inspiration for the development of new methods for SSTD analysis and STPR. The brain-inspired spiking neural networks (SNN) are considered the third generation of neural networks and are a promising paradigm for the creation of new intelligent ICT for SSTD. This new generation of computational models and systems are potentially capable of modelling complex information processes due to their ability to represent and integrate different information dimensions, such as time, space, frequency, and phase, and to deal with large volumes of data in an an adaptive and self-organising manner. The paper reviews methods and systems of SNN for SSTD analysis and STPR, including single neuronal models, evolving spiking neural networks (eSNN) and computational neuro-genetic models (CNGM). Software and hardware implementations and some pilot applications for audio-visual pattern recognition, EEG data analysis, cognitive robotic systems, BCI, neurodegenerative diseases, and others are discussed
Task analysis: the missing link in software development methodologies
Systems development methods or software methodologies have evolved
considerably over the past few years. This development has tended to fall into two main areas: Software Engineering and Human Computer Interaction (HCI). The two main techniques proposed in Software Engineering were Structured Analysis, as proposed by Ross and DeMarco; and Semantic Modelling. These two different approaches were later combined to yield Modern Structured Analysis, in which Structured Analysis was augmented with data modelling techniques.
Modern Structured Analysis was subsequently replaced by Object Oriented Analysis and Design (OOAD) which adopted a holistic approach to data and processes, encapsulating them into objects.
In the HCI domain, design methods such as Hierarchical Task Analysis (HTA) and Task Analysis for Knowledge Descriptions (TAKD), have long been used to model the cognitive nature of the tasks performed by the users. Recent work by Walsh, Um, Long and Sutcliffe have proposed combining Task Analysis (TA) with Structured Analysis and Design methods, in order to improve system usability. Analysis for Task Object Modelling (ATOM), as proposed by Walsh, is an example of such a method which combines TA with object modelling in an integrated life cycle approach.
This article will review the major Software Engineering methods, together with the principal HCI methods and motivate for the integration of the two areas on the basis of improved system usability. A taxonomy of software development methods as proposed by Blum will be reviewed and a proposal made to augment the framework to include the issue of user-centered design methods. The extended framework will then be used to classify several of the principal software design methodologies, together with the principal HCI methods. Each of these methodologies will be reviewed and conclusions drawn as to the efficacy of each in the context of the software life cycle. We will demonstrate that all of the traditional design methodologies fail to include Task Analysis (TA). An alternative methodology, Analysis for Task Object Modelling, as proposed by Walsh, will be discussed, which includes TA with object modelling. We will motivate that TA is an essential
part of Requirements Analysis and HCI design. Furthermore, failure to include TA may result in serious usability problems. Methods like ATOM, which combine TA with OOAD, are thus the most applicable software methodologies for designing usable systems in the future. Further research, however, is needed to improve and integrate the conceptual modelling techniques in ATOM
Using protocol analysis to explore the creative requirements engineering process
Protocol analysis is an empirical method applied by researchers in cognitive psychology and behavioural analysis. Protocol analysis can be used to collect, document and analyse thought processes by an individual problem solver. In general, research subjects are asked to think aloud when performing a given task. Their verbal reports are transcribed and represent a sequence of their thoughts and cognitive activities. These verbal reports are analysed to identify relevant segments of cognitive behaviours by the research subjects. The analysis results may be cross-examined (or validated through retrospective interviews with the research subjects). This paper offers a critical analysis of this research method, its approaches to data collection and analysis, strengths and limitations, and discusses its use in information systems research. The aim is to explore the use of protocol analysis in studying the creative requirements engineering process.<br /
Recommended from our members
Human Activity Modelling in the Specification of Operational Requirements: Work in Progress
This paper describes our experience of integrating HCI concepts and techniques into a concurrent requirements engineering process called RESCUE. We focus on the use of a model of current human activity to inform specification of a future system. We show how human activity descriptions, written using a specially designed template, can facilitate the authoring of use case descriptions to be used in the elicitation of requirements for complex socio-technical systems. We describe our experience of using descriptions of human activity, written using the template, to support specification of operational requirements for DMAN, a system to support air traffic controllers in managing the departure of aircraft from airports. We end with a discussion of lessons learnt from our experience and present some ideas for future development of work in this area
Consciosusness in Cognitive Architectures. A Principled Analysis of RCS, Soar and ACT-R
This report analyses the aplicability of the principles of consciousness developed in the ASys project to three of the most relevant cognitive architectures. This is done in relation to their aplicability to build integrated control systems and studying their support for general mechanisms of real-time consciousness.\ud
To analyse these architectures the ASys Framework is employed. This is a conceptual framework based on an extension for cognitive autonomous systems of the General Systems Theory (GST).\ud
A general qualitative evaluation criteria for cognitive architectures is established based upon: a) requirements for a cognitive architecture, b) the theoretical framework based on the GST and c) core design principles for integrated cognitive conscious control systems
Family of 2-simplex cognitive tools and their application for decision-making and its justifications
Urgency of application and development of cognitive graphic tools for usage
in intelligent systems of data analysis, decision making and its justifications
is given. Cognitive graphic tool "2-simplex prism" and examples of its usage
are presented. Specificity of program realization of cognitive graphics tools
invariant to problem areas is described. Most significant results are given and
discussed. Future investigations are connected with usage of new approach to
rendering, cross-platform realization, cognitive features improving and
expanding of n-simplex family.Comment: 14 pages, 6 figures, conferenc
Scientific requirements for an engineered model of consciousness
The building of a non-natural conscious system requires more than the design of physical or virtual machines with intuitively conceived abilities, philosophically elucidated architecture or hardware homologous to an animal’s brain. Human society might one day treat a type of robot or computing system as an artificial person. Yet that would not answer scientific questions about the machine’s consciousness or otherwise. Indeed, empirical tests for consciousness are impossible because no such entity is denoted within the theoretical structure of the science of mind, i.e. psychology. However, contemporary experimental psychology can identify if a specific mental process is conscious in particular circumstances, by theory-based interpretation of the overt performance of human beings. Thus, if we are to build a conscious machine, the artificial systems must be used as a test-bed for theory developed from the existing science that distinguishes conscious from non-conscious causation in natural systems. Only such a rich and realistic account of hypothetical processes accounting for observed input/output relationships can establish whether or not an engineered system is a model of consciousness. It follows that any research project on machine consciousness needs a programme of psychological experiments on the demonstration systems and that the programme should be designed to deliver a fully detailed scientific theory of the type of artificial mind being developed – a Psychology of that Machine
Recommended from our members
Applying a Fuzzy-Morphological approach to complexity within management decision-making
- …