4,075 research outputs found
An ambient agent architecture exploiting automated cognitive analysis
In this paper an agent-based ambient agent architecture is presented based on monitoring human's interaction with his or her environment and performing cognitive analysis of the causes of observed or predicted behaviour. Within this agent architecture, a cognitive model for the human is taken as a point of departure. From the cognitive model it is automatically derived how internal cognitive states affect human's performance aspects. Furthermore, for these cognitive states representation relations are derived from the cognitive model, expressed by temporal specifications involving events that will be monitored. The representation relations are verified on the monitoring information automatically, resulting in the identification of cognitive states, which affect the performance aspects. In such a way the ambient agent model is able to provide a more in depth cognitive analysis of causes of (un)satisfactory performance and based on this analysis to generate interventions in a knowledgeable manner. The application of the architecture proposed is demonstrated by two examples from the ambient-assisted living domain and the computer-assisted instruction domain. © 2011 The Author(s)
Internet of robotic things : converging sensing/actuating, hypoconnectivity, artificial intelligence and IoT Platforms
The Internet of Things (IoT) concept is evolving rapidly and influencing newdevelopments in various application domains, such as the Internet of MobileThings (IoMT), Autonomous Internet of Things (A-IoT), Autonomous Systemof Things (ASoT), Internet of Autonomous Things (IoAT), Internetof Things Clouds (IoT-C) and the Internet of Robotic Things (IoRT) etc.that are progressing/advancing by using IoT technology. The IoT influencerepresents new development and deployment challenges in different areassuch as seamless platform integration, context based cognitive network integration,new mobile sensor/actuator network paradigms, things identification(addressing, naming in IoT) and dynamic things discoverability and manyothers. The IoRT represents new convergence challenges and their need to be addressed, in one side the programmability and the communication ofmultiple heterogeneous mobile/autonomous/robotic things for cooperating,their coordination, configuration, exchange of information, security, safetyand protection. Developments in IoT heterogeneous parallel processing/communication and dynamic systems based on parallelism and concurrencyrequire new ideas for integrating the intelligent âdevicesâ, collaborativerobots (COBOTS), into IoT applications. Dynamic maintainability, selfhealing,self-repair of resources, changing resource state, (re-) configurationand context based IoT systems for service implementation and integrationwith IoT network service composition are of paramount importance whennew âcognitive devicesâ are becoming active participants in IoT applications.This chapter aims to be an overview of the IoRT concept, technologies,architectures and applications and to provide a comprehensive coverage offuture challenges, developments and applications
Neural Networks for Modeling and Control of Particle Accelerators
We describe some of the challenges of particle accelerator control, highlight
recent advances in neural network techniques, discuss some promising avenues
for incorporating neural networks into particle accelerator control systems,
and describe a neural network-based control system that is being developed for
resonance control of an RF electron gun at the Fermilab Accelerator Science and
Technology (FAST) facility, including initial experimental results from a
benchmark controller.Comment: 21 p
Robotic ubiquitous cognitive ecology for smart homes
Robotic ecologies are networks of heterogeneous robotic devices pervasively embedded in everyday environments, where they cooperate to perform complex tasks. While their potential makes them increasingly popular, one fundamental problem is how to make them both autonomous and adaptive, so as to reduce the amount of preparation, pre-programming and human supervision that they require in real world applications. The project RUBICON develops learning solutions which yield cheaper, adaptive and efficient coordination of robotic ecologies. The approach we pursue builds upon a unique combination of methods from cognitive robotics, machine learning, planning and agent- based control, and wireless sensor networks. This paper illustrates the innovations advanced by RUBICON in each of these fronts before describing how the resulting techniques have been integrated and applied to a smart home scenario. The resulting system is able to provide useful services and pro-actively assist the users in their activities. RUBICON learns through an incremental and progressive approach driven by the feed- back received from its own activities and from the user, while also self-organizing the manner in which it uses available sensors, actuators and other functional components in the process. This paper summarises some of the lessons learned by adopting such an approach and outlines promising directions for future work
A COGNITIVE ARCHITECTURE FOR AMBIENT INTELLIGENCE
LâAmbient Intelligence (AmI) Ăš caratterizzata dallâuso di sistemi pervasivi per
monitorare lâambiente e modificarlo secondo le esigenze degli utenti e rispettando
vincoli definiti globalmente. Questi sistemi non possono prescindere da requisiti
come la scalabilitĂ e la trasparenza per lâutente. Una tecnologia che consente di
raggiungere questi obiettivi Ăš rappresentata dalle reti di sensori wireless (WSN),
caratterizzate da bassi costi e bassa intrusivitĂ . Tuttavia, sebbene in grado di
effettuare elaborazioni a bordo dei singoli nodi, le WSN non hanno da sole le capacitĂ
di elaborazione necessarie a supportare un sistema intelligente; dâaltra parte
senza questa attivitĂ di pre-elaborazione la mole di dati sensoriali puĂČ facilmente
sopraffare un sistema centralizzato con unâeccessiva quantitĂ di dettagli superflui.
Questo lavoro presenta unâarchitettura cognitiva in grado di percepire e controllare
lâambiente di cui fa parte, basata su un nuovo approccio per lâestrazione
di conoscenza a partire dai dati grezzi, attraverso livelli crescenti di astrazione.
Le WSN sono utilizzate come strumento sensoriale pervasivo, le cui capacitĂ computazionali
vengono utilizzate per pre-elaborare i dati rilevati, in modo da consentire
ad un sistema centralizzato intelligente di effettuare ragionamenti di alto
livello.
Lâarchitettura proposta Ăš stata utilizzata per sviluppare un testbed dotato degli
strumenti hardware e software necessari allo sviluppo e alla gestione di applicazioni
di AmI basate su WSN, il cui obiettivo principale sia il risparmio energetico. Per
fare in modo che le applicazioni di AmI siano in grado di comunicare con il mondo
esterno in maniera affidabile, per richiedere servizi ad agenti esterni, lâarchitettura
Ăš stata arricchita con un protocollo di gestione distribuita della reputazione.
Ă stata inoltre sviluppata unâapplicazione di esempio che sfrutta le caratteristiche
del testbed, con lâobiettivo di controllare la temperatura in un ambiente
lavorativo. Questâapplicazione rileva la presenza dellâutente attraverso un modulo
per la fusione di dati multi-sensoriali basato su reti bayesiane, e sfrutta questa
informazione in un controllore fuzzy multi-obiettivo che controlla gli attuatori sulla
base delle preferenze dellâutente e del risparmio energetico.Ambient Intelligence (AmI) systems are characterized by the use of pervasive
equipments for monitoring and modifying the environment according to usersâ
needs, and to globally defined constraints. Furthermore, such systems cannot ignore
requirements about ubiquity, scalability, and transparency to the user. An
enabling technology capable of accomplishing these goals is represented by Wireless
Sensor Networks (WSNs), characterized by low-costs and unintrusiveness. However,
although provided of in-network processing capabilities, WSNs do not exhibit
processing features able to support comprehensive intelligent systems; on the other
hand, without this pre-processing activities the wealth of sensory data may easily
overwhelm a centralized AmI system, clogging it with superfluous details.
This work proposes a cognitive architecture able to perceive, decide upon, and
control the environment of which the system is part, based on a new approach to
knowledge extraction from raw data, that addresses this issue at different abstraction
levels. WSNs are used as the pervasive sensory tool, and their computational
capabilities are exploited to remotely perform preliminary data processing. A central
intelligent unit subsequently extracts higher-level concepts in order to carry on
symbolic reasoning. The aim of the reasoning is to plan a sequence of actions that
will lead the environment to a state as close as possible to the usersâ desires, taking
into account both implicit and explicit feedbacks from the users, while considering
global system-driven goals, such as energy saving. The proposed conceptual architecture
was exploited to develop a testbed providing the hardware and software
tools for the development and management of AmI applications based on WSNs,
whose main goal is energy saving for global sustainability. In order to make the
AmI system able to communicate with the external world in a reliable way, when
some services are required to external agents, the architecture was enriched with
a distributed reputation management protocol.
A sample application exploiting the testbed features was implemented for addressing
temperature control in a work environment. Knowledge about the userâs
presence is obtained through a multi-sensor data fusion module based on Bayesian
networks, and this information is exploited by a multi-objective fuzzy controller
that operates on actuators taking into account usersâ preference and energy consumption
constraints
Advanced Knowledge Technologies at the Midterm: Tools and Methods for the Semantic Web
The University of Edinburgh and research sponsors are authorised to reproduce and distribute reprints and on-line copies for their purposes notwithstanding any copyright annotation hereon. The views and conclusions contained herein are the authorâs and shouldnât be interpreted as necessarily representing the official policies or endorsements, either expressed or implied, of other parties.In a celebrated essay on the new electronic media, Marshall McLuhan wrote in 1962:Our private senses are not closed systems but are endlessly translated into each other in that experience which we call consciousness. Our extended senses, tools, technologies, through the ages, have been closed systems incapable of interplay or collective awareness. Now, in the electric age, the very
instantaneous nature of co-existence among our technological instruments has created a crisis quite new in human history. Our extended faculties and senses now constitute a single field of experience which demands that they become collectively conscious. Our technologies, like our private senses, now demand an interplay and ratio that makes rational co-existence possible. As long as our technologies were as slow as the wheel or the alphabet or money, the fact that
they were separate, closed systems was socially and psychically supportable. This is not true now when sight and sound and movement are simultaneous and global in extent. (McLuhan 1962, p.5, emphasis in original)Over forty years later, the seamless interplay that McLuhan demanded between our
technologies is still barely visible. McLuhanâs predictions of the spread, and increased importance, of electronic media have of course been borne out, and the worlds of business, science and knowledge storage and transfer have been revolutionised. Yet
the integration of electronic systems as open systems remains in its infancy.Advanced Knowledge Technologies (AKT) aims to address this problem, to create a view of knowledge and its management across its lifecycle, to research and create the
services and technologies that such unification will require. Half way through its sixyear span, the results are beginning to come through, and this paper will explore some of the services, technologies and methodologies that have been developed. We hope to give a sense in this paper of the potential for the next three years, to discuss the insights and lessons learnt in the first phase of the project, to articulate the challenges and issues that remain.The WWW provided the original context that made the AKT approach to knowledge
management (KM) possible. AKT was initially proposed in 1999, it brought together an interdisciplinary consortium with the technological breadth and complementarity to create the conditions for a unified approach to knowledge across its lifecycle. The
combination of this expertise, and the time and space afforded the consortium by the
IRC structure, suggested the opportunity for a concerted effort to develop an approach
to advanced knowledge technologies, based on the WWW as a basic infrastructure.The technological context of AKT altered for the better in the short period between the development of the proposal and the beginning of the project itself with the development of the semantic web (SW), which foresaw much more intelligent manipulation and querying of knowledge. The opportunities that the SW provided for e.g., more intelligent retrieval, put AKT in the centre of information technology innovation and knowledge management services; the AKT skill set would clearly be central for the exploitation of those opportunities.The SW, as an extension of the WWW, provides an interesting set of constraints to
the knowledge management services AKT tries to provide. As a medium for the
semantically-informed coordination of information, it has suggested a number of ways in which the objectives of AKT can be achieved, most obviously through the
provision of knowledge management services delivered over the web as opposed to the creation and provision of technologies to manage knowledge.AKT is working on the assumption that many web services will be developed and provided for users. The KM problem in the near future will be one of deciding which services are needed and of coordinating them. Many of these services will be largely or entirely legacies of the WWW, and so the capabilities of the services will vary. As well as providing useful KM services in their own right, AKT will be aiming to exploit this opportunity, by reasoning over services, brokering between them, and providing essential meta-services for SW knowledge service management.Ontologies will be a crucial tool for the SW. The AKT consortium brings a lot of expertise on ontologies together, and ontologies were always going to be a key part of the strategy. All kinds of knowledge sharing and transfer activities will be mediated by ontologies, and ontology management will be an important enabling task. Different
applications will need to cope with inconsistent ontologies, or with the problems that will follow the automatic creation of ontologies (e.g. merging of pre-existing
ontologies to create a third). Ontology mapping, and the elimination of conflicts of
reference, will be important tasks. All of these issues are discussed along with our
proposed technologies.Similarly, specifications of tasks will be used for the deployment of knowledge services over the SW, but in general it cannot be expected that in the medium term there will be standards for task (or service) specifications. The brokering metaservices
that are envisaged will have to deal with this heterogeneity.The emerging picture of the SW is one of great opportunity but it will not be a wellordered, certain or consistent environment. It will comprise many repositories of legacy data, outdated and inconsistent stores, and requirements for common understandings across divergent formalisms. There is clearly a role for standards to play to bring much of this context together; AKT is playing a significant role in these efforts. But standards take time to emerge, they take political power to enforce, and they have been known to stifle innovation (in the short term). AKT is keen to understand the balance between principled inference and statistical processing of web content. Logical inference on the Web is tough. Complex queries using traditional AI inference methods bring most distributed computer systems to their knees. Do we set up semantically well-behaved areas of the Web? Is any part of the Web in which
semantic hygiene prevails interesting enough to reason in? These and many other
questions need to be addressed if we are to provide effective knowledge technologies
for our content on the web
Cognitive assisted living ambient system: a survey
The demographic change towards an aging population is creating a significant impact and introducing drastic challenges to our society. We therefore need to find ways to assist older people to stay independently and prevent social isolation of these population. Information and Communication Technologies (ICT) provide various solutions to help older adults to improve their quality of life, stay healthier, and live independently for a time. Ambient Assisted Living (AAL) is a field to investigate innovative technologies to provide assistance as well as healthcare and rehabilitation to impaired seniors. The paper provides a review of research background and technologies of AAL
- âŠ