1,705 research outputs found
Supporting Cyber-Physical Systems with Wireless Sensor Networks: An Outlook of Software and Services
Sensing, communication, computation and control technologies are the essential building blocks of a cyber-physical system (CPS). Wireless sensor networks (WSNs) are a way to support CPS as they provide fine-grained spatial-temporal sensing, communication and computation at a low premium of cost and power. In this article, we explore the fundamental concepts guiding the design and implementation of WSNs. We report the latest developments in WSN software and services for meeting existing requirements and newer demands; particularly in the areas of: operating system, simulator and emulator, programming abstraction, virtualization, IP-based communication and security, time and location, and network monitoring and management. We also reflect on the ongoing
efforts in providing dependable assurances for WSN-driven CPS. Finally, we report on its applicability with a case-study on smart buildings
Interoperability and Standards: The Way for Innovative Design in Networked Working Environments
Organised by: Cranfield UniversityIn today’s networked economy, strategic business partnerships and outsourcing has become the dominant
paradigm where companies focus on core competencies and skills, as creative design, manufacturing, or
selling. However, achieving seamless interoperability is an ongoing challenge these networks are facing,
due to their distributed and heterogeneous nature. Part of the solution relies on adoption of standards for
design and product data representation, but for sectors predominantly characterized by SMEs, such as the
furniture sector, implementations need to be tailored to reduce costs. This paper recommends a set of best
practices for the fast adoption of the ISO funStep standard modules and presents a framework that enables
the usage of visualization data as a way to reduce costs in manufacturing and electronic catalogue design.Mori Seiki – The Machine Tool Compan
ImageJ2: ImageJ for the next generation of scientific image data
ImageJ is an image analysis program extensively used in the biological
sciences and beyond. Due to its ease of use, recordable macro language, and
extensible plug-in architecture, ImageJ enjoys contributions from
non-programmers, amateur programmers, and professional developers alike.
Enabling such a diversity of contributors has resulted in a large community
that spans the biological and physical sciences. However, a rapidly growing
user base, diverging plugin suites, and technical limitations have revealed a
clear need for a concerted software engineering effort to support emerging
imaging paradigms, to ensure the software's ability to handle the requirements
of modern science. Due to these new and emerging challenges in scientific
imaging, ImageJ is at a critical development crossroads.
We present ImageJ2, a total redesign of ImageJ offering a host of new
functionality. It separates concerns, fully decoupling the data model from the
user interface. It emphasizes integration with external applications to
maximize interoperability. Its robust new plugin framework allows everything
from image formats, to scripting languages, to visualization to be extended by
the community. The redesigned data model supports arbitrarily large,
N-dimensional datasets, which are increasingly common in modern image
acquisition. Despite the scope of these changes, backwards compatibility is
maintained such that this new functionality can be seamlessly integrated with
the classic ImageJ interface, allowing users and developers to migrate to these
new methods at their own pace. ImageJ2 provides a framework engineered for
flexibility, intended to support these requirements as well as accommodate
future needs
Composition Challenges for Sensor Data Visualization
International audienceConnected objects and monitoring systems continuously produce data about their environment. Dashboards are then designed to aggregate and present these data to end-users. Technologies used to design and implement visualization dashboards are babbling from a software engineering point of view. This paper highlights how this domain could benefit from leveraging separation of concerns and software composition paradigms to support dashboard design
Recommended from our members
Distributed Data Mining: The JAM system architecture
This paper describes the system architecture of JAM (Java Agents for Meta-learning), a distributed data mining system that scales up to large and physically separated data sets. An earlyversion of the JAM system was described in Stolfo-et-al-97-KDD-JAM. Since then, JAM has evolved both architecturally and functionally and here we present the final design and implementation details of this system architecture. JAM is an extensible agent-based distributed data mining system that supports the remote dispatch and exchange of agents among participating datasites and employs meta-learning techniques to combine the multiple models that are learned. One of JAM's target applications is fraud and intrusion detection in financial information systems. A brief description of this learning task and JAM's applicability and summary results are also discussed
HANDLING MULTILINGUAL CONTENT IN DIGITAL MEDIA: A CRITICAL ANALYSIS
This document expresses and analyzes the need to define a generic method for representing multilingual information in multimedia data. It describes the basic requirements that would bear upon such representations and establishes the potential link with ISO committee TC 37/SC 4 (Language Resource Management) and with XMT (eXtended MPEG-4 Textual format)
Implicit Incremental Model Analyses and Transformations
When models of a system change, analyses based on them have to be reevaluated in order for the results to stay meaningful. In many cases, the time to get updated analysis results is critical. This thesis proposes multiple, combinable approaches and a new formalism based on category theory for implicitly incremental model analyses and transformations. The advantages of the implementation are validated using seven case studies, partially drawn from the Transformation Tool Contest (TTC)
Open Government Architecture: The evolution of De Jure Standards, Consortium Standards, and Open Source Software
Conducted for the Treasury Board of Québec, this study seeks to present recent contributions to the evolution, within an enterprise architecture context, of de jure and de facto standards by various actors in the milieu, industrial consortia, and international standardization committees active in open source software. In order to be able to achieve its goals of delivering services to citizens and society, the Government of Québec must integrate its computer systems to create a service oriented open architecture. Following in the footsteps of various other governments and the European Community, such an integration will require elaboration of an interoperability framework, i.e. a structured set of de jure standards, de facto standards, specifications, and policies allowing computer systems to interoperate. Thus, we recommend that the Government of Québec: Pursue its endeavours to elaborate an interoperability framework for its computer systems that is based on open de jure and de facto standards. This framework should not only reflect the criteria enumerated in this study and apply to internal computer systems, but it should also extend to Web services supplied to organizations outside of the government. This framework should explicitly prioritize open source de jure and de facto standards and include a policy covering free software. The interoperability framework should initially draw on that of the state of Massachusetts. In the medium term, is should be as comprehensive as that of the British government. Integrate this interoperability framework into its enterprise architecture. Publish this interoperability framework with its enterprise architecture. Specify this interoperability framework in its calls for tenders.
Elaborate a policy of compliance with this framework for all new applications.
- …