3,669 research outputs found
A Blockchain-based Approach for Data Accountability and Provenance Tracking
The recent approval of the General Data Protection Regulation (GDPR) imposes
new data protection requirements on data controllers and processors with
respect to the processing of European Union (EU) residents' data. These
requirements consist of a single set of rules that have binding legal status
and should be enforced in all EU member states. In light of these requirements,
we propose in this paper the use of a blockchain-based approach to support data
accountability and provenance tracking. Our approach relies on the use of
publicly auditable contracts deployed in a blockchain that increase the
transparency with respect to the access and usage of data. We identify and
discuss three different models for our approach with different granularity and
scalability requirements where contracts can be used to encode data usage
policies and provenance tracking information in a privacy-friendly way. From
these three models we designed, implemented, and evaluated a model where
contracts are deployed by data subjects for each data controller, and a model
where subjects join contracts deployed by data controllers in case they accept
the data handling conditions. Our implementations show in practice the
feasibility and limitations of contracts for the purposes identified in this
paper
The Design of a System Architecture for Mobile Multimedia Computers
This chapter discusses the system architecture of a portable computer, called Mobile Digital Companion, which provides support for handling multimedia applications energy efficiently. Because battery life is limited and battery weight is an important factor for the size and the weight of the Mobile Digital Companion, energy management plays a crucial role in the architecture. As the Companion must remain usable in a variety of environments, it has to be flexible and adaptable to various operating conditions. The Mobile Digital Companion has an unconventional architecture that saves energy by using system decomposition at different levels of the architecture and exploits locality of reference with dedicated, optimised modules. The approach is based on dedicated functionality and the extensive use of energy reduction techniques at all levels of system design. The system has an architecture with a general-purpose processor accompanied by a set of heterogeneous autonomous programmable modules, each providing an energy efficient implementation of dedicated tasks. A reconfigurable internal communication network switch exploits locality of reference and eliminates wasteful data copies
Novel development of distributed manufacturing monitoring systems to support high cost and complexity manufacturing
In the current manufacturing environment, characterized by diverse change sources (e.g.
economical, technological, political, social) and integrated supply chains, success
demands close cooperation and coordination between stakeholders and agility. Tools
and systems based on software agents, intelligent products and virtual enterprises have
been developed to achieve such demands but either because of: (i) focus on a single
application; (ii) focus on a single product; (iii) separation between the product and its
information; or (iv) focus on a single system characteristic (e.g. hardware, software,
architecture, requirements) their use has been limited to trial or academic scenarios. In
this thesis a reusable distributed manufacturing monitoring system for harsh
environments, capable of addressing traceability and controllability requirements within
stakeholders and across high cost and complexity supply chains is presented. [Continues.
GAUMLESS: Modelling the Capitalization of Human Action on the Internet
The focus of this thesis is on a field of study related to information design, namely visual modelling, and the application of its concepts and frameworks to a case study on the use of Internet cookies. It represents an opportunity to enhance information designâs relevancy as an adaptive discipline; i.e., borrowing and learning from various knowledge domains in representing phenomena for the purposes of decision-making and action-generation.
As a critical design project, the thesis endeavors to inform Internet users and other audiences of the exploitation inherent in the data-mining processes employed by websites for generating cookies and to expose the risks to users. This focus was motivated by a concern with the ignorance, or at least the casual awareness, of many Internet users of the implications of giving their consent to the use of cookies. The thesis employs a qualitative research methodology that consolidates information design principles, conventions and processes; a distillation of relevant modelling frameworks; and pan-disciplinary philosophical perspectives (i.e., cybernetics, systems theory, and social system theory) into a visual model that represents the cookie system.
The significance of this studyâs contribution to design theory lies in the manner in which boundaries to its research methodology (based on the studyâs purpose, goals and targeted audience) were determined and the singular visual modelling process developed in consideration of the myriad relevant knowledge-domains, extensive data sources and esoteric technical aspects of the system under study. Whereas simplification in a visual model is a key factor for knowledge-creation and establishing usability, its effectiveness to inform and inspire is also measured by its level of accuracy and comprehensiveness.
In concentrating on human behaviour and decision-making contexts and applications, information design has the capacity to help meet personal and social needs and consequently can be a societal force for innovation and progress. The thesisâ visual model is an example of this potential in its intention to represent the cookie process and to raise awareness of its personal and social implications. The study validates the responsibility of the information designer to not prescribe actions or solutions but rather to impart knowledge, support decision-making, and inspire critical reflection
Component-based records: a novel method to record transaction design work
The growing pressures from global competitive markets signal the inevitable challenge for companies to
rapidly design and develop new successful products. To continually improve design quality and efficiency,
companies must consider how to speed design processes, minimise human-errors, avoid unnecessary
iterations, and sustain knowledge embedded in the design process. All of these issues strongly
concern one topic: how to make and exploit records of design activities. Using process modelling ideas,
this paper introduces a new method called component-based records, in place of traditional design
reports. The proposed method records transaction elements of the actual design processes undertaken
in a design episode, which aims to continually improve design quality and efficiency, reduce designersâ
workload for routine tasks, and sustain competitiveness of companies
On Dynamic Monitoring Methods for Networks-on-Chip
Rapid ongoing evolution of multiprocessors will lead to systems with hundreds of processing cores integrated in a single chip. An emerging challenge is the implementation of reliable and efficient interconnection between these cores as well as other components in the systems. Network-on-Chip is an interconnection approach which is intended to solve the performance bottleneck caused by traditional, poorly scalable communication structures such as buses. However, a large on-chip network involves issues related to congestion problems and system control, for instance. Additionally, faults can cause problems in multiprocessor systems. These faults can be transient faults, permanent manufacturing faults, or they can appear due to aging. To solve the emerging traffic management, controllability issues and to maintain system operation regardless of faults a monitoring system is needed. The monitoring system should be dynamically applicable to various purposes and it should fully cover the system under observation. In a large multiprocessor the distances between components can be relatively long. Therefore, the system should be designed so that the amount of energy-inefficient long-distance communication is minimized.
This thesis presents a dynamically clustered distributed monitoring structure. The monitoring is distributed so that no centralized control is required for basic tasks such as traffic management and task mapping. To enable extensive analysis of different Network-on-Chip architectures, an in-house SystemC based simulation environment was implemented. It allows transaction level analysis without time consuming circuit level implementations during early design phases of novel architectures and features.
The presented analysis shows that the dynamically clustered monitoring structure can be efficiently utilized for traffic management in faulty and congested Network-on-Chip-based multiprocessor systems. The monitoring structure can be also successfully applied for task mapping purposes. Furthermore, the analysis shows that the presented in-house simulation environment is flexible and practical tool for extensive Network-on-Chip architecture analysis.Siirretty Doriast
From Big Data To Knowledge â Good Practices From Industry
Recent advancements in data gathering technologies have led to the rise of a large amount of data through which useful insights and ideas can be derived. These data sets are typically too large to process using traditional data processing tools and applications and thus known in the popular press as âbig dataâ. It is essential to extract the hidden meanings in the available data sets by aggregating big data into knowledge, which may then positively contribute to decision making. One way to engage in data-driven strategy is to gather contextual relevant data on specific customers, products, and situations, and determine optimised offerings that are most appealing to the target customers based on sound analytics. Corporations around the world have been increasingly applying analytics, tools and technologies to capture, manage and process such data, and derive value out of the huge volumes of data generated by individuals. The detailed intelligence on consumer behaviour, user patterns and other hidden knowledge that was not possible to derive via traditional means could now be used to facilitate important business processes such as real-time control, and demand forecasting. The aim of our research is to understand and analyse the significance and impact of big data in todayâs industrial environment and identify the good practices that can help us derive useful knowledge out of this wealth of information based on content analysis of 34 firms that have initiated big data analytical projects. Our descriptive and network analysis shows that the goals of a big data initiative are extensible and highlighted the importance of data representation. We also find the data analytical techniques adopted are heavily dependent on the project goals
- âŠ