151 research outputs found
Towards trustworthy computing on untrustworthy hardware
Historically, hardware was thought to be inherently secure and trusted due to its
obscurity and the isolated nature of its design and manufacturing. In the last two
decades, however, hardware trust and security have emerged as pressing issues.
Modern day hardware is surrounded by threats manifested mainly in undesired
modifications by untrusted parties in its supply chain, unauthorized and pirated
selling, injected faults, and system and microarchitectural level attacks. These threats,
if realized, are expected to push hardware to abnormal and unexpected behaviour
causing real-life damage and significantly undermining our trust in the electronic and
computing systems we use in our daily lives and in safety critical applications. A
large number of detective and preventive countermeasures have been proposed in
literature. It is a fact, however, that our knowledge of potential consequences to
real-life threats to hardware trust is lacking given the limited number of real-life
reports and the plethora of ways in which hardware trust could be undermined. With
this in mind, run-time monitoring of hardware combined with active mitigation of
attacks, referred to as trustworthy computing on untrustworthy hardware, is proposed
as the last line of defence. This last line of defence allows us to face the issue of live
hardware mistrust rather than turning a blind eye to it or being helpless once it occurs.
This thesis proposes three different frameworks towards trustworthy computing
on untrustworthy hardware. The presented frameworks are adaptable to different
applications, independent of the design of the monitored elements, based on
autonomous security elements, and are computationally lightweight. The first
framework is concerned with explicit violations and breaches of trust at run-time,
with an untrustworthy on-chip communication interconnect presented as a potential
offender. The framework is based on the guiding principles of component guarding,
data tagging, and event verification. The second framework targets hardware elements
with inherently variable and unpredictable operational latency and proposes a
machine-learning based characterization of these latencies to infer undesired latency
extensions or denial of service attacks. The framework is implemented on a DDR3
DRAM after showing its vulnerability to obscured latency extension attacks. The
third framework studies the possibility of the deployment of untrustworthy hardware
elements in the analog front end, and the consequent integrity issues that might arise
at the analog-digital boundary of system on chips. The framework uses machine
learning methods and the unique temporal and arithmetic features of signals at this
boundary to monitor their integrity and assess their trust level
Towards a human-centric data economy
Spurred by widespread adoption of artificial intelligence and machine learning, âdataâ is becoming
a key production factor, comparable in importance to capital, land, or labour in an increasingly
digital economy. In spite of an ever-growing demand for third-party data in the B2B
market, firms are generally reluctant to share their information. This is due to the unique characteristics
of âdataâ as an economic good (a freely replicable, non-depletable asset holding a highly
combinatorial and context-specific value), which moves digital companies to hoard and protect
their âvaluableâ data assets, and to integrate across the whole value chain seeking to monopolise
the provision of innovative services built upon them. As a result, most of those valuable assets
still remain unexploited in corporate silos nowadays.
This situation is shaping the so-called data economy around a number of champions, and it is
hampering the benefits of a global data exchange on a large scale. Some analysts have estimated
the potential value of the data economy in US$2.5 trillion globally by 2025. Not surprisingly, unlocking
the value of data has become a central policy of the European Union, which also estimated
the size of the data economy in 827C billion for the EU27 in the same period. Within the scope of
the European Data Strategy, the European Commission is also steering relevant initiatives aimed
to identify relevant cross-industry use cases involving different verticals, and to enable sovereign
data exchanges to realise them.
Among individuals, the massive collection and exploitation of personal data by digital firms
in exchange of services, often with little or no consent, has raised a general concern about privacy
and data protection. Apart from spurring recent legislative developments in this direction,
this concern has raised some voices warning against the unsustainability of the existing digital
economics (few digital champions, potential negative impact on employment, growing inequality),
some of which propose that people are paid for their data in a sort of worldwide data labour
market as a potential solution to this dilemma [114, 115, 155].
From a technical perspective, we are far from having the required technology and algorithms
that will enable such a human-centric data economy. Even its scope is still blurry, and the question
about the value of data, at least, controversial. Research works from different disciplines have
studied the data value chain, different approaches to the value of data, how to price data assets,
and novel data marketplace designs. At the same time, complex legal and ethical issues with
respect to the data economy have risen around privacy, data protection, and ethical AI practices. In this dissertation, we start by exploring the data value chain and how entities trade data assets
over the Internet. We carry out what is, to the best of our understanding, the most thorough survey
of commercial data marketplaces. In this work, we have catalogued and characterised ten different
business models, including those of personal information management systems, companies born
in the wake of recent data protection regulations and aiming at empowering end users to take
control of their data. We have also identified the challenges faced by different types of entities,
and what kind of solutions and technology they are using to provide their services.
Then we present a first of its kind measurement study that sheds light on the prices of data
in the market using a novel methodology. We study how ten commercial data marketplaces categorise
and classify data assets, and which categories of data command higher prices. We also
develop classifiers for comparing data products across different marketplaces, and we study the
characteristics of the most valuable data assets and the features that specific vendors use to set
the price of their data products. Based on this information and adding data products offered by
other 33 data providers, we develop a regression analysis for revealing features that correlate with
prices of data products. As a result, we also implement the basic building blocks of a novel data
pricing tool capable of providing a hint of the market price of a new data product using as inputs
just its metadata. This tool would provide more transparency on the prices of data products in
the market, which will help in pricing data assets and in avoiding the inherent price fluctuation of
nascent markets.
Next we turn to topics related to data marketplace design. Particularly, we study how buyers
can select and purchase suitable data for their tasks without requiring a priori access to such
data in order to make a purchase decision, and how marketplaces can distribute payoffs for a
data transaction combining data of different sources among the corresponding providers, be they
individuals or firms. The difficulty of both problems is further exacerbated in a human-centric
data economy where buyers have to choose among data of thousands of individuals, and where
marketplaces have to distribute payoffs to thousands of people contributing personal data to a
specific transaction.
Regarding the selection process, we compare different purchase strategies depending on the
level of information available to data buyers at the time of making decisions. A first methodological
contribution of our work is proposing a data evaluation stage prior to datasets being selected
and purchased by buyers in a marketplace. We show that buyers can significantly improve the
performance of the purchasing process just by being provided with a measurement of the performance
of their models when trained by the marketplace with individual eligible datasets. We
design purchase strategies that exploit such functionality and we call the resulting algorithm Try
Before You Buy, and our work demonstrates over synthetic and real datasets that it can lead to
near-optimal data purchasing with only O(N) instead of the exponential execution time - O(2N)
- needed to calculate the optimal purchase. With regards to the payoff distribution problem, we focus on computing the relative value
of spatio-temporal datasets combined in marketplaces for predicting transportation demand and
travel time in metropolitan areas. Using large datasets of taxi rides from Chicago, Porto and
New York we show that the value of data is different for each individual, and cannot be approximated
by its volume. Our results reveal that even more complex approaches based on the
âleave-one-outâ value, are inaccurate. Instead, more complex and acknowledged notions of value
from economics and game theory, such as the Shapley value, need to be employed if one wishes
to capture the complex effects of mixing different datasets on the accuracy of forecasting algorithms.
However, the Shapley value entails serious computational challenges. Its exact calculation
requires repetitively training and evaluating every combination of data sources and hence O(N!)
or O(2N) computational time, which is unfeasible for complex models or thousands of individuals.
Moreover, our work paves the way to new methods of measuring the value of spatio-temporal
data. We identify heuristics such as entropy or similarity to the average that show a significant
correlation with the Shapley value and therefore can be used to overcome the significant computational
challenges posed by Shapley approximation algorithms in this specific context.
We conclude with a number of open issues and propose further research directions that leverage
the contributions and findings of this dissertation. These include monitoring data transactions
to better measure data markets, and complementing market data with actual transaction prices
to build a more accurate data pricing tool. A human-centric data economy would also require
that the contributions of thousands of individuals to machine learning tasks are calculated daily.
For that to be feasible, we need to further optimise the efficiency of data purchasing and payoff
calculation processes in data marketplaces. In that direction, we also point to some alternatives
to repetitively training and evaluating a model to select data based on Try Before You Buy and
approximate the Shapley value. Finally, we discuss the challenges and potential technologies that
help with building a federation of standardised data marketplaces.
The data economy will develop fast in the upcoming years, and researchers from different
disciplines will work together to unlock the value of data and make the most out of it. Maybe
the proposal of getting paid for our data and our contribution to the data economy finally flies,
or maybe it is other proposals such as the robot tax that are finally used to balance the power
between individuals and tech firms in the digital economy. Still, we hope our work sheds light on
the value of data, and contributes to making the price of data more transparent and, eventually, to
moving towards a human-centric data economy.This work has been supported by IMDEA Networks InstitutePrograma de Doctorado en IngenierĂa TelemĂĄtica por la Universidad Carlos III de MadridPresidente: Georgios Smaragdakis.- Secretario: Ăngel Cuevas RumĂn.- Vocal: Pablo RodrĂguez RodrĂgue
BDTS: Blockchain-based Data Trading System
Trading data through blockchain platforms is hard to achieve \textit{fair
exchange}. Reasons come from two folds: Firstly, guaranteeing fairness between
sellers and consumers is a challenging task as the deception of any
participating parties is risk-free. This leads to the second issue where
judging the behavior of data executors (such as cloud service providers) among
distrustful parties is impractical in the context of traditional trading
protocols. To fill the gaps, in this paper, we present a
\underline{b}lockchain-based \underline{d}ata \underline{t}rading
\underline{s}ystem, named BDTS. BDTS implements a fair-exchange protocol in
which benign behaviors can get rewarded while dishonest behaviors will be
punished. Our scheme requires the seller to provide consumers with the correct
encryption keys for proper execution and encourage a rational data executor to
behave faithfully for maximum benefits from rewards. We analyze the strategies
of consumers, sellers, and dealers in the trading game and point out that
everyone should be honest about their interests so that the game will reach
Nash equilibrium. Evaluations prove efficiency and practicability.Comment: ICICS 2023 (Best Paper Award
FlexiChain 2.0: NodeChain Assisting Integrated Decentralized Vault for Effective Data Authentication and Device Integrity in Complex Cyber-Physical Systems
Distributed Ledger Technology (DLT) has been introduced using the most common
consensus algorithm either for an electronic cash system or a decentralized
programmable assets platform which provides general services. Most established
reliable networks are unsuitable for all applications such as smart cities
applications, and, in particular, Internet of Things (IoT) and Cyber Physical
Systems (CPS) applications. The purpose of this paper is to provide a suitable
DLT for IoT and CPS that could satisfy their requirements. The proposed work
has been designed based on the requirements of Cyber Physical Systems.
FlexiChain is proposed as a layer zero network that could be formed from
independent blockchains. Also, NodeChain has been introduced to be a
distributed (Unique ID) UID aggregation vault to secure all nodes' UIDs.
Moreover, NodeChain is proposed to serve mainly FlexiChain for all node
security requirements. NodeChain targets the security and integrity of each
node. Also, the linked UIDs create a chain of narration that keeps track not
merely for assets but also for who authenticated the assets. The security
results present a higher resistance against four types of attacks. Furthermore,
the strength of the network is presented from the early stages compared to
blockchain and central authority. FlexiChain technology has been introduced to
be a layer zero network for all CPS decentralized applications taking into
accounts their requirements. FlexiChain relies on lightweight processing
mechanisms and creates other methods to increase security
Plastic circular economy in the EU: Material Flow Analysis and Transition Analysis
Plastic is valued for its versatility, but concerns have been raised over the environmental impacts of plastic waste. A more in-depth investigation of the plastic system is still needed to understand current flows and factors to close the plastic cycle.
This research applied a material flow analysis (MFA) and transition analysis (TA), using multilevel perspectives, to the plastic circular economy transition in the EU. The MFA covers over 400 categories of plastic-containing products with a detailed analysis of the final destination of waste. The TA identifies the interaction of barriers and drivers to use secondary plastics, with a focus on the regime level along the plastic value chain.
The MFA results indicate the EU produced over 66⯠million tonnes (Mt) of plastic polymers/fibres and an estimated consumption for plastic products of 73âŻMt in 2016. Plastic waste increases amounted to over 37âŻMt, and a significant amount of plastic waste was not recovered back into plastics in the EU. The uncertainty analysis of MFA highlights important data quality issues that need to be addressed.
To understand why using secondary plastics presents challenges, the TA mapped the factors across policies and standards, markets and business models, technology, and consumer preferences and behaviours that create a web of constraints and a web of drivers. TA results highlight that data-information-knowledge is the key gap as most of the aspects are cross-cutting. Different actors are involved in new business networks and play multiple roles in driving the co-evolutionary dynamic.
The thesis concludes that significant data gaps need MFA-based knowledge to inform policies that address the barriers and the potential socio-technical changes that can reshape plastic flows. The cases playing out across the whole value chain and four different application areas provide insights that are potentially more widely applicable to the circular economy transition processes in Europe
Autonomy, Efficiency, Privacy and Traceability in Blockchain-enabled IoT Data Marketplace
Personal data generated from IoT devices is a new economic asset that individuals can trade to generate revenue on the emerging data marketplaces. Blockchain technology can disrupt the data marketplace and make trading more democratic, trustworthy, transparent and secure. Nevertheless, the adoption of blockchain to create an IoT data marketplace requires consideration of autonomy and efficiency, privacy, and traceability.
Conventional centralized approaches are built around a trusted third party that conducts and controls all management operations such as managing contracts, pricing, billing, reputation mechanisms etc, raising concern that providers lose control over their data. To tackle this issue, an efficient, autonomous and fully-functional marketplace system is needed, with no trusted third party involved in operational tasks. Moreover, an inefficient allocation of buyersâ demands on battery-operated IoT devices poses a challenge for providers to serve multiple buyersâ demands simultaneously in real-time without disrupting their SLAs (service level agreements). Furthermore, a poor privacy decision to make personal data accessible to unknown or arbitrary buyers may have adverse consequences and privacy violations for providers. Lastly, a buyer could buy data from one marketplace and without the knowledge of the provider, resell bought data to users registered in other marketplaces. This may either lead to monetary loss or privacy violation for the provider. To address such issues, a data ownership traceability mechanism is essential that can track the change in ownership of data due to its trading within and across marketplace systems. However, data ownership traceability is hard because of ownership ambiguity, undisclosed reselling, and dispersal of ownership across multiple marketplaces.
This thesis makes the following novel contributions. First, we propose an autonomous and efficient IoT data marketplace, MartChain, offering key mechanisms for a marketplace leveraging smart contracts to record agreement details, participant ratings, and data prices in blockchain without involving any mediator. Second, MartChain is underpinned by an Energy-aware Demand Selection and Allocation (EDSA) mechanism for optimally selecting and allocating buyers' demands on providerâs IoT devices while satisfying the battery, quality and allocation constraints. EDSA maximizes the revenue of the provider while meeting the buyersâ requirements and ensuring the completion of the selected demands without any interruptions. The proof-of-concept implementation on the Ethereum blockchain shows that our approach is viable and benefits the provider and buyer by creating an autonomous and efficient real-time data trading model.
Next, we propose KYBChain, a Know-Your-Buyer in the privacy-aware decentralized IoT data marketplace that performs a multi-faceted assessment of various characteristics of buyers and evaluates their privacy rating. Privacy rating empowers providers to make privacy-aware informed decisions about data sharing. Quantitative analysis to evaluate the utility of privacy rating demonstrates that the use of privacy rating by the providers results in a decrease of data leakage risk and generated revenue, correlating with the classical risk-utility trade-off. Evaluation results of KYBChain on Ethereum reveal that the overheads in terms of gas consumption, throughput and latency introduced by our privacy rating mechanism compared to a marketplace that does not incorporate a privacy rating system are insignificant relative to its privacy gains.
Finally, we propose TrailChain which generates a trusted trade trail for tracking the data ownership spanning multiple decentralized marketplaces. Our solution includes mechanisms for detecting any unauthorized data reselling to prevent privacy violations and a fair resell payment sharing scheme to distribute payment among data owners for authorized reselling. We performed qualitative and quantitative evaluations to demonstrate the effectiveness of TrailChain in tracking data ownership using four private Ethereum networks. Qualitative security analysis demonstrates that TrailChain is resilient against several malicious activities and security attacks. Simulations show that our method detects undisclosed reselling within the same marketplace and across different marketplaces. Besides, it also identifies whether the provider has authorized the reselling and fairly distributes the revenue among the data owners at marginal overhead
Recommended from our members
The interpretation of copyright protection in video game streaming in Europe
This thesis was submitted for the award of Doctor of Philosophy and was awarded by Brunel University LondonVideo games play an important role in the economic and cultural landscape in Europe and have been the basis for user-generated content of all kinds. Online video gaming in particular has become very popular worldwide. One of the reasons for the ever-increasing popularity of the online video game is that it is available for live game streaming. âLetâs Playâ (LP) videos, is a term originated by the gaming community to refer to videos of someone playing a video game, with their audio commentary of the gameplay, which is edited to entertain the audience. LP videos are âepisodic accounts of a playerâs journeyâ, are very entertaining in nature, and can be broadcasted as pre-recorded videos on video-sharing platforms as well as live streamed.
There are three types of LP videos: reviews, playthrough videos with commentary, and playthrough videos without commentary. The first category constitutes reviews of video games. In the second category a viewer can watch the entire or part of the video game being played, while the gamer gives his/her commentary on their experience. In the third category, viewers can watch videos of the entire game being played, with no commentary of the gamer.
There is a debate about whether streaming video games online constitutes an act of communication to the public and as such, an online copyright infringement. Article 3 of the Directive 2001/29/EC provides that Member States shall provide authors with the exclusive right to authorise or prohibit any communication to the public of their works, by wire or wireless means, including the making available to the public of their works in such a way that members of the public may access them from a place and at a time individually chosen by them. Given that gamers communicate to the public whole or part of a video game, without the authorisation of the rightholder, it constitutes an unauthorised act of communication to the public. However, economic and strategy reasons have led video game developers to tolerate streaming activity, leaving streamers and platforms that host streaming videos at an uncertain stage regarding the lawfulness of their activities. While review LP videos fall under the exceptions and limitations to the communication to the public right, for the purposes of criticism or review, playthrough videos with and without commentary do not.
The thesis interprets the communication to the public right in video game streaming, explores whether hosting service providers (platforms) can effectively take down infringing content as well as whether Internet Service Providers (ISPs) can effectively block access to infringing content. With the deployment of doctrinal and comparative analysis, the thesis brings to the surface the limitations of current online copyright enforcement methods and proposes ways to overcome those obstacles. In an effort to strike a fair balance between the rightholdersâ rights, the right to conduct a business, and the freedom of expression, the thesis contributes that for LP videos and live streams to continue to exist, without the risk that they will be taken down after a request made by the rightholders, licence agreement is an alternative and feasible solution. In light of the DSM Directive 2019/790, streaming platforms, such as YouTube and Twitch.tv, perform an act of communication to the public or an act of making available to the public when give the public access to copyright-protected works or other protected subject matter uploaded by its users. Platforms shall be liable for unauthorised act of communication to the public, unless they obtain authorisation from the rightholder, by concluding a licence agreement, or they demonstrate that they have made their best efforts to obtain authorisation. The DSM Directive requires a licence agreement between rightholders and service providers (platforms). It is proposed that the licence agreement, which would allow the streaming of video game content, should be restricted to certain types of video games. Meanwhile, the thesis explores the potential of blockchain technology for the facilitation of the licence agreement. The potential of blockchain technology to process huge amounts of data, to issue digital certificates and the track of the use of non-licensable works would benefit the rightholders, intermediaries, and users
Digital Copyright Protection: Focus on Some Relevant Solutions
Copyright protection of digital content is considered a relevant problem of the current Internet since content digitalization and high performance interconnection networks have greatly increased the possibilities to reproduce and distribute digital content. Digital Rights Management (DRM) systems try to prevent the inappropriate or illegal use of copyrighted digital content. They are promoted by the major global media players, but they are also perceived as proprietary solutions that give rise to classic problems of privacy and fair use. On the other hand, watermarking protocols have become a possible solution to the problem of copyright protection. They have evolved during the last decade, and interesting proposals have been designed. This paper first presents current trends concerning the most significant solutions to the problem of copyright protection based on DRM systems and then focuses on the most promising approaches in the field of watermarking protocols. In this regard, the examined protocols are discussed in order to individuate which of them can better represent the right trade-off between opposite goals, such as, for example, security and easy of use, so as to prove that it is possible to implement open solutions compatible with the current web context without resorting to proprietary architectures or impairing the protection of copyrighted digital content
Recommended from our members
Data trading based on seller preferences within blockchain smart contract
This thesis was submitted for the award of Master of Philosophy and was awarded by Brunel University LondonOnline data trading has not focused on the necessary control of data selling
by the data seller preferences (DSP) using blockchain technology. This
research aims to explore the DSP using smart contract over blockchain
within the domain of online data trading. Data trading has been carried out
for several decades, but cutting-edge technologies and cloud services have
grown dramatically worldwide. Industries are gaining benefits from
accessing the data that enabled them to perform mission-critical tasks by
performing data analysis on the massively available data and getting a
higher return on investment (ROI).
This research aims to make online data trading possible only if the buyer
can satisfy the conditions predefined by the seller. For example, DSP can
restrict the data purchase if the participating buyer is doing business from
a specific geographic location, or it can further restrict a particular type and
size of business. So, data trading will be controlled by smart contract
validation based on DSP hence the novel DSP artefact has been achieved
and evaluated via a personal blockchain Ganache, which is always set to
automatics mining. Even though the DSP Dapp artefact has been explored
with a limited scope of seller preferences and data volume, future
researchers may evolve the DSP Dapp artefact framework to achieve
complex seller preferences such as ethical selling (e.g., green credentials).
The smart contract serves as an automated contract depending on DSP, between seller and buyer, without the involvement of any broker or third
party.
After the first chapter's introduction has set up the context for chapter two
to review the literature, present the research question, and set the aims
and objectives. Chapter three selected the DSR methodology for this
research and analysed the requirements to set the building block for
chapters four and five. Chapters four and five fulfilled objective two by
designing and developing the DSP artefact using a smart contract to control
data trading. Chapter 6 validated the DSP trading system to confirm the
novelty of this research, and finally, chapter 7 summarised the contribution
and future research.
The research proposes a new approach to online data trading that controls
the data selling depending on DSP within smart contract over blockchain
and opens new doors for the researchers for future work in this area
Vbswp-CeaH: Vigorous Buyer-Seller Watermarking Protocol without Trusted Certificate Authority for Copyright Protection in Cloud Environment through Additive Homomorphism
Cloud-based storage ensures the secure dissemination of media. Authentication and integrity are important aspects in the distribution of digital media. Encryption-based techniques shelter this media between the communicating parties which are involved in a transaction. The challenge is how to restrict the digital media which is illegally redistributed by the authorized users. However, the digital watermarking technique and encryption-based methods are also not sufficient enough to provide copyright protection. The watermarking protocol is used to provide intellectual property for the customer and the service provider. This research paper provides a vigorous buyer-seller watermarking protocol without trusted certificate authority for copyright protection in the cloud environment. This research work uses the cloud environment which enables the cloud as a service infrastructural provider for storing credentials such as public and private secret keys and the digital certificates of interacting parties. The scheme uses additive homomorphism encryption with an effective key exchange algorithm for exchanging digital media. This proposed approach addresses the problems of anonymity and copy deterrence and protects the digital rights of the buyer and seller; these most up-to-date issues are related to information security. Furthermore, the experiment results conclude that the proposed protocol is flexible and secure even in a non-secure communication channel. We have used performance measures such as PSNR, NCC and cost in time methods for checking the integrity of the proposed protocol. The conducted experiments show a stronger robustness and high imperceptibility for the watermark and watermarked images
- âŠ