269 research outputs found
Reputational Privacy and the Internet: A Matter for Law?
Reputation - we all have one. We do not completely comprehend its workings and are mostly unaware of its import until it is gone. When we lose it, our traditional laws of defamation, privacy, and breach of confidence rarely deliver the vindication and respite we seek due, primarily, to legal systems that cobble new media methods of personal injury onto pre-Internet laws. This dissertation conducts an exploratory study of the relevance of law to loss of individual reputation perpetuated on the Internet. It deals with three interrelated concepts: reputation, privacy, and memory. They are related in that the increasing lack of privacy involved in our online activities has had particularly powerful reputational effects, heightened by the Internet’s duplicative memory. The study is framed within three research questions: 1) how well do existing legal mechanisms address loss of reputation and informational privacy in the new media environment; 2) can new legal or extra-legal solutions fill any gaps; and 3) how is the role of law pertaining to reputation affected by the human-computer interoperability emerging as the Internet of Things? Through a review of international and domestic legislation, case law, and policy initiatives, this dissertation explores the extent of control held by the individual over her reputational privacy. Two emerging regulatory models are studied for improvements they offer over current legal responses: the European Union’s General Data Protection Regulation, and American Do Not Track policies. Underscoring this inquiry are the challenges posed by the Internet’s unique architecture and the fact that the trove of references to reputation in international treaties is not making its way into domestic jurisprudence or daily life. This dissertation examines whether online communications might be developing a new form of digital speech requiring new legal responses and new gradients of personal harm; it also proposes extra-legal solutions to the paradox that our reputational needs demand an overt sociality while our desire for privacy has us shunning the limelight. As we embark on the Web 3.0 era of human-machine interoperability and the Internet of Things, our expectations of the role of law become increasingly important
Eurolanguages-2010: Innovations and Development
Збірник наукових студентських робіт по підсумкам 8 Міжнародної конференції "Євромови-2010
Cybersecurity and the Digital Health: An Investigation on the State of the Art and the Position of the Actors
Cybercrime is increasingly exposing the health domain to growing risk. The push towards a strong connection of citizens to health services, through digitalization, has undisputed advantages. Digital health allows remote care, the use of medical devices with a high mechatronic and IT content with strong automation, and a large interconnection of hospital networks with an increasingly effective exchange of data. However, all this requires a great cybersecurity commitment—a commitment that must start with scholars in research and then reach the stakeholders. New devices and technological solutions are increasingly breaking into healthcare, and are able to change the processes of interaction in the health domain. This requires cybersecurity to become a vital part of patient safety through changes in human behaviour, technology, and processes, as part of a complete solution. All professionals involved in cybersecurity in the health domain were invited to contribute with their experiences. This book contains contributions from various experts and different fields. Aspects of cybersecurity in healthcare relating to technological advance and emerging risks were addressed. The new boundaries of this field and the impact of COVID-19 on some sectors, such as mhealth, have also been addressed. We dedicate the book to all those with different roles involved in cybersecurity in the health domain
Designing the replication layer of a general-purpose datacenter key-value store
Online services and cloud applications such as graph applications, messaging systems,
coordination services, HPC applications, social networks and deep learning rely on
key-value stores (KVSes), in order to reliably store and quickly retrieve data. KVSes
are NoSQL Databases with a read/write/read-modify-write API. KVSes replicate their
dataset in a few servers, such that the KVS can continue operating in the presence of
faults (availability). To allow programmers to reason about replication, KVSes specify
a set of rules (consistency), which are enforced through the use of replication protocols.
These rules must be intuitive to facilitate programmer productivity (programmability).
A general-purpose KVS must maximize the number of operations executed per
unit of time within a predetermined latency (performance) without compromising on
consistency, availability or programmability. However, all three of these guarantees
are at odds with performance. In this thesis, we explore the design of the replication
layer of a general-purpose KVS, which is responsible for navigating this trade-off, by
specifying and enforcing the consistency and availability guarantees of the KVS.
We start the exploration by observing that modern, server-grade hardware with
manycore servers and RDMA-capable networks, challenges conventional wisdom in
protocol design. In order to investigate the impact of these advances on protocols and
their design, we first create an informal taxonomy of strongly-consistent replication
protocols. We focus on strong consistency semantics because they are necessary for a
general-purpose KVS and they are at odds with performance. Based on this taxonomy
we carefully select 10 protocols for analysis. Secondly, we present Odyssey, a frame-work tailored towards protocol implementation for multi-threaded, RDMA-enabled,
in-memory, replicated KVSes. Using Odyssey, we characterize the design space of
strongly-consistent replication protocols, by building, evaluating and comparing the
10 protocols.
Our evaluation demonstrates that some of the protocols that were efficient in yesterday’s hardware are not so today because they cannot take advantage of the abundant
parallelism and fast networking present in modern hardware. Conversely, some protocols that were inefficient in yesterday’s hardware are very attractive today. We distil
our findings in a concise set of general guidelines and recommendations for protocol
selection and design in the era of modern hardware.
The second step of our exploration focuses on the tension between consistency
and performance. The problem is that expensive strongly-consistent primitives are
necessary to achieve synchronization, but in typical applications only a small fraction
of accesses is actually used for synchronization. To navigate this trade-off, we advocate
the adoption of Release Consistency (RC) for KVSes. We argue that RC’s one-sided
barriers are ideal for capturing the ordering relationship between synchronization and
non-synchronization accesses while enabling high performance.
We present Kite, a general-purpose, replicated KVS that enforces RC through a
novel fast/slow path mechanism that leverages the absence of failures in the typical
case to maximize performance, while relying on the slow path for progress. In ad dition, Kite leverages our study of replication protocols to select the most suitable
protocols for its primitives and is implemented over Odyssey to make the most out of
modern hardware. Finally, Kite does not compromise on consistency, availability or
programmability, as it provides sufficient primitives to implement any algorithm (consistency), does not interrupt its operation on a failure (availability), and offers the RC
API that programmers are already familiar with (programmability)
Recommended from our members
Developing sustainable business models for institutions’ provision of open educational resources: Learning from OpenLearn users’ motivations and experiences
Universities across the globe have, for some time, been exploring the possibilities for achieving public benefit and generating business and visibility through releasing and sharing open educational resources (OER). Many have written about the need to develop sustainable and profitable business models around the production and release of OER. Downes (2006), for example, has questioned the financial sustainability of OER production at scale. Many of the proposed business models focus on OER’s value in generating revenue and detractors of OER have questioned whether they are in competition with formal education.
This paper reports on a study intended to broaden the conversation about OER business models to consider the motivations and experiences of OER users as the basis for making a better informed decision about whether OER and formal learning are competitive or complementary with each other. The study focused on OpenLearn - the Open University’s (OU) web-based platform for OER, which hosts hundreds of online courses and videos and is accessed by over 3,000,000 users a year. A large scale survey and follow-up interviews with OpenLearn users worldwide revealed that university provided OER can offer learners a bridge to formal education, allowing them to try out a subject before registering on a formal course and to build confidence in their abilities as learners. In addition, it was found that using OER during formal paid-for study can improve learners’ performance and self-reliance, leading to increased retention and satisfaction with the learning experience
Recommended from our members
Open educational resources for all? Comparing user motivations and characteristics across The Open University’s iTunes U channel and OpenLearn platform.
With the rise in access to mobile multimedia devices, educational institutions have exploited the iTunes U platform as an additional channel to provide free educational resources with the aim of profile-raising and breaking down barriers to education. For those prepared to invest in content preparation, it is possible to produce interactive, portable material that can be made available globally. Commentators have questioned both the financial implications for platform-specific content production, and the availability of devices for learners to access it (Osborne, 2012).
The Open University (OU) makes its free educational resources available on iTunes U and via its web-based open educational resources (OER) platform, OpenLearn. The OU’s OER on iTunes U reached the 60 million download mark in 2013; its OpenLearn platform boasts 27 million unique visitors since 2006. This paper reports the results of a large-scale study of users of the OU’s iTunes U channel and OpenLearn platform. A survey of several thousand users revealed key differences in demographics between those accessing OER via the web and via iTunes U. In addition, the data allowed comparison between three groups: formal learners, informal learners and educators.
The study raises questions about whether university-provided OER meet the needs of users and makes recommendations for how content can be modified to suit their needs. As the publishing of OER becomes core to business, we reflect on reasons why understanding users’ motivations and demographics is vital, allowing for needs-led resource provision and content that is adapted to best achieve learner satisfaction, and to deliver institutions’ social mission
Analysis and design of security mechanisms in the context of Advanced Persistent Threats against critical infrastructures
Industry 4.0 can be defined as the digitization of all components within the industry, by combining productive processes with leading information and communication technologies. Whereas this integration has several benefits, it has also facilitated the emergence of several attack vectors. These can be leveraged to perpetrate sophisticated attacks such as an Advanced Persistent Threat (APT), that ultimately disrupts and damages critical infrastructural operations with a severe impact.
This doctoral thesis aims to study and design security mechanisms capable of detecting and tracing APTs to ensure the continuity of the production line. Although the basic tools to detect individual attack vectors of an APT have already been developed, it is important to integrate holistic defense solutions in existing critical infrastructures that are capable of addressing all potential threats. Additionally, it is necessary to prospectively analyze the requirements that these systems have to satisfy after the integration of novel services in the upcoming years.
To fulfill these goals, we define a framework for the detection and traceability of APTs in Industry 4.0, which is aimed to fill the gap between classic security mechanisms and APTs. The premise is to retrieve data about the production chain at all levels to correlate events in a distributed way, enabling the traceability of an APT throughout its entire life cycle. Ultimately, these mechanisms make it possible to holistically detect and anticipate attacks in a timely and autonomous way, to deter the propagation and minimize their impact. As a means to validate this framework, we propose some correlation algorithms that implement it (such as the Opinion Dynamics solution) and carry out different experiments that compare the accuracy of response techniques that take advantage of these traceability features. Similarly, we conduct a study on the feasibility of these detection systems in various Industry 4.0 scenarios
Actas de las XIV Jornadas de Ingeniería Telemática (JITEL 2019) Zaragoza (España) 22-24 de octubre de 2019
En esta ocasión, es la ciudad de Zaragoza la encargada de servir de anfitriona a las XIV Jornadas de Ingeniería Telemática (JITEL 2019), que se celebrarán del 22 al 24 de octubre de 2019. Las Jornadas de Ingeniería Telemática (JITEL), organizadas por la Asociación de Telemática (ATEL), constituyen un foro propicio de reunión, debate y divulgación para los grupos que imparten docencia e investigan en temas relacionados con las redes y los servicios telemáticos. Con la organización de este evento se pretende fomentar, por un lado el intercambio de experiencias y resultados, además de la comunicación y cooperación entre los grupos de investigación que trabajan en temas relacionados con la telemática. En paralelo a las tradicionales sesiones que caracterizan los congresos científicos, se desea potenciar actividades más abiertas, que estimulen el intercambio de ideas entre los investigadores experimentados y los noveles, así como la creación de vínculos y puntos de encuentro entre los diferentes grupos o equipos de investigación. Para ello, además de invitar a personas relevantes en los campos correspondientes, se van a incluir sesiones de presentación y debate de las líneas y proyectos activos de los mencionados equipos
- …