4,437 research outputs found
An Overview on Application of Machine Learning Techniques in Optical Networks
Today's telecommunication networks have become sources of enormous amounts of
widely heterogeneous data. This information can be retrieved from network
traffic traces, network alarms, signal quality indicators, users' behavioral
data, etc. Advanced mathematical tools are required to extract meaningful
information from these data and take decisions pertaining to the proper
functioning of the networks from the network-generated data. Among these
mathematical tools, Machine Learning (ML) is regarded as one of the most
promising methodological approaches to perform network-data analysis and enable
automated network self-configuration and fault management. The adoption of ML
techniques in the field of optical communication networks is motivated by the
unprecedented growth of network complexity faced by optical networks in the
last few years. Such complexity increase is due to the introduction of a huge
number of adjustable and interdependent system parameters (e.g., routing
configurations, modulation format, symbol rate, coding schemes, etc.) that are
enabled by the usage of coherent transmission/reception technologies, advanced
digital signal processing and compensation of nonlinear effects in optical
fiber propagation. In this paper we provide an overview of the application of
ML to optical communications and networking. We classify and survey relevant
literature dealing with the topic, and we also provide an introductory tutorial
on ML for researchers and practitioners interested in this field. Although a
good number of research papers have recently appeared, the application of ML to
optical networks is still in its infancy: to stimulate further work in this
area, we conclude the paper proposing new possible research directions
Recommended from our members
The Global academic research organization network: Data sharing to cure diseases and enable learning health systems.
Introduction:Global data sharing is essential. This is the premise of the Academic Research Organization (ARO) Council, which was initiated in Japan in 2013 and has since been expanding throughout Asia and into Europe and the United States. The volume of data is growing exponentially, providing not only challenges but also the clear opportunity to understand and treat diseases in ways not previously considered. Harnessing the knowledge within the data in a successful way can provide researchers and clinicians with new ideas for therapies while avoiding repeats of failed experiments. This knowledge transfer from research into clinical care is at the heart of a learning health system. Methods:The ARO Council wishes to form a worldwide complementary system for the benefit of all patients and investigators, catalyzing more efficient and innovative medical research processes. Thus, they have organized Global ARO Network Workshops to bring interested parties together, focusing on the aspects necessary to make such a global effort successful. One such workshop was held in Austin, Texas, in November 2017. Representatives from Japan, Taiwan, Singapore, Europe, and the United States reported on their efforts to encourage data sharing and to use research to inform care through learning health systems. Results:This experience report summarizes presentations and discussions at the Global ARO Network Workshop held in November 2017 in Austin, TX, with representatives from Japan, Korea, Singapore, Taiwan, Europe, and the United States. Themes and recommendations to progress their efforts are explored. Standardization and harmonization are at the heart of these discussions to enable data sharing. In addition, the transformation of clinical research processes through disruptive innovation, while ensuring integrity and ethics, will be key to achieving the ARO Council goal to overcome diseases such that people not only live longer but also are healthier and happier as they age. Conclusions:The achievement of global learning health systems will require further exploration, consensus-building, funding aligned with incentives for data sharing, standardization, harmonization, and actions that support global interests for the benefit of patients
Medical data processing and analysis for remote health and activities monitoring
Recent developments in sensor technology, wearable computing, Internet of Things (IoT), and wireless communication have given rise to research in ubiquitous healthcare and remote monitoring of human\u2019s health and activities. Health monitoring systems involve processing and analysis of data retrieved from smartphones, smart watches, smart bracelets, as well as various sensors and wearable devices. Such systems enable continuous monitoring of patients psychological and health conditions by sensing and transmitting measurements such as heart rate, electrocardiogram, body temperature, respiratory rate, chest sounds, or blood pressure. Pervasive healthcare, as a relevant application domain in this context, aims at revolutionizing the delivery of medical services through a medical assistive environment and facilitates the independent living of patients. In this chapter, we discuss (1) data collection, fusion, ownership and privacy issues; (2) models, technologies and solutions for medical data processing and analysis; (3) big medical data analytics for remote health monitoring; (4) research challenges and opportunities in medical data analytics; (5) examples of case studies and practical solutions
Towards Software-Defined Data Protection: GDPR Compliance at the Storage Layer is Within Reach
Enforcing data protection and privacy rules within large data processing
applications is becoming increasingly important, especially in the light of
GDPR and similar regulatory frameworks. Most modern data processing happens on
top of a distributed storage layer, and securing this layer against accidental
or malicious misuse is crucial to ensuring global privacy guarantees. However,
the performance overhead and the additional complexity for this is often
assumed to be significant -- in this work we describe a path forward that
tackles both challenges. We propose "Software-Defined Data Protection" (SDP),
an adoption of the "Software-Defined Storage" approach to non-performance
aspects: a trusted controller translates company and application-specific
policies to a set of rules deployed on the storage nodes. These, in turn, apply
the rules at line-rate but do not take any decisions on their own. Such an
approach decouples often changing policies from request-level enforcement and
allows storage nodes to implement the latter more efficiently.
Even though in-storage processing brings challenges, mainly because it can
jeopardize line-rate processing, we argue that today's Smart Storage solutions
can already implement the required functionality, thanks to the separation of
concerns introduced by SDP. We highlight the challenges that remain, especially
that of trusting the storage nodes. These need to be tackled before we can
reach widespread adoption in cloud environments
Software-Defined Data Protection: Low Overhead Policy Compliance at the Storage Layer is Within Reach!
Most modern data processing pipelines run on top of a distributed storage layer, and securing the whole system, and the storage layer in particular, against accidental or malicious misuse is crucial to ensuring compliance to rules and regulations. Enforcing data protection and privacy rules, however, stands at odds with the requirement to achieve higher and higher access bandwidths and processing rates in large data processing pipelines. In this work we describe our proposal for the path forward that reconciles the two goals. We call our approach "Software-Defined Data Protection" (SDP). Its premise is simple, yet powerful: decoupling often changing policies from request-level enforcement allows distributed smart storage nodes to implement the latter at line-rate. Existing and future data protection frameworks can be translated to the same hardware interface which allows storage nodes to offload enforcement efficiently both for company-specific rules and regulations, such as GDPR or CCPA. While SDP is a promising approach, there are several remaining challenges to making this vision reality. As we explain in the paper, overcoming these will require collaboration across several domains, including security, databases and specialized hardware design
clicktatorship and democrazy: Social media and political campaigning
This chapter aims to direct attention to the political dimension of the social media age.
Although current events like the Cambridge Analytica data breach managed to raise awareness for the
issue, the systematically organized and orchestrated mechanisms at play still remain oblivious to most.
Next to dangerous monopoly-tendencies among the powerful players on the market, reliance on
automated algorithms in dealing with content seems to enable large-scale manipulation that is applied for
economical and political purposes alike. The successful replacement of traditional parties by movements
based on personality cults around marketable young faces like Emmanuel Macron or Austria’s Sebastian
Kurz is strongly linked to products and services offered by an industry that simply provides likes and
followers for cash. Inspired by Trump’s monopolization of the Twitter-channel, these new political
acteurs use the potential of social media for effective message control, allowing them to avoid
confrontations with professional journalists. In addition, an extremely active minority of organized
agitators relies on the viral potential of the web to strongly influence and dictate public discourse –
suggesting a shift from the Spiral of Silence to the dangerous illusion of a Nexus of Noise
Recommended from our members
JuxtaLearn D3.2 Performance Framework
This deliverable, D3.2, for Work Package 3 incorporating the pedagogy from WP2 and orchestration factors mapped in D3.1 reviews aspects of performance in the context of participative video making. It reviews literature on curiosity and engagement characteristics of interaction mechanisms for public displays and anticipates requirements for social network analysis of relevant public videos from WP6 task 6.3. Thus, to support JuxtaLearn performance it proposes a reflective performance framework that encompasses the material environment and objects required, the participants, and the knowledge needed
- …