255,542 research outputs found
A New 2.5D Representation for Lymph Node Detection using Random Sets of Deep Convolutional Neural Network Observations
Automated Lymph Node (LN) detection is an important clinical diagnostic task
but very challenging due to the low contrast of surrounding structures in
Computed Tomography (CT) and to their varying sizes, poses, shapes and sparsely
distributed locations. State-of-the-art studies show the performance range of
52.9% sensitivity at 3.1 false-positives per volume (FP/vol.), or 60.9% at 6.1
FP/vol. for mediastinal LN, by one-shot boosting on 3D HAAR features. In this
paper, we first operate a preliminary candidate generation stage, towards 100%
sensitivity at the cost of high FP levels (40 per patient), to harvest volumes
of interest (VOI). Our 2.5D approach consequently decomposes any 3D VOI by
resampling 2D reformatted orthogonal views N times, via scale, random
translations, and rotations with respect to the VOI centroid coordinates. These
random views are then used to train a deep Convolutional Neural Network (CNN)
classifier. In testing, the CNN is employed to assign LN probabilities for all
N random views that can be simply averaged (as a set) to compute the final
classification probability per VOI. We validate the approach on two datasets:
90 CT volumes with 388 mediastinal LNs and 86 patients with 595 abdominal LNs.
We achieve sensitivities of 70%/83% at 3 FP/vol. and 84%/90% at 6 FP/vol. in
mediastinum and abdomen respectively, which drastically improves over the
previous state-of-the-art work.Comment: This article will be presented at MICCAI (Medical Image Computing and
Computer-Assisted Interventions) 201
Empirical Big Data Research: A Systematic Literature Mapping
Background: Big Data is a relatively new field of research and technology,
and literature reports a wide variety of concepts labeled with Big Data. The
maturity of a research field can be measured in the number of publications
containing empirical results. In this paper we present the current status of
empirical research in Big Data. Method: We employed a systematic mapping method
with which we mapped the collected research according to the labels Variety,
Volume and Velocity. In addition, we addressed the application areas of Big
Data. Results: We found that 151 of the assessed 1778 contributions contain a
form of empirical result and can be mapped to one or more of the 3 V's and 59
address an application area. Conclusions: The share of publications containing
empirical results is well below the average compared to computer science
research as a whole. In order to mature the research on Big Data, we recommend
applying empirical methods to strengthen the confidence in the reported
results. Based on our trend analysis we consider Volume and Variety to be the
most promising uncharted area in Big Data.Comment: Submitted to Springer journal Data Science and Engineerin
Benefits of Computer Based Content Analysis to Foresight
Purpose of the article: The present manuscript summarizes benefits of the use of computer-based content
analysis in a generation phase of foresight initiatives. Possible advantages, disadvantages and limitations of
the content analysis for the foresight projects are discussed as well.
Methodology/methods: In order to specify the benefits and identify the limitations of the content analysis
within the foresight, results of the generation phase of a particular foresight project performed without
and subsequently with the use of computer based content analysis tool were compared by two proposed
measurements.
Scientific aim: The generation phase of the foresight is the most demanding part in terms of analysis duration,
costs and resources due to a significant amount of reviewed text. In addition, the conclusions of the foresight
evaluation are dependent on personal views and perceptions of the foresight analysts as the evaluation is
based merely on reading. The content analysis may partially or even fully replace the reading and provide an
important benchmark.
Findings: The use of computer based content analysis tool significantly reduced time to conduct the foresight
generation phase. The content analysis tool showed very similar results as compared to the evaluation
performed by the standard reading. Only ten % of results were not revealed by the use of content analysis tool.
On the other hand, several new topics were identified by means of content analysis tool that were missed by
the reading.
Conclusions: The results of two measurements should be subjected to further testing within more foresight
projects to validate them. The computer based content analysis tool provides valuable benchmark to the
foresight analysts and partially substitute the reading. However, a complete replacement of the reading is not
recommended, as deep understanding to weak signals interpretation is essential for the foresight
Data Management in Industry 4.0: State of the Art and Open Challenges
Information and communication technologies are permeating all aspects of
industrial and manufacturing systems, expediting the generation of large
volumes of industrial data. This article surveys the recent literature on data
management as it applies to networked industrial environments and identifies
several open research challenges for the future. As a first step, we extract
important data properties (volume, variety, traffic, criticality) and identify
the corresponding data enabling technologies of diverse fundamental industrial
use cases, based on practical applications. Secondly, we provide a detailed
outline of recent industrial architectural designs with respect to their data
management philosophy (data presence, data coordination, data computation) and
the extent of their distributiveness. Then, we conduct a holistic survey of the
recent literature from which we derive a taxonomy of the latest advances on
industrial data enabling technologies and data centric services, spanning all
the way from the field level deep in the physical deployments, up to the cloud
and applications level. Finally, motivated by the rich conclusions of this
critical analysis, we identify interesting open challenges for future research.
The concepts presented in this article thematically cover the largest part of
the industrial automation pyramid layers. Our approach is multidisciplinary, as
the selected publications were drawn from two fields; the communications,
networking and computation field as well as the industrial, manufacturing and
automation field. The article can help the readers to deeply understand how
data management is currently applied in networked industrial environments, and
select interesting open research opportunities to pursue
A Survey on the Security of Pervasive Online Social Networks (POSNs)
Pervasive Online Social Networks (POSNs) are the extensions of Online Social
Networks (OSNs) which facilitate connectivity irrespective of the domain and
properties of users. POSNs have been accumulated with the convergence of a
plethora of social networking platforms with a motivation of bridging their
gap. Over the last decade, OSNs have visually perceived an altogether
tremendous amount of advancement in terms of the number of users as well as
technology enablers. A single OSN is the property of an organization, which
ascertains smooth functioning of its accommodations for providing a quality
experience to their users. However, with POSNs, multiple OSNs have coalesced
through communities, circles, or only properties, which make
service-provisioning tedious and arduous to sustain. Especially, challenges
become rigorous when the focus is on the security perspective of cross-platform
OSNs, which are an integral part of POSNs. Thus, it is of utmost paramountcy to
highlight such a requirement and understand the current situation while
discussing the available state-of-the-art. With the modernization of OSNs and
convergence towards POSNs, it is compulsory to understand the impact and reach
of current solutions for enhancing the security of users as well as associated
services. This survey understands this requisite and fixates on different sets
of studies presented over the last few years and surveys them for their
applicability to POSNs...Comment: 39 Pages, 10 Figure
Analytical Cost Metrics : Days of Future Past
As we move towards the exascale era, the new architectures must be capable of
running the massive computational problems efficiently. Scientists and
researchers are continuously investing in tuning the performance of
extreme-scale computational problems. These problems arise in almost all areas
of computing, ranging from big data analytics, artificial intelligence, search,
machine learning, virtual/augmented reality, computer vision, image/signal
processing to computational science and bioinformatics. With Moore's law
driving the evolution of hardware platforms towards exascale, the dominant
performance metric (time efficiency) has now expanded to also incorporate
power/energy efficiency. Therefore, the major challenge that we face in
computing systems research is: "how to solve massive-scale computational
problems in the most time/power/energy efficient manner?"
The architectures are constantly evolving making the current performance
optimizing strategies less applicable and new strategies to be invented. The
solution is for the new architectures, new programming models, and applications
to go forward together. Doing this is, however, extremely hard. There are too
many design choices in too many dimensions. We propose the following strategy
to solve the problem: (i) Models - Develop accurate analytical models (e.g.
execution time, energy, silicon area) to predict the cost of executing a given
program, and (ii) Complete System Design - Simultaneously optimize all the cost
models for the programs (computational problems) to obtain the most
time/area/power/energy efficient solution. Such an optimization problem evokes
the notion of codesign
Delivery of broadband services to SubSaharan Africa via Nigerian communications satellite
Africa is the least wired continent in the world in terms of robust telecommunications infrastructure and systems to cater for its more than one billion people. African nations are mostly still in the early stages of Information Communications Technology (ICT) development as verified by the relatively low ICT Development Index (IDI) values of all countries in the African region. In developing nations, mobile broadband subscriptions and penetration between 2000-2009 was increasingly more popular than fixed broadband subscriptions. To achieve the goal of universal access, with rapid implementation of ICT infrastructure to complement the sparsely distributed terrestrial networks in the hinterlands and leveraging the adequate submarine cables along the African coastline, African nations and their stakeholders are promoting and implementing Communication Satellite systems, particularly in Nigeria, to help bridge the digital hiatus. This paper examines the effectiveness of communication satellites in delivering broadband-based services
IMPROVING SMART GRID SECURITY USING MERKLE TREES
Abstract—Presently nations worldwide are starting to convert their aging electrical power infrastructures into modern, dynamic power grids. Smart Grid offers much in the way of efficiencies and robustness to the electrical power grid, however its heavy reliance on communication networks will leave it more vulnerable to attack than present day grids. This paper looks at the threat to public key cryptography systems from a fully realized quantum computer and how this could impact the Smart Grid. We argue for the use of Merkle Trees in place of public key cryptography for authentication of devices in wireless mesh networks that are used in Smart Grid applications
Automatic generation of object shapes with desired functionalities
3D objects (artefacts) are made to fulfill functions. Designing an object
often starts with defining a list of functionalities that it should provide,
also known as functional requirements. Today, the design of 3D object models is
still a slow and largely artisanal activity, with few Computer-Aided Design
(CAD) tools existing to aid the exploration of the design solution space. To
accelerate the design process, we introduce an algorithm for generating object
shapes with desired functionalities. Following the concept of form follows
function, we assume that existing object shapes were rationally chosen to
provide desired functionalities. First, we use an artificial neural network to
learn a function-to-form mapping by analysing a dataset of objects labeled with
their functionalities. Then, we combine forms providing one or more desired
functions, generating an object shape that is expected to provide all of them.
Finally, we verify in simulation whether the generated object possesses the
desired functionalities, by defining and executing functionality tests on it.Comment: 12 pages, 9 figures, 28 reference
A Roadmap Towards Resilient Internet of Things for Cyber-Physical Systems
The Internet of Things (IoT) is a ubiquitous system connecting many different
devices - the things - which can be accessed from the distance. The
cyber-physical systems (CPS) monitor and control the things from the distance.
As a result, the concepts of dependability and security get deeply intertwined.
The increasing level of dynamicity, heterogeneity, and complexity adds to the
system's vulnerability, and challenges its ability to react to faults. This
paper summarizes state-of-the-art of existing work on anomaly detection,
fault-tolerance and self-healing, and adds a number of other methods applicable
to achieve resilience in an IoT. We particularly focus on non-intrusive methods
ensuring data integrity in the network. Furthermore, this paper presents the
main challenges in building a resilient IoT for CPS which is crucial in the era
of smart CPS with enhanced connectivity (an excellent example of such a system
is connected autonomous vehicles). It further summarizes our solutions,
work-in-progress and future work to this topic to enable "Trustworthy IoT for
CPS". Finally, this framework is illustrated on a selected use case: A smart
sensor infrastructure in the transport domain.Comment: preprint (2018-10-29
- …