5,056 research outputs found
Fog-enabled Edge Learning for Cognitive Content-Centric Networking in 5G
By caching content at network edges close to the users, the content-centric
networking (CCN) has been considered to enforce efficient content retrieval and
distribution in the fifth generation (5G) networks. Due to the volume,
velocity, and variety of data generated by various 5G users, an urgent and
strategic issue is how to elevate the cognitive ability of the CCN to realize
context-awareness, timely response, and traffic offloading for 5G applications.
In this article, we envision that the fundamental work of designing a cognitive
CCN (C-CCN) for the upcoming 5G is exploiting the fog computing to
associatively learn and control the states of edge devices (such as phones,
vehicles, and base stations) and in-network resources (computing, networking,
and caching). Moreover, we propose a fog-enabled edge learning (FEL) framework
for C-CCN in 5G, which can aggregate the idle computing resources of the
neighbouring edge devices into virtual fogs to afford the heavy delay-sensitive
learning tasks. By leveraging artificial intelligence (AI) to jointly
processing sensed environmental data, dealing with the massive content
statistics, and enforcing the mobility control at network edges, the FEL makes
it possible for mobile users to cognitively share their data over the C-CCN in
5G. To validate the feasibility of proposed framework, we design two
FEL-advanced cognitive services for C-CCN in 5G: 1) personalized network
acceleration, 2) enhanced mobility management. Simultaneously, we present the
simulations to show the FEL's efficiency on serving for the mobile users'
delay-sensitive content retrieval and distribution in 5G.Comment: Submitted to IEEE Communications Magzine, under review, Feb. 09, 201
Effective Caching for the Secure Content Distribution in Information-Centric Networking
The secure distribution of protected content requires consumer authentication
and involves the conventional method of end-to-end encryption. However, in
information-centric networking (ICN) the end-to-end encryption makes the
content caching ineffective since encrypted content stored in a cache is
useless for any consumer except those who know the encryption key. For
effective caching of encrypted content in ICN, we propose a novel scheme,
called the Secure Distribution of Protected Content (SDPC). SDPC ensures that
only authenticated consumers can access the content. The SDPC is a lightweight
authentication and key distribution protocol; it allows consumer nodes to
verify the originality of the published article by using a symmetric key
encryption. The security of the SDPC was proved with BAN logic and Scyther tool
verification.Comment: 7 pages, 9 figures, 2018 IEEE 87th Vehicular Technology Conference
(VTC Spring
Interoperability, Trust Based Information Sharing Protocol and Security: Digital Government Key Issues
Improved interoperability between public and private organizations is of key
significance to make digital government newest triumphant. Digital Government
interoperability, information sharing protocol and security are measured the
key issue for achieving a refined stage of digital government. Flawless
interoperability is essential to share the information between diverse and
merely dispersed organisations in several network environments by using
computer based tools. Digital government must ensure security for its
information systems, including computers and networks for providing better
service to the citizens. Governments around the world are increasingly
revolving to information sharing and integration for solving problems in
programs and policy areas. Evils of global worry such as syndrome discovery and
manage, terror campaign, immigration and border control, prohibited drug
trafficking, and more demand information sharing, harmonization and cooperation
amid government agencies within a country and across national borders. A number
of daunting challenges survive to the progress of an efficient information
sharing protocol. A secure and trusted information-sharing protocol is required
to enable users to interact and share information easily and perfectly across
many diverse networks and databases globally.Comment: 20 page
A Modern Primer on Processing in Memory
Modern computing systems are overwhelmingly designed to move data to
computation. This design choice goes directly against at least three key trends
in computing that cause performance, scalability and energy bottlenecks: (1)
data access is a key bottleneck as many important applications are increasingly
data-intensive, and memory bandwidth and energy do not scale well, (2) energy
consumption is a key limiter in almost all computing platforms, especially
server and mobile systems, (3) data movement, especially off-chip to on-chip,
is very expensive in terms of bandwidth, energy and latency, much more so than
computation. These trends are especially severely-felt in the data-intensive
server and energy-constrained mobile systems of today. At the same time,
conventional memory technology is facing many technology scaling challenges in
terms of reliability, energy, and performance. As a result, memory system
architects are open to organizing memory in different ways and making it more
intelligent, at the expense of higher cost. The emergence of 3D-stacked memory
plus logic, the adoption of error correcting codes inside the latest DRAM
chips, proliferation of different main memory standards and chips, specialized
for different purposes (e.g., graphics, low-power, high bandwidth, low
latency), and the necessity of designing new solutions to serious reliability
and security issues, such as the RowHammer phenomenon, are an evidence of this
trend. This chapter discusses recent research that aims to practically enable
computation close to data, an approach we call processing-in-memory (PIM). PIM
places computation mechanisms in or near where the data is stored (i.e., inside
the memory chips, in the logic layer of 3D-stacked memory, or in the memory
controllers), so that data movement between the computation units and memory is
reduced or eliminated.Comment: arXiv admin note: substantial text overlap with arXiv:1903.0398
Digitalization and Its Impact on Commercial Aviation
The purpose of this thesis was to study the concept of digitalization and to examine its impact on the commercial aviation industry. Digitalization and the numerous developments deriving from it constitutes a comprehensive framework that is consequential for any industry, and especially for the commercial aviation industry. This particular relevance arises primarily from the rapidly evolving nature of the industry, and furthermore components such as the cost structure, security and competition intensity play an important part.
Digitalization has historically been defined in multiple dissimilar ways due to its constantly developing nature. The definition used in this thesis will combine the proliferation of mobile devices and internet-based technologies with other significant innovations such as Big Data, Automation and 3D-printing. This definition will not stand alone, but instead it will be combined to fit the context of the commercial aviation industry.
The findings indicate that investing in significant digital technologies commercial airlines can potentially increase their customer satisfaction as well as their operational efficiency considerably. The specific digital trends contributing to the customer satisfaction are the internet and the Internet of Things (IoT), Big Data and Blockchain, whereas Augmented Reality (AR), Automation and 3D-printing affect the operational flight performance. The successful adaptation of these technologies can potentially lead to improvements in the overall efficiency-, cost-, flexibility- and security related performance of the airline.
However, prior to focusing on the individual trends, it is vital to acknowledge the current capabilities of the firm. Furthermore, the company needs to develop a solid digital strategy and implement that strategy successfully. An ad hoc set of mind is advisable in addition to an approach that promotes trial and failure
- …