496,138 research outputs found
Distributed Processes, Distributed Cognizers and Collaborative Cognition
Cognition is thinking; it feels like something to think, and only those who can feel can think. There are also things that thinkers can do. We know neither how thinkers can think nor how they are able do what they can do. We are waiting for cognitive science to discover how. Cognitive science does this by testing hypotheses about what processes can generate what doing (“know-how”) This is called the Turing Test. It cannot test whether a process can generate feeling, hence thinking -- only whether it can generate doing. The processes that generate thinking and know-how are “distributed” within the heads of thinkers, but not across thinkers’ heads. Hence there is no such thing as distributed cognition, only collaborative cognition. Email and the Web have spawned a new form of collaborative cognition that draws upon individual brains’ real-time interactive potential in ways that were not possible in oral, written or print interactions
Massive MIMO is a Reality -- What is Next? Five Promising Research Directions for Antenna Arrays
Massive MIMO (multiple-input multiple-output) is no longer a "wild" or
"promising" concept for future cellular networks - in 2018 it became a reality.
Base stations (BSs) with 64 fully digital transceiver chains were commercially
deployed in several countries, the key ingredients of Massive MIMO have made it
into the 5G standard, the signal processing methods required to achieve
unprecedented spectral efficiency have been developed, and the limitation due
to pilot contamination has been resolved. Even the development of fully digital
Massive MIMO arrays for mmWave frequencies - once viewed prohibitively
complicated and costly - is well underway. In a few years, Massive MIMO with
fully digital transceivers will be a mainstream feature at both sub-6 GHz and
mmWave frequencies. In this paper, we explain how the first chapter of the
Massive MIMO research saga has come to an end, while the story has just begun.
The coming wide-scale deployment of BSs with massive antenna arrays opens the
door to a brand new world where spatial processing capabilities are
omnipresent. In addition to mobile broadband services, the antennas can be used
for other communication applications, such as low-power machine-type or
ultra-reliable communications, as well as non-communication applications such
as radar, sensing and positioning. We outline five new Massive MIMO related
research directions: Extremely large aperture arrays, Holographic Massive MIMO,
Six-dimensional positioning, Large-scale MIMO radar, and Intelligent Massive
MIMO.Comment: 20 pages, 9 figures, submitted to Digital Signal Processin
Distributed machine learning for IoT
In the modern world, big data is used in machine learning, which is quite difficult to process on a single computer, so various methods for parallel processing of such data are being developed. But what about microcontrollers? In a cloud system, microcontrollers are often found, thanks to which they make pacification of various devices, and sometimes you have to work with big data. In microcontrollers, the memory is quite small and the processor is not as productive as on modern supercomputers. Therefore, many scientists propose various methods for parallel processing of big data for embedded systems, one of such methods is proposed by the author of this article
The application of artificial intelligence techniques to large distributed networks
Data accessibility and transfer of information, including the land resources information system pilot, are structured as large computer information networks. These pilot efforts include the reduction of the difficulty to find and use data, reducing processing costs, and minimize incompatibility between data sources. Artificial Intelligence (AI) techniques were suggested to achieve these goals. The applicability of certain AI techniques are explored in the context of distributed problem solving systems and the pilot land data system (PLDS). The topics discussed include: PLDS and its data processing requirements, expert systems and PLDS, distributed problem solving systems, AI problem solving paradigms, query processing, and distributed data bases
Gaia Data Processing Architecture
Gaia is ESA's ambitious space astrometry mission the main objective of which
is to astrometrically and spectro-photometrically map 1000 Million celestial
objects (mostly in our galaxy) with unprecedented accuracy. The announcement of
opportunity for the data processing will be issued by ESA late in 2006. The
Gaia Data Processing and Analysis Consortium (DPAC) has been formed recently
and is preparing an answer. The satellite will downlink close to 100 TB of raw
telemetry data over 5 years. To achieve its required accuracy of a few 10s of
Microarcsecond astrometry, a highly involved processing of this data is
required.
In addition to the main astrometric instrument Gaia will host a Radial
Velocity instrument, two low-resolution dispersers for multi-color photometry
and two Star Mappers. Gaia is a flying Giga Pixel camera. The various
instruments each require relatively complex processing while at the same time
being interdependent. We describe the overall composition of the DPAC and the
envisaged overall architecture of the Gaia data processing system. We shall
delve further into the core processing - one of the nine, so-called,
coordination units comprising the Gaia processing system.Comment: 10 Pages, 2 figures. To appear in ADASS XVI Proceeding
Towards Trusted Data Processing for Information and Intelligence Systems
Data is a valued asset and its security is essential for any enterprise and organization. This paper introduces Trusted Data Processing (TDP) and addresses three fundamental questions in TDP: 1) what are the essential requirements to achieve TDP? 2) what security mechanisms and safeguards are available to ensure TDP? 3) how to integrate TDP to practice? Based on the attacks targeting at data assets and their consequences, the requirements to achieve TDP, including data security, data privacy, accountability, transparency, distributed computing, and trusted elements, are identified. Available security mechanisms and safeguards to ensure TDP are discussed. This paper also summarizes the challenges to achieve TDP and provides a practical guidance to achieve TDP through the integration with NIST Cybersecurity Framework
- …