17,671 research outputs found

    A Comprehensive Survey of Deep Learning in Remote Sensing: Theories, Tools and Challenges for the Community

    Full text link
    In recent years, deep learning (DL), a re-branding of neural networks (NNs), has risen to the top in numerous areas, namely computer vision (CV), speech recognition, natural language processing, etc. Whereas remote sensing (RS) possesses a number of unique challenges, primarily related to sensors and applications, inevitably RS draws from many of the same theories as CV; e.g., statistics, fusion, and machine learning, to name a few. This means that the RS community should be aware of, if not at the leading edge of, of advancements like DL. Herein, we provide the most comprehensive survey of state-of-the-art RS DL research. We also review recent new developments in the DL field that can be used in DL for RS. Namely, we focus on theories, tools and challenges for the RS community. Specifically, we focus on unsolved challenges and opportunities as it relates to (i) inadequate data sets, (ii) human-understandable solutions for modelling physical phenomena, (iii) Big Data, (iv) non-traditional heterogeneous data sources, (v) DL architectures and learning algorithms for spectral, spatial and temporal data, (vi) transfer learning, (vii) an improved theoretical understanding of DL systems, (viii) high barriers to entry, and (ix) training and optimizing the DL.Comment: 64 pages, 411 references. To appear in Journal of Applied Remote Sensin

    An Overview on Application of Machine Learning Techniques in Optical Networks

    Get PDF
    Today's telecommunication networks have become sources of enormous amounts of widely heterogeneous data. This information can be retrieved from network traffic traces, network alarms, signal quality indicators, users' behavioral data, etc. Advanced mathematical tools are required to extract meaningful information from these data and take decisions pertaining to the proper functioning of the networks from the network-generated data. Among these mathematical tools, Machine Learning (ML) is regarded as one of the most promising methodological approaches to perform network-data analysis and enable automated network self-configuration and fault management. The adoption of ML techniques in the field of optical communication networks is motivated by the unprecedented growth of network complexity faced by optical networks in the last few years. Such complexity increase is due to the introduction of a huge number of adjustable and interdependent system parameters (e.g., routing configurations, modulation format, symbol rate, coding schemes, etc.) that are enabled by the usage of coherent transmission/reception technologies, advanced digital signal processing and compensation of nonlinear effects in optical fiber propagation. In this paper we provide an overview of the application of ML to optical communications and networking. We classify and survey relevant literature dealing with the topic, and we also provide an introductory tutorial on ML for researchers and practitioners interested in this field. Although a good number of research papers have recently appeared, the application of ML to optical networks is still in its infancy: to stimulate further work in this area, we conclude the paper proposing new possible research directions

    Intelligent Management and Efficient Operation of Big Data

    Get PDF
    This chapter details how Big Data can be used and implemented in networking and computing infrastructures. Specifically, it addresses three main aspects: the timely extraction of relevant knowledge from heterogeneous, and very often unstructured large data sources, the enhancement on the performance of processing and networking (cloud) infrastructures that are the most important foundational pillars of Big Data applications or services, and novel ways to efficiently manage network infrastructures with high-level composed policies for supporting the transmission of large amounts of data with distinct requisites (video vs. non-video). A case study involving an intelligent management solution to route data traffic with diverse requirements in a wide area Internet Exchange Point is presented, discussed in the context of Big Data, and evaluated.Comment: In book Handbook of Research on Trends and Future Directions in Big Data and Web Intelligence, IGI Global, 201

    Mathematical and computer modeling of electro-optic systems using a generic modeling approach

    Get PDF
    The conventional approach to modelling electro-optic sensor systems is to develop separate models for individual systems or classes of system, depending on the detector technology employed in the sensor and the application. However, this ignores commonality in design and in components of these systems. A generic approach is presented for modelling a variety of sensor systems operating in the infrared waveband that also allows systems to be modelled with different levels of detail and at different stages of the product lifecycle. The provision of different model types (parametric and image-flow descriptions) within the generic framework can allow valuable insights to be gained

    A Versatile Sensor Data Processing Framework for Resource Technology

    Get PDF
    Die Erweiterung experimenteller Infrastrukturen um neuartige Sensor eröffnen die Möglichkeit, qualitativ neuartige Erkenntnisse zu gewinnen. Um diese Informationen vollständig zu erschließen ist ein Abdecken der gesamten Verarbeitungskette von der Datenauslese bis zu anwendungsbezogenen Auswertung erforderlich. Eine Erweiterung bestehender wissenschaftlicher Instrumente beinhaltet die strukturelle und zeitbezogene Integration der neuen Sensordaten in das Bestandssystem. Das hier vorgestellte Framework bietet durch seinen flexiblen Ansatz das Potenzial, unterschiedliche Sensortypen in unterschiedliche, leistungsfähige Plattformen zu integrieren. Zwei unterschiedliche Integrationsansätze zeigen die Flexibilität dieses Ansatzes, wobei einer auf die Steigerung der Sensitivität einer Anlage zur Sekundärionenmassenspektroskopie und der andere auf die Bereitstellung eines Prototypen zur Untersuchung von Rezyklaten ausgerichtet ist. Die sehr unterschiedlichen Hardwarevoraussetzungen und Anforderungen der Anwendung bildeten die Basis zur Entwicklung eines flexiblen Softwareframeworks. Um komplexe und leistungsfähige Applikationsbausteine bereitzustellen wurde eine Softwaretechnologie entwickelt, die modulare Pipelinestrukturen mit Sensor- und Ausgabeschnittstellen sowie einer Wissensbasis mit entsprechenden Konfigurations- und Verarbeitungsmodulen kombiniert.:1. Introduction 2. Hardware Architecture and Application Background 3. Software Concept 4. Experimental Results 5. Conclusion and OutlookNovel sensors with the ability to collect qualitatively new information offer the potential to improve experimental infrastructure and methods in the field of research technology. In order to get full access to this information, the entire range from detector readout data transfer over proper data and knowledge models up to complex application functions has to be covered. The extension of existing scientific instruments comprises the integration of diverse sensor information into existing hardware, based on the expansion of pivotal event schemes and data models. Due to its flexible approach, the proposed framework has the potential to integrate additional sensor types and offers migration capabilities to high-performance computing platforms. Two different implementation setups prove the flexibility of this approach, one extending the material analyzing capabilities of a secondary ion mass spectrometry device, the other implementing a functional prototype setup for the online analysis of recyclate. Both setups can be regarded as two complementary parts of a highly topical and ground-breaking unique scientific application field. The requirements and possibilities resulting from different hardware concepts on one hand and diverse application fields on the other hand are the basis for the development of a versatile software framework. In order to support complex and efficient application functions under heterogeneous and flexible technical conditions, a software technology is proposed that offers modular processing pipeline structures with internal and external data interfaces backed by a knowledge base with respective configuration and conclusion mechanisms.:1. Introduction 2. Hardware Architecture and Application Background 3. Software Concept 4. Experimental Results 5. Conclusion and Outloo
    • …
    corecore