497 research outputs found

    Pioneering the use of a plenoptic Adaptive Optics system for Free Space Optical Communications

    Get PDF
    Tesis doctoral en Astrofísica, fecha de lectura 30 de Septiembre de 2019In this thesis, an Adaptive Optics proposal is presented and experimentally verified at both laboratory and telescope, with the objective of compensating the atmospheric aberrations in the uplink beam and, therefore, ameliorate Free Space Optical Communication links performance and the generation of Laser Guide Stars for conventional AO systems. The research focuses on the active correction of Ground to Space laser beams (optical links and artificial stars), as downlink communications resemble conventional astronomical observations when applying Adaptive Optics techniques: the light is originated in space and it travels downwards through the atmosphere to the receiver (where the AO system would be placed), whereas the uplink needs to be corrected before existing the launching telescope by measuring the atmospheric wavefront with an a-priori unknown reference source. The uplink pre-compensation entails a scientific and technological challenge. The uplink correction problem was deeply studied by the formulation of all possible solutions, which are properly modelled and simulated with an already existing Adaptive Optics Matlab toolbox, into which new functionalities were coded and integrated (upwards Fresnel propagation, new concept wavefront sensor, etc.). Based on the simulation outcome, the corresponding requirements were formulated for the design of an uplink corrector AO system from the very last element to the control strategy. After the proper hardware acquisition (of both COTS elements and custom-built components), the uplink corrector laboratory scale prototype was built and integrated at IAC laboratory facilities. Finally from January 2019 to May 2019, the Uplink Wavefront Corrector System was integrated at the Optical Ground Station telescope at Teide Observatory, successfully demonstrating the uplink precompensation of the laser beam

    A Technology for Deblurring Astronomical Images

    Get PDF
    This paper focuses on a specific technology for deblurring astronomical images. Data are collected as digital matrices of image pixels. The technology reconstructs the object via computational manipulation of the blurred image. Dr Stuart Jefferies, Professor, Department of Physics and Astronomy, GSU, along with several colleagues developed a sequence of techniques over three decades, based on a single, powerful idea. Supported for a time by the US Air Force AMOS project and drawing on compatible ideas from physics, engineering, mathematics and computer science, they learned to extract more and more useful information from the raw data. Jefferies took the idea of blind deconvolution and, with his network, developed it into a powerful and adaptable tool for scientific knowledge making and an efficient and effective applied technology for an important national security mission. We have located this work in the very long tradition of astronomical image making and in the more recent context of big science projects funded by government for military necessities. Galileo is celebrated as the first person to publish hand-drawn images from observing the night sky through a telescope. Indeed, hand-drawings were the only way to convey images of the night sky for over two hundred years, until the development of photography in the mid-1800s. Photography dominated astronomical image making for another hundred years. Today, we live in an era of huge sets of digital data. The infrastructure required to generate, transmit and analyze such data requires huge budgets. Hence the need for public funding. In addition to scientific knowledge making and applied technology development, we noted that an important legacy of big, government funded, science project can be the building of institutions that outlive the original project, but carry-on related work. The GSU Imaging Innovation Hub, championed by Jefferies, follows that tradition

    Interactive Multidimensional Modeling of Linked Data for Exploratory OLAP

    Get PDF
    Exploratory OLAP aims at coupling the precision and detail of corporate data with the information wealth of LOD. While some techniques to create, publish, and query RDF cubes are already available, little has been said about how to contextualize these cubes with situational data in an on-demand fashion. In this paper we describe an approach, called iMOLD, that enables non-technical users to enrich an RDF cube with multidimensional knowledge by discovering aggregation hierarchies in LOD. This is done through a user-guided process that recognizes in the LOD the recurring modeling patterns that express roll- up relationships between RDF concepts, then translates these patterns into aggregation hierarchies to enrich the RDF cube. Two families of aggregation patterns are identified, based on associations and generalization respectively, and the algorithms for recognizing them are described. To evaluate iMOLD in terms of efficiency and effectiveness we compare it with a related approach in the literature, we propose a case study based on DBpedia, and we discuss the results of a test made with real users

    Interactive multidimensional modeling of linked data for exploratory OLAP

    Get PDF
    Exploratory OLAP aims at coupling the precision and detail of corporate data with the information wealth of LOD. While some techniques to create, publish, and query RDF cubes are already available, little has been said about how to contextualize these cubes with situational data in an on-demand fashion. In this paper we describe an approach, called iMOLD, that enables non-technical users to enrich an RDF cube with multidimensional knowledge by discovering aggregation hierarchies in LOD. This is done through a user-guided process that recognizes in the LOD the recurring modeling patterns that express roll-up relationships between RDF concepts, then translates these patterns into aggregation hierarchies to enrich the RDF cube. Two families of aggregation patterns are identified, based on associations and generalization respectively, and the algorithms for recognizing them are described. To evaluate iMOLD in terms of efficiency and effectiveness we compare it with a related approach in the literature, we propose a case study based on DBpedia, and we discuss the results of a test made with real users.Peer ReviewedPostprint (author's final draft

    Validation and Verification of Safety-Critical Systems in Avionics

    Get PDF
    This research addresses the issues of safety-critical systems verification and validation. Safety-critical systems such as avionics systems are complex embedded systems. They are composed of several hardware and software components whose integration requires verification and testing in compliance with the Radio Technical Commission for Aeronautics standards and their supplements (RTCA DO-178C). Avionics software requires certification before its deployment into an aircraft system, and testing is mandatory for certification. Until now, the avionics industry has relied on expensive manual testing. The industry is searching for better (quicker and less costly) solutions. This research investigates formal verification and automatic test case generation approaches to enhance the quality of avionics software systems, ensure their conformity to the standard, and to provide artifacts that support their certification. The contributions of this thesis are in model-based automatic test case generations approaches that satisfy MC/DC criterion, and bidirectional requirement traceability between low-level requirements (LLRs) and test cases. In the first contribution, we integrate model-based verification of properties and automatic test case generation in a single framework. The system is modeled as an extended finite state machine model (EFSM) that supports both the verification of properties and automatic test case generation. The EFSM models the control and dataflow aspects of the system. For verification, we model the system and some properties and ensure that properties are correctly propagated to the implementation via mandatory testing. For testing, we extended an existing test case generation approach with MC/DC criterion to satisfy RTCA DO-178C requirements. Both local test cases for each component and global test cases for their integration are generated. The second contribution is a model checking-based approach for automatic test case generation. In the third contribution, we developed an EFSM-based approach that uses constraints solving to handle test case feasibility and addresses bidirectional requirements traceability between LLRs and test cases. Traceability elements are determined at a low-level of granularity, and then identified, linked to their source artifact, created, stored, and retrieved for several purposes. Requirements’ traceability has been extensively studied but not at the proposed low-level of granularity

    Memory-Based Grammatical Relation Finding

    Get PDF

    The Development And Evaluation Of Personalized Learning Material Based On A Profiling Algorithm For Polytechnic Students In Learning Algebra

    Get PDF
    Matematik adalah asas untuk pengajian kejuruteraan, terutamanya bagi pelajar kejuruteraan di politeknik Malaysia. Topik algebra pula adalah topik penting dalam matematik terutama bagi program kejuruteraan. Kajian-kajian lepas menunjukkan teknik pembelajaran tersesuai diri mampu meningkatkan kefahaman pelajar. Oleh itu, kajian ini dilakukan untuk mereka bentuk dan membangunkan satu aplikasi menggunakan teknologi Sistem Tutor Pintar (STP) untuk pembelajaran tersesuai diri bagi pembelajaran matematik. Teknologi ini membantu pembelajaran tersesuai diri dengan memberi cadangan bahan pembelajaran paling sesuai. Cadangan ini dilakukan melalui pengiraan algorithma Penaakulan Berasaskan Kes (PBK) dengan mencari persamaan antara profil baru dan profil yang disimpan di dalam pangkalan data. Cadangan dari profil yang mempunyai nilai persamaan paling tinggi digunakan sebagai rujukan. Gaya pembelajaran dan pengetahuan awalan pelajar digunakan sebagai maklumat untuk membentuk profil pelajar. Terdapat dua versi bahan ujian yang dibina: Pembelajaran Tersuai Diri (PTD) yang merujuk pelajar kepada nilai profil persamaan paling tinggi dan Pembelajaran secara Bukan Tersesuai Diri (PBTD) yang merujuk kepada nilai profil persamaan paling rendah. Mathematics is the foundation for engineering studies, especially for Malaysian polytechnics engineering students. Algebra is an important topic in mathematics, especially in engineering programs. Previous research shows that personalization techniques can increase student understanding. Thus, the aim of this study was to design and develop an application that utilized Intelligent Tutoring System (ITS) technology for the personalization of mathematics learning. This technology has the ability to help with the personalization of student learning by recommending the most suitable learning materials. The recommendation is computed using a Case-based Reasoning (CBR) algorithm by finding the similarity between the new submitted profile and the stored profiles in the database. The solution given by the most similar cases is used as a reference. Prior learning and mathematics learning style are the two parameters of a student's profile

    The influence of knowledge sharing on performance among Malaysian public sector managers and the moderating role of individual personality

    Get PDF
    There have been recent calls for further research into the sharing of managerial tacit knowledge to enhance individual and organisational performance. This, due to a lack of knowledge of current practices of knowledge sharing, especially in developing countries, has been the motivation behind this research. The study examines the roles of personality traits in facilitating knowledge sharing practices and managerial tacit knowledge transfer among managers working in high and low performance local governments. Specifically, the study examines the direct relationship between knowledge sharing practices and tacit knowledge among 308 managers working in local governments. Secondly, this study explores the differences between knowledge sharing practices, tacit knowledge and individual performance among managers working in high and low performance local governments. Thirdly, this study also explores the role of personality traits as moderators of the relationship between knowledge sharing practices and tacit knowledge with individual performance. A triangulation approach combining questionnaire and interviews was used in the study. The questionnaire was distributed to middle managers of 35 Malaysian local government engaged in a Star Rating System. There were 358 completed questionnaires returned, but only 308 were useable. To support the results from the quantitative data, semi-structured interviews were conducted with 8 managers from Malaysian Local Governments of high and low levels of performance representing 4 main categories: City Hall, City Council, Municipal Council, and District Council.The results provided general support the majority of hypotheses of the study. Specifically, mentoring programme (competence), individual codification, institutional personalization and institutional codification were related to managerial tacit knowledge transfer. Tacit knowledge associated with managing oneself, managing tasks and managing others were significantly related to knowledge sharing practices. Unexpectedly, there were no significant differences in knowledge sharing practices, levels of accumulated managerial tacit knowledge, or individual performance between high and low performance local governments. Finally, results indicated that the agreeableness dimension of individual personality interacted with mentoring programmes in a way that predicted individual performance. Furthermore, agreeableness and conscientiousness dimensions of personality interacted with tacit knowledge associated managing self and managing tasks to influence individual performance. The openness dimension interacted with tacit knowledge associated with managing others to influence individual performance.This study adds to the limited body of empirical research in knowledge management, particularly within the Malaysian public sector. It represents a comprehensive survey and explanation of knowledge management in Malaysia. The relationship between knowledge sharing practices and tacit knowledge variables and their interaction with sub traits of personality in terms of individual performance suggests that it would be beneficial to the Ministry of Housing and Local Government in Malaysia to manage tacit knowledge as a way of enhancing individual performance. Contributions to the theory and practice, limitations and implications of the study are discussed

    Framework for Interoperable and Distributed Extraction-Transformation-Loading (ETL) Based on Service Oriented Architecture

    Get PDF
    Extraction. Transformation and Loading (ETL) are the major functionalities in data warehouse (DW) solutions. Lack of component distribution and interoperability is a gap that leads to many problems in the ETL domain, which is due to tightly-coupled components in the current ETL framework. This research discusses how to distribute the Extraction, Transformation and Loading components so as to achieve distribution and interoperability of these ETL components. In addition, it shows how the ETL framework can be extended. To achieve that, Service Oriented Architecture (SOA) is adopted to address the mentioned missing features of distribution and interoperability by restructuring the current ETL framework. This research contributes towards the field of ETL by adding the distribution and inter- operability concepts to the ETL framework. This Ieads to contributions towards the area of data warehousing and business intelligence, because ETL is a core concept in this area. The Design Science Approach (DSA) and Scrum methodologies were adopted for achieving the research goals. The integration of DSA and Scrum provides the suitable methods for achieving the research objectives. The new ETL framework is realized by developing and testing a prototype that is based on the new ETL framework. This prototype is successfully evaluated using three case studies that are conducted using the data and tools of three different organizations. These organizations use data warehouse solutions for the purpose of generating statistical reports that help their top management to take decisions. Results of the case studies show that distribution and interoperability can be achieved by using the new ETL framework
    corecore