292 research outputs found

    Multimedia authoring, development environments, and digital video editing

    Get PDF
    Multimedia systems integrate text, audio, video, graphics, and other media and allow them to be utilized in a combined and interactive manner. Using this exciting and rapidly developing technology, multimedia applications can provide extensive benefits in a variety of arenas, including research, education, medicine, and commerce. While there are many commercial multimedia development packages, the easy and fast creation of a useful, full-featured multimedia document is not yet a straightforward task. This paper addresses issues in the development of multimedia documents, ranging from user-interface tools that manipulate multimedia documents to multimedia communication technologies such as compression, digital video editing and information retrieval. It outlines the basic steps in the multimedia authoring process and some of the requirements that need to be met by multimedia development environments. It also presents the role of video, an essential component of multimedia systems and the role of programming in digital video editing. A model is described for remote access of distributed video. The paper concludes with a discussion of future research directions and new uses of multimedia documents

    Sentinel: a co-designed platform for semantic enrichment of social media streams

    Get PDF
    We introduce the Sentinel platform that supports semantic enrichment of streamed social media data for the purposes of situational understanding. The platform is the result of a codesign effort between computing and social scientists, iteratively developed through a series of pilot studies. The platform is founded upon a knowledge-based approach, in which input streams (channels) are characterized by spatial and terminological parameters, collected media is preprocessed to identify significant terms (signals), and data are tagged (framed) in relation to an ontology. Interpretation of processed media is framed in terms of the 5W framework (who, what, when, where, and why). The platform is designed to be open to the incorporation of new processing modules, building on the knowledge-based elements (channels, signals, and framing ontology) and accessible via a set of user-facing apps. We present the conceptual architecture for the platform, discuss the design and implementation challenges of the underlying streamprocessing system, and present a number of apps developed in the context of the pilot studies, highlighting the strengths and importance of the codesign approach and indicating promising areas for future research

    Genesis Failure Investigation Report

    Get PDF
    The-Genesis mission to collect solar-wind samples and return them to Earth for detailed analysis proceeded successfully for 3.5 years. During reentry on September 8, 2004, a failure in the entry, descent and landing sequence resulted in a crash landing of the Genesis sample return capsule. This document describes the findings of the avionics sub-team that supported the accident investigation of the JPL Failure Review Board

    SA-FEMIP: A Self-Adaptive Features Extractor and Matcher IP-Core Based on Partially Reconfigurable FPGAs for Space Applications

    Get PDF
    Video-based navigation (VBN) is increasingly used in space applications to enable autonomous entry, descent, and landing of aircrafts. VBN algorithms require real-time performances and high computational capabilities, especially to perform features extraction and matching (FEM). In this context, field-programmable gate arrays (FPGAs) can be employed as efficient hardware accelerators. This paper proposes an improved FPGA-based FEM module. Online self-adaptation of the parameters of both the image noise filter and the features extraction algorithm is adopted to improve the algorithm robustness. Experimental results demonstrate the effectiveness of the proposed self-adaptive module. It introduces a marginal resource overhead and no timing performance degradation when compared with the reference state-of-the-art architecture

    Digital Image Access & Retrieval

    Get PDF
    The 33th Annual Clinic on Library Applications of Data Processing, held at the University of Illinois at Urbana-Champaign in March of 1996, addressed the theme of "Digital Image Access & Retrieval." The papers from this conference cover a wide range of topics concerning digital imaging technology for visual resource collections. Papers covered three general areas: (1) systems, planning, and implementation; (2) automatic and semi-automatic indexing; and (3) preservation with the bulk of the conference focusing on indexing and retrieval.published or submitted for publicatio

    The Development of Digital Forensics Workforce Competency on the Example of Estonian Defence League

    Get PDF
    03.07.2014 kehtestati Vabariigi Valitsuse määrus nr. 108, mis reguleerib Kaitseliidu kaasamise tingimusi ja korda küberjulgeoleku tagamisel. Seega võivad Kaitseliidu küberkaitse üksuse (KL KKÜ edaspidi KKÜ) kutsuda olukorda toetama erinevad asutused: näiteks Riigi Infosüsteemide amet (RIA), infosüsteemi järelevalveasutus või kaitseministeerium või selle valitsemisala ametiasutused oma ülesannete raames. KKÜ-d saab kaasata info- ja sidetehnoloogia infrastruktuuri järjepidevuse tagamisel, turvaintsidentide kontrollimisel ja lahendamisel, rakendades nii aktiivseid kui passiivseid meetmeid. KKÜ ülesannete kaardistamisel täheldati, et KKÜ partnerasutused / organisatsioonid ei ole kaardistanud oma spetsialistide olemasolevaid pädevusi ja sellele lisaks puudub ülevaade digitaalse ekspertiisi kogukonnas vajaolevatest pädevustest. Leitut arvesse võttes seati ülesandeks vajadustest ja piirangutest (võttes arvesse digitaalse ekspertiisi kogukonda kujundavaid standardeid) ülevaatliku pildi loomine, et töötada välja digitaalse ekspertiisi kompetentsipõhine raamistik, mis toetab KKÜ spetsialistide arendamist palkamisest pensionini. Selleks uurisime KKÜ ja nende olemasolevate koolitusprogrammide hetkeolukorda ning otsustasime milliseid omadusi peab edasise arengu tarbeks uurima ja kaaluma. Võrreldavate tulemuste saa-miseks ja eesmärgi täitmiseks pidi koostatav mudel olema suuteline lahendama 5-t järgnevat ülesannet: 1. Oskuste kaardistamine, 2. Eesmärkide seadmine ja ümberhindamine, 3. Koolituskava planeerimine, 4. Värbamisprotsessi kiirendamine ning 5. Spetsialistide kestva arengu soodustamine. Raamistiku väljatöötamiseks võeti aluseks National Initiative for Cybersecurity Education (NICE) Cybersecurity Workforce Framework (NICE Framework) pädevusraamistik mida parendati digitaalse ekspertiisi spetsialistide, ja käesoleval juhul ka KKÜ, vajadusi silmas pidades. Täiendusi lisati nii tasemete, spetsialiseerumise kui ka ülesannete kirjelduste kujul. Parenduste lisamisel võeti arvesse töös tutvustatud digitaalse ekspertiisi piiranguid ja standardeid, mille lõpptulemusena esitati KKÜ-le Digitaalse Ekspertiisi Pädevuse ontoloogia, KKÜ struktuuri muudatuse ettepanek, soovitatavad õpetamisstrateegiad digitaalse ekspertiisi kasutamiseks (muudetud Bloomi taksonoomia tasemetega), uus digitaalse ekspertiisi standardi alajaotus – Mehitamata Süsteemide ekspertiis ja Digitaalse Ekspertiisi Pädevuse Mudeli Raamistik. Ülesannete ja oskuste loetelu koostati rahvusvaheliselt tunnustatud sertifitseerimis-organisatsioonide ja erialast pädevust pakkuvate õppekavade abil. Kavandatava mudeli hindamiseks kasutati mini-Delphi ehk Estimate-Talk-Estimate (ETE) tehnikat. Esialgne prognoos vajaduste ja prioriteetidega anti KKÜ partnerasutustele saamaks tehtud töö kohta ekspertarvamusi. Kogu tagasisidet silmas pidades tehti mudelisse korrektuurid ja KKÜ-le sai vormistatud ettepanek ühes edasise tööplaaniga. Üldiselt kirjeldab väljapakutud pädevusraamistik KKÜ spetsialistilt ooda-tavat pädevuse ulatust KKÜ-s, et suurendada nende rolli kiirreageerimisrühmana. Raamistik aitab määratleda digitaalse ekspertiisi eeldatavaid pädevusi ja võimekusi praktikas ning juhendab eksperte spetsialiseerumise valikul. Kavandatud mudeli juures on arvestatud pikaajalise mõjuga (palkamisest pensionini). Tulenevalt mudeli komplekssusest, on raamistikul pikk rakendusfaas – organisatsiooni arengule maksimaalse mõju saavutamiseks on prognoositud ajakava maksimaalselt 5 aastat. Antud ettepanekud on käesolevaks hetkeks KKÜ poolt heaks kiidetud ning planeeritud kava rakendati esmakordselt 2019 aasta aprillikuus.In 03.07.2014 Regulation No. 108 was introduced which regulates the conditions and pro-cedure of the involvement of the Estonian Defence League (EDL) Cyber Defence Unit (CDU) in ensuring cyber security. This means that EDL can be brought in by the Information System Authority, Ministry of Defence or the authorities of its area of government within the scope of either of their tasks e.g. ensuring the continuity of information and communication technology infrastructure and in handling and solving cyber security incidents while applying both active and passive measures. In January 2018 EDL CDU’s Digi-tal Evidence Handling Group had to be re-organized and, thus, presented a proposal for internal curriculum in order to further instruct Digital Evidence specialists. While describing the CDU's tasks, it was noted that the CDU's partner institutions / organizations have not mapped out their specialists’ current competencies. With this in mind, we set out to create a comprehensive list of needs and constraints (taking into account the community standards of DF) to develop a DF-based competence framework that supports the devel-opment of CDU professionals. Hence, we studied the current situation of CDU, their existing training program, and contemplated which features we need to consider and ex-plore for further development. In order to assemble comparable results and to achieve the goal the model had to be able to solve the 5 following tasks: 1. Competency mapping, 2. Goal setting and reassessment, 3. Scheduling the training plan, 4. Accelerating the recruitment process, and 5. Promoting the continuous development of professionals. The frame-work was developed on the basis of the National Initiative for Cybersecurity Education (NICE) Cybersecurity Workforce Framework (NICE Framework), which was revised to meet the needs of DF specialists, including EDL CDU. Additions were supplemented in terms of levels, specialization, and job descriptions. The proposals included the DF limitations and standards introduced in the work, which ultimately resulted in a proposal for a Digital Forensics Competency ontology, EDL CDU structure change, Suggested Instruc-tional Strategies for Digital Forensics Use With Each Level of revised Bloom's Taxonomy, a new DF standard subdivision – Unmanned Systems Forensics, and Digital Forensic Competency Model Framework. The list of tasks and skills were compiled from international certification distribution organizations and curricula, and their focus on DF Special-ist Competencies. Mini-Delphi or Estimate-Talk-Estimate (ETE) techniques were applied to evaluate the proposed model. An initial estimation of competencies and priorities were given to the EDL CDU partner institutions for expert advice and evaluation. Considering the feedback, improvements were made to the model and a proposal was put forward to the CDU with a future work plan. In general, the proposed competence framework describes the expected scope of competence of an DF specialist in the EDL CDU to enhance their role as a rapid response team. The framework helps in defining the expected compe-tencies and capabilities of digital forensics in practice and offers guidance to the experts in the choice of specialization. The proposed model takes into account the long-term effect (hire-to-retire). Due to the complexity of the model, the framework has a long implementation phase — the maximum time frame for achieving the full effect for the organization is expected to be 5 years. These proposals were approved by EDL CDU and the proposed plan was first launched in April 2019

    A multi-disciplinary co-design approach to social media sensemaking with text mining

    Get PDF
    This thesis presents the development of a bespoke social media analytics platform called Sentinel using an event driven co-design approach. The performance and outputs of this system, along with its integration into the routine research methodology of its users, were used to evaluate how the application of an event driven co-design approach to system design improves the degree to which Social Web data can be converted into actionable intelligence, with respect to robustness, agility, and usability. The thesis includes a systematic review into the state-of-the-art technology that can support real-time text analysis of social media data, used to position the text analysis elements of the Sentinel Pipeline. This is followed by research chapters that focus on combinations of robustness, agility, and usability as themes, covering the iterative developments of the system through the event driven co-design lifecycle. Robustness and agility are covered during initial infrastructure design and early prototyping of bottom-up and top-down semantic enrichment. Robustness and usability are then considered during the development of the Semantic Search component of the Sentinel Platform, which exploits the semantic enrichment developed in the prototype, alpha, and beta systems. Finally, agility and usability are used whilst building upon the Semantic Search functionality to produce a data download functionality for rapidly collecting corpora for further qualitative research. These iterations are evaluated using a number of case studies that were undertaken in conjunction with a wider research programme, within the field of crime and security, that the Sentinel platform was designed to support. The findings from these case studies are used in the co-design process to inform how developments should evolve. As part of this research programme the Sentinel platform has supported the production of a number of research papers authored by stakeholders, highlighting the impact the system has had in the field of crime and security researc

    Forgotten as data – remembered through information. Social memory institutions in the digital age: the case of the Europeana Initiative

    Get PDF
    The study of social memory has emerged as a rich field of research closely linked to cultural artefacts, communication media and institutions as carriers of a past that transcends the horizon of the individual’s lifetime. Within this domain of research, the dissertation focuses on memory institutions (libraries, archives, museums) and the shifts they are undergoing as the outcome of digitization and the diffusion of online media. Very little is currently known about the impact that digitality and computation may have on social memory institutions, specifically, and social memory, more generally – an area of study that would benefit from but, so far, has been mostly overlooked by information systems research. The dissertation finds its point of departure in the conceptualization of information as an event that occurs through the interaction between an observer and the observed – an event that cannot be stored as information but merely as data. In this context, memory is conceived as an operation that filters, thus forgets, the singular details of an information event by making it comparable to other events according to abstract classification criteria. Against this backdrop, memory institutions are institutions of forgetting as they select, order and preserve a canon of cultural heritage artefacts. Supported by evidence from a case study on the Europeana initiative (a digitization project of European libraries, archives and museums), the dissertation reveals a fundamental shift in the field of memory institutions. The case study demonstrates the disintegration of 1) the cultural heritage artefact, 2) its standard modes of description and 3) the catalogue as such into a steadily accruing assemblage of data and metadata. Dismembered into bits and bytes, cultural heritage needs to be re-membered through the emulation of recognizable cultural heritage artefacts and momentary renditions of order. In other words, memory institutions forget as binary-based data and remember through computational information

    Spacelab Life Sciences-1

    Get PDF
    This report provides an historical overview of the Spacelab Life Sciences-1 (SLS-1) mission along with the resultant biomaintenance data and investigators' findings. Only the nonhuman elements, developed by Ames Research Center (ARC) researchers, are addressed herein. The STS-40 flight of SLS-1, in June 1991, was the first spacelab flown after 'return to orbit', it was also the first spacelab mission specifically designated as a Life Sciences Spacelab. The experiments performed provided baseline data for both hardware and rodents used in succeeding missions

    FSEHS Cause and Effect Catalog and Student Handbook 2008-2009

    Get PDF
    corecore