44 research outputs found

    MPEG-SCORM : ontologia de metadados interoperáveis para integração de padrões multimídia e e-learning

    Get PDF
    Orientador: Yuzo IanoTese (doutorado) - Universidade Estadual de Campinas, Faculdade de Engenharia Elétrica e de ComputaçãoResumo: A convergência entre as mídias digitais propõe uma integração entre as TIC, focadas no domínio do multimídia (sob a responsabilidade do Moving Picture Experts Group, constituindo o subcomitê ISO / IEC JTC1 SC29), e as TICE, (TIC para a Educação, geridas pelo ISO / IEC JTC1 SC36), destacando-se os padrões MPEG, empregados na forma de conteúdo e metadados para o multimídia, e as TICE, aplicadas à Educação a Distância, ou e-Learning (o aprendizado eletrônico). Neste sentido, coloca-se a problemática de desenvolver uma correspondência interoperável de bases normativas, atingindo assim uma proposta inovadora na convergência entre as mídias digitais e as aplicações para e-Learning, essencialmente multimídia. Para este fim, propõe-se criar e aplicar uma ontologia de metadados interoperáveis para web, TV digital e extensões para dispositivos móveis, baseada na integração entre os padrões de metadados MPEG-21 e SCORM, empregando a linguagem XPathAbstract: The convergence of digital media offers an integration of the ICT, focused on telecommunications and multimedia domain (under responsibility of the Moving Picture Experts Group, ISO/IEC JTC1 SC29), with the ICTE (the ICT for Education, managed by the ISO/IEC JTC1 SC36), highlighting the MPEG formats, featured as content and as description metadata potentially applied to the Multimedia or Digital TV and as a technology applied to e-Learning. Regarding this, it is presented the problem of developing an interoperable matching for normative bases, achieving an innovative proposal in the convergence between digital Telecommunications and applications for e-Learning, both essentially multimedia. To achieve this purpose, it is proposed to creating a ontology for interoperability between educational applications in Digital TV environments and vice-versa, simultaneously facilitating the creation of learning metadata based objects for Digital TV programs as well as providing multimedia video content as learning objects for Distance Education. This ontology is designed as interoperable metadata for the Web, Digital TV and e-Learning, built on the integration between MPEG-21 and SCORM metadata standards, employing the XPath languageDoutoradoTelecomunicações e TelemáticaDoutor em Engenharia ElétricaCAPE

    Media Digitization and Preservation Initiative: A Case Study

    Get PDF
    Since its creation nearly a decade ago, the Digital Curation Centre (DCC) Curation Lifecycle Model has become the quintessential framework for understanding digital curation. Organizations and consortia around the world have used the DCC Curation Lifecycle Model as a tool to ensure that all the necessary stages of digital curation are undertaken, to define roles and responsibilities, and to build a framework of standards and technologies for digital curation. Yet, research on the application of the model to large-scale digitization projects as a way of understanding their efforts at digital curation is scant. This paper reports on findings of a qualitative case study analysis of Indiana University Bloomington’s multi-million-dollar Media Digitization and Preservation Initiative (MDPI), employing the DCC Curation Lifecycle Model as a lens for examining the scope and effectiveness of its digital curation efforts. Findings underscore the success of MDPI in performing digital curation by illustrating the ways it implements each of the model’s components. Implications for the application of the DCC Curation Lifecycle Model in understanding digital curation for mass digitization projects are discussed as well as directions for future research

    Multisite adaptive computation offloading for mobile cloud applications

    Get PDF
    The sheer amount of mobile devices and their fast adaptability have contributed to the proliferation of modern advanced mobile applications. These applications have characteristics such as latency-critical and demand high availability. Also, these kinds of applications often require intensive computation resources and excessive energy consumption for processing, a mobile device has limited computation and energy capacity because of the physical size constraints. The heterogeneous mobile cloud environment consists of different computing resources such as remote cloud servers in faraway data centres, cloudlets whose goal is to bring the cloud closer to the users, and nearby mobile devices that can be utilised to offload mobile tasks. Heterogeneity in mobile devices and the different sites include software, hardware, and technology variations. Resource-constrained mobile devices can leverage the shared resource environment to offload their intensive tasks to conserve battery life and improve the overall application performance. However, with such a loosely coupled and mobile device dominating network, new challenges and problems such as how to seamlessly leverage mobile devices with all the offloading sites, how to simplify deploying runtime environment for serving offloading requests from mobile devices, how to identify which parts of the mobile application to offload and how to decide whether to offload them and how to select the most optimal candidate offloading site among others. To overcome the aforementioned challenges, this research work contributes the design and implementation of MAMoC, a loosely coupled end-to-end mobile computation offloading framework. Mobile applications can be adapted to the client library of the framework while the server components are deployed to the offloading sites for serving offloading requests. The evaluation of the offloading decision engine demonstrates the viability of the proposed solution for managing seamless and transparent offloading in distributed and dynamic mobile cloud environments. All the implemented components of this work are publicly available at the following URL: https://github.com/mamoc-repo

    Quality prediction for component-based software development: techniques and a generic environment.

    Get PDF
    Cai Xia.Thesis (M.Phil.)--Chinese University of Hong Kong, 2002.Includes bibliographical references (leaves 105-110).Abstracts in English and Chinese.Chapter 1 --- Introduction --- p.1Chapter 1.1 --- Component-Based Software Development and Quality Assurance Issues --- p.1Chapter 1.2 --- Our Main Contributions --- p.5Chapter 1.3 --- Outline of This Thesis --- p.6Chapter 2 --- Technical Background and Related Work --- p.8Chapter 2.1 --- Development Framework for Component-based Software --- p.8Chapter 2.1.1 --- Common Object Request Broker Architecture (CORBA) --- p.9Chapter 2.1.2 --- Component Object Model (COM) and Distributed COM (DCOM) --- p.12Chapter 2.1.3 --- Sun Microsystems's JavaBeans and Enterprise JavaBeans --- p.14Chapter 2.1.4 --- Comparison among Different Frameworks --- p.17Chapter 2.2 --- Quality Assurance for Component-Based Systems --- p.199Chapter 2.2.1 --- Traditional Quality Assurance Issues --- p.199Chapter 2.2.2 --- The Life Cycle of Component-based Software Systems --- p.255Chapter 2.2.3 --- Differences between components and objects --- p.266Chapter 2.2.4 --- Quality Characteristics of Components --- p.27Chapter 2.3 --- Quality Prediction Techniques --- p.32Chapter 2.3.1 --- ARMOR: A Software Risk Analysis Tool --- p.333Chapter 3 --- A Quality Assurance Model for CBSD --- p.35Chapter 3.1 --- Component Requirement Analysis --- p.38Chapter 3.2 --- Component Development --- p.39Chapter 3.3 --- Component Certification --- p.40Chapter 3.4 --- Component Customization --- p.42Chapter 3.5 --- System Architecture Design --- p.43Chapter 3.6 --- System Integration --- p.44Chapter 3.7 --- System Testing --- p.45Chapter 3.8 --- System Maintenance --- p.46Chapter 4 --- A Generic Quality Assessment Environment: ComPARE --- p.48Chapter 4.1 --- Objective --- p.50Chapter 4.2 --- Metrics Used in ComPARE --- p.53Chapter 4.2.1 --- Metamata Metrics --- p.55Chapter 4.2.2 --- JProbe Metrics --- p.57Chapter 4.2.3 --- Application of Metamata and Jprobe Metrics --- p.58Chapter 4.3 --- Models Definition --- p.61Chapter 4.3.1 --- Summation Model --- p.61Chapter 4.3.2 --- Product Model --- p.62Chapter 4.3.3 --- Classification Tree Model --- p.62Chapter 4.3.4 --- Case-Based Reasoning Model --- p.64Chapter 4.3.5 --- Bayesian Network Model --- p.65Chapter 4.4 --- Operations in ComPARE --- p.66Chapter 4.5 --- ComPARE Prototype --- p.68Chapter 5 --- Experiments and Discussions --- p.70Chapter 5.1 --- Data Description --- p.71Chapter 5.2 --- Experiment Procedures --- p.73Chapter 5.3 --- Modeling Methodology --- p.75Chapter 5.3.1 --- Classification Tree Modeling --- p.75Chapter 5.3.2 --- Bayesian Belief Network Modeling --- p.80Chapter 5.4 --- Experiment Results --- p.83Chapter 5.3.1 --- Classification Tree Results Using CART --- p.83Chapter 5.3.2 --- BBN Results Using Hugin --- p.86Chapter 5.5 --- Comparison and Discussion --- p.90Chapter 6 --- Conclusion --- p.92Chapter A --- Classification Tree Report of CART --- p.95Chapter B --- Publication List --- p.104Bibliography --- p.10

    Semantic Model Alignment for Business Process Integration

    Get PDF
    Business process models describe an enterprise’s way of conducting business and in this form the basis for shaping the organization and engineering the appropriate supporting or even enabling IT. Thereby, a major task in working with models is their analysis and comparison for the purpose of aligning them. As models can differ semantically not only concerning the modeling languages used, but even more so in the way in which the natural language for labeling the model elements has been applied, the correct identification of the intended meaning of a legacy model is a non-trivial task that thus far has only been solved by humans. In particular at the time of reorganizations, the set-up of B2B-collaborations or mergers and acquisitions the semantic analysis of models of different origin that need to be consolidated is a manual effort that is not only tedious and error-prone but also time consuming and costly and often even repetitive. For facilitating automation of this task by means of IT, in this thesis the new method of Semantic Model Alignment is presented. Its application enables to extract and formalize the semantics of models for relating them based on the modeling language used and determining similarities based on the natural language used in model element labels. The resulting alignment supports model-based semantic business process integration. The research conducted is based on a design-science oriented approach and the method developed has been created together with all its enabling artifacts. These results have been published as the research progressed and are presented here in this thesis based on a selection of peer reviewed publications comprehensively describing the various aspects

    A Framework To Model Complex Systems Via Distributed Simulation: A Case Study Of The Virtual Test Bed Simulation System Using the High Level Architecture

    Get PDF
    As the size, complexity, and functionality of systems we need to model and simulate con-tinue to increase, benefits such as interoperability and reusability enabled by distributed discrete-event simulation are becoming extremely important in many disciplines, not only military but also many engineering disciplines such as distributed manufacturing, supply chain management, and enterprise engineering, etc. In this dissertation we propose a distributed simulation framework for the development of modeling and the simulation of complex systems. The framework is based on the interoperability of a simulation system enabled by distributed simulation and the gateways which enable Com-mercial Off-the-Shelf (COTS) simulation packages to interconnect to the distributed simulation engine. In the case study of modeling Virtual Test Bed (VTB), the framework has been designed as a distributed simulation to facilitate the integrated execution of different simulations, (shuttle process model, Monte Carlo model, Delay and Scrub Model) each of which is addressing differ-ent mission components as well as other non-simulation applications (Weather Expert System and Virtual Range). Although these models were developed independently and at various times, the original purposes have been seamlessly integrated, and interact with each other through Run-time Infrastructure (RTI) to simulate shuttle launch related processes. This study found that with the framework the defining properties of complex systems - interaction and emergence are realized and that the software life cycle models (including the spiral model and prototyping) can be used as metaphors to manage the complexity of modeling and simulation of the system. The system of systems (a complex system is intrinsically a system of systems ) continuously evolves to accomplish its goals, during the evolution subsystems co-ordinate with one another and adapt with environmental factors such as policies, requirements, and objectives. In the case study we first demonstrate how the legacy models developed in COTS simulation languages/packages and non-simulation tools can be integrated to address a compli-cated system of systems. We then describe the techniques that can be used to display the state of remote federates in a local federate in the High Level Architecture (HLA) based distributed simulation using COTS simulation packages

    A structured approach to physically-based modeling for computer graphics

    Get PDF
    This thesis presents a framework for the design of physically-based computer graphics models. The framework includes a paradigm for the structure of physically-based models, techniques for "structured" mathematical modeling, and a specification of a computer program structure in which to implement the models. The framework is based on known principles and methodologies of structured programming and mathematical modeling. Because the framework emphasizes the structure and organization of models, we refer to it as "Structured Modeling." The Structured Modeling framework focuses on clarity and "correctness" of models, emphasizing explicit statement of assumptions, goals, and techniques. In particular, we partition physically-based models, separating them into conceptual and mathematical models, and posed problems. We control complexity of models by designing in a modular manner, piecing models together from smaller components. The framework places a particular emphasis on defining a complete formal statement of a model's mathematical equations, before attempting to simulate the model. To manage the complexity of these equations, we define a collection of mathematical constructs, notation, and terminology, that allow mathematical models to be created in a structured and modular manner. We construct a computer programming environment that directly supports the implementation of models designed using the above techniques. The environment is geared to a tool-oriented approach, in which models are built from an extensible collection of software objects, that correspond to elements and tasks of a "blackboard" design of models. A substantial portion of this thesis is devoted to developing a library of physically-based model "modules," including rigid-body kinematics, rigid-body dynamics, and dynamic constraints, all built with the Structured Modeling framework. These modules are intended to serve both as examples of the framework, and as potentially useful tools for the computer graphics community. Each module includes statements of goals and assumptions, explicit mathematical models and problem statements, and descriptions of software objects that support them. We illustrate the use of the library to build some sample models, and include discussion of various possible additions and extensions to the library. Structured Modeling is an experiment in modeling: an exploration of designing via strict adherence to a dogma of structure, modularity, and mathematical formality. It does not stress issues such as particular numerical simulation techniques or efficiency of computer execution time or memory usage, all of which are important practical considerations in modeling. However, at least so far as the work carried on in this thesis, Structured Modeling has proven to be a useful aid in the design and understanding of complex physically based models.</p

    Ontologies for Legal Relevance and Consumer Complaints. A Case Study in the Air Transport Passenger Domain

    Get PDF
    Applying relevant legal information to settle complaints and disputes is a common challenge for all legal practitioners and laymen. However, the analysis of the concept of relevance itself has thus far attracted only sporadic attention. This thesis bridges this gap by understanding the components of complaints, and by defining relevant legal information, and makes use of computational ontologies and design patterns to represent this relevant knowledge in an explicit and structured way. This work uses as a case-study a real situation of consumer disputes in the Air Transport Passenger domain. Two artifacts were built: the Relevant Legal Information in Consumer Disputes Ontology, and its specialization, the Air Transport Passenger Incidents Ontology, aimed at modelling relevant legal information; and the Complaint Design Pattern proposed to conceptualize complaints. In order to demonstrate the ability of the ontologies to serve as a knowledge base for a computer program providing relevant legal information, a demonstrative application was developed

    Model representation and documentation in computer simulation.

    Get PDF
    Typically, simulation project is highly complex process, which relies heavily on the expertise and knowledge of the simulation analyst. It also requires the research of large amounts of systems data. This comprehensive data, together with the specialist skills of the analyst is integral to the success of any simulation project and it would seem obvious that a record of this information is ideally required for future references. However, it appears that, usually, very little or no effort is taken to record and maintain this significant information. This oversight often removes the opportunity for the subsequent use of the model by members of the project team themselves. It also hinders the reuse of simulation models in the development of future models that could use the same data. Hence, proper and complete documentation is seen as an essential requirement to overcome such situations. A simulation study involves, not only developing the model, but also managing the process prior to model construction and subsequent tasks. Documentation in simulation involves, not only recording the model description, but also other exhaustive details embraced with the whole project. Clearly, the project team and model re-users are benefited from such in-depth and effective documentation.Model Representation and Documentation (MRD) is a new concept for documentation in simulation. It addresses the different purposes or needs of different audiences in respect of the simulation project, model reuse, and other interested parties. No structured documentation methodology, either to satisfy this context, or to encompass the complete simulation project has been found in existing literature, or in simulation software. However, it is feasible that a progressive documentation with the model development process would fulfil the needs of different audiences and allows structuring the documentation process.The proposed MRD process is based on task-orientation, which is attributed to the system development methodology in software engineering. It offers the user the ability to manage the documentation process with micro-level of task documents and to capture project details as the project progresses. Subsequently, task documents are accumulated to produce complete documents to fulfil different purposes of documentation. Pre-structured forms of task documents, which are based on typical simulation project procedure and enriched with reusable model elements, not only provide the uniform and consistent structure to capture task details, but also offer a sound foundation for an integrated documentation system.An isolated MRD process, though concurrent with the model development, does not improve the present poor attempt to documentation. The integration of MRD process with model development offers the user the ability to perform both processes simultaneously as a single process while both are benefited directly and mutually through model exchange. The documentation models, which are constructed with reusable generic model elements, and the common database, which stores model details within a standard internal structure, make provision for such model exchange. Hence, an integrated MRD process improves, not only the documentation in simulation, but also model reusability.The study has produced a novel approach for documenting the details of simulation projects in an integrated environment.Samarakoon M. Piyasena

    Design Development Test and Evaluation (DDT and E) Considerations for Safe and Reliable Human Rated Spacecraft Systems

    Get PDF
    A team directed by the NASA Engineering and Safety Center (NESC) collected methodologies for how best to develop safe and reliable human rated systems and how to identify the drivers that provide the basis for assessing safety and reliability. The team also identified techniques, methodologies, and best practices to assure that NASA can develop safe and reliable human rated systems. The results are drawn from a wide variety of resources, from experts involved with the space program since its inception to the best-practices espoused in contemporary engineering doctrine. This report focuses on safety and reliability considerations and does not duplicate or update any existing references. Neither does it intend to replace existing standards and policy
    corecore