346 research outputs found

    CIMLA: A Modular and Modifiable Data Preparation, Organization, and Fusion Infrastructure to Partially Support the Development of Context-aware MMLA Solutions

    Get PDF
    Multimodal Learning Analytics (MMLA) solutions aim to provide a more holistic picture of a learning situation by processing multimodal educational data. Considering contextual information of a learning situation is known to help in providing more relevant outputs to educational stakeholders. However, most of the MMLA solutions are still in prototyping phase and dealing with different dimensions of an authentic MMLA situation that involve multiple cross-disciplinary stakeholders like teachers, researchers, and developers. One of the reasons behind still being in prototyping phase of the development lifecycle is related to the challenges that software developers face at different levels in developing context-aware MMLA solutions. In this paper, we identify the requirements and propose a data infrastructure called CIMLA. It includes different data processing components following a standard data processing pipeline and considers contextual information following a data structure. It has been evaluated in three authentic MMLA scenarios involving different cross-disciplinary stakeholders following the Software Architecture Analysis Method. Its fitness was analyzed in each of the three scenarios and developers were interviewed to assess whether it meets functional and non-functional requirements. Results showed that CIMLA supports modularity in developing context-aware MMLA solutions and each of its modules can be reused with required modifications in the development of other solutions. In the future, the current involvement of a developer in customizing the configuration file to consider contextual information can be investigated

    Launch Control Systems: Moving Towards a Scalable, Universal Platform for Future Space Endeavors

    Get PDF
    The redirection of NASA away from the Constellation program calls for heavy reliance on commercial launch vehicles for the near future in order to reduce costs and shift focus to research and long term space exploration. To support them, NASA will renovate Kennedy Space Center's launch facilities and make them available for commercial use. However, NASA's current launch software is deeply connected with the now-retired Space Shuttle and is otherwise not massively compatible. Therefore, a new Launch Control System must be designed that is adaptable to a variety of different launch protocols and vehicles. This paper exposits some of the features and advantages of the new system both from the perspective of the software developers and the launch engineers

    A fast classifier-based approach to credit card fraud detection

    Get PDF
    openThis thesis aims at addressing the problem of anomaly detection in the context of credit card fraud detection with machine learning. Specifically, the goal is to apply a new approach to two-sample testing based on classifiers recently developed for new physic searches in high-energy physics. This strategy allows one to compare batches of incoming data with a control sample of standard transactions in a statistically sound way without prior knowledge of the type of fraudulent activity. The learning algorithm at the basis of this approach is a modern implementation of kernel methods that allows for fast online training and high flexibility. This work is the first attempt to export this method to a real-world use case outside the domain of particle physics.This thesis aims at addressing the problem of anomaly detection in the context of credit card fraud detection with machine learning. Specifically, the goal is to apply a new approach to two-sample testing based on classifiers recently developed for new physic searches in high-energy physics. This strategy allows one to compare batches of incoming data with a control sample of standard transactions in a statistically sound way without prior knowledge of the type of fraudulent activity. The learning algorithm at the basis of this approach is a modern implementation of kernel methods that allows for fast online training and high flexibility. This work is the first attempt to export this method to a real-world use case outside the domain of particle physics

    Proceedings of the 3rd Workshop on Domain-Specific Language Design and Implementation (DSLDI 2015)

    Full text link
    The goal of the DSLDI workshop is to bring together researchers and practitioners interested in sharing ideas on how DSLs should be designed, implemented, supported by tools, and applied in realistic application contexts. We are both interested in discovering how already known domains such as graph processing or machine learning can be best supported by DSLs, but also in exploring new domains that could be targeted by DSLs. More generally, we are interested in building a community that can drive forward the development of modern DSLs. These informal post-proceedings contain the submitted talk abstracts to the 3rd DSLDI workshop (DSLDI'15), and a summary of the panel discussion on Language Composition

    Interactive Digital Terrestrial Television: The Interoperability Challenge in Brazil

    Get PDF
    This paper introduces different standards implemented in existing Digital Terrestrial Television Broadcasting systems to allow the fruition of interactive services and applications through digital Set Top Boxes. It focuses on the interoperability issue between the Brazilian and the European architectures. In fact, despite in Brazil the GEM specification has been designed to foster wide content compatibility across a range of interactive platforms, it has never come to a final implementation and deployment. As a result the interoperability issue has been deeply explored in the BEACON project and an innovative system architecture has been developed to deploy t-learning services across Europe and Brazil, providing integration of those systems that were not able to interoperate until nowadays. This work is an important step in the direction of standards' interoperability. As a result, MHP and Ginga NCL-Lua implementation appeared to be the very best choice to deliver interactive services in an interoperable mode between European and Brazilian digital television

    CERN UNIX user guide

    Get PDF

    Survey of Technologies for Web Application Development

    Full text link
    Web-based application developers face a dizzying array of platforms, languages, frameworks and technical artifacts to choose from. We survey, classify, and compare technologies supporting Web application development. The classification is based on (1) foundational technologies; (2)integration with other information sources; and (3) dynamic content generation. We further survey and classify software engineering techniques and tools that have been adopted from traditional programming into Web programming. We conclude that, although the infrastructure problems of the Web have largely been solved, the cacophony of technologies for Web-based applications reflects the lack of a solid model tailored for this domain.Comment: 43 page
    • …
    corecore