5,216 research outputs found

    Standards and Costs for Quality Management of E-Learning Services

    Get PDF
    The proportions of the technological development in the field of communications and information represent an irrefutable premise for significant changes in all the spheres of human life. Corroborated with the advance of the Internet recorded in the last decade, these changes form the perfect recipe for the emergence (ever since the '90s), functioning and development of flexible forms of labour at distance, using the technology of information and communication. Among the first domains where the impact of technology is very strong may be named education, marketing, advertising and commerce; the forms of manifestation are materialized in e-learning, cyber-marketing, online advertising and electronic commerce. But the simple use of technology does not automatically assure the success of the new forms of activity. These transformations of the traditional into digital, of the classic into virtual must be accompanied by the adequate support with respect to the quality of services, standards, platforms and the hardware and software technologies. If we are referring to the educational domain, we have to analyze the e-learning phenomenon or tele-education in its spectacular evolution in such a recent history. Quality represents a landmark of major importance in all the fields of modern society based on knowledge. From the perspective of tele-education, quality assurance must be focalized on three main directions: the quality of the actual educational process (class/course support, platform, technology, etc.); the quality of the instructor (professional training, qualification, specialization, pedagogic ability, teaching method, etc.); the quality of the person undergoing the course/class (training, knowledge thesaurus, involvement, accumulation wish, etc.). Also, like in any activity, quality standard reporting means an economic approach by quality costs. Theat means that the good product or quality services in e-learning are very strongly linked with educational multimedia production and good costs.quality, standards, e-learning, technology, cost

    Towards an Interaction-based Integration of MKM Services into End-User Applications

    Full text link
    The Semantic Alliance (SAlly) Framework, first presented at MKM 2012, allows integration of Mathematical Knowledge Management services into typical applications and end-user workflows. From an architecture allowing invasion of spreadsheet programs, it grew into a middle-ware connecting spreadsheet, CAD, text and image processing environments with MKM services. The architecture presented in the original paper proved to be quite resilient as it is still used today with only minor changes. This paper explores extensibility challenges we have encountered in the process of developing new services and maintaining the plugins invading end-user applications. After an analysis of the underlying problems, I present an augmented version of the SAlly architecture that addresses these issues and opens new opportunities for document type agnostic MKM services.Comment: 14 pages, 7 figure

    Semantic web service architecture for simulation model reuse

    Get PDF
    COTS simulation packages (CSPs) have proved popular in an industrial setting with a number of software vendors. In contrast, options for re-using existing models seem more limited. Re-use of simulation component models by collaborating organizations is restricted by the same semantic issues however that restrict the inter-organization use of web services. The current representations of web components are predominantly syntactic in nature lacking the fundamental semantic underpinning required to support discovery on the emerging semantic web. Semantic models, in the form of ontology, utilized by web service discovery and deployment architecture provide one approach to support simulation model reuse. Semantic interoperation is achieved through the use of simulation component ontology to identify required components at varying levels of granularity (including both abstract and specialized components). Selected simulation components are loaded into a CSP, modified according to the requirements of the new model and executed. The paper presents the development of ontology, connector software and web service discovery architecture in order to understand how such ontology are created, maintained and subsequently used for simulation model reuse. The ontology is extracted from health service simulation - comprising hospitals and the National Blood Service. The ontology engineering framework and discovery architecture provide a novel approach to inter- organization simulation, uncovering domain semantics and adopting a less intrusive interface between participants. Although specific to CSPs the work has wider implications for the simulation community

    Are Components the Future of Web–Application Development?

    Get PDF
    The software industry is still creating much of its product in a “monolithic” fashion. The products may be more modular and configurable than they used to be, but most projects cannot be said to be truly component based. Even some projects being built with component-enabled technologies are not taking full advantage of the component model. It is quite possible to misuse component capabilities and as a result, to forfeit many of their benefits. Many organizations are becoming aware of the advantages and are getting their developers trained in the new technologies and the proper way to use them. It takes time for an organization to adopt such a significant change in their current practices. Some of the trade magazines would have us believe that the industry is years ahead of where it truly is – those of us in the trenches know that the reaction time is a little longer in the real world. The change to component-based development has begun, however.component-based development, frameworks, language, market, technology.
    corecore