135 research outputs found

    Metamodels of information technology best practices frameworks

    Get PDF
    This article deals with the generation and application of ontological metamodels of frameworks of best practices in IT. The ontological metamodels represent the logical structures and fundamental semantics of framework models and constitute adequate tools for the analysis, adaptation, comparison and integration of the frameworks of best practices in IT. The MetaFrame methodology for the construction of the metamodels, founded on the discipline of the conceptual metamodelling and on the extended Entity/Relationship methodology is described herein, as well as the metamodels of the best practices for the outsourcing of IT, the eSCM-SP v2.01 (eSourcing Capability Model for Service Providers) and the eSCM-CL v1.1 (eSourcing Capability Model for Client Organizations), constructed according to the MetaFrame methodology

    Study of Tools Interoperability

    Get PDF
    Interoperability of tools usually refers to a combination of methods and techniques that address the problem of making a collection of tools to work together. In this study we survey different notions that are used in this context: interoperability, interaction and integration. We point out relation between these notions, and how it maps to the interoperability problem. We narrow the problem area to the tools development in academia. Tools developed in such environment have a small basis for development, documentation and maintenance. We scrutinise some of the problems and potential solutions related with tools interoperability in such environment. Moreover, we look at two tools developed in the Formal Methods and Tools group1, and analyse the use of different integration techniques

    Technology Management: Lowering Total Cost of Ownership (TCO) Through Thin Client Technology

    Get PDF
    IS professionals face the daunting challenge of keeping up with the pace of technology while controlling the Total Cost of Ownership. Using the TCO model developed by the Gartner Group and the TCA model by the Tolley Group, many IS professionals have confirmed the overall cost of deploying applications within an enterprise. One possible alternative to controlling the TCO of an organization is thin client technology. A key feature in the recently release Windows 2000 Server is Microsoftā€™s Terminal Services which is an enhancement to Windows NT Server v4.0 Terminal Server Edition which was the server platform for launching thin client technologies

    A conceptual architecture for adaptation in remote desktop systems driven by the user perception of multimedia

    Full text link
    Current thin-client remote desktop systems were designed for data-oriented applications over low-quality LAN links and they do not provide satisfactory end-user performance in enterprise environment for more and more popular graphical and multimedia applications. To improve perception of those applications in thin-client environment we propose architecture of a server-side Quality of Service (QoS) management component responsible for mapping application QoS requirements into network QoS. We analyze how service differentiation and traffic management techniques combined with user perception monitoring can be used in order to adjust network level resource allocation when performance of multimedia applications in remote desktop environment is not meeting user requirements. Our objective is to provide QoS-aware remote desktop systems which will be able to manage available resources in intelligent manner and meet end-user performance expectations. Ā© 2005 IEEE

    Strategic news frames and public policy debates: Press and television news coverage of the euro in the UK

    Get PDF
    There is growing concern amongst observers of the media that news coverage of politics has moved away from a focus on issues, and instead towards political strategy. Research evidencing such concerns has tended to examine strategic news at a macro level and rarely delves into the complexities surrounding its manifestations. This study addresses this issue by conducting a content analysis of a non-election issue in the British news media (press and TV news) over a three-month period, whereby strategy news as a frame was examined. The issue chosen for case study was the ā€œeuro debateā€ of May-June 2003. Findings showed the euro debate to fulfil many typical characteristics of EU reporting in the British media, with coverage cyclical and driven by events, and subsequently lacking sustained engagement with the issues. Although there was a roughly equal balance of issue and strategy framed stories in the press, certain features of coverage gave strategy greater prominence. Despite much of the content analysis findings confirming the worries of media critics, a number of qualifications emerge, such as the active role that politicians play as sources of strategic news

    A coarse granular approach to software development allowing non-programmers to build and deploy reliable, web based applications

    Get PDF
    Today's software development is mainly performed by IT experts. The application expert participates only in a light way in the software development process: traditionally, he is only involved within the requirements analysis and the acceptance test. He has no influence on the software development. The result of the loose integration of the application expert are long delivery times. This is due to the fact that IT experts are a rare resource and that misunderstandings lead to fallbacks within the software development cycle. This thesis presents a component-based software development process for Web-based applications and its concrete application within the ETI project which directly integrates the application expert into the software development on his level of understanding. Based on a set of building blocks modeling activities of the application domain, the application expert can specify the workflows the application has to implement. Afterwards, the actual application is automatically generated and packaged. During the design of the workflows the application expert is guided by formal methods which guarantee the consistency of the application at the workflow level

    MS

    Get PDF
    thesisSeveral methods exist for monitoring software development. Few formal evaluation methods have been applied to measure and improve clinical software application problems once the software has been implemented in the clinical setting. A standardized software problem classification system was developed and implemented at the University of Utah Health Sciences Center. External validity was measured by a survey of 14 University Healthcare Consortium (UHC) hospitals. Internal validation was accomplished by: an indepth analysis of problems details; revision in the problem ticket format; verification from staff within the information systems department; and mapping of old problems to the new classification system. Cohen's Kappa statistics of agreement, used for reliability testing of the new classification systems, revealed good agreement (Kappa = .6162) among HELP Desk agents in consistency of classifying problems calls. A monthly quality improvement report template with the following categories was developed from the new classification system: top 25 problems; unplanned server downtimes; problem summaries; customer satisfaction survey results; top problems details; case analyses; and follow-up of case analysis. Continuous Quality Improvement (CQ) methodology was applied to problem reporting within the Office of Information Resources (OIR) and a web-based ticket entry system was implemented. The new system has resulted in the following benefits: reduction in problem resolution times by one third; improved problem ticket information; shift of 2 FTEs from call center to dispatch due to the increased efficiency of the HELP DESK; and a trend in improvement of customer satisfaction as measured by an online survey. The study provided an internal quality model for the OIR department and the UUHSC. The QM report template provided a method for tracking and trending software problems to use in conducting evaluation and quality improvement studies. The template also provided data for analysis and improvement studies. The template also provided data for analysis and improvement of customer satisfaction. The study has further potential as a model for information system departments at other health care institutions for implementing quality improvement methods. There is potential for improvement in the information technology, social, organizational, and cultural aspects as key issues emerge over time. There can be many consequences to the data collected and many consequences of change can be studied
    • ā€¦
    corecore