1,442 research outputs found

    Four Decades of Computing in Subnuclear Physics - from Bubble Chamber to LHC

    Full text link
    This manuscript addresses selected aspects of computing for the reconstruction and simulation of particle interactions in subnuclear physics. Based on personal experience with experiments at DESY and at CERN, I cover the evolution of computing hardware and software from the era of track chambers where interactions were recorded on photographic film up to the LHC experiments with their multi-million electronic channels

    Enabling application agility: software as a service, cloud computing and dynamic languages

    Get PDF
    The good news is that application developers are on the verge of being liberated from the tyranny of middleware. Next Generation IT will leverage a new computing platform which makes the development and deliver of applications significantly easier than it is today. This new platform consists of Cloud Computing, Software As A Service and Dynamic Languages. Cloud Computing [1] offers mainframe or better infrastructure through a small set of services delivered globally over the Internet

    Integrating legacy mainframe systems: architectural issues and solutions

    Get PDF
    For more than 30 years, mainframe computers have been the backbone of computing systems throughout the world. Even today it is estimated that some 80% of the worlds' data is held on such machines. However, new business requirements and pressure from evolving technologies, such as the Internet is pushing these existing systems to their limits and they are reaching breaking point. The Banking and Financial Sectors in particular have been relying on mainframes for the longest time to do their business and as a result it is they that feel these pressures the most. In recent years there have been various solutions for enabling a re-engineering of these legacy systems. It quickly became clear that to completely rewrite them was not possible so various integration strategies emerged. Out of these new integration strategies, the CORBA standard by the Object Management Group emerged as the strongest, providing a standards based solution that enabled the mainframe applications become a peer in a distributed computing environment. However, the requirements did not stop there. The mainframe systems were reliable, secure, scalable and fast, so any integration strategy had to ensure that the new distributed systems did not lose any of these benefits. Various patterns or general solutions to the problem of meeting these requirements have arisen and this research looks at applying some of these patterns to mainframe based CORBA applications. The purpose of this research is to examine some of the issues involved with making mainframebased legacy applications inter-operate with newer Object Oriented Technologies

    Integration using Oracle SOA suite

    Get PDF
    Estágio realizado na Wipro RetailTese de mestrado integrado. Engenharia Informática e Computação. Faculdade de Engenharia. Universidade do Porto. 200

    MS

    Get PDF
    thesisSeveral methods exist for monitoring software development. Few formal evaluation methods have been applied to measure and improve clinical software application problems once the software has been implemented in the clinical setting. A standardized software problem classification system was developed and implemented at the University of Utah Health Sciences Center. External validity was measured by a survey of 14 University Healthcare Consortium (UHC) hospitals. Internal validation was accomplished by: an indepth analysis of problems details; revision in the problem ticket format; verification from staff within the information systems department; and mapping of old problems to the new classification system. Cohen's Kappa statistics of agreement, used for reliability testing of the new classification systems, revealed good agreement (Kappa = .6162) among HELP Desk agents in consistency of classifying problems calls. A monthly quality improvement report template with the following categories was developed from the new classification system: top 25 problems; unplanned server downtimes; problem summaries; customer satisfaction survey results; top problems details; case analyses; and follow-up of case analysis. Continuous Quality Improvement (CQ) methodology was applied to problem reporting within the Office of Information Resources (OIR) and a web-based ticket entry system was implemented. The new system has resulted in the following benefits: reduction in problem resolution times by one third; improved problem ticket information; shift of 2 FTEs from call center to dispatch due to the increased efficiency of the HELP DESK; and a trend in improvement of customer satisfaction as measured by an online survey. The study provided an internal quality model for the OIR department and the UUHSC. The QM report template provided a method for tracking and trending software problems to use in conducting evaluation and quality improvement studies. The template also provided data for analysis and improvement studies. The template also provided data for analysis and improvement of customer satisfaction. The study has further potential as a model for information system departments at other health care institutions for implementing quality improvement methods. There is potential for improvement in the information technology, social, organizational, and cultural aspects as key issues emerge over time. There can be many consequences to the data collected and many consequences of change can be studied

    To Host a Legacy System to the Web

    Get PDF
    The dramatic improvements in global interconectivity due to intranets, extranets and the Internet has led to many enterprises to consider migrating legacy systems to a web based systems. While data remapping is relatively straightforward in most cases, greater challenges lie in adapting legacy application software. This research effort describes an experiment in which a legacy system is migrated to a web-client/server environment. First, this thesis reports on the difficulties and issues arising when porting a legacy system International Invoice (IIMM) to a web-client/server environment. Next, this research analyzes the underlying issues, and offer cautionary guidance to future migrators and finally this research effort builds a prototype of the legacy system on a web client/server environment that demonstrates effective strategies to deal with these issues

    Off-line computing for experimental high-energy physics

    Get PDF
    The needs of experimental high-energy physics for large-scale computing and data handling are explained in terms of the complexity of individual collisions and the need for high statistics to study quantum mechanical processes. The prevalence of university-dominated collaborations adds a requirement for high-performance wide-area networks. The data handling and computational needs of the different types of large experiment, now running or under construction, are evaluated. Software for experimental high-energy physics is reviewed briefly with particular attention to the success of packages written within the discipline. It is argued that workstations and graphics are important in ensuring that analysis codes are correct, and the worldwide networks which support the involvement of remote physicists are described. Computing and data handling are reviewed showing how workstations and RISC processors are rising in importance but have not supplanted traditional mainframe processing. Examples of computing systems constructed within high-energy physics are examined and evaluated

    Maine IT Workforce Skills Management : A study for the Maine State Department of Labor

    Get PDF
    Executive Summary: From August 2010 to February 2011 personnel from Information and Innovation at the University of Southern Maine have conducted a study of IT skills needed, possessed and taught in Maine. The goals of this study were to provide fine-grained information to the Maine state Department of Labor to facilitate their workforce development activities. This study concerns the skills sought after by employers, possessed by unemployed and employed workers and taught in education and training establishments with a bricks and mortar presence in Maine. It relied on data created by third parties and by study personnel. Anecdotal evidence was gathered from meetings with local industry IT professionals as well. This study does not attempt to estimate demand or supply of a given skill, but it does assess which skills are in greatest and least demand, which skills are in greatest and least supply, and which skills are taught more and less often. The results of data analysis are presented in a new measure, skill rank disparity, which exposes skill and training gaps and gluts. This study provides certain insights into its results, observing individual cases of skills high in demand and low in supply, for example. Insights are also provided in terms of groups of skills that are often taught, often asked for, and whether these groups of skills are well-represented in the Maine IT workforce. This study also provides specific and actionable recommendatio

    Y2K on Campus

    Get PDF
    corecore