60 research outputs found

    Web services strategy

    Get PDF
    Thesis (S.M.M.O.T.)--Massachusetts Institute of Technology, Sloan School of Management, Management of Technology Program, 2003.June 2003.Includes bibliographical references (p. 116-123).This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.Everything is connected to everything. El Aleph (1945), by Jorge Luis Borges[1] This thesis addresses the need to simplify and streamline web service network infrastructure and to identify business models that best leverage Web services technology and industry dynamics to generate positive business results. Web services have evolved from the simple page-display protocol of their origin and now reach beyond the links that simply updated web data dynamically from corporate databases, to where systems can automatically transact. These Web services represent a series of network business technology standards and capabilities that irrevocably change the way in which businesses will do business. In fact, every business today is a networked business and has opportunities to grow using Web services. This study focuses on the implementation challenges in the financial services market, specifically the On Line Transaction Processing (OLTP) sector where legacy mainframes interface with multiple tiers of distribution through proprietary EDI links. The OLTP industry operates under stringent regulatory requirements for availability and audit-ability of not only who performed what transaction, but who had access to the information about the information. In this environment organizational demands on network infrastructure including hardware, software and personnel are changing radically, while concurrently Information Technology (IT) budgets are under pressure. The strategic choices for deploying web services in this environment may contain lessons for other industries where cost effective large scale processing, high availability, security, manageability and Intellectual Property Rights (IPR) are paramount concerns. In this paper we use a systems dynamics model to simulate the impact of market changes on the adoption of innovative technologies and their commoditization on the industry value chain, with the aim of identifying business models and network topologies which best support the growth of an Open Systems network business. From the results of the simulation we will derive strategic recommendations for networked business models and web services integration strategies to meet Line Of Business (LOB) objectives.by Stephen B. Miles.S.M.M.O.T

    Aviation System Analysis Capability Executive Assistant Design

    Get PDF
    In this technical document, we describe the design developed for the Aviation System Analysis Capability (ASAC) Executive Assistant (EA) Proof of Concept (POC). We describe the genesis and role of the ASAC system, discuss the objectives of the ASAC system and provide an overview of components and models within the ASAC system, and describe the design process and the results of the ASAC EA POC system design. We also describe the evaluation process and results for applicable COTS software. The document has six chapters, a bibliography, three appendices and one attachment

    A technology reference model for client/server software development

    Get PDF
    In today's highly competitive global economy, information resources representing enterprise-wide information are essential to the survival of an organization. The development of and increase in the use of personal computers and data communication networks are supporting or, in many cases, replacing the traditional computer mainstay of corporations. The client/server model incorporates mainframe programming with desktop applications on personal computers. The aim of the research is to compile a technology model for the development of client/server software. A comprehensive overview of the individual components of the client/server system is given. The different methodologies, tools and techniques that can be used are reviewed, as well as client/server-specific design issues. The research is intended to create a road map in the form of a Technology Reference Model for Client/Server Software Development.ComputingM. Sc. (Information Systems

    Integrative Approach to Online Quality Management: Process Control and Packout Verification in an Intelligent Manufacturing Workcell.

    Get PDF
    The current global competition and the economic situation in the United States and the world is forcing industries to produce quality products quickly and at a competitive price. Many industries are aiming towards world class manufacturing objectives like responsive delivery, defect free product and declining cost. Industries in the present environment can survive and produce the quality products to customer expectations only if they implement new technologies. The key element in the success of industries is using the continuous process improvement strategies like reducing process variability and reducing response time to process deviations. Achieving quality in manufacturing processes is an important part of the job description of everyone concerned with the manufacturing operation. The greatest savings can come when a quality system can immediately inform appropriate personnel when process problems occur, and can then assist in ensuring rapid response at the lowest possible level in the organization. This type of system adds not only to the bottom lux but also to the job satisfaction of all concerned [John, 1992]. The quality tools of the future are those that operate under a different scenario: the computer systems that collect the data also automatically do the analysis, interpretation, detection and correction, along with exception-based alarming and reporting. In this research, we examine the potential benefits of an integrative approach to on-line quality management. The motivation for this research comes from a field study with a printed circuit board assembly and instrumentation cluster manufacturer. The company made substantial investments in setting up elaborate systems for on-line data collection and monitoring of process status. While these modern quality control systems provided a rich database, their application in quality management was rather limited. This is partially due to the lack of appropriate methodology for quality decisions. The objective of this research is to develop an integrative approach to process control and packout verification of products. When the developed approach was implemented, it enabled the company to reduce their Problem Resolution Requests (PRRs) by 25% and was a cost avoidance of approximately $600,000 annually

    Architecture for grid-enabled instrumentation in extreme environments

    Get PDF
    Technological progress in recent decades has led to sensor networks and robotic explorers becoming principal tools for investigation of remote or "hostile" environments where it is difficult, if not impossible for humans to intervene. These situations include deep ocean and space environments where the devices can be subject to extreme pressures, temperatures and radiation levels. It is a costly enterprise to deploy an instrument in such settings and therefore reliable operation and ease of use are requisite features to build into the basic fabric of the machine. This thesis describes the design and implementation of a modular machine system based on a peer-to-peer, decentralised network topology where the power supply and electronic hardware resources are distributed homogeneously throughout a network of nodes. Embedded within each node is a minimal, low-power single board computer on which a real-time operating system and MicroCANopen protocol stack are operating to realise a standard interface to the network. The network is based on a grid paradigm where nodes act as resource producers and consumers, sharing information so that the machine system as a whole can perform tasks. The resulting architecture supports "plug-and-play" flexibility, to allow users or system developers to reconfigure or expand its capabilities by adding/removing nodes at a later time. An immediate application of this instrument is in-situ sampling of microbes in extreme aqueous habitats. The microbial sampler is targeted at providing improved sampling capabilities when performing physical, chemical and biological investigations in deep- ocean hydrothermal vent environments. At these depths the instrument is subject to immense pressures of many thousand pounds per square inch, where superheated, corrosive, mineral-loaded vent fluids mix with near-freezing seawater. In the longer term, it is anticipated that this flexible, open interface architecture on which the microbial sampler instrument is based will be applicable more generally to other sectors, including commercial and scientific markets.EThOS - Electronic Theses Online ServiceGBUnited Kingdo

    Software development and continual change: A programmers attitude problem.

    Get PDF
    Software fonns around a requirement. Defining this requirement is often regarded as the hardest part of software engineering. The requirement however has an additional complexity as, once defined, it will change with time. This change of requirement can come either from the user, or from the rapid advances in 'computer' technology. How then can software succeed to continue to remain 'current' both in tenns of requirements and technology in this forever changing environment? This thesis examines the issues surrounding 'change' as applied to software and software engineering. Changing requirements are often deemed a 'curse' placed upon software engineers. It has been suggested, however, that the problems associated with change exist only in the attitude of software engineers. This is perhaps understandable considering the training methods and tools available to supposedly 'help' them. The evidence shows that quality of management and experience of personnel involved in development contribute more significantly to the success of a development project than any technical aspect. This unfortunately means that the process is highly susceptible to staff turnover which, if uncontrolled, can lead to pending disaster for the users. This suggests a 'better' system would be developed if 'experience' was maintained at a process level, rather that at an individual level. Conventional methods of software engineering are based upon a defined set of requirements which are detennined at the beginning of the software process. This thesis presents an alternative paradigm which requires only a minimal set of requirements at the outset and actively encourages changes and additional requirements, even with a mature software product. The basis of this alternative approach is the fonn of the 'requirements specification' and the capturing and re-use of the 'experience' maintained by the software process itself

    Electronic architecture and technoogy development of astronaut spaceflight load sensors

    Get PDF
    Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2001.Includes bibliographical references.by Sylvie Loday.S.M

    Architecture for grid-enabled instrumentation in extreme environments

    Get PDF
    Technological progress in recent decades has led to sensor networks and robotic explorers becoming principal tools for investigation of remote or "hostile" environments where it is difficult, if not impossible for humans to intervene. These situations include deep ocean and space environments where the devices can be subject to extreme pressures, temperatures and radiation levels. It is a costly enterprise to deploy an instrument in such settings and therefore reliable operation and ease of use are requisite features to build into the basic fabric of the machine. This thesis describes the design and implementation of a modular machine system based on a peer-to-peer, decentralised network topology where the power supply and electronic hardware resources are distributed homogeneously throughout a network of nodes. Embedded within each node is a minimal, low-power single board computer on which a real-time operating system and MicroCANopen protocol stack are operating to realise a standard interface to the network. The network is based on a grid paradigm where nodes act as resource producers and consumers, sharing information so that the machine system as a whole can perform tasks. The resulting architecture supports "plug-and-play" flexibility, to allow users or system developers to reconfigure or expand its capabilities by adding/removing nodes at a later time. An immediate application of this instrument is in-situ sampling of microbes in extreme aqueous habitats. The microbial sampler is targeted at providing improved sampling capabilities when performing physical, chemical and biological investigations in deep- ocean hydrothermal vent environments. At these depths the instrument is subject to immense pressures of many thousand pounds per square inch, where superheated, corrosive, mineral-loaded vent fluids mix with near-freezing seawater. In the longer term, it is anticipated that this flexible, open interface architecture on which the microbial sampler instrument is based will be applicable more generally to other sectors, including commercial and scientific markets
    • …
    corecore