277 research outputs found

    MULTI-TERRAIN WHEELCHAIR MQP

    Get PDF
    Many disabled people struggle to traverse outdoor terrain without specialized equipment. To tackle the outdoors, users have to spend money, time, and sacrifice usability by using bulky assistive devices. This project was carried out by a group of four WPI students to design and construct an assistive device for a fellow student who is paralyzed from the waist down. After initial brainstorming and research, the choice was made to create a multi-terrain wheelchair. Using organizational, financial, manufacturing and design methods, the group was able to design and construct a prototype consisting of separate subsystems. This wheelchair design and prototype aims to not only assist disabled persons, but also serve as a basis for future projects to improve upon and help solve the issue

    Modeling The Secure Boot Protocol Using Actor Network Theory

    Get PDF
    M.S. Thesis. University of Hawaiʻi at Mānoa 2017

    Control System as a Service : Standardizing Service Model and Pricing Principles

    Get PDF
    The service business has aided manufacturers to supplement their new equipment sales across many fields of businesses. The service business has proven to generate steady portions of total revenues, but even larger percentual portion of total profits. Since customers have the tendency to focus more and more on their core competencies and capabilities, services have grown stead-ily day in and out. Not many studies have been made about the potential that a control system as a service can provide to a company. Through market saturation and globally competitive markets, companies meet the challenges to operate effectively in the service business, have a standardized way of working and to price their services optimally. In this master’s thesis I will explain how services and service portfolio can be standardized and what pricing principles are to be considered for them. First, the servitization as a trend in an industrial context is reviewed along with service strategies and service orientation. Also, the pricing contexts around services and the value aspect of it are reviewed. Lastly, the literature review highlights cloud services with a comparison to traditional IT services. Next, I will study how the offerings can be readjusted to offer control systems as a service with the help of cloud services and what costs to take into considerations, and how the pricing of the service could include. By comparing a traditional model and a service model, the total cost of ownership during the lifecycle has different phases. The total cost of ownership is calculated to be less for a traditional model during one traditional lifecycle. However, as a new lifecycle is initialized with a lot of in-vestment costs for software and hardware, thereby making the service model is yet again cheaper for the next couple of years. Consequently, the comparison between the models is dependent on the customer preferences and their IT strategy; the level of outsourcing it wants to practice, what cost structure it wants to pursue, and how much predictability it can have for the future, as a traditional model is not as scalable as a service model.Palveluliiketoiminta on auttanut yrityksiä täydentämään uusien tuotetarjontaansa monilla liike-toiminta-alueilla. Palveluliiketoiminnan on osoitettu tuottavan tasaista, mutta usein jopa muuta liiketoimintaa suurempaa kokonaistuottoa yritysten tuloksista. Asiakkaiden keskittyminen ydin-toimintaansa on kasvattanut muiden osa-alueiden ulkoistamista. Automaatiojärjestelmien tuottamista palveluna ei juurikaan löydy tutkimustuloksia. Markkinoi-den kyllästymisen sekä maailmanlaajuisen kilpailun vuoksi palveluliiketoiminnalla haetaan opti-maalisia malleja kilpailuun ja uusiin ydinkyvykkyyksiin. Tässä tutkielmassa tutkitaan, miten palvelut ja palveluportfolio voidaan standardoida ja mitä hinnoitteluperiaatteita kohdeyritys voisi tarjonnassaan hyödyntää. Ensiksi tarkastellaan palve-lullistamista trendinä teollisessa kontekstissa yhdessä kanssa. Myös palvelujen hinnoittelua ja sen arvoa tarkastellaan. Lopuksi kirjallisuuskatsauksessa korostetaan pilvipalveluja verrattuna perinteisiin IT-ratkaisuihin. Seuraavaksi tutkin, kuinka nykyinen tarjontamalli voidaan muokata tarjoamaan automaatiojär-jestelmää palveluna pilvipalvelujen avulla. Myös kustannusrakennemuutokset sekä palveluhin-noittelu ja sen tulevat menetelmät otetaan huomioon. Vertaamalla perinteistä mallia ja palvelumallia, kokonaiskustannuksilla elinkaaren aikana on eri vaiheita. Perinteisen mallin kokonaiskustannusten lasketaan olevan pienemmät yhden järjestel-män perinteisen elinkaaren aikana. Sen sijaan, heti uuden elinkaaren alkaessa, kohdistuu asiak-kaalle paljon ohjelmistojen ja laitteistojen investointikustannuksia, jolloin palvelumallin kustan-nusrakenteen arvioidaan muodostuvan kustannusnäkökulmasta asiakkaalle muutamaksi vuo-deksi edullisemmaksi. Näin ollen, mallien vertailu riippuu asiakkaan mieltymyksistä ja heidän IT-strategiastaan; kuinka paljon ulkoistamista asiakas haluaa järjestelmälleen ja minkälaista kustannusrakennetta se suo-sii liiketoiminnassaan. Myös ennustettavuus on järjestelmäkontekstissa merkitsevää, sillä palve-lumallin skaalautuvuus tuo paljon etuja perinteisiin investointiprojekteihin verrattuna

    An axiom system for sequence-based specification

    Get PDF
    AbstractThis paper establishes an axiomatic foundation and a representation theorem for the rigorous, constructive process, called sequence-based specification, of deriving precise specifications from ordinary (informal) statements of functional requirements. The representation theorem targets a special class of Mealy state machines, and algorithms are presented for converting from the set of sequences that define the specification to the equivalent Mealy machine, and vice versa. Since its inception, sequence-based specification has been effectively used in a variety of real applications, with gains reported in quality and productivity. This paper establishes the mathematical foundation independently of the process itself

    Identifying Design Strategies to Mitigate the Risk Introduced into New Product Development by Suppliers

    Get PDF
    For every organization, an efficient and effective product development process is a key to generate and manage growth opportunities. Often strategic relationships with key suppliers and partners are required as organizations do not have all the competencies that are crucial to the development of a product. This is particularly true for Original Design Manufacturer (ODM) and Joint Development Manufacturer (JDM) supplier relationships, which are characterized by a high degree of supplier involvement in every stage of product development. If the interactions with these key suppliers are not managed properly, there is significant risk that the endeavor will end up with missing budget, schedule and cost goals, particularly for complex systems. Little attention in the literature, however, has been given to the risk introduced by suppliers into the product development process nor mitigating this risk through appropriate design strategies. This thesis addresses the need to develop a risk assessment methodology that would not only identify areas of concern but also identify potential design strategies to mitigate risk. In this work, metrics are derived to quantify the relative importance, degree of change, difficulty of change and degree of coupling for engineering metrics at system and subsystem levels. From these metrics, a framework is developed to quantitatively assess the risk due to supplier interactions. In addition, design strategies identified in the literature are characterized in terms of these same metrics to determine the design strategy which is most suited to mitigate the risk associated with a particular EM. Finally, a case study is presented for the hypothetical development of a 3D printer, to assess initial feasibility and utility of the framework

    Deductive formal verification of embedded systems

    Get PDF
    We combine static analysis techniques with model-based deductive verification using SMT solvers to provide a framework that, given an analysis aspect of the source code, automatically generates an analyzer capable of inferring information about that aspect. The analyzer is generated by translating the collecting semantics of a program to a formula in first order logic over multiple underlying theories. We import the semantics of the API invocations as first order logic assertions. These assertions constitute the models used by the analyzer. Logical specification of the desired program behavior is incorporated as a first order logic formula. An SMT-LIB solver treats the combined formula as a constraint and solves it. The solved form can be used to identify logical and security errors in embedded programs. We have used this framework to analyze Android applications and MATLAB code. We also report the formal verification of the conformance of the open source Netgear WNR3500L wireless router firmware implementation to the RFC 2131. Formal verification of a software system is essential for its deployment in mission-critical environments. The specifications for the development of routers are provided by RFCs that are only described informally in English. It is prudential to ensure that a router firmware conforms to its corresponding RFC before it can be deployed for managing mission-critical networks. The formal verification process demonstrates the usefulness of inductive types and higher-order logic in software certification

    Doctor of Philosophy

    Get PDF
    dissertationTrusted computing base (TCB) of a computer system comprises components that must be trusted in order to support its security policy. Research communities have identified the well-known minimal TCB principle, namely, the TCB of a system should be as small as possible, so that it can be thoroughly examined and verified. This dissertation is an experiment showing how small the TCB for an isolation service is based on software fault isolation (SFI) for small multitasking embedded systems. The TCB achieved by this dissertation includes just the formal definitions of isolation properties, instruction semantics, program logic, and a proof assistant, besides hardware. There is not a compiler, an assembler, a verifier, a rewriter, or an operating system in the TCB. To the best of my knowledge, this is the smallest TCB that has ever been shown for guaranteeing nontrivial properties of real binary programs on real hardware. This is accomplished by combining SFI techniques and high-confidence formal verification. An SFI implementation inserts dynamic checks before dangerous operations, and these checks provide necessary invariants needed by the formal verification to prove theorems about the isolation properties of ARM binary programs. The high-confidence assurance of the formal verification comes from two facts. First, the verification is based on an existing realistic semantics of the ARM ISA that is independently developed by Cambridge researchers. Second, the verification is conducted in a higher-order proof assistant-the HOL theorem prover, which mechanically checks every verification step by rigorous logic. In addition, the entire verification process, including both specification generation and verification, is automatic. To support proof automation, a novel program logic has been designed, and an automatic reasoning framework for verifying shallow safety properties has been developed. The program logic integrates Hoare-style reasoning and Floyd's inductive assertion reasoning together in a small set of definitions, which overcomes shortcomings of Hoare logic and facilitates proof automation. All inference rules of the logic are proven based on the instruction semantics and the logic definitions. The framework leverages abstract interpretation to automatically find function specifications required by the program logic. The results of the abstract interpretation are used to construct the function specifications automatically, and the specifications are proven without human interaction by utilizing intermediate theorems generated during the abstract interpretation. All these work in concert to create the very small TCB
    corecore