111 research outputs found

    National Aeronautics and Space Administration (NASA)/American Society for Engineering Education (ASEE) Summer Faculty Fellowship Program: 1995.

    Get PDF
    The JSC NASA/ASEE Summer Faculty Fellowship Program was conducted at JSC, including the White Sands Test Facility, by Texas A&M University and JSC. The objectives of the program, which began nationally in 1964 and at JSC in 1965, are (1) to further the professional knowledge of qualified engineering and science faculty members; (2) to stimulate an exchange of ideas between participants and NASA; (3) to enrich and refresh the research and teaching activities of the participants' institutions; and (4) to contribute to the research objectives of the NASA centers. Each faculty fellow spent at least 10 weeks at JSC engaged in a research project in collaboration with a NASA/JSC colleague. In addition to the faculty participants, the 1995 program included five students. This document is a compilation of the final reports on the research projects completed by the faculty fellows and visiting students during the summer of 1995. The reports of two of the students are integral with that of the respective fellow. Three students wrote separate reports

    Worst-input mutation approach to web services vulnerability testing based on SOAP messages

    Get PDF
    The growing popularity and application of Web services have led to an increase in attention to the vulnerability of software based on these services. Vulnerability testing examines the trustworthiness, and reduces the security risks of software systems, however such testing of Web services has become increasing challenging due to the cross-platform and heterogeneous characteristics of their deployment. This paper proposes a worst-input mutation approach for testing Web service vulnerability based on SOAP (Simple Object Access Protocol) messages. Based on characteristics of the SOAP messages, the proposed approach uses the farthest neighbor concept to guide generation of the test suite. The test case generation algorithm is presented, and a prototype Web service vulnerability testing tool described. The tool was applied to the testing of Web services on the Internet, with experimental results indicating that the proposed approach, which found more vulnerability faults than other related approaches, is both practical and effective

    Assessing coastal vulnerability: Advanced modeling methods and dynamic hydraulic characteristics of Gulf Coastal systems

    Get PDF
    The United States coastline contain some of the most valued ecological resources, the most populated urban areas, the most complex infrastructure systems, the most prolific economic engines, and the busiest ports of trade. However important the coastline may be to our nation, the history of our coastal communities suggests that they are extremely vulnerable to natural disasters, including hurricane landfall. There are many potential reasons for this vulnerability, and several of them are considered in this work. The common goal of research presented here is to better understand the hydrodynamic forces developed as hurricanes impact the coast so that the resulting effects on coastal resources can be better understood and managed, and vulnerability can be significantly minimized. This work begins with consideration of the hydraulic domain at the interface between inland riverine and coastal environments. Regulators, and therefore those being regulated, generally prefer to separate riverine systems from coastal systems in the design and analysis of coastal infrastructure. Although analysis is greatly simplified, important synergistic hydrodynamic effects are not considered which can have dramatic negative effects on the ability of infrastructure to withstand hurricane impact. Research continues by evaluating how society delineates the coastal flood hazard. Current methods apply a deterministic, steady-state approach to defining this highly dynamic feature influenced by multiple uncertain and variable parameters. By ignoring the variability inherent in the coastal floodplain, society is not able to correctly define the flood hazard, and therefore cannot fully asses the risk to which it is exposed. A methodology is presented to more realistically quantify the coastal flood hazard and to calculate an appropriate flood risk metric. Finally, this research considers the reliability of a coastal community's water distribution system under hurricane impact. By understanding system vulnerability and system interdependence, community leaders can provide more reliable infrastructure systems, thereby reducing the magnitude of disaster and shortening the recovery time. A methodology is presented to quantify the reliability of a water system under several hurricane impact scenarios

    Quality modelling and metrics of Web-based information systems

    Get PDF
    In recent years, the World Wide Web has become a major platform for software applications. Web-based information systems have been involved in many areas of our everyday life, such as education, entertainment, business, manufacturing, communication, etc. As web-based systems are usually distributed, multimedia, interactive and cooperative, and their production processes usually follow ad-hoc approaches, the quality of web-based systems has become a major concern. Existing quality models and metrics do not fully satisfy the needs of quality management of Web-based systems. This study has applied and adapted software quality engineering methods and principles to address the following issues, a quality modeling method for derivation of quality models of Web-based information systems; and the development, implementation and validation of quality metrics of key quality attributes of Web-based information systems, which include navigability and timeliness. The quality modeling method proposed in this study has the following strengths. It is more objective and rigorous than existing approaches. The quality analysis can be conducted in the early stage of system life cycle on the design. It is easy to use and can provide insight into the improvement of the design of systems. Results of case studies demonstrated that the quality modeling method is applicable and practical. Practitioners can use the modeling method to develop their own quality models. This study is amongst the first comprehensive attempts to develop quality measurement for Web-based information systems. First, it identified the relationship between website structural complexity and navigability. Quality metrics of navigability were defined, investigated and implemented. Empirical studies were conducted to evaluate the metrics. Second, this study investigated website timeliness and attempted to find direct and indirect measures for the quality attribute. Empirical studies for validating such metrics were also conducted. This study also suggests four areas of future research that may be fruitful

    Expert system verification and validation study: ES V/V Workshop

    Get PDF
    The primary purpose of this document is to build a foundation for applying principles of verification and validation (V&V) of expert systems. To achieve this, some V&V as applied to conventionally implemented software is required. Part one will discuss the background of V&V from the perspective of (1) what is V&V of software and (2) V&V's role in developing software. Part one will also overview some common analysis techniques that are applied when performing V&V of software. All of these materials will be presented based on the assumption that the reader has little or no background in V&V or in developing procedural software. The primary purpose of part two is to explain the major techniques that have been developed for V&V of expert systems

    Essays on recruitment, training, and incentives

    Get PDF
    Thesis (Ph. D.)--Massachusetts Institute of Technology, Sloan School of Management, 2013.Cataloged from PDF version of thesis.Includes bibliographical references.This thesis is composed of three papers, each relating to labor market imperfections and their implications for firms' staffing practices. In the first paper, I examine why hospitals provide direct financial support to nursing schools and faculty. This support is striking because nursing education is clearly general, clearly paid by the firm, and information asymmetries appear minimal. Using AHA and survey data, I find hospitals employing a greater share of their MSA's registered nurses are more likely to provide such support, net of size and other institutional controls. I interpret this result as evidence that technologically-general skills training may be made defacto-specific by mobility frictions. In the second paper, I present a theory of couples' job search whereby women sort into lowerpaying geographically-dispersed occupations due to expectations of future spouses' geographically-clustered occupations and (thereby) inability to relocate for work. Results confirm men segregate into geographically-clustered occupations, and that these occupations involve more-frequent early career relocations for both sexes. I also find that the minority of the men and women who depart from this equilibrium experience delayed marriage, higher divorce, and lower earnings. Results are consistent with the theory's implication that marriage and mobility expectations foment a self-fulfilling pattern of occupational segregation, with individual departures deterred by earnings and marriage penalties. In the third paper, I examine the use and misuse of authority and incentives in organizational hierarchies. Through a principal-supervisor-agent model inspired by sales settings, I propose organizations delegate authority over salespeople to front-line sales mangers because they can decompose performance measures into ability and luck. The model yields the result that managers on the cusp of a quota have a unique personal incentive to retain and adjust quotas for poor performing subordinates, permitting me to distinguish managers' interests from those of the firm. I parametrically estimate the model using detailed person-transaction-level microdata from 244 firms that subscribe to a "cloud"-based service for automating transaction processing and compensation. I estimate 13-15% of quota adjustments and retentions among poor performers are explained by the managers' unique personal interest in meeting a quota. I use agency theory to evaluate firms' mitigation practices.by Alan Benson.Ph.D

    Diabetic retinopathy screening and treatment

    Get PDF

    Evaluation of a task performance resource constraint model to assess the impact of offshore emergency management on risk reduction

    Get PDF
    In this age of safety awareness, technological emergencies still happen, occasionally with catastrophic results. Often human intervention is the only way of averting disaster. Ensuring that the chosen emergency managers are competent requires a combination of training and assessmentH. owever, assessmenct urrently relies on expert judgement of behaviour as opposed to its impact on outcome, therefore it would be difficult to incorporate such data into formal Quantitative Risk Assessments (QRA). Although there is, as yet, no suitable alternative to expert judgement, there is a need for methods of quantifying the impact of emergency management on risk reduction in accident and incidents. The Task Performance Resource Constraint (TPRC) model is capable of representing the critical factors. It calculates probability of task success with respect to time based on uncertainties associated with the task and resource variables. The results can then be used to assess the management performance based on the physical outcome in the emergency, thereby providing a measure of the impact of emergency management on risk with a high degree of objectivity. Data obtained from training exercises for offshore and onshore emergency management were measured and successfully used with the TPRC model. The resulting probability of success functions also demonstrated a high level of external validity when used with improvements in emergency management or design changes or real data from the Piper Alpha disaster. It also appeared to have more external validity than other HRQ/QRA techniques as it uses physical data that are a greater influence on outcome than psychological changes - though this could be because the current HRA/QRA techniques view human unreliability as probability of error rather than probability of failure. The simulation data were also used to build up distributions of timings for simple emergency management tasks. Using additional theoretical data, this demonstrated the model's potential for assessing the probability of successf or novel situations and future designs

    Expert system verification and validation study: Workshop and presentation material

    Get PDF
    Workshop and presentation material are included. Following an introduction, the basic concepts, techniques, and guidelines are discussed. Handouts and worksheets are included
    corecore