473,475 research outputs found

    Re-examining the Information Systems Security Problem from a Systems Theory Perspective

    Get PDF
    This theoretical paper discusses a recent shift in cyber attackersā€™ interest away from traditional network and operating systems vulnerabilities and towards application level security flaws in end user systems. The authors argue that this shift signals a strong need to re-examine the way that security is addressed during the systems development process. Most of the systems development methodologies currently used do not contain formal processes for dealing with the interconnected complexity and risks associated with todayā€™s computing environments. Using systems theory as a theoretical lens, the fundamental processes of current systems development methodologies are analyzed and weaknesses in their ability to deal with these environmental factors are discussed. The authors then present a proposed holistic framework for integrating security into existing systems development methods. The paper concludes with a discussion of the need for more scholarly research in this area and suggestions for future research directions are offered

    DataGauge: A Practical Process for Systematically Designing and Implementing Quality Assessments of Repurposed Clinical Data

    Get PDF
    The well-known hazards of repurposing data make Data Quality (DQ) assessment a vital step towards ensuring valid results regardless of analytical methods. However, there is no systematic process to implement DQ assessments for secondary uses of clinical data. This paper presents DataGauge, a systematic process for designing and implementing DQ assessments to evaluate repurposed data for a specific secondary use. DataGauge is composed of five steps: (1) Define information needs, (2) Develop a formal Data Needs Model (DNM), (3) Use the DNM and DQ theory to develop goal-specific DQ assessment requirements, (4) Extract DNM-specified data, and (5) Evaluate according to DQ requirements. DataGauge\u27s main contribution is integrating general DQ theory and DQ assessment methods into a systematic process. This process supports the integration and practical implementation of existing Electronic Health Record-specific DQ assessment guidelines. DataGauge also provides an initial theory-based guidance framework that ties the DNM to DQ testing methods for each DQ dimension to aid the design of DQ assessments. This framework can be augmented with existing DQ guidelines to enable systematic assessment. DataGauge sets the stage for future systematic DQ assessment research by defining an assessment process, capable of adapting to a broad range of clinical datasets and secondary uses. Defining DataGauge sets the stage for new research directions such as DQ theory integration, DQ requirements portability research, DQ assessment tool development and DQ assessment tool usability

    Faculty Perspectives on Effective Integration of Simulation into a Baccalaureate Nursing Curriculum

    Get PDF
    Research shows that use of high fidelity simulation (HFS) as a teaching strategy requires extensive amounts of faculty time and financial resources for faculty development and equipment. This project study addressed the challenges encountered in the integration of HFS into a Midwestern metropolitan baccalaureate nursing program. The purpose of this qualitative case study was to explore perceptions of nursing faculty about best practice elements for successful integration of HFS into undergraduate nursing programs. Guiding questions were developed using Donabedian\u27s structure-process-outcome model and focused on faculty perceptions related to successful implementation of simulation in their programs. Purposeful sampling was used to select 22 faculty who had integrated HFS into 5 regional baccalaureate nursing programs in metropolitan areas of 2 Midwestern states. Nine participants completed an online interview tool developed by the researcher and designed to elicit responses to open-ended questions about barriers encountered, methods used to overcome those barriers, first impressions about conducting HFS, perceptions of successful integration, and incentives to using HFS. Data were coded and analyzed to identify themes. Emergent themes included the need to identify specific courses for HFS, ensure participation of faculty teaching didactic courses, use nationally recognized principles for HFS implementation, implement consistent methods of debriefing, and use formal written plans. Findings from the study were used to design a staff development initiative to facilitate planning and establishment of HFS in a nursing curriculum. Positive social change may occur when faculty and administrators use project guidelines to develop sound practices for integrating HFS into the nursing curriculum

    SPEEDY: An Eclipse-based IDE for invariant inference

    Full text link
    SPEEDY is an Eclipse-based IDE for exploring techniques that assist users in generating correct specifications, particularly including invariant inference algorithms and tools. It integrates with several back-end tools that propose invariants and will incorporate published algorithms for inferring object and loop invariants. Though the architecture is language-neutral, current SPEEDY targets C programs. Building and using SPEEDY has confirmed earlier experience demonstrating the importance of showing and editing specifications in the IDEs that developers customarily use, automating as much of the production and checking of specifications as possible, and showing counterexample information directly in the source code editing environment. As in previous work, automation of specification checking is provided by back-end SMT solvers. However, reducing the effort demanded of software developers using formal methods also requires a GUI design that guides users in writing, reviewing, and correcting specifications and automates specification inference.Comment: In Proceedings F-IDE 2014, arXiv:1404.578

    A pattern-based approach to a cell tracking ontology

    No full text
    Time-lapse microscopy has thoroughly transformed our understanding of biological motion and developmental dynamics from single cells to entire organisms. The increasing amount of cell tracking data demands the creation of tools to make extracted data searchable and interoperable between experiment and data types. In order to address that problem, the current paper reports on the progress in building the Cell Tracking Ontology (CTO): An ontology framework for describing, querying and integrating data from complementary experimental techniques in the domain of cell tracking experiments. CTO is based on a basic knowledge structure: the cellular genealogy serving as a backbone model to integrate specific biological ontologies into tracking data. As a first step we integrate the Phenotype and Trait Ontology (PATO) as one of the most relevant ontologies to annotate cell tracking experiments. The CTO requires both the integration of data on various levels of generality as well as the proper structuring of collected information. Therefore, in order to provide a sound foundation of the ontology, we have built on the rich body of work on top-level ontologies and established three generic ontology design patterns addressing three modeling challenges for properly representing cellular genealogies, i.e. representing entities existing in time, undergoing changes over time and their organization into more complex structures such as situations

    Transitioning Applications to Semantic Web Services: An Automated Formal Approach

    No full text
    Semantic Web Services have been recognized as a promising technology that exhibits huge commercial potential, and attract significant attention from both industry and the research community. Despite expectations being high, the industrial take-up of Semantic Web Service technologies has been slower than expected. One of the main reasons is that many systems have been developed without considering the potential of the web in integrating services and sharing resources. Without a systematic methodology and proper tool support, the migration from legacy systems to Semantic Web Service-based systems can be a very tedious and expensive process, which carries a definite risk of failure. There is an urgent need to provide strategies which allow the migration of legacy systems to Semantic Web Services platforms, and also tools to support such a strategy. In this paper we propose a methodology for transitioning these applications to Semantic Web Services by taking the advantage of rigorous mathematical methods. Our methodology allows users to migrate their applications to Semantic Web Services platform automatically or semi-automatically

    Pernicious assimilation: reframing the integration of the urban informal economy in Southern Africa

    Get PDF
    This paper argues that many of the official attempts to ā€œintegrateā€ the urban informal economy into the mainstream economy are fundamentally flawed. An unpacking of the ā€œintegrativeā€ agenda as pursued by planning and other governmental practices reveals that ā€œintegrationā€, as currently practiced, does not herald the mainstreaming of the informal economy. Drawing on research in Zimbabwe and evidence from other countries in Southern Africa, I argue that what we witness is a sinister stripping away of the lifeblood of informality. This malicious form of integration entails crippling Faustian bargains. In the end, this pernicious assimilation insidiously does away with that which makes informality a livelihood haven for the majority of urbanites. I conclude that the duplicitous integration is unworkable and leaves the big questions of inclusion untouched, hence the persistence of the ā€œproblemā€ of informality
    • ā€¦
    corecore