373 research outputs found

    Process of designing robust, dependable, safe and secure software for medical devices: Point of care testing device as a case study

    Get PDF
    This article has been made available through the Brunel Open Access Publishing Fund.Copyright © 2013 Sivanesan Tulasidas et al. This paper presents a holistic methodology for the design of medical device software, which encompasses of a new way of eliciting requirements, system design process, security design guideline, cloud architecture design, combinatorial testing process and agile project management. The paper uses point of care diagnostics as a case study where the software and hardware must be robust, reliable to provide accurate diagnosis of diseases. As software and software intensive systems are becoming increasingly complex, the impact of failures can lead to significant property damage, or damage to the environment. Within the medical diagnostic device software domain such failures can result in misdiagnosis leading to clinical complications and in some cases death. Software faults can arise due to the interaction among the software, the hardware, third party software and the operating environment. Unanticipated environmental changes and latent coding errors lead to operation faults despite of the fact that usually a significant effort has been expended in the design, verification and validation of the software system. It is becoming increasingly more apparent that one needs to adopt different approaches, which will guarantee that a complex software system meets all safety, security, and reliability requirements, in addition to complying with standards such as IEC 62304. There are many initiatives taken to develop safety and security critical systems, at different development phases and in different contexts, ranging from infrastructure design to device design. Different approaches are implemented to design error free software for safety critical systems. By adopting the strategies and processes presented in this paper one can overcome the challenges in developing error free software for medical devices (or safety critical systems).Brunel Open Access Publishing Fund

    On Dimer Models and Closed String Theories

    Full text link
    We study some aspects of the recently discovered connection between dimer models and D-brane gauge theories. We argue that dimer models are also naturally related to closed string theories on non compact orbifolds of \BC^2 and \BC^3, via their twisted sector R charges, and show that perfect matchings in dimer models correspond to twisted sector states in the closed string theory. We also use this formalism to study the combinatorics of some unstable orbifolds of \BC^2.Comment: 1 + 25 pages, LaTeX, 11 epsf figure

    MartiTracks: A Geometrical Approach for Identifying Geographical Patterns of Distribution

    Get PDF
    Panbiogeography represents an evolutionary approach to biogeography, using rational cost-efficient methods to reduce initial complexity to locality data, and depict general distribution patterns. However, few quantitative, and automated panbiogeographic methods exist. In this study, we propose a new algorithm, within a quantitative, geometrical framework, to perform panbiogeographical analyses as an alternative to more traditional methods. The algorithm first calculates a minimum spanning tree, an individual track for each species in a panbiogeographic context. Then the spatial congruence among segments of the minimum spanning trees is calculated using five congruence parameters, producing a general distribution pattern. In addition, the algorithm removes the ambiguity, and subjectivity often present in a manual panbiogeographic analysis. Results from two empirical examples using 61 species of the genus Bomarea (2340 records), and 1031 genera of both plants and animals (100118 records) distributed across the Northern Andes, demonstrated that a geometrical approach to panbiogeography is a feasible quantitative method to determine general distribution patterns for taxa, reducing complexity, and the time needed for managing large data sets

    Surface Rupture of the November 2002 M7.9 Denali Fault Earthquake, Alaska, and Comparison to Other Strike-Slip Ruptures

    Get PDF
    On November 3, 2002, a moment-magnitude (Mw) 7.9 earthquake produced 340 km of surface rupture on the Denali fault and two related faults in central Alaska. The rupture, which proceeded from west to east, began with a 40-km-long break on a previously unknown thrust fault. Estimates of surface slip on this thrust were 3-6 m. Next came the principal surface break, along 220 km of the Denali fault. There, right-lateral offset averaged almost 5 m and increased eastward to a maximum of nearly 9 m. Finally, slip turned southeastward onto the Totschunda fault, where dextral offsets up to 3 m continued for another 70 km. This three-part rupture ranks among the longest documented strike-slip events of the past two centuries. The surface-slip distribution supports and clarifies models of seismological and geodetic data that indicated initial thrusting followed by rightlateral strike slip, with the largest moment release near the east end of the Denali fault. The Denali fault ruptured beneath the Trans-Alaska oil pipeline. The pipeline withstood almost 6 m of lateral offset, because engineers designed it to survive such offsets based on pre-construction geological studies. The Denali fault earthquake was typical of large-magnitude earthquakes on major intracontinental strike-slip faults, in the length of the rupture, the multiple fault strands that ruptured, and the variable slip along strike

    Surface Rupture and Slip Distribution of the Denali and Totschunda Faults

    Get PDF
    The 3 November 2002 Denali fault, Alaska, earthquake resulted in 341 km of surface rupture on the Susitna Glacier, Denali, and Totschunda faults. The rupture proceeded from west to east and began with a 48-km-long break on the previously unknown Susitna Glacier thrust fault. Slip on this thrust averaged about 4 m (Crone et al., 2004). Next came the principal surface break, along 226 km of the Denali fault, with average right-lateral offsets of 4.5–5.1 m and a maximum offset of 8.8 m near its eastern end. The Denali fault trace is commonly left stepping and north side up. About 99 km of the fault ruptured through glacier ice, where the trace orientation was commonly influenced by local ice fabric. Finally, slip transferred southeastward onto the Totschunda fault and continued for another 66 km where dextral offsets average 1.6–1.8 m. The transition from the Denali fault to the Totschunda fault occurs over a complex 25-km-long transfer zone of right-slip and normal fault traces. Three methods of calculating average surface slip all yield a moment magnitude of Mw 7.8, in very good agreement with the seismologically determined magnitude of M 7.9. A comparison of strong-motion inversions for moment release with our slip distribution shows they have a similar pattern. The locations of the two largest pulses of moment release correlate with the locations of increasing steps in the average values of observed slip. This suggests that slipdistribution data can be used to infer moment release along other active fault traces.PublishedS23–S52reserve

    On the geometry of C^3/D_27 and del Pezzo surfaces

    Get PDF
    We clarify some aspects of the geometry of a resolution of the orbifold X = C3/D_27, the noncompact complex manifold underlying the brane quiver standard model recently proposed by Verlinde and Wijnholt. We explicitly realize a map between X and the total space of the canonical bundle over a degree 1 quasi del Pezzo surface, thus defining a desingularization of X. Our analysis relys essentially on the relationship existing between the normalizer group of D_27 and the Hessian group and on the study of the behaviour of the Hesse pencil of plane cubic curves under the quotient.Comment: 23 pages, 5 figures, 2 tables. JHEP style. Added references. Corrected typos. Revised introduction, results unchanged

    RPA using a multiplexed cartridge for low cost point of care diagnostics in the field

    Get PDF
    A point of care device utilising Lab-on-a-Chip technologies that is applicable for biological pathogens was designed, fabricated and tested showing sample in to answer out capabilities. The purpose of the design was to develop a cartridge with the capability to perform nucleic acid extraction and purification from a sample using a chitosan membrane at an acidic pH. Waste was stored within the cartridge with the use of sodium polyacrylate to solidify or gelate the sample in a single chamber. Nucleic acid elution was conducted using the RPA amplification reagents (alkaline pH). Passive valves were used to regulate the fluid flow and a multiplexer was designed to distribute the fluid into six microchambers for amplification reactions. Cartridges were produced using soft lithography of silicone from 3D printed moulds, bonded to glass substrates. The isothermal technique, RPA is employed for amplification. This paper shows the results from two separate experiments: the first using the RPA control nucleic acid, the second showing successful amplification from Chlamydia Trachomatis. Endpoint analysis conducted for the RPA analysis was gel electrophoresis that showed 143 base pair DNA was amplified successfully for positive samples whilst negative samples did not show amplification. End point analysis for Chlamydia Trachomatis samples was fluorescence detection that showed successful detection of 1 copy/μL and 10 copies/μL spiked in a MES buffer.Medical Research Counci

    Hierarchical case-based reasoning to support knitwear design

    Get PDF
    Knitwear design is a creative activity that is hard to automate using the computer. The production of the associated knitting pattern, however, is repetitive, time-consuming and error-prone, calling for automation. Our objectives are two-fold: to facilitate the design and to ease the burden of calculations and checks in pattern production. We conduct a feasibility study for applying case-based reasoning in knitwear design: we describe appropriate methods and show their application

    Face-space: A unifying concept in face recognition research

    Get PDF
    The concept of a multidimensional psychological space, in which faces can be represented according to their perceived properties, is fundamental to the modern theorist in face processing. Yet the idea was not clearly expressed until 1991. The background that led to the development of face-space is explained, and its continuing influence on theories of face processing is discussed. Research that has explored the properties of the face-space and sought to understand caricature, including facial adaptation paradigms, is reviewed. Face-space as a theoretical framework for understanding the effect of ethnicity and the development of face recognition is evaluated. Finally, two applications of face-space in the forensic setting are discussed. From initially being presented as a model to explain distinctiveness, inversion, and the effect of ethnicity, face-space has become a central pillar in many aspects of face processing. It is currently being developed to help us understand adaptation effects with faces. While being in principle a simple concept, face-space has shaped, and continues to shape, our understanding of face perception

    Evolving faces from principal components

    Get PDF
    A system that uses an underlying genetic algorithm to evolve faces in response to user selection is described. The descriptions of faces used by the system are derived from a statistical analysis of a set of faces. The faces used for generation are transformed to an average shape by defining locations around each face and morphing. The shape-free images and shape vectors are then separately subjected to principal components analysis. Novel faces are generated by recombining the image components ("eigenfaces") and then morphing their shape according to the principal components of the shape vectors ("eigenshapes"). The prototype system indicates that such statistical analysis of a set of faces can produce plausible, randomly generated photographic images
    corecore