38 research outputs found

    Guiding Complex Design Optimisation Using Bayesian Networks

    Get PDF

    Flourishing in the flesh of the interworld: ecophenomenological intertwining and environmental virtue ethics

    Full text link
    Modern Western society faces a number of challenges and risks to both itself and the natural environment. An ongoing debate addresses what aspects of the natural environment should be granted moral considerability and which ethical system or systems should be used to justify this considerability. In this thesis, I critique the major centric positions within contemporary environmental ethics anthropocentrism, biocentrism, and ecocentrism and lay out an alternative approach developed from Merleau-Ponty s phenomenological ontology. Dualist approaches to environmental ethics are anthropocentric and favour human dominion. Holist approaches, such as deep ecology (e.g., Naess), are ecocentric and erase the difference within and between the human and nonhuman. I make use of the ecofeminist critique (e.g., Plumwood) of both dualist shallow and monist deep ecology to argue for a relational identity of the human nonhuman (inter)world. An account of value in the nonhuman natural world that is anthropogenic but not anthropocentric is needed. Value in the human nonhuman interrelationship should be accounted for by an ethics using the metaphysics of divergent intertwining. In considering my alternative ecophenomenological approach, I use the concept of Merleau-Ponty s ontology of the intertwining; however, Merleau-Ponty never completed the task of developing his ethics. I do so by using his intertwined and interworldly structure to help re-embed human existence in the natural environment without erasing the difference that also characterises this interrelationship. I take my ecophenomenology further by positing a provisional environmental virtue ethics

    Designing a seismic program for an industrial CCS site: Trials and tribulations

    Get PDF
    AbstractDesigning a seismic characterization and monitoring program for a site with high levels of industrial and cultural infrastructure is by not trivial. At the MGSC Phase III project site, a combination of 3D surface seismic and VSP surveys will be used for site characterization and to monitor the injected CO2. The sparse existing data have been carefully analyzed to design 3D surface seismic and VSP surveys that will fit within the surface constraints at the site and meet the greater objectives of the project. The seismic data will be used to map formation heterogeneities and characterize fractures

    Managing the Enriched Experience Network - Learning-Outcome Approach to the Experimental Design Life-Cycle

    Full text link
    Experimental design methods have long been used in scientific areas such as agriculture, biology and physics to minimise error and assure validity. Although most network researchers performing experiments with testbeds and simulations implicitly follow scientific method, until recently there has been little emphasis on improving experimental design methods. Traditional experimental design focuses on experiments where the scope and objectives are relatively constrained, however network research in innovative areas where there is little or no precedence often has changing objectives that evolve over time. We describe the learning-outcome approach to the experimental design life-cycle that applies the concepts of systems development life-cycle models used in software engineering, as well as learning taxonomies used in education. Our approach extends traditional experimental design by providing a more comprehensive and efficient way of decomposing an experimental research project into manageable stages that are designed rather than improvised, leading to a well-structured way of assessing the knowledge gained. We provide an insight into our experiences with this approach in the context of experimental research in management of Alcatel Australia's new-generation Enriched Experience Network

    Has the Pascal Experiment Failed ? or Can A Good Language make Good Programmers ?

    No full text
    In the early 1980`s the authors electrical engineering school switched their initial teaching language from Fortran to Pascal. One of the primary factors leading to this choice was the belief that a well-structured language such as Pascal would inherently lead to the students writing, as a matter of course, well-structured code. The students learning Pascal are separated into two degree courses, electrical engineering (EE) and computer systems engineering (CSE). The EE students complete only basic computing and software engineering material, whereas the CSE students cover a much more intensive software engineering curriculum. Experience over the last decade has shown that the EE students are not capable of consistently writing good code once they have completed the software subjects, whereas the CSE students are significantly more capable. It has become progressively apparent that the EE students cannot be forced into writing good code by restricting the scope of their tools. Rather th..

    Architecture-based design of computer based systems

    No full text
    This paper presents a practical approach to architecture-based design of computer based systems. The approach is discussed in relation to other existing methods of performing discovery, abstraction, refinement and evolution of systems ’ architectures. It has also be shown that this approach can be supported by formal methods of refinement. The approach assists the designer to maintain a strict focus of reasoning about the architecture and its qualities. 1

    Dichoptic mfVEP in glaucoma detection, and their functional correlates

    No full text
    Theoretical thesis.Bibliography: leaves 170-177.Chapter 1. Introduction and literature review -- Chapter 2. Materials and methods -- Chapter 3. Dichoptic suppression of mfVEP amplitude : effect of retinal eccentricity and simulated unilateral visual impairment -- Chapter 4. Stimulation speed, but not contrast determines degree of dichoptic suppression of mfVEP -- Chapter 5. Comparison of low luminance contrast (LLA) and blue on yellow (BonY) stimulation with fast and slow presentation in the detection of glaucoma -- Chapter 6. The role of interocular suppression in the detection of glaucomatous defects using dichoptic mfVEP fast blue on yellow stimulation -- Chapter 7. Correlation of disc parameters with visual field indices using scanning laser and SD-OCT at different levels of glaucoma severity -- Chapter 8. Structural and functional correlations of novel and conventional perimetry technologies -- Chapter 9. Conclusions and future directions.Glaucoma is the leading cause of preventable blindness in the western world. Effective treatment of glaucoma relies on early detection of a disease that is without symptoms in the initial stages. Identification of damage often relies on visual field testing which can be unreliable. The Multifocal VEP (mfVEP) was developed to provide an objective measure of visual field function in glaucoma. The newly designed dichoptic (binocular) mfVEP creates a testing environ which is most conducive to calculating inter-eye symmetry, as both eyes are tested at the same time, under the same conditions. However, the sensitivity of this testing technology, the role of inter-ocular suppression in asymmetry analysis and the ideal stimulus for detecting glaucoma using dichoptic mfVEP are largely unknown. Experiments focusing on dichoptic mfVEP and interocular suppression demonstrated sensitivity to eccentricity, contrast and speed of presentation. Building on these relationships, dichoptic mfVEP with targeted stimuli was shown to be more effective than monocular in detecting unilateral early glaucoma.Furthermore this project sought to provide further insight into the structure/function relationship that underlies the pathogenesis of glaucoma by studying correlations of glaucomatous field defects as detected by Humphrey visual field testing and mfVEP with structural changes characterised by high resolution retinal imaging techniques. The results showing that newer imaging technologies exhibited closer correlation to glaucomatous change but that the relationship is still variable highlighted the one of the many challenges in monitoring glaucoma. Correlation analysis comparing HVF and mfVEP showed similarities in structure-function patterns.Mode of access: World wide web1 online resource (177 leaves) diagrams, graphs, table

    University of Technology, Sydney.

    No full text
    Designated Contact: David Rowe Defining systems architecture evolvability- a taxonomy of change Evolvability is part of the alchemy of systems engineering. Designing a system that is evolvable is considered best practice in many industry domains. However, what does ‘evolvable ’ mean? And in what context does a system evolve? Reviewing the many factors of system change and their associated definitions, we conclude that a single definition for ‘evolvability ’ is not adequate. We assert that evolvability is a composite quality which allows a system’s architecture to accommodate change in a cost effective manner while maintaining the integrity of the architecture. In order to define evolvability as a composite, we propose a taxonomy which classifies the different aspects of evolvability. Using this taxonomy to select relevant systems architecting and design approaches, a systems architect can be confident in including those aspects of evolution most suitable to a particular application. The concepts introduced in this paper are applied to the Ericsson AXE telecommunications switching system for illustration and justification. 1
    corecore