422 research outputs found

    Autocontinuity and convergence theorems for the Choquet integral

    Get PDF
    Our aim is to provide some convergence theorems for the Choquet integral with respect to various notions of convergence. For instance, the dominated convergence theorem for almost uniform convergence is related to autocontinuous set functions. Autocontinuity can also be related to convergence in measure, strict convergence or mean convergence. Whereas the monotone convergence theorem for almost uniform convergence is related to monotone autocontinuity, a weaker version than autocontinuity.

    非線形積分が定める関数空間の完備性 (関数空間論とその周辺)

    Get PDF

    Engineering Systems Integration

    Get PDF
    Dreamers may envision our future, but it is the pragmatists who build it. Solve the right problem in the right way, mankind moves forward. Solve the right problem in the wrong way or the wrong problem in the right way, however clever or ingenious the solution, neither credits mankind. Instead, this misfire demonstrates a failure to appreciate a crucial step in pragmatic problem solving: systems integration. The first book to address the underlying premises of systems integration and how to exposit them in a practical and productive manner, Engineering Systems Integration: Theory, Metrics, and Methods looks at the fundamental nature of integration, exposes the subtle premises to achieve integration, and posits a substantial theoretical framework that is both simple and clear. Offering systems managers and systems engineers the framework from which to consider their decisions in light of systems integration metrics, the book isolates two basic questions, 1) Is there a way to express the interplay of human actions and the result of system interactions of a product with its environment?, and 2) Are there methods that combine to improve the integration of systems? The author applies the four axioms of General Systems Theory (holism, decomposition, isomorphism, and models) and explores the domains of history and interpretation to devise a theory of systems integration, develop practical guidance applying the three frameworks, and formulate the mathematical constructs needed for systems integration. The practicalities of integrating parts when we build or analyze systems mandate an analysis and evaluation of existing integrative frameworks of causality and knowledge. Integration is not just a word that describes a best practice, an art, or a single discipline. The act of integrating is an approach, operative in all disciplines, in all we see, in all we do

    Social Sciences: A Critique of Positivism

    Get PDF
    Sociolog

    Rethinking the risk matrix

    Get PDF
    So far risk has been mostly defined as the expected value of a loss, mathematically PL (being P the probability of an adverse event and L the loss incurred as a consequence of the adverse event). The so called risk matrix follows from such definition. This definition of risk is justified in a long term “managerial” perspective, in which it is conceivable to distribute the effects of an adverse event on a large number of subjects or a large number of recurrences. In other words, this definition is mostly justified on frequentist terms. Moreover, according to this definition, in two extreme situations (high-probability/low-consequence and low-probability/high-consequence), the estimated risk is low. This logic is against the principles of sustainability and continuous improvement, which should impose instead both a continuous search for lower probabilities of adverse events (higher and higher reliability) and a continuous search for lower impact of adverse events (in accordance with the fail-safe principle). In this work a different definition of risk is proposed, which stems from the idea of safeguard: (1Risk)=(1P)(1L). According to this definition, the risk levels can be considered low only when both the probability of the adverse event and the loss are small. Such perspective, in which the calculation of safeguard is privileged to the calculation of risk, would possibly avoid exposing the Society to catastrophic consequences, sometimes due to wrong or oversimplified use of probabilistic models. Therefore, it can be seen as the citizen’s perspective to the definition of risk

    Engineering Systems Integration

    Get PDF
    Dreamers may envision our future, but it is the pragmatists who build it. Solve the right problem in the right way, mankind moves forward. Solve the right problem in the wrong way or the wrong problem in the right way, however clever or ingenious the solution, neither credits mankind. Instead, this misfire demonstrates a failure to appreciate a crucial step in pragmatic problem solving: systems integration. The first book to address the underlying premises of systems integration and how to exposit them in a practical and productive manner, Engineering Systems Integration: Theory, Metrics, and Methods looks at the fundamental nature of integration, exposes the subtle premises to achieve integration, and posits a substantial theoretical framework that is both simple and clear. Offering systems managers and systems engineers the framework from which to consider their decisions in light of systems integration metrics, the book isolates two basic questions, 1) Is there a way to express the interplay of human actions and the result of system interactions of a product with its environment?, and 2) Are there methods that combine to improve the integration of systems? The author applies the four axioms of General Systems Theory (holism, decomposition, isomorphism, and models) and explores the domains of history and interpretation to devise a theory of systems integration, develop practical guidance applying the three frameworks, and formulate the mathematical constructs needed for systems integration. The practicalities of integrating parts when we build or analyze systems mandate an analysis and evaluation of existing integrative frameworks of causality and knowledge. Integration is not just a word that describes a best practice, an art, or a single discipline. The act of integrating is an approach, operative in all disciplines, in all we see, in all we do

    Doctor of Philosophy

    Get PDF
    dissertationCongenital heart defects are classes of birth defects that affect the structure and function of the heart. These defects are attributed to the abnormal or incomplete development of a fetal heart during the first few weeks following conception. The overall detection rate of congenital heart defects during routine prenatal examination is low. This is attributed to the insufficient number of trained personnel in many local health centers where many cases of congenital heart defects go undetected. This dissertation presents a system to identify congenital heart defects to improve pregnancy outcomes and increase their detection rates. The system was developed and its performance assessed in identifying the presence of ventricular defects (congenital heart defects that affect the size of the ventricles) using four-dimensional fetal chocardiographic images. The designed system consists of three components: 1) a fetal heart location estimation component, 2) a fetal heart chamber segmentation component, and 3) a detection component that detects congenital heart defects from the segmented chambers. The location estimation component is used to isolate a fetal heart in any four-dimensional fetal echocardiographic image. It uses a hybrid region of interest extraction method that is robust to speckle noise degradation inherent in all ultrasound images. The location estimation method's performance was analyzed on 130 four-dimensional fetal echocardiographic images by comparison with manually identified fetal heart region of interest. The location estimation method showed good agreement with the manually identified standard using four quantitative indexes: Jaccard index, Sørenson-Dice index, Sensitivity index and Specificity index. The average values of these indexes were measured at 80.70%, 89.19%, 91.04%, and 99.17%, respectively. The fetal heart chamber segmentation component uses velocity vector field estimates computed on frames contained in a four-dimensional image to identify the fetal heart chambers. The velocity vector fields are computed using a histogram-based optical flow technique which is formulated on local image characteristics to reduces the effect of speckle noise and nonuniform echogenicity on the velocity vector field estimates. Features based on the velocity vector field estimates, voxel brightness/intensity values, and voxel Cartesian coordinate positions were extracted and used with kernel k-means algorithm to identify the individual chambers. The segmentation method's performance was evaluated on 130 images from 31 patients by comparing the segmentation results with manually identified fetal heart chambers. Evaluation was based on the Sørenson-Dice index, the absolute volume difference and the Hausdorff distance, with each resulting in per patient average values of 69.92%, 22.08%, and 2.82 mm, respectively. The detection component uses the volumes of the identified fetal heart chambers to flag the possible occurrence of hypoplastic left heart syndrome, a type of congenital heart defect. An empirical volume threshold defined on the relative ratio of adjacent fetal heart chamber volumes obtained manually is used in the detection process. The performance of the detection procedure was assessed by comparison with a set of images with confirmed diagnosis of hypoplastic left heart syndrome and a control group of normal fetal hearts. Of the 130 images considered 18 of 20 (90%) fetal hearts were correctly detected as having hypoplastic left heart syndrome and 84 of 110 (76.36%) fetal hearts were correctly detected as normal in the control group. The results show that the detection system performs better than the overall detection rate for congenital heart defect which is reported to be between 30% and 60%

    Creating an Unbroken Line of Becoming in Live Music Performance

    Full text link
    The dissertation provides an approach to the pedagogy of musical expression in applied music study. Exercises are provided which support a student’s mastery of the mechanics of grouping, and this mastery is put to work in an adaptation of the work of the great Russian acting teacher, Constantine Stanislavski. Mapping of embodiment schema in the manner of Lakoff and Johnson (1980) is asserted as a useful interface between grouped pitch objects and meaning; mappings are utilized by the instrumental music performer in the same manner as Stanislavski taught his acting students to utilize given circumstances. The discourse situates all of these skills in a gradually-emerging present; the exercises challenge the student to achieve what Stanislavski calls an unbroken line—a spontaneous and unique line of continually-restructuring narrative that evolves in real-time. Throughout the work, the phenomenon of grouping is examined in terms of (1) its relationship to meaning, (2) the physiology which supports it, (3) the Gestalt laws of similarity and trajectory which model it, (4) the parameters of unmarked performance nuance (vibrato, articulation, etc.) which can be used by a performer to project it, and (5) the state of continual evolution which characterizes it in the diachronic context of music

    Semantic discovery and reuse of business process patterns

    Get PDF
    Patterns currently play an important role in modern information systems (IS) development and their use has mainly been restricted to the design and implementation phases of the development lifecycle. Given the increasing significance of business modelling in IS development, patterns have the potential of providing a viable solution for promoting reusability of recurrent generalized models in the very early stages of development. As a statement of research-in-progress this paper focuses on business process patterns and proposes an initial methodological framework for the discovery and reuse of business process patterns within the IS development lifecycle. The framework borrows ideas from the domain engineering literature and proposes the use of semantics to drive both the discovery of patterns as well as their reuse
    corecore