751 research outputs found

    Facilitating evolution in relational database design : a procedure to evaluate and refine novice database designers' schemata : a thesis presented in partial fulfilment of the requirements for the degree of Master of Business Studies in Information Systems at Massey University

    Get PDF
    Relational database management systems (RDBMS) have become widely used by many industries in recent years. Latterly these systems have begun to expand their market by becoming readily available at minimal cost to most users of modern computing technology. The quality of applications developed from RDBMSs however is largely dependent upon the quality of the underlying schema. This research looks at the area of schema design and in particular schemata designed by people who have a minimal understanding of relational concepts. It uses a survey and case studies to help define some of the issues involved in the area. A procedure to modify existing schemata is described, and the schema from one of the case studies used to apply the schema re-design procedure to a real database design. The results are compared to the original schema as well as a schema designed using a conventional application of the NIAM analysis and design methodology. The research supports the hypothesis that database applications based on schemata designed by lay-persons are currently being used to support business data management requirements. The utility, reliability and longevity of these applications depend to some extent on the quality of the underlying schema and its ability to store the required data and maintain that data's integrity. The application of the schema re-design procedure presented in this thesis reveals refinements on the original schema and provides a method for lay-persons to evaluate and improve existing database designs. A number of issues and questions related to the focus of this research are raised and, although outside the scope of the research, are noted as suggestions for further work

    Council house building in County Durham 1900-1939: the local implementation of national policy

    Get PDF
    There has been a fundamental transformation of the housing supply in England and Wales since the beginning of this century, when most families lived in private rented accommodation, up to the present, when the majority either own their home or rent it from a public authority. This study looks at a vital stage of this development - the growth of the public sector in housing before the Second World War - and examines, in particular, the experience of an important area in North East England. This was the region which at the turn of the century had the most severe housing problems and where by 1939 local authorities had done much more than most to improve conditions. The study begins by considering briefly the philosophies held by housing reformers in the early twentieth century and the course actually taken by national housing policy up to1939. The specific problems of County Durham at the turn of the century are then analysed and an account is given of the attempts made by local government before the First World War to deal with them. The bulk of the study is devoted to the experience of the inter-war years. An analysis is made of the physical achievements of local authority housebuilding within the county, the factors that constrained this action and the alternatives that were sometimes adopted in its place. Finally, the study examines in some depth the new responsibilities assumed by local government in the implementation of national housing policy. These duties involved local government in the twin roles of builder and landlord; and, by drawing on hitherto-unused council records of the inter-war period, the final two chapters of the study examine these roles particularly from the local authority viewpoint

    Approximate Bayesian Computational methods

    Full text link
    Also known as likelihood-free methods, approximate Bayesian computational (ABC) methods have appeared in the past ten years as the most satisfactory approach to untractable likelihood problems, first in genetics then in a broader spectrum of applications. However, these methods suffer to some degree from calibration difficulties that make them rather volatile in their implementation and thus render them suspicious to the users of more traditional Monte Carlo methods. In this survey, we study the various improvements and extensions made to the original ABC algorithm over the recent years.Comment: 7 figure

    Tram-Line filtering for retinal vessel segmentation

    Get PDF
    The segmentation of the vascular network from retinal fundal images is a fundamental step in the analysis of the retina, and may be used for a number of purposes, including diagnosis of diabetic retinopathy. However, due to the variability of retinal images segmentation is difficult, particularly with images of diseased retina which include significant distractors. This paper introduces a non-linear filter for vascular segmentation, which is particularly robust against such distractors. We demonstrate results on the publicly-available STARE dataset, superior to Stare’s performance, with 57.2% of the vascular network (by length) successfully located, with 97.2% positive predictive value measured by vessel length, compared with 57% and 92.2% for Stare. The filter is also simple and computationally efficient

    On Particle Learning

    Full text link
    This document is the aggregation of six discussions of Lopes et al. (2010) that we submitted to the proceedings of the Ninth Valencia Meeting, held in Benidorm, Spain, on June 3-8, 2010, in conjunction with Hedibert Lopes' talk at this meeting, and of a further discussion of the rejoinder by Lopes et al. (2010). The main point in those discussions is the potential for degeneracy in the particle learning methodology, related with the exponential forgetting of the past simulations. We illustrate in particular the resulting difficulties in the case of mixtures.Comment: 14 pages, 9 figures, discussions on the invited paper of Lopes, Carvalho, Johannes, and Polson, for the Ninth Valencia International Meeting on Bayesian Statistics, held in Benidorm, Spain, on June 3-8, 2010. To appear in Bayesian Statistics 9, Oxford University Press (except for the final discussion

    Macroscopic electromagnetic stress tensor for ionized media

    Full text link
    Following the arguments presented by Mansuripur [Opt. Express 16, 14821-14835 (2008)], we suggest a form for the macroscopic electromagnetic stress tensor appropriate for ionized media. The generalized Lorentz force includes the effects of polarization forces as well as those on the free charge and current densities. The resulting tensor is written in terms of the fields D, B, E, and H. Its expression for a fully ionized medium subject to an external electromagnetic field is discussed, as are the plasma conservation equations. An apparatus is suggested for its experimental discrimination.Comment: 16 pages, 2 figures, fixed some nonsense with the tubular source, to appear in JP

    Comment on "Plasma ionization by annularly bounded helicon waves" [Phys . Plasmas 13, 063501 (2006)]

    Full text link
    The neoclassical calculation of the helicon wave theory contains a fundamental flaw. Use is made of a proportional relationship between the magnetic field and its curl to derive the Helmholtz equation describing helicon wave propagation; however, by the fundamental theorem of Stokes, the curl of the magnetic field must be perpendicular to that portion of the field contributing to the local curl. Reexamination of the equations of motion indicates that only electromagnetic waves propagate through a stationary region of constant pressure in a fully ionized, neutral medium.Comment: 7 pages, 1 figure, to be published in Phys. Plasmas, http://link.aip.org/link/?PHPAEN/16/054701/
    corecore