5,391 research outputs found

    Sampling Strategy and Accuracy Assessment for Digital Terrain Modelling

    Get PDF
    In this thesis, investigations into some of the problems related to three of the main concerns (i. e. accuracy, cost and efficiency) of digital terrain modelling have been carried out. Special attention has been given to two main issues - the establishment of a family of mathematical models which is comprehensive in theory and reliable in practice, and the development of a procedure for the determination of an optimum sampling interval for a DTM project with a specified accuracy requirement. Concretely, the following discussions or investigations have been carried out:- i). First of all, a discussion of the theoretical background to digital terrain modelling has been conducted and an insight into the complex matter of digital terrain surface modelling has been obtained. ii). Some investigations into the improvement of the quality of DTM source data have been carried out. In this respect, algorithms for gross error detection have been developed and a procedure for random noise filtering implemented. iii). Experimental tests of the accuracy of DTMs derived from various data sources (i. e. aerial photography, space photography and existing contour maps) have been carried out. In the case of the DTMs derived from photogrammetrically measured data, the tests were designed deliberately to investigate the relationship between DTM accuracy and sampling interval, terrain slope and data pattern. In the case of DTMs derived from digital contour data, the tests were designed to investigate the relationship between DTM accuracy and contour interval, terrain slope and the characteristics of the data set. iv). The problems related to the reliability of the DTM accuracy figures obtained from the results of the experimental tests have also been investigated. Some criteria have also been set for the accuracy, number and distribution of check points. v). A family of mathematical models has been developed for the prediction of DTM accuracy. These models have been validated by experimental test data and evaluated from a theoretical standpoint. Some of the existing accuracy models have also been evaluated for comparison purposes. vi). A procedure for the determination of the optimum sampling interval for a DTM project with a specified accuracy requirement has also been proposed. Based on this procedure, a potential sampling strategy has also been investigated

    Financial power laws: Empirical evidence, models, and mechanism

    Get PDF
    Financial markets (share markets, foreign exchange markets and others) are all characterized by a number of universal power laws. The most prominent example is the ubiquitous finding of a robust, approximately cubic power law characterizing the distribution of large returns. A similarly robust feature is long-range dependence in volatility (i.e., hyperbolic decline of its autocorrelation function). The recent literature adds temporal scaling of trading volume and multi-scaling of higher moments of returns. Increasing awareness of these properties has recently spurred attempts at theoretical explanations of the emergence of these key characteristics form the market process. In principle, different types of dynamic processes could be responsible for these power-laws. Examples to be found in the economics literature include multiplicative stochastic processes as well as dynamic processes with multiple equilibria. Though both types of dynamics are characterized by intermittent behavior which occasionally generates large bursts of activity, they can be based on fundamentally different perceptions of the trading process. The present chapter reviews both the analytical background of the power laws emerging from the above data generating mechanism as well as pertinent models proposed in the economics literature. --

    Bore optimisation and impedance modelling of brass musical instruments

    Get PDF

    Risk Assessment for National Natural Resource Conservation Programs

    Get PDF
    This paper reviews the risk assessments prepared by the U.S. Department of Agriculture (USDA) in support of regulations implementing the Conservation Reserve Program (CRP) and Environmental Quality Incentives Program (EQIP). These two natural resource conservation programs were authorized as part of the 1996 Farm Bill. The risk assessments were required under the Federal Crop Insurance Reform and Department of Agriculture Reorganization Act of 1994. The framework used for the assessments was appropriate, but the assessments could be improved in the areas of assessments endpoint selection, definition, and estimation. Many of the assessment endpoints were too diffuse or ill-defined to provide an adequate characterization of the program benefits. Two reasons for this lack of clarity were apparent: 1) the large, unprioritized set of natural resource conservation objectives for the two programs and 2) there is little agreement about what changes in environmental attributes caused by agriculture should be considered adverse and which may be considered negligible. There is also some "double counting" of program benefits. Although the CRP and EQIP are, in part, intended to assist agricultural producers with regulatory compliance, the resultant environmental benefits would occur absent the programs. The paper concludes with a set of recommendations for continuing efforts to conduct regulatory analyses of these major conservation programs. The central recommendation is that future risk assessments go beyond efforts to identify the natural resources at greatest risk due to agricultural production activities and instead provide scientific input for analyses of the cost-effectiveness of the conservation programs.

    Integrating the finite element method and genetic algorithms to solve structural damage detection and design optimisation problems

    Get PDF
    This thesis documents fundamental new research in to a specific application of structural box-section beams, for which weight reduction is highly desirable. It is proposed and demonstrated that the weight of these beams can be significantly reduced by using advanced, laminated fibre-reinforced composites in place of steel. Of the many issues raised during this investigation two, of particular importance, are considered in detail; (a) the detection and quantification of damage in composite structures and (b) the optimisation of laminate design to maximise the performance of loaded composite structuress ubject to given constraints. It is demonstrated that both these issues can be formulated and solved as optimisation problems using the finite element method, in which an appropriate objective function is minimised (or maximised). In case (a) the difference in static response obtained from a loaded structure containing damage and an equivalent mathematical model of the structure is minimised by iteratively updating the model. This reveals the damage within the model and subsequently allows the residual properties of the damaged structure to be quantified. Within the scope of this work is the ability to resolve damage, that consists of either penny-shaped sub-surface flaws or tearing damage of box-section beams from surface experimental data. In case (b) an objective function is formulated in terms of a given structural response, or combination of responses that is optimised in order to return an optimal structure, rather than just a satisfactory structure. For the solution of these optimisation problems a novel software tool, based on the integration of genetic algorithms and a commercially available finite element (FE) package, has been developed. A particular advantage of the described method is its applicability to a wide range of engineering problems. The tool is described and its effectiveness demonstrated with reference to two inverse damage detection and quantification problems and one laminate design optimisation problem. The tool allows the full suite of functions within the FE software to be used to solve non-convex optimisation problems, formulated in terms of both discrete and continuous variables, without explicitly stating the form of the stiffness matrix. Furthermore, a priori knowledge about the problem may be readily incorporated in to the method

    5th International Probabilistic Workshop: 28-29 November 2007, Ghent, Belgium

    Get PDF
    These are the proceedings of the 5th International Probabilistic Workshop. Even though the 5th anniversary of a conference might not be of such importance, it is quite interesting to note the development of this probabilistic conference. Originally, the series started as the 1st and 2nd Dresdner Probabilistic Symposium, which were launched to present research and applications mainly dealt with at Dresden University of Technology. Since then, the conference has grown to an internationally recognised conference dealing with research on and applications of probabilistic techniques, mainly in the field of structural engineering. Other topics have also been dealt with such as ship safety and natural hazards. Whereas the first conferences in Dresden included about 12 presentations each, the conference in Ghent has attracted nearly 30 presentations. Moving from Dresden to Vienna (University of Natural Resources and Applied Life Sciences) to Berlin (Federal Institute for Material Research and Testing) and then finally to Ghent, the conference has constantly evolved towards a truly international level. This can be seen by the language used. The first two conferences were entirely in the German language. During the conference in Berlin however, the change from the German to English language was especially apparent as some presentations were conducted in German and others in English. Now in Ghent all papers will be presented in English. Participants now, not only come from Europe, but also from other continents. Although the conference will move back to Germany again next year (2008) in Darmstadt, the international concept will remain, since so much work in the field of probabilistic safety evaluations is carried out internationally. In two years (2009) the conference will move to Delft, The Netherlands and probably in 2010 the conference will be held in Szczecin, Poland. Coming back to the present: the editors wish all participants a successful conference in Ghent

    Bare nothingness: Situated subjects in embodied artists' systems

    Get PDF
    This chapter examines the current state of digital artworks, arguing that they have not yet made a groundbreaking impact on the cultural landscape of the 21st century and suggesting that a reason for this lack of notoriety is the obsolete model of agency deployed by many digital artists. As an alternative to what is framed as out-of-date forms of interactivity, the chapter highlights evolving research into interactive systems, artists' tools, applications, and techniques that will provide readers with an insightful and up-to-date examination of emerging multimedia technology trends. In particular, the chapter looks at situated computing and embodied systems, in which context-aware models of human subjects can be combined with sensor technology to expand the agencies at play in interactive works. The chapter connects these technologies to Big Data, Crowdsourcing and other techniques from artificial intelligence that expand our understanding of interaction and participation
    corecore