13,374 research outputs found

    Estimating Health Care Costs Among Fragile and Conflict Affected States: an Elastic Net-Risk Measures Approach

    Get PDF
    Fragile and conflict affected states (FCAS) are those in which the government lacks the political will and/or capacity to provide the basic functions necessary for poverty reduction, economic development, and the security of human rights of their populations.Until recent history, unfortunately, the majority of research conducted and universal health care debates have been centered around middle income and emerging economies. As a result, FCAS have been neglected from many global discussions and decisions. Due to this neglect, many FCAS do not have proper vaccinations and antibiotics. Seemingly, well estimated health care costs are a necessary stepping stone in improving the health of citizens among FCAS. Fortunately, developments in statistical learning theory combined with data obtained by the WBG and Transparency International make it possible to accurately model health care cost among FCAS. The data used in this paper consisted of 35 countries and 89 variables. Of these 89 variables, health care expenditure (HCE) was the only response variable. With 88 predictor variables, there was expected to be multicollinearity, which occurs when multiple variables share relatively large absolute correlation. Since multicollinearity is expected and the number of variables is far greater than the number of observations, this paper adopts Zou and Hastie\u27s method of regularization via elastic net (ENET). In order to accurately estimate the maximum and expected maximum HCE among FCAS, well-known risk measures, such as Value at Risk and Conditional Value at Risk, and related quantities were obtained via Monte Carlo simulations. This paper obtained risk measures at 95 security level

    Numerical Simulation and Customized DACM Based Design Optimization

    Get PDF
    PhD thesis in Offshore technologyThe diverse numerical modelling, analysis and simulation tools that have been developed and introduced to markets are intended to perform the virtual design and testing of products and systems without the construction of physical prototypes. Digital prototyping in the form of computer modelling and simulation are important means of numerical model predictions, i.e. design validation and verification. However, as the tools advance to more precise and diverse applications, the operation eventually becomes more complex, computationally expensive and error prone; this is particularly true for complex multi-disciplinary and multidimensional problems; for instance, in multi-body dynamics, Fluid-Structure Interaction (FSI) and high-dimensional numerical simulation problems. On the other hand, integrating design optimization operations into the product and system development processes, through the computer based applications, makes the process even more complex and highly expensive. This thesis analyses and discusses causes of complexity in numerical modelling, simulation and optimization operations and proposes new approaches/frameworks that would help significantly reduce the complexity and the associated computational costs. Proposed approaches mainly integrate, simplify and decompose or approximate complex numerical simulation based optimization problems into simpler, and to metamodel-based optimization problems. Despite advancing computational technologies in continuum mechanics, the design and analysis tools have developed in separate directions with regard to ‘basis functions’ of the technologies until recent developments. Basis functions are the building blocks of every continuous function. Continuous functions in every computational tool are linear combinations of specific basis functions in the function space. Since first introduced, basis functions in the design and modelling tools have developed so rapidly that various complex physical problems can today be designed and modelled to the highest precision. On the other hand, most analysis tools still utilize approximate models of the problems from the latter tools, particularly if the problem involves complex smooth geometric designs. The existing gap between the basis functions of the tools and the increasing precision of models for analysis introduce tremendous computational costs. Moreover, to transfer models from one form of basis function to another, additonal effort is required. The variation of the basis functions also demands extra effort in numerical simulation based optimization processes. This thesis discusses the recently developed integrated modelling and analysis approach that utilizes the state-of-the-art basis function (NURBS function) for both design and analysis. A numerical simulation based shape optimization framework that utilizes the state-of-the-art basis function is also presented in a study in the thesis. One of the common multidisciplinary problem that involves multiple models of domains in a single problem, fluid-structure interaction (FSI) problem, is studied in the thesis. As the name implies, the two models of domains involved in any FSI problems are fluid and structure domain models. In order to solve the FSI problems, usually three mathematical components are needed: namely, i) fluid dynamics model, ii) structural mechanics model and, iii) the FSI model. This thesis presents the challenges in FSI problems and discusses different FSI approaches in numerical analysis. A comparative analysis of computational methods, based on the coupling and temporal discretization schemes, is discussed using a benchmark problem, to give a better understanding of what a multidisciplinary problem is and the challenge for design optimizations that involve such problems. [...

    Simulating Growth and Development of Tomato Crop

    Get PDF
    Crop models are powerful tools to test hypotheses, synthesize and convey knowledge, describe and understand complex systems and compare different scenarios. Models may be used for prediction and planning of production, in decision support systems and control of the greenhouse climate, water supply and nutrient supply. The mechanistic simulation of tomato crop growth and development is described in this paper. The main processes determining yield, growth, development and water and nutrient uptake of a tomato crop are discussed in relation to growth conditions and crop management. Organ initiation is simulated as a function of temperature. Simulation of leaf area expansion is also based on temperature, unless a maximum specific leaf area is reached. Leaf area is an important determinant for the light interception of the canopy. Radiation shows exponential extinction with depth in the canopy. For leaf photosynthesis several models are available. Transpiration is calculated according to the Penman-Monteith approach. Net assimilate production is calculated as the difference between canopy gross photosynthesis and maintenance respiration. The net assimilate production is used for growth of the different plant organs and growth respiration. Partitioning of assimilates among plant organs is simulated based on the relative sink strengths of the organs. The simulation of plant-nutrient relationships starts with the calculation of the demanded concentrations of different macronutrients for each plant organ with the demand depending on the ontogenetic stage of the organ. Subsequently, the demanded nutrient uptake is calculated from these demanded concentrations and dry weight of the organs. When there is no limitation in the availability at the root surface, the actual uptake will equal the demanded uptake. When the root system cannot fulfil the demand, uptake is less, plant nutrient concentration drops and crop production might be reduced. It is concluded that mechanistic crop models accurately simulate yield, growth, development and water and nutrient relations of greenhouse grown tomato in different climate zone

    Electrospinning predictions using artificial neural networks

    Get PDF
    Electrospinning is a relatively simple method of producing nanofibres. Currently there is no method to predict the characteristics of electrospun fibres produced from a wide range of polymer/solvent combinations and concentrations without first measuring a number of solution properties. This paper shows how artificial neural networks can be trained to make electrospinning predictions using only commonly available prior knowledge of the polymer and solvent. Firstly, a probabilistic neural network was trained to predict the classification of three possibilities: no fibres (electrospraying); beaded fibres; and smooth fibres with > 80% correct predictions. Secondly, a generalised neural network was trained to predict fibre diameter with an average absolute percentage error of 22.3% for the validation data. These predictive tools can be used to reduce the parameter space before scoping exercises

    The application of ANFIS prediction models for thermal error compensation on CNC machine tools

    Get PDF
    Thermal errors can have significant effects on CNC machine tool accuracy. The errors come from thermal deformations of the machine elements caused by heat sources within the machine structure or from ambient temperature change. The effect of temperature can be reduced by error avoidance or numerical compensation. The performance of a thermal error compensation system essentially depends upon the accuracy and robustness of the thermal error model and its input measurements. This paper first reviews different methods of designing thermal error models, before concentrating on employing an adaptive neuro fuzzy inference system (ANFIS) to design two thermal prediction models: ANFIS by dividing the data space into rectangular sub-spaces (ANFIS-Grid model) and ANFIS by using the fuzzy c-means clustering method (ANFIS-FCM model). Grey system theory is used to obtain the influence ranking of all possible temperature sensors on the thermal response of the machine structure. All the influence weightings of the thermal sensors are clustered into groups using the fuzzy c-means (FCM) clustering method, the groups then being further reduced by correlation analysis. A study of a small CNC milling machine is used to provide training data for the proposed models and then to provide independent testing data sets. The results of the study show that the ANFIS-FCM model is superior in terms of the accuracy of its predictive ability with the benefit of fewer rules. The residual value of the proposed model is smaller than ±4 μm. This combined methodology can provide improved accuracy and robustness of a thermal error compensation system

    From buildings to cities: techniques for the multi-scale analysis of urban form and function

    Get PDF
    The built environment is a significant factor in many urban processes, yet direct measures of built form are seldom used in geographical studies. Representation and analysis of urban form and function could provide new insights and improve the evidence base for research. So far progress has been slow due to limited data availability, computational demands, and a lack of methods to integrate built environment data with aggregate geographical analysis. Spatial data and computational improvements are overcoming some of these problems, but there remains a need for techniques to process and aggregate urban form data. Here we develop a Built Environment Model of urban function and dwelling type classifications for Greater London, based on detailed topographic and address-based data (sourced from Ordnance Survey MasterMap). The multi-scale approach allows the Built Environment Model to be viewed at fine-scales for local planning contexts, and at city-wide scales for aggregate geographical analysis, allowing an improved understanding of urban processes. This flexibility is illustrated in the two examples, that of urban function and residential type analysis, where both local-scale urban clustering and city-wide trends in density and agglomeration are shown. While we demonstrate the multi-scale Built Environment Model to be a viable approach, a number of accuracy issues are identified, including the limitations of 2D data, inaccuracies in commercial function data and problems with temporal attribution. These limitations currently restrict the more advanced applications of the Built Environment Model
    • …
    corecore