4,514 research outputs found

    Integration of geometric modeling and advanced finite element preprocessing

    Get PDF
    The structure to a geometry based finite element preprocessing system is presented. The key features of the system are the use of geometric operators to support all geometric calculations required for analysis model generation, and the use of a hierarchic boundary based data structure for the major data sets within the system. The approach presented can support the finite element modeling procedures used today as well as the fully automated procedures under development

    Higher-order perturbation solutions to dynamic, discrete-time rational expectations models

    Get PDF
    We present an algorithm and software routines for computing nth order Taylor series approximate solutions to dynamic, discrete-time rational expectations models around a nonstochastic steady state. The primary advantage of higher-order (as opposed to first- or second-order) approximations is that they are valid not just locally, but often globally (i.e., over nonlocal, possibly very large compact sets) in a rigorous sense that we specify. We apply our routines to compute first- through seventh-order approximate solutions to two standard macroeconomic models, a stochastic growth model and a life-cycle consumption model, and discuss the quality and global properties of these solutions.Macroeconomics - Econometric models ; Business cycles ; Monetary policy

    Compression of spectral meteorological imagery

    Get PDF
    Data compression is essential to current low-earth-orbit spectral sensors with global coverage, e.g., meteorological sensors. Such sensors routinely produce in excess of 30 Gb of data per orbit (over 4 Mb/s for about 110 min) while typically limited to less than 10 Gb of downlink capacity per orbit (15 minutes at 10 Mb/s). Astro-Space Division develops spaceborne compression systems for compression ratios from as little as three to as much as twenty-to-one for high-fidelity reconstructions. Current hardware production and development at Astro-Space Division focuses on discrete cosine transform (DCT) systems implemented with the GE PFFT chip, a 32x32 2D-DCT engine. Spectral relations in the data are exploited through block mean extraction followed by orthonormal transformation. The transformation produces blocks with spatial correlation that are suitable for further compression with any block-oriented spatial compression system, e.g., Astro-Space Division's Laplacian modeler and analytic encoder of DCT coefficients

    Modeling Support System for Systems Analytic Research

    Get PDF
    A computer-assisted mathematical modeling system based on an advanced interactive methodology is presented. The system aims at the extraction of a trade-off between human mental models and regression-type models based on the use of numerical data, by using a flexible combination of statistical methods and graph-theoretical analysis. One of the main advantages of using this system is the facilities for the structuring of both mental and mathematical models

    Methods to Construct a Step-By-Step Beginner’s Guide (BG) to Decision Analytic Cost Effectiveness Modelling.

    Get PDF
    Background: Although guidance on good research practice in health economic modelling is widely available, there is still a need for a simpler instructive resource which could guide a beginner modeller alongside modelling for the first time. Aim: To develop a Beginner’s Guide to be used as a hand-held guide contemporaneous to the model development process. Methods: A systematic review of best practice guidelines was used to construct a framework of steps undertaken during the model development process. Focused methods reviews supplemented this framework. Consensus was obtained amongst a group of model developers to review and finalise the content of the preliminary Beginner’s Guide. The final Beginner’s Guide was used to develop cost effectiveness models. Results: Thirty two best practice guidelines were data extracted, synthesised and critically evaluated to identify steps for model development which formed a framework for the Beginner’s Guide. Within five phases of model development, eight broad submethods were identified and nineteen methodological reviews were conducted to develop the content of the draft Beginner’s Guide. Two rounds of consensus agreement were undertaken to reach agreement on the final Beginner’s Guide. To assess fitness for purpose (ease of use and completeness), models were developed independently and by the researcher using the Beginner’s Guide. Conclusion: A combination of systematic review, methods reviews, consensus agreement and validation was used to construct a step-by-step Beginner’s Guide for developing decision analytical cost effectiveness models. The final Beginner’s Guide is a step-by-step resource to accompany the model development process from understanding the problem to be modelled, model conceptualisation, model implementation, and model checking through to reporting of the model results

    SOCR: Statistics Online Computational Resource

    Get PDF
    The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis, visualization and integration. Following years of experience in statistical teaching at all college levels using established licensed statistical software packages, like STATA, S-PLUS, R, SPSS, SAS, Systat, etc., we have attempted to engineer a new statistics education environment, the Statistics Online Computational Resource (SOCR). This resource performs many of the standard types of statistical analysis, much like other classical tools. In addition, it is designed in a plug-in object-oriented architecture and is completely platform independent, web-based, interactive, extensible and secure. Over the past 4 years we have tested, fine-tuned and reanalyzed the SOCR framework in many of our undergraduate and graduate probability and statistics courses and have evidence that SOCR resources build student's intuition and enhance their learning.

    Models of Cognition: Neurological possibility does not indicate neurological plausibility

    Get PDF
    Many activities in Cognitive Science involve complex computer models and simulations of both theoretical and real entities. Artificial Intelligence and the study of artificial neural nets in particular, are seen as major contributors in the quest for understanding the human mind. Computational models serve as objects of experimentation, and results from these virtual experiments are tacitly included in the framework of empirical science. Cognitive functions, like learning to speak, or discovering syntactical structures in language, have been modeled and these models are the basis for many claims about human cognitive capacities. Artificial neural nets (ANNs) have had some successes in the field of Artificial Intelligence, but the results from experiments with simple ANNs may have little value in explaining cognitive functions. The problem seems to be in relating cognitive concepts that belong in the `top-down' approach to models grounded in the `bottom-up' connectionist methodology. Merging the two fundamentally different paradigms within a single model can obfuscate what is really modeled. When the tools (simple artificial neural networks) to solve the problems (explaining aspects of higher cognitive functions) are mismatched, models with little value in terms of explaining functions of the human mind are produced. The ability to learn functions from data-points makes ANNs very attractive analytical tools. These tools can be developed into valuable models, if the data is adequate and a meaningful interpretation of the data is possible. The problem is, that with appropriate data and labels that fit the desired level of description, almost any function can be modeled. It is my argument that small networks offer a universal framework for modeling any conceivable cognitive theory, so that neurological possibility can be demonstrated easily with relatively simple models. However, a model demonstrating the possibility of implementation of a cognitive function using a distributed methodology, does not necessarily add support to any claims or assumptions that the cognitive function in question, is neurologically plausible
    corecore