1,251 research outputs found

    Sketching is more than making correct drawings

    Get PDF
    Sketching in the context of a design process is not a goal in itself, but can be considered as a tool to\ud make better designs. Sketching as a design tool has several useful effects as: ordering your thoughts,\ud better understanding of difficult shapes, functioning as a communication tool, and providing an\ud iterative way of developing shapes. In our bachelor-curriculum Industrial Design Engineering we\ud developed a series of courses that addresses these effects in particular.\ud The courses are Sketching and concept drawing (SCT), Product Presentation Drawing (PPT) and\ud Applied sketching skills (TTV). This line of courses is built on three pillars:\ud - Learning to sketch; Theory, speed and control of the materials.\ud - Learning from sketching; Develop a better insight in complex 3D shapes (Figure 1).\ud - Sketching as a design tool; Communication, ordering your thoughts, iterative working.\ud As a result we see that students who have finished the courses instinctively start sketching in an\ud iterative manner, use sketching as a source of inspiration and learn that the whole process of iterative\ud sketching helps in structuring, developing and communicating the design process. In this way the\ud students become better sketchers and better designer

    The New Zealand Resource Management Act

    Get PDF
    New Zealand’s Resource Management Act, 1991, greatly altered the legal basis of planning practice in that country. In some respects its provisions represented dramatic change from what had gone before. The New Zealand Ministry for the Environment self-consciously sought through the Act to take a leading role globally in moving towards a sound basis for sustainable development. As other planning systems come under increasing scrutiny elsewhere in the world, this legislation from a small country in the southern hemisphere deserves attention

    Advancements in renal protection

    Get PDF

    Advancements in renal protection

    Get PDF

    Kontekstualisering: geologiese tydskaal en preservering van die fossielrekord

    Get PDF
    Contextualising: geological time scale and preserving of the fossil record Our point of departure is that the universe came into existence 13 000 million years ago as a result of a big bang. The earth formed in the known galactic system 4 600 million years ago. The concept of geological time and its measurement changed during the course of the history of man. In the 18th century, some natural scientists formed the view that the earth was very old, although it was not possible to determine the absolute age of the earth before the discovery of radioactivity (1896) and its application in the dating of rocks (1905). The principle of uniformity (the present is the key to the past) of James Hutton (1726-1797) is replaced in modern geology with the constancy of natural laws and uniform processes, while acknowledging that the rates of processes may vary considerably. It is also acknowledged that natural catastrophic occurrences, like meteorite impacts and earthquakes, can occur and that they are ordinary geological processes. Since the 17th century geologists in Europe and Britain, and later North America found that they could determine the relative ages of rock successions and could compare the rock successions over long distances, by applying stratigraphic principles. The result of this research was the geological timescale, as well as a description of the broad history of life on earth. This geological timescale was already completed at the end of the 1830s. This completion happened more or less 55 years before the publication of Darwin’s theory of evolution by natural selection in 1895. By using radiometric dating, it became possible to assign absolute ages to the different Periods of the geological timescale. That provided a good test for the reliability of the principles applied when the timescale was compiled. Nowhere were any contradictions found between the relative ages of Periods in the timescale and their absolute ages. The geological timescale is not invalidated by limitations in the application of radiometric dating, because the timescale is based on relative age and fossil content. The concept of deep space is readily accepted by Christians, but not the concept of deep time. It is important to remember that deep space and deep time are scientific concepts; findings, theories and concepts that may change when new knowledge is gained. Our faith, however, is not subject to scientific theories. We steadfastly stand by what we know and believe: God is the Creator and Sustainer of everything – of the whole universe. Scientific findings serve to bring us a deeper awareness of God’s omnipotence

    Kinetics of the low-temperature pyrolysis of polyethene, polypropene and polystyrene modeling, experimental determination and comparison with literature models and data

    Get PDF
    The pyrolysis kinetics of low-density polyethylene, high-density polyethylene, polypropylene, and polystyrene has been studied at temperatures below 450 C. In addition, a literature review on the low-temperature pyrolysis of these polymers has been conducted and has revealed that the scatter in the reported kinetic data is significant, which is most probably due to the use of simple first-order kinetic models to interpret the experimental data. This model type is only applicable in a small conversion range, but was used by many authors over a much wider conversion range. In this investigation the pyrolysis kinetics of the forementioned polymers and a mixture of polymers has been studied at temperatures below 450 C by performing isothermal thermogravimetric analysis (TGA) experiments. The TGA experimental data was used to determine the kinetic parameters on the basis of a simple first-order model for high conversions (70-90%) and a model developed in the present study, termed the random chain dissociation (RCD) model, for the entire conversion range. The influence of important parameters, such as molecular weight, extent of branching and -scission on the pyrolysis kinetics was studied with the RCD model. This model was also used to calculate the primary product spectrum of the pyrolysis process. The effect of the extent of branching and the initial molecular weight on the pyrolysis process was also studied experimentally. The effect of the extent of branching was found to be quite significant, but the effect of the initial molecular weight was minor. These results were found to agree quite well with the predictions obtained from the RCD model. Finally, the behavior of mixtures of the aforementioned polymers was studied and it was found that the pyrolysis kinetics of the polymers in the mixture remains unaltered in comparison with the pyrolysis kinetics of the pure polymers

    Hyper-differential sensitivity analysis with respect to model discrepancy: Optimal solution updating

    Full text link
    A common goal throughout science and engineering is to solve optimization problems constrained by computational models. However, in many cases a high-fidelity numerical emulation of systems cannot be optimized due to code complexity and computational costs which prohibit the use of intrusive and many query algorithms. Rather, lower-fidelity models are constructed to enable intrusive algorithms for large-scale optimization. As a result of the discrepancy between high and low-fidelity models, optimal solutions determined using low-fidelity models are frequently far from true optimality. In this article we introduce a novel approach that uses post-optimality sensitivities with respect to model discrepancy to update the optimization solution. Limited high-fidelity data is used to calibrate the model discrepancy in a Bayesian framework which in turn is propagated through post-optimality sensitivities of the low-fidelity optimization problem. Our formulation exploits structure in the post-optimality sensitivity operator to achieve computational scalability. Numerical results demonstrate how an optimal solution computed using a low-fidelity model may be significantly improved with limited evaluations of a high-fidelity model

    Hyper-differential sensitivity analysis with respect to model discrepancy: Mathematics and computation

    Full text link
    Model discrepancy, defined as the difference between model predictions and reality, is ubiquitous in computational models for physical systems. It is common to derive partial differential equations (PDEs) from first principles physics, but make simplifying assumptions to produce tractable expressions for the governing equations or closure models. These PDEs are then used for analysis and design to achieve desirable performance. For instance, the end goal may be to solve a PDE-constrained optimization (PDECO) problem. This article considers the sensitivity of PDECO problems with respect to model discrepancy. We introduce a general representation of the discrepancy and apply post-optimality sensitivity analysis to derive an expression for the sensitivity of the optimal solution with respect to the discrepancy. An efficient algorithm is presented which combines the PDE discretization, post-optimality sensitivity operator, adjoint-based derivatives, and a randomized generalized singular value decomposition to enable scalable computation. Kronecker product structure in the underlying linear algebra and corresponding infrastructure in PDECO is exploited to yield a general purpose algorithm which is computationally efficient and portable across a range of applications. Known physics and problem specific characteristics of discrepancy are imposed through user specified weighting matrices. We demonstrate our proposed framework on two nonlinear PDECO problems to highlight its computational efficiency and rich insight
    • …
    corecore