446 research outputs found
Application-Specific Number Representation
Reconfigurable devices, such as Field Programmable Gate Arrays (FPGAs), enable application-
specific number representations. Well-known number formats include fixed-point, floating-
point, logarithmic number system (LNS), and residue number system (RNS). Such different
number representations lead to different arithmetic designs and error behaviours, thus produc-
ing implementations with different performance, accuracy, and cost.
To investigate the design options in number representations, the first part of this thesis presents
a platform that enables automated exploration of the number representation design space. The
second part of the thesis shows case studies that optimise the designs for area, latency or
throughput from the perspective of number representations.
Automated design space exploration in the first part addresses the following two major issues:
² Automation requires arithmetic unit generation. This thesis provides optimised
arithmetic library generators for logarithmic and residue arithmetic units, which support
a wide range of bit widths and achieve significant improvement over previous designs.
² Generation of arithmetic units requires specifying the bit widths for each
variable. This thesis describes an automatic bit-width optimisation tool called R-Tool,
which combines dynamic and static analysis methods, and supports different number
systems (fixed-point, floating-point, and LNS numbers).
Putting it all together, the second part explores the effects of application-specific number
representation on practical benchmarks, such as radiative Monte Carlo simulation, and seismic
imaging computations. Experimental results show that customising the number representations
brings benefits to hardware implementations: by selecting a more appropriate number format,
we can reduce the area cost by up to 73.5% and improve the throughput by 14.2% to 34.1%; by
performing the bit-width optimisation, we can further reduce the area cost by 9.7% to 17.3%.
On the performance side, hardware implementations with customised number formats achieve
5 to potentially over 40 times speedup over software implementations
Characterization, Design, and Optimization of Dual-Purpose Wind Turbines and Frost Protection Fans
This thesis report outlines the creation of a MATLAB tool to design reversible machines that can function as both wind turbines and as agricultural frost protection fans. Frost protection fans are used to prevent crop loss during radiative freeze events during which a temperature inversion is present. Such a dual-purpose machine fundamentally has the constraint that it must use symmetric airfoils, so a suite of tools for automatically designing an optimized wind turbine blade with symmetric airfoils using the Blade Element Momentum (BEM) theory approach is presented. The BEM code is then re-derived and adapted for use with a frost protection fan, which is analogous to a propeller at zero free-stream windspeed. The relative performance of a blade operating in fan mode is investigated using a turbulent jet entrainment model to predict the time-averaged temperature rise provided by the fan during a thermal inversion event. With these tools, an optimal configuration of blade pitch angle, rotor tilt angle, and tower height can be found for a given wind turbine blade. The models are incorporated into a cohesive program with a graphical user interface. The feasibility of such machines is found to depend heavily on the wind resource at a given site
Survey of computer programs for heat transfer analysis
An overview is presented of the current capabilities of thirty-eight computer programs that can be used for solution of heat transfer problems. These programs range from the large, general-purpose codes with a broad spectrum of capabilities, large user community and comprehensive user support (e.g., ANSYS, MARC, MITAS 2 MSC/NASTRAN, SESAM-69/NV-615) to the small, special purpose codes with limited user community such as ANDES, NNTB, SAHARA, SSPTA, TACO, TEPSA AND TRUMP. The capabilities of the programs surveyed are listed in tabular form followed by a summary of the major features of each program. As with any survey of computer programs, the present one has the following limitations: (1) It is useful only in the initial selection of the programs which are most suitable for a particular application. The final selection of the program to be used should, however, be based on a detailed examination of the documentation and the literature about the program; (2) Since computer software continually changes, often at a rapid rate, some means must be found for updating this survey and maintaining some degree of currency
Recommended from our members
Pathway using WUDAPT's Digital Synthetic City tool towards generating urban canopy parameters for multi-scale urban atmospheric modeling
The WUDAPT (World Urban Database and Access Portal Tools project goal is to capture consistent information on urban form and function for cities worldwide that can support urban weather, climate, hydrology and air quality modeling. These data are provided as urban canopy parameters (UCPs) as used by weather, climate and air quality models to simulate the effects of urban surfaces on the overlying atmosphere. Information is stored with different levels of detail (LOD). With higher LOD greater spatial precision is provided. At the lowest LOD, Local Climate Zones(LCZ) with nominal UCP ranges is provided (order 100 m or more). To describe the spatial heterogeneity present in cities with great specificity at different urban scales we introduce the Digital Synthetic City (DSC) tool to generate UCPs at any desired scale meeting the fit-for-purpose goal of WUDAPT. 3D building and road elements of entire city landscapes are simulated based on readily available data. Comparisons with real-world urban data are very encouraging. It is customized (C-DSC) to incorporate each city's unique building morphologies based on unique types, variations and spatial distribution of building typologies, architecture features, construction materials and distribution of green and pervious surfaces. The C-DSC uses crowdsourcing methods and sampling within city Testbeds from around the world. UCP data can be computed from synthetic images at selected grid sizes and stored such that the coded string provides UCP values for individual grid cells
Imaging light transport at the femtosecond scale
Paper, milk, clouds and white paint share a common property: they are opaque disordered media through which light scatters randomly rather than propagating in a straight path. For very thick and turbid media, indeed, light eventually propagates in a ‘diffusive’ way, i.e. similarly to how tea infuses through hot water. Frequently though, a material is neither perfectly opaque nor transparent and the simple diffusion model does not hold. In this work, we developed a novel optical-gating setup that allowed us to observe light transport in scattering media with sub-ps time resolution. An array of unexplored aspects of light propagation emerged from this spatio-temporal description, unveiling transport regimes that were previously inaccessibile due to the extreme time scales involved and the lack of analytical models
Man-Computer Interactive Data Access System (McIDAS). Continued development of McIDAS and operation in the GARP Atlantic tropical experiment
The complete output of the Synchronous Meteorological Satellite was recorded on one inch magnetic tape. A quality control subsystem tests cloud track vectors against four sets of criteria: (1) rejection if best match occurs on correlation boundary; (2) rejection if major correlation peak is not distinct and significantly greater than secondary peak; (3) rejection if correlation is not persistent; and (4) rejection if acceleration is too great. A cloud height program determines cloud optical thickness from visible data and computer infrared emissivity. From infrared data and temperature profile, cloud height is determined. A functional description and electronic schematics of equipment are given
Global sensitivity analysis of leaf-canopy-atmosphere RTMs: Implications for biophysical variables retrieval from top-of-atmosphere radiance data
Knowledge of key variables driving the top of the atmosphere (TOA) radiance over a vegetated surface is an important step to derive biophysical variables from TOA radiance data, e.g., as observed by an optical satellite. Coupled leaf-canopy-atmosphere Radiative Transfer Models (RTMs) allow linking vegetation variables directly to the at-sensor TOA radiance measured. Global Sensitivity Analysis (GSA) of RTMs enables the computation of the total contribution of each input variable to the output variance. We determined the impacts of the leaf-canopy-atmosphere variables into TOA radiance using the GSA to gain insights into retrievable variables. The leaf and canopy RTM PROSAIL was coupled with the atmospheric RTM MODTRAN5. Because of MODTRAN's computational burden and GSA's demand for many simulations, we first developed a surrogate statistical learning model, i.e., an emulator, that allows approximating RTM outputs through a machine learning algorithm with low computation time. A Gaussian process regression (GPR) emulator was used to reproduce lookup tables of TOA radiance as a function of 12 input variables with relative errors of 2.4%. GSA total sensitivity results quantified the driving variables of emulated TOA radiance along the 400-2500 nm spectral range at 15 cm-1 (between 0.3-9 nm); overall, the vegetation variables play a more dominant role than atmospheric variables. This suggests the possibility to retrieve biophysical variables directly from at-sensor TOA radiance data. Particularly promising are leaf chlorophyll content, leaf water thickness and leaf area index, as these variables are the most important drivers in governing TOA radiance outside the water absorption regions. A software framework was developed to facilitate the development of retrieval models from at-sensor TOA radiance data. As a proof of concept, maps of these biophysical variables have been generated for both TOA (L1C) and bottom-of-atmosphere (L2A) Sentinel-2 data by means of a hybrid retrieval scheme, i.e., training GPR retrieval algorithms using the RTM simulations. Obtained maps from L1C vs L2A data are consistent, suggesting that vegetation properties can be directly retrieved from TOA radiance data given a cloud-free sky, thus without the need of an atmospheric correction
Machine learning in solar physics
The application of machine learning in solar physics has the potential to
greatly enhance our understanding of the complex processes that take place in
the atmosphere of the Sun. By using techniques such as deep learning, we are
now in the position to analyze large amounts of data from solar observations
and identify patterns and trends that may not have been apparent using
traditional methods. This can help us improve our understanding of explosive
events like solar flares, which can have a strong effect on the Earth
environment. Predicting hazardous events on Earth becomes crucial for our
technological society. Machine learning can also improve our understanding of
the inner workings of the sun itself by allowing us to go deeper into the data
and to propose more complex models to explain them. Additionally, the use of
machine learning can help to automate the analysis of solar data, reducing the
need for manual labor and increasing the efficiency of research in this field.Comment: 100 pages, 13 figures, 286 references, accepted for publication as a
Living Review in Solar Physics (LRSP
- …