676 research outputs found

    Direct Multifield Volume Ray Casting of Fiber Surfaces

    Get PDF
    Multifield data are common in visualization. However, reducing these data to comprehensible geometry is a challenging problem. Fiber surfaces, an analogy of isosurfaces to bivariate volume data, are a promising new mechanism for understanding multifield volumes. In this work, we explore direct ray casting of fiber surfaces from volume data without any explicit geometry extraction. We sample directly along rays in domain space, and perform geometric tests in range space where fibers are defined, using a signed distance field derived from the control polygons. Our method requires little preprocess, and enables real-time exploration of data, dynamic modification and pixel-exact rendering of fiber surfaces, and support for higher-order interpolation in domain space. We demonstrate this approach on several bivariate datasets, including analysis of multi-field combustion data

    Modelling the geographical distribution of soil-transmitted helminth infections in Bolivia

    Get PDF
    The prevalence of infection with the three common soil-transmitted helminths (i.e. Ascaris lumbricoides, Trichuris trichiura, and hookworm) in Bolivia is among the highest in Latin America. However, the spatial distribution and burden of soil-transmitted helminthiasis are poorly documented.; We analysed historical survey data using Bayesian geostatistical models to identify determinants of the distribution of soil-transmitted helminth infections, predict the geographical distribution of infection risk, and assess treatment needs and costs in the frame of preventive chemotherapy. Rigorous geostatistical variable selection identified the most important predictors of A. lumbricoides, T. trichiura, and hookworm transmission.; Results show that precipitation during the wettest quarter above 400 mm favours the distribution of A. lumbricoides. Altitude has a negative effect on T. trichiura. Hookworm is sensitive to temperature during the coldest month. We estimate that 38.0%, 19.3%, and 11.4% of the Bolivian population is infected with A. lumbricoides, T. trichiura, and hookworm, respectively. Assuming independence of the three infections, 48.4% of the population is infected with any soil-transmitted helminth. Empirical-based estimates, according to treatment recommendations by the World Health Organization, suggest a total of 2.9 million annualised treatments for the control of soil-transmitted helminthiasis in Bolivia.; We provide estimates of soil-transmitted helminth infections in Bolivia based on high-resolution spatial prediction and an innovative variable selection approach. However, the scarcity of the data suggests that a national survey is required for more accurate mapping that will govern spatial targeting of soil-transmitted helminthiasis control

    Implementation of Linear and Lagrange Interpolation on Compression of Fibrous Peat Soil Prediction

    Get PDF
    Previous studies have predicted the compression of fibrous peat soils using the Gibson & Lo method. But the prediction process is still done manually so it requires quite a long time. Therefore this research implements linear and Lagrange interpolation methods using Matlab software to speed up the prediction process. This study also carried out a comparison of the results of the implementation of the two methods to determine its effectiveness in making predictions. Based on the results of trials and analysis, it can be seen that the prediction of compression of fibrous peat soil using linear interpolation is more effective than using Lagrange interpolation, this can be proven by the smaller average RMSE prediction results using linear interpolation, with a difference in the average value of RMSE 7.7. Besides, prediction testing using Lagrange interpolation requires longer time, because it still does the iteration process as much as laboratory test data

    Sparse Modelling and Multi-exponential Analysis

    Get PDF
    The research fields of harmonic analysis, approximation theory and computer algebra are seemingly different domains and are studied by seemingly separated research communities. However, all of these are connected to each other in many ways. The connection between harmonic analysis and approximation theory is not accidental: several constructions among which wavelets and Fourier series, provide major insights into central problems in approximation theory. And the intimate connection between approximation theory and computer algebra exists even longer: polynomial interpolation is a long-studied and important problem in both symbolic and numeric computing, in the former to counter expression swell and in the latter to construct a simple data model. A common underlying problem statement in many applications is that of determining the number of components, and for each component the value of the frequency, damping factor, amplitude and phase in a multi-exponential model. It occurs, for instance, in magnetic resonance and infrared spectroscopy, vibration analysis, seismic data analysis, electronic odour recognition, keystroke recognition, nuclear science, music signal processing, transient detection, motor fault diagnosis, electrophysiology, drug clearance monitoring and glucose tolerance testing, to name just a few. The general technique of multi-exponential modeling is closely related to what is commonly known as the Padé-Laplace method in approximation theory, and the technique of sparse interpolation in the field of computer algebra. The problem statement is also solved using a stochastic perturbation method in harmonic analysis. The problem of multi-exponential modeling is an inverse problem and therefore may be severely ill-posed, depending on the relative location of the frequencies and phases. Besides the reliability of the estimated parameters, the sparsity of the multi-exponential representation has become important. A representation is called sparse if it is a combination of only a few elements instead of all available generating elements. In sparse interpolation, the aim is to determine all the parameters from only a small amount of data samples, and with a complexity proportional to the number of terms in the representation. Despite the close connections between these fields, there is a clear lack of communication in the scientific literature. The aim of this seminar is to bring researchers together from the three mentioned fields, with scientists from the varied application domains.Output Type: Meeting Repor

    A Review of Image Super Resolution using Deep Learning

    Get PDF
    The image processing methods collectively known as super-resolution have proven useful in creating high-quality images from a group of low-resolution photographic images. Single image super resolution (SISR) has been applied in a variety of fields. The paper offers an in-depth analysis of a few current picture super resolution works created in various domains. In order to comprehend the most current developments in the development of Image super resolution systems, these recent publications have been examined with particular emphasis paid to the domain for which these systems have been designed, image enhancement used or not, among other factors. To improve the accuracy of the image super resolution, a different deep learning techniques might be explored. In fact, greater research into the image super resolution in medical imaging is possible to improve the data's suitability for future analysis. In light of this, it can be said that there is a lot of scope for research in the field of medical imaging

    New tools for quantitative analysis of nuclear architecture

    No full text
    The cell nucleus houses a wide variety of macromolecular substructures including the cell’s genetic material. The spatial configuration of these substructures is thought to be fundamentally associated with nuclear function, yet the architectural organisation of the cell nucleus is only poorly understood. Advances in microscopy and associated fluorescence techniques have provided a wealth of nuclear image data. Such images offer the opportunity for both visualising nuclear substructures and quantitative investigation of the spatial configuration of these objects. In this thesis, we present new tools to study and explore the subtle principles behind nuclear architecture. We describe a novel method to segment fluorescent microscopy images of nuclear objects. The effectiveness of this segmentation algorithm is demonstrated using extensive simulation. Additionally, we show that the method performs as well as manual-thresholding, which is considered the gold standard. Next, randomisationbased tests from spatial point pattern analysis are employed to inspect spatial interactions of nuclear substructures. The results suggest new and interesting spatial relationships in the nucleus. However, this approach probes only relative nuclear organisation and cannot readily yield a description of absolute spatial preference, which may be a key component of nuclear architecture. To address this problem we have developed methodology based on techniques employed in statistical shape analysis and image registration. The approach proposes that the nuclear boundary can be used to align nuclei from replicate images into a common coordinate system. Each nucleus and its contents can therefore be registered to the sample mean shape using rigid and non-rigid deformations. This aggregated data allows inference regarding global nuclear spatial organisation. For example, the kernel smoothed intensity function is computed to return an estimate of the intensity function of the registered nuclear object. Simulation provides evidence that the registration procedure is sensible and the results accurate. Finally, we have investigated a large database of nuclear substructures using conventional methodology as well as our new tools. We have identified novel spatial relationships between nuclear objects that offer significant clues to their function. We have also examined the absolute spatial configuration of these substructures in registered data. The results reveal dramatic underlying spatial preferences and present new and clear insights into nuclear architecture

    Remote Sensing of Natural Hazards

    Get PDF
    Each year, natural hazards such as earthquakes, cyclones, flooding, landslides, wildfires, avalanches, volcanic eruption, extreme temperatures, storm surges, drought, etc., result in widespread loss of life, livelihood, and critical infrastructure globally. With the unprecedented growth of the human population, largescale development activities, and changes to the natural environment, the frequency and intensity of extreme natural events and consequent impacts are expected to increase in the future.Technological interventions provide essential provisions for the prevention and mitigation of natural hazards. The data obtained through remote sensing systems with varied spatial, spectral, and temporal resolutions particularly provide prospects for furthering knowledge on spatiotemporal patterns and forecasting of natural hazards. The collection of data using earth observation systems has been valuable for alleviating the adverse effects of natural hazards, especially with their near real-time capabilities for tracking extreme natural events. Remote sensing systems from different platforms also serve as an important decision-support tool for devising response strategies, coordinating rescue operations, and making damage and loss estimations.With these in mind, this book seeks original contributions to the advanced applications of remote sensing and geographic information systems (GIS) techniques in understanding various dimensions of natural hazards through new theory, data products, and robust approaches
    • …
    corecore