222 research outputs found

    Non-Gaussian Minkowski functionals & extrema counts in redshift space

    Full text link
    In the context of upcoming large-scale structure surveys such as Euclid, it is of prime importance to quantify the effect of peculiar velocities on geometric probes. Hence the formalism to compute in redshift space the geometrical and topological one-point statistics of mildly non-Gaussian 2D and 3D cosmic fields is developed. Leveraging the partial isotropy of the target statistics, the Gram-Charlier expansion of the joint probability distribution of the field and its derivatives is reformulated in terms of the corresponding anisotropic variables. In particular, the cosmic non-linear evolution of the Minkowski functionals, together with the statistics of extrema are investigated in turn for 3D catalogues and 2D slabs. The amplitude of the non-Gaussian redshift distortion correction is estimated for these geometric probes. In 3D, gravitational perturbation theory is implemented in redshift space to predict the cosmic evolution of all relevant Gram-Charlier coefficients. Applications to the estimation of the cosmic parameters sigma(z) and beta=f/b1 from upcoming surveys is discussed. Such statistics are of interest for anisotropic fields beyond cosmology.Comment: 35 pages, 15 figures, matches version published in MNRAS with a typo corrected in eq A1

    Fiber orientation distributions based on planar fiber orientation tensors of fourth order

    Get PDF
    Fiber orientation tensors represent averaged measures of fiber orientations inside a microstructure. Although, orientation-dependent material models are commonly used to describe the mechanical properties of representative microstructure, the influence of changing or differing microstructure on the material response is rarely investigated systematically for directional measures which are more precise than second-order fiber orientation tensors. For the special case of planar orientation distributions, a set of admissible fiber orientation tensors of fourth-order is known. Fiber orientation distributions reconstructed from given orientation tensors are of interest both for numerical averaging schemes in material models and visualization of the directional information itself. Focusing on the special case of planar orientations, this paper draws the geometric picture of fiber orientation distribution functions reconstructed from fourth-order fiber orientation tensors. The developed methodology can be adopted to study the dependence of material models on planar fourth-order fiber orientation tensors. Within the set of admissible fiber orientation tensors, a subset of distinct tensors is identified. Advantages and disadvantages of the description of planar orientation states in two- or three-dimensional tensor frameworks are demonstrated. Reconstruction of fiber orientation distributions is performed by truncated Fourier series and additionally by deploying a maximum entropy method. The combination of the set of admissible and distinct fiber orientation tensors and reconstruction methods leads to the variety of reconstructed fiber orientation distributions. This variety is visualized by arrangements of polar plots within the parameter space of fiber orientation tensors. This visualization shows the influence of averaged orientation measures on reconstructed orientation distributions and can be used to study any simulation method or quantity which is defined as a function of planar fourth-order fiber orientation tensors

    Spatio–temporal analysis of changes of shape for constituent bodies within biomolecular aggregates

    Get PDF
    Changes of shape are important in many situations of interest in biology at different typical length scales. Approaches for modelling the behaviour of droplets in suspension and thermallydriven motion of the molecular chains in enzymes are presented. Both models use orthogonal basis functions to describe the spatial dependences in a spherical geometry. Both models also describe the effect of time-dependent boundary data on the shape of the bodies involved, a stochastic response for the enzyme model (dimensions of the order 10−9 m) and smooth response for the colloidal model (dimensions of the order 10−6 m). The first model presented considers the behaviour of a droplet of fluid surrounded by a thin film of host fluid, both fluids being Newtonian and immiscible, with a well-defined continuous and smooth interface between these regions. The flows for the droplet and host fluid are assumed axisymmetric with small Reynold numbers. An extension of traditional lubrication theory is used to model the flow for the host fluid and a multi-modal Stokes flow is used to derive the flow within the droplet, subject to continuity conditions at the interface between the droplet and host fluid. The interface is free to move in response to the flows, under the effects of interfacial tension. Asymptotic expansions for the flow variables and interface are used to find the simplest behaviour of the system beyond the leading order. The second unique modelling approach used is the method of Zernike moments. Zernike moments are an extension of spherical harmonics to include more general radial dependence and the ability to model holes, folded layers etc. within and on the unit sphere. The method has traditionally been used to describe the shape of enzymes in a static time-independent manner. This approach is extended to give results based on the thermally-driven motion of atoms in molecules about their equilibrium positions. The displacements are assumed to be fitted by Normal probability distributions. The precision and accuracy of this model are considered and compared to similar models. Results are plotted and discussed for both regimes and further extensions, improvements and basis for further work are discussed for both approaches

    Measurement, optimisation and control of particle properties in pharmaceutical manufacturing processes

    Get PDF
    Previously held under moratorium from 2 June 2020 until 6 June 2022.The understanding and optimisation of particle properties connected to their structure and morphology is a common objective for particle engineering applications either to improve materialhandling in the manufacturing process or to influence Critical Quality Attributes (CQAs) linked to product performance. This work aims to demonstrate experimental means to support a rational development approach for pharmaceutical particulate systems with a specific focus on droplet drying platforms such as spray drying. Micro-X-ray tomography (micro-XRT) is widely applied in areas such as geo- and biomedical sciences to enable a three dimensional investigation of the specimens. Chapter 4 elaborates on practical aspects of micro-XRT for a quantitative analysis of pharmaceutical solid products with an emphasis on implemented image processing and analysis methodologies. Potential applications of micro-XRT in the pharmaceutical manufacturing process can range from the characterisation of single crystals to fully formulated oral dosage forms. Extracted quantitative information can be utilised to directly inform product design and production for process development or optimisation. The non-destructive nature of the micro-XRT analysis can be further employed to investigate structure-performance relationships which might provide valuable insights for modelling approaches. Chapter 5 further demonstrates the applicability of micro-XRT for the analysis of ibuprofen capsules as a multi-particulate system each with a population of approximately 300 pellets. The in-depth analysis of collected micro-XRT image data allowed the extraction of more than 200 features quantifying aspects of the pellets’ size, shape, porosity, surface and orientation. Employed feature selection and machine learning methods enabled the detection of broken pellets within a classification model. The classification model has an accuracy of more than 99.55% and a minimum precision of 86.20% validated with a test dataset of 886 pellets from three capsules. The combination of single droplet drying (SDD) experiments with a subsequent micro-XRT analysis was used for a quantitative investigation of the particle design space and is described in Chapter 6. The implemented platform was applied to investigate the solidification of formulated metformin hydrochloride particles using D-mannitol and hydroxypropyl methylcellulose within a selected, pragmatic particle design space. The results indicate a significant impact of hydroxypropyl methylcellulose reducing liquid evaporation rates and particle drying kinetics. The morphology and internal structure of the formulated particles after drying are dominated by a crystalline core of D-mannitol partially suppressed with increasing hydroxypropyl methylcellulose additions. The characterisation of formulated metformin hydrochloride particles with increasing polymer content demonstrated the importance of an early-stage quantitative assessment of formulation-related particle properties. A reliable and rational spray drying development approach needs to assess parameters of the compound system as well as of the process itself in order to define a well-controlled and robust operational design space. Chapter 7 presents strategies for process implementation to produce peptide-based formulations via spray drying demonstrated using s-glucagon as a model peptide. The process implementation was supported by an initial characterisation of the lab-scale spray dryer assessing a range of relevant independent process variables including drying temperature and feed rate. The platform response was captured with available and in-house developed Process Analytical Technology. A B-290 Mini-Spray Dryer was used to verify the development approach and to implement the pre-designed spray drying process. Information on the particle formation mechanism observed in SDD experiments were utilised to interpret the characteristics of the spray dried material.The understanding and optimisation of particle properties connected to their structure and morphology is a common objective for particle engineering applications either to improve materialhandling in the manufacturing process or to influence Critical Quality Attributes (CQAs) linked to product performance. This work aims to demonstrate experimental means to support a rational development approach for pharmaceutical particulate systems with a specific focus on droplet drying platforms such as spray drying. Micro-X-ray tomography (micro-XRT) is widely applied in areas such as geo- and biomedical sciences to enable a three dimensional investigation of the specimens. Chapter 4 elaborates on practical aspects of micro-XRT for a quantitative analysis of pharmaceutical solid products with an emphasis on implemented image processing and analysis methodologies. Potential applications of micro-XRT in the pharmaceutical manufacturing process can range from the characterisation of single crystals to fully formulated oral dosage forms. Extracted quantitative information can be utilised to directly inform product design and production for process development or optimisation. The non-destructive nature of the micro-XRT analysis can be further employed to investigate structure-performance relationships which might provide valuable insights for modelling approaches. Chapter 5 further demonstrates the applicability of micro-XRT for the analysis of ibuprofen capsules as a multi-particulate system each with a population of approximately 300 pellets. The in-depth analysis of collected micro-XRT image data allowed the extraction of more than 200 features quantifying aspects of the pellets’ size, shape, porosity, surface and orientation. Employed feature selection and machine learning methods enabled the detection of broken pellets within a classification model. The classification model has an accuracy of more than 99.55% and a minimum precision of 86.20% validated with a test dataset of 886 pellets from three capsules. The combination of single droplet drying (SDD) experiments with a subsequent micro-XRT analysis was used for a quantitative investigation of the particle design space and is described in Chapter 6. The implemented platform was applied to investigate the solidification of formulated metformin hydrochloride particles using D-mannitol and hydroxypropyl methylcellulose within a selected, pragmatic particle design space. The results indicate a significant impact of hydroxypropyl methylcellulose reducing liquid evaporation rates and particle drying kinetics. The morphology and internal structure of the formulated particles after drying are dominated by a crystalline core of D-mannitol partially suppressed with increasing hydroxypropyl methylcellulose additions. The characterisation of formulated metformin hydrochloride particles with increasing polymer content demonstrated the importance of an early-stage quantitative assessment of formulation-related particle properties. A reliable and rational spray drying development approach needs to assess parameters of the compound system as well as of the process itself in order to define a well-controlled and robust operational design space. Chapter 7 presents strategies for process implementation to produce peptide-based formulations via spray drying demonstrated using s-glucagon as a model peptide. The process implementation was supported by an initial characterisation of the lab-scale spray dryer assessing a range of relevant independent process variables including drying temperature and feed rate. The platform response was captured with available and in-house developed Process Analytical Technology. A B-290 Mini-Spray Dryer was used to verify the development approach and to implement the pre-designed spray drying process. Information on the particle formation mechanism observed in SDD experiments were utilised to interpret the characteristics of the spray dried material

    Computing and visualising intra-voxel orientation-specific relaxation-diffusion features in the human brain

    Get PDF
    Diffusion MRI techniques are used widely to study the characteristics of the human brain connectome in vivo. However, to resolve and characterise white matter (WM) fibres in heterogeneous MRI voxels remains a challenging problem typically approached with signal models that rely on prior information and constraints. We have recently introduced a 5D relaxation–diffusion correlation framework wherein multidimensional diffusion encoding strategies are used to acquire data at multiple echo‐times to increase the amount of information encoded into the signal and ease the constraints needed for signal inversion. Nonparametric Monte Carlo inversion of the resulting datasets yields 5D relaxation–diffusion distributions where contributions from different sub‐voxel tissue environments are separated with minimal assumptions on their microscopic properties. Here, we build on the 5D correlation approach to derive fibre‐specific metrics that can be mapped throughout the imaged brain volume. Distribution components ascribed to fibrous tissues are resolved, and subsequently mapped to a dense mesh of overlapping orientation bins to define a smooth orientation distribution function (ODF). Moreover, relaxation and diffusion measures are correlated to each independent ODF coordinate, thereby allowing the estimation of orientation‐specific relaxation rates and diffusivities. The proposed method is tested on a healthy volunteer, where the estimated ODFs were observed to capture major WM tracts, resolve fibre crossings, and, more importantly, inform on the relaxation and diffusion features along with distinct fibre bundles. If combined with fibre‐tracking algorithms, the methodology presented in this work has potential for increasing the depth of characterisation of microstructural properties along individual WM pathways

    Signal concentration and related concepts in time-frequency and on the unit sphere

    Get PDF
    Unit sphere signal processing is an increasingly active area of research with applications in computer vision, medical imaging, geophysics, cosmology and wireless communications. However, comparing with signal processing in time-frequency domain, characterization and processing of signals defined on the unit sphere is relatively unfamiliar for most of the engineering researchers. In order to better understand and analysis the current issues using the spherical model, such as analysis of brain neural electronic activities in medical imaging and neuroscience, target detection and tracking in radar systems, earthquake occurrence prediction and seismic origin detection in seismology, it is necessary to set up a systematic theory for unit sphere signal processing. How to efficiently analyze and represent functions defined on the unit sphere are central for the unit sphere signal processing, such as filtering, smoothing, detection and estimation in the presence of noise and interference. Slepian-Landau-Pollak time-frequency energy concentration theory and the essential dimensionality of time-frequency signals by the Fourier transform are the fundamental tools for signal processing in the time-frequency domain. Therefore, our research work starts from the analogies of signals between time-frequency and spatial-spectral. In this thesis, we first formulate the k-th moment time-duration weighting measure for a band-limited signal using a general constrained variational method, where a complete, orthonormal set of optimal band-limited functions with the minimum fourth moment time-duration measure is obtained and the prospective applications are discussed. Further, the formulation to an arbitrary signal with second and fourth moment weighting in both time and frequency domain is also developed and the corresponding optimal functions are obtained, which are helpful for practical waveform designs in communication systems. Next, we develop a k-th spatially global moment azimuthal measure (GMZM) and a k-th spatially local moment zenithal measure (LMZM) for real-valued spectral-limited signals. The corresponding sets of optimal functions are solved and compared with the spherical Slepian functions. In addition, a harmonic multiplication operation is developed on the unit sphere. Using this operation, a spectral moment weighting measure to a spatial-limited signal is formulated and the corresponding optimal functions are solved. However, the performance of these sets of functions and their perspective applications in real world, such as efficiently analysis and representation of spherical signals, is still in exploration. Some spherical quadratic functionals by spherical harmonic multiplication operation are formulated in this thesis. Next, a general quadratic variational framework for signal design on the unit sphere is developed. Using this framework and the quadratic functionals, the general concentration problem to an arbitrary signal defined on the unit sphere to simultaneously achieve maximum energy in the finite spatial region and finite spherical spectrum is solved. Finally, a novel spherical convolution by defining a linear operator is proposed, which not only specializes the isotropic convolution, but also has a well defined spherical harmonic characterization. Furthermore, using the harmonic multiplication operation on the unit sphere, a reconstruction strategy without consideration of noise using analysis-synthesis filters under three different sampling methods is discussed

    Overcomplete Image Representations for Texture Analysis

    Get PDF
    Advisor/s: Dr. Boris Escalante-RamĂ­rez and Dr. Gabriel CristĂłbal. Date and location of PhD thesis defense: 23th October 2013, Universidad Nacional AutĂłnoma de MĂ©xico.In recent years, computer vision has played an important role in many scientific and technological areas mainlybecause modern society highlights vision over other senses. At the same time, application requirements and complexity have also increased so that in many cases the optimal solution depends on the intrinsic charac-teristics of the problem; therefore, it is difficult to propose a universal image model. In parallel, advances in understanding the human visual system have allowed to propose sophisticated models that incorporate simple phenomena which occur in early stages of the visual system. This dissertation aims to investigate characteristicsof vision such as over-representation and orientation of receptive fields in order to propose bio-inspired image models for texture analysis

    Three-dimensional modeling of the human jaw/teeth using optics and statistics.

    Get PDF
    Object modeling is a fundamental problem in engineering, involving talents from computer-aided design, computational geometry, computer vision and advanced manufacturing. The process of object modeling takes three stages: sensing, representation, and analysis. Various sensors may be used to capture information about objects; optical cameras and laser scanners are common with rigid objects, while X-ray, CT and MRI are common with biological organs. These sensors may provide a direct or an indirect inference about the object, requiring a geometric representation in the computer that is suitable for subsequent usage. Geometric representations that are compact, i.e., capture the main features of the objects with a minimal number of data points or vertices, fall into the domain of computational geometry. Once a compact object representation is in the computer, various analysis steps can be conducted, including recognition, coding, transmission, etc. The subject matter of this dissertation is object reconstruction from a sequence of optical images using shape from shading (SFS) and SFS with shape priors. The application domain is dentistry. Most of the SFS approaches focus on the computational part of the SFS problem, i.e. the numerical solution. As a result, the imaging model in most conventional SFS algorithms has been simplified under three simple, but restrictive assumptions: (1) the camera performs an orthographic projection of the scene, (2) the surface has a Lambertian reflectance and (3) the light source is a single point source at infinity. Unfortunately, such assumptions are no longer held in the case of reconstruction of real objects as intra-oral imaging environment for human teeth. In this work, we introduce a more realistic formulation of the SFS problem by considering the image formation components: the camera, the light source, and the surface reflectance. This dissertation proposes a non-Lambertian SFS algorithm under perspective projection which benefits from camera calibration parameters. The attenuation of illumination is taken account due to near-field imaging. The surface reflectance is modeled using the Oren-Nayar-Wolff model which accounts for the retro-reflection case. In this context, a new variational formulation is proposed that relates an evolving surface model with image information, taking into consideration that the image is taken by a perspective camera with known parameters. A new energy functional is formulated to incorporate brightness, smoothness and integrability constraints. In addition, to further improve the accuracy and practicality of the results, 3D shape priors are incorporated in the proposed SFS formulation. This strategy is motivated by the fact that humans rely on strong prior information about the 3D world around us in order to perceive 3D shape information. Such information is statistically extracted from training 3D models of the human teeth. The proposed SFS algorithms have been used in two different frameworks in this dissertation: a) holistic, which stitches a sequence of images in order to cover the entire jaw, and then apply the SFS, and b) piece-wise, which focuses on a specific tooth or a segment of the human jaw, and applies SFS using physical teeth illumination characteristics. To augment the visible portion, and in order to have the entire jaw reconstructed without the use of CT or MRI or even X-rays, prior information were added which gathered from a database of human jaws. This database has been constructed from an adult population with variations in teeth size, degradation and alignments. The database contains both shape and albedo information for the population. Using this database, a novel statistical shape from shading (SSFS) approach has been created. Extending the work on human teeth analysis, Finite Element Analysis (FEA) is adapted for analyzing and calculating stresses and strains of dental structures. Previous Finite Element (FE) studies used approximate 2D models. In this dissertation, an accurate three-dimensional CAD model is proposed. 3D stress and displacements of different teeth type are successfully carried out. A newly developed open-source finite element solver, Finite Elements for Biomechanics (FEBio), has been used. The limitations of the experimental and analytical approaches used for stress and displacement analysis are overcome by using FEA tool benefits such as dealing with complex geometry and complex loading conditions
    • 

    corecore