17,220 research outputs found
Mathematical Analysis and Optimization of Infiltration Processes
A variety of infiltration techniques can be used to fabricate solid materials, particularly composites. In general these processes can be described with at least one time dependent partial differential equation describing the evolution of the solid phase, coupled to one or more partial differential equations describing mass transport through a porous structure. This paper presents a detailed mathematical analysis of a relatively simple set of equations which is used to describe chemical vapor infiltration. The results demonstrate that the process is controlled by only two parameters, alpha and beta. The optimization problem associated with minimizing the infiltration time is also considered. Allowing alpha and beta to vary with time leads to significant reductions in the infiltration time, compared with the conventional case where alpha and beta are treated as constants
Recommended from our members
Process Chain for Numerical Simulation of IMLS
Additive layer manufacturing methods imply, among other advantages, extensive flexibility
concerning their ability to realize mass customization. Despite various efforts towards process
enhancement, numerous deficiencies concerning part distortion or residual stresses are still
observable. The present work deals with the definition of an efficient process chain for
numerical simulation of indirect metal laser sintering (IMLS), in order to improve
dimensional accuracy. The underlying method is based on investigations of dilatometric behavior of iron based powder, which is integrated into reaction kinetic models and coupled
with a finite element analysis (FEA). Thus, singular process steps, e. g. solid phase sintering,
phase transformations or infiltration, are numerically modelled with adequate accuracy.
Referring to thermomechanical simulation, possibilities for pre-scaling of part geometries are
presented.Mechanical Engineerin
Sensitivity analysis and parameter estimation for distributed hydrological modeling: potential of variational methods
Variational methods are widely used for the analysis and control of computationally intensive spatially distributed systems. In particular, the adjoint state method enables a very efficient calculation of the derivatives of an objective function (response function to be analysed or cost function to be optimised) with respect to model inputs. In this contribution, it is shown that the potential of variational methods for distributed catchment scale hydrology should be considered. A distributed flash flood model, coupling kinematic wave overland flow and Green Ampt infiltration, is applied to a small catchment of the Thoré basin and used as a relatively simple (synthetic observations) but didactic application case. It is shown that forward and adjoint sensitivity analysis provide a local but extensive insight on the relation between the assigned model parameters and the simulated hydrological response. Spatially distributed parameter sensitivities can be obtained for a very modest calculation effort (~6 times the computing time of a single model run) and the singular value decomposition (SVD) of the Jacobian matrix provides an interesting perspective for the analysis of the rainfall-runoff relation. For the estimation of model parameters, adjoint-based derivatives were found exceedingly efficient in driving a bound-constrained quasi-Newton algorithm. The reference parameter set is retrieved independently from the optimization initial condition when the very common dimension reduction strategy (i.e. scalar multipliers) is adopted. Furthermore, the sensitivity analysis results suggest that most of the variability in this high-dimensional parameter space can be captured with a few orthogonal directions. A parametrization based on the SVD leading singular vectors was found very promising but should be combined with another regularization strategy in order to prevent overfitting
Radiotherapy planning for glioblastoma based on a tumor growth model: Improving target volume delineation
Glioblastoma are known to infiltrate the brain parenchyma instead of forming
a solid tumor mass with a defined boundary. Only the part of the tumor with
high tumor cell density can be localized through imaging directly. In contrast,
brain tissue infiltrated by tumor cells at low density appears normal on
current imaging modalities. In clinical practice, a uniform margin is applied
to account for microscopic spread of disease.
The current treatment planning procedure can potentially be improved by
accounting for the anisotropy of tumor growth: Anatomical barriers such as the
falx cerebri represent boundaries for migrating tumor cells. In addition, tumor
cells primarily spread in white matter and infiltrate gray matter at lower
rate. We investigate the use of a phenomenological tumor growth model for
treatment planning. The model is based on the Fisher-Kolmogorov equation, which
formalizes these growth characteristics and estimates the spatial distribution
of tumor cells in normal appearing regions of the brain. The target volume for
radiotherapy planning can be defined as an isoline of the simulated tumor cell
density.
A retrospective study involving 10 glioblastoma patients has been performed.
To illustrate the main findings of the study, a detailed case study is
presented for a glioblastoma located close to the falx. In this situation, the
falx represents a boundary for migrating tumor cells, whereas the corpus
callosum provides a route for the tumor to spread to the contralateral
hemisphere. We further discuss the sensitivity of the model with respect to the
input parameters. Correct segmentation of the brain appears to be the most
crucial model input.
We conclude that the tumor growth model provides a method to account for
anisotropic growth patterns of glioblastoma, and may therefore provide a tool
to make target delineation more objective and automated
Some Applications of the Percolation Theory. Brief Review of the Century Beginning
The review is a brief description of the state of problems in percolation
theory and their numerous applications, which are analyzed on base of
interesting papers published in the last 15-20 years. At the submitted papers
are studied both the cluster system of the physical body and its impact on the
object in general, and adequate mathematical tools for description of critical
phenomena too. Of special interest are the data, first, the point of phase
transition of certain of percolation system is not really a point, but it is a
critical interval, and second, in vicinity of percolation threshold observed
many different infinite clusters instead of one infinite cluster that appears
in traditional consideration.Comment: arXiv admin note: text overlap with arXiv:1104.5376, arXiv:1205.0691
by other author
Recommended from our members
Toward improved calibration of hydrologic models: Combining the strengths of manual and automatic methods
Automatic methods for model calibration seek to take advantage of the speed and power of digital computers, while being objective and relatively easy to implement. However, they do not provide parameter estimates and hydrograph simulations that are considered acceptable by the hydrologists responsible for operational forecasting and have therefore not entered into widespread use. In contrast, the manual approach which has been developed and refined over the years to result in excellent model calibrations is complicated and highly labor-intensive, and the expertise acquired by one individual with a specific model is not easily transferred to another person (or model). In this paper, we propose a hybrid approach that combines the strengths of each. A multicriteria formulation is used to "model" the evaluation techniques and strategies used in manual calibration, and the resulting optimization problem is solved by means of a computerized algorithm. The new approach provides a stronger test of model performance than methods that use a single overall statistic to aggregate model errors over a large range of hydrologic behaviors. The power of the new approach is illustrated by means of a case study using the Sacramento Soil Moisture Accounting model
Cancer therapeutic potential of combinatorial immuno- and vaso-modulatory interventions
Currently, most of the basic mechanisms governing tumor-immune system
interactions, in combination with modulations of tumor-associated vasculature,
are far from being completely understood. Here, we propose a mathematical model
of vascularized tumor growth, where the main novelty is the modeling of the
interplay between functional tumor vasculature and effector cell recruitment
dynamics. Parameters are calibrated on the basis of different in vivo
immunocompromised Rag1-/- and wild-type (WT) BALB/c murine tumor growth
experiments. The model analysis supports that tumor vasculature normalization
can be a plausible and effective strategy to treat cancer when combined with
appropriate immuno-stimulations. We find that improved levels of functional
tumor vasculature, potentially mediated by normalization or stress alleviation
strategies, can provide beneficial outcomes in terms of tumor burden reduction
and growth control. Normalization of tumor blood vessels opens a therapeutic
window of opportunity to augment the antitumor immune responses, as well as to
reduce the intratumoral immunosuppression and induced-hypoxia due to vascular
abnormalities. The potential success of normalizing tumor-associated
vasculature closely depends on the effector cell recruitment dynamics and tumor
sizes. Furthermore, an arbitrary increase of initial effector cell
concentration does not necessarily imply a better tumor control. We evidence
the existence of an optimal concentration range of effector cells for tumor
shrinkage. Based on these findings, we suggest a theory-driven therapeutic
proposal that optimally combines immuno- and vaso-modulatory interventions
- …