24 research outputs found
Thalamic Neuron Resilience during Osmotic Demyelination Syndrome (ODS) Is Revealed by Primary Cilium Outgrowth and ADP-ribosylation factor-like protein 13B Labeling in Axon Initial Segment
A murine osmotic demyelinating syndrome (ODS) model was developed through chronic hyponatremia, induced by desmopressin subcutaneous implants, followed by precipitous sodium restoration. The thalamic ventral posterolateral (VPL) and ventral posteromedial (VPM) relay nuclei were the most demyelinated regions where neuroglial damage could be evidenced without immune response. This report showed that following chronic hyponatremia, 12 h and 48 h time lapses after rebalancing osmolarity, amid the ODS-degraded outskirts, some resilient neuronal cell bodies built up primary cilium and axon hillock regions that extended into axon initial segments (AIS) where ADP-ribosylation factor-like protein 13B (ARL13B)-immunolabeled rod-like shape content was revealed. These AIS-labeled shaft lengths appeared proportional with the distance of neuronal cell bodies away from the ODS damaged epicenter and time lapses after correction of hyponatremia. Fine structure examination verified these neuron abundant transcriptions and translation regions marked by the ARL13B labeling associated with cell neurotubules and their complex cytoskeletal macromolecular architecture. This necessitated energetic transport to organize and restore those AIS away from the damaged ODS core demyelinated zone in the murine model. These labeled structures could substantiate how thalamic neuron resilience occurred as possible steps of a healing course out of ODS.</p
Direct Method for Floor Response Spectra Considering Soil-Structure Interaction
Soil-Structure Interaction (SSI) analysis is required in structural dynamic analysis under
seismic excitations in the current standards, and it significantly influences the Floor Response
Spectra (FRS),which are used in the safety assessment for the secondary systems in
nuclear power facilities. A direct spectra-to-spectra method is well developed to generate
FRS in fixed-basemodels efficiently and accurately. Thus it is necessary to address SSI effect
and integrate it into the free field motion as the seismic input for the fixed-base model.
For the fully or partly embedded nuclear power reactors, earthquake excitations come from
both bottomfoundations and external structures. In this case, the foundations and external
structures behave like a foundation systemwith seismic input atmultiple supporting points.
The objective of this study is to develop an approach to address SSI effect considering
the foundation flexibility and spatially varying ground motions. A substructure method
is analytically derived to convert the three dimensional free field motion, i.e., Foundation
Input Response Spectra (FIRS) given by site response analysis, into Foundation Level Input
Response Spectra (FLIRS). The latter can be used as the seismic input in the direct spectrato-
spectra method to generate FRS considering SSI. Only FIRS, dynamic soil stiffness,
mass matrix, geometry of the structure, and basic modal information, including natural
frequencies and modal shapes, are needed. Both flexible and rigid foundations are considered
under the excitation of spatially varying ground motions or uniform seismic input.
Furthermore, parametric study is performed to examine the influence of the foundation
flexibility on SSI analysis and the resultant FRS. It is observed that FLIRS and FRS are amplified
significantly due to SSI effect. This amplification is more severe and the associated
frequency is smaller with amore flexible foundation.
A semi-analyticalmethod is proposed to generate dynamic soil stiffness of rigid foundations
and flexible foundations. Given the soil properties, the Greenâs influence function is
formulated analytically from wave propagation functions. And Boundary ElementMethod
(BEM) is employed to determine the dynamic stiffness of foundationswith arbitrary shapes.
The resultant 6MĂ6M dynamic soil stiffness matrix is then used as the generalized soil
springs in the proposed substructuremethod.
This study presents a fully probabilistic method for addressing the uncertainty resulting
from seismic input and soil properties in the generation of FRS.A large number of FLIRS
are developed byMonte Carlo simulations,which enables the uncertainty to be propagated
from site response analysis to SSI analysis consistently. Then a uniform hazard FLIRS is
obtained. Compared to the approach specified in current codes, the uniform hazard FRS
lowered the seismic demand significantly to provide amore economical solution for seismic
design. Meanwhile, it overcomes the underestimation of FRS by current method in some
frequency range. A realistic and continuous distribution is proposed for shear wave velocity
(Vs ) to replace the current application. Sensitivity study is performed on the correlation
coefficient and the standard derivation of Vs. The results show that these two parameters
do not influencemuch in uncertainty analysis.
Based on the proposed method, SSI analysis is performed in a realistic model to develop
uniform hazard FLIRS for performance-based seismic design, and the direct spectra-tospectramethod
is extended to generate safe and economical FRS considering SSI
Light Scattering by Non-Spherical Particles
Nicht-sphÀrische Teilchen sind in der Natur sowie in verfahrenstechnischen Anwendungen sehr hÀufig anzutreffen. Insbesondere
die Detektion von Eiskristallen wÀhrend des Fluges durch Verkehrsflugzeuge ist ein Problem, dass in den
vergangenen Jahren vermehrt Aufmerksamkeit erhalten hat. WĂ€hrend das Problem der Streuung einer ebenen elektromagnetischen
Welle durch ein homogenes und isotropes sphÀrisches Teilchen, wie z.B. einen Regentropfen als vollstÀndig
gelöst anzusehen ist, ist dies bei nicht-sphÀrischen Partikeln nicht der Fall. Hier existiert nach wie vor ein Fokus
der Forschung und Entwicklung sowohl auf theoretischer, numerischer, als auch experimenteller Seite, aufgrund einer
Vielzahl an unterschiedlichen Schwierigkeiten. Diese Arbeit beschreibt verschiedene numerische und semianalytische
Verfahren, die auf das Streuproblem angewandt werden können und dabei die gesamte Reichweite des maĂgeblichen
Mie-GröĂenparameters abdecken. Diese Methoden werden auf die Kalibration und Interpretation der Messergebnisse
des PHIPS-Messinstruments angewandt, welches in einer HALO Kampagne zur Charakterisierung von atmosphÀrischen
Eiskristallen erprobt wurde. Die Berechnungsmethoden im Einzelnen beinhalten zwei Derivate der geometrischen Optik,
anwendbar auf beliebige Partikel-Geometrien mit homogenem, als auch inhomogenem Brechungsindex, das numerisch
exakte Verfahren der Finiten Integration der Maxwell-Gleichungen, sowie die in der Lichtstreuung und Quantenmechanik
hÀufig verwendete Transitionsoperator-Methode. Diese Berechnungsmethoden werden auf eine Reihe von Beispiel-
Geometrien angewandt und der Einfluss von Polarisation und gemittelter Partikel-Orientierung werden untersucht.
ZusÀtzlich wurde ein Verfahren implementiert, dass das Strahlprofil eines Laserstrahls auf die gestreute LichtintensitÀt
berĂŒcksichtigt, welches beispielsweise in der Anwendung bei Time-Shift Messungen eine zentrale Rolle spielt. Die
Grenzen der Anwendbarkeit der verschiedenen Berechnungsmethoden werden in der Arbeit erlÀutert. Des Weiteren werden
mehrere moderne Messverfahren auf ihre Anwendbarkeit im Hinblick auf nicht-sphĂ€rische Teilchen hin ĂŒberprĂŒft.
Dies beinhaltet unter anderem das Time-Shift Messverfahren, sowie interferometrische bildgebende Verfahren. Die Analyse
der Anwendbarkeit der verschiedenen Messmethoden ist im experimentellen Abschnitt der Arbeit dokumentiert.
Messungen der Streulicht-Phasenfunktionen von natĂŒrlichen Eiskristallen wurden ebenfalls durchgefĂŒhrt und die spezifischen
Vorbereitungen fĂŒr die Untersuchungen von Eiskristallen in einem optischen Experiment werden in dieser Arbeit
ebenfalls erlÀutert. Als gemeinsame Problematik konnte bei vielen Verfahren der limitierte Dynamikbereich der verwendeten
Detektoren identifiziert werden. Ein abschlieĂender wichtiger Aspekt in dieser Arbeit ist die Produktion und
Aufbewahrung von Eiskristallen mit möglichst natĂŒrlichen optischen Eigenschaften in einer Laborumgebung. HierfĂŒr
wurde eine kompakte Wolkenkammer entwickelt, die die geforderten Eigenschaften an Produktionsmenge und QualitÀt
von Eiskristallen erfĂŒllt. Auslegung, Konstruktion und Betrieb des Apparates werden im letzten Kapitel der Dissertation
detailliert wiedergegeben
Recommended from our members
SCALE: A modular code system for performing standardized computer analyses for licensing evaluation
This Manual represents Revision 5 of the user documentation for the modular code system referred to as SCALE. The history of the SCALE code system dates back to 1969 when the current Computational Physics and Engineering Division at Oak Ridge National Laboratory (ORNL) began providing the transportation package certification staff at the U.S. Atomic Energy Commission with computational support in the use of the new KENO code for performing criticality safety assessments with the statistical Monte Carlo method. From 1969 to 1976 the certification staff relied on the ORNL staff to assist them in the correct use of codes and data for criticality, shielding, and heat transfer analyses of transportation packages. However, the certification staff learned that, with only occasional use of the codes, it was difficult to become proficient in performing the calculations often needed for an independent safety review. Thus, shortly after the move of the certification staff to the U.S. Nuclear Regulatory Commission (NRC), the NRC staff proposed the development of an easy-to-use analysis system that provided the technical capabilities of the individual modules with which they were familiar. With this proposal, the concept of the Standardized Computer Analyses for Licensing Evaluation (SCALE) code system was born. This manual covers an array of modules written for the SCALE package, consisting of drivers, system libraries, cross section and materials properties libraries, input/output routines, storage modules, and help files
Geospatial Computing: Architectures and Algorithms for Mapping Applications
Beginning with the MapTube website (1), which was launched in 2007 for crowd-sourcing maps, this project investigates approaches to exploratory Geographic Information Systems (GIS) using web-based mapping, or âweb GISâ. Users can log in to upload their own maps and overlay different layers of GIS data sets. This work looks into the theory behind how web-based mapping systems function and whether their performance can be modelled and predicted. One of the important questions when dealing with different geospatial data sets is how they relate to one another. Internet data stores provide another source of information, which can be exploited if more generic geospatial data mining techniques are developed. The identification of similarities between thousands of maps is a GIS technique that can give structure to the overall fabric of the data, once the problems of scalability and comparisons between different geographies are solved. After running MapTube for nine years to crowd-source data, this would mark a natural progression from visualisation of individual maps to wider questions about what additional knowledge can be discovered from the data collected. In the new âdata scienceâ age, the introduction of real-time data sets introduces a new challenge for web-based mapping applications. The mapping of real-time geospatial systems is technically challenging, but has the potential to show inter-dependencies as they emerge in the time series. Combined geospatial and temporal data mining of realtime sources can provide archives of transport and environmental data from which to accurately model the systems under investigation. By using techniques from machine learning, the models can be built directly from the real-time data stream. These models can then be used for analysis and experimentation, being derived directly from city data. This then leads to an analysis of the behaviours of the interacting systems. (1) The MapTube website: http://www.maptube.org
Proceedings of AUTOMATA 2010: 16th International workshop on cellular automata and discrete complex systems
International audienceThese local proceedings hold the papers of two catgeories: (a) Short, non-reviewed papers (b) Full paper
Zylindrische Dekomposition unter anwendungsorientierten Paradigmen
Quantifier elimination (QE) is a powerful tool for problem solving. Once a problem is expressed as a formula, such a method converts it to a simpler, quantifier-free equivalent, thus solving the problem. Particularly many problems live in the domain of real numbers, which makes real QE very interesting. Among the so far implemented methods, QE by cylindrical algebraic decomposition (CAD) is the most important complete method. The aim of this thesis is to develop CAD-based algorithms, which can solve more problems in practice and/or provide more interesting information as output. An algorithm that satisfies these standards would concentrate on generic cases and postpone special and degenerated ones to be treated separately or to be abandoned completely. It would give a solution, which is locally correct for a region the user is interested in. It would give answers, which can provide much valuable information in particular for decision problems. It would combine these methods with more specialized ones, for subcases that allow for. It would exploit degrees of freedom in the algorithms by deciding to proceed in a way that promises to be efficient. It is the focus of this dissertation to treat these challenges. Algorithms described here are implemented in the computer logic system REDLOG and ship with the computer algebra system REDUCE