6,305 research outputs found
Modeling interannual dense shelf water export in the region of the Mertz Glacier Tongue (1992-2007)
1] Ocean observations around the Australian-Antarctic basin show the importance of coastal latent heat polynyas near the Mertz Glacier Tongue (MGT) to the formation of Dense Shelf Water (DSW) and associated Antarctic Bottom Water (AABW). Here, we use a regional ocean/ice shelf model to investigate the interannual variability of the export of DSW from the AdĂ©lie (west of the MGT) and the Mertz (east of the MGT) depressions from 1992 to 2007. The variability in the model is driven by changes in observed surface heat and salt fluxes. The model simulates an annual mean export of DSW through the AdĂ©lie sill of about 0.07â±â0.06 Sv. From 1992 to 1998, the export of DSW through the AdĂ©lie (Mertz) sills peaked at 0.14 Sv (0.29 Sv) during July to November. During periods of mean to strong polynya activity (defined by the surface ocean heat loss), DSW formed in the AdĂ©lie depression can spread into the Mertz depression via the cavity under the MGT. An additional simulation, where ocean/ice shelf thermodynamics have been disabled, highlights the fact that models without ocean/ice shelf interaction processes will significantly overestimate rates of DSW export. The melt rates of the MGT are 1.2â±â0.4 m yrâ1 during periods of average to strong polynya activity and can increase to 3.8â±â1.5 m/yr during periods of sustained weak polynya activity, due to the increased presence of relatively warmer water interacting with the base of the ice shelf. The increased melting of the MGT during a weak polynya state can cause further freshening of the DSW and ultimately limits the production of AABW
Staying true with the help of others: doxastic self-control through interpersonal commitment
I explore the possibility and rationality of interpersonal mechanisms of doxastic self-control, that is, ways in which individuals can make use of other people in order to get themselves to stick to their beliefs. I look, in particular, at two ways in which people can make interpersonal epistemic commitments, and thereby willingly undertake accountability to others, in order to get themselves to maintain their beliefs in the face of anticipated âepistemic temptationsâ. The first way is through the avowal of belief, and the second is through the establishment of collective belief. I argue that both of these forms of interpersonal epistemic commitment can function as effective tools for doxastic self-control, and, moreover, that the control they facilitate should not be dismissed as irrational from an epistemic perspective
Nonequilibrium spectral diffusion due to laser heating in stimulated photon echo spectroscopy of low temperature glasses
A quantitative theory is developed, which accounts for heating artifacts in
three-pulse photon echo (3PE) experiments. The heat diffusion equation is
solved and the average value of the temperature in the focal volume of the
laser is determined as a function of the 3PE waiting time. This temperature is
used in the framework of nonequilibrium spectral diffusion theory to calculate
the effective homogeneous linewidth of an ensemble of probe molecules embedded
in an amorphous host. The theory fits recently observed plateaus and bumps
without introducing a gap in the distribution function of flip rates of the
two-level systems or any other major modification of the standard tunneling
model.Comment: 10 pages, Revtex, 6 eps-figures, accepted for publication in Phys.
Rev.
Decrease in the orbital period of dwarf nova OY Carinae
We have measured the orbital light curve of dwarf nova OY Carinae on 8
separate occasions between 1997 September and 2005 December. The measurements
were made in white light using CCD photometers on the Mt Canopus 1 m telescope.
The time of eclipse in 2005 December was 168 +- 5 s earlier than that predicted
by the Wood et al.(1989) ephemeris. Using the times of eclipse from our
measurements and the compilation of published measurements by Pratt et al
(1999) we find that the observational data are inconsistent with a constant
period and indicate that the orbital period is decreasing by 5+-1 X 10^-12 s/s.
This is too fast to be explained by gravitational radiation emission. It is
possible that the change is cyclic with a period greater than about 80 years.
This is much longer than typical magnetic activity cycles and may be due to the
presence of a third object in the system. Preliminary estimates suggest that
this is a brown dwarf with mass about 0.016 Msun and orbital radius >= 17 AU.Comment: 4 pages 2 figures. MNRAS submitted Final proofread version.
Discussion modified with figure showing fits and residuals to models,
statistical significance of fits added and minor typographical edit
Reply to Comment by Vincent et al.
Peer Reviewedhttps://deepblue.lib.umich.edu/bitstream/2027.42/143706/1/tect20719.pdfhttps://deepblue.lib.umich.edu/bitstream/2027.42/143706/2/tect20719_am.pd
Recommended from our members
Contributions of greenhouse gas forcing and the Southern Annular Mode to historical Southern Ocean surface temperature trends
We examine the 1979-2014 Southern Ocean (SO) sea surface temperature (SST) trends simulated in an ensemble of coupled general circulation models and evaluate possible causes of the modelsâ inability to reproduce the observed 1979-2014 SO cooling. For each model we estimate the response of SO SST to step changes in greenhouse gas (GHG) forcing and in the seasonal indices of the Southern Annular Mode (SAM). Using these step-response functions, we skillfully reconstruct the modelsâ 1979-2014 SO SST trends. Consistent with the seasonal signature of the Antarctic ozone hole and the seasonality of SO stratification, the summer and fall SAM exert a large impact on the simulated SO SST trends. We further identify conditions that favor multidecadal SO cooling: 1) a weak SO warming response to GHG forcing; 2) a strong multidecadal SO cooling response to a positive SAM trend; 3) a historical SAM trend as strong as in observations
The CMS Event Builder
The data acquisition system of the CMS experiment at the Large Hadron
Collider will employ an event builder which will combine data from about 500
data sources into full events at an aggregate throughput of 100 GByte/s.
Several architectures and switch technologies have been evaluated for the DAQ
Technical Design Report by measurements with test benches and by simulation.
This paper describes studies of an EVB test-bench based on 64 PCs acting as
data sources and data consumers and employing both Gigabit Ethernet and Myrinet
technologies as the interconnect. In the case of Ethernet, protocols based on
Layer-2 frames and on TCP/IP are evaluated. Results from ongoing studies,
including measurements on throughput and scaling are presented.
The architecture of the baseline CMS event builder will be outlined. The
event builder is organised into two stages with intelligent buffers in between.
The first stage contains 64 switches performing a first level of data
concentration by building super-fragments from fragments of 8 data sources. The
second stage combines the 64 super-fragments into full events. This
architecture allows installation of the second stage of the event builder in
steps, with the overall throughput scaling linearly with the number of switches
in the second stage. Possible implementations of the components of the event
builder are discussed and the expected performance of the full event builder is
outlined.Comment: Conference CHEP0
Platform for automatic patient quality assurance via Monte Carlo simulations in proton therapy
For radiation therapy, it is crucial to ensure that the delivered dose matches the planned dose. Errors in the dose calculations done in the treatment planning system (TPS), treatment delivery errors, other software bugs or data corruption during transfer might lead to significant differences between predicted and delivered doses. As such, patient specific quality assurance (QA) of dose distributions, through experimental validation of individual fields, is necessary. These measurement based approaches, however, are performed with 2D detectors, with limited resolution and in a water phantom. Moreover, they are work intensive and often impose a bottleneck to treatment efficiency. In this work, we investigated the potential to replace measurement-based approach with a simulation-based patient specific QA using a Monte Carlo (MC) code as independent dose calculation engine in combination with treatment log files. Our developed QA platform is composed of a web interface, servers and computation scripts, and is capable to autonomously launch simulations, identify and report dosimetric inconsistencies. To validate the beam model of independent MC engine, in-water simulations of mono-energetic layers and 30 SOBP-type dose distributions were performed. Average Gamma passing ratio 99 ± 0.5% for criteria 2%/2 mm was observed. To demonstrate feasibility of the proposed approach, 10 clinical cases such as head and neck, intracranial indications and craniospinal axis, were retrospectively evaluated via the QA platform. The results obtained via QA platform were compared to QA results obtained by measurement-based approach. This comparison demonstrated consistency between the methods, while the proposed approach significantly reduced in-room time required for QA procedures
Random variables with completely independent subcollections
AbstractWe investigate the algebra and geometry of the independence conditions on discrete random variables in which we consider a collection of random variables and study the condition of independence of some subcollections. We interpret independence conditions as an ideal of algebraic relations. After a change of variables, this ideal is generated by generalized 2Ă2 minors of multi-way tables and linear forms. In particular, let Î be a simplicial complex on some random variables and A be the table corresponding to the product of those random variables. If A is Î-independent table then A can be written as the entrywise sum AI+A0 where AI is a completely independent table and A0 is identically 0 in its Î-margins.We compute the isolated components of the original ideal, showing that there is only one component that could correspond to probability distributions, and relate the algebra and geometry of the main component to that of the Segre embedding. If Î has fewer than three facets, we are able to compute generators for the main component, show that it is CohenâMacaulay, and give a full primary decomposition of the original ideal
The Weddelll Sea and Dronning Maud Land (WSDML) Regional Working Group Virtual Science Workshop, 20-23 October, 2020.
Workshop report from the Weddell Sea and Dronning Maud Land (WSDML) Regional Working Group virtual science workshop, held 20-23 October 2020
- âŠ