1,687 research outputs found
A formal proof of the optimal frame setting for Dynamic-Frame Aloha with known population size
In Dynamic-Frame Aloha subsequent frame lengths must be optimally chosen to
maximize throughput. When the initial population size is known,
numerical evaluations show that the maximum efficiency is achieved by setting
the frame length equal to the backlog size at each subsequent frame; however,
at best of our knowledge, a formal proof of this result is still missing, and
is provided here. As byproduct, we also prove that the asymptotical efficiency
in the optimal case is , provide upper and lower bounds for the length
of the entire transmission period and show that its asymptotical behaviour is
, with .Comment: 22 pages, submitted to IEEE Trans. on Information Theor
On the Hardness-Intensity Correlation in Gamma-Ray Burst Pulses
We study the hardness-intensity correlation (HIC) in gamma-ray bursts (GRBs).
In particular, we analyze the decay phase of pulse structures in their light
curves. The study comprises a sample of 82 long pulses selected from 66 long
bursts observed by BATSE on the Compton Gamma-Ray Observatory. We find that at
least 57% of these pulses have HICs that can be well described by a power law.
The distribution of the power law indices, obtained by modeling the HIC of
pulses from different bursts, is broad with a mean of 1.9 and a standard
deviation of 0.7. We also compare indices among pulses from the same bursts and
find that their distribution is significantly narrower. The probability of a
random coincidence is shown to be very small. In most cases, the indices are
equal to within the uncertainties. This is particularly relevant when comparing
the external versus the internal shock models. In our analysis, we also use a
new method for studying the HIC, in which the intensity is represented by the
peak value of the E F_E spectrum. This new method gives stronger correlations
and is useful in the study of various aspects of the HIC. In particular, it
produces a better agreement between indices of different pulses within the same
burst. Also, we find that some pulses exhibit a "track jump" in their HICs, in
which the correlation jumps between two power laws with the same index. We
discuss the possibility that the "track jump" is caused by strongly overlapping
pulses. Based on our findings, the constancy of the index is proposed to be
used as a tool for pulse identification in overlapping pulses.Comment: 20 pages with 9 eps figures (emulateapj), ApJ accepte
Infiltração marginal de dentes selados com materiais restauradores rovisórios
TCC (graduação) - Universidade Federal de Santa Catarina, Universidade Federal de Santa Catarina, Centro de Ciências da Saúde, Curso de Odontologia.O objetivo deste estudo, in vitro, foi comparar a capacidade seladora de diferentes materiais restauradores provisórios usados em Endodontia: Bioplic®, XTemp®, XTemp LC®, Maxxion R®, Riva LC® e Coltosol®. Quarenta e dois molares foram selecionados e divididos em 7 grupos de 6 elementos, sendo um deles o controle negativo. Após a abertura coronária, uma camada de resina composta foi fotopolimerizada nas entradas dos canais. Sobre ela foi colocada uma bolinha de algodão de forma a padronizar a altura da câmara pulpar em 4mm. Os materiais foram inseridos de acordo com as instruções dos fabricantes e os dentes impermeabilizados, exceto 1mm ao redor do bordo cavo superficial. Após termociclagem (125 ciclos) somente a coroa, e a porção mais cervical da raiz, foi submersa em azul de metileno a 2% e mantidos a 37º. Decorridas 72 h os dentes foram seccionados longitudinalmente e a infiltração marginal avaliada pelos escores: 0 = sem infiltração ou apenas superficial, 1 = até a metade da parede cavitária e do selamento, 2 = em toda a extensão da parede cavitária e do selamento, 3 = em toda a extensão da parede cavitária e do selamento atingindo a bolinha de algodão. Os dados foram analisados pelo teste de Kruskal-Wallis que detectou diferença significativa entre os materiais (p>0,05). Para comparações individuais foi empregado o teste U de Mann-Whitney. O Bioplic® apresentou o melhor resultado, 83,33% das amostras receberam escore 0. O Coltosol com 33,33% das amostras em escore 0 não mostrou diferença estatisticamente significativa com o Bioplic. Os outros materiais receberam escores que variaram entre 1 e 3. Concluiu-se que o Bioplic® foi o material que proporcionou o melhor selamento, porém nenhum material foi capaz de impedir totalmente a infiltração marginal do corante
Determining Bolometric Corrections for BATSE Burst Observations
We compare the energy and count fluxes obtained by integrating over the
finite bandwidth of BATSE with a measure proportional to the bolometric energy
flux, the phi-measure, introduced by Borgonovo & Ryde. We do this on a sample
of 74 bright, long, and smooth pulses from 55 GRBs. The correction factors show
a fairly constant behavior over the whole sample, when the
signal-to-noise-ratio is high enough. We present the averaged spectral
bolometric correction for the sample, which can be used to correct flux data.Comment: 3 pages, 3 figures, to appear in AIP proc. "Gamma-Ray Burst and
Afterglow Astronomy 2001" Woods Hole, Massachusett
Synthesis of Aluminum-Aluminum Nitride Nanocomposites by Gas-Liquid Reactions
An innovative method has been developed for synthesizing aluminum-aluminum nitride nanocomposite materials wherein the reinforcing nano-sized aluminum nitride particles are formed in-situ in a molten aluminum alloy. This method, which circumvents most issues associated with the traditional ways of making nanocomposites, involves reacting a nitrogen-bearing gas with a specially designed molten aluminum alloy. The method ensures excellent dispersion of the nanoparticles in the matrix alloy, which is reflected in enhanced mechanical properties. In this thesis, the author reviews the limitations of the conventional methods of manufacturing nanocomposites and develops thermodynamic and kinetic models that allow optimizing the in-situ gas-liquid process to produce quality nanocomposite material. Also, in this thesis, the author reports the measured room temperature and elevated temperature tensile properties of materials that were made by the optimized process and compares the measured values to their counterparts obtained for the base alloy. A 75 pct. increase in room temperature yield strength is obtained when the base alloy is reinforced with one pct. nano-size aluminum nitride particles and this significant increase in yield strength is accompanied by only negligible loss of ductility
Aluminum Nano-composites for Elevated Temperature Applications
Conventional manufacturing methods are sub-optimal for nano-composites fabrication. Inhomogeneous dispersion of the secondary phase and scalability issues are the main issues. This work focuses on an innovative method where the reinforcement is formed in-situ in the melt. It involves the reaction of the molten aluminum with a nitrogen- bearing gas injected through the melt at around 1273 K. AlN particles are expected to form through this in situ reaction. A model has been developed to predict the amount of reinforced phase. Experiments have been carried out to confirm the feasibility of the process and the mechanism of AlN formation discussed. The detrimental effect of oxygen in the melt which hinders the nitridation reaction has been proved. The effect of process times and the addition of alloying elements (Mg and Si) have also been investigated
The Temporal and Spectral Characteristics of "Fast Rise and Exponential Decay" Gamma-Ray Burst Pulses
In this paper we have analyzed the temporal and spectral behavior of 52 Fast
Rise and Exponential Decay (FRED) pulses in 48 long-duration gamma-ray bursts
(GRBs) observed by the CGRO/BATSE, using a pulse model with two shape
parameters and the Band model with three shape parameters, respectively. It is
found that these FRED pulses are distinguished both temporally and spectrally
from those in long-lag pulses. Different from these long-lag pulses only one
parameter pair indicates an evident correlation among the five parameters,
which suggests that at least 4 parameters are needed to model burst
temporal and spectral behavior. In addition, our studies reveal that these FRED
pulses have correlated properties: (i) long-duration pulses have harder spectra
and are less luminous than short-duration pulses; (ii) the more asymmetric the
pulses are the steeper the evolutionary curves of the peak energy () in
the spectrum within pulse decay phase are. Our statistical
results give some constrains on the current GRB models.Comment: 18 pages, 7 figures, accepted for publication in the Astrophysical
Journa
Fighting the curse of sparsity: probabilistic sensitivity measures from cumulative distribution functions
Quantitative models support investigators in several risk analysis applications. The calculation of sensitivity measures is an integral part of this analysis. However, it becomes a computationally challenging task, especially when the number of model inputs is large and the model output is spread over orders of magnitude. We introduce and test a new method for the estimation of global sensitivity measures. The new method relies on the intuition of exploiting the empirical cumulative distribution function of the simulator output. This choice allows the estimators of global sensitivity measures to be based on numbers between 0 and 1, thus fighting the curse of sparsity. For density-based sensitivity measures, we devise an approach based on moving averages that bypasses kernel-density estimation. We compare the new method to approaches for calculating popular risk analysis global sensitivity measures as well as to approaches for computing dependence measures gathering increasing interest in the machine learning and statistics literature (the Hilbert–Schmidt independence criterion and distance covariance). The comparison involves also the number of operations needed to obtain the estimates, an aspect often neglected in global sensitivity studies. We let the estimators undergo several tests, first with the wing-weight test case, then with a computationally challenging code with up to k = 30, 000 inputs, and finally with the traditional Level E benchmark code
- …