40,793 research outputs found

    The anatomy of the Gunn laser

    Get PDF
    A monopolar GaAs Fabry–Pérot cavity laser based on the Gunn effect is studied both experimentally and theoretically. The light emission occurs via the band-to-band recombination of impact-ionized excess carriers in the propagating space-charge (Gunn) domains. Electroluminescence spectrum from the cleaved end-facet emission of devices with Ga1−xAlxAs (x = 0.32) waveguides shows clearly a preferential mode at a wavelength around 840 nm at T = 95 K. The threshold laser gain is assessed by using an impact ionization coefficient resulting from excess carriers inside the high-field domain

    Roughness effects in turbulent forced convection

    Get PDF
    We conducted direct numerical simulations (DNSs) of turbulent flow over three-dimensional sinusoidal roughness in a channel. A passive scalar is present in the flow with Prandtl number Pr=0.7Pr=0.7, to study heat transfer by forced convection over this rough surface. The minimal channel is used to circumvent the high cost of simulating high Reynolds number flows, which enables a range of rough surfaces to be efficiently simulated. The near-wall temperature profile in the minimal channel agrees well with that of the conventional full-span channel, indicating it can be readily used for heat-transfer studies at a much reduced cost compared to conventional DNS. As the roughness Reynolds number, k+k^+, is increased, the Hama roughness function, ΔU+\Delta U^+, increases in the transitionally rough regime before tending towards the fully rough asymptote of κm1log(k+)+C\kappa_m^{-1}\log(k^+)+C, where CC is a constant that depends on the particular roughness geometry and κm0.4\kappa_m\approx0.4 is the von K\'arm\'an constant. In this fully rough regime, the skin-friction coefficient is constant with bulk Reynolds number, RebRe_b. Meanwhile, the temperature difference between smooth- and rough-wall flows, ΔΘ+\Delta\Theta^+, appears to tend towards a constant value, ΔΘFR+\Delta\Theta^+_{FR}. This corresponds to the Stanton number (the temperature analogue of the skin-friction coefficient) monotonically decreasing with RebRe_b in the fully rough regime. Using shifted logarithmic velocity and temperature profiles, the heat transfer law as described by the Stanton number in the fully rough regime can be derived once both the equivalent sand-grain roughness ks/kk_s/k and the temperature difference ΔΘFR+\Delta \Theta^+_{FR} are known. In meteorology, this corresponds to the ratio of momentum and heat transfer roughness lengths, z0m/z0hz_{0m}/z_{0h}, being linearly proportional to z0m+z_{0m}^+, the momentum roughness length [continued]...Comment: Accepted (In press) in the Journal of Fluid Mechanic

    An Elliptical Galaxy Luminosity Function and Velocity Dispersion Sample of Relevance for Gravitational Lensing Statistics

    Full text link
    We have selected 42 elliptical galaxies from the literature and estimated their velocity dispersions at the effective radius (\sigma_{\re}) and at 0.54 effective radii (\vff). We find by a dynamical analysis that the normalized velocity dispersion of the dark halo of an elliptical galaxy \vdm is roughly \sigma_{\re} multiplied by a constant, which is almost independent of the core radius or the anisotropy parameter of each galaxy. Our sample analysis suggests that \vdm^{*} lies in the range 178-198 km s1^{-1}. The power law relation we find between the luminosity and the dark matter velocity dispersion measured in this way is (L/L^{*}) = (\vdm/\vdm^{*})^\gamma, where γ\gamma is between 2-3. These results are of interest for strong gravitational lensing statistics studies. In order to determine the value of \vdm^{*}, we calculate \mstar in the same \bt band in which \vdm^{*} has been estimated. We select 131 elliptical galaxies as a complete sample set with apparent magnitudes \bt between 9.26 and 12.19. We find that the luminosity function is well fitted to the Schechter form, with parameters \mstar = -19.66 + 5log10h±0.30\cdot\log_{10}h \pm 0.30, α=0.15±0.55\alpha = 0.15 \pm 0.55, and the normalization constant ϕ=(1.34±0.30)×103h3\phi^{*} = (1.34 \pm 0.30) \times 10^{-3} h^{3} Mpc3^{-3}, with the Hubble constant \hnot = 100 hh km s1^{-1} Mpc1^{-1}. This normalization implies that morphology type E galaxies make up (10.8 ±\pm 1.2) per cent of all galaxies.Comment: 18 pages latex, with ps figs included. accepted by New Astronomy (revised to incorporate referees comments

    Impact of edge-removal on the centrality betweenness of the best spreaders

    Full text link
    The control of epidemic spreading is essential to avoid potential fatal consequences and also, to lessen unforeseen socio-economic impact. The need for effective control is exemplified during the severe acute respiratory syndrome (SARS) in 2003, which has inflicted near to a thousand deaths as well as bankruptcies of airlines and related businesses. In this article, we examine the efficacy of control strategies on the propagation of infectious diseases based on removing connections within real world airline network with the associated economic and social costs taken into account through defining appropriate quantitative measures. We uncover the surprising results that removing less busy connections can be far more effective in hindering the spread of the disease than removing the more popular connections. Since disconnecting the less popular routes tend to incur less socio-economic cost, our finding suggests the possibility of trading minimal reduction in connectivity of an important hub with efficiencies in epidemic control. In particular, we demonstrate the performance of various local epidemic control strategies, and show how our approach can predict their cost effectiveness through the spreading control characteristics.Comment: 11 pages, 4 figure

    Weak Parity

    Get PDF
    We study the query complexity of Weak Parity: the problem of computing the parity of an n-bit input string, where one only has to succeed on a 1/2+eps fraction of input strings, but must do so with high probability on those inputs where one does succeed. It is well-known that n randomized queries and n/2 quantum queries are needed to compute parity on all inputs. But surprisingly, we give a randomized algorithm for Weak Parity that makes only O(n/log^0.246(1/eps)) queries, as well as a quantum algorithm that makes only O(n/sqrt(log(1/eps))) queries. We also prove a lower bound of Omega(n/log(1/eps)) in both cases; and using extremal combinatorics, prove lower bounds of Omega(log n) in the randomized case and Omega(sqrt(log n)) in the quantum case for any eps>0. We show that improving our lower bounds is intimately related to two longstanding open problems about Boolean functions: the Sensitivity Conjecture, and the relationships between query complexity and polynomial degree.Comment: 18 page

    Reliability assessment of microgrid with renewable generation and prioritized loads

    Full text link
    With the increase in awareness about the climate change, there has been a tremendous shift towards utilizing renewable energy sources (RES). In this regard, smart grid technologies have been presented to facilitate higher penetration of RES. Microgrids are the key components of the smart grids. Microgrids allow integration of various distributed energy resources (DER) such as the distributed generation (DGs) and energy storage systems (ESSs) into the distribution system and hence remove or delay the need for distribution expansion. One of the crucial requirements for utilities is to ensure that the system reliability is maintained with the inclusion of microgrid topology. Therefore, this paper evaluates the reliability of a microgrid containing prioritized loads and distributed RES through a hybrid analytical-simulation method. The stochasticity of RES introduces complexity to the reliability evaluation. The method takes into account the variability of RES through Monte- Carlo state sampling simulation. The results indicate the reliability enhancement of the overall system in the presence of the microgrid topology. In particular, the highest priority load has the largest improvement in the reliability indices. Furthermore, sensitivity analysis is performed to understand the effects of the failure of microgrid islanding in the case of a fault in the upstream network
    corecore