983 research outputs found

    The connection between entropy and the absorption spectra of Schwarzschild black holes for light and massless scalar fields

    Full text link
    We present heuristic arguments suggesting that if EM waves with wavelengths somewhat larger than the Schwarzschild radius of a black hole were fully absorbed by it, the second law of thermodynamics would be violated, under the Bekenstein interpretation of the area of a black hole as a measure of its entropy. Thus, entropy considerations make the well known fact that large wavelengths are only marginally absorbed by black holes, a natural consequence of thermodynamics. We also study numerically the ingoing radial propagation of a scalar field wave in a Schwarzschild metric, relaxing the standard assumption which leads to the eikonal equation, that the wave has zero spatial extent. We find that if these waves have wavelengths larger that the Schwarzschild radius, they are very substantially reflected, fully to numerical accuracy. Interestingly, this critical wavelength approximately coincides with the one derived from entropy considerations of the EM field, and is consistent with well known limit results of scattering in the Schwarzschild metric. The propagation speed is also calculated and seen to differ from the value cc, for wavelengths larger than RsR_{s}, in the vicinity of RsR_{s}. As in all classical wave phenomena, whenever the wavelength is larger or comparable to the physical size of elements in the system, in this case changes in the metric, the zero extent 'particle' description fails, and the wave nature becomes apparent.Comment: 14 Pages, 4 figures. Accepted for publication in the Journal Entrop

    Intervención cognoscitiva en los procesos ejecutivos para el procesamiento de información : un programa con profesores y estudiantes de secundaria

    Get PDF
    A continuación se muestran los resultados de una investigación sobre los procesos ejecutivos para el procesamiento de información (codificación, inferencia, funcionalización y aplicación) adelantada durante el año 2006 con un grupo de profesores y estudiantes pertenecientes a distintos colegios de secundaria de la ciudad de Bogotá. La investigación se adelantó con base en un programa de intervención cognoscitiva fundamentado en la teoría triarquica de Sternberg (1985) siguiendo un diseño de muestras cronológicas. Los resultados mostraron que, previo a la aplicación del programa de intervención, cerca del 80% de los estudiantes de la población no empleaban los procesos ejecutivos para la asimilación de conceptos científicos, al terminar el programa se encontró que cerca del 65% de los estudiantes mejoró su desempeño y empleo de los procesos ejecutivos

    Intervención cognoscitiva en los procesos ejecutivos para el procesamiento de información : un programa con profesores y estudiantes de secundaria

    Get PDF
    A continuación se muestran los resultados de una investigación sobre los procesos ejecutivos para el procesamiento de información (codificación, inferencia, funcionalización y aplicación) adelantada durante el año 2006 con un grupo de profesores y estudiantes pertenecientes a distintos colegios de secundaria de la ciudad de Bogotá. La investigación se adelantó con base en un programa de intervención cognoscitiva fundamentado en la teoría triarquica de Sternberg (1985) siguiendo un diseño de muestras cronológicas. Los resultados mostraron que, previo a la aplicación del programa de intervención, cerca del 80% de los estudiantes de la población no empleaban los procesos ejecutivos para la asimilación de conceptos científicos, al terminar el programa se encontró que cerca del 65% de los estudiantes mejoró su desempeño y empleo de los procesos ejecutivos

    Towards automated composition of convergent services: a survey

    Get PDF
    A convergent service is defined as a service that exploits the convergence of communication networks and at the same time takes advantage of features of the Web. Nowadays, building up a convergent service is not trivial, because although there are significant approaches that aim to automate the service composition at different levels in the Web and Telecom domains, selecting the most appropriate approach for specific case studies is complex due to the big amount of involved information and the lack of technical considerations. Thus, in this paper, we identify the relevant phases for convergent service composition and explore the existing approaches and their associated technologies for automating each phase. For each technology, the maturity and results are analysed, as well as the elements that must be considered prior to their application in real scenarios. Furthermore, we provide research directions related to the convergent service composition phases

    Freshly Formed Dust in the Cassiopeia A Supernova Remnant as Revealed by the Spitzer Space Telescope

    Get PDF
    We performed Spitzer Infrared Spectrograph mapping observations covering nearly the entire extent of the Cassiopeia A supernova remnant (SNR), producing mid-infrared (5.5-35 micron) spectra every 5-10". Gas lines of Ar, Ne, O, Si, S and Fe, and dust continua were strong for most positions. We identify three distinct ejecta dust populations based on their continuum shapes. The dominant dust continuum shape exhibits a strong peak at 21 micron. A line-free map of 21 micron-peak dust made from the 19-23 micron range closely resembles the [Ar II], [O IV], and [Ne II] ejecta-line maps implying that dust is freshly formed in the ejecta. Spectral fitting implies the presence of SiO2, Mg protosilicates, and FeO grains in these regions. The second dust type exhibits a rising continuum up to 21 micron and then flattens thereafter. This ``weak 21 micron'' dust is likely composed of Al2O3 and C grains. The third dust continuum shape is featureless with a gently rising spectrum and is likely composed of MgSiO3 and either Al2O3 or Fe grains. Using the least massive composition for each of the three dust classes yields a total mass of 0.02 Msun. Using the most-massive composition yields a total mass of 0.054 Msun. The primary uncertainty in the total dust mass stems from the selection of the dust composition necessary for fitting the featureless dust as well as 70 micron flux. The freshly formed dust mass derived from Cas A is sufficient from SNe to explain the lower limit on the dust masses in high redshift galaxies.Comment: 8 figures: Accepted for the publication in Ap

    High-resolution backprojection at regional distance: Application to the Haiti M7.0 earthquake and comparisons with finite source studies

    Get PDF
    A catastrophic M_w7 earthquake ruptured on 12 January 2010 on a complex fault system near Port-au-Prince, Haiti. Offshore rupture is suggested by aftershock locations and marine geophysics studies, but its extent remains difficult to define using geodetic and teleseismic observations. Here we perform the multitaper multiple signal classification (MUSIC) analysis, a high-resolution array technique, at regional distance with recordings from the Venezuela National Seismic Network to resolve high-frequency (about 0.4 Hz) aspects of the earthquake process. Our results indicate westward rupture with two subevents, roughly 35 km apart. In comparison, a lower-frequency finite source inversion with fault geometry based on new geologic and aftershock data shows two slip patches with centroids 21 km apart. Apparent source time functions from USArray further constrain the intersubevent time delay, implying a rupture speed of 3.3 km/s. The tips of the slip zones coincide with subevents imaged by backprojections. The different subevent locations found by backprojection and source inversion suggest spatial complementarity between high- and low-frequency source radiation consistent with high-frequency radiation originating from rupture arrest phases at the edges of main slip areas. The centroid moment tensor (CMT) solution and a geodetic-only inversion have similar moment, indicating most of the moment released is captured by geodetic observations and no additional rupture is required beyond where it is imaged in our preferred model. Our results demonstrate the contribution of backprojections of regional seismic array data for earthquakes down to M ≈ 7, especially when incomplete coverage of seismic and geodetic data implies large uncertainties in source inversions

    QoSatAr: a cross-layer architecture for E2E QoS provisioning over DVB-S2 broadband satellite systems

    Get PDF
    This article presents QoSatAr, a cross-layer architecture developed to provide end-to-end quality of service (QoS) guarantees for Internet protocol (IP) traffic over the Digital Video Broadcasting-Second generation (DVB-S2) satellite systems. The architecture design is based on a cross-layer optimization between the physical layer and the network layer to provide QoS provisioning based on the bandwidth availability present in the DVB-S2 satellite channel. Our design is developed at the satellite-independent layers, being in compliance with the ETSI-BSM-QoS standards. The architecture is set up inside the gateway, it includes a Re-Queuing Mechanism (RQM) to enhance the goodput of the EF and AF traffic classes and an adaptive IP scheduler to guarantee the high-priority traffic classes taking into account the channel conditions affected by rain events. One of the most important aspect of the architecture design is that QoSatAr is able to guarantee the QoS requirements for specific traffic flows considering a single parameter: the bandwidth availability which is set at the physical layer (considering adaptive code and modulation adaptation) and sent to the network layer by means of a cross-layer optimization. The architecture has been evaluated using the NS-2 simulator. In this article, we present evaluation metrics, extensive simulations results and conclusions about the performance of the proposed QoSatAr when it is evaluated over a DVB-S2 satellite scenario. The key results show that the implementation of this architecture enables to keep control of the satellite system load while guaranteeing the QoS levels for the high-priority traffic classes even when bandwidth variations due to rain events are experienced. Moreover, using the RQM mechanism the user’s quality of experience is improved while keeping lower delay and jitter values for the high-priority traffic classes. In particular, the AF goodput is enhanced around 33% over the drop tail scheme (on average)
    • …
    corecore