498 research outputs found

    Massive pulsating stars observed by BRITE-Constellation. I. The triple system Beta Centauri (Agena)

    Full text link
    This paper aims to precisely determine the masses and detect pulsation modes in the two massive components of Beta Cen with BRITE-Constellation photometry. In addition, seismic models for the components are considered and the effects of fast rotation are discussed. This is done to test the limitations of seismic modeling for this very difficult case. A simultaneous fit of visual and spectroscopic orbits is used to self-consistently derive the orbital parameters, and subsequently the masses, of the components. The derived masses are equal to 12.02 +/- 0.13 and 10.58 +/- 0.18 M_Sun. The parameters of the wider, A - B system, presently approaching periastron passage, are constrained. Analysis of the combined blue- and red-filter BRITE-Constellation photometric data of the system revealed the presence of 19 periodic terms, of which eight are likely g modes, nine are p modes, and the remaining two are combination terms. It cannot be excluded that one or two low-frequency terms are rotational frequencies. It is possible that both components of Beta Cen are Beta Cep/SPB hybrids. An attempt to use the apparent changes of frequency to distinguish which modes originate in which component did not succeed, but there is potential for using this method when more BRITE data become available. Agena seems to be one of very few rapidly rotating massive objects with rich p- and g-mode spectra, and precisely known masses. It can therefore be used to gain a better understanding of the excitation of pulsations in relatively rapidly rotating stars and their seismic modeling. Finally, this case illustrates the potential of BRITE-Constellation data for the detection of rich-frequency spectra of small-amplitude modes in massive pulsating stars.Comment: 17 pages (with Appendix), 15 figures, accepted for publication in A&

    Information Theory and Its Application in Machine Condition Monitoring

    Get PDF
    Condition monitoring of machinery is one of the most important aspects of many modern industries. With the rapid advancement of science and technology, machines are becoming increasingly complex. Moreover, an exponential increase of demand is leading an increasing requirement of machine output. As a result, in most modern industries, machines have to work for 24 hours a day. All these factors are leading to the deterioration of machine health in a higher rate than before. Breakdown of the key components of a machine such as bearing, gearbox or rollers can cause a catastrophic effect both in terms of financial and human costs. In this perspective, it is important not only to detect the fault at its earliest point of inception but necessary to design the overall monitoring process, such as fault classification, fault severity assessment and remaining useful life (RUL) prediction for better planning of the maintenance schedule. Information theory is one of the pioneer contributions of modern science that has evolved into various forms and algorithms over time. Due to its ability to address the non-linearity and non-stationarity of machine health deterioration, it has become a popular choice among researchers. Information theory is an effective technique for extracting features of machines under different health conditions. In this context, this book discusses the potential applications, research results and latest developments of information theory-based condition monitoring of machineries

    IoT for measurements and measurements for IoT

    Get PDF
    The thesis is framed in the broad strand of the Internet of Things, providing two parallel paths. On one hand, it deals with the identification of operational scenarios in which the IoT paradigm could be innovative and preferable to pre-existing solutions, discussing in detail a couple of applications. On the other hand, the thesis presents methodologies to assess the performance of technologies and related enabling protocols for IoT systems, focusing mainly on metrics and parameters related to the functioning of the physical layer of the systems

    The XMM-Newton serendipitous ultraviolet source survey catalogue

    Get PDF
    The XMM-Newton Serendipitous Ultraviolet Source Survey (XMM-SUSS) is a catalogue of ultraviolet (UV) sources detected serendipitously by the Optical Monitor (XMM-OM) on-board the XMM-Newton observatory. The catalogue contains ultraviolet-detected sources collected from 2,417 XMM-OM observations in 1-6 broad band UV and optical filters, made between 24 February 2000 and 29 March 2007. The primary contents of the catalogue are source positions, magnitudes and fluxes in 1 to 6 passbands, and these are accompanied by profile diagnostics and variability statistics. The XMM-SUSS is populated by 753,578 UV source detections above a 3 sigma signal-to-noise threshold limit which relate to 624,049 unique objects. Taking account of substantial overlaps between observations, the net sky area covered is 29-54 square degrees, depending on UV filter. The magnitude distributions peak at 20.2, 20.9 and 21.2 in UVW2, UVM2 and UVW1 respectively. More than 10 per cent of sources have been visited more than once using the same filter during XMM-Newton operation, and > 20 per cent of sources are observed more than once per filter during an individual visit. Consequently, the scope for science based on temporal source variability on timescales of hours to years is broad. By comparison with other astrophysical catalogues we test the accuracy of the source measurements and define the nature of the serendipitous UV XMM-OM source sample. The distributions of source colours in the UV and optical filters are shown together with the expected loci of stars and galaxies, and indicate that sources which are detected in multiple UV bands are predominantly star-forming galaxies and stars of type G or earlier.Comment: Accepted for publication in MNRA

    Planck early results. VI. The High Frequency Instrument data processing

    Get PDF
    We describe the processing of the 336 billion raw data samples from the High Frequency Instrument (HFI) which we performed to produce six temperature maps from the first 295 days of Planck-HFI survey data. These maps provide an accurate rendition of the sky emission at 100, 143, 217, 353, 545 and 857GHz with an angular resolution ranging from 9.9 to 4.4 . The white noise level is around 1.5 μK degree or less in the 3 main CMB channels (100–217 GHz). The photometric accuracy is better than 2% at frequencies between 100 and 353 GHz and around 7% at the two highest frequencies. The maps created by the HFI Data Processing Centre reach our goals in terms of sensitivity, resolution, and photometric accuracy. They are already sufficiently accurate and well-characterised to allow scientific analyses which are presented in an accompanying series of early papers. At this stage, HFI data appears to be of high quality and we expect that with further refinements of the data processing we should be able to achieve, or exceed, the science goals of the Planck project

    The Telecommunications and Data Acquisition Report

    Get PDF
    This quarterly publication provides archival reports on developments in programs in space communications, radio navigation, radio science, and ground-based radio and radar astronomy. It reports on activities of the Deep Space Network (DSN) in planning, supporting research and technology, implementation, and operations. Also included are standardization activities at the Jet Propulsion Laboratory for space data and information systems

    Structural health monitoring of civil infrastructure

    Get PDF
    Structural health monitoring (SHM) is a term increasingly used in the last decade to describe a range of systems implemented on full-scale civil infrastructures and whose purposes are to assist and inform operators about continued 'fitness for purpose' of structures under gradual or sudden changes to their state, to learn about either or both of the load and response mechanisms. Arguably, various forms of SHM have been employed in civil infrastructure for at least half a century, but it is only in the last decade or two that computer-based systems are being designed for the purpose of assisting owners/operators of ageing infrastructure with timely information for their continued safe and economic operation. This paper describes the motivations for and recent history of SHM applications to various forms of civil infrastructure and provides case studies on specific types of structure. It ends with a discussion of the present state-of-the-art and future developments in terms of instrumentation, data acquisition, communication systems and data mining and presentation procedures for diagnosis of infrastructural 'health'

    Gaia Data Release 1: Pre-processing and source list creation

    Get PDF
    Context. The first data release from the Gaia mission contains accurate positions and magnitudes for more than a billion sources, and proper motions and parallaxes for the majority of the 2.5 million HIPPARCOS and Tycho-2 stars. Aims. We describe three essential elements of the initial data treatment leading to this catalogue: the image analysis, the construction of a source list, and the near real-time monitoring of the payload health. We also discuss some weak points that set limitations for the attainable precision at the present stage of the mission. Methods. Image parameters for point sources are derived from one-dimensional scans, using a maximum likelihood method, under the assumption of a line spread function constant in time, and a complete modelling of bias and background. These conditions are, however, not completely fulfilled. The Gaia source list is built starting from a large ground-based catalogue, but even so a significant number of new entries have been added, and a large number have been removed. The autonomous onboard star image detection will pick up many spurious images, especially around bright sources, and such unwanted detections must be identified. Another key step of the source list creation consists in arranging the more than 10^(10) individual detections in spatially isolated groups that can be analysed individually. Results. Complete software systems have been built for the Gaia initial data treatment, that manage approximately 50 million focal plane transits daily, giving transit times and fluxes for 500 million individual CCD images to the astrometric and photometric processing chains. The software also carries out a successful and detailed daily monitoring of Gaia health
    • …
    corecore