524 research outputs found

    New high-sensitivity, milliarcsecond resolution results from routine observations of lunar occultations at the ESO VLT

    Full text link
    (Abridged) Lunar occultations (LO) are a very efficient and powerful technique, that achieves the best combination of high angular resolution and sensitivity possible today at near-infrared wavelengths. Given that the events are fixed in time, that the sources are occulted randomly, and that the telescope use is minimal, the technique is very well suited for service mode observations. We have established a program of routine LO observations at the VLT observatory, especially designed to take advantage of short breaks available in-between other programs. We have used the ISAAC instrument in burst mode, capable of producing continuous read-outs at millisecond rates on a suitable subwindow. Given the random nature of the source selection, our aim has been primarily the investigation of a large number of stellar sources at the highest angular resolution in order to detect new binaries. Serendipitous results such as resolved sources and detection of circumstellar components were also anticipated. We have recorded the signal from background stars for a few seconds, around the predicted time of occultation by the Moon's dark limb. At millisecond time resolution, a characteristic diffraction pattern can be observed. Patterns for two or more sources superimpose linearly, and this property is used for the detection of binary stars. The detailed analysis of the diffraction fringes can be used to measure specific properties such as the stellar angular size and the presence of extended light sources such as a circumstellar shell. We present a list of 191 stars for which LO data could be recorded and analyzed. Results include the detection of 16 binary and 2 triple stars, all but one of which were previously unknown. The projected angular separations are as small as 4 milliarcseconds and magnitude differences as high as ?K=5.8 mag...Comment: 10 pages, 3 figures, to be published in A&

    Reassessing values for emerging big data technologies: integrating design-based and application-based approaches

    Get PDF
    Through the exponential growth in digital devices and computational capabilities, big data technologies are putting pressure upon the boundaries of what can or cannot be considered acceptable from an ethical perspective. Much of the literature on ethical issues related to big data and big data technologies focuses on separate values such as privacy, human dignity, justice or autonomy. More holistic approaches, allowing a more comprehensive view and better balancing of values, usually focus on either a design-based approach, in which it is tried to implement values into the design of new technologies, or an application-based approach, in which it is tried to address the ways in which new technologies are used. Some integrated approaches do exist, but typically are more general in nature. This offers a broad scope of application, but may not always be tailored to the specific nature of big data related ethical issues. In this paper we distil a comprehensive set of ethical values from existing design-based and application-based ethical approaches for new technologies and further focus these values to the context of emerging big data technologies. A total of four value lists (from techno-moral values, value-sensitive design, anticipatory emerging technology ethics and biomedical ethics) were selected for this. The integrated list consists of a total of ten values: human welfare, autonomy, non-maleficence, justice, accountability, trustworthiness, privacy, dignity, solidarity and environmental welfare. Together, this set of values provides a comprehensive and in-depth overview of the values that are to be taken into account for emerging big data technologies.Horizon 2020(H2020)No 731873 (e-SIDES)Article / Letter to editorInstituut voor Metajuridic

    Reassessing values for emerging big data technologies: integrating design-based and application-based approaches

    Get PDF
    Through the exponential growth in digital devices and computational capabilities, big data technologies are putting pressure upon the boundaries of what can or cannot be considered acceptable from an ethical perspective. Much of the literature on ethical issues related to big data and big data technologies focuses on separate values such as privacy, human dignity, justice or autonomy. More holistic approaches, allowing a more comprehensive view and better balancing of values, usually focus on either a design-based approach, in which it is tried to implement values into the design of new technologies, or an application-based approach, in which it is tried to address the ways in which new technologies are used. Some integrated approaches do exist, but typically are more general in nature. This offers a broad scope of application, but may not always be tailored to the specific nature of big data related ethical issues. In this paper we distil a comprehensive set of ethical values from existing design-based and application-based ethical approaches for new technologies and further focus these values to the context of emerging big data technologies. A total of four value lists (from techno-moral values, value-sensitive design, anticipatory emerging technology ethics and biomedical ethics) were selected for this. The integrated list consists of a total of ten values: human welfare, autonomy, non-maleficence, justice, accountability, trustworthiness, privacy, dignity, solidarity and environmental welfare. Together, this set of values provides a comprehensive and in-depth overview of the values that are to be taken into account for emerging big data technologies.Horizon 2020(H2020)No 731873 (e-SIDES)Article / Letter to editorInstituut voor Metajuridic

    Results and recommendations from an intercomparison of six Hygroscopicity-TDMA systems

    Get PDF
    The performance of six custom-built Hygrocopicity-Tandem Differential Mobility Analyser (H-TDMA) systems was investigated in the frame of an international calibration and intercomparison workshop held in Leipzig, February 2006. The goal of the workshop was to harmonise H-TDMA measurements and develop recommendations for atmospheric measurements and their data evaluation. The H-TDMA systems were compared in terms of the sizing of dry particles, relative humidity (RH) uncertainty, and consistency in determination of number fractions of different hygroscopic particle groups. The experiments were performed in an air-conditioned laboratory using ammonium sulphate particles or an external mixture of ammonium sulphate and soot particles. The sizing of dry particles of the six H-TDMA systems was within 0.2 to 4.2% of the selected particle diameter depending on investigated size and individual system. Measurements of ammonium sulphate aerosol found deviations equivalent to 4.5% RH from the set point of 90% RH compared to results from previous experiments in the literature. Evaluation of the number fraction of particles within the clearly separated growth factor modes of a laboratory generated externally mixed aerosol was done. The data from the H-TDMAs was analysed with a single fitting routine to investigate differences caused by the different data evaluation procedures used for each H-TDMA. The differences between the H-TDMAs were reduced from +12/-13% to +8/-6% when the same analysis routine was applied. We conclude that a common data evaluation procedure to determine number fractions of externally mixed aerosols will improve the comparability of H-TDMA measurements. It is recommended to ensure proper calibration of all flow, temperature and RH sensors in the systems. It is most important to thermally insulate the aerosol humidification unit and the second DMA and to monitor these temperatures to an accuracy of 0.2 degrees C. For the correct determination of external mixtures, it is necessary to take into account size-dependent losses due to diffusion in the plumbing between the DMAs and in the aerosol humidification unit.Peer reviewe

    Milliarcsecond angular resolution of reddened stellar sources in the vicinity of the Galactic Center

    Full text link
    For the first time, the lunar occultation technique has been employed on a very large telescope in the near-IR with the aim of achieving systematically milliarcsecond resolution on stellar sources. We have demonstrated the burst mode of the ISAAC instrument, using a fast read-out on a small area of the detector to record many tens of seconds of data at a time on fields of few squared arcsec. We have used the opportunity to record a large number of LO events during a passage of the Moon close to the Galactic Center in March 2006. We have developed a data pipeline for the treatment of LO data, including the automated estimation of the main data analysis parameters using a wavelet-based method, and the preliminary fitting and plotting of all light curves. We recorded 51 LO events over about four hours. Of these, 30 resulted of sufficient quality to enable a detailed fitting. We detected two binaries with subarcsec projected separation and three stars with a marginally resolved angular diameter of about 2 mas. Two more SiO masers, were found to be resolved and in one case we could recover the brightness profile of the extended emission, which is well consistent with an optically thin shell. The remaining unresolved stars were used to characterize the performance of the method. The LO technique at a very large telescope is a powerful and efficient method to achieve angular resolution, sensitivity, and dynamic range that are among the best possible today with any technique. The selection of targets is naturally limited and LOs are fixed-time events, however each observation requires only a few minutes including overheads. As such, LOs are ideally suited to fill small gaps of idle time between standard observations.Comment: A&A in pres

    Foreign Direct Investment and Employment: Home Country Experience in the United States and Sweden

    Get PDF
    We compare the relation between foreign affiliate production and parent employment in U.S. manufacturing multinationals with that in Swedish firms. U.S. multinationals appear to have allocated some of their more labor intensive operations selling in world markets to affiliates in developing countries, reducing the labor intensity in their home production. Swedish multinationals produce relatively little in developing countries and most of that has been for sale within host countries with import-substituting trade regimes. The great majority of Swedish affiliate production is in high-income countries, the U.S. and Europe, and is associated with more employment, particularly blue-collar employment, in the parent companies. The small Swedish-owned production that does take place in developing countries is also associated with more white-collar employment at home. The effects on white-collar employment within the Swedish firms have grown smaller and weaker over time.
    corecore