3,298 research outputs found

    Rotational properties of the binary and non-binary populations in the Trans-Neptunian belt

    Full text link
    We present results for the short-term variability of Binary Trans-Neptunian Objects (BTNOs). We performed CCD photometric observations using the 3.58 m Telescopio Nazionale Galileo, the 1.5 m Sierra Nevada Observatory telescope, and the 1.23 m Centro Astronomico Hispano Aleman telescope at Calar Alto Observatory. We present results based on five years of observations and report the short-term variability of six BTNOs. Our sample contains three classical objects: 2003MW12, or Varda, 2004SB60, or Salacia, and 2002 VT130; one detached disk object: 2007UK126; and two resonant objects: 2007TY430 and 2000EB173, or Huya. For each target, possible rotational periods and/or photometric amplitudes are reported. We also derived some physical properties from their lightcurves, such as density, primary and secondary sizes, and albedo. We compiled and analyzed a vast lightcurve database for Trans-Neptunian Objects (TNOs) including centaurs to determine the lightcurve amplitude and spin frequency distributions for the binary and non-binary populations. The mean rotational periods, from the Maxwellian fits to the frequency distributions, are 8.63+/-0.52 h for the entire sample, 8.37+/-0.58 h for the sample without the binary population, and 10.11+/-1.19 h for the binary population alone. Because the centaurs are collisionally more evolved, their rotational periods might not be so primordial. We computed a mean rotational period, from the Maxwellian fit, of 8.86+/-0.58 h for the sample without the centaur population, and of 8.64+/-0.67 h considering a sample without the binary and the centaur populations. According to this analysis, regular TNOs spin faster than binaries, which is compatible with the tidal interaction of the binaries. Finally, we examined possible formation models for several systems studied in this work and by our team in previous papers.Comment: Accepted for publication in Astronomy and Astrophysics (June 26th, 2014); minor changes with published version; 21 pages, 17 figures, 7 table

    Geothermal probabilistic cost study

    Get PDF
    A tool is presented to quantify the risks of geothermal projects, the Geothermal Probabilistic Cost Model (GPCM). The GPCM model was used to evaluate a geothermal reservoir for a binary-cycle electric plant at Heber, California. Three institutional aspects of the geothermal risk which can shift the risk among different agents was analyzed. The leasing of geothermal land, contracting between the producer and the user of the geothermal heat, and insurance against faulty performance were examined

    Abundances of ammonia and carbon disulfide in the Jovian stratosphere following the impact of comet Shoemaker‐Levy 9

    Full text link
    Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/95648/1/grl8459.pd

    Flexible copyright: the law and economics of introducing an open norm in the Netherlands

    Get PDF
    This study analyses the law and economics of introducing flexibility in the system of exceptions and limitations in Dutch copyright law. Such flexibility would exist in an open norm, on the basis of which the courts can decide whether certain uses of copyrighted material are permissible or not, instead of explicitly defining this in the law. First, it assesses problem areas where the lack of flexibility creates legal disputes and potential barriers to innovation and commercialisation. Second, it analyses the economic rationale and economic effects of introducing flexibility. The study was commissioned by the Dutch Ministry of Economic Affairs, Agriculture & Innovation. Research methods used are literature review and in-depth interviews. The study includes a case study of Israel, where a fair use exception was introduced in the Copyright Act in 2007. Exceptions and limitations in the current copyright system are meant to balance the protection granted to rights owners with the public interest’s need to make certain unauthorized uses. However, this report identified a number of situations that do not fit well within the current set of exceptions and limitations and attributes this to a lack of flexibility. Among these uses are the activities of search engines, the use of works in User Created Content, cloud computing, data mining, distance learning, and transformative uses by, for instance, documentary filmmakers. Several of these problem areas have given rise to court proceedings with varying outcomes. The interpretation given by courts to existing exceptions and limitations - such as the quotation right, the exception for transient and incidental copying, the private copying exception, and the incidental use exception - is usually too narrow to respond to new technological developments, new developments in the creation process, or new commercialisation models. These types of uses generally do not ‘fit’ the narrowly defined exceptions and limitations and therefore lack legal basis. The same is true for things not yet invented. Because the law is not flexible in itself, courts have increasingly found inventive ways to create legal space for uses that are not covered by the exhaustive list of exceptions. In these cases flexibility with specific evaluation criteria could have been more satisfactory from a legal perspective. Flexibility could be obtained by introducing an open norm in the copyright system. This report defines such an open norm for the purpose of analysing the effects of more flexibility in copyright law. The norm has two main properties. First, it would coexist with the exhaustive list of exceptions and limitations in the current Dutch Copyright Act. Second, a use of a work would only benefit from the open norm if it passes the so-called three-step test, which takes the interests of the author or right holder into account. The first category of economic effects of introducing an open norm is that for some known uses that otherwise require licensing, the open norm would allow unlicensed use. Thispotentially reduces the reward to the creator of a work and therefore decreases the incentive to create. By contrast, it is also likely to reduce the creator’s costs of using another work as an input when producing a new work, and therefore to increase the incentive to create. It is difficult to predict which of these two opposing effects ultimately turns the scale in specific markets. Traditional creators generally worry about the negative effect on their reward and seem to believe that the first effect dominates. For businesses that use large numbers of protected works as an input for their services, such as Google, the opposite is true. They emphasise the benefits of reduced input costs and are likely to improve their legal position with an open norm. Collective rights management organisations in turn fear that their bargaining power vis-à-vis users like UCC-platforms, such as YouTube, would suffer from an open norm. However, given the design of the open norm, it is unlikely that rewards for creators are significantly affected. The application of the open norm by the courts tests for adverse effects on the business model of the rights holder (the previously mentioned three-step test). In case of severe adverse effects on the rights holder, the open norm does not apply. The shift in bargaining power from rights holders to user (platforms) is limited to cases that are currently licensed and where parties are sufficiently confident that the use benefits from the open norm. The second category of economic effects of introducing an open norm is that the legal delineation between infringement and permissible use becomes capable of accommodating developments in technology and society. This enables entrepreneurs to develop new products and services that rely on currently unforeseen use of protected material. On the downside, flexibility may reduce legal certainty in the short run, until jurisprudence on the practice of flexible copyright has developed. The countries that have recently introduced an open norm in their copyright laws have not produced any ex-ante or ex-post studies on the magnitude of these economic effects. The case study of fair use in Israel shows that the change may decrease legal certainty in the short run (as case law needs time to develop), but improve legal certainty in the longer run, as the legal position of acts that do not ‘fit’ a rigid system with an exhaustive list of static exceptions is being clarified. In sum, the main effects of introducing an open norm seem to be of a legal nature: it changes the legal position of some businesses and therefore affects the costs these businesses make to comply with copyright. ‘Tomorrow’s inventions’ are likely to be facilitated by an open norm. Since most businesses seem currently not chilled by the lack of flexibility, the effect on products and services available in the market is likely to be secondary to the legal effects

    Use of Energy Consumption during Milling to Fill a Measurement Gap in Hybrid Additive Manufacturing

    Get PDF
    Coupling additive manufacturing (AM) with interlayer peening introduces bulk anisotropic properties within a build across several centimeters. Current methods to map high resolution anisotropy and heterogeneity are either destructive or have a limited penetration depth using a nondestructive method. An alternative pseudo-nondestructive method to map high-resolution anisotropy and heterogeneity is through energy consumption during milling. Previous research has shown energy consumption during milling correlates with surface integrity. Since surface milling of additively manufactured parts is often required for post-processing to improve dimensional accuracy, an opportunity is available to use surface milling as an alternative method to measure mechanical properties and build quality. The variation of energy consumption during the machining of additive parts, as well as hybrid AM parts, is poorly understood. In this study, the use of net cutting specific energy was proposed as a suitable metric for measuring mechanical properties after interlayer ultrasonic peening of 316 stainless steel. Energy consumption was mapped throughout half of a cuboidal build volume. Results indicated the variation of net cutting specific energy increased farther away from the surface and was higher for hybrid AM compared to as-printed and wrought. The average lateral and layer variation of the net cutting specific energy for printed samples was 81% higher than the control, which indicated a significantly higher degree of heterogeneity. Further, it was found that energy consumption was an effective process signature exhibiting strong correlations with microhardness. Anisotropy based on residual strains were measured using net cutting specific energy and validated by hole drilling. The proposed technique contributes to filling part of the measure gap in hybrid additive manufacturing and capitalizes on the preexisting need for machining of AM parts to achieve both goals of surface finish and quality assessment in one milling operation

    Psychological response and quality of life after transplantation: a comparison between heart, lung, liver and kidney recipients

    Full text link
    PRINCIPLES: Various non-specific questionnaires were used to measure quality of life and psychological wellbeing of patients after organ transplantation. At present cross-organ studies dealing specifically with the psychological response to a transplanted organ are non-existent in German-speaking countries. METHODS: The Transplant Effects Questionnaire TxEQ-D and the SF-36 Quality of Life Questionnaire were used to examine the psychological response and quality of life of 370 patients after heart, lung, liver or kidney transplantation. The organ groups were compared with regard to psychosocial parameters. RESULTS: 72% of patients develop a feeling of responsibility for the received organ and its function. This feeling is even stronger towards the patient's key relationships i.e. family, friends, the treatment team and the donor. 11.6% worry about the transplanted organ. Heart and lung patients report significantly fewer concerns than liver and kidney patients. Overall, only a minority of patients report feelings of guilt towards the donor (2.7%), problems in disclosing their transplant to others (2.4%), or difficulties in complying with medical orders (3.5%). Lung transplant patients show significantly better adherence. CONCLUSIONS: A feeling of responsibility towards those one is close to and towards the donor is a common psychological phenomenon after transplantation of an organ. Conscious feelings of guilt and shame are harboured by only a minority of patients. The fact that heart and lung patients worry less about their transplant might have primarily to do with the greater medical and psychosocial support in this group

    Models of the SL9 Impacts II. Radiative-hydrodynamic Modeling of the Plume Splashback

    Full text link
    We model the plume "splashback" phase of the SL9 collisions with Jupiter using the ZEUS-3D hydrodynamic code. We modified the Zeus code to include gray radiative transport, and we present validation tests. We couple the infalling mass and momentum fluxes of SL9 plume material (from paper I) to a jovian atmospheric model. A strong and complex shock structure results. The modeled shock temperatures agree well with observations, and the structure and evolution of the modeled shocks account for the appearance of high excitation molecular line emission after the peak of the continuum light curve. The splashback region cools by radial expansion as well as by radiation. The morphology of our synthetic continuum light curves agree with observations over a broad wavelength range (0.9 to 12 microns). A feature of our ballistic plume is a shell of mass at the highest velocities, which we term the "vanguard". Portions of the vanguard ejected on shallow trajectories produce a lateral shock front, whose initial expansion accounts for the "third precursors" seen in the 2-micron light curves of the larger impacts, and for hot methane emission at early times. Continued propagation of this lateral shock approximately reproduces the radii, propagation speed, and centroid positions of the large rings observed at 3-4 microns by McGregor et al. The portion of the vanguard ejected closer to the vertical falls back with high z-component velocities just after maximum light, producing CO emission and the "flare" seen at 0.9 microns. The model also produces secondary maxima ("bounces") whose amplitudes and periods are in agreement with observations.Comment: 13 pages, 9 figures (figs 3 and 4 in color), accepted for Ap.J. latex, version including full figures at: http://oobleck.tn.cornell.edu/jh/ast/papers/slplume2-20.ps.g

    The NICMOS Snapshot Survey of nearby Galaxies

    Get PDF
    We present ``snapshot'' observations with the NearInfrared Camera and MultiObject Spectrometer (NICMOS) on board the Hubble Space Telescope (HST) of 94 nearby galaxies from the Revised Shapley Ames Catalog. Images with 0.2 as resolution were obtained in two filters, a broad-band continuum filter (F160W, roughly equivalent to the H-band) and a narrow band filter centered on the Paschen alpha line (F187N or F190N, depending on the galaxy redshift) with the 51x51 as field of view of the NICMOS camera 3. A first-order continuum subtraction is performed, and the resulting line maps and integrated Paschen alpha line fluxes are presented. A statistical analysis indicates that the average Paschen alpha surface brightness {\bf in the central regions} is highest in early-type (Sa-Sb) spirals.Comment: Original contained error in flux calibration. Table 1 now has correct Paschen Alpha fluxes. 14 pages LaTeX with JPEG and PS figures. Also available at http://icarus.stsci.edu/~boeker/publications.htm

    The Tully-Fisher relation at intermediate redshift

    Full text link
    Using the Very Large Telescope in Multi Object Spectroscopy mode, we have observed a sample of 113 field spiral galaxies in the FORS Deep Field (FDF) with redshifts in the range 0.1<z<1.0. The galaxies were selected upon apparent brightness (R<23) and encompass all late spectrophotometric types from Sa to Sdm/Im. Spatially resolved rotation curves have been extracted for 77 galaxies and fitted with synthetic velocity fields taking into account all observational effects from inclination and slit misalignment to seeing and slit width. We also compared different shapes for the intrinsic rotation curve. To gain robust values of V_max, our analysis is focussed on galaxies with rotation curves which extend well into the region of constant rotation velocity at large radii. If the slope of the local Tully-Fisher relation (TFR) is held fixed, we find evidence for a mass-dependent luminosity evolution which is as large as up to 2 mag for the lowest-mass galaxies, but is small or even negligible for the highest-mass systems in our sample. In effect, the TFR slope is shallower at z~0.5 in comparison to the local sample. We argue for a mass-dependent evolution of the mass-to-light ratio. An additional population of blue, low-mass spirals does not seem a very appealing explanation. The flatter tilt we find for the distant TFR is in contradiction to the predictions of recent semi-analytic simulations.Comment: 18 pages, 14 figures, A&A, in press. Section on sample completeness added. Please note that the entire analysis is based on undisturbed, high quality rotation curves! Potential effects of tidal interactions are also discusse
    • 

    corecore