1,729 research outputs found

    Flexible copyright: the law and economics of introducing an open norm in the Netherlands

    Get PDF
    This study analyses the law and economics of introducing flexibility in the system of exceptions and limitations in Dutch copyright law. Such flexibility would exist in an open norm, on the basis of which the courts can decide whether certain uses of copyrighted material are permissible or not, instead of explicitly defining this in the law. First, it assesses problem areas where the lack of flexibility creates legal disputes and potential barriers to innovation and commercialisation. Second, it analyses the economic rationale and economic effects of introducing flexibility. The study was commissioned by the Dutch Ministry of Economic Affairs, Agriculture & Innovation. Research methods used are literature review and in-depth interviews. The study includes a case study of Israel, where a fair use exception was introduced in the Copyright Act in 2007. Exceptions and limitations in the current copyright system are meant to balance the protection granted to rights owners with the public interest’s need to make certain unauthorized uses. However, this report identified a number of situations that do not fit well within the current set of exceptions and limitations and attributes this to a lack of flexibility. Among these uses are the activities of search engines, the use of works in User Created Content, cloud computing, data mining, distance learning, and transformative uses by, for instance, documentary filmmakers. Several of these problem areas have given rise to court proceedings with varying outcomes. The interpretation given by courts to existing exceptions and limitations - such as the quotation right, the exception for transient and incidental copying, the private copying exception, and the incidental use exception - is usually too narrow to respond to new technological developments, new developments in the creation process, or new commercialisation models. These types of uses generally do not ‘fit’ the narrowly defined exceptions and limitations and therefore lack legal basis. The same is true for things not yet invented. Because the law is not flexible in itself, courts have increasingly found inventive ways to create legal space for uses that are not covered by the exhaustive list of exceptions. In these cases flexibility with specific evaluation criteria could have been more satisfactory from a legal perspective. Flexibility could be obtained by introducing an open norm in the copyright system. This report defines such an open norm for the purpose of analysing the effects of more flexibility in copyright law. The norm has two main properties. First, it would coexist with the exhaustive list of exceptions and limitations in the current Dutch Copyright Act. Second, a use of a work would only benefit from the open norm if it passes the so-called three-step test, which takes the interests of the author or right holder into account. The first category of economic effects of introducing an open norm is that for some known uses that otherwise require licensing, the open norm would allow unlicensed use. Thispotentially reduces the reward to the creator of a work and therefore decreases the incentive to create. By contrast, it is also likely to reduce the creator’s costs of using another work as an input when producing a new work, and therefore to increase the incentive to create. It is difficult to predict which of these two opposing effects ultimately turns the scale in specific markets. Traditional creators generally worry about the negative effect on their reward and seem to believe that the first effect dominates. For businesses that use large numbers of protected works as an input for their services, such as Google, the opposite is true. They emphasise the benefits of reduced input costs and are likely to improve their legal position with an open norm. Collective rights management organisations in turn fear that their bargaining power vis-à-vis users like UCC-platforms, such as YouTube, would suffer from an open norm. However, given the design of the open norm, it is unlikely that rewards for creators are significantly affected. The application of the open norm by the courts tests for adverse effects on the business model of the rights holder (the previously mentioned three-step test). In case of severe adverse effects on the rights holder, the open norm does not apply. The shift in bargaining power from rights holders to user (platforms) is limited to cases that are currently licensed and where parties are sufficiently confident that the use benefits from the open norm. The second category of economic effects of introducing an open norm is that the legal delineation between infringement and permissible use becomes capable of accommodating developments in technology and society. This enables entrepreneurs to develop new products and services that rely on currently unforeseen use of protected material. On the downside, flexibility may reduce legal certainty in the short run, until jurisprudence on the practice of flexible copyright has developed. The countries that have recently introduced an open norm in their copyright laws have not produced any ex-ante or ex-post studies on the magnitude of these economic effects. The case study of fair use in Israel shows that the change may decrease legal certainty in the short run (as case law needs time to develop), but improve legal certainty in the longer run, as the legal position of acts that do not ‘fit’ a rigid system with an exhaustive list of static exceptions is being clarified. In sum, the main effects of introducing an open norm seem to be of a legal nature: it changes the legal position of some businesses and therefore affects the costs these businesses make to comply with copyright. ‘Tomorrow’s inventions’ are likely to be facilitated by an open norm. Since most businesses seem currently not chilled by the lack of flexibility, the effect on products and services available in the market is likely to be secondary to the legal effects

    Economic comparison of reactive distillation (RD) to a benchmark conventional flowsheet:Regions of RD applicability and trends in column design

    Get PDF
    A novel methodology for the techno-economic assessment of Reactive Distillation (RD) is presented. The developed methodology benchmarks reactive distillation (RD) to a conventional reactor + distillation train flowsheet (R+D) on a cost-optimized basis, with the optimization being performed on the process unit level (reactor sizing, number of stages, feed point(s)) and the internals level (reactive tray design). This methodology is applied to the ideal quaternary system A+B↔C+D with the conventional boiling point order of TC &lt; TA &lt; TB &lt; TD (αAD = 4, αBD = 2, αCD = 8). From this pool of data, a regime map of RD vs. R+D is established in which the attractive regions of either flowsheet option are identified in terms of the chemical reaction rate and chemical equilibrium. It is found that RD can arise as the cost optimal option for a large range of residence time requirements by virtue of overcoming the external recycle requirements of R+D. This is achieved through optimized reactive tray design. Contrary to conventional distillation design practices, it was found that the preferred use of bubble-cap trays over sieve trays to allow elevated weir heights and designing the column diameter below 80% of flooding become relevant design choices when accommodating high liquid holdup.</p

    Anticipatie op kartel- en concentratietoezicht

    Get PDF
    De Nederlandse Mededingingsautoriteit handhaaft de Mededingingswet. Op grond van de Mededingingswet is het onder meer verboden om afspraken te maken of om fusies door te voeren die de mededinging beperken. In welke mate houden bedrijven en hun adviseurs rekening met dit toezicht door de NMa? Wat is met andere woorden het anticipatie-effect van mededingingstoezicht? Wanneer mededingingsbeperkend gedrag vanwege het toezicht niet plaatsvindt, is er sprake van een anticipatie-effect. SEO Economisch Onderzoek heeft dit effect voor concentraties en kartels onderzocht door middel van enquĂȘtes met ondernemingen en adviseurs en econometrische analyse. Uit enquĂȘtes onder ondernemingen blijkt dat 5% van de gemelde fusies zijn aangepast voor melding om mogelijke mededingingsbezwaren weg te nemen. Naast elke 100 fusiemeldingen zijn er 13 voornemens die vanwege het toezicht op concentraties niet worden doorgezet tot een melding. Ondernemingen houden ook rekening met de Mededingingswet wanneer zij contracten vormgeven, bijeenkomsten bezoeken of overleggen. Uit de enquĂȘte onder advocaten en andere adviseurs blijkt dat tegenover elk sanctiebesluit van de NMa bijna 5 bij de NMa onbekende gevallen staan waarin een verboden gedraging is stopgezet of aangepast op grond van mededingingsrechtelijk advies. Welke factoren beĂŻnvloeden de omvang van het anticipatie-effect? Met econometrische analyse is het effect van het marktaandeel, remedies, kosten van de melding en de doorlooptijd van de meldingsfase onderzocht. Voor karteltoezicht is het effect van de NMa-Agenda, de clementieregeling, de persoonlijke boete, de boete voor de onderneming en negatieve publiciteit onderzocht

    A tilted interference filter in a converging beam

    Full text link
    Context. Narrow-band interference filters can be tuned toward shorter wavelengths by tilting them from the perpendicular to the optical axis. This can be used as a cheap alternative to real tunable filters, such as Fabry-P\'erot interferometers and Lyot filters. At the Swedish 1-m Solar Telescope, such a setup is used to scan through the blue wing of the Ca II H line. Because the filter is mounted in a converging beam, the incident angle varies over the pupil, which causes a variation of the transmission over the pupil, different for each wavelength within the passband. This causes broadening of the filter transmission profile and degradation of the image quality. Aims. We want to characterize the properties of our filter, at normal incidence as well as at different tilt angles. Knowing the broadened profile is important for the interpretation of the solar images. Compensating the images for the degrading effects will improve the resolution and remove one source of image contrast degradation. In particular, we need to solve the latter problem for images that are also compensated for blurring caused by atmospheric turbulence. Methods. We simulate the process of image formation through a tilted interference filter in order to understand the effects. We test the hypothesis that they are separable from the effects of wavefront aberrations for the purpose of image deconvolution. We measure the filter transmission profile and the degrading PSF from calibration data. Results. We find that the filter transmission profile differs significantly from the specifications.We demonstrate how to compensate for the image-degrading effects. Because the filter tilt effects indeed appear to be separable from wavefront aberrations in a useful way, this can be done in a final deconvolution, after standard image restoration with MFBD/Phase Diversity based methods. We illustrate the technique with real data

    The IRX-beta relation on sub-galactic scales in star-forming galaxies of the Herschel Reference Survey

    Get PDF
    UV and optical surveys are essential to gain insight into the processes driving galaxy formation and evolution. The rest-frame UV emission is key to measure the cosmic SFR. However, UV light is strongly reddened by dust. In starburst galaxies, the UV colour and the attenuation are linked, allowing to correct for dust extinction. Unfortunately, evidence has been accumulating that the relation between UV colour and attenuation is different for normal star-forming galaxies when compared to starburst galaxies. It is still not understood why star-forming galaxies deviate from the UV colour-attenuation relation of starburst galaxies. Previous work and models hint that the role of the shape of the attenuation curve and the age of stellar populations have an important role. In this paper we aim at understanding the fundamental reasons to explain this deviation. We have used the CIGALE SED fitting code to model the far UV to the far IR emission of a set of 7 reasonably face-on spiral galaxies from the HRS. We have explored the influence of a wide range of physical parameters to quantify their influence and impact on the accurate determination of the attenuation from the UV colour, and why normal galaxies do not follow the same relation as starburst galaxies. We have found that the deviation can be best explained by intrinsic UV colour differences between different regions in galaxies. Variations in the shape of the attenuation curve can also play a secondary role. Standard age estimators of the stellar populations prove to be poor predictors of the intrinsic UV colour. These results are also retrieved on a sample of 58 galaxies when considering their integrated fluxes. When correcting the emission of normal star-forming galaxies for the attenuation, it is crucial to take into account possible variations in the intrinsic UV colour as well as variations of the shape of the attenuation curve.Comment: Accepted for publication in A&A, 18 pages, 14 figures. The paper with high resolution figures can be downloaded at http://www.oamp.fr/people/mboquien/HRS/boquien_IRX_beta.pd

    DustPedia: Multiwavelength photometry and imagery of 875 nearby galaxies in 42 ultraviolet-microwave bands

    Get PDF
    Aims. The DustPedia project is capitalising on the legacy of the Herschel Space Observatory, using cutting-edge modelling techniques to study dust in the 875 DustPedia galaxies – representing the vast majority of extended galaxies within 3000 km s-1 that were observed by Herschel. This work requires a database of multiwavelength imagery and photometry that greatly exceeds the scope (in terms of wavelength coverage and number of galaxies) of any previous local-Universe survey. Methods. We constructed a database containing our own custom Herschel reductions, along with standardised archival observations from GALEX, SDSS, DSS, 2MASS, WISE, Spitzer, and Planck. Using these data, we performed consistent aperture-matched photometry, which we combined with external supplementary photometry from IRAS and Planck. Results. We present our multiwavelength imagery and photometry across 42 UV-microwave bands for the 875 DustPedia galaxies. Our aperture-matched photometry, combined with the external supplementary photometry, represents a total of 21 857 photometric measurements. A typical DustPedia galaxy has multiwavelength photometry spanning 25 bands. We also present the Comprehensive & Adaptable Aperture Photometry Routine (CAAPR), the pipeline we developed to carry out our aperture-matched photometry. CAAPR is designed to produce consistent photometry for the enormous range of galaxy and observation types in our data. In particular, CAAPR is able to determine robust cross-compatible uncertainties, thanks to a novel method for reliably extrapolating the aperture noise for observations that cover a very limited amount of background. Our rich database of imagery and photometry is being made available to the community

    The Imaging Magnetograph eXperiment (IMaX) for the Sunrise balloon-borne solar observatory

    Get PDF
    The Imaging Magnetograph eXperiment (IMaX) is a spectropolarimeter built by four institutions in Spain that flew on board the Sunrise balloon-borne telesocope in June 2009 for almost six days over the Arctic Circle. As a polarimeter IMaX uses fast polarization modulation (based on the use of two liquid crystal retarders), real-time image accumulation, and dual beam polarimetry to reach polarization sensitivities of 0.1%. As a spectrograph, the instrument uses a LiNbO3 etalon in double pass and a narrow band pre-filter to achieve a spectral resolution of 85 mAA. IMaX uses the high Zeeman sensitive line of Fe I at 5250.2 AA and observes all four Stokes parameters at various points inside the spectral line. This allows vector magnetograms, Dopplergrams, and intensity frames to be produced that, after reconstruction, reach spatial resolutions in the 0.15-0.18 arcsec range over a 50x50 arcsec FOV. Time cadences vary between ten and 33 seconds, although the shortest one only includes longitudinal polarimetry. The spectral line is sampled in various ways depending on the applied observing mode, from just two points inside the line to 11 of them. All observing modes include one extra wavelength point in the nearby continuum. Gauss equivalent sensitivities are four Gauss for longitudinal fields and 80 Gauss for transverse fields per wavelength sample. The LOS velocities are estimated with statistical errors of the order of 5-40 m/s. The design, calibration and integration phases of the instrument, together with the implemented data reduction scheme are described in some detail.Comment: 17 figure
    • 

    corecore