137 research outputs found
Anything You Can Use, I Can Use Better: Examining the Contours of Fair Use as an Affirmative Defense for Theatre Artists, Creators, and Producers
Broadway is booming. In a post-Hamilton world, ticket sales and attendance records for the commercial theatre industry continue to break season after season. At the same time (and perhaps not so coincidentally), litigation against theatre artists, creators, and producers has surged, especially in the realm of copyright infringement. Many theatre professionals accused of infringement in recent years have employed the doctrine of fair useâcodified at 17 U.S.C. § 107âas an affirmative defense against such claims. This Note explores cases involving theatre professionals in which fair use was examined and contends that they collectively reflect broader historical trends in fair use jurisprudence. In particular, this Note argues that the fair use doctrine remains analytically unclear and difficult to follow and proposes that the transformative use inquiryâwhich was articulated in 1994 by the Supreme Court in Campbell v. Acuff-Rose Music, Inc.âbe abandoned in future fair use analyses in favor of expressly following the four statutory factors enumerated in 17 U.S.C. § 107. Lastly, this Note directly addresses theatre artists, creators, and producers, and advises them that when writing, developing, or mounting a new theatrical production, any reliance on the fair use doctrine ought to be avoided. Instead, alternative avenues should be explored in order to circumvent copyright ownership challenges
Inference in receiver operating characteristic surface analysis via a trinormal modelâbased testing approach
Receiver operating characteristic (ROC) analysis is the methodological framework of choice for the assessment of diagnostic markers and classification procedures in general, in both twoâclass and multipleâclass classification problems. We focus on the threeâclass problem for which inference usually involves formal hypothesis testing using a proxy metric such as the volume under the ROC surface (VUS). In this article, we develop an existing approach from the twoâclass ROC framework. We define a hypothesisâtesting procedure that directly compares two ROC surfaces under the assumption of the trinormal model. In the case of the assessment of a single marker, the corresponding ROC surface is compared with the chance plane, that is, to an uninformative marker. A simulation study investigating the proposed tests with existing ones on the basis of the VUS metric follows. Finally, the proposed methodology is applied to a dataset of a panel of pancreatic cancer diagnostic markers. The described testing procedures along with related graphical tools are supported in the corresponding Râpackage trinROC, which we have developed for this purpose
Inference in receiver operating characteristic surface analysis via a trinormal modelâbased testing approach
Receiver operating characteristic (ROC) analysis is the methodological framework of choice for the assessment of diagnostic markers and classification procedures in general, in both twoâclass and multipleâclass classification problems. We focus on the threeâclass problem for which inference usually involves formal hypothesis testing using a proxy metric such as the volume under the ROC surface (VUS). In this article, we develop an existing approach from the twoâclass ROC framework. We define a hypothesisâtesting procedure that directly compares two ROC surfaces under the assumption of the trinormal model. In the case of the assessment of a single marker, the corresponding ROC surface is compared with the chance plane, that is, to an uninformative marker. A simulation study investigating the proposed tests with existing ones on the basis of the VUS metric follows. Finally, the proposed methodology is applied to a dataset of a panel of pancreatic cancer diagnostic markers. The described testing procedures along with related graphical tools are supported in the corresponding Râpackage trinROC, which we have developed for this purpose
A geological 3D-model of Austria
GeoSphere Austria (formerly Geologische Bundesanstalt - Geological Survey of Austria) has produced a supra-regional 3D framework model called â3D AUSTRIAâ providing a large-scale geological overview for professional geologists, students and the public. This model is intended to act as support for subsequent regional modelling projects as well as for educational and communicational purpose.
The modelled domain of covers a rectangular area of 175 000 kmÂČ including the national borders of Austria, down to a depth to 60 km below sea level. Model units are defined following the nomenclature of Schmid et al. (2004) and Froitzheim et al. (2008), each unit having a specific paleo-geographic origin and tectono-metamorphic history. Seven modelling units are considered: two continental plates (1) the Adriatic Plate, (2) the Eurasian Plate, four units from the Alpine orogenic wedge (3) the South-Alpine Superunit, (4) the Austroalpine Superunit, (5) the Penninic Superunit, (6) the Sub-Penninic Superunit and (7) Neogene sedimentary basins in the foreland and within the Alps. Due to the large-scale character of the model, relatively small constituents of the Alpine Orogen are merged together (Meliata Superunit and Inner Western Carpathian Superunit with the Austroalpine Superunit, Helvetic Superunit and Allochtone Molasse with the Sup-Penninic Superunit, intrusive rocks along the Periadriatic Fault with their host unit, minor Neogene basins with the Austroalpine Superunit). The model geometry is constrained by the geological map of Austria 1:1.5M (Schuster et al., 2019), (2) 24 published cross sections and (3) published contour maps for the Moho discontinuity (Ziegler & DĂšzes, 2006) and the large Neogene basins. It has been generated with the SKUA-GOCAD software suite following the workflow of Pfleiderer et al. (2016).
The framework model 3D AUSTRIA can be visualized with the web 3D Viewer of Geosphere Austria (https://gis.geosphere.at/portal/home/webscene/viewer.html?webscene=c11cd25795294ba8b6f276ab2d072afb) or downloaded from the Tethys Research Data Repository (https://doi.tethys.at/10.24341/tethys.184) allowing the generation of a physical multi-part model using 3D printing technology. It provides a unique way to comprehend the fundamentally 3D nature of sedimentary and tectonic features, like the unconformity at the base of Neogene sedimentary basins, the Alpine frontal thrust or the Tauern Window. The data acquired in the framework of the AlpArray project can be used in future for refining the geometry of 3D AUSTRIA
Inference on the symmetry point-based optimal cut-off point and associated sensitivity and specificity with application to SARS-CoV-2 antibody data
Acknowledgments. This work was supported by grants PID2019-104681RB-I00. Data courtesy of Dr Konstantina Kontopoulou.In the presence of a continuous response test/biomarker, it is often necessary to identify a cut-off point value to aid binary classification between diseased and non-diseased subjects. The symmetry-point approach which maximizes simultaneously both types of correct classification is one way to determine an optimal cut-off point. In this article, we study methods for constructing confidence intervals independently for the symmetry point and its corresponding sensitivity, as well as respective joint nonparametric confidence regions. We illustrate using data on the generation of antibodies elicited two weeks post-injection after the second dose of the Pfizer/BioNTech vaccine in adult healthcare workers
Automatic discovery of photoisomerization mechanisms with nanosecond machine learning photodynamics simulations
Photochemical reactions are widely used by academic and industrial researchers to construct complex molecular architectures via mechanisms that often require harsh reaction conditions. Photodynamics simulations provide time-resolved snapshots of molecular excited-state structures required to understand and predict reactivities and chemoselectivities. Molecular excited-states are often nearly degenerate and require computationally intensive multiconfigurational quantum mechanical methods, especially at conical intersections. Non-adiabatic molecular dynamics require thousands of these computations per trajectory, which limits simulations to âŒ1 picosecond for most organic photochemical reactions. Westermayr et al. recently introduced a neural-network-based method to accelerate the predictions of electronic properties and pushed the simulation limit to 1 ns for the model system, methylenimmonium cation (CHNH). We have adapted this methodology to develop the Python-based, Python Rapid Artificial Intelligence Ab Initio Molecular Dynamics (PyRAIMD) software for the cisâtrans isomerization of trans-hexafluoro-2-butene and the 4Ï-electrocyclic ring-closing of a norbornyl hexacyclodiene. We performed a 10 ns simulation for trans-hexafluoro-2-butene in just 2 days. The same simulation would take approximately 58 years with traditional multiconfigurational photodynamics simulations. We generated training data by combining Wigner sampling, geometrical interpolations, and short-time quantum chemical trajectories to adaptively sample sparse data regions along reaction coordinates. The final data set of the cisâtrans isomerization and the 4Ï-electrocyclic ring-closing model has 6207 and 6267 data points, respectively. The training errors in energy using feedforward neural networks achieved chemical accuracy (0.023â0.032 eV). The neural network photodynamics simulations of trans-hexafluoro-2-butene agree with the quantum chemical calculations showing the formation of the cis-product and reactive carbene intermediate. The neural network trajectories of the norbornyl cyclohexadiene corroborate the low-yielding syn-product, which was absent in the quantum chemical trajectories, and revealed subsequent thermal reactions in 1 ns
Baker Center Journal of Applied Public Policy - Vol. IV, No. I
This is the 4th volume of the Baker Center Journal on Applied Public Policy. It includes articles on nuclear nonproliferation, American political development, election issues, Tennessee state trial courts, attitudes related to rich and poor people, and two student articles on science, innovation, technology and economic growth and explosive trace detection at airports
Percutaneous CT fluoroscopy-guided core biopsy of pancreatic lesions: technical and clinical outcome of 104 procedures during a 10-year period
Background: In unclear pancreatic lesions, a tissue sample can confirm or exclude the suspected diagnosis and help to provide an optimal treatment strategy to each patient. To date only one small study reported on the outcome of computed tomography (CT) fluoroscopy-guided biopsies of the pancreas. Purpose: To evaluate technical success and diagnostic rate of all CT fluoroscopy-guided core biopsies of the pancreas performed in a single university center during a 10-year period. Material and Methods: In this retrospective study we included all patients who underwent a CT fluoroscopy-guided biopsy of a pancreatic mass at our comprehensive cancer center between 2005 and 2014. All interventions were performed under local anesthesia on a 16-row or 128-row CT scanner. Technical success and diagnostic rates as well as complications and effective patient radiation dose were analyzed. Results: One hundred and one patients (54 women;mean age, 63.912.6 years) underwent a total of 104 CT fluoroscopy-guided biopsies of the pancreas. Ninety-eight of 104 interventions (94.2%) could be performed with technical success and at least one tissue sample could be obtained. In 88 of these 98 samples, a definitive pathological diagnosis, consistent with clinical success could be achieved (89.8%). Overall 19 minor and three major complications occurred during the intra- or 30-day post-interventional period and all other interventions could be performed without complications;there was no death attributable to the intervention. Conclusion: CT fluoroscopy-guided biopsy of pancreatic lesions is an effective procedure characterized by a low major complication and a high diagnostic rate
Classifying malware attacks in IaaS cloud environments
In the last few years, research has been motivated to provide a categorization and classification of security concerns accompanying the growing adaptation of Infrastructure as a Service (IaaS) clouds. Studies have been motivated by the risks, threats and vulnerabilities imposed by the components within the environment and have provided general classifications of related attacks, as well as the respective detection and mitigation mechanisms. Virtual Machine Introspection (VMI) has been proven to be an effective tool for malware detection and analysis in virtualized environments. In this paper, we classify attacks in IaaS cloud that can be investigated using VMI-based mechanisms. This infers a special focus on attacks that directly involve Virtual Machines (VMs) deployed in an IaaS cloud. Our classification methodology takes into consideration the source, target, and direction of the attacks. As each actor in a cloud environment can be both source and target of attacks, the classification provides any cloud actor the necessary knowledge of the different attacks by which it can threaten or be threatened, and consequently deploy adapted VMI-based monitoring architectures. To highlight the relevance of attacks, we provide a statistical analysis of the reported vulnerabilities exploited by the classified attacks and their financial impact on actual business processes
- âŠ