4,745 research outputs found
Television's new engines
Internationalization is a key to the success of television formats. In order to understand format trade it is necessary to draw out distinctions between formats and genre. Engines—innovations in programming engineered by format devisors—allow formats to regenerate and hybridize across genres. The core principle of formats, however, is the practice of franchising. Causal relations can be established between formats, engines and the tradability of television culture. The article shows how formats have impacted upon platforms, markets, labor, audiences and distribution of TV content
Put On Your Slippers And Fill Up Your Pipe : You\u27re Not Going Bye-Bye To-Night
https://digitalcommons.library.umaine.edu/mmb-vp/4293/thumbnail.jp
Recommended from our members
Drama without drama: The late rise of scripted TV formats
This article revisits the history of TV formats - concepts of TV shows that are licensed for local adaptations – focusing on scripted entertainment. While the TV format revolution of the 1990s bypassed scripted formats, they have been catching up in recent years. This paper analyses both the reasons for this late rise and the factors behind the recent growth. It argues that the adaptation of scripted formats is more complex and risks remain higher than for other genres. The underlying economics of their production and distribution also differs from non-scripted formats. Stars came together when demand for drama increased worldwide, Hollywood studios began to mine their catalogues, new exporters and scripted genres emerged, and knowledge transfer techniques improved. Finally, this paper analyses the significance of the rise of scripted entertainment in the global TV format trading system
Leveraging GPT-4 for identifying cancer phenotypes in electronic health records: A performance comparison between GPT-4, GPT-3.5-turbo, Flan-T5, Llama-3-8B, and spaCy\u27s rule-based and machine learning-based methods
OBJECTIVE: Accurately identifying clinical phenotypes from Electronic Health Records (EHRs) provides additional insights into patients\u27 health, especially when such information is unavailable in structured data. This study evaluates the application of OpenAI\u27s Generative Pre-trained Transformer (GPT)-4 model to identify clinical phenotypes from EHR text in non-small cell lung cancer (NSCLC) patients. The goal was to identify disease stages, treatments and progression utilizing GPT-4, and compare its performance against GPT-3.5-turbo, Flan-T5-xl, Flan-T5-xxl, Llama-3-8B, and 2 rule-based and machine learning-based methods, namely, scispaCy and medspaCy.
MATERIALS AND METHODS: Phenotypes such as initial cancer stage, initial treatment, evidence of cancer recurrence, and affected organs during recurrence were identified from 13 646 clinical notes for 63 NSCLC patients from Washington University in St. Louis, Missouri. The performance of the GPT-4 model is evaluated against GPT-3.5-turbo, Flan-T5-xxl, Flan-T5-xl, Llama-3-8B, medspaCy, and scispaCy by comparing precision, recall, and micro-F1 scores.
RESULTS: GPT-4 achieved higher F1 score, precision, and recall compared to Flan-T5-xl, Flan-T5-xxl, Llama-3-8B, medspaCy, and scispaCy\u27s models. GPT-3.5-turbo performed similarly to that of GPT-4. GPT, Flan-T5, and Llama models were not constrained by explicit rule requirements for contextual pattern recognition. spaCy models relied on predefined patterns, leading to their suboptimal performance.
DISCUSSION AND CONCLUSION: GPT-4 improves clinical phenotype identification due to its robust pre-training and remarkable pattern recognition capability on the embedded tokens. It demonstrates data-driven effectiveness even with limited context in the input. While rule-based models remain useful for some tasks, GPT models offer improved contextual understanding of the text, and robust clinical phenotype extraction
Dynamic Peer-to-Peer Competition
The dynamic behavior of a multiagent system in which the agent size
is variable it is studied along a Lotka-Volterra approach. The agent size has
hereby for meaning the fraction of a given market that an agent is able to
capture (market share). A Lotka-Volterra system of equations for prey-predator
problems is considered, the competition factor being related to the difference
in size between the agents in a one-on-one competition. This mechanism
introduces a natural self-organized dynamic competition among agents. In the
competition factor, a parameter is introduced for scaling the
intensity of agent size similarity, which varies in each iteration cycle. The
fixed points of this system are analytically found and their stability analyzed
for small systems (with agents). We have found that different scenarios
are possible, from chaotic to non-chaotic motion with cluster formation as
function of the parameter and depending on the initial conditions
imposed to the system. The present contribution aim is to show how a realistic
though minimalist nonlinear dynamics model can be used to describe market
competition (companies, brokers, decision makers) among other opinion maker
communities.Comment: 17 pages, 50 references, 6 figures, 1 tabl
Big Data Strategies for Data Center Infrastructure Management Using a 3D Gaming Platform
High Performance Computing (HPC) is intrinsically linked to effective Data
Center Infrastructure Management (DCIM). Cloud services and HPC have become key
components in Department of Defense and corporate Information Technology
competitive strategies in the global and commercial spaces. As a result, the
reliance on consistent, reliable Data Center space is more critical than ever.
The costs and complexity of providing quality DCIM are constantly being tested
and evaluated by the United States Government and companies such as Google,
Microsoft and Facebook. This paper will demonstrate a system where Big Data
strategies and 3D gaming technology is leveraged to successfully monitor and
analyze multiple HPC systems and a lights-out modular HP EcoPOD 240a Data
Center on a singular platform. Big Data technology and a 3D gaming platform
enables the relative real time monitoring of 5000 environmental sensors, more
than 3500 IT data points and display visual analytics of the overall operating
condition of the Data Center from a command center over 100 miles away. In
addition, the Big Data model allows for in depth analysis of historical trends
and conditions to optimize operations achieving even greater efficiencies and
reliability.Comment: 6 pages; accepted to IEEE High Peformance Extreme Computing (HPEC)
conference 201
Diffuse Hard X-ray Emission in Starburst Galaxies as Synchrotron from Very High Energy Electrons
[Abdriged] The origin of the diffuse hard X-ray (2 - 10 keV) emission from
starburst galaxies is a long-standing problem. We suggest that synchrotron
emission of 10 - 100 TeV electrons and positrons (e+/-) can contribute to this
emission, because starbursts have strong magnetic fields. We consider three
sources of e+/- at these energies: (1) primary electrons directly accelerated
by supernova remnants; (2) pionic secondary e+/- created by inelastic
collisions between CR protons and gas nuclei in the dense ISMs of starbursts;
(3) pair e+/- produced between the interactions between 10 - 100 TeV gamma-rays
and the intense far-infrared (FIR) radiation fields of starbursts. We create
one-zone steady-state models of the CR population in the Galactic Center (R <=
112 pc), NGC 253, M82, and Arp 220's nuclei, assuming a power law injection
spectrum for electrons and protons. We compare these models to extant radio and
GeV and TeV gamma-ray data for these starbursts, and calculate the diffuse
synchrotron X-ray and Inverse Compton (IC) luminosities of these starbursts. If
the primary electron spectrum extends to ~PeV energies and has a
proton/electron injection ratio similar to the Galactic value, we find that
synchrotron contributes 2 - 20% of their unresolved, diffuse hard X-ray
emission. Inverse Compton emission is likewise a minority of the unresolved
X-ray emission in these starbursts, from 0.1% in the Galactic Center to 10% in
Arp 220's nuclei. We also model generic starbursts, including submillimeter
galaxies, in the context of the FIR--X-ray relation, finding that up to 2% in
the densest starbursts with our fiducial assumptions. Neutrino and TeV
gamma-ray data can further constrain the synchrotron X-ray emission of
starbursts. Our models do not constrain hard synchrotron X-ray emission from
any additional hard components of primary e+/- from sources like pulsars in
starbursts.Comment: Accepted by ApJ; 31 pages, emulateapj forma
A viscoelastic deadly fluid in carnivorous pitcher plants
Background : The carnivorous plants of the genus Nepenthes, widely
distributed in the Asian tropics, rely mostly on nutrients derived from
arthropods trapped in their pitcher-shaped leaves and digested by their
enzymatic fluid. The genus exhibits a great diversity of prey and pitcher forms
and its mechanism of trapping has long intrigued scientists. The slippery inner
surfaces of the pitchers, which can be waxy or highly wettable, have so far
been considered as the key trapping devices. However, the occurrence of species
lacking such epidermal specializations but still effective at trapping insects
suggests the possible implication of other mechanisms. Methodology/Principal
Findings : Using a combination of insect bioassays, high-speed video and
rheological measurements, we show that the digestive fluid of Nepenthes
rafflesiana is highly viscoelastic and that this physical property is crucial
for the retention of insects in its traps. Trapping efficiency is shown to
remain strong even when the fluid is highly diluted by water, as long as the
elastic relaxation time of the fluid is higher than the typical time scale of
insect movements. Conclusions/Significance : This finding challenges the common
classification of Nepenthes pitchers as simple passive traps and is of great
adaptive significance for these tropical plants, which are often submitted to
high rainfalls and variations in fluid concentration. The viscoelastic trap
constitutes a cryptic but potentially widespread adaptation of Nepenthes
species and could be a homologous trait shared through common ancestry with the
sundew (Drosera) flypaper plants. Such large production of a highly
viscoelastic biopolymer fluid in permanent pools is nevertheless unique in the
plant kingdom and suggests novel applications for pest control
Public policies, law and bioethics: : a framework for producing public health policy across the European Union
Unlike the duties of clinicians to patients, professional standards for ethical practice are not well defined in public health. This is mainly due to public health practice having to reconcile tensions between public and private interest(s). This involves at times being paternalistic, while recognising the importance of privacy and autonomy, and at the same time balancing the interests of some against those of others. The Public Health specialist operates at the macro level, frequently having to infer the wishes and needs of individuals that make up a population and may have to make decisions where the interests of people conflict. This is problematic when devising policy for small populations; however, it becomes even more difficult when there is responsibility for many communities or nation states. Under the Treaty on European Union, the European Commission was given a competence in public health. Different cultures will give different moral weight to protecting individual interests versus action for collective benefit. However, even subtle differences in moral preferences may cause problems in deriving public health policy within the European Union. Understanding the extent to which different communities perceive issues such as social cohesion by facilitating cultural dialogues will be vital if European institutions are to work towards new forms of citizenship. The aim of EuroPHEN was to derive a framework for producing common approaches to public health policy across Europe. Little work has been done on integrating ethical analysis with empirical research, especially on trade-offs between private and public interests. The disciplines of philosophy and public policy have been weakly connected. Much of the thinking on public health ethics has hitherto been conducted in the United States of America, and an ethical framework for public health within Europe would need to reflect the greater respect for values such as solidarity and integrity which are more highly valued in Europe. Towards this aim EuroPHEN compared the organisation of public health structures and public policy responses to selected public health problems in Member States to examine how public policy in different countries weighs competing claims of private and public interest. Ethical analysis was performed of tensions between the private and public interest in the context of various ethical theories, principles and traditions. During autumn 2003, 96 focus groups were held across 16 European Union Member States exploring public attitudes and values to public versus private interests. The groups were constructed to allow examination of differences in attitudes between countries and demographic groups (age, gender, smoking status, educational level and parental and marital status). Focus group participants discussed issues such as attitudes to community; funding of public services; rights and responsibilities of citizens; rules and regulations; compulsory car seat belts; policies to reduce tobacco consumption; Not-In-My-Back-Yard arguments; banning of smacking of children; legalising cannabis and parental choice with regards to immunisation. This project proposes a preliminary framework and stresses that a European policy of Public Health will have to adopt a complex, pluralistic and dynamic goal structure, capable of accommodating variations in what specific goals should be prioritised in the specific socio-economic settings of individual countries
- …