1,429 research outputs found
(Per)versiones de la historia de EEUU en el cine de Lars Von Trier
I Congreso Internacional de Historia y Cine: 5, 6, 7 y 8 de Septiembre de 2007
Zelig o la deconstrucción del biopic
Actas del Segundo Congreso Internacional de Historia y Cine organizado por el Instituto de Cultura y Tecnología Miguel de Unamuno y celebrado del 9 al 11 de septiembre de 2010 en la Universidad Carlos III de MadridLa presente comunicación analiza la película Zelig (Woody Allen, 1983) en el marco de lo que podría denominarse Películas de (de)construcción histórica o Cine histórico posmoderno. Se estudia cómo este filme, ambientado en los años 20, 30 y 40, subvierte las convenciones del biopic y el documental a través de un inteligente despliegue de estrategias formales y narrativas, y cómo con ello cuestiona, desde una óptica eminentemente posmoderna, nociones asentadas en torno a la construcción identitaria, religiosa, política o nacional, así como el papel que en la configuración del proyecto moderno juegan instancias como la Ciencia o los medios de comunicación. Finalmente, se defiende que su rechazo de la univocidad de la interpretación supone una defensa de la libertad individual, la subjetividad, los sentimientos y el humor, frente a las amenazas típicamente modernas de la alienación colectiva, el positivismo científico a ultranza y los maximalismos totalitarios.This paper analyzes Woody Allen’s Zelig (1983) in the framework of what could be called Postmodern Historical Films. It shows how this film, set in the 1920’s, 30’s and 40’s, subverts the conventions of genres such as the biographical film and the documentary through a brilliant display of formal and narrative strategies. These ultimately question, from a typically postmodern approach, settled notions about the construction of identity, be it religious, political or national, as well as the role of certain institutions, like science or mass media, in the project of modernity. Last, it is argued that the film’s disapproval of any univocal characterization of interpretation represents a vindication of individual freedom, subjectivity, feelings and humour, against the typically modern threats of collective alienation, overstated scientific positivism, and maximalist, totalitarian positionsPublicad
Landau-Ginzburg potentials via projective representations
We interpret the Landau-Ginzburg potentials associated to
Gross-Hacking-Keel-Kontsevich's partial compactifications of cluster varieties
as F-polynomials of projective representations of Jacobian algebras. Along the
way, we show that both the projective and the injective representations of
Jacobi-finite quivers with potential are well-behaved under
Derksen-Weyman-Zelevinsky's mutations of representations.Comment: v2 27 pages, some editorial change
(N)NLO+NLL’ accurate predictions for plain and groomed 1-jettiness in neutral current DIS
The possibility to reanalyse data taken by the HERA experiments offers the chance to study modern QCD jet and event-shape observables in deep-inelastic scattering. To address this, we compute resummed and matched predictions for the 1-jettiness distribution in neutral current DIS with and without grooming the hadronic final state using the soft-drop technique. Our theoretical predictions also account for non-perturbative corrections from hadronisation through parton-to-hadron level transfer matrices extracted from dedicated Monte Carlo simulations with Sherpa. To estimate parameter uncertainties in particular for the beam-fragmentation modelling we derive a family of replica tunes to data from the HERA experiments. While NNLO QCD normalisation corrections to the NLO+NLL’ prediction are numerically small, hadronisation corrections turn out to be quite sizeable. However, soft-drop grooming significantly reduces the impact of non-perturbative contributions. We supplement our study with hadron-level predictions from Sherpa based on the matching of NLO QCD matrix elements with the parton shower. Good agreement between the predictions from the two calculational methods is observed
Basin structure of optimization based state and parameter estimation
Most data based state and parameter estimation methods require suitable
initial values or guesses to achieve convergence to the desired solution, which
typically is a global minimum of some cost function. Unfortunately, however,
other stable solutions (e.g., local minima) may exist and provide suboptimal or
even wrong estimates. Here we demonstrate for a 9-dimensional Lorenz-96 model
how to characterize the basin size of the global minimum when applying some
particular optimization based estimation algorithm. We compare three different
strategies for generating suitable initial guesses and we investigate the
dependence of the solution on the given trajectory segment (underlying the
measured time series). To address the question of how many state variables have
to be measured for optimal performance, different types of multivariate time
series are considered consisting of 1, 2, or 3 variables. Based on these time
series the local observability of state variables and parameters of the
Lorenz-96 model is investigated and confirmed using delay coordinates. This
result is in good agreement with the observation that correct state and
parameter estimation results are obtained if the optimization algorithm is
initialized with initial guesses close to the true solution. In contrast,
initialization with other exact solutions of the model equations (different
from the true solution used to generate the time series) typically fails, i.e.
the optimization procedure ends up in local minima different from the true
solution. Initialization using random values in a box around the attractor
exhibits success rates depending on the number of observables and the available
time series (trajectory segment).Comment: 15 pages, 2 figure
Evidential strategies in financial statement analysis: a corpus linguistic text mining approach to bankruptcy prediction
The qualitative information of companies’ financial statements provides useful information that can increase the accuracy of bankruptcy prediction models. In this research, a dataset of 924,903 financial statements from 355,704 German companies classified into solvent, financially distressed, and bankrupt companies using the Amadeus database from Bureau van Dijk was examined. The results provide empirical evidence that a corpus linguistic approach implementing evidential strategy analysis towards financial statements helps to distinguish between companies’ financial situations. They show that companies use different approaches and confidence assessments when evaluating their financial statements based on solvency and vary their use of evidential strategies accordingly. This leads to the proposition of a procedure to quantify and generate features based on the analysis of evidential strategies that can be used to improve corporate bankruptcy prediction. The results presented here stem from an interdisciplinary adaptation of linguistic findings and provide future research with another means of analysis in the area of text mining
Epistemologien der Integration. Zur Zirkulation von Differenzwissen in Politikschulbüchern der postmigrantischen Gesellschaft
Das Konzept der Integration ist nicht zuletzt aufgrund der mit ihm verbundenen natio-ethno-kulturellen Differenzkonstruktionen seit längerem Gegenstand kontroverser Debatten. Politikschulbücher schreiben sich in doppelter Hinsicht – und nicht ganz spannungsfrei – in den Integrationsdiskurs ein, insofern sie Integration sowohl zum Gegenstand als auch zum Lernziel machen. Dieser Beitrag skizziert konzeptionelle Grundlagen und ausgewählte Ergebnisse einer Studie, in der die Konstruktion des Wissensobjekts Integration und damit verbundene Differenz- und Normalitätsvorstellungen in Politikschulbüchern (2002-2021) untersucht wurden. Ziel des Beitrags ist es insbesondere, praxistheoretisch inspirierte und am Material gewonnene Analysekategorien vorzustellen, die auch weitere Forschung zu Differenz(de-)konstruktionen in Bildungsmedien und Unterricht anleiten könnten. (DIPF/Orig.
Measuring Hadronic Higgs Boson Branching Ratios at Future Lepton Colliders
We present a novel strategy for the simultaneous measurement of Higgs-boson
branching ratios into gluons and light quarks at a future lepton collider
operating in the Higgs-factory mode. Our method is based on template fits to
global event-shape observables, and in particular fractional energy
correlations, thereby exploiting differences in the QCD radiation patterns of
quarks and gluons. In a constrained fit of the deviations of the light-flavour
hadronic Higgs-boson branching ratios from their Standard Model expectations,
based on an integrated luminosity of , we obtain
confidence level limits of and .Comment: 12 pages, 6 figures, 2 table
- …