823 research outputs found
TOWARD AN ONTOLOGICAL ETHICS: DERRIDA’S READING OF HEIDEGGER AND LEVINAS
The aim of this work is to examine a few aspects of Jacques Derrida’s reading of the philosophy of Heidegger and Levinas. Specifically, we intend to show that the criticism Derrida directs towards certain themes in Levinas’s thought at the same time contains a revaluation of Heidegger’s ontology as it was developed during the 1920s, before the so-called Kehre. What this triple hermeneutic comparison puts into play is the relationship between ethics and ontology. In critiquing the relationship between these two concepts in Levinas, Derrida seems to move closer to the way they are described and developed in both Being and Time and in Kant and the Problem of Metaphysics. Finally, we will try to show how this reevaluation of ontology by Derrida determines his approach to the philosophy of Jean-Luc Nancy, whose ethics, differently from Levinas’s, is an ethics of ontology
The old-new epistemology of digital journalism: how algorithms and filter bubbles are (re)creating modern metanarratives
In journalism studies, the advent of the World Wide Web and the rise of online journalism are
generally associated with going beyond the objective, normative paradigm associated with
the principles of philosophical and scientific modernity towards a postmodern paradigm
centred on subjectivity and relativism. This article offers an alternative reading of the epistemology
of online journalism: the fragmentation of audiences into homophilic networks, the
formation of ideological bubbles, and the growing polarisation caused by algorithms make the
contents circulating online a reintroduction of modernity’s metanarratives. These metanarratives
in no way correspond to the principles typical of postmodernism, such as the
equivalence of interpretations and openness to dialogue. Journalistic content also comes
under this charge: although it conveys narratives that are subjective, they are perceived as
absolute truths inside the information bubbles in which they circulate. This phenomenon is
caused by “information platformization” processes. Based on these premises, a new definition
of online journalism is proposed: rather than “postmodern”, it can be better understood as a
fulfilment of the foundational principles of modernism, but in a subjective form
The Covid-19 News Narrative: The Case of Italian Media
The aim of this work is to study news narratives on the Covid-19 pandemic in Italy. Its point of departure is the accusation that scholars and institutions such as the WHO levelled against media outlets around the world for creating a drift defined as “infodemic”: too many news items in circulation that are not properly verified or are deliberately misleading. During the pandemic, media outlets in Italy as elsewhere were accused of oversimplifying messages from the scientific community and experts, creating effects such as undue alarmism in the population. I will assess the accuracy of this accusation by analysing articles published in Italian news outlets, focusing specifically on elements in news narratives whose conformation makes them prone to a high degree of simplification: namely, headlines and ancillary content such as teasers appearing in social networks. As I will show, even in the context of the pandemic, distinctions must be made between unwarranted journalistic simplifications, which border on unfounded news, and simplifications that, on the contrary, allow content in the public interest to reach a larger number of readers, thus heightening the level of awareness of pandemic-related issues. My thesis is that in the context of the digital public sphere, linguistic and conceptual simplification is sometimes necessary, and that it is therefore appropriate to distinguish between cases that produce beneficial effects for public understanding of a phenomenon such as Covid-19 from those that can be classified as infodemic
Cultural studies and cultural sociology: Scott Lash in conversation with Luca Serafini
The purpose of this conversation is to analyse the bond between Cultural Studies and social research, and define analogies and differences between Cultural Studies and Cultural Sociology. Scott Lash, a cultural theorist strongly influenced by philosophical thought, clarifies how the latter can rightfully be part of a social research oriented towards forms of life and practices, as well as towards a concrete, and not utopian, political commitment. The American scholar thoroughly investigates also the notion of "symbolic" in the field of Cultural Sociology, clarifying how it can interact with the speculative models of Cultural Studies
Projection pursuit based on Gaussian mixtures and evolutionary algorithms
We propose a projection pursuit (PP) algorithm based on Gaussian mixture
models (GMMs). The negentropy obtained from a multivariate density estimated by
GMMs is adopted as the PP index to be maximised. For a fixed dimension of the
projection subspace, the GMM-based density estimation is projected onto that
subspace, where an approximation of the negentropy for Gaussian mixtures is
computed. Then, Genetic Algorithms (GAs) are used to find the optimal,
orthogonal projection basis by maximising the former approximation. We show
that this semi-parametric approach to PP is flexible and allows highly
informative structures to be detected, by projecting multivariate datasets onto
a subspace, where the data can be feasibly visualised. The performance of the
proposed approach is shown on both artificial and real datasets
From Compton Scattering of photons on targets to Inverse Compton Scattering of electron and photon beams
We revisit the kinematics of Compton Scattering (electron-photon interactions
producing electrons and photons in the exit channel) covering the full range of
energy/momenta distribution between the two colliding particles, with a
dedicated view to statistical properties of secondary beams that are generated
in beam-beam collisions. Starting from the Thomson inverse scattering, where
electrons do not recoil and photons are back-scattered to higher energies by a
Lorentz boost effect (factor ), we analyze three transition points,
separating four regions. These are in sequence, given by increasing the
electron recoil (numbers are for transition points, letters for regions): a)
Thomson back-scattering, 1) equal sharing of total energy in the exit channel
between electron and photon, b) deep recoil regime where the bandwidth/energy
spread of the two interacting beams are exchanged in the exit channel, 2)
electron is stopped, i.e. taken down at rest in the laboratory system by
colliding with an incident photon of energy, c) electron
back-scattering region, where incident electron is back-scattered by the
incident photon, 3) symmetric scattering, when the incident particles carry
equal and opposite momenta, so that in the exit channel they are back-scattered
with same energy/momenta, d) Compton scattering ( Arthur Compton, see
ref.4), where photons carry an energy much larger than the colliding electron
energy. For each region and/or transition point we discuss the potential
effects of interest in diverse areas, like generating mono-chromatic gamma ray
beams in deep recoil regions with spectral purification, or possible mechanisms
of generation and propagation of very high energy photons in the cosmological
domain
Lessons from the Infodemic: Fact-Checking and the Old-New Ideals of Modern Journalism
The wave of disinformation that marked recent years led to the emergence of new conceptual categories, such as ‘post-truth’. However, it also generated forms of reaction to the phenomenon, such as fact-checking, a journalistic genre that has recently established itself worldwide. The first theoretical hypothesis of this paper postulates that the success of fact-checking can be interpreted as a contemporary return to the ideals of so-called ‘modern’ journalism. These ideals emerged between the 19th century and the first decades of the 20th century and were incorporated within journalism through the principles of the Enlightenment and scientific modernity: objectivity, impartiality, reliance on data and evidence (until they dispersed, in the postmodern turn, with the occurrence of the web). This paper analyses the potentials and limits of this “old-new” model of journalism, using examples from leading international fact-checking projects. Regarding the limits, it will be discussed how practicing fact-checking without adopting a partial point of view is unachievable. Additionally, our analysis will also shed light on how, upon closer inspection, the category of ‘disinformation’ itself turns out to be ambiguous. As for the potentials, we will examine best practices that allow a scrutiny of journalistic narratives on facts, considering examples of ‘good fact-checking’ that does not claim absolute objectivity. We develop the argument that good fact-checking can help to pursue a new model of objectivity and scientificity, based on assumptions such as the falsifiability of statements, the replicability of experiments, and the delimitation of the context of analysis. Finally, we argue that this objectivity should be seen as a form of open rationality rather than a new ‘dogmatism of facts’
Desktop virtual reality as an exposure method for test anxiety: quantitative and qualitative feasibility study
Large quadratic programs in training gaussian support vector machines
We consider the numerical solution of the large convex quadratic program arising in training the learning machines named support vector machines. Since the matrix of the quadratic form is dense and generally large, solution approaches based on explicitstorage of this matrix are not practicable. Well known strategies for this quadratic program are based on decomposition techniques that split the problem into a sequence of smaller quadratic programming subproblems. For the solution of these subproblems we present an iterative projection-type method suited for the structure of the constraints and very eective in case of Gaussian support vector machines. We develop an appropriate decomposition technique designed to exploit the high performance of the proposed inner solver on medium or large subproblems. Numerical experiments on large-scale benchmark problems allow to compare this approach with another widelyused decomposition technique. Finally, a parallel extension of the proposed strategy is described
The role of platforms in the journalistic ecosystem: innovative business models of Italian media outlets
Increased dependence of the news media system on online platforms has caused a readjustment of the traditional business models that supported media enterprises until a few years ago, which in some cases has resulted in a redefinition of the boundaries of the journalistic field. To test the interplay between different strategies of relationship with platforms and a different location within the field, the paper considered four Italian news outlets. Two, Will Media Italia and Factanza, are newcomers, information projects anchored in the logic of online platforms and disseminated almost exclusively on social media. Two, The Post Internazionale and Domani, are inhabitants of the field, media outlets with greater links to the traditional logic of journalism. Through semi-structured interviews conducted with selected members of the four news media outlets, the study aims at answering two key-questions. First: what forms of revenue are the media outlets analyzed aiming for in order to maintain their economic sustainability? How do the characteristics of media outlets influence the use of different forms of advertising, even as a function of the response of the online and platform environments they inhabit, albeit with different strategies? Second-and closely related-how is it preserved, the editorial autonomy of the media outlets analyzed? That is, how do these media outlets attempt to make their business both economically sustainable and not overly dependent on forms of monetization generated by the platforms
- …
