429 research outputs found
Benchmarking GEANT4 nuclear models for hadron therapy with 95 MeV/nucleon carbon ions
In carbon-therapy, the interaction of the incoming beam with human tissues
may lead to the production of a large amount of nuclear fragments and secondary
light particles. An accurate estimation of the biological dose deposited into
the tumor and the surrounding healthy tissues thus requires sophisticated
simulation tools based on nuclear reaction models. The validity of such models
requires intensive comparisons with as many sets of experimental data as
possible. Up to now, a rather limited set of double di erential carbon
fragmentation cross sections have been measured in the energy range used in
hadrontherapy (up to 400 MeV/A). However, new data have been recently obtained
at intermediate energy (95 MeV/A). The aim of this work is to compare the
reaction models embedded in the GEANT4 Monte Carlo toolkit with these new data.
The strengths and weaknesses of each tested model, i.e.
G4BinaryLightIonReaction, G4QMDReaction and INCL++, coupled to two di fferent
de-excitation models, i.e. the generalized evaporation model and the Fermi
break-up are discussed
Simulation study on light ions identification methods for carbon beams from 95 to 400 MeV/A
Monte Carlo simulations have been performed in order to evaluate the
efficiencies of several light ions identification techniques. The detection
system was composed with layers of scintillating material to measure either the
deposited energy or the time-of-flight of ions produced by nuclear reactions
between 12C projectiles and a PMMA target. Well known techniques such as
(DELTA) E--Range, (DELTA) E--E--ToF and (DELTA)E--E are presented and their
particle identification efficiencies are compared one to another regarding the
generated charge and mass of the particle to be identified. The simulations
allowed to change the beam energy matching the ones proposed in an hadron
therapy facility, namely from 95 to 400 MeV/A
Les dirigeants français du CAC 40 : entre élitisme scolaire et passage par l'État
International audienceCet article étudie les dirigeants français des entreprises du CAC 40 à la fin de 2007 en s'intéressant particulièrement aux filières de formation supérieure qu'ils ont suivies. Les anciens élèves des grandes écoles les plus élitistes (Ecole polytechnique, HEC, ENA) continuent de dominer et il existe toujours une tendance à recruter dans les grands corps de l'Etat
The WebStand Project
In this paper we present the state of advancement of the French ANR WebStand
project. The objective of this project is to construct a customizable XML based
warehouse platform to acquire, transform, analyze, store, query and export data
from the web, in particular mailing lists, with the final intension of using
this data to perform sociological studies focused on social groups of World
Wide Web, with a specific emphasis on the temporal aspects of this data. We are
currently using this system to analyze the standardization process of the W3C,
through its social network of standard setters
XML content warehousing: Improving sociological studies of mailing lists and web data
In this paper, we present the guidelines for an XML-based approach for the
sociological study of Web data such as the analysis of mailing lists or
databases available online. The use of an XML warehouse is a flexible solution
for storing and processing this kind of data. We propose an implemented
solution and show possible applications with our case study of profiles of
experts involved in W3C standard-setting activity. We illustrate the
sociological use of semi-structured databases by presenting our XML Schema for
mailing-list warehousing. An XML Schema allows many adjunctions or crossings of
data sources, without modifying existing data sets, while allowing possible
structural evolution. We also show that the existence of hidden data implies
increased complexity for traditional SQL users. XML content warehousing allows
altogether exhaustive warehousing and recursive queries through contents, with
far less dependence on the initial storage. We finally present the possibility
of exporting the data stored in the warehouse to commonly-used advanced
software devoted to sociological analysis
La régulation par les standards ISO
International audienceDes appareillages électriques, en passant par les bilans comptables jusqu'aux technologies de l'information et de la communication, rares sont les secteurs économiques qui ne soient pas soumis à des normes dites techniques. Les normes techniques, dont les normes ISO sont certainement les plus connues (notamment ISO 9000 et 14000), sont des standards de production qui ont pour but d'améliorer la qualité, la sécurité ou la comptabilité de biens et de services. De fait, si ces normes passent souvent pour un objet obscur et complexe engageant les quelques initiés des domaines concernés, elles n'en demeurent pas moins un véritable processus de régulation économique à l'échelle mondiale. Le terme ISO est polysémique puisqu'il désigne aussi bien l'organisation internationale (International Organization for Standardization) au sein de laquelle sont adoptés les standards que les normes elles-mêmes. Créée en 1946, à la suite de la Fédération internationale des associations nationales de normalisation, l'ISO présente une structure fédérative regroupant les agences nationales de 148 pays. Le nombre impressionnant de normes publiées (13 700 depuis 1947) tend à attester de l'importance de ces dispositions pour les secteurs économiques concernés. A l'heure des débats sur la mondialisation, il peut être intéressant de se pencher sur une entreprise de régulation internationale qui, loin de réactiver des oppositions classiques (libéralisme économique contre régulation ou intérêts privés contre autorité publique), semble, au contraire, réaliser un syncrétisme original. Les politiques de normalisation internationale mettent en relation une multitude d'acteurs publics et privés : organisations internationales, administrations nationales, agences, centres de recherches, entreprises, associations, etc. Cette diversité d'acteurs n'épuise cependant pas la notion de politique publique, mais incite à la concevoir sous un nouveau jour, qui soit susceptible d'en faire ressortir tant l'originalité des formes que la portée effective dans les cas qui nous intéressent. Dans la contribution sans doute un peu trop dense que nous présentons ici, nous aimerions à la fois interroger les entrées classiques d'analyse de ces politiques, et proposer une démarche d'analyse un peu différente, basée sur l'idée de format empruntée à Rémi Barbier, et de monopolisation des formats. Ce cadre d'analyse permet de saisir à la fois les modalités d'élaboration des prescriptions et la mise à l'épreuve de ces prescriptions dans les situations de travail qui les opérationnalisent
Recommended from our members
The Role of Internal Third-Party Interveners in Civil Resistance Campaigns: The Case of Israeli–Jewish Anti-Occupation Activists
When a non-violent resistance campaign does not have leverage to challenge powerful opponents, third-party intervention has been shown to assist. While the role of external third-party interveners – foreign activists – has been documented, less attention has been given to intervention from members of the dominant population. Drawing from the literature on civil resistance and through the study of Israeli Jews who intervene in Palestinian resistance campaigns against the Israeli military occupation, I argue that intervention from members of the dominant population is strategically desirable. Through an analysis of three Palestinian campaigns, this article identifies that the physical presence of Israeli Jews was needed to ensure the Palestinians could maintain their resistance efforts and presence on the land, despite the repression they faced. Furthermore, the skills and knowledge of the Israelis were needed to help the Palestinians achieve some of their goals, at least in the short term
Normalisation et régulation des marchés : la téléphonie mobile en Europe et aux Etats-Unis.
L’étude propose d’analyser les liens complexes entre standardisation et régulation des marchés de téléphonie mobile selon une perspective d’économie politique tenant compte, dans une perspective schumpetérienne, des déséquilibres de marché et des phénomènes monopolistiques associés à l’innovation. Elle vise d’abord à souligner, pour les différentes générations de réseaux (de 0 G à 4G), la particularité de cette industrie en matière de retour sur investissement, et le rôle clé que tient la standardisation des réseaux dans la structuration du marché. Cette variable-clé du standard explique en grande partie la rente qu’a représenté le GSM dans les dynamiques industrielles et financières du secteur. L’étude explore ainsi les relations entre les politiques de normalisation, qui ne sont évidemment ni le seul fait d’acteurs publics ni de simples règles de propriété industrielle, et les politiques de régulation du secteur (attribution de licences, règles de concurrence, etc.). Elle souligne que les vingt-cinq dernières années rendent de plus en plus complexes les configurations d’expertise, et accroissent les interdépendances entre entrepreneurs de réseaux, normalisateurs et régulateurs. Dans une perspective proche de celle de Fligstein, qui met en avant différentes dimensions institutionnelles de la structuration du marché (politiques concurrentielles, règles de propriété industrielle, rapports salariaux, institutions financières), il s’agit donc ici de souligner les relations d’interdépendance entre diverses sphères d’activité fortement institutionnalisées.The study proposes analyzing the complex links between the standardization and regulation of mobile phone markets from a political economy perspective. Moreover, this study examines these links by taking into consideration, from a Schumpeterian perspective, the market disequilibrium and the monopolistic phenomena associated with innovation. It aims firstly to underline, with respect to different network generations (0G to 4G), the particularity of this industry in terms of investment return, and the key role that network standardization plays in the structuring of the market. This key variable of the standard explains in large part the income that GSM represented in the industrial and financial dynamics of the sector. The study thus explores the relations between the normalization policies, which are certainly neither the sole issue of public actors nor are they simple industrial property regulations, and the regulation policies of the sector (allocation of licenses, trade regulations, etc.). It underlines that the last twenty-five years have made the configurations of expertise more and more complex, and have increased the interdependency between network entrepreneurs, normalizers, and regulators. From a perspective close to Fligstein’s, which emphasizes the different institutional dimensions of market structuring (trade policies, industrial property regulations, wage relations, financial institutions), this study focuses on the interdependent relations between diverse, heavily institutionalized spheres of activity.Monopolisation; Normalisation; Régulation des marchés; Téléphonie mobile;
Using Open Standards for Interoperability - Issues, Solutions, and Challenges facing Cloud Computing
Virtualization offers several benefits for optimal resource utilization over
traditional non-virtualized server farms. With improvements in internetworking
technologies and increase in network bandwidth speeds, a new era of computing
has been ushered in, that of grids and clouds. With several commercial cloud
providers coming up, each with their own APIs, application description formats,
and varying support for SLAs, vendor lock-in has become a serious issue for end
users. This article attempts to describe the problem, issues, possible
solutions and challenges in achieving cloud interoperability. These issues will
be analyzed in the ambit of the European project Contrail that is trying to
adopt open standards with available virtualization solutions to enhance users'
trust in the clouds by attempting to prevent vendor lock-ins, supporting and
enforcing SLAs together with adequate data protection for sensitive data
Predictive factors of clinical assays on hydroxychloroquine for COVID-19 mortality during the first year of the pandemic: A meta-synthesis
Background: The COVID-19 pandemic led to a violent debate about the efficacy of a repurposed drug hydroxychloroquine (HCQ) and a new broad-spectrum antiviral (remdesivir) and about randomized controlled trials (RCTs) and observational studies. To understand conflicting results in the literature, we performed a metasynthesis to determine whether intrinsic qualitative criteria within studies may predict apparent efficacy or ineffectiveness of HCQ and remdesivir. Methodology: Predictive criteria were identified through critical review of studies assessing HCQ and remdesivir for COVID-19 mortality from March to November 2020. Multiple correspondence analysis, comparative metaanalysis, and predictive value were used to explore and identify criteria associated with study outcomes. Results: Among the 61 included studies, potential conflict of interest, detailed therapeutic protocol, toxic treatment (overdose or use in contraindicated patients), known centers and doctors, and private data computing company were the most predictive criteria of the direction of effect of the studies. All 18 observational studies evaluating HCQ and reporting detailed therapeutic protocol without conflict of interest were Pro. Potential conflict of interest was a perfect predictor for remdesivir efficacy. RCTs were associated with HCQ inefficacy and potential conflict of interest. The most predictive criteria were validated and allowed perfect classification of 10 additional studies.Conclusion: In therapeutic trials on COVID-19, the major biases predicting the conclusions are not methodology nor data analysis, but conflict of interest and absence of medical expertise. The thorough search for declared or undeclared and direct or indirect conflict of interest, and medical expertise should be included in the quality criteria for the evaluation of future therapeutic studies in COVID-19 and beyond. A new checklist evaluating not only methodology but also conflict of interest and medical expertise is proposed
- …