588 research outputs found
Citoyen ou consommateur face à la PFI: Des questions en suspens.
International audienceThe PFI (Private Finance Initiative) is a method for financing major procurement projects that relies on heavy private sector involvement. It has been controversial ever since its introduction on a large scale by the Conservatives in the 1990s. Remarkably, PFI has become popular among New Labour ministers. This paper analyses the impact of PFI on the relationship between citizens and governmentLa Private Finance Initiative, système de financement d'infrastructures publiques, fait partie d'un vaste ensemble de politiques visant à mettre en oeuvre des synergies entre les secteurs public et privé dont l'objectif ultime affiché est d'améliorer la quantité et la qualité des prestations fournies aux usagers. À ce titre, la PFI n'est qu'une des composantes du grand domaine des partenariats publics privés (PPPs, public-private partnerships). Cette volonté de dépasser ce qui est perçu comme un clivage démodé entre le public et le privé ne se limite pas à l'aspect purement financier de l'investissement public et irrigue l'ensemble de la rhétorique néo-travailliste. Par delà le discours technicien, cette approche interroge, de façon fondamentale, la conception du citoyen qui la sous-tend
A fast monolithic active pixel sensor with pixel level reset noise suppression and binary outputs for charged particle detection
In order to develop precision vertex detectors for the future linear collider, fast active monolithic active pixel sensors are studied. Standard CMOS 0.25 mum digital process is used to design a test chip which includes different pixel types, column-level discriminators and a digital control part. In-pixel amplification is implemented together with double sampling. Different charge-to-voltage conversion factors were obtained using amplifiers with different gains or diode sizes. Pixel architectures with DC and AC coupling to charge sensing element were proposed. As far, hits from conversion of 35Fe photons were registered for the DC-coupled pixel. Double sampling is functional and allows almost a complete cancellation if fixed pattern noise
Critical assessment of QSAR models of environmental toxicity against Tetrahymena pyriformis: focusing on applicability domain and overfitting by variable selection
The estimation of the accuracy of predictions is a critical problem in QSAR modeling. The "distance to model" can be defined as a metric that defines the similarity between the training set molecules and the test set compound for the given property in the context of a specific model. It could be expressed in many different ways, e.g., using Tanimoto coefficient, leverage, correlation in space of models, etc. In this paper we have used mixtures of Gaussian distributions as well as statistical tests to evaluate six types of distances to models with respect to their ability to discriminate compounds with small and large prediction errors. The analysis was performed for twelve QSAR models of aqueous toxicity against T. pyriformis obtained with different machine-learning methods and various types of descriptors. The distances to model based on standard deviation of predicted toxicity calculated from the ensemble of models afforded the best results. This distance also successfully discriminated molecules with low and large prediction errors for a mechanism-based model developed using log P and the Maximum Acceptor Superdelocalizability descriptors. Thus, the distance to model metric could also be used to augment mechanistic QSAR models by estimating their prediction errors. Moreover, the accuracy of prediction is mainly determined by the training set data distribution in the chemistry and activity spaces but not by QSAR approaches used to develop the models. We have shown that incorrect validation of a model may result in the wrong estimation of its performance and suggested how this problem could be circumvented. The toxicity of 3182 and 48774 molecules from the EPA High Production Volume (HPV) Challenge Program and EINECS (European chemical Substances Information System), respectively, was predicted, and the accuracy of prediction was estimated. The developed models are available online at http://www.qspr.org site
Addressing a bottle neck for regulation of nanomaterials: quantitative read-across (Nano-QRA) algorithm for cases when only limited data is available
The number and variety of engineered nanoparticles have been growing exponentially. Since the experimental evaluation of nanoparticles causing public health concerns is expensive and time consuming, efficient computational tools are amongst the most suitable approaches to identifying potential negative impacts, to the human health and the environment, of new nanomaterials before their production. However, developing computational models complimentary to experiments is impossible without incorporating consistent and high quality experimental data. Although there are limited available data in the literature, one may apply read-across techniques that seem to be an attractive and pragmatic alternative way of predicting missing physico-chemical or toxicological data. Unfortunately, the existing methods of read-across are strongly dependent on the expert's knowledge. In consequence, the results of estimations may vary dependently on personal experience of expert conducting the study and as such cannot guarantee the reproducibility of their results. Therefore, it is essential to develop novel read-across algorithm(s) that will provide reliable predictions of the missing data without the need to for additional experiments. We proposed a novel quantitative read-across approach for nanomaterials (Nano-QRA) that addresses and overcomes a basic limitation of existing methods. It is based on: one-point-slope, two-point formula, or the equation of a plane passing through three points. The proposed Nano-QRA approach is a simple and effective algorithm for filling data gaps in quantitative manner providing reliable predictions of the missing data. © The Royal Society of Chemistry
A vertex detector for the International Linear Collider based on CMOS sensors
The physics programme at the International Linear Collider (ILC) calls for a vertex detector (VD) providing unprecedented flavour tagging performances, especially for c-quarks and τ leptons. This requirement makes a very granular, thin and multi-layer VD installed very close to the interaction region mandatory. Additional constraints, mainly on read-out speed and radiation tolerance, originate from the beam background, which governs the occupancy and the radiation level the detector should be able to cope with. CMOS sensors are being developed to fulfil these requirements. This report addresses the ILC requirements (highly related to beamstrahlung), the main advantages and features of CMOS sensors, the demonstrated performances and the specific aspects of a VD based on this technology. The status of the main R&D directions (radiation tolerance, thinning procedure and read-out speed) are also presented
- …
