16 research outputs found

    Making visible the invisible: Why disability-disaggregated data is vital to "leave no-one behind"

    Get PDF
    People with disability make up approximately 15% of the world’s population and are, therefore, a major focus of the ‘leave no-one behind’ agenda. It is well known that people with disabilities face exclusion, particularly in low-income contexts, where 80% of people with disability live. Understanding the detail and causes of exclusion is crucial to achieving inclusion, but this cannot be done without good quality, comprehensive data. Against the background of the Convention for the Rights of Persons with Disabilities in 2006, and the advent of 2015’s 2030 Agenda for Sustainable Development there has never been a better time for the drive towards equality of inclusion for people with disability. Governments have laid out targets across seventeen Sustainable Development Goals (SDGs), with explicit references to people with disability. Good quality comprehensive disability data, however, is essential to measuring progress towards these targets and goals, and ultimately their success. It is commonly assumed that there is a lack of disability data, and development actors tend to attribute lack of data as the reason for failing to proactively plan for the inclusion of people with disabilities within their programming. However, it is an incorrect assumption that there is a lack of disability data. There is now a growing amount of disability data available. Disability, however, is a notoriously complex phenomenon, with definitions of disability varying across contexts, as well as variations in methodologies that are employed to measure it. Therefore, the body of disability data that does exist is not comprehensive, is often of low quality, and is lacking in comparability. The need for comprehensive, high quality disability data is an urgent priority bringing together a number of disability actors, with a concerted response underway. We argue here that enough data does exist and can be easily disaggregated as demonstrated by Leonard Cheshire’s Disability Data Portal and other studies using the Washington Group Question Sets developed by the Washington Group on Disability Statistics. Disaggregated data can improve planning and budgeting for reasonable accommodation to realise the human rights of people with disabilities. We know from existing evidence that disability data has the potential to drive improvements, allowing the monitoring and evaluation so essential to the success of the 2030 agenda of ‘leaving no-one behind’

    Ordering of analog specification tests based on parametric defect level estimation

    No full text
    ISBN 978-1-4244-6649-8International audienceThis paper presents an approach for ordering analog specification (or functional) tests that is based on a statistical estimation of parametric defect level. A statistical model of n specification tests is obtained by applying a density estimation technique to a small sample of data (obtained from the initial phase of production testing or through Monte-Carlo simulation of the design). The statistical model is next sampled to generate a large population of synthetic devices from which specification tests can be ordered according to their impact on defect level by means of feature selection techniques. An optimal order can be obtained using the Branch and Bound method when n is relatively low. However, for larger values of n, heuristic methods such as genetic algorithms and floating search must be used which do not guarantee an optimal order. Since the value of n can reach several hundreds for advanced analog integrated devices, we have studied a heuristic algorithm that considers combinations of subsets of the overall test set. These subsets are easier to model and to order and a heuristic approach is used to form an overall order. This test ordering approach is evaluated for different artificial and experimental case-studies, including a fully differential operational amplifier. These case-studies are simple enough so that it is possible to compare the results obtained with the algorithm with an expected reference order

    Minimization of functional tests by statistical modelling of analogue circuits

    No full text
    International audienceIn this paper, we address the problem of functional test compaction of analogue circuits by using a statistical model of the performances of the Circuit Under Test (CUT). The statistical model is obtained using data from a Monte Carlo simulation and uses a multi-normal law to estimate the joint probability density function (PDF) of the circuit performances at the design stage. The functional test compaction method is based on the minimization of the defect level, again at the design stage, that is calculated from the estimated PDF and the actual specifications of the circuit performances. The suitability of the actual reduced functional test set for production test is evaluated in terms of its capability of detecting catastrophic faults

    Analog/RF test ordering in the early stages of production testing

    No full text
    ISBN 978-1-4673-1073-4International audienceOrdering of analog/RF tests is important for the identification of redundant tests. Most methods for test ordering are based on a representative set of defective devices. However, at the beginning of production testing, there is little or no data on defective devices. Obtaining this data through defect and fault simulation is unrealistic for most advanced analog/RF devices. In this work, we will present a method for analog/RF test ordering that uses only data from a small set of functional circuits. A statistical model of the device under test is constructed from this data. This model is next used for sampling a large number of virtual circuits which will also include defective ones. These virtual defective circuits are then used for ordering analog/RF tests using feature selection techniques. Experimental results for an IBM RF front-end have demonstrated the validity of this technique for test grading and compaction

    Minimization of functional tests by statistical modelling of analogue circuits

    No full text
    ISBN : 1-4244-1278-1International audienceIn this paper, we address the problem of functional test compaction of analogue circuits by using a statistical model of the performances of the Circuit Under Test (CUT). The statistical model is obtained using data from a Monte Carlo simulation and uses a multi-normal law to estimate the joint probability density function (PDF) of the circuit performances at the design stage. The functional test compaction method is based on the minimization of the defect level, again at the design stage, that is calculated from the estimated PDF and the actual specifications of the circuit performances. The suitability of the actual reduced functional test set for production test is evaluated in terms of its capability of detecting catastrophic faults

    Human immunodeficiency virus infection in a child revealed by a massive purulent pericarditis mistaken for a liver abscess due to Staphylococcus aureus

    No full text
    Massive purulent andacute pericarditis in children is a life-threatening disease associated with high mortality. It has been described tocomplicate usuallya bronchopulmonary infectionbut is currently uncommon in the era of antibiotics. Acute and massive purulent pericarditis has been rarely reported in children in association with human immunodeficiency virus (HIV) infection. This is a case of a10-year-old boy who presented with signs of sepsis and cardiac tamponade due to a massive staphylococcal purulent pericarditis complicating an unknown HIV infection.The child underwent pericardiectomy, intensive treatment, and survived this life-threatening disease

    Réduction de tests fonctionnels en utilisant des techniques d'estimation non paramétrique

    No full text
    National audienceLa part dû au test dans le coût de conception et de fabrication des circuits intégrés ne cesse de croître, d’où la n´ecessité d’optimiser cette étape devenue incontournable. Dans ce travail, une nouvelle méthode de réduction du nombre de performances à tester est proposée. Elle est basée sur la modélisation du circuit sous test par un modèle statistique non paramétrique (méthode du noyau). Cette modélisation ne nécessite aucune hypothèse sur la distribution des performances du circuit sous test. Le choix des paramètres optimaux du modèle est validé à l’aide du test de Kolmogorov-Smirnov. Une fois le modèle validé, la méthode génère un grand échantillon de circuits sur lesquels on procède à une réduction du nombre des performances suivant la valeur minimale du taux de défauts (proportion des circuits défaillants qui passe le test) estimé

    Functional test compaction by statistical modelling of analogue circuits

    No full text
    International audienceIn this paper, we address the problem of functional test compaction of analogue circuits by using a statistical model of the performances of the Circuit Under Test (CUT). The statistical model is obtained using data from a Monte-Carlo simulation and uses a multi-normal law to estimate the joint Probability Density Function (PDF) of the circuit performances at the design stage. The functional test compaction method is based on the minimization of the defect level, again at the design stage, that is calculated from the estimated PDF and the actual specifications of the circuit performances. The suitability of the actual reduced functional test set for production test must next be evaluated in terms of its capability of detecting catastrophic and parametric faults

    Using Interval Petri Nets and timed automata for diagnosis of Discrete Event Systems (DES)

    No full text
    International audienceA discrete event system (DES) is a dynamic system that evolves in accordance with the abrupt occurrence, at possibly unknown irregular intervals, of physical events. Because of the special nature of these systems, different tools are currently used for their analysis, design and modeling. The main focus of this paper is the presentation of a new modeling approach of Discrete Event Systems. The proposed approach is based on hybrid model which combines Interval Constrained Petri Nets (ICPN) and Timed Automata. These tools allow us to evaluate, respectively, the quality variations and to manage the flow type disturbance. An example analysis illustrates our approac
    corecore