10 research outputs found
Risk Assessment of Deadly Economic Socio-Political Crisis with Correlational Network and Convolutional Neural Network
From social analysis to biology to machine learning, graphs naturally occur in a wide range of applications. In contrast to studying data one at a time, graphs' unique capacity to capture structural relationships among data enables them to yield additional insights. Nevertheless, the capacity to learn from graphs can be difficult because meaningful connectivity should exist between data and the form of data such as text, numbers or categories should allow for building a graph from their relationships. Investigating hidden patterns in the variation of development indicators and severe socio-political crises that happened in low-income countries is an analytical approach that has been experimented with in this research. Evidence of a correlation between socio-political crises and development indicators suggests that a method to assess the risk of crisis should consider the context of each country, as well as the relative means of crisis. This research reviewed different risk assessment methods and proposed a novel method based on a weighted correlation network, and convolution neural network, to generate images representing the signature of development indicators correlating with a crisis. The convolution neural network trained to identify changes in indicators will be able to find countries with similar signatures and provide insights about important indicators that might reduce the number of deadly crises in a country. This research enhances the knowledge of developing a quantitative risk assessment for crisis prevention with development indicators
Judgement Analysis of Patient Management: General Practitioners' Policies and Self-Insight.
In this thesis judgement analysis (multiple linear regression techniques) was used to
look at both GPs' decisions to prescribe certain types of drug for patients and their
judgements of patients' risk of coronary heart disease. All of these were idiographic
analyses in that decision making by each GP was modelled separately. Judgement analysis
(paramorphically) describes a subject's judgement or decision making policy in terms of
the relative influence of different pieces of information.
The amount of information subjects could take into account was limited. For all
types of judgement or decision doctors were influenced on average by only four of the
thirteen or twelve cues available.
The decision to prescribe one of the types of drug was modelled not only in terms
of the individual effects of cues (judgement analysis) but also in terms of the influence of
the doctor's assessment of the patient's risk. Doctors agreed more about judgements of risk
and the factors influencing this than about prescription. Doctors only prescribed to patients
they rated as at high risk but factors such as for example smoking behaviour led some
doctors not to prescribe to individuals in this group.
Judgement and decision making policies (explicit policies) were also elicited
verbally from doctors. These showed greater agreement than the policies captured using
judgement analysis (tacit policies) did. When these explicit policies were compared to tacit
policies a moderate amount of correspondence was found. However, doctors tended to
over-rate the importance of certain cues. A number of explanations for this pattern of self-insight
were investigated including the possibilities that doctors have self-insight but are
unable to state it and that the pattern was an artefact of linear modelling. Both of these
hypotheses were rejected. Subjects' explicit policies were found to resemble the pattern of
selection of information more than the pattern of its use. Both the hypotheses that subjects'
explicit policies were based on phenomenal knowledge and that they are based on some
ideal model (influencing which cues are selected) were supported
Cognitive Foundations for Visual Analytics
In this report, we provide an overview of scientific/technical literature on information visualization and VA. Topics discussed include an update and overview of the extensive literature search conducted for this study, the nature and purpose of the field, major research thrusts, and scientific foundations. We review methodologies for evaluating and measuring the impact of VA technologies as well as taxonomies that have been proposed for various purposes to support the VA community. A cognitive science perspective underlies each of these discussions
CACIC 2015 : XXI Congreso Argentino de Ciencias de la Computación. Libro de actas
Actas del XXI Congreso Argentino de Ciencias de la Computación (CACIC 2015), realizado en Sede UNNOBA JunÃn, del 5 al 9 de octubre de 2015.Red de Universidades con Carreras en Informática (RedUNCI
A NEW TREATMENT OF LOW PROBABILITY EVENTS WITH PARTICULAR APPLICATION TO NUCLEAR POWER PLANT INCIDENTS
PhDTechnological innovation is inescapable if civilisation is to continue
in the face of population growth, rising expectations and resource exhaustion.
Unfortunately, major innovations, confidently thought to be
safe, occasionally fail catastrophically. The fears so engendered are
impeding technical progress generally and that of nuclear power in particular.
Attempts to allay disquiet about these disastrous Low Probability
Events (LPEs) by exhaustive studies of nuclear power plant designs
have, so far, been less than successful. The New Treatment adopts
instead an approach that, after examination of the LPE in its historical
and societal settings, combines theoretical design analysis with construction
site and operational realities in pragmatic engineering, the
quality of which can be assured by accountable inspection.
The LPE is envisaged as a singularity in a stream of largely mundane,
but untoward incidents, described as 'Event-noise'. Predictions of the
likelihood of plant LPEs by frequency-theory probability are illusory
because the LPE is unique and not part of a stable distribution. Again,
noise analysis seems to lead to intractable mathematical expressions.
While theoretical LPE prognostications depend on the identification of
fault sequences in design that can either be designed-out or reduced to
plausibly negligible probabilities, the reality of LPE prevention lies
with the plant in operation. As absolute safety is unattainable, the
approach aims at ensuring that the perceived residual nuclear risk is
societally tolerable. An adaption of elementary Catastrophe theory to
model the prospective Event-noise field to be experienced by the plant
is proposed whereby potential, credible LPEs could be more readily
discerned and avoided.
In this milieu of increasing sophistication in technology when management
in the traditional administrative mold is proving inadequate, the
engineer emerges as the proper central decision-maker. The special
intellectual capability needed is acquired during his training and experience,
a claim that can draw support from new studies in neuropsychology.
The Nuclear Installation Inspectorate is cited as an exemplar of a body
practising the kind of engineering inspection needed to apprehend those
human fallibilities to which most catastrophic failures of technology
are due. Nevertheless, such regulatory systems lack accountability and,
as Goedel's theorem suggests, cannot assess their own efficiency. Independent
appraisal by Signal Detection Theory is suggested as a remedy
Recommended from our members
Evaluating Evaluations of Clinical Decision Support Systems: Case Studies From NHS Clinical Settings
The NHS is under increasing pressure to cut costs while delivering high quality care. At the same time, the demand for healthcare services has grown, driven in part by the increasing number of older people in the population. NHS Trusts are adopting clinical decision support systems (CDSSs) to help decision making at the point of care. CDSSs are said to bring benefits such as improvements in guideline adherence, clinical processes and user performance but evidence of these benefits is not always available and their effectiveness in terms of improving patient outcomes is often open to question. This thesis presents research that was carried out in a large teaching NHS Trust looking at the evaluations of three CDSSs. Semi structured interviews were carried out with key informants who were involved in their adoption, use and evaluations. Documentary analysis and observations were also used to augment the interviews. Most evaluations were carried out informally by the developers and were primarily driven by external regulatory pressures rather than patient outcomes and organisational needs. Evaluation documentation was inadequate or missing, thus making it difficult to systematically assess these evaluations. This thesis contends that evaluations are important to provide decision makers in NHS Trusts with adequate information to make decisions about CDSSs and computerised healthcare information technologies in general. NHS Trusts need to build organisational capacity and readiness to enable them to effectively carry out evaluations that will provide meaningful information to gain better understanding of CDSSs and to inform their successful adoption, implementation, usage and to justify the resource allocation. This research shows that CDSS evaluations investigated took a predominantly narrow view. It thus provides evidence for the need for a more systemic approach to evaluation
Towards Solving the Table Maker's Dilemma on GPU
Since 1985, the IEEE 754 standard defines formats, rounding modes and basic operations for floating-point arithmetic. In 2008 the standard has been extended, and recommendations have been added about the rounding of some elementary functions such as trigonometric functions (cosine, sine, tangent and their inverses), exponentials, and logarithms. However to guarantee the exact rounding of these functions one has to approximate them with a suffi-cient precision. Finding this precision is known as the Table Maker’s Dilemma. To determine this precision, it is neces-sary to find the hardest-to-round argument of these func-tions. Lefèvre et al. proposed in 1998 an algorithm which improves the exhaustive search by computing a lower bound on the distance between a line segment and a grid. We present in this paper an analysis of this algorithm in or-der to deploy it efficiently on GPU. We manage to obtain a speedup of 15.4 on a NVIDIA Fermi GPU over one single high-end CPU core.