1,159 research outputs found
How to use the Kohonen algorithm to simultaneously analyse individuals in a survey
The Kohonen algorithm (SOM, Kohonen,1984, 1995) is a very powerful tool for
data analysis. It was originally designed to model organized connections
between some biological neural networks. It was also immediately considered as
a very good algorithm to realize vectorial quantization, and at the same time
pertinent classification, with nice properties for visualization. If the
individuals are described by quantitative variables (ratios, frequencies,
measurements, amounts, etc.), the straightforward application of the original
algorithm leads to build code vectors and to associate to each of them the
class of all the individuals which are more similar to this code-vector than to
the others. But, in case of individuals described by categorical (qualitative)
variables having a finite number of modalities (like in a survey), it is
necessary to define a specific algorithm. In this paper, we present a new
algorithm inspired by the SOM algorithm, which provides a simultaneous
classification of the individuals and of their modalities.Comment: Special issue ESANN 0
Recurrence Quantification Analysis and Principal Components in the Detection of Short Complex Signals
Recurrence plots were introduced to help aid the detection of signals in
complicated data series. This effort was furthered by the quantification of
recurrence plot elements. We now demonstrate the utility of combining
recurrence quantification analysis with principal components analysis to allow
for a probabilistic evaluation for the presence of deterministic signals in
relatively short data lengths.Comment: 10 pages, 3 figures; Elsevier preprint, elsart style; programs used
for analysis available for download at http://homepages.luc.edu/~cwebbe
Predictors and outcome impact of perioperative serum sodium changes in a high-risk population.
BACKGROUND: The perioperative period may be associated with a marked neurohumoral stress response, significant fluid losses, and varied fluid replacement regimes. Acute changes in serum sodium concentration are therefore common, but predictors and outcomes of these changes have not been investigated in a large surgical population. METHODS: We carried out a retrospective cohort analysis of 27 068 in-patient non-cardiac surgical procedures in a tertiary teaching hospital setting. Data on preoperative conditions, perioperative events, hospital length of stay, and mortality were collected, along with preoperative and postoperative serum sodium measurements up to 7 days after surgery. Logistic regression was used to investigate the association between sodium changes and mortality, and to identify clinical characteristics associated with a deviation from baseline sodium >5 mmol litre(-1). RESULTS: Changes in sodium concentration >5 mmol litre(-1) were associated with increased mortality risk (adjusted odds ratio 1.49 for a decrease, 3.02 for an increase). Factors independently associated with a perioperative decrease in serum sodium concentration >5 mmol litre(-1) included age >60, diabetes mellitus, and the use of patient-controlled opioid analgesia. Factors associated with a similar increase were preoperative oxygen dependency, mechanical ventilation, central nervous system depression, non-elective surgery, and major operative haemorrhage. CONCLUSIONS: Maximum deviation from preoperative serum sodium value is associated with increased hospital mortality in patients undergoing in-patient non-cardiac surgery. Specific preoperative and perioperative factors are associated with significant serum sodium changes.This work was supported by the Cambridge University Division of Anaesthesia.This is the author accepted manuscript. The final version is available from Oxford University Press via http://dx.doi.org/10.1093/bja/aeu40
On the Schoenberg Transformations in Data Analysis: Theory and Illustrations
The class of Schoenberg transformations, embedding Euclidean distances into
higher dimensional Euclidean spaces, is presented, and derived from theorems on
positive definite and conditionally negative definite matrices. Original
results on the arc lengths, angles and curvature of the transformations are
proposed, and visualized on artificial data sets by classical multidimensional
scaling. A simple distance-based discriminant algorithm illustrates the theory,
intimately connected to the Gaussian kernels of Machine Learning
Seeking systematicity in variation : theoretical and methodological considerations on the “variety” concept
One centennial discussion in linguistics concerns whether languages, or linguistic systems, are, essentially, homogeneous or rather show “structured heterogeneity.” In this contribution, the question is addressed whether and how sociolinguistically defined systems (or ‘varieties’) are to be distinguished in a heterogeneous linguistic landscape: to what extent can structure be found in the myriads of language variants heard in everyday language use? We first elaborate on the theoretical importance of this ‘variety question’ by relating it to current approaches from, among others, generative linguistics (competing grammars), sociolinguistics (style-shifting, polylanguaging), and cognitive linguistics (prototype theory). Possible criteria for defining and detecting varieties are introduced, which are subsequently tested empirically, using a self-compiled corpus of spoken Dutch in West Flanders (Belgium). This empirical study demonstrates that the speech repertoire of the studied West Flemish speakers consists of four varieties, viz. a fairly stable dialect variety, a more or less virtual standard Dutch variety, and two intermediate varieties, which we will label ‘cleaned-up dialect’ and ‘substandard.’ On the methodological level, this case-study underscores the importance of speech corpora comprising both inter- and intra-speaker variation on the one hand, and the merits of triangulating qualitative and quantitative approaches on the other
Refining patterns of joint hypermobility, habitus, and orthopedic traits in joint hypermobility syndrome and Ehlers–Danlos syndrome, hypermobility type
Joint hypermobility syndrome (JHS) and Ehlers-Danlos syndrome, hypermobility type (EDS-HT) are two overlapping heritable disorders (JHS/EDS-HT) recognized by separated sets of diagnostic criteria and still lack a confirmatory test. This descriptive research was aimed at better characterizing the clinical phenotype of JHS/EDS-HT with focus on available diagnostic criteria, and in order to propose novel features and assessment strategies. One hundred and eighty-nine (163 females, 26 males; age: 2-73 years) patients from two Italian reference centers were investigated for Beighton score, range of motion in 21 additional joints, rate and sites of dislocations and sprains, recurrent soft-tissue injuries, tendon and muscle ruptures, body mass index, arm span/height ratio, wrist and thumb signs, and 12 additional orthopedic features. Rough rates were compared by age, sex, and handedness with a series of parametric and non-parametric tools. Multiple correspondence analysis was carried out for possible co-segregations of features. Beighton score and hypermobility at other joints were influenced by age at diagnosis. Rate and sites of joint instability complications did not vary according to age at diagnosis except for soft-tissue injuries. No major difference was registered by sex and dominant versus non-dominant body side. At multiple correspondence analysis, selected features tend to co-segregate in a dichotomous distribution. Dolichostenomelia and arachnodactyly segregated independently. This study pointed out a more protean musculoskeletal phenotype than previously considered according to available diagnostic criteria for JHS/EDS-HT. Our findings corroborated the need for a re-thinking of JHS/EDS-HT on clinical grounds in order to find better therapeutic and research strategie
Aprendendo com o insucesso: um estudo de caso de aplicação da resolução criativa de problemas ao projeto educativo
O Projeto Educativo, como instrumento fundamental para a autonomia das escolas, em Portugal, deve aglutinar as
principais expectativas da comunidade escolar, implicando rigor na metodologia de investigação, utilizada para sua
elaboração, e na implementação da mudança requerida. O presente artigo relata a forma como foi possível obter esses
aspectos, numa escola secundária, ao longo de mais de um ano, por meio do uso do método de Resolução Criativa de
Problemas, que envolveu toda a comunidade escolar. Conforme o planeamento definido, a elaboração do Projeto
Educativo seguiu os passos de um trabalho de investigação e resultou num documento estratégico e operacional,
sujeito, posteriormente, a várias tentativas de implementação que, no entanto, tiveram apenas sucesso relativo. A
discussão das condições necessárias para que o documento final possa servir de base à implementação das políticas e
ações definidas, o que simplificaria todo o esquema de funcionamento escolar, sob um prisma de gestão efetiva das
organizações, é aqui iniciada e sugerida para futuras investigações
An evaluation methodology for crowdsourced design
In recent years, the “power of the crowd” has been repeatedly demonstrated and various Internet platforms have been used to support applications of collaborative intelligence to tasks ranging from open innovation to image analysis. However, crowdsourcing applications in the fields of design research and creative innovation have been much slower to emerge. So, although there have been reports of systems and researchers using Internet crowdsourcing to carry out generative design, there are still many gaps in knowledge about the capability and limitations of the technology. Indeed the process models developed to support traditional commercial design (e.g. Pugh’s Total Design, Agile, Double-Diamond etc.) have yet to be established for Crowdsourced Design. As a contribution to the development of such a general model this paper proposes the cDesign framework to support the creation of Crowdsourced Design activities. Within the cDesign framework the effective evaluation of design quality is identified as a key component that not only enables the leveraging of a large, virtual workforces’ creative activities but is also fundamental to most iterative and optimisation processes. This paper reports an experimental investigation (developed using the cDesign framework) into two different Crowdsourced design evaluation approaches; free evaluation and ‘crowdsourced Design Evaluation Criteria’ (cDEC). The results are benchmarked against an evaluation carried out by a panel of experienced designers. The results suggest that the cDEC approach produces design rankings that correlate strongly with the judgements of an “expert panel”. The paper concludes that cDEC assessment methodology demonstrates how Crowdsourcing can be effectively used to evaluate, as well as generate, new design solutions
- …
