14,124 research outputs found

    Category deficits and paradoxical dissociations in Alzheimer's disease and Herpes Simplex Hencephalitis

    Get PDF
    Most studies examining category specificity are single-case studies of patients with living or non living deficits. Nevertheless, no explicit or agreed criteria exist for establishing category-specific deficits in single-cases regarding the type of analyses, whether to compare with healthy controls, the number of tasks, or the type of tasks. We examined to groups of patients with neurological pathology frequently accompained with impaired semantic memory (19 patients with Alzheimer disease and 15 with Herpes Simplex Encephalitis). Category knowledge was examined using three tasks (picture naming, naming-to-description and features verification). Both patients groups were compared with aged- and education- matched healthy controls. The profile of each patients was examined for consistency across tasks and across different analyses; however both prove to be inconsistent. One striking findings was the presence of a paradoxical dissociation ( i.e., patients who were impaired on living things on one task and non living things on another task). The findings have significant implication for how we determine category effects and, more generall for the methods use to document double dissociation across individual cases in this literature

    Regional and Industry Cycles in Australasia: Implications for a Common Currency

    Get PDF
    If two countries experience similar cycles, loss in monetary sovereignty following currency union may not be severe. Analysis of cyclical similarity is frequently carried out at the overall industry level, then interpreted with reference to regional industrial structures. By contrast, this paper explicitly incorporates regional industry structure into an examination of Australasian cycles. Since 1991, NZ and Australasian cycles have been highly correlated, but there is little evidence that the NZ cycle has been "caused" by Australian regional or industry cycles. We test whether the NZDAUD exchange rate has insulated NZ from Australian shocks, but find it has not played a major buffering role in response to Australian industry shocks (including mining shocks). Instead, the strongest impacts on the NZDAUD stem from the NZ cycle. An important loss of monetary sovereignty under currency union may therefore arise in response to NZ-specific shocks.

    Non-invasive estimation of left atrial dominant frequency in atrial fibrillation from different electrode sites: Insight from body surface potential mapping

    Get PDF
    © 2014, CardioFront LLC. All rights reserved. The dominant driving sources of atrial fibrillation are often found in the left atrium, but the expression of left atrial activation on the body surface is poorly understood. Using body surface potential mapping and simultaneous invasive measurements of left atrial activation our aim was to describe the expression of the left atrial dominant fibrillation frequency across the body surface. 20 patients in atrial fibrillation were studied. The spatial distributions of the dominant atrial fibrillation frequency across anterior and posterior sites on the body surface were quantified. Their relationship with invasive left atrial dominant fibrillation frequency was assessed by linear regression analysis, and the coefficient of determination was calculated for each body surface site. The correlation between intracardiac and body surface dominant frequency was significantly higher with posterior compared with anterior sites (coefficient of determination 67±8% vs 48±2%,

    Intrinsic dimension of a dataset: what properties does one expect?

    Full text link
    We propose an axiomatic approach to the concept of an intrinsic dimension of a dataset, based on a viewpoint of geometry of high-dimensional structures. Our first axiom postulates that high values of dimension be indicative of the presence of the curse of dimensionality (in a certain precise mathematical sense). The second axiom requires the dimension to depend smoothly on a distance between datasets (so that the dimension of a dataset and that of an approximating principal manifold would be close to each other). The third axiom is a normalization condition: the dimension of the Euclidean nn-sphere \s^n is Θ(n)\Theta(n). We give an example of a dimension function satisfying our axioms, even though it is in general computationally unfeasible, and discuss a computationally cheap function satisfying most but not all of our axioms (the ``intrinsic dimensionality'' of Ch\'avez et al.)Comment: 6 pages, 6 figures, 1 table, latex with IEEE macros, final submission to Proceedings of the 22nd IJCNN (Orlando, FL, August 12-17, 2007

    Mobile unemployment in a post-industrial society: The case of Sweden

    Get PDF
    Since the early 90s, every region in Sweden has been struck by high unemployment, especially among young persons. In the same period, there has been an overrepresentation of unemployed among inter-regional migrants. Increasingly, however, this mobility does not lead to employment. Yet, there are largely economic factors explaining this mobility. There seems to be a dual spatial pattern of this phenomenon. Preferably, unemployed migrate to (a) metropolitan regions and to (b) rural areas. This indicates that the phenomenon is multifaceted. Mobile unemployed have different backgrounds and different aspirations. Unemployed migrants to rural areas are predominantly low cost seekers who do not look for a regular job any longer. Unemployed migrants to metropolitan regions are to a large extent recent immigrants and/or young persons attracted by the informal segments of the urban labor market. The aim of the paper is to describe emerging patterns of inter-regional migration of unemployed, to analyze the socio-economic careers of different migrant groups, and to analyze factors leading to mobile unemployment. The factors analyzed include changes in the welfare system and in labor market policy. Finally, the paper will discuss the regional economic consequences of the emerging pattern and the policy implications.

    Using machine learning to model dose–response relationships

    Full text link
    Rationale, aims and objectivesEstablishing the relationship between various doses of an exposure and a response variable is integral to many studies in health care. Linear parametric models, widely used for estimating dose–response relationships, have several limitations. This paper employs the optimal discriminant analysis (ODA) machine‐learning algorithm to determine the degree to which exposure dose can be distinguished based on the distribution of the response variable. By framing the dose–response relationship as a classification problem, machine learning can provide the same functionality as conventional models, but can additionally make individual‐level predictions, which may be helpful in practical applications like establishing responsiveness to prescribed drug regimens.MethodUsing data from a study measuring the responses of blood flow in the forearm to the intra‐arterial administration of isoproterenol (separately for 9 black and 13 white men, and pooled), we compare the results estimated from a generalized estimating equations (GEE) model with those estimated using ODA.ResultsGeneralized estimating equations and ODA both identified many statistically significant dose–response relationships, separately by race and for pooled data. Post hoc comparisons between doses indicated ODA (based on exact P values) was consistently more conservative than GEE (based on estimated P values). Compared with ODA, GEE produced twice as many instances of paradoxical confounding (findings from analysis of pooled data that are inconsistent with findings from analyses stratified by race).ConclusionsGiven its unique advantages and greater analytic flexibility, maximum‐accuracy machine‐learning methods like ODA should be considered as the primary analytic approach in dose–response applications.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/134965/1/jep12573_am.pdfhttp://deepblue.lib.umich.edu/bitstream/2027.42/134965/2/jep12573.pd

    Why Congo Persists: Sovereignty, Globalization and the Violent Reproduction of a Weak State

    Get PDF
    Wherever one looks, many elements conspire to suggest that the Democratic Republic of Congo should have collapsed some time ago under the multiple assaults of its own inadequacies as a state, the extreme heterogeneity and polarization of its populations, and the dislocations of globalization and foreign occupation. Yet, Congo has gone on defying such expectations and has continued to display a stunning propensity for resilience. This paper tries to explain why Congo persists amid these overwhelming structural obstacles. It focuses particularly on the more recent period when state weakness, foreign invasions, the exploitation of its natural resources by transnational and informal networks, and the multiplicity of domestic rebellions linked to foreign interests have not managed to dent, however slightly, the generalized support that exists for the reproduction of the Congolese state among its elites and regular citizens, foreign political and economic interests, and the international community at large. Observing that, in many parts of Congo, local grievances against the state and the greed of political elites have been magnified by the circumstances of post -Cold War Africa, it takes as paradoxical the continued broadly unchallenged existence of Congo

    Photometric redshifts for Quasars in multi band Surveys

    Get PDF
    MLPQNA stands for Multi Layer Perceptron with Quasi Newton Algorithm and it is a machine learning method which can be used to cope with regression and classification problems on complex and massive data sets. In this paper we give the formal description of the method and present the results of its application to the evaluation of photometric redshifts for quasars. The data set used for the experiment was obtained by merging four different surveys (SDSS, GALEX, UKIDSS and WISE), thus covering a wide range of wavelengths from the UV to the mid-infrared. The method is able i) to achieve a very high accuracy; ii) to drastically reduce the number of outliers and catastrophic objects; iii) to discriminate among parameters (or features) on the basis of their significance, so that the number of features used for training and analysis can be optimized in order to reduce both the computational demands and the effects of degeneracy. The best experiment, which makes use of a selected combination of parameters drawn from the four surveys, leads, in terms of DeltaZnorm (i.e. (zspec-zphot)/(1+zspec)), to an average of DeltaZnorm = 0.004, a standard deviation sigma = 0.069 and a Median Absolute Deviation MAD = 0.02 over the whole redshift range (i.e. zspec <= 3.6), defined by the 4-survey cross-matched spectroscopic sample. The fraction of catastrophic outliers, i.e. of objects with photo-z deviating more than 2sigma from the spectroscopic value is < 3%, leading to a sigma = 0.035 after their removal, over the same redshift range. The method is made available to the community through the DAMEWARE web application.Comment: 38 pages, Submitted to ApJ in February 2013; Accepted by ApJ in May 201
    • 

    corecore