515 research outputs found

    Down-Sampling coupled to Elastic Kernel Machines for Efficient Recognition of Isolated Gestures

    Get PDF
    In the field of gestural action recognition, many studies have focused on dimensionality reduction along the spatial axis, to reduce both the variability of gestural sequences expressed in the reduced space, and the computational complexity of their processing. It is noticeable that very few of these methods have explicitly addressed the dimensionality reduction along the time axis. This is however a major issue with regard to the use of elastic distances characterized by a quadratic complexity. To partially fill this apparent gap, we present in this paper an approach based on temporal down-sampling associated to elastic kernel machine learning. We experimentally show, on two data sets that are widely referenced in the domain of human gesture recognition, and very different in terms of quality of motion capture, that it is possible to significantly reduce the number of skeleton frames while maintaining a good recognition rate. The method proves to give satisfactory results at a level currently reached by state-of-the-art methods on these data sets. The computational complexity reduction makes this approach eligible for real-time applications.Comment: ICPR 2014, International Conference on Pattern Recognition, Stockholm : Sweden (2014

    Tachycardie avec complexes ventriculaires larges et alternance des complexes QRS et des intervalles RR

    Get PDF
    A patient with sustained monomorphic ventricular tachycardia developed a second tachycardia of incessant character with alternating RR-interval and alternating QRS-complexes after treatment with amiodarone. A retrograde His bundle deflection was recorded during the latter arrhythmia, that was induced as well with atrial as with ventricular stimulation. The "alternating" tachycardia could be explained by alternating bundle branch reentry and alternating conduction in the myocardium. This view is supported by the finding of an alternating mid-diastolic potential. This case report introduces a new electrocardiographic entity in the spectrum of wide QRS-tachycardias

    Elacestrant (oral selective estrogen receptor degrader) Versus Standard Endocrine Therapy for Estrogen Receptor–Positive, Human Epidermal Growth Factor Receptor 2–Negative Advanced Breast Cancer: Results From the Randomized Phase III EMERALD Trial

    Get PDF
    Elacestrant; Càncer de mamaElacestrant; Cáncer de mamaElacestrant; Breast cancerPURPOSE Patients with pretreated estrogen receptor (ER)–positive/human epidermal growth factor receptor 2 (HER2)–negative advanced breast cancer have poor prognosis. Elacestrant is a novel, oral selective ER degrader that demonstrated activity in early studies. METHODS This randomized, open-label, phase III trial enrolled patients with ER-positive/HER2-negative advanced breast cancer who had one-two lines of endocrine therapy, required pretreatment with a cyclin-dependent kinase 4/6 inhibitor, and ≤ 1 chemotherapy. Patients were randomly assigned to elacestrant 400 mg orally once daily or standard-of-care (SOC) endocrine monotherapy. Primary end points were progression-free survival (PFS) by blinded independent central review in all patients and patients with detectable ESR1 mutations. RESULTS Patients were randomly assigned to elacestrant (n = 239) or SOC (n = 238). ESR1 mutation was detected in 47.8% of patients, and 43.4% received two prior endocrine therapies. PFS was prolonged in all patients (hazard ratio = 0.70; 95% CI, 0.55 to 0.88; P = .002) and patients with ESR1 mutation (hazard ratio = 0.55; 95% CI, 0.39 to 0.77; P = .0005). Treatment-related grade 3/4 adverse events occurred in 7.2% receiving elacestrant and 3.1% receiving SOC. Treatment-related adverse events leading to treatment discontinuations were 3.4% in the elacestrant arm versus 0.9% in SOC. Nausea of any grade occurred in 35.0% receiving elacestrant and 18.8% receiving SOC (grade 3/4, 2.5% and 0.9%, respectively). CONCLUSION Elacestrant is the first oral selective ER degrader demonstrating a significant PFS improvement versus SOC both in the overall population and in patients with ESR1 mutations with manageable safety in a phase III trial for patients with ER-positive/HER2-negative advanced breast cancer

    ESMO recommendations on the use of circulating tumour DNA assays for patients with cancer: a report from the ESMO Precision Medicine Working Group

    Get PDF
    Circulating tumour DNA (ctDNA); Liquid biopsy; Precision medicineDNA tumoral circulant (ctDNA); Biòpsia líquida; Medicina de precisióADN tumoral circulante (ctDNA); Biopsia líquida; Medicina de precisiónCirculating tumour DNA (ctDNA) assays conducted on plasma are rapidly developing a strong evidence base for use in patients with cancer. The European Society for Medical Oncology convened an expert working group to review the analytical and clinical validity and utility of ctDNA assays. For patients with advanced cancer, validated and adequately sensitive ctDNA assays have utility in identifying actionable mutations to direct targeted therapy, and may be used in routine clinical practice, provided the limitations of the assays are taken into account. Tissue-based testing remains the preferred test for many cancer patients, due to limitations of ctDNA assays detecting fusion events and copy number changes, although ctDNA assays may be routinely used when faster results will be clinically important, or when tissue biopsies are not possible or inappropriate. Reflex tumour testing should be considered following a non-informative ctDNA result, due to false-negative results with ctDNA testing. In patients treated for early-stage cancers, detection of molecular residual disease or molecular relapse, has high evidence of clinical validity in anticipating future relapse in many cancers. Molecular residual disease/molecular relapse detection cannot be recommended in routine clinical practice, as currently there is no evidence for clinical utility in directing treatment. Additional potential applications of ctDNA assays, under research development and not recommended for routine practice, include identifying patients not responding to therapy with early dynamic changes in ctDNA levels, monitoring therapy for the development of resistance mutations before clinical progression, and in screening asymptomatic people for cancer. Recommendations for reporting of results, future development of ctDNA assays and future clinical research are made.This project was funded by the European Society for Medical Oncology (no grant number)

    Appropriate sequence for afatinib and cisplatin combination improves anticancer activity in head and neck squamous cell carcinoma

    Get PDF
    Despite a better understanding in head and neck tumors pathogenesis as well as improvements in radiotherapy and surgery, locally advanced head and neck squamous cell carcinoma (HNSCC) remains of poor prognosis. One promising target is the epidermal growth factor receptor (EGFR), which is overexpressed in the majority of HNSCC and is associated to tumor progression and resistance to treatment. However, in several clinical trials, the combination of EGFR inhibitors with chemotherapy and/or radiotherapy generates moderate results. In this study, we investigated the anti-tumor activity of afatinib, an irreversible pan-EGFR inhibitor, combined to cisplatin in different schedules of exposure. For that, we used two human EGFR wild-type HNSCC cell lines and we evaluated the cytotoxicity of the two drugs combined in different sequences. The efficiency of each strategy was assessed by evaluating the effects on cell cycle distribution, DNA damage, cell death and downstream pathways of ErbB family receptors. We demonstrated that cisplatin treatment followed by afatinib exposure displayed more cytotoxic effects than the opposite timing or than simultaneous association. This higher anticancer activity is probably due to afatinib-induced cell cycle arrest, which prevents the repair of cisplatin-induced DNA damage and promotes cell death by various mechanisms including apoptosis. These data suggest the importance of an appropriate timing administration between an EGFR inhibitor and a conventional chemotherapy in order to obtain the best clinical benefit for patients with a head and neck cancer

    Negotiating the Web Science Curriculum through Shared Educational Artefacts

    No full text
    EXTENDED ABSTRACT The far-reaching impact of Web on society is widely recognised and acknowledged. The interdisciplinary study of this impact has crystallised in the field of study known as Web Science. However, defining an agreed, shared understanding of what constitutes Web Science requires complex negotiation and translations of understandings across component disciplines, national cultures and educational traditions. Some individual institutions have already established particular curricula, and discussions in the Web Science Curriculum Workshop series have marked the territory to some extent. This paper reports on a process being adopted across a consortium of partners to systematically create a shared understanding of what constitutes Web Science. It records and critiques the processes instantiated to agree a common curriculum, and presents a framework for future discussion and development. The need to study the Web in its complexity, development and impact led to the creation of Web Science. Web Science is inherently interdisciplinary. Its goal is to: a) understand the Web growth mechanisms; b) create approaches that allow new powerful and more beneficial mechanisms to occur. Teaching Web Science is a unique experience since the emerging discipline is a combination of two essential features. On one hand, the analysis of microscopic laws extrapolated to the macroscopic realm generates observed behaviour. On the other hand languages and algorithms on the Web are built in order to produce novel desired computer behaviour that should be put in context. Finding a suitable curriculum that is different from the study of language, algorithms, interaction patterns and business processes is thus an important and challenging task for the simple reason that we believe that the future of sociotechnical systems will be in their innovative power (inventing new ways to solve problems), rather than their capacity to optimize current practices. The Web Science Curriculum Development (WSCD) Project focuses European expertise in this interdisciplinary endeavour with the ultimate aim of designing a joint masters program for Web Science between the partner universities. The process of curriculum definition is being addressed using a negotiation process which mirrors the web science and engineering approach described by Berners-Lee (figure 1 below). The process starts on the engineering side (right). From the technical design point of view the consortium is creating an open repository of shared educational artefacts using EdShare [1] (based on EPrints) to collect or reference the whole range of educational resources being used in our various programmes. Socially, these resources will be annotated against a curriculum categorization [2] which in itself is subject to negotiation and change, currently via a wiki. This last process is represented by complexity and collaboration at the bottom of the diagram. The resources necessarily extend beyond artefacts used in the lecture and seminar room encompassing artefacts associated with the administrative and organisational processes which are necessary to assure the comparability of the educational resources and underwrite the quality standards of the associated awards. Figure 1: Web Science and Engineering Approach (e.g. See http://www.w3.org/2007/Talks/0314-soton-tbl/#%2811%29) From the social point of view the contributions will be discussed and peer reviewed by members of the consortium. Our intention is that by sharing the individual components of the teaching and educational process and quality assuring them by peer review we will provide concrete examples of our understanding of the discipline. However, as Berners-Lee observes, it is in the move from the micro to the macro that the magic (complexity) is involved. The challenge for our consortium, once our community repository is adequately populated, is to involve the wider community in the contribution, discussion and annotation that will lead to the evolution of a negotiated and agreed but evolving curriculum for Web Science. Others have worked on using community approaches to developing curriculum. For example, in the Computer Science community there is a repository of existing syllabi [3] that enables designers of new courses to understand how others have approached the problem, and the Information Science community is using a wiki [4] to enable the whole community to contribute to the dynamic development of the curriculum. What makes this project unique is that rather than taking a top down structured approach to curriculum definition it takes a bottom up approach, using the actual teaching materials as the basis on which to iteratively negotiate and refine the definition of the curriculum

    A new deep branch of eurasian mtDNA macrohaplogroup M reveals additional complexity regarding the settlement of Madagascar.

    Get PDF
    BACKGROUND: Current models propose that mitochondrial DNA macrohaplogroups M and N evolved from haplogroup L3 soon after modern humans left Africa. Increasingly, however, analysis of isolated populations is filling in the details of, and in some cases challenging, aspects of this general model. RESULTS: Here, we present the first comprehensive study of three such isolated populations from Madagascar: the Mikea hunter-gatherers, the neighbouring Vezo fishermen, and the Merina central highlanders (n = 266). Complete mitochondrial DNA genome sequences reveal several unresolved lineages, and a new, deep branch of the out-of-Africa founder clade M has been identified. This new haplogroup, M23, has a limited global distribution, and is restricted to Madagascar and a limited range of African and Southwest Asian groups. CONCLUSIONS: The geographic distribution, phylogenetic placement and molecular age of M23 suggest that the colonization of Madagascar was more complex than previously thought.RIGHTS : This article is licensed under the BioMed Central licence at http://www.biomedcentral.com/about/license which is similar to the 'Creative Commons Attribution Licence'. In brief you may : copy, distribute, and display the work; make derivative works; or make commercial use of the work - under the following conditions: the original author must be given credit; for any reuse or distribution, it must be made clear to others what the license terms of this work are

    ‘We are (the) Middle Class’ The new Turkish middle class: identification, behaviors and expectations

    Get PDF
    Abstract. This article aims to identify and characterise the Turkish middle class. Our objective is to improve its description by implementing a clustering method combining an economic and a sociological approach. Using the Income and Living Conditions Survey (2014), we first identify the middle-class on the basis of an income interval. We then use information about employment and education to characterise the heterogeneity of this middle-income class. The distinctive behaviors and aspirations of four middle class groups are explored in depth by using the results of original qualitative field research carried out among middle class households from two contrasting regions.Keywords. Social stratification, Income distribution, Middle class, Turkey, Clustering methods.JEL. O15, D13, E25

    Negotiating the Web Science Curriculum through Shared Educational Artefacts

    Get PDF
    EXTENDED ABSTRACT The far-reaching impact of Web on society is widely recognised and acknowledged. The interdisciplinary study of this impact has crystallised in the field of study known as Web Science. However, defining an agreed, shared understanding of what constitutes Web Science requires complex negotiation and translations of understandings across component disciplines, national cultures and educational traditions. Some individual institutions have already established particular curricula, and discussions in the Web Science Curriculum Workshop series have marked the territory to some extent. This paper reports on a process being adopted across a consortium of partners to systematically create a shared understanding of what constitutes Web Science. It records and critiques the processes instantiated to agree a common curriculum, and presents a framework for future discussion and development. The need to study the Web in its complexity, development and impact led to the creation of Web Science. Web Science is inherently interdisciplinary. Its goal is to: a) understand the Web growth mechanisms; b) create approaches that allow new powerful and more beneficial mechanisms to occur. Teaching Web Science is a unique experience since the emerging discipline is a combination of two essential features. On one hand, the analysis of microscopic laws extrapolated to the macroscopic realm generates observed behaviour. On the other hand languages and algorithms on the Web are built in order to produce novel desired computer behaviour that should be put in context. Finding a suitable curriculum that is different from the study of language, algorithms, interaction patterns and business processes is thus an important and challenging task for the simple reason that we believe that the future of sociotechnical systems will be in their innovative power (inventing new ways to solve problems), rather than their capacity to optimize current practices. The Web Science Curriculum Development (WSCD) Project focuses European expertise in this interdisciplinary endeavour with the ultimate aim of designing a joint masters program for Web Science between the partner universities. The process of curriculum definition is being addressed using a negotiation process which mirrors the web science and engineering approach described by Berners-Lee (figure 1 below). The process starts on the engineering side (right). From the technical design point of view the consortium is creating an open repository of shared educational artefacts using EdShare [1] (based on EPrints) to collect or reference the whole range of educational resources being used in our various programmes. Socially, these resources will be annotated against a curriculum categorization [2] which in itself is subject to negotiation and change, currently via a wiki. This last process is represented by complexity and collaboration at the bottom of the diagram. The resources necessarily extend beyond artefacts used in the lecture and seminar room encompassing artefacts associated with the administrative and organisational processes which are necessary to assure the comparability of the educational resources and underwrite the quality standards of the associated awards. Figure 1: Web Science and Engineering Approach (e.g. See http://www.w3.org/2007/Talks/0314-soton-tbl/#%2811%29) From the social point of view the contributions will be discussed and peer reviewed by members of the consortium. Our intention is that by sharing the individual components of the teaching and educational process and quality assuring them by peer review we will provide concrete examples of our understanding of the discipline. However, as Berners-Lee observes, it is in the move from the micro to the macro that the magic (complexity) is involved. The challenge for our consortium, once our community repository is adequately populated, is to involve the wider community in the contribution, discussion and annotation that will lead to the evolution of a negotiated and agreed but evolving curriculum for Web Science. Others have worked on using community approaches to developing curriculum. For example, in the Computer Science community there is a repository of existing syllabi [3] that enables designers of new courses to understand how others have approached the problem, and the Information Science community is using a wiki [4] to enable the whole community to contribute to the dynamic development of the curriculum. What makes this project unique is that rather than taking a top down structured approach to curriculum definition it takes a bottom up approach, using the actual teaching materials as the basis on which to iteratively negotiate and refine the definition of the curriculum
    • …
    corecore