52 research outputs found

    Growth in Children with Cerebral Palsy during five years after Selective Dorsal Rhizotomy: a practice-based study

    Get PDF
    Background: Overweight is reported as a side effect of SDR. The aims were to study the development of weight, height and body mass index (BMI) during five years after SDR. Methods: This prospective, longitudinal and practice-based study included all 56 children with CP spastic diplegia undergoing SDR from the start in March 1993 to April 2003 in our hospital. The preoperative Gross Motor Function Classification System (GMFCS) levels were I-II in 17, III in 15, IV-V in 24 children. Median age at SDR was 4.3 years (range 2.4-7.4 years). Weight and height/recumbent length were measured. Swedish growth charts for typically developing children generated weight, height and BMI z-scores for age and gender. Results: The preoperative median z-scores were for height-1.92 and for body mass index (BMI)-0.22. Five years later, the median BMI z-score was increased by + 0.57 (p + 2 SD) increased (p < 0.05). Baseline BMI and age at the start of follow-up influenced the BMI change during the five years (p < 0.001 and p < 0.05 respectively). The individual growth was highly variable, but a tendency towards increasing stunting with age was seen in severe gross motor dysfunction (GMFCS levels IV-V) and the opposite, a slight catch-up of height in children with walking ability (GMFCS levels I-III). Conclusions: These are the first available subtype-and GMFCS-specific longitudinal growth data for children with CP spastic diplegia. Their growth potential according to these data should be regarded as a minimum, as some children were undernourished. It is unknown whether the spasticity reduction through SDR increased the weight gain velocity, or if the relative weight increase was part of the general "obesity epidemic". For some children the weight increase was highly desirable. In others, it resulted in overweight and obesity with risk of negative health effects. Weight and height should be monitored to enable early prevention of weight aberrations also causing problems with mobility, activity and participation

    Long-term outcomes five years after selective dorsal rhizotomy

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Selective dorsal rhizotomy (SDR) is a well accepted neurosurgical procedure performed for the relief of spasticity interfering with motor function in children with spastic cerebral palsy (CP). The goal is to improve function, but long-term outcome studies are rare. The aims of this study were to evaluate long-term functional outcomes, safety and side effects during five postoperative years in all children with diplegia undergoing SDR combined with physiotherapy.</p> <p>Methods</p> <p>This study group consisted of 35 children, consecutively operated, with spastic diplegia, of which 26 were Gross Motor Function Classification System (GMFCS) levels III–V. Mean age was 4.5 years (range 2.5–6.6). They were all assessed by the same multidisciplinary team at pre- and at 6, 12, 18 months, 3 and 5 years postoperatively. Clinical and demographic data, complications and number of rootlets cut were prospectively registered. Deep tendon reflexes and muscle tone were examined, the latter graded with the modified Ashworth scale. Passive range of motion (PROM) was measured with a goniometer. Motor function was classified according to the GMFCS and measured with the Gross Motor Function Measure (GMFM-88) and derived into GMFM-66. Parent's opinions about the children's performance of skills and activities and the amount of caregiver assistance were measured with Pediatric Evaluation Disability Inventory (PEDI).</p> <p>Results</p> <p>The mean proportion of rootlets cut in S2-L2 was 40%. Muscle tone was immediately reduced in adductors, hamstrings and dorsiflexors (p < 0.001) with no recurrence of spasticity over the 5 years. For GMFCS-subgroups I–II, III and IV–V significant improvements during the five years were seen in PROM for hip abduction, popliteal angle and ankle dorsiflexion (p = 0.001), capacity of gross motor function (GMFM) (p = 0.001), performance of functional skills and independence in self-care and mobility (PEDI) (p = 0.001).</p> <p>Conclusion</p> <p>SDR is a safe and effective method for reducing spasticity permanently without major negative side effects. In combination with physiotherapy, in a group of carefully selected and systematically followed young children with spastic diplegia, it provides lasting functional benefits over a period of at least five years postoperatively.</p

    Ovarian cancer

    Get PDF
    Ovarian cancer is not a single disease and can be subdivided into at least five different histological subtypes that have different identifiable risk factors, cells of origin, molecular compositions, clinical features and treatments. Ovarian cancer is a global problem, is typically diagnosed at a late stage and has no effective screening strategy. Standard treatments for newly diagnosed cancer consist of cytoreductive surgery and platinum-based chemotherapy. In recurrent cancer, chemotherapy, anti-angiogenic agents and poly(ADP-ribose) polymerase inhibitors are used, and immunological therapies are currently being tested. High-grade serous carcinoma (HGSC) is the most commonly diagnosed form of ovarian cancer and at diagnosis is typically very responsive to platinum-based chemotherapy. However, in addition to the other histologies, HGSCs frequently relapse and become increasingly resistant to chemotherapy. Consequently, understanding the mechanisms underlying platinum resistance and finding ways to overcome them are active areas of study in ovarian cancer. Substantial progress has been made in identifying genes that are associated with a high risk of ovarian cancer (such as BRCA1 and BRCA2), as well as a precursor lesion of HGSC called serous tubal intraepithelial carcinoma, which holds promise for identifying individuals at high risk of developing the disease and for developing prevention strategies

    Pan-cancer analysis of whole genomes

    Get PDF
    Cancer is driven by genetic change, and the advent of massively parallel sequencing has enabled systematic documentation of this variation at the whole-genome scale(1-3). Here we report the integrative analysis of 2,658 whole-cancer genomes and their matching normal tissues across 38 tumour types from the Pan-Cancer Analysis of Whole Genomes (PCAWG) Consortium of the International Cancer Genome Consortium (ICGC) and The Cancer Genome Atlas (TCGA). We describe the generation of the PCAWG resource, facilitated by international data sharing using compute clouds. On average, cancer genomes contained 4-5 driver mutations when combining coding and non-coding genomic elements; however, in around 5% of cases no drivers were identified, suggesting that cancer driver discovery is not yet complete. Chromothripsis, in which many clustered structural variants arise in a single catastrophic event, is frequently an early event in tumour evolution; in acral melanoma, for example, these events precede most somatic point mutations and affect several cancer-associated genes simultaneously. Cancers with abnormal telomere maintenance often originate from tissues with low replicative activity and show several mechanisms of preventing telomere attrition to critical levels. Common and rare germline variants affect patterns of somatic mutation, including point mutations, structural variants and somatic retrotransposition. A collection of papers from the PCAWG Consortium describes non-coding mutations that drive cancer beyond those in the TERT promoter(4); identifies new signatures of mutational processes that cause base substitutions, small insertions and deletions and structural variation(5,6); analyses timings and patterns of tumour evolution(7); describes the diverse transcriptional consequences of somatic mutation on splicing, expression levels, fusion genes and promoter activity(8,9); and evaluates a range of more-specialized features of cancer genomes(8,10-18).Peer reviewe

    The Transformation from Traditional Nonprofit Organizations to Social Enterprises: An Institutional Entrepreneurship Perspective

    Get PDF
    The development of commercial revenue streams allows traditional nonprofit organizations to increase financial certainty in response to the reduction of traditional funding sources and increased competition. In order to capture commercial revenue-generating opportunities, traditional nonprofit organizations need to deliberately transform themselves into social enterprises. Through the theoretical lens of institutional entrepreneurship, we explore the institutional work that supports this transformation by analyzing field interviews with 64 institutional entrepreneurs from UK-based social enterprises. We find that the route to incorporate commercial processes and convert traditional nonprofit organizations into social enterprises requires six distinct kinds of institutional work at three different domains; these are—“engaging commercial revenue strategies”, “creating a professionalized organizational form”, and “legitimating a socio-commercial business model”. In elaborating on social entrepreneurship research and practice, we offer a comprehensive framework delineating the key practices contributing to the transformation from traditional nonprofit organizations to social enterprises. This extends our understanding of the ex-ante strategy of incorporating commercial processes within social organizations. Furthermore, the identification of these practices also offers an important tool for scholars in this field to examine the connection (or disconnection) of each practice with different ethical concerns of social entrepreneurship in greater depth.British Academ

    Ventilação mecânica volume-controlada versus pressão controlada em modelo canino de lesão pulmonar aguda: efeitos cardiorrespiratórios e sobre o custo de oxigênio da respiração Volume controlled ventilation versus pressure controlled ventilation in a canine acute lung injury model: effects on cardiorespiratory parameters and oxygen cost of breathing

    No full text
    Introdução: Persiste a questão sobre se há vantagens mecânicas ou de trocas gasosas no uso da ventilação pressão-controlada (VPC) sobre a ciclada a volume (VCV). Objetivos: Comparar, de forma randômica, a VPC com a VCV com fluxo desacelerado nos modos assistido e controlado em modelo experimental de lesão pulmonar aguda. Métodos: Sete cães com lesão pulmonar aguda grave (PaO2/FIO2 < 100mmHg) induzida por ácido oléico intravenoso (0,05mg/kg) foram ventilados em VPC ou VCV, mantidos constantes o volume corrente e o tempo inspiratório. Nas duas modalidades os animais foram ventilados por 40 minutos no modo assistido seguido do modo controlado após curarização. Resultados: Não houve diferenças em relação às trocas gasosas (PaO2 e PaCO2), ao débito, ao transporte de oxigênio e à mecânica respiratória entre a VCV e a VPC. O consumo de oxigênio (VO2) após a curarização foi semelhante (124 &plusmn; 48 na VCV versus 143 &plusmn; 50ml/min na VPC, com p = 0,42). Entretanto, no modo assistido, houve tendência de maior VO2 na VPC (219 &plusmn; 72 versus 154 &plusmn; 67ml/min na VCV, p = 0,06). Isso associou-se a tendência de maior custo de oxigênio da respiração (COR) naquela modalidade, embora sem diferença estatística significante (31 &plusmn; 77 na VCV versus 75 &plusmn; 96ml/min na VPC, p = 0,23) e menor PvO2 (34 &plusmn; 7 versus 42 &plusmn; 6ml/min na VCV, p = 0,02). O pico de fluxo inspiratório nos ciclos assistidos foi maior na VPC (58 &plusmn; 9 versus 48 &plusmn; 4L/min na VCV, p = 0,01). A instituição da ventilação controlada por curarização reduziu em cerca de 20% o débito cardíaco e o DO2 em relação ao modo assistido, tanto na VCV quanto na VPC. Conclusões: Em um modelo de insuficiência respiratória grave, com elevado COR, a manutenção da ventilação controlada em relação à assistida melhorou a relação entre oferta e consumo de oxigênio. A VPC não trouxe benefícios às trocas gasosas ou à mecânica pulmonar em relação à VCV, podendo aumentar o COR no modo assistido no presente modelo.<br>Background: It is questionable whether pressure-controlled ventilation (PCV) has advantages over volume-cycled ventilation (VCV). Objectives: To compare PCV to VCV with decelerating flow profile during assisted and controlled modes in an acute lung injury experimental model. Methods: Severe acute lung injury (PaO2/FIO2 < 100 mmHg) was induced by oleic acid IV infusion (0.05 mg/kg) in seven dogs. The animals were submitted to PCV and VCV in a randomized sequence. After 40 minutes in the assisted mode, ventilation was changed to the controlled mode after neuromuscular blockade. The tidal volume and the inspiratory time were kept constant throughout the experiment. Results: There were no differences in gas exchange (PaO2 and PaCO2), cardiac output or oxygen delivery (DO2) between VCV and PCV. The same was observed regarding maximum airway and plateau pressures, and also to the static compliance. Oxygen consumption (VO2) after neuromuscular blockade was 124 &plusmn; 48 in VCV versus 143 &plusmn; 50 ml/min in PCV, p = 0.42. In the assisted mode, there was a statistical trend of a higher VO2 in PCV (219 &plusmn; 72 versus 154 &plusmn; 67 ml/min in VCV, p = 0.06), that was associated with a statistical trend of a higher oxygen cost of breathing (OCB) during assisted PCV, although without statistical significance (31 &plusmn; 77 in VCV versus 75 &plusmn; 96 ml/min in PCV, p = 0.23), and also in a lower PvO2 (34 &plusmn; 7 in PCV versus 42 &plusmn; 6 ml/min in VCV, p = 0.02). These occurred despite a higher maximum inspiratory flow in the assisted mode in PCV (58 &plusmn; 9 versus 48 &plusmn; 4 L/min in VCV, p = 0.01). In both VCV and PCV the institution of controlled ventilation reduced cardiac debit and DO2 in as much as 20% relative to the assisted mode. Conclusions: The implementation of controlled ventilation improved the oxygen delivery/consumption relationship in this severe and with high OCB acute lung injury model. The PCV offered no additional benefits to VCV and it was associated with a higher OCB during the assisted mode

    Environmental impact on the bacteriological quality of domestic water supplies in Lagos, Nigeria Impacto ambiental sobre a qualidade bacteriológica do abastecimento domiciliar de água em Lagos, Nigéria

    Get PDF
    OBJECTIVE: To assess the impact of town planning, infrastructure, sanitation and rainfall on the bacteriological quality of domestic water supplies. METHODS: Water samples obtained from deep and shallow wells, boreholes and public taps were cultured to determine the most probable number of Escherichia coli and total coliform using the multiple tube technique. Presence of enteric pathogens was detected using selective and differential media. Samples were collected during both periods of heavy and low rainfall and from municipalities that are unique with respect to infrastructure planning, town planning and sanitation. RESULTS: Contamination of treated and pipe distributed water was related with distance of the collection point from a utility station. Faults in pipelines increased the rate of contamination (p<0.5) and this occurred mostly in densely populated areas with dilapidated infrastructure. Wastewater from drains was the main source of contamination of pipe-borne water. Shallow wells were more contaminated than deep wells and boreholes and contamination was higher during period of heavy rainfall (p<0.05). E. coli and enteric pathogens were isolated from contaminated supplies. CONCLUSIONS: Poor town planning, dilapidated infrastructure and indiscriminate siting of wells and boreholes contributed to the low bacteriological quality of domestic water supplies. Rainfall accentuated the impact.<br>OBJETIVO: Avaliar o impacto do planejamento urbano, da infra-estrutura, do saneamento e dos índices pluviométricos sobre a qualidade bacteriológica do abastecimento domiciliar de água. MÉTODOS: Foi realizada cultura de amostras de água obtida de poços superficiais e profundos, fossos e água corrente de bicas públicas para determinar o número mais provável de Escherichia coli e coliformes totais por meio da técnica de múltiplos tubos. Patógenos entéricos foram detectados pelo uso de meios diferenciais e seletivos. Amostras foram coletadas durante os períodos de seca e de chuvas intensas em municípios com características singulares de infra-estrutura, planejamento urbano e saneamento. RESULTADOS: A contaminação de água tratada ou encanada esteve relacionada à distância do ponto de coleta com relação à estação de tratamento. Defeitos na canalização aumentaram o índice de contaminação (p<0,5), principalmente em áreas densamente povoadas com infra-estrutura arruinada. Os despejos de bueiros representaram a principal fonte de contaminação da água encanada. Houve maior contaminação em poços superficiais do que em fossos e durante os períodos de chuvas intensas (p<0,05). E. coli e outros patógenos entéricos foram isolados de fontes de abastecimento contaminadas. CONCLUSÕES: A falta de planejamento urbano, as más condições de infra-estrutura e a localização indiscriminada de poços e fossos contribuíram para a baixa qualidade bacteriológica do abastecimento domiciliar de água. As águas das chuvas agravaram o impacto
    corecore