20 research outputs found

    Matrix Development in Self-Assembly of Articular Cartilage

    Get PDF
    Articular cartilage is a highly functional tissue which covers the ends of long bones and serves to ensure proper joint movement. A tissue engineering approach that recapitulates the developmental characteristics of articular cartilage can be used to examine the maturation and degeneration of cartilage and produce fully functional neotissue replacements for diseased tissue.This study examined the development of articular cartilage neotissue within a self-assembling process in two phases. In the first phase, articular cartilage constructs were examined at 1, 4, 7, 10, 14, 28, 42, and 56 days immunohistochemically, histologically, and through biochemical analysis for total collagen and glycosaminoglycan (GAG) content. Based on statistical changes in GAG and collagen levels, four time points from the first phase (7, 14, 28, and 56 days) were chosen to carry into the second phase, where the constructs were studied in terms of their mechanical characteristics, relative amounts of collagen types II and VI, and specific GAG types (chondroitin 4-sulfate, chondroitin 6-sulfate, dermatan sulfate, and hyaluronan). Collagen type VI was present in initial abundance and then localized to a pericellular distribution at 4 wks. N-cadherin activity also spiked at early stages of neotissue development, suggesting that self-assembly is mediated through a minimization of free energy. The percentage of collagen type II to total collagen significantly increased over time, while the proportion of collagen type VI to total collagen decreased between 1 and 2 wks. The chondroitin 6- to 4- sulfate ratio decreased steadily during construct maturation. In addition, the compressive properties reached a plateau and tensile characteristics peaked at 4 wks.The indices of cartilage formation examined in this study suggest that tissue maturation in self-assembled articular cartilage mirrors known developmental processes for native tissue. In terms of tissue engineering, it is suggested that exogenous stimulation may be necessary after 4 wks to further augment the functionality of developing constructs

    Prognostic model to predict postoperative acute kidney injury in patients undergoing major gastrointestinal surgery based on a national prospective observational cohort study.

    Get PDF
    Background: Acute illness, existing co-morbidities and surgical stress response can all contribute to postoperative acute kidney injury (AKI) in patients undergoing major gastrointestinal surgery. The aim of this study was prospectively to develop a pragmatic prognostic model to stratify patients according to risk of developing AKI after major gastrointestinal surgery. Methods: This prospective multicentre cohort study included consecutive adults undergoing elective or emergency gastrointestinal resection, liver resection or stoma reversal in 2-week blocks over a continuous 3-month period. The primary outcome was the rate of AKI within 7 days of surgery. Bootstrap stability was used to select clinically plausible risk factors into the model. Internal model validation was carried out by bootstrap validation. Results: A total of 4544 patients were included across 173 centres in the UK and Ireland. The overall rate of AKI was 14·2 per cent (646 of 4544) and the 30-day mortality rate was 1·8 per cent (84 of 4544). Stage 1 AKI was significantly associated with 30-day mortality (unadjusted odds ratio 7·61, 95 per cent c.i. 4·49 to 12·90; P < 0·001), with increasing odds of death with each AKI stage. Six variables were selected for inclusion in the prognostic model: age, sex, ASA grade, preoperative estimated glomerular filtration rate, planned open surgery and preoperative use of either an angiotensin-converting enzyme inhibitor or an angiotensin receptor blocker. Internal validation demonstrated good model discrimination (c-statistic 0·65). Discussion: Following major gastrointestinal surgery, AKI occurred in one in seven patients. This preoperative prognostic model identified patients at high risk of postoperative AKI. Validation in an independent data set is required to ensure generalizability

    Transition Techniques of the Future Internet Protocol-IPv6

    Get PDF
    In the 1970’s the Internet Protocol (IP) was designed after much decision on it. Thirty years after its deployment and usage, the resulting design (i.e., IPv4) has been more than sufficient even though certain techniques had to be implemented to further reduce the rapid depletion of the IPv4 address space due to the exponential growth of the internet. However, the techniques implemented to further reduce the rapid depletion of IPv4 were only just temporary and have serious limitations. It should no longer be news that the transition from IPv4 to the new internet protocol (IPv6) will have to happen now. The transition techniques to achieve this transition process and the benefits that will accrue from having such an addressing scheme are the ultimate objective of this work. The paper clearly iterated that for successful transition to take place, organizations must first begin to run IPv4 and IPv6 in parallel. The vulnerable impacts of the transition and the appropriate solutions were explained

    Changes in aortic volumes following endovascular sealing of abdominal aortic aneurysms with the nellix endoprosthesis

    No full text
    PURPOSE: To investigate the effects on aortic volumes of endovascular aneurysm sealing (EVAS) with the Nellix device. METHODS: Twenty-five consecutive patients (mean age 78±7 years; 17 men) with abdominal aortic aneurysms containing thrombus were treated with EVAS. Their pre- and post-EVAS computed tomography (CT) scans were reviewed to document volume changes in the entire aneurysmal aorta, the lumen, and the intraluminal thrombus. The changes are reported as the mean and 95% confidence interval (CI). RESULTS: Total aortic volume was greater on postoperative scans by a mean 17 mL (95% CI 10.0 to 23.5, p<0.001). The volume occupied by the endobags was greater than the preoperative lumen volume by a mean 28 mL (95% CI 24.7 to 31.7, p=0.002). Postoperatively, the aortic volume occupied by thrombus had decreased by a mean 11 mL (95% CI 4.7 to 18.2, p<0.001). There were good correlations between changes in aneurysm and thrombus volumes (r=0.864, p<0.001), between the planning CT/EVAS time interval and the change in aneurysm volume (r=0.640, p=0.001), and between the planning CT/EVAS time interval and the change in thrombus volume (r=0.567, p=0.003). CONCLUSION: There are significant changes in aortic volumes post EVAS. These changes may be a direct consequence of the technique and have implications for the planning and performance of EVAS

    Crab meat: a novel vehicle for E. coli

    No full text

    Neuroenhancement in Military Personnel: Conceptual and Methodological Promises and Challenges

    Get PDF
    Military personnel are subjected to prolonged operations in harsh and undesirable conditions characterized by severe environmental exposures, resource scarcity, and physical and mental encumbrance. Prolonged military operations under these conditions can degrade the already limited perceptual, cognitive, and emotional resources necessary to sustain performance on mission-related tasks. The complex multi-domain operations of the future battlespace are expected to further increase demands at even the lowest levels of the military echelon. These demands will be characterized with increasingly prolonged operations of small units in austere environments with limited resupply and degraded technological capabilities. It is therefore critical to identify new training and technological approaches to enable sustained, optimized, and/or enhanced performance of military personnel. Research in the international defence science community, academia, and industry has developed several promising neuroscientific strategies for pursuing this goal, including neuromodulatory and neurofeedback techniques. The present paper reviews the state of the art in cognitive neuroenhancement research and development from six participating nations: Canada, Germany, Italy, The Netherlands, United Kingdom, and the United States of America. Six neuromodulation techniques are reviewed, including transcranial magnetic stimulation (TMS), transcranial focused ultrasound stimulation (tFUS), transcranial electrical stimulation (tES), transcutaneous peripheral nerve stimulation (tPNS), photobiomodulation, and cranial electrotherapy stimulation (CES). Three neurofeedback techniques are considered, including the use of electroencephalography (EEG), functional magnetic resonance imaging (fMRI), and functional near-infrared spectroscopy (fNIRS) for monitoring brain states, with feedback loops enabled through machine learning and artificial intelligence. Participating nations summarize basic and applied research leveraging one or more of these neuromodulation and neurofeedback technologies for the purposes of enhancing Warfighter cognitive performance. The report continues by detailing the inherent methodological challenges of cognitive neuroenhancement and other considerations for conducting research, development, and engineering in this domain. The report concludes with a discussion of promising future directions in neuroenhancement, including biosensing, improved mechanistic and predictive modelling and software tools, developing non-invasive forms of deep-brain stimulation, testing emerging theoretical models of brain and behavior, and developing closed-loop neuroenhancement and humanmachine teaming methods. Emphasis is placed on the conceptual and methodological promises and challenges associated with planning, executing, and interpreting neuroenhancement research and development efforts in the context of Warfighter selection, training, operations, and recovery

    FEDARGOS-V1: A Monitoring Architecture for Federated Cloud Computing Infrastructures

    Get PDF
    Resource management in cloud infrastructure is one of the key elements of quality of services provided by the cloud service providers. Resource management has its taxonomy, which includes discovery of resources, selection of resources, allocation of resources, pricing of resources, disaster management, and monitoring of resources. Specifically, monitoring provides the means of knowing the status and availability of the physical resources and services within the cloud infrastructure. This results in making &#x201C;monitoring of resources&#x201D; one of the key aspects of the cloud resource management taxonomy. However, managing the resources in a secure and scalable manner is not easy, particularly when considering a federated cloud environment. A federated cloud is used and shared by many multi-cloud tenants and at various cloud software stack levels. As a result, there is a need to reconcile all the tenants&#x2019; diverse monitoring requirements. To cover all aspects relating to the monitoring of resources in a federated cloud environment, we present the FEDerated Architecture for Resource manaGement and mOnitoring in cloudS Version 1.0 (FEDARGOS-V1), a cloud resource monitoring architecture for federated cloud infrastructures. The architecture focuses mainly on the ability to access information while monitoring services for early identification of resource constraints within short time intervals in federated cloud platforms. The monitoring architecture was deployed in a real-time OpenStack-based FEDerated GENomic (FEDGEN) cloud testbed. We present experimental results in order to evaluate our design and compare it both qualitatively and quantitatively to a number of existing Cloud monitoring systems that are similar to ours. The architecture provided here can be deployed in private or public federated cloud infrastructures for faster and more scalable resource monitoring
    corecore