1,180 research outputs found

    Orlando Gaming Overview

    Get PDF

    Quantitative analysis of breast lumpectomies using histology and micro-CT data

    Full text link
    OBJECTIVE: Breast cancer represents a significant risk in women's health, affecting many women worldwide. Current treatment options in the U.S involve a multidisciplinary approach, most often beginning with surgery to remove cancerous tissue. Evaluation of margins for cancer on excised tissue is an important part of surgery, an important predictor of survival. As a result, there has been a great deal of research interest in intraoperative margin assessment, with a focus on fast and accurate results. Micro-computed Tomography (micro-CT) has emerged as a promising avenue to this end. We hypothesize that micro-CT scans will show a statistically significant difference in radiodensity between cancerous and non-cancerous tissue at intraoperative scan times. METHODS: 15 breast lumpectomy specimens were collected from patients undergoing surgery at Massachusetts General Hospital (MGH). Lumpectomies were scanned with a Nikon XTH225 Micro-CT scanner. Corresponding histology slides were scanned with a whole slide scanner, and matched with micro-CT scans. Representative areas of cancerous and non-cancerous tissues were segmented from micro-CT scans, and their respective radiodensity differences were tested for statistical significance. RESULTS: 9 of 15 lumpectomy cases were successfully matched with histology sections. Of the 9 cases matched, 8 showed a statistically significant difference in mean radiodensity. CONCLUSION: Due to potential confounds in the study, the results are difficult to deem conclusive. However, micro-CT remains a promising tool in margin assessment, and could be fit for clinical use with further study

    Dispatch: distributed peer-to-peer simulations

    Get PDF
    Recently there has been an increasing demand for efficient mechanisms of carrying out computations that exhibit coarse grained parallelism. Examples of this class of problems include simulations involving Monte Carlo methods, computations where numerous, similar but independent, tasks are performed to solve a large problem or any solution which relies on ensemble averages where a simulation is run under a variety of initial conditions which are then combined to form the result. With the ever increasing complexity of such applications, large amounts of computational power are required over a long period of time. Economic constraints entail deploying specialized hardware to satisfy this ever increasing computing power. We address this issue in Dispatch, a peer-to-peer framework for sharing computational power. In contrast to grid computing and other institution-based CPU sharing systems, Dispatch targets an open environment, one that is accessible to all the users and does not require any sort of membership or accounts, i.e. any machine connected to the Internet can be the part of framework. Dispatch allows dynamic and decentralized organization of these computational resources. It empowers users to utilize heterogeneous computational resources spread across geographic and administrative boundaries to run their tasks in parallel. As a first step, we address a number of challenging issues involved in designing such distributed systems. Some of these issues are forming a decentralized and scalable network of computational resources, finding sufficient number of idle CPUs in the network for participants, allocating simulation tasks in an optimal manner so as to reduce the computation time, allowing new participants to join the system and run their task irrespective of their geographical location and facilitating users to interact with their tasks (pausing, resuming, stopping) in real time and implementing security features for preventing malicious users from compromising the network and remote machines. As a second step, we evaluate the performance of Dispatch on a large-scale network consisting of 10−130 machines. For one particular simulation, we were able to achieve up to 1500 million iterations per second as compared to 10 million iterations per second on one machine. We also test Dispatch over a wide-area network where it is deployed on machines that are geographically apart and belong to different domains

    Post Procedural Care of Patients Receiving Percutaneous Transhepatic Biliary Drainage Catheter Placement

    Get PDF
    A recent review of patients undergoing transhepatic biliary drainage catheter placement showed a 30-day readmission rate of 28%. New post-procedural processes were created to standardize the care of this patient population to decrease readmission rates and improve patient satisfaction

    FORMULATION AND CHARACTERIZATION OF SUSTAINED RELEASE MATRIX TABLETS OF IVABRADINE USING 32 FULL FACTORIAL DESIGN

    Get PDF
    Objective: Ivabradine (IB) is anti-Ischemic drug and used for the symptomatic management of stable angina pectoris. IB acts by reducing the heart rate in a mechanism different from beta blockers and calcium channel blockers, two commonly prescribed anti-anginal drugs. IB has a short biological half-life and the dose of 5/7.5 mg twice a day. In this present study, an attempt has been made to prepare sustained release tablet of IB to achieve the desired drug release.Methods: The sustained release polymers, hydroxypropyl methylcellulose K100M (HPMC K100M), guar gum (GG) and xanthan gum (XG) were taken for the preliminary trail from which guar gum and xanthan gum had shown better drug release. Initially, drug-excipients compatibility studies were carried out by using Fourier transformed infrared spectroscopy (FTIR) and Differential Scanning Calorimetry (DSC) which showed no interaction between drug and excipients. Tablets were prepared by wet granulation technique and evaluated for pre-compression and post-compression parameters.Results: 32 full factorial design was applied to achieve controlled drug release up to 24 h. The concentration of GG (X1) and XG (X2) were selected as independent variables and the % CDR at 2 h. (Y1) and 18 h. (Y2) were taken as dependent variables. In vitro drug release study revealed that as the amount of polymers increased, % CDR decreased.Conclusion: Contour as well as response surface plots were constructed to show the effect of X1 and X2 on % CDR and predicted at the concentration of independent variables X1 (10 mg) and X2 (10 mg) for a maximized response. The optimized batch (O1) was kept for stability study at 40±2 °C/75±5 %RH for a period of 6mo according to ICH guidelines and found to be stable

    Selective Serotonin-Norepinephrine Reuptake Inhibitors-Induced Takotsubo Cardiomyopathy

    Get PDF
    CONTEXT: Takotsubo translates to octopus pot in Japanese. Takotsubo cardiomyopathy (TTC) is characterized by a transient regional systolic dysfunction of the left ventricle. Catecholamine excess is the one most studied and favored theories explaining the pathophysiology of TTC. CASE REPORT: We present the case of a 52-year-old Hispanic female admitted for venlafaxine-induced TTC with a review literature on all the cases of Serotonin-norepinephrine reuptake inhibitors (SNRI)-associated TTC published so far. CONCLUSION: SNRI inhibit the reuptake of catecholamines into the presynaptic neuron, resulting in a net gain in the concentration of epinephrine and serotonin in the neuronal synapses and causing iatrogenic catecholamine excess, ultimately leading to TTC
    • …
    corecore