3,355 research outputs found

    Database Queries that Explain their Work

    Get PDF
    Provenance for database queries or scientific workflows is often motivated as providing explanation, increasing understanding of the underlying data sources and processes used to compute the query, and reproducibility, the capability to recompute the results on different inputs, possibly specialized to a part of the output. Many provenance systems claim to provide such capabilities; however, most lack formal definitions or guarantees of these properties, while others provide formal guarantees only for relatively limited classes of changes. Building on recent work on provenance traces and slicing for functional programming languages, we introduce a detailed tracing model of provenance for multiset-valued Nested Relational Calculus, define trace slicing algorithms that extract subtraces needed to explain or recompute specific parts of the output, and define query slicing and differencing techniques that support explanation. We state and prove correctness properties for these techniques and present a proof-of-concept implementation in Haskell.Comment: PPDP 201

    Directed motion of C60 on a graphene sheet subjected to a temperature gradient

    Full text link
    Nonequilibrium molecular dynamics simulations is used to study the motion of a C60 molecule on a graphene sheet subjected to a temperature gradient. The C60 molecule is actuated and moves along the system while it just randomly dances along the perpendicular direction. Increasing the temperature gradient increases the directed velocity of C60. It is found that the free energy decreases as the C60 molecule moves toward the cold end. The driving mechanism based on the temperature gradient suggests the construction of nanoscale graphene-based motors

    Could the Anti-Chaperone VER155008 Replace Temozolomide for Glioma Treatment

    Get PDF
    Cancer inducible molecular chaperone HSP90 is of great importance as an anticancer target. Proteomic analysis showed that inhibiting HSP90 by the geldanamycin derivative, 17-AAG elevated the expression of the co-chaperone Hsp70. In this study we used HSP90 selective inhibitor 17-AAG and HSP70/90 dual inhibitor, VER155008 (VER) in U87-MG glioma cells. miRNAs microarray technology was used to evaluate the efficacy of these inhibitory drugs compared with temozolomide (TMZ), used as a standard treatment for glioma. Microarrays data identified 154 differentially expressed miRNAs using stringent or unstringent parameters. 16 miRNAs were overlapped between treatments, 13 upregulated and one downregulated miRNA were overlapped between TMZ and VER. The miRNA target prediction software was used for these overlapped miRNAs and identified 6 of the 13 upregulated miRNAs target methyltransferase genes. The IC50, together with Akt and HSP70 and 90 protein level data favour VER and TMZ to 17-AAG, however due to the selectivity of VER to cancer cells as a potent antichaperon, it may be more favourable to the standard TMZ

    Energy conversion and storage via photoelectrochemical methods

    Get PDF
    Photoelectro analytical chemistry provides an elegant technique by which to explore, amongst others, various industrial and environmental applications. To this end, four areas of photoelectroanalytical chemistry are investigated in order to develop industrially - and environmentally - relevant galvanic and photogalvanic cells, together with exploring the electro-generation of an industrially important molecule and diffusion factors they may affect this generation.The first study is investigated a long-range charge transfer, using tert-butylferrocene (tBuFc) as model hydrophobic system. It is found that the apparent one-dimensional diffusion coefficient depends on the tBuFc loading. It is suggested that an efficient relay mechanism for electron transfer is through the partitioning of the oxidised form between the two subphases, with inter-pseudophase reaction.However, the second study investigated the normal lyotropic liquid crystals (in the lamellar or hexagonal phases) as a route to afford a structured, three-dimensional, quasi-biphasic framework within which electron transfer cascades may take place using cyclic voltammetry. It is shown that these can take place through reagent partitioning between the hydrophobic and hydrophilic subphases, and it is illustrated how the structure and its orientation, the nature of the ionic doping of the framework, and the hydrophobicity of the redox analyte may give rise to changes in the observed voltammetric waveshape.For the case of an artitifical mimic of the first few stages of Photosystem I, it is demonstrated that photo-induced electron transfer is likewise affected by the orientation, and develop a system of photon efficiency of ~0.1%.Thirdly, a novel attempt at power production was attempted with the construction and optimisation of a photogalvanic cell system. A literature review was conducted and a system proposed utilizing 10-methylphenothiazine (NMP) as a light harvester and zinc as a sacrificial electrode with tetrabutylammonium chloride (TBAP) as a supporting electrolyte and chloroform as a mediator. The study aimed to create a cell that could be produced using industrial run-off or other waste water supplies.A series of cells was produced with varying concentrations of both zinc and NMP solutions and the power conversions studied by producing a voltage-current plot for each system. A system that exhibited 9.02% conversion efficiency keep, future studies were conducted to show whether the zinc species effected the power conversion or if silver would act in a similar way.A mechanism was proposed for the power production process and so studies using 2, 4-Dichlorophenol (DCP) rather than chloroform we conducted; it was believed that the dissociation step for DCP was step wise rather than concerted. Lower power production was seen in these cells as predicted by the reaction mechanism. Tris - (4-bromophenyl) - amine (TBA), an alternative light harvester to NMP, was used to see if altering the active chemical agent resulted in efficiency change.Finally , A photogalvanic cell that employs 2,4-dichlorophenol as a fuel source, an N-substituted phenothiazine as light harvester, and sacrificial zinc anode is presented, and shown to afford a ca. 4% light-to-electrical power conversion efficiency in violet light

    Could Upregulated Hsp70 Protein Compensate for the Hsp90-Silence-Induced Cell Death in Glioma Cells?

    Get PDF
    The molecular chaperone heat shock protein 90 alpha (Hsp90α) has been recognized in various tumours including glioma. This pilot study using a proteomic approach analyses the downstream effects of Hsp90 inhibition using 17-allylamino-17-demethoxygeldanamycin (17AAG) and a short hairpin RNA (shRNA) oligonucleotide targeting hsp90α (shhsp90α) in the U87-MG glioma cell line. Preliminary data coupled with bioinformatic analysis identified several known and unknown Hsp90 client proteins that demonstrated a change in their protein expression after Hsp90 inhibition, signifying an alteration in the canonical pathways of cell cycle progression, apoptosis, cell invasion, angiogenesis, and metastasis. Members of the glycolysis pathway were upregulated, demonstrating increased dependency on glycolysis for energy source by the treated glioma cells. Upregulated proteins also include Hsp70 and members of its family such as Hsp27 and gp96, thereby suggesting the role of Hsp90 co-chaperones in compensating for Hsp90 function after Hsp90 inhibition. Considering Hsp70’s role in antiapoptosis, it was postulated that a combination therapy involving a multitarget approach could be carried out. Consequently inhibition of both Hsp90 and Hsp70 in U87-MG glioma cells resulted in 60% cell death indicating the importance of combination therapy for glioma therapeutics

    Could Upregulated Hsp70 Protein Compensate for the Hsp90-Silence-Induced Cell Death in Glioma Cells?

    Get PDF
    The molecular chaperone heat shock protein 90 alpha (Hsp90α) has been recognized in various tumours including glioma. This pilot study using a proteomic approach analyses the downstream effects of Hsp90 inhibition using 17-allylamino-17-demethoxygeldanamycin (17AAG) and a short hairpin RNA (shRNA) oligonucleotide targeting hsp90α (shhsp90α) in the U87-MG glioma cell line. Preliminary data coupled with bioinformatic analysis identified several known and unknown Hsp90 client proteins that demonstrated a change in their protein expression after Hsp90 inhibition, signifying an alteration in the canonical pathways of cell cycle progression, apoptosis, cell invasion, angiogenesis, and metastasis. Members of the glycolysis pathway were upregulated, demonstrating increased dependency on glycolysis for energy source by the treated glioma cells. Upregulated proteins also include Hsp70 and members of its family such as Hsp27 and gp96, thereby suggesting the role of Hsp90 co-chaperones in compensating for Hsp90 function after Hsp90 inhibition. Considering Hsp70’s role in antiapoptosis, it was postulated that a combination therapy involving a multitarget approach could be carried out. Consequently inhibition of both Hsp90 and Hsp70 in U87-MG glioma cells resulted in 60% cell death indicating the importance of combination therapy for glioma therapeutics

    Lung nodule modeling and detection for computerized image analysis of low dose CT imaging of the chest.

    Get PDF
    From a computerized image analysis prospective, early diagnosis of lung cancer involves detection of doubtful nodules and classification into different pathologies. The detection stage involves a detection approach, usually by template matching, and an authentication step to reduce false positives, usually conducted by a classifier of one form or another; statistical, fuzzy logic, support vector machines approaches have been tried. The classification stage matches, according to a particular approach, the characteristics (e.g., shape, texture and spatial distribution) of the detected nodules to common characteristics (again, shape, texture and spatial distribution) of nodules with known pathologies (confirmed by biopsies). This thesis focuses on the first step; i.e., nodule detection. Specifically, the thesis addresses three issues: a) understanding the CT data of typical low dose CT (LDCT) scanning of the chest, and devising an image processing approach to reduce the inherent artifacts in the scans; b) devising an image segmentation approach to isolate the lung tissues from the rest of the chest and thoracic regions in the CT scans; and c) devising a nodule modeling methodology to enhance the detection rate and lend benefits for the ultimate step in computerized image analysis of LDCT of the lungs, namely associating a pathology to the detected nodule. The methodology for reducing the noise artifacts is based on noise analysis and examination of typical LDCT scans that may be gathered on a repetitive fashion; since, a reduction in the resolution is inevitable to avoid excessive radiation. Two optimal filtering methods are tested on samples of the ELCAP screening data; the Weiner and the Anisotropic Diffusion Filters. Preference is given to the Anisotropic Diffusion Filter, which can be implemented on 7x7 blocks/windows of the CT data. The methodology for lung segmentation is based on the inherent characteristics of the LDCT scans, shown as distinct bi-modal gray scale histogram. A linear model is used to describe the histogram (the joint probability density function of the lungs and non-lungs tissues) by a linear combination of weighted kernels. The Gaussian kernels were chosen, and the classic Expectation-Maximization (EM) algorithm was employed to estimate the marginal probability densities of the lungs and non-lungs tissues, and select an optimal segmentation threshold. The segmentation is further enhanced using standard shape analysis based on mathematical morphology, which improves the continuity of the outer and inner borders of the lung tissues. This approach (a preliminary version of it appeared in [14]) is found to be adequate for lung segmentation as compared to more sophisticated approaches developed at the CVIP Lab (e.g., [15][16]) and elsewhere. The methodology developed for nodule modeling is based on understanding the physical characteristics of the nodules in LDCT scans, as identified by human experts. An empirical model is introduced for the probability density of the image intensity (or Hounsfield units) versus the radial distance measured from the centroid – center of mass - of typical nodules. This probability density showed that the nodule spatial support is within a circle/square of size 10 pixels; i.e., limited to 5 mm in length; which is within the range that the radiologist specify to be of concern. This probability density is used to fill in the intensity (or Hounsfield units) of parametric nodule models. For these models (e.g., circles or semi-circles), given a certain radius, we calculate the intensity (or Hounsfield units) using an exponential expression for the radial distance with parameters specified from the histogram of an ensemble of typical nodules. This work is similar in spirit to the earlier work of Farag et al., 2004 and 2005 [18][19], except that the empirical density of the radial distance and the histogram of typical nodules provide a data-driven guide for estimating the intensity (or Hounsfield units) of the nodule models. We examined the sensitivity and specificity of parametric nodules in a template-matching framework for nodule detection. We show that false positives are inevitable problems with typical machine learning methods of automatic lung nodule detection, which invites further efforts and perhaps fresh thinking into automatic nodule detection. A new approach for nodule modeling is introduced in Chapter 5 of this thesis, which brings high promise in both the detection, and the classification of nodules. Using the ELCAP study, we created an ensemble of four types of nodules and generated a nodule model for each type based on optimal data reduction methods. The resulting nodule model, for each type, has lead to drastic improvements in the sensitivity and specificity of nodule detection. This approach may be used as well for classification. In conclusion, the methodologies in this thesis are based on understanding the LDCT scans and what is to be expected in terms of image quality. Noise reduction and image segmentation are standard. The thesis illustrates that proper nodule models are possible and indeed a computerized approach for image analysis to detect and classify lung nodules is feasible. Extensions to the results in this thesis are immediate and the CVIP Lab has devised plans to pursue subsequent steps using clinical data
    corecore