982 research outputs found

    Targeted therapy in melanoma

    Get PDF
    Malignant melanoma is a highly lethal disease unless detected early. Single-agent chemotherapy is well tolerated but is associated with very low response rates. Combination chemotherapy and biochemotherapy may improve objective response rates but do not prolong survival and are associated with greater toxicity. Immunotherapeutic approaches such as high-dose interleukin-2 are associated with durable responses in a small percentage of patients, but are impractical for many patients due to accessibility and toxicity issues. Elucidations of the molecular mechanisms of carcinogenesis in melanoma have expanded the horizon of opportunity to alter the natural history of the disease. Multiple signal transduction pathways seem to be aberrant and drugs that target them have been and continue to be in development. In this review we present data on the most promising targeted agents in development, including B-raf inhibitors and other signal transduction inhibitors, oligonucleotides, proteasome inhibitors, as well as inhibitors of angiogenesis. Most agents are in early phase trials although some have already reached phase III evaluation. As knowledge and experience with targeted therapy advance, new challenges appear to be arising particularly in terms of resistance and appropriate patient selection

    Multi-Object Classification and Unsupervised Scene Understanding Using Deep Learning Features and Latent Tree Probabilistic Models

    Get PDF
    Deep learning has shown state-of-art classification performance on datasets such as ImageNet, which contain a single object in each image. However, multi-object classification is far more challenging. We present a unified framework which leverages the strengths of multiple machine learning methods, viz deep learning, probabilistic models and kernel methods to obtain state-of-art performance on Microsoft COCO, consisting of non-iconic images. We incorporate contextual information in natural images through a conditional latent tree probabilistic model (CLTM), where the object co-occurrences are conditioned on the extracted fc7 features from pre-trained Imagenet CNN as input. We learn the CLTM tree structure using conditional pairwise probabilities for object co-occurrences, estimated through kernel methods, and we learn its node and edge potentials by training a new 3-layer neural network, which takes fc7 features as input. Object classification is carried out via inference on the learnt conditional tree model, and we obtain significant gain in precision-recall and F-measures on MS-COCO, especially for difficult object categories. Moreover, the latent variables in the CLTM capture scene information: the images with top activations for a latent node have common themes such as being a grasslands or a food scene, and on on. In addition, we show that a simple k-means clustering of the inferred latent nodes alone significantly improves scene classification performance on the MIT-Indoor dataset, without the need for any retraining, and without using scene labels during training. Thus, we present a unified framework for multi-object classification and unsupervised scene understanding

    Ontology based data warehouse modeling and managing ecology of human body for disease and drug prescription management

    Get PDF
    Health care sector is currently experiencing a major crisis with information overload. With the increasing prevalence of chronic diseases and the ageing population the amount of paper-work is more than ever before. In the US, a hospital admission of one patient generates an estimate of 60 pieces of paper. The federal governments of various countries have passed policies and initiatives that focus on introducing information systems into the health care sector. Technology will immensely reduce the cost of managing patients and even reduce the risks of mis-diagnosing and prescribing incorrectmedications to patients. This paper primarily focuses on introducing the concept of ontology based warehouse modelling and managing ecology of human body for disease and drug prescription management. Disorders of the human body and factors such as the patient?s age, living and working conditions, familial and genetic influences can be simulated into Metadata in a warehousing environment. In this environment, various relationships are identified and described between these factors and the diseases. Secondly, we also introduce ontological representation of the various human body systems such as the digestive, musculoskeletal and nervous system in disease processes. Although this is an extensive and complex knowledge domain, the work in this paper is one of the first to attempt to introduce the use of ontology based data warehousing and data mining conceptually. We also aim at implementing and applying this research in practice

    Multidimensional ontology modeling of human digital ecosystems affected by social behavioural data patterns

    Get PDF
    Relational and hierarchical data modeling studies are carried out, using simple and explicit comparison based ontology. The comparison is basically performed on relationally and hierarchically structured data entities/dimensions.This methodology is adopted to understand the human ecosystem that is affected by human behavioural and social disorder data patterns. For example, the comparison may be made among human systems, which could be between male and female, fat and slim, disabled and normal (physical impairment), again normal and abnormal (psychological), smokers and non-smokers and among different age group domains.There could be different hierarchies among which, different super-type dimensions are conceptualized into several subtype dimensions and integrated them by connecting the interrelated several common data attributes. Domain ontologies are built based on the known-knowledge mining and thus unknownrelationships are modeled that are affected by social behaviour data patterns. This study is useful in understanding human situations, behavioral patterns and social ecology that can facilitate health and medical practitioners, social workers and psychologists, while treating their patients and clients

    Modeling the fate and transport of chlorinated benzenes in Baton Rouge Bayou

    Get PDF
    Knowledge of the fate and transport of chlorinated benzenes is necessary at certain sites for effective remediation of contaminated soils using plants. Current research is examining the capability of wetland plants to catalyze degradation or attenuate migration, but again requires knowledge of the uptake of contaminants from an aqueous environment. A two stage model was used to estimate the rate of uptake of contaminants from sediments. The first stage of the model predicts pollutant leaching rates from sediments to the overlying water, which would then be fed into the plant or bioreactor model systems. In this study the flux from contaminated sediments into the overlying water is simulated by using a diffusive transport model. This flux is used to calculate the aqueous phase concentration, coming out of the bed, then fed into a plant reactor. Concentrations coming out of the bed and plant reactor are both measured. A mathematical model was suggested and developed for the plant or bio-reactor which incorporates the transport model thus being able to accurately predict the buffering potentials of willows on chlorinated benzenes. This model was used to estimate the buffering effect of willows on sediments contaminated with lower chlorinated benzenes in Baton Rouge Bayou, near the Petro Processors, Inc. Superfund site. The model predictions were calibrated and verified against laboratory work. This model is then scaled up to determine the uptake. Uptake rates for willows based on retention times were estimated for different flow conditions of bayou (slow, fast and cyclic). The uptake rate for 1.34 acres of the study area was found to be 3.8 kg/year, highest for fast flow conditions while for slow and cyclic flows it was found to be 3.62 and 3.01 kg/year. The average total contaminant uptake for 100 acres for 20 years is estimated to be around 1.56 tons/year. The models used here adequately fit the lab data based on paired t test and correlation coefficient. This model can be used at similar sites in other places and for different compounds

    Ontology based data warehousing for mining of heterogeneous and multidimensional data sources

    Get PDF
    Heterogeneous and multidimensional big-data sources are virtually prevalent in all business environments. System and data analysts are unable to fast-track and access big-data sources. A robust and versatile data warehousing system is developed, integrating domain ontologies from multidimensional data sources. For example, petroleum digital ecosystems and digital oil field solutions, derived from big-data petroleum (information) systems, are in increasing demand in multibillion dollar resource businesses worldwide. This work is recognized by Industrial Electronic Society of IEEE and appeared in more than 50 international conference proceedings and journals

    SENSING CHARACTERISTICS OF MULTIWALLED CARBON NANOTUBE (MWCNT) SENSORS EMBEDDED IN POROUS ALUMINA MEMBRANES

    Get PDF
    A theoretical model is developed for calculating the sensitivity of resistive sensors based on aligned multiwall carbon nanotubes (MWCNT) embedded in the pores of alumina membranes. Aligned MWCNTs offer more surface area as each CNT acts as a landing site for detecting gas species. The MWCNTs behave as a p-type semiconducting layer; when the bus bar contacts are placed at either end of the top surface the resistance between the contacts responds to oxidizing (resistance decreases) and reducing gases (resistance increases). The model presented in this thesis aims to understand the device resistance dependence upon the MWCNT resistance, and the sensitivity dependence upon the device structure and design. The model was utilized for enhancing the sensitivity of MWCNT sensors for ammonia (30% sensitivity) and nitrogen dioxide (40% sensitivity) gases. Experimental results from sensitivity measurements are compared with theoretical predictions

    A HIGHLY RELIABLE NON-VOLATILE FILE SYSTEM FOR SMALL SATELLITES

    Get PDF
    Recent Advancements in Solid State Memories have resulted in packing several Giga Bytes (GB) of memory into tiny postage stamp size Memory Cards. Of late, Secure Digital (SD) cards have become a de-facto standard for all portable handheld devices. They have found growing presence in almost all embedded applications, where huge volumes of data need to be handled and stored. For the very same reason SD cards are being widely used in space applications also. Using these SD Cards in space applications requires robust radiation hardened SD cards and Highly Reliable Fault Tolerant File Systems to handle them. The present work is focused on developing a Highly Reliable Fault Tolerant SD card based FAT16 File System for space applications

    Ontology based warehouse modeling of fractured reservoir ecosystems - for an effective borehole and petroleum production management

    Get PDF

    Ontology based data warehouse modeling and mining of earthquake data: prediction analysis along Eurasian-Australian continental plates

    Get PDF
    Seismological observatories archive volumes of heterogeneous types of earthquake data. These organizations, by virtue of their geographic operations, handle complicated hierarchical data structures. In order to effectively and efficiently perform seismological observatories business activities, the flow of data and information must be consistent and information is shared among its units, situated at differentgeographic locations. In order to improve information sharing among observatories, heterogeneous nature of earthquake data from various sources are intelligently integrated. Data warehouse is a solution, in which, earthquake data entities are modeled using ontology-base multidimensional representation.These data are structured and stored in multi-dimensions in a warehousing environment to minimize the complexity of heterogeneous data. Authors are of the view that data integration process adds value to knowledge building and information sharing among different observatories. Authors suggest that warehoused data modeling facilitates earthquake prediction analysis more effectively
    corecore