587 research outputs found

    Approaches in biotechnological applications of natural polymers

    Get PDF
    Natural polymers, such as gums and mucilage, are biocompatible, cheap, easily available and non-toxic materials of native origin. These polymers are increasingly preferred over synthetic materials for industrial applications due to their intrinsic properties, as well as they are considered alternative sources of raw materials since they present characteristics of sustainability, biodegradability and biosafety. As definition, gums and mucilages are polysaccharides or complex carbohydrates consisting of one or more monosaccharides or their derivatives linked in bewildering variety of linkages and structures. Natural gums are considered polysaccharides naturally occurring in varieties of plant seeds and exudates, tree or shrub exudates, seaweed extracts, fungi, bacteria, and animal sources. Water-soluble gums, also known as hydrocolloids, are considered exudates and are pathological products; therefore, they do not form a part of cell wall. On the other hand, mucilages are part of cell and physiological products. It is important to highlight that gums represent the largest amounts of polymer materials derived from plants. Gums have enormously large and broad applications in both food and non-food industries, being commonly used as thickening, binding, emulsifying, suspending, stabilizing agents and matrices for drug release in pharmaceutical and cosmetic industries. In the food industry, their gelling properties and the ability to mold edible films and coatings are extensively studied. The use of gums depends on the intrinsic properties that they provide, often at costs below those of synthetic polymers. For upgrading the value of gums, they are being processed into various forms, including the most recent nanomaterials, for various biotechnological applications. Thus, the main natural polymers including galactomannans, cellulose, chitin, agar, carrageenan, alginate, cashew gum, pectin and starch, in addition to the current researches about them are reviewed in this article.. }To the Conselho Nacional de Desenvolvimento Cientfíico e Tecnológico (CNPq) for fellowships (LCBBC and MGCC) and the Coordenação de Aperfeiçoamento de Pessoal de Nvíel Superior (CAPES) (PBSA). This study was supported by the Portuguese Foundation for Science and Technology (FCT) under the scope of the strategic funding of UID/BIO/04469/2013 unit, the Project RECI/BBB-EBI/0179/2012 (FCOMP-01-0124-FEDER-027462) and COMPETE 2020 (POCI-01-0145-FEDER-006684) (JAT)

    Effects of rapid urbanisation on the urban thermal environment between 1990 and 2011 in Dhaka Megacity, Bangladesh

    Get PDF
    This study investigates the influence of land-use/land-cover (LULC) change on land surface temperature (LST) in Dhaka Megacity, Bangladesh during a period of rapid urbanisation. LST was derived from Landsat 5 TM scenes captured in 1990, 2000 and 2011 and compared to contemporaneous LULC maps. We compared index-based and linear spectral mixture analysis (LSMA) techniques for modelling LST. LSMA derived biophysical parameters corresponded more strongly to LST than those produced using index-based parameters. Results indicated that vegetation and water surfaces had relatively stable LST but it increased by around 2 °C when these surfaces were converted to built-up areas with extensive impervious surfaces. Knowledge of the expected change in LST when one land-cover is converted to another can inform land planners of the potential impact of future changes and urges the development of better management strategies

    On the Representability of Complete Genomes by Multiple Competing Finite-Context (Markov) Models

    Get PDF
    A finite-context (Markov) model of order yields the probability distribution of the next symbol in a sequence of symbols, given the recent past up to depth . Markov modeling has long been applied to DNA sequences, for example to find gene-coding regions. With the first studies came the discovery that DNA sequences are non-stationary: distinct regions require distinct model orders. Since then, Markov and hidden Markov models have been extensively used to describe the gene structure of prokaryotes and eukaryotes. However, to our knowledge, a comprehensive study about the potential of Markov models to describe complete genomes is still lacking. We address this gap in this paper. Our approach relies on (i) multiple competing Markov models of different orders (ii) careful programming techniques that allow orders as large as sixteen (iii) adequate inverted repeat handling (iv) probability estimates suited to the wide range of context depths used. To measure how well a model fits the data at a particular position in the sequence we use the negative logarithm of the probability estimate at that position. The measure yields information profiles of the sequence, which are of independent interest. The average over the entire sequence, which amounts to the average number of bits per base needed to describe the sequence, is used as a global performance measure. Our main conclusion is that, from the probabilistic or information theoretic point of view and according to this performance measure, multiple competing Markov models explain entire genomes almost as well or even better than state-of-the-art DNA compression methods, such as XM, which rely on very different statistical models. This is surprising, because Markov models are local (short-range), contrasting with the statistical models underlying other methods, where the extensive data repetitions in DNA sequences is explored, and therefore have a non-local character

    Search for new phenomena in final states with an energetic jet and large missing transverse momentum in pp collisions at √ s = 8 TeV with the ATLAS detector

    Get PDF
    Results of a search for new phenomena in final states with an energetic jet and large missing transverse momentum are reported. The search uses 20.3 fb−1 of √ s = 8 TeV data collected in 2012 with the ATLAS detector at the LHC. Events are required to have at least one jet with pT > 120 GeV and no leptons. Nine signal regions are considered with increasing missing transverse momentum requirements between Emiss T > 150 GeV and Emiss T > 700 GeV. Good agreement is observed between the number of events in data and Standard Model expectations. The results are translated into exclusion limits on models with either large extra spatial dimensions, pair production of weakly interacting dark matter candidates, or production of very light gravitinos in a gauge-mediated supersymmetric model. In addition, limits on the production of an invisibly decaying Higgs-like boson leading to similar topologies in the final state are presente

    Search for the neutral Higgs bosons of the minimal supersymmetric standard model in pp collisions at root s=7 TeV with the ATLAS detector

    Get PDF
    A search for neutral Higgs bosons of the Minimal Supersymmetric Standard Model (MSSM) is reported. The analysis is based on a sample of proton-proton collisions at a centre-of-mass energy of 7TeV recorded with the ATLAS detector at the Large Hadron Collider. The data were recorded in 2011 and correspond to an integrated luminosity of 4.7 fb-1 to 4.8 fb-1. Higgs boson decays into oppositely-charged muon or τ lepton pairs are considered for final states requiring either the presence or absence of b-jets. No statistically significant excess over the expected background is observed and exclusion limits at the 95% confidence level are derived. The exclusion limits are for the production cross-section of a generic neutral Higgs boson, φ, as a function of the Higgs boson mass and for h/A/H production in the MSSM as a function of the parameters mA and tan ÎČ in the mhmax scenario for mA in the range of 90GeV to 500 GeV. Copyright CERN

    Jet energy measurement with the ATLAS detector in proton-proton collisions at root s=7 TeV

    Get PDF
    The jet energy scale and its systematic uncertainty are determined for jets measured with the ATLAS detector at the LHC in proton-proton collision data at a centre-of-mass energy of √s = 7TeV corresponding to an integrated luminosity of 38 pb-1. Jets are reconstructed with the anti-kt algorithm with distance parameters R=0. 4 or R=0. 6. Jet energy and angle corrections are determined from Monte Carlo simulations to calibrate jets with transverse momenta pT≄20 GeV and pseudorapidities {pipe}η{pipe}<4. 5. The jet energy systematic uncertainty is estimated using the single isolated hadron response measured in situ and in test-beams, exploiting the transverse momentum balance between central and forward jets in events with dijet topologies and studying systematic variations in Monte Carlo simulations. The jet energy uncertainty is less than 2. 5 % in the central calorimeter region ({pipe}η{pipe}<0. 8) for jets with 60≀pT<800 GeV, and is maximally 14 % for pT<30 GeV in the most forward region 3. 2≀{pipe}η{pipe}<4. 5. The jet energy is validated for jet transverse momenta up to 1 TeV to the level of a few percent using several in situ techniques by comparing a well-known reference such as the recoiling photon pT, the sum of the transverse momenta of tracks associated to the jet, or a system of low-pT jets recoiling against a high-pT jet. More sophisticated jet calibration schemes are presented based on calorimeter cell energy density weighting or hadronic properties of jets, aiming for an improved jet energy resolution and a reduced flavour dependence of the jet response. The systematic uncertainty of the jet energy determined from a combination of in situ techniques is consistent with the one derived from single hadron response measurements over a wide kinematic range. The nominal corrections and uncertainties are derived for isolated jets in an inclusive sample of high-pT jets. Special cases such as event topologies with close-by jets, or selections of samples with an enhanced content of jets originating from light quarks, heavy quarks or gluons are also discussed and the corresponding uncertainties are determined. © 2013 CERN for the benefit of the ATLAS collaboration

    A Directed Molecular Evolution Approach to Improved Immunogenicity of the HIV-1 Envelope Glycoprotein

    Get PDF
    A prophylactic vaccine is needed to slow the spread of HIV-1 infection. Optimization of the wild-type envelope glycoproteins to create immunogens that can elicit effective neutralizing antibodies is a high priority. Starting with ten genes encoding subtype B HIV-1 gp120 envelope glycoproteins and using in vitro homologous DNA recombination, we created chimeric gp120 variants that were screened for their ability to bind neutralizing monoclonal antibodies. Hundreds of variants were identified with novel antigenic phenotypes that exhibit considerable sequence diversity. Immunization of rabbits with these gp120 variants demonstrated that the majority can induce neutralizing antibodies to HIV-1. One novel variant, called ST-008, induced significantly improved neutralizing antibody responses when assayed against a large panel of primary HIV-1 isolates. Further study of various deletion constructs of ST-008 showed that the enhanced immunogenicity results from a combination of effective DNA priming, an enhanced V3-based response, and an improved response to the constant backbone sequences

    Constraints on parton distribution functions and extraction of the strong coupling constant from the inclusive jet cross section in pp collisions at √s=7 TeV

    Get PDF
    Peer reviewe

    Study of hadronic event-shape variables in multijet final states in pp collisions at √s=7 TeV

    Get PDF
    Peer reviewe
    • 

    corecore