226 research outputs found

    The mechano-ubiquitinome of articular cartilage: differential ubiquitination and activation of a group of ER-associated DUBs and ER stress regulators

    Get PDF
    Understanding how connective tissue cells respond to mechanical stimulation is important to human health and disease processes in musculoskeletal diseases. Injury to articular cartilage is a key risk factor in predisposition to tissue damage and degenerative osteoarthritis. Recently, we have discovered that mechanical injury to connective tissues including murine and porcine articular cartilage causes a significant increase in Lysine 63- polyubiquitination. Here we identified the ubiquitin signature that is unique to injured articular cartilage tissue post mechanical injury (the “mechano-ubiquitinome”). A total of 463 ubiquitinated peptides were identified, with an enrichment of ubiquitinated peptides of proteins involved in protein processing in the endoplasmic reticulum (ER), also known as the ER-associated degradation (ERAD) response, including YOD1, BRCC3, ATXN3 and USP5 as well as the ER stress regulators, RAD23B, VCP/p97 and Ubiquilin 1. Enrichment of these proteins suggested an injury-induced ER stress response and, for instance, ER stress markers DDIT3/CHOP and BIP/GRP78 were upregulated following cartilage injury on the protein and gene expression levels. Similar ER stress induction was also observed in response to tail fin injury in zebrafish larvae, suggesting a generic response to tissue injury. Furthermore, a rapid increase in global DUB activity following injury and significant activity in human osteoarthritic cartilage was observed using DUB specific activity probes. Combined, these results implicate the involvement of ubiquitination events and activation of a set of DUBs and ER stress regulators in cellular responses to cartilage tissue injury and in osteoarthritic cartilage tissues. This link through the ERAD pathway makes this protein set attractive for further investigation in in vivo models of tissue injury and for targeting in osteoarthritis and related musculoskeletal diseases

    HI in the Outskirts of Nearby Galaxies

    Full text link
    The HI in disk galaxies frequently extends beyond the optical image, and can trace the dark matter there. I briefly highlight the history of high spatial resolution HI imaging, the contribution it made to the dark matter problem, and the current tension between several dynamical methods to break the disk-halo degeneracy. I then turn to the flaring problem, which could in principle probe the shape of the dark halo. Instead, however, a lot of attention is now devoted to understanding the role of gas accretion via galactic fountains. The current Λ\rm \Lambda cold dark matter theory has problems on galactic scales, such as the core-cusp problem, which can be addressed with HI observations of dwarf galaxies. For a similar range in rotation velocities, galaxies of type Sd have thin disks, while those of type Im are much thicker. After a few comments on modified Newtonian dynamics and on irregular galaxies, I close with statistics on the HI extent of galaxies.Comment: 38 pages, 17 figures, invited review, book chapter in "Outskirts of Galaxies", Eds. J. H. Knapen, J. C. Lee and A. Gil de Paz, Astrophysics and Space Science Library, Springer, in pres

    Accounting for International War: The State of the Discipline

    Full text link
    In studies of war it is important to observe that the processes leading to so frequent an event as conflict are not necessarily those that lead to so infrequent an event as war. Also, many models fail to recognize that a phenomenon irregularly distributed in time and space, such as war, cannot be explained on the basis of relatively invariant phenomena. Much research on periodicity in the occurrence of war has yielded little result, suggesting that the direction should now be to focus on such variables as diffusion and contagion. Structural variables, such as bipolarity, show contradictory results with some clear inter-century differences. Bipolarity, some results suggest, might have different effects on different social entities. A considerable number of studies analysing dyadic variables show a clear connection between equal capabilities among contending nations and escalation of conflict into war. Finally, research into national attributes often points to strength and geographical location as important variables. In general, the article concludes, there is room for modest optimism, as research into the question of war is no longer moving in non-cumulative circles. Systematic research is producing results and there is even a discernible tendency of convergence, in spite of a great diversity in theoretical orientations.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/69148/2/10.1177_002234338101800101.pd

    Propaganda in an Age of Algorithmic Personalization: Expanding Literacy Research and Practice

    Get PDF
    In this commentary, the author considers the rise of algorithmic personalization and the power of propaganda as they shift the dynamic landscape of 21st‐century literacy research and practice. Algorithmic personalization uses data from the behaviors, beliefs, interests, and emotions of the target audience to provide filtered digital content, targeted advertising, and differential product pricing to online users. As persuasive genres, advertising and propaganda may demand different types of reading practices than texts whose purpose is primarily informational or argumentative. Understanding the propaganda function of algorithmic personalization may lead to a deeper consideration of texts that activate emotion and tap into audience values for aesthetic, commercial, and political purposes. Increased attention to algorithmic personalization, propaganda, and persuasion in the context of K–12 literacy education may also help people cope with sponsored content, bots, and other forms of propaganda and persuasion that now circulate online

    Variability of humidity conditions in the Arctic during the first International Polar Year, 1882-83

    Get PDF
    Of all the early instrumental data for the Arctic, the meteorological data gathered during the first International Polar Year, in 1882–83 (IPY-1), are the best in terms of coverage, quality and resolution. Research carried out during IPY-1 scientific expeditions brought a significant contribution to the development of hygrometry in polar regions at the end of the 19th century. The present paper gives a detailed analysis of a unique series of humidity measurements that were carried out during IPY-1 at hourly resolutions at nine meteorological stations, relatively evenly distributed in the High Arctic. It gives an overall view of the humidity conditions prevalent in the Arctic at that time. The results show that the spatial distribution of atmospheric water vapour pressure (e) and relative humidity (RH) in the Arctic during IPY-1 was similar to the present. In the annual course the highest values of e were noted in July and August, while the lowest occurred in the cold half of the year. In comparison to present-day conditions (1961–1990), the mean values of RH in the IPY-1 period (September 1882 to July 1883) were higher by 2.4–5.6%. Most of the changes observed between historical and modern RH values are not significant. The majority of historical daily RH values lie between a distance of less than two standard deviations from current long-term monthly means

    Planck early results. II. The thermal performance of Planck

    Get PDF
    The performance of the Planck instruments in space is enabled by their low operating temperatures, 20 K for LFI and 0.1 K for HFI, achieved through a combination of passive radiative cooling and three active mechanical coolers. The scientific requirement for very broad frequency coverage led to two detector technologies with widely different temperature and cooling needs. Active coolers could satisfy these needs; a helium cryostat, as used by previous cryogenic space missions (IRAS, COBE, ISO, Spitzer, AKARI), could not. Radiative cooling is provided by three V-groove radiators and a large telescope baffle. The active coolers are a hydrogen sorption cooler (<20 K), a 4He Joule-Thomson cooler (4.7 K), and a 3He-4He dilution cooler (1.4 K and 0.1 K). The flight system was at ambient temperature at launch and cooled in space to operating conditions. The HFI bolometer plate reached 93 mK on 3 July 2009, 50 days after launch. The solar panel always faces the Sun, shadowing the rest of Planck, and operates at a mean temperature of 384 K. At the other end of the spacecraft, the telescope baffle operates at 42.3 K and the telescope primary mirror operates at 35.9 K. The temperatures of key parts of the instruments are stabilized by both active and passive methods. Temperature fluctuations are driven by changes in the distance from the Sun, sorption cooler cycling and fluctuations in gas-liquid flow, and fluctuations in cosmic ray flux on the dilution and bolometer plates. These fluctuations do not compromise the science data

    Planck Early Results. VII. The Early Release Compact Source Catalogue

    Get PDF
    A brief description of the methodology of construction, contents and usage of the Planck Early Release Compact Source Catalogue (ERCSC), including the Early Cold Cores (ECC) and the Early Sunyaev-Zeldovich (ESZ) cluster catalogue is provided. The catalogue is based on data that consist of mapping the entire sky once and 60% of the sky a second time by Planck, thereby comprising the first high sensitivity radio/submillimetre observations of the entire sky. Four source detection algorithms were run as part of the ERCSC pipeline. A Monte-Carlo algorithm based on the injection and extraction of artificial sources into the Planck maps was implemented to select reliable sources among all extracted candidates such that the cumulative reliability of the catalogue is ≥90%. There is no requirement on completeness for the ERCSC. As a result of the Monte-Carlo assessment of reliability of sources from the different techniques, an implementation of the PowellSnakes source extraction technique was used at the five frequencies between 30 and 143 GHz while the SExtractor technique was used between 217 and 857GHz. The 10σ photometric flux density limit of the catalogue at |b| > 30◦ is 0.49, 1.0, 0.67, 0.5, 0.33, 0.28, 0.25, 0.47 and 0.82 Jy at each of the nine frequencies between 30 and 857 GHz. Sources which are up to a factor of ∼2 fainter than this limit, and which are present in “clean” regions of the Galaxy where the sky background due to emission from the interstellar medium is low, are included in the ERCSC if they meet the high reliability criterion. The Planck ERCSC sources have known associations to stars with dust shells, stellar cores, radio galaxies, blazars, infrared luminous galaxies and Galactic interstellar medium features. A significant fraction of unclassified sources are also present in the catalogs. In addition, two early release catalogs that contain 915 cold molecular cloud core candidates and 189 SZ cluster candidates that have been generated using multifrequency algorithms are presented. The entire source list, with more than 15000 unique sources, is ripe for follow-up characterisation with Herschel, ATCA, VLA, SOFIA, ALMA and other ground-based observing facilities

    Global, regional, and national comparative risk assessment of 79 behavioural, environmental and occupational, and metabolic risks or clusters of risks, 1990-2015: A systematic analysis for the Global Burden of Disease Study 2015

    Get PDF
    Background: The Global Burden of Diseases, Injuries, and Risk Factors Study 2015 provides an up-to-date synthesis of the evidence for risk factor exposure and the attributable burden of disease. By providing national and subnational assessments spanning the past 25 years, this study can inform debates on the importance of addressing risks in context. Methods: We used the comparative risk assessment framework developed for previous iterations of the Global Burden of Disease Study to estimate attributable deaths, disability-adjusted life-years (DALYs), and trends in exposure by age group, sex, year, and geography for 79 behavioural, environmental and occupational, and metabolic risks or clusters of risks from 1990 to 2015. This study included 388 risk-outcome pairs that met World Cancer Research Fund-defined criteria for convincing or probable evidence. We extracted relative risk and exposure estimates from randomised controlled trials, cohorts, pooled cohorts, household surveys, census data, satellite data, and other sources. We used statistical models to pool data, adjust for bias, and incorporate covariates. We developed a metric that allows comparisons of exposure across risk factors—the summary exposure value. Using the counterfactual scenario of theoretical minimum risk level, we estimated the portion of deaths and DALYs that could be attributed to a given risk. We decomposed trends in attributable burden into contributions from population growth, population age structure, risk exposure, and risk-deleted cause-specific DALY rates. We characterised risk exposure in relation to a Socio-demographic Index (SDI). Findings: Between 1990 and 2015, global exposure to unsafe sanitation, household air pollution, childhood underweight, childhood stunting, and smoking each decreased by more than 25%. Global exposure for several occupational risks, high body-mass index (BMI), and drug use increased by more than 25% over the same period. All risks jointly evaluated in 2015 accounted for 57·8% (95% CI 56·6–58·8) of global deaths and 41·2% (39·8–42·8) of DALYs. In 2015, the ten largest contributors to global DALYs among Level 3 risks were high systolic blood pressure (211·8 million [192·7 million to 231·1 million] global DALYs), smoking (148·6 million [134·2 million to 163·1 million]), high fasting plasma glucose (143·1 million [125·1 million to 163·5 million]), high BMI (120·1 million [83·8 million to 158·4 million]), childhood undernutrition (113·3 million [103·9 million to 123·4 million]), ambient particulate matter (103·1 million [90·8 million to 115·1 million]), high total cholesterol (88·7 million [74·6 million to 105·7 million]), household air pollution (85·6 million [66·7 million to 106·1 million]), alcohol use (85·0 million [77·2 million to 93·0 million]), and diets high in sodium (83·0 million [49·3 million to 127·5 million]). From 1990 to 2015, attributable DALYs declined for micronutrient deficiencies, childhood undernutrition, unsafe sanitation and water, and household air pollution; reductions in risk-deleted DALY rates rather than reductions in exposure drove these declines. Rising exposure contributed to notable increases in attributable DALYs from high BMI, high fasting plasma glucose, occupational carcinogens, and drug use. Environmental risks and childhood undernutrition declined steadily with SDI; low physical activity, high BMI, and high fasting plasma glucose increased with SDI. In 119 countries, metabolic risks, such as high BMI and fasting plasma glucose, contributed the most attributable DALYs in 2015. Regionally, smoking still ranked among the leading five risk factors for attributable DALYs in 109 countries; childhood underweight and unsafe sex remained primary drivers of early death and disability in much of sub-Saharan Africa. Interpretation: Declines in some key environmental risks have contributed to declines in critical infectious diseases. Some risks appear to be invariant to SDI. Increasing risks, including high BMI, high fasting plasma glucose, drug use, and some occupational exposures, contribute to rising burden from some conditions, but also provide opportunities for intervention. Some highly preventable risks, such as smoking, remain major causes of attributable DALYs, even as exposure is declining. Public policy makers need to pay attention to the risks that are increasingly major contributors to global burden. Funding: Bill & Melinda Gates Foundation

    Search for the associated production of the Higgs boson with a top-quark pair

    Get PDF
    A search for the standard model Higgs boson produced in association with a top-quark pair t t ¯ H (tt¯H) is presented, using data samples corresponding to integrated luminosities of up to 5.1 fb &#8722;1 and 19.7 fb &#8722;1 collected in pp collisions at center-of-mass energies of 7 TeV and 8 TeV respectively. The search is based on the following signatures of the Higgs boson decay: H &#8594; hadrons, H &#8594; photons, and H &#8594; leptons. The results are characterized by an observed t t ¯ H tt¯H signal strength relative to the standard model cross section, &#956; = &#963;/&#963; SM ,under the assumption that the Higgs boson decays as expected in the standard model. The best fit value is &#956; = 2.8 ± 1.0 for a Higgs boson mass of 125.6 GeV

    Performance of reconstruction and identification of τ leptons decaying to hadrons and vτ in pp collisions at √s=13 TeV

    Get PDF
    The algorithm developed by the CMS Collaboration to reconstruct and identify τ leptons produced in proton-proton collisions at √s=7 and 8 TeV, via their decays to hadrons and a neutrino, has been significantly improved. The changes include a revised reconstruction of π⁰ candidates, and improvements in multivariate discriminants to separate τ leptons from jets and electrons. The algorithm is extended to reconstruct τ leptons in highly Lorentz-boosted pair production, and in the high-level trigger. The performance of the algorithm is studied using proton-proton collisions recorded during 2016 at √s=13 TeV, corresponding to an integrated luminosity of 35.9 fb¯¹. The performance is evaluated in terms of the efficiency for a genuine τ lepton to pass the identification criteria and of the probabilities for jets, electrons, and muons to be misidentified as τ leptons. The results are found to be very close to those expected from Monte Carlo simulation
    corecore