2,014 research outputs found
Testing and Learning on Distributions with Symmetric Noise Invariance
Kernel embeddings of distributions and the Maximum Mean Discrepancy (MMD),
the resulting distance between distributions, are useful tools for fully
nonparametric two-sample testing and learning on distributions. However, it is
rarely that all possible differences between samples are of interest --
discovered differences can be due to different types of measurement noise, data
collection artefacts or other irrelevant sources of variability. We propose
distances between distributions which encode invariance to additive symmetric
noise, aimed at testing whether the assumed true underlying processes differ.
Moreover, we construct invariant features of distributions, leading to learning
algorithms robust to the impairment of the input distributions with symmetric
additive noise.Comment: 22 page
Countering Personalized Speech
Social media platforms use personalization algorithms to make content curation decisions for each end user. These personalized recommendation decisions are essentially speech conveying a platform\u27s predictions on content relevance for each end user. Yet, they are causing some of the worst problems on the internet. First, they facilitate the precipitous spread of mis- and disinformation by exploiting the very same biases and insecurities that drive end user engagement with such content. Second, they exacerbate social media addiction and related mental health harms by leveraging users\u27 affective needs to drive engagement to greater and greater heights. Lastly, they erode end user privacy and autonomy as both sources and incentives for data collection.
As with any harmful speech, the solution is often counterspeech. Free speech jurisprudence considers counterspeech the most speech-protective weapon to combat false or harmful speech. Thus, to combat problematic recommendation decisions, social media platforms, policymakers, and other stakeholders should embolden end users to use counterspeech to reduce the harmful effects of platform personalization.
One way to implement this solution is through end user personalization inputs. These inputs reflect end user expression about a platform\u27s recommendation decisions. However, industry-standard personalization inputs are failing to provide effective countermeasures against problematic recommendation decisions. On most, if not all, major social media platforms, the existing inputs confer limited ex post control over the platform\u27s recommendation decisions. In order for end user personalization to achieve the promise of counterspeech, I make several proposals along key regulatory modalities, including revising the architecture of personalization inputs to confer robust ex ante capabilities that filter by content type and characteristics
Structure-based design of pantothenate kinase inhibitors as lead structures for new antibiotics
BMED395MAMD-MEDB
Hyperparameter Learning via Distributional Transfer
Bayesian optimisation is a popular technique for hyperparameter learning but
typically requires initial exploration even in cases where similar prior tasks
have been solved. We propose to transfer information across tasks using learnt
representations of training datasets used in those tasks. This results in a
joint Gaussian process model on hyperparameters and data representations.
Representations make use of the framework of distribution embeddings into
reproducing kernel Hilbert spaces. The developed method has a faster
convergence compared to existing baselines, in some cases requiring only a few
evaluations of the target objective
PRAS40 suppresses atherogenesis through inhibition of mTORC1-dependent pro-inflammatory signaling in endothelial cells
Endothelial pro-inflammatory activation plays a pivotal role in atherosclerosis, and many pro-inflammatory and atherogenic signals converge upon mechanistic target of rapamycin (mTOR). Inhibitors of mTOR complex 1 (mTORC1) reduced atherosclerosis in preclinical studies, but side effects including insulin resistance and dyslipidemia limit their clinical use in this context. Therefore, we investigated PRAS40, a cell type-specific endogenous modulator of mTORC1, as alternative target. Indeed, we previously found PRAS40 gene therapy to improve metabolic profile; however, its function in endothelial cells and its role in atherosclerosis remain unknown. Here we show that PRAS40 negatively regulates endothelial mTORC1 and pro-inflammatory signaling. Knockdown of PRAS40 in endothelial cells promoted TNFα-induced mTORC1 signaling, proliferation, upregulation of inflammatory markers and monocyte recruitment. In contrast, PRAS40-overexpression blocked mTORC1 and all measures of pro-inflammatory signaling. These effects were mimicked by pharmacological mTORC1-inhibition with torin1. In an in vivo model of atherogenic remodeling, mice with induced endothelium-specific PRAS40 deficiency showed enhanced endothelial pro-inflammatory activation as well as increased neointimal hyperplasia and atherosclerotic lesion formation. These data indicate that PRAS40 suppresses atherosclerosis via inhibition of endothelial mTORC1-mediated pro-inflammatory signaling. In conjunction with its favourable effects on metabolic homeostasis, this renders PRAS40 a potential target for the treatment of atherosclerosis
Recommended from our members
Integrating Quality of Life and Survival Outcomes in Cardiovascular Clinical Trials.
Background Survival and health status (eg, symptoms and quality of life) are key outcomes in clinical trials of heart failure treatment. However, health status can only be recorded on survivors, potentially biasing treatment effect estimates when there is differential survival across treatment groups. Joint modeling of survival and health status can address this bias. Methods and Results We analyzed patient-level data from the PARTNER 1B trial (Placement of Aortic Transcatheter Valves) of transcatheter aortic valve replacement versus standard care. Health status was quantified with the Kansas City Cardiomyopathy Questionnaire (KCCQ) at randomization, 1, 6, and 12 months. We compared hazard ratios for survival and mean differences in KCCQ scores at 12 months using several models: the original growth curve model for KCCQ scores (ignoring death), separate Bayesian models for survival and KCCQ scores, and a Bayesian joint longitudinal-survival model fit to either 12 or 30 months of survival follow-up. The benefit of transcatheter aortic valve replacement on 12-month KCCQ scores was greatest in the joint-model fit to all survival data (mean difference, 33.7 points; 95% credible intervals [CrI], 24.2-42.4), followed by the joint-model fit to 12 months of survival follow-up (32.3 points; 95% CrI, 22.5-41.5), a Bayesian model without integrating death (30.4 points; 95% CrI, 21.4-39.3), and the original growth curve model (26.0 points; 95% CI, 18.7-33.3). At 12 months, the survival benefit of transcatheter aortic valve replacement was also greater in the joint model (hazard ratio, 0.50; 95% CrI, 0.32-0.73) than in the nonjoint Bayesian model (0.54; 95% CrI, 0.37-0.75) or the original Kaplan-Meier estimate (0.55; 95% CI, 0.40-0.74). Conclusions In patients with severe symptomatic aortic stenosis and prohibitive surgical risk, the estimated benefits of transcatheter aortic valve replacement on survival and health status compared with standard care were greater in joint Bayesian models than other approaches
Recommended from our members
Examining reliability of seasonal to decadal sea surface temperature forecasts: the role of ensemble dispersion
Useful probabilistic climate forecasts on decadal timescales should be reliable (i.e. forecast probabilities match the observed relative frequencies) but this is seldom examined. This paper assesses a necessary condition for reliability, that the ratio of ensemble spread to forecast error being close to one, for seasonal to decadal sea surface temperature retrospective forecasts from the Met Office Decadal Prediction System (DePreSys). Factors which may affect reliability are diagnosed by comparing this spread-error ratio for an initial condition ensemble and two perturbed physics ensembles for initialized and uninitialized predictions. At lead times less than 2 years, the initialized ensembles tend to be under-dispersed, and hence produce overconfident and hence unreliable forecasts. For longer lead times, all three ensembles are predominantly over-dispersed. Such over-dispersion is primarily related to excessive inter-annual variability in the climate model. These findings highlight the need to carefully evaluate simulated variability in seasonal and decadal prediction systems.Useful probabilistic climate forecasts on decadal timescales should be reliable (i.e. forecast probabilities match the observed relative frequencies) but this is seldom examined. This paper assesses a necessary condition for reliability, that the ratio of ensemble spread to forecast error being close to one, for seasonal to decadal sea surface temperature retrospective forecasts from the Met Office Decadal Prediction System (DePreSys). Factors which may affect reliability are diagnosed by comparing this spread-error ratio for an initial condition ensemble and two perturbed physics ensembles for initialized and uninitialized predictions. At lead times less than 2 years, the initialized ensembles tend to be under-dispersed, and hence produce overconfident and hence unreliable forecasts. For longer lead times, all three ensembles are predominantly over-dispersed. Such over-dispersion is primarily related to excessive inter-annual variability in the climate model. These findings highlight the need to carefully evaluate simulated variability in seasonal and decadal prediction systems
The AMIGA sample of isolated galaxies. XI. Optical characterisation of nuclear activity
Context.- This paper is part of a series involving the AMIGA project
(Analysis of the Interstellar Medium of Isolated GAlaxies), which identifies
and studies a statistically-significant sample of the most isolated galaxies in
the northern sky. Aims.- We present a catalogue of nuclear activity, traced by
optical emission lines, in a well-defined sample of the most isolated galaxies
in the local Universe, which will be used as a basis for studying the effect of
the environment on nuclear activity. Methods.- We obtained spectral data from
the 6th Data Release of the Sloan Digital Sky Survey, which were inspected in a
semi-automatic way. We subtracted the underlying stellar populations from the
spectra (using the software Starlight) and modelled the nuclear emission
features. Standard emission-line diagnostics diagrams were applied, using a new
classification scheme that takes into account censored data, to classify the
type of nuclear emission. Results.- We provide a final catalogue of
spectroscopic data, stellar populations, emission lines and classification of
optical nuclear activity for AMIGA galaxies. The prevalence of optical active
galactic nuclei (AGN) in AMIGA galaxies is 20.4%, or 36.7% including transition
objects. The fraction of AGN increases steeply towards earlier morphological
types and higher luminosities. We compare these results with a matched analysis
of galaxies in isolated denser environments (Hickson Compact Groups). After
correcting for the effects of the morphology and luminosity, we find that there
is no evidence for a difference in the prevalence of AGN between isolated and
compact group galaxies, and we discuss the implications of this result.
Conclusions.- We find that a major interaction is not a necessary condition for
the triggering of optical AGN.Comment: 16 pages, 11 figures, 12 tables, published in Astronomy and
Astrophysics. Figure 5 corrected: [OI] diagram adde
- …