4,643 research outputs found
Deletion of myeloid-PTP1B decreases MHC Class I expression and peptide presentation through an IL-10 dependent mechanism in response to LPS challenge
Peer reviewedPublisher PD
Protein Tyrosine Phosphatase 1B (PTP1B) in the immune system
Journal not available online when checked 02/04/19. DOI: 10.14800/ics.965Peer reviewedPublisher PD
LĂ©on Marillier and the veridical hallucination in late-nineteenth- and early-twentieth-century French psychology and psychopathology.
Recent research on the professionalization of psychology at the end of the nineteenth century shows how objects of knowledge which appear illegitimate to us today shaped the institutionalization of disciplines. The veridical or telepathic hallucination was one of these objects, constituting a field both of division and exchange between nascent psychology and disciplines known as 'psychic sciences' in France, and 'psychical research' in the Anglo-American context. In France, Leon Marillier (1862-1901) was the main protagonist in discussions concerning the concept of the veridical hallucination, which gave rise to criticisms by mental specialists and psychopathologists. After all, not only were these hallucinations supposed to occur in healthy subjects, but they also failed to correspond to the Esquirolian definition of hallucinations through being corroborated by their representation of external, objective events.Andreas Sommerâs contribution to this article was made possible through support by the Perrott-Warrick Fund, Trinity College, University of Cambridge, and Cedar Creek Institute, Charlottesville, VA.This is the author accepted manuscript. The final version is available from SAGE via http://dx.doi.org/10.1177/0957154X1456275
Solving the stationary Liouville equation via a boundary element method
Intensity distributions of linear wave fields are, in the high frequency
limit, often approximated in terms of flow or transport equations in phase
space. Common techniques for solving the flow equations for both time dependent
and stationary problems are ray tracing or level set methods. In the context of
predicting the vibro-acoustic response of complex engineering structures,
reduced ray tracing methods such as Statistical Energy Analysis or variants
thereof have found widespread applications. Starting directly from the
stationary Liouville equation, we develop a boundary element method for solving
the transport equations for complex multi-component structures. The method,
which is an improved version of the Dynamical Energy Analysis technique
introduced recently by the authors, interpolates between standard statistical
energy analysis and full ray tracing, containing both of these methods as
limiting cases. We demonstrate that the method can be used to efficiently deal
with complex large scale problems giving good approximations of the energy
distribution when compared to exact solutions of the underlying wave equation
Scale-aware neural calibration for wide swath altimetry observations
Sea surface height (SSH) is a key geophysical parameter for monitoring and
studying meso-scale surface ocean dynamics. For several decades, the mapping of
SSH products at regional and global scales has relied on nadir satellite
altimeters, which provide one-dimensional-only along-track satellite
observations of the SSH. The Surface Water and Ocean Topography (SWOT) mission
deploys a new sensor that acquires for the first time wide-swath
two-dimensional observations of the SSH. This provides new means to observe the
ocean at previously unresolved spatial scales. A critical challenge for the
exploiting of SWOT data is the separation of the SSH from other signals present
in the observations. In this paper, we propose a novel learning-based approach
for this SWOT calibration problem. It benefits from calibrated nadir altimetry
products and a scale-space decomposition adapted to SWOT swath geometry and the
structure of the different processes in play. In a supervised setting, our
method reaches the state-of-the-art residual error of ~1.4cm while proposing a
correction on the entire spectral from 10km to 1000kComment: 8 pages, 7 figures, Preprin
Training neural mapping schemes for satellite altimetry with simulation data
Satellite altimetry combined with data assimilation and optimal interpolation
schemes have deeply renewed our ability to monitor sea surface dynamics.
Recently, deep learning (DL) schemes have emerged as appealing solutions to
address space-time interpolation problems. The scarcity of real altimetry
dataset, in terms of space-time coverage of the sea surface, however impedes
the training of state-of-the-art neural schemes on real-world case-studies.
Here, we leverage both simulations of ocean dynamics and satellite altimeters
to train simulation-based neural mapping schemes for the sea surface height and
demonstrate their performance for real altimetry datasets. We analyze further
how the ocean simulation dataset used during the training phase impacts this
performance. This experimental analysis covers both the resolution from
eddy-present configurations to eddy-rich ones, forced simulations vs.
reanalyses using data assimilation and tide-free vs. tide-resolving
simulations. Our benchmarking framework focuses on a Gulf Stream region for a
realistic 5-altimeter constellation using NEMO ocean simulations and 4DVarNet
mapping schemes. All simulation-based 4DVarNets outperform the operational
observation-driven and reanalysis products, namely DUACS and GLORYS. The more
realistic the ocean simulation dataset used during the training phase, the
better the mapping. The best 4DVarNet mapping was trained from an eddy-rich and
tide-free simulation datasets. It improves the resolved longitudinal scale from
151 kilometers for DUACS and 241 kilometers for GLORYS to 98 kilometers and
reduces the root mean squared error (RMSE) by 23% and 61%. These results open
research avenues for new synergies between ocean modelling and ocean
observation using learning-based approaches
A posteriori learning for quasi-geostrophic turbulence parametrization
The use of machine learning to build subgrid parametrizations for climate
models is receiving growing attention. State-of-the-art strategies address the
problem as a supervised learning task and optimize algorithms that predict
subgrid fluxes based on information from coarse resolution models. In practice,
training data are generated from higher resolution numerical simulations
transformed in order to mimic coarse resolution simulations. By essence, these
strategies optimize subgrid parametrizations to meet so-called criteria. But the actual purpose of a subgrid parametrization is to
obtain good performance in terms of metrics which imply
computing entire model trajectories. In this paper, we focus on the
representation of energy backscatter in two dimensional quasi-geostrophic
turbulence and compare parametrizations obtained with different learning
strategies at fixed computational complexity. We show that strategies based on
criteria yield parametrizations that tend to be unstable in
direct simulations and describe how subgrid parametrizations can alternatively
be trained end-to-end in order to meet criteria. We
illustrate that end-to-end learning strategies yield parametrizations that
outperform known empirical and data-driven schemes in terms of performance,
stability and ability to apply to different flow configurations. These results
support the relevance of differentiable programming paradigms for climate
models in the future.Comment: 36 pages, 14 figures, submitted to Journal of Advances in Modeling
Earth Systems (JAMES
Can we map the interannual variability of the whole upper Southern Ocean with the current database of hydrographic observations?
International audienceWith the advent of Argo floats, it now seems feasible to study the interannual variations of upper ocean hydrographic properties of the historically undersampled Southern Ocean. To do so, scattered hydrographic profiles often first need to be mapped. To investigate biases and errors associated both with the limited space-time distribution of the profiles and with the mapping methods, we colocate the mixed-layer depth (MLD) output from a state-of-the-art 1/12° DRAKKAR simulation onto the latitude, longitude, and date of actual in situ profiles from 2005 to 2014. We compare the results obtained after remapping using a nearest neighbor (NN) interpolation and an objective analysis (OA) with different spatiotemporal grid resolutions and decorrelation scales. NN is improved with a coarser resolution. OA performs best with low decorrelation scales, avoiding too strong a smoothing, but returns values over larger areas with large decorrelation scales and low temporal resolution, as more points are available. For all resolutions OA represents better the annual extreme values than NN. Both methods underestimate the seasonal cycle in MLD. MLD biases are lower than 10 m on average but can exceed 250 m locally in winter. We argue that current Argo data should not be mapped to infer decadal trends in MLD, as all methods are unable to reproduce existing trends without creating unrealistic extra ones. We also show that regions of the subtropical Atlantic, Indian, and Pacific Oceans, and the whole ice-covered Southern Ocean, still cannot be mapped even by the best method because of the lack of observational data
- âŠ