12,649 research outputs found
Parametric instability and wave turbulence driven by tidal excitation of internal waves
We investigate the stability of stratified fluid layers undergoing
homogeneous and periodic tidal deformation. We first introduce a local model
which allows to study velocity and buoyancy fluctuations in a Lagrangian domain
periodically stretched and sheared by the tidal base flow. While keeping the
key physical ingredients only, such a model is efficient to simulate planetary
regimes where tidal amplitudes and dissipation are small. With this model, we
prove that tidal flows are able to drive parametric subharmonic resonances of
internal waves, in a way reminiscent of the elliptical instability in rotating
fluids. The growth rates computed via Direct Numerical Simulations (DNS) are in
very good agreement with WKB analysis and Floquet theory. We also investigate
the turbulence driven by this instability mechanism. With spatio-temporal
analysis, we show that it is a weak internal wave turbulence occurring at small
Froude and buoyancy Reynolds numbers. When the gap between the excitation and
the Brunt-V\"ais\"al\"a frequencies is increased, the frequency spectrum of
this wave turbulence displays a -2 power law reminiscent of the high-frequency
branch of the Garett and Munk spectrum (Garrett & Munk 1979) which has been
measured in the oceans. In addition, we find that the mixing efficiency is
altered compared to what is computed in the context of DNS of stratified
turbulence excited at small Froude and large buoyancy Reynolds numbers and is
consistent with a superposition of waves.Comment: Accepted for publication in Journal of Fluid Mechanics, 27 pages, 21
figure
A Quasi-Bayesian Perspective to Online Clustering
When faced with high frequency streams of data, clustering raises theoretical
and algorithmic pitfalls. We introduce a new and adaptive online clustering
algorithm relying on a quasi-Bayesian approach, with a dynamic (i.e.,
time-dependent) estimation of the (unknown and changing) number of clusters. We
prove that our approach is supported by minimax regret bounds. We also provide
an RJMCMC-flavored implementation (called PACBO, see
https://cran.r-project.org/web/packages/PACBO/index.html) for which we give a
convergence guarantee. Finally, numerical experiments illustrate the potential
of our procedure
The linear instability of the stratified plane Couette flow
We present the stability analysis of a plane Couette flow which is stably
stratified in the vertical direction orthogonally to the horizontal shear.
Interest in such a flow comes from geophysical and astrophysical applications
where background shear and vertical stable stratification commonly coexist. We
perform the linear stability analysis of the flow in a domain which is periodic
in the stream-wise and vertical directions and confined in the cross-stream
direction. The stability diagram is constructed as a function of the Reynolds
number Re and the Froude number Fr, which compares the importance of shear and
stratification. We find that the flow becomes unstable when shear and
stratification are of the same order (i.e. Fr 1) and above a moderate
value of the Reynolds number Re700. The instability results from a
resonance mechanism already known in the context of channel flows, for instance
the unstratified plane Couette flow in the shallow water approximation. The
result is confirmed by fully non linear direct numerical simulations and to the
best of our knowledge, constitutes the first evidence of linear instability in
a vertically stratified plane Couette flow. We also report the study of a
laboratory flow generated by a transparent belt entrained by two vertical
cylinders and immersed in a tank filled with salty water linearly stratified in
density. We observe the emergence of a robust spatio-temporal pattern close to
the threshold values of F r and Re indicated by linear analysis, and explore
the accessible part of the stability diagram. With the support of numerical
simulations we conclude that the observed pattern is a signature of the same
instability predicted by the linear theory, although slightly modified due to
streamwise confinement
Scaling Deep Learning on GPU and Knights Landing clusters
The speed of deep neural networks training has become a big bottleneck of
deep learning research and development. For example, training GoogleNet by
ImageNet dataset on one Nvidia K20 GPU needs 21 days. To speed up the training
process, the current deep learning systems heavily rely on the hardware
accelerators. However, these accelerators have limited on-chip memory compared
with CPUs. To handle large datasets, they need to fetch data from either CPU
memory or remote processors. We use both self-hosted Intel Knights Landing
(KNL) clusters and multi-GPU clusters as our target platforms. From an
algorithm aspect, current distributed machine learning systems are mainly
designed for cloud systems. These methods are asynchronous because of the slow
network and high fault-tolerance requirement on cloud systems. We focus on
Elastic Averaging SGD (EASGD) to design algorithms for HPC clusters. Original
EASGD used round-robin method for communication and updating. The communication
is ordered by the machine rank ID, which is inefficient on HPC clusters.
First, we redesign four efficient algorithms for HPC systems to improve
EASGD's poor scaling on clusters. Async EASGD, Async MEASGD, and Hogwild EASGD
are faster \textcolor{black}{than} their existing counterparts (Async SGD,
Async MSGD, and Hogwild SGD, resp.) in all the comparisons. Finally, we design
Sync EASGD, which ties for the best performance among all the methods while
being deterministic. In addition to the algorithmic improvements, we use some
system-algorithm codesign techniques to scale up the algorithms. By reducing
the percentage of communication from 87% to 14%, our Sync EASGD achieves 5.3x
speedup over original EASGD on the same platform. We get 91.5% weak scaling
efficiency on 4253 KNL cores, which is higher than the state-of-the-art
implementation
Research methods & statistics for psychology: OER course packet
This packet was developed to teach psychological research methods and statistics as a no-cost, open access course. Secondary goals are to teach computational reproducibilty and create teaching materials that can be shared among colleagues teaching similar courses at Haverford College and around the globe. This packet includes links to video lectures, video tutorials, open access textbooks, statistical analysis activities, and a data sharing repository. This packet was developed during the 2020/21 academic year while the course was being taught remotely due to Covid. The syllabus and course materials were developed keeping asynchronous versus synchronous learning in mind.
Use is governed by a CC-BY-NC license
MAP7 regulates axon morphogenesis by recruiting kinesin-1 to microtubules and modulating organelle transport.
Neuronal cell morphogenesis depends on proper regulation of microtubule-based transport, but the underlying mechanisms are not well understood. Here, we report our study of MAP7, a unique microtubule-associated protein that interacts with both microtubules and the motor protein kinesin-1. Structure-function analysis in rat embryonic sensory neurons shows that the kinesin-1 interacting domain in MAP7 is required for axon and branch growth but not for branch formation. Also, two unique microtubule binding sites are found in MAP7 that have distinct dissociation kinetics and are both required for branch formation. Furthermore, MAP7 recruits kinesin-1 dynamically to microtubules, leading to alterations in organelle transport behaviors, particularly pause/speed switching. As MAP7 is localized to branch sites, our results suggest a novel mechanism mediated by the dual interactions of MAP7 with microtubules and kinesin-1 in the precise control of microtubule-based transport during axon morphogenesis
Order Out of Chaos: Slowly Reversing Mean Flows Emerge from Turbulently Generated Internal Waves
We demonstrate via direct numerical simulations that a periodic, oscillating
mean flow spontaneously develops from turbulently generated internal waves. We
consider a minimal physical model where the fluid self-organizes in a
convective layer adjacent to a stably stratified one. Internal waves are
excited by turbulent convective motions, then nonlinearly interact to produce a
mean flow reversing on timescales much longer than the waves' period. Our
results demonstrate for the first time that the three-scale dynamics due to
convection, waves, and mean flow is generic and hence can occur in many
astrophysical and geophysical fluids. We discuss efforts to reproduce the mean
flow in reduced models, where the turbulence is bypassed. We demonstrate that
wave intermittency, resulting from the chaotic nature of convection, plays a
key role in the mean-flow dynamics, which thus cannot be captured using only
second-order statistics of the turbulent motions
Preferred sizes and ordering in surface nanobubble populations
Two types of homogeneous surface nanobubble populations, created by different
means, are analyzed statistically on both their sizes and spatial positions. In
the first type (created by droplet-deposition, case A) the bubble size R is
found to be distributed according to a generalized gamma law with a preferred
radius R*=20 nm. The radial distribution function shows a preferred spacing at
~5.5 R*. These characteristics do not show up in comparable Monte-Carlo
simulations of random packings of hard disks with the same size distribution
and the same density, suggesting a structuring effect in the nanobubble
formation process. The nanobubble size distribution of the second population
type (created by ethanol-water exchange, case B) is a mixture of two clearly
separated distributions, hence, with two preferred radii. The local ordering is
less significant, due to the looser packing of the nanobubbles.Comment: 5 pages, 5 figure
- …