3,099 research outputs found

    Interior regularity criteria for suitable weak solutions of the Navier-Stokes equations

    Full text link
    We present new interior regularity criteria for suitable weak solutions of the 3-D Navier-Stokes equations: a suitable weak solution is regular near an interior point zz if either the scaled Lx,tp,qL^{p,q}_{x,t}-norm of the velocity with 3/p+2/q23/p+2/q\leq 2, 1q1\leq q\leq \infty, or the Lx,tp,qL^{p,q}_{x,t}-norm of the vorticity with 3/p+2/q33/p+2/q\leq 3, 1q<1 \leq q < \infty, or the Lx,tp,qL^{p,q}_{x,t}-norm of the gradient of the vorticity with 3/p+2/q43/p+2/q\leq 4, 1q1 \leq q, 1p1 \leq p, is sufficiently small near zz

    Conventional Forces can Explain the Anomalous Acceleration of Pioneer 10

    Full text link
    Anderson, et al. find the measured trajectories of Pioneer 10 and 11 spacecraft deviate from the trajectories computed from known forces acting on them. This unmodelled acceleration (and the less well known, but similar, unmodelled torque) can be accounted for by non-isotropic radiation of spacecraft heat. Various forms of non-isotropic radiation were proposed by Katz, Murphy, and Scheffer, but Anderson, et al. felt that none of these could explain the observed effect. This paper calculates the known effects in more detail and considers new sources of radiation, all based on spacecraft construction. These effects are then modelled over the duration of the experiment. The model reproduces the acceleration from its appearance at a heliocentric distance of 5 AU to the last measurement at 71 AU to within 10 percent. However, it predicts a larger decrease in acceleration between intervals I and III of the Pioneer 10 observations than is observed. This is a 2 sigma discrepancy from the average of the three analyses (SIGMA, CHASMP, and Markwardt). A more complex (but more speculative) model provides a somewhat better fit. Radiation forces can also plausibly explain the previously unmodelled torques, including the spindown of Pioneer 10 that is directly proportional to spacecraft bus heat, and the slow but constant spin-up of Pioneer 11. In any case, by accounting for the bulk of the acceleration, the proposed mechanism makes it much more likely that the entire effect can be explained without the need for new physics.Comment: Final minor changes for publication - added explanation of acronyms, added to RTG asymmetry argument.. Was: 12 pages, 9 figures, major revision. Added discussion of gas leaks and spin history, a radiation based explanation of spin changes, and references to re-analysis of Markwardt. Fixed radio forces, tuned models. Was: 7 pages, 5 figures; added liklihood calculations in body and abstract per suggestio

    Analysis Tools for Large Connectomes

    Get PDF
    New reconstruction techniques are generating connectomes of unprecedented size. These must be analyzed to generate human comprehensible results. The analyses being used fall into three general categories. The first is interactive tools used during reconstruction, to help guide the effort, look for possible errors, identify potential cell classes, and answer other preliminary questions. The second type of analysis is support for formal documents such as papers and theses. Scientific norms here require that the data be archived and accessible, and the analysis reproducible. In contrast to some other “omic” fields such as genomics, where a few specific analyses dominate usage, connectomics is rapidly evolving and the analyses used are often specific to the connectome being analyzed. These analyses are typically performed in a variety of conventional programming language, such as Matlab, R, Python, or C++, and read the connectomic data either from a file or through database queries, neither of which are standardized. In the short term we see no alternative to the use of specific analyses, so the best that can be done is to publish the analysis code, and the interface by which it reads connectomic data. A similar situation exists for archiving connectome data. Each group independently makes their data available, but there is no standardized format and long-term accessibility is neither enforced nor funded. In the long term, as connectomics becomes more common, a natural evolution would be a central facility for storing and querying connectomic data, playing a role similar to the National Center for Biotechnology Information for genomes. The final form of analysis is the import of connectome data into downstream tools such as neural simulation or machine learning. In this process, there are two main problems that need to be addressed. First, the reconstructed circuits contain huge amounts of detail, which must be intelligently reduced to a form the downstream tools can use. Second, much of the data needed for these downstream operations must be obtained by other methods (such as genetic or optical) and must be merged with the extracted connectome
    corecore