257 research outputs found

    The triviality of the 61-stem in the stable homotopy groups of spheres

    Full text link
    We prove that the 2-primary π61\pi_{61} is zero. As a consequence, the Kervaire invariant element θ5\theta_5 is contained in the strictly defined 4-fold Toda bracket 2,θ4,θ4,2\langle 2, \theta_4, \theta_4, 2\rangle. Our result has a geometric corollary: the 61-sphere has a unique smooth structure and it is the last odd dimensional case - the only ones are S1,S3,S5S^1, S^3, S^5 and S61S^{61}. Our proof is a computation of homotopy groups of spheres. A major part of this paper is to prove an Adams differential d3(D3)=B3d_3(D_3) = B_3. We prove this differential by introducing a new technique based on the algebraic and geometric Kahn-Priddy theorems. The success of this technique suggests a theoretical way to prove Adams differentials in the sphere spectrum inductively by use of differentials in truncated projective spectra.Comment: 67 pages, minor changes, accepted versio

    The special fiber of the motivic deformation of the stable homotopy category is algebraic

    Get PDF
    For each prime pp, we define a tt-structure on the category S0,0^/τ-Modharmb\widehat{S^{0,0}}/\tau\text{-}\mathbf{Mod}_{harm}^b of harmonic C\mathbb{C}-motivic left module spectra over S0,0^/τ\widehat{S^{0,0}}/\tau, whose MGL-homology has bounded Chow-Novikov degree, such that its heart is equivalent to the abelian category of pp-completed BPBPBP_*BP-comodules that are concentrated in even degrees. We prove that S0,0^/τ-Modharmb\widehat{S^{0,0}}/\tau\text{-}\mathbf{Mod}_{harm}^b is equivalent to Db(BPBP-Comodev)\mathcal{D}^b({{BP}_*{BP}\text{-}\mathbf{Comod}}^{{ev}}) as stable \infty-categories equipped with tt-structures. As an application, for each prime pp, we prove that the motivic Adams spectral sequence for S0,0^/τ\widehat{S^{0,0}}/\tau, which converges to the motivic homotopy groups of S0,0^/τ\widehat{S^{0,0}}/\tau, is isomorphic to the algebraic Novikov spectral sequence, which converges to the classical Adams-Novikov E2E_2-page for the sphere spectrum S0^\widehat{S^0}. This isomorphism of spectral sequences allows Isaksen and the second and third authors to compute the stable homotopy groups of spheres at least to the 90-stem, with ongoing computations into even higher dimensions.Comment: Accepted version, 85 page

    Stable homotopy groups of spheres

    Full text link
    We discuss the current state of knowledge of stable homotopy groups of spheres. We describe a new computational method that yields a streamlined computation of the first 61 stable homotopy groups, and gives new information about the stable homotopy groups in dimensions 62 through 90. The method relies more heavily on machine computations than previous methods, and is therefore less prone to error. The main mathematical tool is the Adams spectral sequence

    The Tradeoff Between Privacy and Accuracy in Anomaly Detection Using Federated XGBoost

    Full text link
    Privacy has raised considerable concerns recently, especially with the advent of information explosion and numerous data mining techniques to explore the information inside large volumes of data. In this context, a new distributed learning paradigm termed federated learning becomes prominent recently to tackle the privacy issues in distributed learning, where only learning models will be transmitted from the distributed nodes to servers without revealing users' own data and hence protecting the privacy of users. In this paper, we propose a horizontal federated XGBoost algorithm to solve the federated anomaly detection problem, where the anomaly detection aims to identify abnormalities from extremely unbalanced datasets and can be considered as a special classification problem. Our proposed federated XGBoost algorithm incorporates data aggregation and sparse federated update processes to balance the tradeoff between privacy and learning performance. In particular, we introduce the virtual data sample by aggregating a group of users' data together at a single distributed node. We compute parameters based on these virtual data samples in the local nodes and aggregate the learning model in the central server. In the learning model upgrading process, we focus more on the wrongly classified data before in the virtual sample and hence to generate sparse learning model parameters. By carefully controlling the size of these groups of samples, we can achieve a tradeoff between privacy and learning performance. Our experimental results show the effectiveness of our proposed scheme by comparing with existing state-of-the-arts
    corecore