82 research outputs found

    Gaps between classes of matrix monotone functions

    Full text link
    We prove the existence of gaps between all the different classes of matrix monotone functions defined on an interval, provided the interval is non trivial and different from the whole real line. We then show how matrix monotone functions may be used in the characterization of certain C*-algebras as an alternative to the study of the matricial structure by positive linear maps

    Similarity-preserving linear maps on B(H)

    Get PDF
    AbstractWe determine kernels of similarity-preserving bounded linear maps on B(H) and give characterizations for elementary operators of length 1 to be similarity-preserving

    Maps preserving operator pairs whose products are projections

    Get PDF
    AbstractLet B(H) be the algebra of all bounded linear operators on a complex Hilbert space H with dimH⩾2. It is proved that a surjective map φ on B(H) preserves operator pairs whose products are nonzero projections in both directions if and only if there is a unitary or an anti-unitary operator U on H such that φ(A)=λU∗AU for all A in B(H) for some constants λ with λ2=1. Related results for surjective maps preserving operator pairs whose triple Jordan products are nonzero projections in both directions are also obtained. These show that the operator pairs whose products or triple Jordan products are nonzero projections are isometric invariants of B(H)

    A note on maximality of analytic crossed products

    Get PDF
    AbstractLet G be a compact abelian group with the totally ordered dual group Gˆ which admits the positive semigroup Gˆ+. Let N be a von Neumann algebra and α={αgˆ}gˆ∈Gˆ be an automorphism group of Gˆ on N. We denote N⋊αGˆ+ to the analytic crossed product determined by N and α. We show that if N⋊αGˆ+ is a maximal σ-weakly closed subalgebra of N⋊αGˆ, then Gˆ+ induces an archimedean order in Gˆ

    Towards artificial general intelligence via a multimodal foundation model

    Full text link
    The fundamental goal of artificial intelligence (AI) is to mimic the core cognitive activities of human. Despite tremendous success in the AI research, most of existing methods have only single-cognitive ability. To overcome this limitation and take a solid step towards artificial general intelligence (AGI), we develop a foundation model pre-trained with huge multimodal data, which can be quickly adapted for various downstream cognitive tasks. To achieve this goal, we propose to pre-train our foundation model by self-supervised learning with weak semantic correlation data crawled from the Internet and show that promising results can be obtained on a wide range of downstream tasks. Particularly, with the developed model-interpretability tools, we demonstrate that strong imagination ability is now possessed by our foundation model. We believe that our work makes a transformative stride towards AGI, from our common practice of "weak or narrow AI" to that of "strong or generalized AI".Comment: Published by Nature Communications, see https://www.nature.com/articles/s41467-022-30761-
    corecore