13,819 research outputs found

    Properties of Noncommutative Renyi and Augustin Information

    Full text link
    The scaled R\'enyi information plays a significant role in evaluating the performance of information processing tasks by virtue of its connection to the error exponent analysis. In quantum information theory, there are three generalizations of the classical R\'enyi divergence---the Petz's, sandwiched, and log-Euclidean versions, that possess meaningful operational interpretation. However, these scaled noncommutative R\'enyi informations are much less explored compared with their classical counterpart, and lacking crucial properties hinders applications of these quantities to refined performance analysis. The goal of this paper is thus to analyze fundamental properties of scaled R\'enyi information from a noncommutative measure-theoretic perspective. Firstly, we prove the uniform equicontinuity for all three quantum versions of R\'enyi information, hence it yields the joint continuity of these quantities in the orders and priors. Secondly, we establish the concavity in the region of s(1,0)s\in(-1,0) for both Petz's and the sandwiched versions. This completes the open questions raised by Holevo [\href{https://ieeexplore.ieee.org/document/868501/}{\textit{IEEE Trans.~Inf.~Theory}, \textbf{46}(6):2256--2261, 2000}], Mosonyi and Ogawa [\href{https://doi.org/10.1007/s00220-017-2928-4/}{\textit{Commun.~Math.~Phys}, \textbf{355}(1):373--426, 2017}]. For the applications, we show that the strong converse exponent in classical-quantum channel coding satisfies a minimax identity. The established concavity is further employed to prove an entropic duality between classical data compression with quantum side information and classical-quantum channel coding, and a Fenchel duality in joint source-channel coding with quantum side information in the forthcoming papers

    Electric Vehicle Promotion Policy in Taiwan

    Get PDF
    The developmental patterns of automotive industries in developing countries differ from those in developed countries. Nations should actively and effectively develop an electric vehicle (EV) industry to reduce carbon dioxide emissions and energy consumption, especially during this period of increasing fuel prices and emphasis on saving energy and reducing carbon emissions. From interdisciplinary perspectives, this study analyzed the promotion methods of the EV industry in Taiwan. In addition, we suggest that the Taiwan government should use its advantages in Central Taiwan to assemble mature suppliers of precision machinery in this area to facilitate long-term research and development for the EV industry. This study provides an empirical experience for emerging cities in developing countries regarding the development of the EV industry and is an appropriate reference for the creation of EV industry clusters

    Pattern Anomaly Detection based on Sequence-to-Sequence Regularity Learning

    Get PDF
    Anomaly detection in traffic surveillance videos is a challenging task due to the ambiguity of anomaly definition and the complexity of scenes. In this paper, we propose to detect anomalous trajectories for vehicle behavior analysis via learning regularities in data. First, we train a sequence-to-sequence model under the autoencoder architecture and propose a new reconstruction error function for model optimization and anomaly evaluation. As such, the model is forced to learn the regular trajectory patterns in an unsupervised manner. Then, at the inference stage, we use the learned model to encode the test trajectory sample into a compact representation and generate a new trajectory sequence in the learned regular pattern. An anomaly score is computed based on the deviation of the generated trajectory from the test sample. Finally, we can find out the anomalous trajectories with an adaptive threshold. We evaluate the proposed method on two real-world traffic datasets and the experiments show favorable results against state-of-the-art algorithms. This paper\u27s research on sequence-to-sequence regularity learning can provide theoretical and practical support for pattern anomaly detection

    Search for a heavy dark photon at future e+ee^+e^- colliders

    Full text link
    A coupling of a dark photon AA' from a U(1)AU(1)_{A'} with the standard model (SM) particles can be generated through kinetic mixing represented by a parameter ϵ\epsilon. A non-zero ϵ\epsilon also induces a mixing between AA' and ZZ if dark photon mass mAm_{A'} is not zero. This mixing can be large when mAm_{A'} is close to mZm_Z even if the parameter ϵ\epsilon is small. Many efforts have been made to constrain the parameter ϵ\epsilon for a low dark photon mass mAm_{A'} compared with the ZZ boson mass mZm_Z. We study the search for dark photon in e+eγAγμ+μe^+e^- \to \gamma A' \to \gamma \mu^+ \mu^- for a dark photon mass mAm_{A'} as large as kinematically allowed at future e+ee^+e^- colliders. For large mAm_{A'}, care should be taken to properly treat possible large mixing between AA' and ZZ. We obtain sensitivities to the parameter ϵ\epsilon for a wide range of dark photon mass at planed e+  ee^+\;e^- colliders, such as Circular Electron Positron Collider (CEPC), International Linear Collider (ILC) and Future Circular Collider (FCC-ee). For the dark photon mass 20 GeVmA330 GeV20~\text{GeV}\lesssim m_{A^{\prime}}\lesssim 330~\text{GeV}, the 2σ2\sigma exclusion limits on the mixing parameter are ϵ103102\epsilon\lesssim 10^{-3}-10^{-2}. The CEPC with s=240 GeV\sqrt{s}=240~\text{GeV} and FCC-ee with s=160 GeV\sqrt{s}=160~\text{GeV} are more sensitive than the constraint from current LHCb measurement once the dark photon mass mA50 GeVm_{A^{\prime}}\gtrsim 50~\text{GeV}. For mA220 GeVm_{A^{\prime}}\gtrsim 220~\text{GeV}, the sensitivity at the FCC-ee with s=350 GeV\sqrt{s}=350~\text{GeV} and 1.5 ab11.5~\text{ab}^{-1} is better than that at the 13~TeV LHC with 300 fb1300~\text{fb}^{-1}, while the sensitivity at the CEPC with s=240 GeV\sqrt{s}=240~\text{GeV} and 5 ab15~\text{ab}^{-1} can be even better than that at 13~TeV LHC with 3 ab13~\text{ab}^{-1} for mA180 GeVm_{A^{\prime}}\gtrsim 180~\text{GeV}.Comment: 21 pages, 5 figures, 2 table

    Logistics Data Exchange for the EDI Customs Clearance System based on XML

    Get PDF
    Because of the disconnection between the Logistics services trading platform and the EDI customs clearance system, the logistics clearance data needed to be gathered manually, and the efficiency of customs clearance was rather low. In view of this problem, a logistics data exchange method based on the XML technology was proposed, which firstly achieved the batch extraction and conversion of the logistics clearance data that came from the Logistics services trading platform. Then, the data was transferred to the customs broker. Finally, the data was parsed by deserialization and submitted to the EDI customs clearance system automatically. The logistics data exchange method achieved the connection between the logistics services trading platform and the EDI customs clearance system, and raised the efficiency of customs clearance
    corecore