19,452 research outputs found
Desingularization in Computational Applications and Experiments
After briefly recalling some computational aspects of blowing up and of
representation of resolution data common to a wide range of desingularization
algorithms (in the general case as well as in special cases like surfaces or
binomial varieties), we shall proceed to computational applications of
resolution of singularities in singularity theory and algebraic geometry, also
touching on relations to algebraic statistics and machine learning. Namely, we
explain how to compute the intersection form and dual graph of resolution for
surfaces, how to determine discrepancies, the log-canoncial threshold and the
topological Zeta-function on the basis of desingularization data. We shall also
briefly see how resolution data comes into play for Bernstein-Sato polynomials,
and we mention some settings in which desingularization algorithms can be used
for computational experiments. The latter is simply an invitation to the
readers to think themselves about experiments using existing software, whenever
it seems suitable for their own work.Comment: notes of a summer school talk; 16 pages; 1 figur
ZETA - Zero-Trust Authentication: Relying on Innate Human Ability, not Technology
Reliable authentication requires the devices and
channels involved in the process to be trustworthy; otherwise
authentication secrets can easily be compromised. Given the
unceasing efforts of attackers worldwide such trustworthiness
is increasingly not a given. A variety of technical solutions,
such as utilising multiple devices/channels and verification
protocols, has the potential to mitigate the threat of untrusted
communications to a certain extent. Yet such technical solutions
make two assumptions: (1) users have access to multiple
devices and (2) attackers will not resort to hacking the human,
using social engineering techniques. In this paper, we propose
and explore the potential of using human-based computation
instead of solely technical solutions to mitigate the threat of
untrusted devices and channels. ZeTA (Zero Trust Authentication
on untrusted channels) has the potential to allow people to
authenticate despite compromised channels or communications
and easily observed usage. Our contributions are threefold:
(1) We propose the ZeTA protocol with a formal definition
and security analysis that utilises semantics and human-based
computation to ameliorate the problem of untrusted devices
and channels. (2) We outline a security analysis to assess
the envisaged performance of the proposed authentication
protocol. (3) We report on a usability study that explores the
viability of relying on human computation in this context
PKM and the maintenance of memory.
How can memories outlast the molecules from which they are made? Answers to this fundamental question have been slow coming but are now emerging. A novel kinase, an isoform of protein kinase C (PKC), PKMzeta, has been shown to be critical to the maintenance of some types of memory. Inhibiting the catalytic properties of this kinase can erase well-established memories without altering the ability of the erased synapse to be retrained. This article provides an overview of the literature linking PKMzeta to memory maintenance and identifies some of the controversial issues that surround the bold implications of the existing data. It concludes with a discussion of the future directions of this domain
The L-functions and modular forms database project
The Langlands Programme, formulated by Robert Langlands in the 1960s and
since much developed and refined, is a web of interrelated theory and
conjectures concerning many objects in number theory, their interconnections,
and connections to other fields. At the heart of the Langlands Programme is the
concept of an L-function.
The most famous L-function is the Riemann zeta-function, and as well as being
ubiquitous in number theory itself, L-functions have applications in
mathematical physics and cryptography. Two of the seven Clay Mathematics
Million Dollar Millennium Problems, the Riemann Hypothesis and the Birch and
Swinnerton-Dyer Conjecture, deal with their properties. Many different
mathematical objects are connected in various ways to L-functions, but the
study of those objects is highly specialized, and most mathematicians have only
a vague idea of the objects outside their specialty and how everything is
related. Helping mathematicians to understand these connections was the
motivation for the L-functions and Modular Forms Database (LMFDB) project. Its
mission is to chart the landscape of L-functions and modular forms in a
systematic, comprehensive and concrete fashion. This involves developing their
theory, creating and improving algorithms for computing and classifying them,
and hence discovering new properties of these functions, and testing
fundamental conjectures.
In the lecture I gave a very brief introduction to L-functions for
non-experts, and explained and demonstrated how the large collection of data in
the LMFDB is organized and displayed, showing the interrelations between linked
objects, through our website www.lmfdb.org. I also showed how this has been
created by a world-wide open source collaboration, which we hope may become a
model for others.Comment: 14 pages with one illustration. Based on a plenary lecture given at
FoCM'14, December 2014, Montevideo, Urugua
On the correction of anomalous phase oscillation in entanglement witnesses using quantum neural networks
Entanglement of a quantum system depends upon relative phase in complicated
ways, which no single measurement can reflect. Because of this, entanglement
witnesses are necessarily limited in applicability and/or utility. We propose
here a solution to the problem using quantum neural networks. A quantum system
contains the information of its entanglement; thus, if we are clever, we can
extract that information efficiently. As proof of concept, we show how this can
be done for the case of pure states of a two-qubit system, using an
entanglement indicator corrected for the anomalous phase oscillation. Both the
entanglement indicator and the phase correction are calculated by the quantum
system itself acting as a neural network
- …