2,869 research outputs found
The analysis of scanner data for crop inventories
There are no author-identified significant results in this report
How to Issue a Central Bank Digital Currency
With the emergence of Bitcoin and recently proposed stablecoins from
BigTechs, such as Diem (formerly Libra), central banks face growing competition
from private actors offering their own digital alternative to physical cash. We
do not address the normative question whether a central bank should issue a
central bank digital currency (CBDC) or not. Instead, we contribute to the
current research debate by showing how a central bank could do so, if desired.
We propose a token-based system without distributed ledger technology and show
how earlier-deployed, software-only electronic cash can be improved upon to
preserve transaction privacy, meet regulatory requirements in a compelling way,
and offer a level of quantum-resistant protection against systemic privacy
risk. Neither monetary policy nor financial stability would be materially
affected because a CBDC with this design would replicate physical cash rather
than bank deposits.Comment: Swiss National Bank Working Paper3/202
Vendors are Undermining the Structure of U.S. Elections
As we approach the 2008 general election, the structure of elections in the United States -- once reliant on local representatives accountable to the public -- has become almost wholly dependent on large corporations, which are not accountable to the public. Most local officials charged with running elections are now unable to administer elections without the equipment, services, and trade-secret software of a small number of corporations. If the vendors withdrew their support for elections now, our election structure would collapse. Case studies presented in this report give examples of the pervasive control voting system vendors now have over election administration in almost every state, and the consequences some jurisdictions are already experiencing.However, some states and localities are recognizing the threat that vendor-dependency poses to elections. They are using ingenuity and determination to begin reversing the direction. This report examines the situation, how we got here, and steps we can take to limit corporate control of our elections in 2008 and reduce it even further in the future
Group Diffie-Hellman Key Exchange Secure against Dictionary Attacks
Group Diffie-Hellman schemes for password-based key exchange are designed to provide a pool of players communicating over a public network, and sharing just a human-memorable password, with a session key (e.g, the key is used for multicast data integrity and confidentiality) . The fundamental security goal to achieve in this scenario is security against dictionary attacks. While solutions have been proposed to solve this problem no formal treatment has ever been suggested. In this paper, we define a security model and then present a protocol with its security proof in both the random oracle model and the ideal-cipher model
Λογοδοτούμενη επεξεργασία δεδομένων που διατηρεί την ιδιωτικότητα μέσω κατανεμημένων μητρώων
Ο όγκος των δεδομένων που συλλέγονται καθημερινά σημειώνει εκθετική αύξηση, ενώ η κατοχή τους θεωρείται πολύτιμη. Η ανάγκη για εκτένη ανάλυση έχει αναδειχθεί μέσα από το έργο διαφόρων ερευνητών και οργανισμών. Ωστόστο, τα δεδομένα αυτά μπορεί να είναι ευαίσθητα και να υπάγονται σε ρυθμιστικές νομοθεσίες απορρήτου κάνοντας την επεξεργασία από τρίτους αδύνατη. Προτείνουμε ένα πρωτόκολλο στο οποίο επεξεργαστές δεδομένων (data processors) έχουν την δυνατότητα να καταχωρήσουν σύνολα δεδομένων (datasets) για τα οποία μπορούν να γίνουν αιτήσεις επεξεργασίας οι οποίες διεκπαιρεώνονται από επεξεργαστές δεδομένων (data processors). Ένα κατανεμημένο μητρώo (distributed ledger) χρησιμοποιείται ως διαχειριστής του συστήματος λειτουργώντας ως ένα αμμετάβλητο ιστορικό όλων των ενεργειών των συμμετεχόντων. Το κατανεμημένο μητρώο παρέχει τις ιδιότητες της λογοδοσίας, του ελέγχου και της παρακολούθησης της προέλευσης των δεδομένων. Επίσης, χρησιμοποιείται ένα σχήμα Μηδενικής Γνώσης Ορθότητας Υπολογισμού (Zero Knowledge Verifiable Computation) μέσα από το οποίο οι επεξεργαστές δεδομένων υποχρεούνται να παράξουν μια απόδειξη ορθότητας υπολογισμού, χωρίς να αποκαλείψουν το ίδιο το σύνολο δεδομένων, την οποία ο αιτών (data requestor) και επαληθεύει. Κατά αυτό τον τρόπο πιστοποιείται το γεγονός ότι πραγματοποιήθηκε η σωστή επεξεργασία δεδομένων χωρίς να αποκαλυφθούν επιπλέον πληροφορίες σχετικά με αυτά.Data are gathered constantly, grow exponentially, and are considered a
valuable asset. The need for extensive analysis has emerged by various
organizations and researchers. However, they can be sensitive, private,
and protected by privacy disclosure acts making data processing by
third-parties almost impossible. We propose a protocol for data processing
where data controllers can register their datasets and entities can
request data processing operations by data processors. A distributed
ledger is used as the controller of the system serving as an immutable
history log of all actions taken by the participants. The blockchain-based
distributed ledger provides data accountability, auditability and
provenance tracking. We also use a Zero Knowledge Verifiable Computation
scheme where a data processor is enforced to produce a proof of
correctness of computation without revealing the dataset itself that the
requestor verifies. This records the fact that correct processing has
taken place without disclosing any information about the data
AgRISTARS: Foreign commodity production forecasting. The 1980 US corn and soybeans exploratory experiment
The U.S. corn and soybeans exploratory experiment is described which consisted of evaluations of two technology components of a production forecasting system: classification procedures (crop labeling and proportion estimation at the level of a sampling unit) and sampling and aggregation procedures. The results from the labeling evaluations indicate that the corn and soybeans labeling procedure works very well in the U.S. corn belt with full season (after tasseling) LANDSAT data. The procedure should be readily adaptable to corn and soybeans labeling required for subsequent exploratory experiments or pilot tests. The machine classification procedures evaluated in this experiment were not effective in improving the proportion estimates. The corn proportions produced by the machine procedures had a large bias when the bias correction was not performed. This bias was caused by the manner in which the machine procedures handled spectrally impure pixels. The simulation test indicated that the weighted aggregation procedure performed quite well. Although further work can be done to improve both the simulation tests and the aggregation procedure, the results of this test show that the procedure should serve as a useful baseline procedure in future exploratory experiments and pilot tests
The heavy top quark and supersymmetry
Three aspects of supersymmetric theories are discussed: electroweak symmetry
breaking, the issues of flavor, and gauge unification. The heavy top quark
plays an important, sometimes dominant, role in each case. Additional
symmetries lead to extensions of the standard model which can provide an
understanding for many of the outstanding problems of particle physics. A
broken supersymmetric extension of spacetime allows electroweak symmetry
breaking to follow from the dynamics of the heavy top quark; an extension of
isospin provides a constrained framework for understanding the pattern of quark
and lepton masses; and a grand unified extension of the standard model gauge
group provides an elegant understanding of the gauge quantum numbers of the
components of a generation. Experimental signatures for each of these
additional symmetries are discussed.Comment: 60 pages, 1 ps file; lectures delivered at the 1995 SLAC Summer
Institut
- …