3,514 research outputs found

    Low Rank Matrix Completion with Exponential Family Noise

    Full text link
    The matrix completion problem consists in reconstructing a matrix from a sample of entries, possibly observed with noise. A popular class of estimator, known as nuclear norm penalized estimators, are based on minimizing the sum of a data fitting term and a nuclear norm penalization. Here, we investigate the case where the noise distribution belongs to the exponential family and is sub-exponential. Our framework alllows for a general sampling scheme. We first consider an estimator defined as the minimizer of the sum of a log-likelihood term and a nuclear norm penalization and prove an upper bound on the Frobenius prediction risk. The rate obtained improves on previous works on matrix completion for exponential family. When the sampling distribution is known, we propose another estimator and prove an oracle inequality w.r.t. the Kullback-Leibler prediction risk, which translates immediatly into an upper bound on the Frobenius prediction risk. Finally, we show that all the rates obtained are minimax optimal up to a logarithmic factor

    How brokers can optimally plot against traders

    Full text link
    Traders buy and sell financial instruments in hopes of making profit, and brokers are responsible for the transaction. Some brokers, known as market-makers, take the position opposite to the trader's. If the trader buys, they sell; if the trader sells, they buy. Said differently, brokers make money whenever their traders lose money. From this somewhat strange mechanism emerge various conspiracy theories, notably that brokers manipulate prices in order to maximize their traders' losses. In this paper, our goal is to perform this evil task optimally. Assuming total control over the price of an asset (ignoring the usual aspects of finance such as market conditions, external influence or stochasticity), we show how in cubic time, given a set of trades specified by a stop-loss and a take-profit price, a broker can find a maximum loss price movement. We also study the same problem under a model of probabilistic trades. We finally look at the online trade setting, where broker and trader exchange turns, each trying to make a profit. We show that the best option for the trader is to never trade

    Women and Combat: Why They Serve

    Get PDF

    Hamiltonian chordal graphs are not cycle extendible

    Full text link
    In 1990, Hendry conjectured that every Hamiltonian chordal graph is cycle extendible; that is, the vertices of any non-Hamiltonian cycle are contained in a cycle of length one greater. We disprove this conjecture by constructing counterexamples on nn vertices for any n15n \geq 15. Furthermore, we show that there exist counterexamples where the ratio of the length of a non-extendible cycle to the total number of vertices can be made arbitrarily small. We then consider cycle extendibility in Hamiltonian chordal graphs where certain induced subgraphs are forbidden, notably PnP_n and the bull.Comment: Some results from Section 3 were incorrect and have been removed. To appear in SIAM Journal on Discrete Mathematic

    Long-run dynamics of the U.S. patent classification system

    Full text link
    Almost by definition, radical innovations create a need to revise existing classification systems. In this paper, we argue that classification system changes and patent reclassification are common and reveal interesting information about technological evolution. To support our argument, we present three sets of findings regarding classification volatility in the U.S. patent classification system. First, we study the evolution of the number of distinct classes. Reconstructed time series based on the current classification scheme are very different from historical data. This suggests that using the current classification to analyze the past produces a distorted view of the evolution of the system. Second, we study the relative sizes of classes. The size distribution is exponential so classes are of quite different sizes, but the largest classes are not necessarily the oldest. To explain this pattern with a simple stochastic growth model, we introduce the assumption that classes have a regular chance to be split. Third, we study reclassification. The share of patents that are in a different class now than they were at birth can be quite high. Reclassification mostly occurs across classes belonging to the same 1-digit NBER category, but not always. We also document that reclassified patents tend to be more cited than non-reclassified ones, even after controlling for grant year and class of origin

    Local Agenda 21 and ecological urban restructuring: An European model project in Leipzig

    Get PDF
    AGENDA 21 as one of the final resolutions of the UN Conference on Environment and Development (UNCED) in June 1992 in Rio de Janeiro stresses in chapter 28 the important role of cities, towns and communities in globally sustainable development. One of the most important European model projects in this respect was carried out in Leipzig from 1993 to 1997. The Leipzig Ostraum Project was supported through the LIFE support program of the European Union with the largest subsidy awarded till that time (4,3 mill. DM, with a total project budget of about 20 mill. DM). The central goal of this project is the use, testing and further development of present knowledge with regard to sustainable urban restructuring in combination with innovative strategies of economic and employment policies. The scope of traditional urban ecology is extended to comprise adjacent rural areas and to revitalize regional material flows. In the meantime, affiliated projects have won support by the THERMIE program of the European Union and the EXWOST-program of the German Federal Ministry of Construction. This paper reports on the most important results of the Leipzig Project to date. In particular, the authors show that the concept of ecological urban restructuring and the Local Agenda 21 are in harmony with one another and can play a decisive role in stimulating consensus on future urban development. All urban actors can be winners in this process. Difficulties arose, however, through dishonest use of financial support by the project agency. -- Die AGENDA 21, Abschlußdokument der UN-Konferenz über Umwelt und Entwicklung im Juni 1992 in Rio de Janeiro, betont in ihrem Kapitel 28 die Rolle der Kommunen, Städte und Gemeinden bei einer global zukunftsfähigen Entwickung (sustainable development). Eines der wichtigsten europäischen Modellprojekte zu diesem Thema wurde in den Jahren 1993 bis 1997 in Leipzig durchgeführt. Zentrale Zielstellung des mit der bisher höchsten Fördersumme aus dem LIFEProgramm der Europäischen Kommission (4,3 Mio DM bei einem Projektvolumen von knapp 20 Mio DM) geförderten Leipziger Ostraum-Projektes ist die Anwendung und Weiterentwicklung des Erkenntnisstandes zum Thema des zukunftsfähigen Städtebaus in Kombination mit innovativen Strategien einer ökologischen Wirtschafts- und Beschäftigungspolitik. Die traditionelle Stadtökologie wird erweitert durch die Einbeziehung der Umlandgemeinden und die Revitalisierung regionaler Stoffkreisläufe. Inzwischen sind ergänzende Tochterprojekte mit Förderung aus dem THERMIE-Programm der Europäischen Kommission und dem EXWOST-Programm des Bundesbauministeriums eingeworben werden. In dem Paper wird über die wichtigsten bisherigen Ergebnisse des Projektes berichtet. Insbesondere wird gezeigt, daß die Konzepte des Ökologischen Stadtumbaus und der Lokalen Agenda 21 miteinander harmonieren und eine konsensstiftende Funktion in der zukünftigen Stadtentwicklung übernehmen können. Alle städtischen Akteure können Winner in diesem Prozeß sein. Durch unredlichen Umgang des Projektträgers mit den Fördergeldern war das Leipziger Projekt in jüngster Zeit allerdings in Schwierigkeiten geraten.

    How predictable is technological progress?

    Get PDF
    Recently it has become clear that many technologies follow a generalized version of Moore's law, i.e. costs tend to drop exponentially, at different rates that depend on the technology. Here we formulate Moore's law as a correlated geometric random walk with drift, and apply it to historical data on 53 technologies. We derive a closed form expression approximating the distribution of forecast errors as a function of time. Based on hind-casting experiments we show that this works well, making it possible to collapse the forecast errors for many different technologies at different time horizons onto the same universal distribution. This is valuable because it allows us to make forecasts for any given technology with a clear understanding of the quality of the forecasts. As a practical demonstration we make distributional forecasts at different time horizons for solar photovoltaic modules, and show how our method can be used to estimate the probability that a given technology will outperform another technology at a given point in the future

    The Tandem Duplication Distance Is NP-Hard

    Get PDF
    In computational biology, tandem duplication is an important biological phenomenon which can occur either at the genome or at the DNA level. A tandem duplication takes a copy of a genome segment and inserts it right after the segment - this can be represented as the string operation AXB ? AXXB. Tandem exon duplications have been found in many species such as human, fly or worm, and have been largely studied in computational biology. The Tandem Duplication (TD) distance problem we investigate in this paper is defined as follows: given two strings S and T over the same alphabet, compute the smallest sequence of tandem duplications required to convert S to T. The natural question of whether the TD distance can be computed in polynomial time was posed in 2004 by Leupold et al. and had remained open, despite the fact that tandem duplications have received much attention ever since. In this paper, we prove that this problem is NP-hard, settling the 16-year old open problem. We further show that this hardness holds even if all characters of S are distinct. This is known as the exemplar TD distance, which is of special relevance in bioinformatics. One of the tools we develop for the reduction is a new problem called the Cost-Effective Subgraph, for which we obtain W[1]-hardness results that might be of independent interest. We finally show that computing the exemplar TD distance between S and T is fixed-parameter tractable. Our results open the door to many other questions, and we conclude with several open problems

    Early identification of important patents through network centrality

    Full text link
    One of the most challenging problems in technological forecasting is to identify as early as possible those technologies that have the potential to lead to radical changes in our society. In this paper, we use the US patent citation network (1926-2010) to test our ability to early identify a list of historically significant patents through citation network analysis. We show that in order to effectively uncover these patents shortly after they are issued, we need to go beyond raw citation counts and take into account both the citation network topology and temporal information. In particular, an age-normalized measure of patent centrality, called rescaled PageRank, allows us to identify the significant patents earlier than citation count and PageRank score. In addition, we find that while high-impact patents tend to rely on other high-impact patents in a similar way as scientific papers, the patents' citation dynamics is significantly slower than that of papers, which makes the early identification of significant patents more challenging than that of significant papers.Comment: 14 page
    corecore