1,230 research outputs found

    From Black-Scholes to Online Learning: Dynamic Hedging under Adversarial Environments

    Full text link
    We consider a non-stochastic online learning approach to price financial options by modeling the market dynamic as a repeated game between the nature (adversary) and the investor. We demonstrate that such framework yields analogous structure as the Black-Scholes model, the widely popular option pricing model in stochastic finance, for both European and American options with convex payoffs. In the case of non-convex options, we construct approximate pricing algorithms, and demonstrate that their efficiency can be analyzed through the introduction of an artificial probability measure, in parallel to the so-called risk-neutral measure in the finance literature, even though our framework is completely adversarial. Continuous-time convergence results and extensions to incorporate price jumps are also presented

    Technology diffusion in communication networks

    Full text link
    The deployment of new technologies in the Internet is notoriously difficult, as evidence by the myriad of well-developed networking technologies that still have not seen widespread adoption (e.g., secure routing, IPv6, etc.) A key hurdle is the fact that the Internet lacks a centralized authority that can mandate the deployment of a new technology. Instead, the Internet consists of thousands of nodes, each controlled by an autonomous, profit-seeking firm, that will deploy a new networking technology only if it obtains sufficient local utility by doing so. For the technologies we study here, local utility depends on the set of nodes that can be reached by traversing paths consisting only of nodes that have already deployed the new technology. To understand technology diffusion in the Internet, we propose a new model inspired by work on the spread of influence in social networks. Unlike traditional models, where a node's utility depends only its immediate neighbors, in our model, a node can be influenced by the actions of remote nodes. Specifically, we assume node v activates (i.e. deploys the new technology) when it is adjacent to a sufficiently large connected component in the subgraph induced by the set of active nodes; namely, of size exceeding node v's threshold value \theta(v). We are interested in the problem of choosing the right seedset of nodes to activate initially, so that the rest of the nodes in the network have sufficient local utility to follow suit. We take the graph and thresholds values as input to our problem. We show that our problem is both NP-hard and does not admit an (1-o(1) ln|V| approximation on general graphs. Then, we restrict our study to technology diffusion problems where (a) maximum distance between any pair of nodes in the graph is r, and (b) there are at most \ell possible threshold values. Our set of restrictions is quite natural, given that (a) the Internet graph has constant diameter, and (b) the fact that limiting the granularity of the threshold values makes sense given the difficulty in obtaining empirical data that parameterizes deployment costs and benefits. We present algorithm that obtains a solution with guaranteed approximation rate of O(r^2 \ell \log|V|) which is asymptotically optimal, given our hardness results. Our approximation algorithm is a linear-programming relaxation of an 0-1 integer program along with a novel randomized rounding scheme.National Science Foundation (S-1017907, CCF-0915922

    From which world is your graph?

    Full text link
    Discovering statistical structure from links is a fundamental problem in the analysis of social networks. Choosing a misspecified model, or equivalently, an incorrect inference algorithm will result in an invalid analysis or even falsely uncover patterns that are in fact artifacts of the model. This work focuses on unifying two of the most widely used link-formation models: the stochastic blockmodel (SBM) and the small world (or latent space) model (SWM). Integrating techniques from kernel learning, spectral graph theory, and nonlinear dimensionality reduction, we develop the first statistically sound polynomial-time algorithm to discover latent patterns in sparse graphs for both models. When the network comes from an SBM, the algorithm outputs a block structure. When it is from an SWM, the algorithm outputs estimates of each node's latent position.Comment: To appear in NIPS 201

    Texas instrument

    Get PDF
    In this report, I analyze the company from Sector Economic overview. Using SWOT to analyses company and figure out the potential risk. And using scenario analysis to get company’s target value with $ 212.38

    Seismic Hazard Assessment and Design Ground Motion: Lessons Learned From Recent Earthquakes

    Get PDF
    Recent earthquakes, the 2008 Wenchuan, 2009 L’Aquila, 2010 Haiti, and 2011 Christchurch and Japan in particular, have called attention to the probabilistic seismic hazard maps, particularly the ground motions with 10, 5, and 2 percent probabilities of exceedance in 50 years. As discussed in this paper, these ground motions are artifacts because they were produced from probabilistic seismic hazard analysis (PSHA). PSHA is a mathematical formulation derived from a rigorous probability analysis of the distribution of earthquake magnitudes, locations, and ground-motion attenuation. Some of the assumptions and distributions that PSHA is based on have been found to be invalid in earth science, however. In addition, PSHA contains a mathematical error, equating a dimensionless quantity (the annual probability of exceedance—exceedance probability in one year) to a dimensional quantity (the annual frequency of exceedance with the unit of per year [1/yr.]). Thus, PSHA is scientifically flawed, and the resulting seismic-hazard and seismic-risk estimates are artifacts. Use of the probabilistic ground-motion maps could lead to either unsafe or overly conservative engineering design. On the other hand, recent earthquakes, the 2010 Chile and 2011 Japan in particular, also showed that ground motions derived from deterministic seismic hazard analysis (DSHA) provided appropriate engineering design to prevent earthquake disaster

    Optimal Inseason Management Of Pink Salmon Given Uncertain Run Sizes And Declining Economic Value

    Get PDF
    Thesis (Ph.D.) University of Alaska Fairbanks, 2001This is a comprehensive study on the fishery and management system (including the inseason stock abundance dynamics, the purse seine fleet dynamics and the inseason management) of pink salmon (Oncorhynchus gorbuscha) in the northern Southeast Alaska inside waters (NSE). Firstly, we presented a hierarchical Bayesian modelling approach (HBM) for estimating salmon escapement abundance and timing from stream count data, which improves estimates in years when data are sparse by "borrowing strength" from counts in other years. We presented a model of escapement and of count data, a hierarchical Bayesian statistical framework, a Gibbs sampling estimation approach for posterior distributions, and model determination techniques. We then applied the HBM to estimating historical escapement parameters for pink salmon returns to Kadashan Creek in Southeast Alaska. Secondly, a simulation study was conducted to compare the performance of the HBM to that of separate maximum likelihood estimation of each year's escapement. We found that the HBM was much better able to estimate escapement parameters in years where few or no counts are made after the peak of escapement. Separate estimates for such years could be wildly inaccurate. However, even a single postpeak count could dramatically improve the estimability of escapement parameters. Third, we defined major stocks and their migratory pathways for the NSE pink salmon. We estimated the escapement timing parameters of these stocks by the HBM. A boxcar migration model was then used to reconstruct the catch and abundance histories for these stocks from 1977 to 1998. Finally, we developed a stochastic simulation model that simulates this fishery and management system. Uncertainties in annual stock size and run timing, fleet dynamics and both preseason and inseason forecasts were accounted for explicitly in this simulation. The simulation model was applied to evaluating four kinds of management strategies with different fishing opening schedules and decision rules. When only flesh quality is concerned, the present and a more aggressive strategy, both of which are adaptive to the run strength of the stocks, are able to provide higher quality fish without compromising the escapement objectives

    Ground Motion for the Maximum Credible Earthquake in Kentucky

    Get PDF
    Although they are not frequent, earthquakes occur in and around Kentucky and pose certain hazards. Assessing seismic hazards is challenging, however, because of a lack of observations. The best estimates of ground motions that could be expected if the maximum credible earthquake occurs in or around Kentucky are depicted in maps showing peak ground acceleration and short-period (0.2 second) and long-period (1.0 second) response accelerations with 5 percent critical damping on hard rock. Another consideration for seismic safety is that the maximum credible earthquake has a long recurrenece interval, from 500 to 1,000 years in the New Madrid Seismic Zone and from 2,000 to 5,000 years in the Wabash Valley Seismic Zone. These maps can be used for seismic safety design for buildings, bridges, dams, and other structures. In combination with local geologic and geotechnical information, these maps can also be used to develop a variety of hazard mitigation strategies, such as land-use planning, emergency planning and preparedness, and lifeline planning
    • …
    corecore