116 research outputs found

    Understanding XCP: Equilibrium and Fairness

    Full text link

    Worldline Monte Carlo for fermion models at large N_f

    Full text link
    Strongly-coupled fermionic systems can support a variety of low-energy phenomena, giving rise to collective condensation, symmetry breaking and a rich phase structure. We explore the potential of worldline Monte Carlo methods for analyzing the effective action of fermionic systems at large flavor number N_f, using the Gross-Neveu model as an example. Since the worldline Monte Carlo approach does not require a discretized spacetime, fermion doubling problems are absent, and chiral symmetry can manifestly be maintained. As a particular advantage, fluctuations in general inhomogeneous condensates can conveniently be dealt with analytically or numerically, while the renormalization can always be uniquely performed analytically. We also critically examine the limitations of a straightforward implementation of the algorithms, identifying potential convergence problems in the presence of fermionic zero modes as well as in the high-density region.Comment: 40 pages, 13 figure

    Transport Architectures for an Evolving Internet

    Get PDF
    In the Internet architecture, transport protocols are the glue between an application’s needs and the network’s abilities. But as the Internet has evolved over the last 30 years, the implicit assumptions of these protocols have held less and less well. This can cause poor performance on newer networks—cellular networks, datacenters—and makes it challenging to roll out networking technologies that break markedly with the past. Working with collaborators at MIT, I have built two systems that explore an objective-driven, computer-generated approach to protocol design. My thesis is that making protocols a function of stated assumptions and objectives can improve application performance and free network technologies to evolve. Sprout, a transport protocol designed for videoconferencing over cellular networks, uses probabilistic inference to forecast network congestion in advance. On commercial cellular networks, Sprout gives 2-to-4 times the throughput and 7-to-9 times less delay than Skype, Apple Facetime, and Google Hangouts. This work led to Remy, a tool that programmatically generates protocols for an uncertain multi-agent network. Remy’s computer-generated algorithms can achieve higher performance and greater fairness than some sophisticated human-designed schemes, including ones that put intelligence inside the network. The Remy tool can then be used to probe the difficulty of the congestion control problem itself—how easy is it to “learn” a network protocol to achieve desired goals, given a necessarily imperfect model of the networks where it ultimately will be deployed? We found weak evidence of a tradeoff between the breadth of the operating range of a computer-generated protocol and its performance, but also that a single computer-generated protocol was able to outperform existing schemes over a thousand-fold range of link rates

    Entry Deterrence in a Duopoly Market

    Get PDF
    In a homogeneous good, Cournot duopoly model, entry may occur even when the potential entrant has no cost advantage and no independent access to distribution. By sinking its costs of production before negotiating with the incumbents, the entrant creates an externality that induces the incumbents to bid more aggressively for the distribution rights to its output. Each incumbent is willing to pay up to the incremental profit earned from the additional output plus the incremental loss avoided by keeping the output away from its rival. This implies that the incumbents are willing to pay up to the market price for each unit of available output. A sequential game in which the incumbents produce first is analyzed, and the conditions under which entry is deterred by incumbents' preemptive capacity expansions are derived

    A study of the selective laser alloying of elemental titanium and boron powder

    Get PDF
    To better understand the selective laser alloying process of elemental titanium and boron powder, theoretic models are created which takes material related, reaction related, laser related, and other affecting factors into consideration. These models do help interpret the corresponding experiment results. In turn, the related experiment results validate the theoretic models created. In practice, the experiments with two different molar ratios of 1:2 and 4:1 between titanium and boron are carried out. The huge amount of energy released because of reaction between titanium and boron during laser irradiation makes it hard to understand and control the alloying process. At the same time, the energy released can also be utilized to lower the laser power and speed up the scanning speed. In the end, laser parameters are optimized to build solid parts with good quality of top surface finish and only few defects inside
    • …
    corecore