692 research outputs found

    Phase transition of color-superconductivity and cooling behavior of quark stars

    Get PDF
    We discuss the color-superconductivity and its effect on the cooling behavior of strange quark stars. The neutrino emissivity and specific heat of quark matter are calculated within the BCS theory. In the superconducting phase, the emissivity decreases and causes suppression of the cooling rate. It is shown that the phase transition leads to a sudden discontinuous suppression of the cooling rate in cooperation with the specific heat.Comment: 7 pages, 3 figure

    Confinement Properties in the Multi-Instanton System

    Full text link
    We investigate the confinement properties in the multi-instanton system, where the size distribution is assumed to be ρ5 \rho^{-5} for the large instanton size ρ \rho . We find that the instanton vacuum gives the area law behavior of the Wilson loop, which indicates existence of the linear confining potential. In the multi-instanton system, the string tension increases monotonously with the instanton density, and takes the standard value σ1GeV/fm \sigma \simeq 1 GeV/fm for the density (N/V)1/4=200MeV (N/V)^{{1/4}} = 200 MeV . Thus, instantons directly relate to color confinement properties.Comment: Talk presented by M. Fukushima at ``Lattice '97'', the International Symposium on Lattice Field Theory, 22 - 26 July 1997, in Edinburgh, Scotland, 3 pages, Plain Late

    Real-time Loss Estimation for Instrumented Buildings

    Get PDF
    Motivation. A growing number of buildings have been instrumented to measure and record earthquake motions and to transmit these records to seismic-network data centers to be archived and disseminated for research purposes. At the same time, sensors are growing smaller, less expensive to install, and capable of sensing and transmitting other environmental parameters in addition to acceleration. Finally, recently developed performance-based earthquake engineering methodologies employ structural-response information to estimate probabilistic repair costs, repair durations, and other metrics of seismic performance. The opportunity presents itself therefore to combine these developments into the capability to estimate automatically in near-real-time the probabilistic seismic performance of an instrumented building, shortly after the cessation of strong motion. We refer to this opportunity as (near-) real-time loss estimation (RTLE). Methodology. This report presents a methodology for RTLE for instrumented buildings. Seismic performance is to be measured in terms of probabilistic repair cost, precise location of likely physical damage, operability, and life-safety. The methodology uses the instrument recordings and a Bayesian state-estimation algorithm called a particle filter to estimate the probabilistic structural response of the system, in terms of member forces and deformations. The structural response estimate is then used as input to component fragility functions to estimate the probabilistic damage state of structural and nonstructural components. The probabilistic damage state can be used to direct structural engineers to likely locations of physical damage, even if they are concealed behind architectural finishes. The damage state is used with construction cost-estimation principles to estimate probabilistic repair cost. It is also used as input to a quantified, fuzzy-set version of the FEMA-356 performance-level descriptions to estimate probabilistic safety and operability levels. CUREE demonstration building. The procedure for estimating damage locations, repair costs, and post-earthquake safety and operability is illustrated in parallel demonstrations by CUREE and Kajima research teams. The CUREE demonstration is performed using a real 1960s-era, 7-story, nonductile reinforced-concrete moment-frame building located in Van Nuys, California. The building is instrumented with 16 channels at five levels: ground level, floors 2, 3, 6, and the roof. We used the records obtained after the 1994 Northridge earthquake to hindcast performance in that earthquake. The building is analyzed in its condition prior to the 1994 Northridge Earthquake. It is found that, while hindcasting of the overall system performance level was excellent, prediction of detailed damage locations was poor, implying that either actual conditions differed substantially from those shown on the structural drawings, or inappropriate fragility functions were employed, or both. We also found that Bayesian updating of the structural model using observed structural response above the base of the building adds little information to the performance prediction. The reason is probably that Real-Time Loss Estimation for Instrumented Buildings ii structural uncertainties have only secondary effect on performance uncertainty, compared with the uncertainty in assembly damageability as quantified by their fragility functions. The implication is that real-time loss estimation is not sensitive to structural uncertainties (saving costly multiple simulations of structural response), and that real-time loss estimation does not benefit significantly from installing measuring instruments other than those at the base of the building. Kajima demonstration building. The Kajima demonstration is performed using a real 1960s-era office building in Kobe, Japan. The building, a 7-story reinforced-concrete shearwall building, was not instrumented in the 1995 Kobe earthquake, so instrument recordings are simulated. The building is analyzed in its condition prior to the earthquake. It is found that, while hindcasting of the overall repair cost was excellent, prediction of detailed damage locations was poor, again implying either that as-built conditions differ substantially from those shown on structural drawings, or that inappropriate fragility functions were used, or both. We find that the parameters of the detailed particle filter needed significant tuning, which would be impractical in actual application. Work is needed to prescribe values of these parameters in general. Opportunities for implementation and further research. Because much of the cost of applying this RTLE algorithm results from the cost of instrumentation and the effort of setting up a structural model, the readiest application would be to instrumented buildings whose structural models are already available, and to apply the methodology to important facilities. It would be useful to study under what conditions RTLE would be economically justified. Two other interesting possibilities for further study are (1) to update performance using readily observable damage; and (2) to quantify the value of information for expensive inspections, e.g., if one inspects a connection with a modeled 50% failure probability and finds that the connect is undamaged, is it necessary to examine one with 10% failure probability

    Impact of Seismic Risk on Lifetime Property Values

    Get PDF
    This report presents a methodology for establishing the uncertain net asset value, NAV, of a real-estate investment opportunity considering both market risk and seismic risk for the property. It also presents a decision-making procedure to assist in making real-estate investment choices under conditions of uncertainty and risk-aversion. It is shown that that market risk, as measured by the coefficient of variation of NAV, is at least 0.2 and may exceed 1.0. In a situation of such high uncertainty, where potential gains and losses are large relative to a decision-maker's risk tolerance, it is appropriate to adopt a decision-analysis approach to real-estate investment decision-making. A simple equation for doing so is presented. The decision-analysis approach uses the certainty equivalent, CE, as opposed to NAV as the basis for investment decision-making. That is, when faced with multiple investment alternatives, one should choose the alternative that maximizes CE. It is shown that CE is less than the expected value of NAV by an amount proportional to the variance of NAV and the inverse of the decision-maker's risk tolerance, [rho]. The procedure for establishing NAV and CE is illustrated in parallel demonstrations by CUREE and Kajima research teams. The CUREE demonstration is performed using a real 1960s-era hotel building in Van Nuys, California. The building, a 7-story non-ductile reinforced-concrete moment-frame building, is analyzed using the assembly-based vulnerability (ABV) method, developed in Phase III of the CUREE-Kajima Joint Research Program. The building is analyzed three ways: in its condition prior to the 1994 Northridge Earthquake, with a hypothetical shearwall upgrade, and with earthquake insurance. This is the first application of ABV to a real building, and the first time ABV has incorporated stochastic structural analyses that consider uncertainties in the mass, damping, and force-deformation behavior of the structure, along with uncertainties in ground motion, component damageability, and repair costs. New fragility functions are developed for the reinforced concrete flexural members using published laboratory test data, and new unit repair costs for these components are developed by a professional construction cost estimator. Four investment alternatives are considered: do not buy; buy; buy and retrofit; and buy and insure. It is found that the best alternative for most reasonable values of discount rate, risk tolerance, and market risk is to buy and leave the building as-is. However, risk tolerance and market risk (variability of income) both materially affect the decision. That is, for certain ranges of each parameter, the best investment alternative changes. This indicates that expected-value decision-making is inappropriate for some decision-makers and investment opportunities. It is also found that the majority of the economic seismic risk results from shaking of S[subscript a] < 0.3g, i.e., shaking with return periods on the order of 50 to 100 yr that cause primarily architectural damage, rather than from the strong, rare events of which common probable maximum loss (PML) measurements are indicative. The Kajima demonstration is performed using three Tokyo buildings. A nine-story, steel-reinforced-concrete building built in 1961 is analyzed as two designs: as-is, and with a steel-braced-frame structural upgrade. The third building is 29-story, 1999 steel-frame structure. The three buildings are intended to meet collapse-prevention, life-safety, and operational performance levels, respectively, in shaking with 10%exceedance probability in 50 years. The buildings are assessed using levels 2 and 3 of Kajima's three-level analysis methodology. These are semi-assembly based approaches, which subdivide a building into categories of components, estimate the loss of these component categories for given ground motions, and combine the losses for the entire building. The two methods are used to estimate annualized losses and to create curves that relate loss to exceedance probability. The results are incorporated in the input to a sophisticated program developed by the Kajima Corporation, called Kajima D, which forecasts cash flows for office, retail, and residential projects for purposes of property screening, due diligence, negotiation, financial structuring, and strategic planning. The result is an estimate of NAV for each building. A parametric study of CE for each building is presented, along with a simplified model for calculating CE as a function of mean NAV and coefficient of variation of NAV. The equation agrees with that developed in parallel by the CUREE team. Both the CUREE and Kajima teams collaborated with a number of real-estate investors to understand their seismic risk-management practices, and to formulate and to assess the viability of the proposed decision-making methodologies. Investors were interviewed to elicit their risk-tolerance, r, using scripts developed and presented here in English and Japanese. Results of 10 such interviews are presented, which show that a strong relationship exists between a decision-maker's annual revenue, R, and his or her risk tolerance, [rho is approximately equal to] 0.0075R[superscript 1.34]. The interviews show that earthquake risk is a marginal consideration in current investment practice. Probable maximum loss (PML) is the only earthquake risk parameter these investors consider, and they typically do not use seismic risk at all in their financial analysis of an investment opportunity. For competitive reasons, a public investor interviewed here would not wish to account for seismic risk in his financial analysis unless rating agencies required him to do so or such consideration otherwise became standard practice. However, in cases where seismic risk is high enough to significantly reduce return, a private investor expressed the desire to account for seismic risk via expected annualized loss (EAL) if it were inexpensive to do so, i.e., if the cost of calculating the EAL were not substantially greater than that of PML alone. The study results point to a number of interesting opportunities for future research, namely: improve the market-risk stochastic model, including comparison of actual long-term income with initial income projections; improve the risk-attitude interview; account for uncertainties in repair method and in the relationship between repair cost and loss; relate the damage state of structural elements with points on the force-deformation relationship; examine simpler dynamic analysis as a means to estimate vulnerability; examine the relationship between simplified engineering demand parameters and performance; enhance category-based vulnerability functions by compiling a library of building-specific ones; and work with lenders and real-estate industry analysts to determine the conditions under which seismic risk should be reflected in investors' financial analyses

    Chiral symmetry breaking and stability of strangelets

    Full text link
    We discuss the stability of strangelets by considering dynamical chiral symmetry breaking and confinement. We use a U(3)L×U(3)RU(3)_{L} \times U(3)_{R} symmetric Nambu--Jona-Lasinio model for chiral symmetry breaking supplemented by a boundary condition for confinement. It is shown that strangelets with baryon number A<2×103A < 2 \times 10^{3} can stably exist. For the observables, we obtain the masses and the charge-to-baryon number ratios of the strangelets. These quantities are compared with the observed data of the exotic particles.Comment: 10 pages, 9 figures, submitted to Physical Review

    Systematic study of autocorrelation time in pure SU(3) lattice gauge theory

    Full text link
    Results of our autocorrelation measurement performed on Fujitsu AP1000 are reported. We analyze (i) typical autocorrelation time, (ii) optimal mixing ratio between overrelaxation and pseudo-heatbath and (iii) critical behavior of autocorrelation time around cross-over region with high statistic in wide range of β\beta for pure SU(3) lattice gauge theory on 848^4, 16416^4 and 32432^4 lattices. For the mixing ratio K, small value (3-7) looks optimal in the confined region, and reduces the integrated autocorrelation time by a factor 2-4 compared to the pseudo-heatbath. On the other hand in the deconfined phase, correlation times are short, and overrelaxation does not seem to matter For a fixed value of K(=9 in this paper), the dynamical exponent of overrelaxation is consistent with 2 Autocorrelation measurement of the topological charge on 323×6432^3 \times 64 lattice at β\beta = 6.0 is also briefly mentioned.Comment: 3 pages of A4 format including 7-figure

    Autocorrelation in Updating Pure SU(3) Lattice Gauge Theory by the use of Overrelaxed Algorithms

    Full text link
    We measure the sweep-to-sweep autocorrelations of blocked loops below and above the deconfinement transition for SU(3) on a 16416^4 lattice using 20000-140000 Monte-Carlo updating sweeps. A divergence of the autocorrelation time toward the critical β\beta is seen at high blocking levels. The peak is near β\beta = 6.33 where we observe 440 ±\pm 210 for the autocorrelation time of 1×11\times 1 Wilson loop on 242^4 blocked lattice. The mixing of 7 Brown-Woch overrelaxation steps followed by one pseudo-heat-bath step appears optimal to reduce the autocorrelation time below the critical β\beta. Above the critical β\beta, however, no clear difference between these two algorithms can be seen and the system decorrelates rather fast.Comment: 4 pages of A4 format including 6-figure
    corecore