27,272 research outputs found

    Efficient transfer entropy analysis of non-stationary neural time series

    Full text link
    Information theory allows us to investigate information processing in neural systems in terms of information transfer, storage and modification. Especially the measure of information transfer, transfer entropy, has seen a dramatic surge of interest in neuroscience. Estimating transfer entropy from two processes requires the observation of multiple realizations of these processes to estimate associated probability density functions. To obtain these observations, available estimators assume stationarity of processes to allow pooling of observations over time. This assumption however, is a major obstacle to the application of these estimators in neuroscience as observed processes are often non-stationary. As a solution, Gomez-Herrero and colleagues theoretically showed that the stationarity assumption may be avoided by estimating transfer entropy from an ensemble of realizations. Such an ensemble is often readily available in neuroscience experiments in the form of experimental trials. Thus, in this work we combine the ensemble method with a recently proposed transfer entropy estimator to make transfer entropy estimation applicable to non-stationary time series. We present an efficient implementation of the approach that deals with the increased computational demand of the ensemble method's practical application. In particular, we use a massively parallel implementation for a graphics processing unit to handle the computationally most heavy aspects of the ensemble method. We test the performance and robustness of our implementation on data from simulated stochastic processes and demonstrate the method's applicability to magnetoencephalographic data. While we mainly evaluate the proposed method for neuroscientific data, we expect it to be applicable in a variety of fields that are concerned with the analysis of information transfer in complex biological, social, and artificial systems.Comment: 27 pages, 7 figures, submitted to PLOS ON

    Titan Science with the James Webb Space Telescope (JWST)

    Get PDF
    The James Webb Space Telescope (JWST), scheduled for launch in 2018, is the successor to the Hubble Space Telescope (HST) but with a significantly larger aperture (6.5 m) and advanced instrumentation focusing on infrared science (0.6-28.0 Ό\mum ). In this paper we examine the potential for scientific investigation of Titan using JWST, primarily with three of the four instruments: NIRSpec, NIRCam and MIRI, noting that science with NIRISS will be complementary. Five core scientific themes are identified: (i) surface (ii) tropospheric clouds (iii) tropospheric gases (iv) stratospheric composition and (v) stratospheric hazes. We discuss each theme in depth, including the scientific purpose, capabilities and limitations of the instrument suite, and suggested observing schemes. We pay particular attention to saturation, which is a problem for all three instruments, but may be alleviated for NIRCam through use of selecting small sub-arrays of the detectors - sufficient to encompass Titan, but with significantly faster read-out times. We find that JWST has very significant potential for advancing Titan science, with a spectral resolution exceeding the Cassini instrument suite at near-infrared wavelengths, and a spatial resolution exceeding HST at the same wavelengths. In particular, JWST will be valuable for time-domain monitoring of Titan, given a five to ten year expected lifetime for the observatory, for example monitoring the seasonal appearance of clouds. JWST observations in the post-Cassini period will complement those of other large facilities such as HST, ALMA, SOFIA and next-generation ground-based telescopes (TMT, GMT, EELT).Comment: 50 pages, including 22 figures and 2 table

    On the Numerical Accuracy of Spreadsheets

    Get PDF
    This paper discusses the numerical precision of five spreadsheets (Calc, Excel, Gnumeric, NeoOffice and Oleo) running on two hardware platforms (i386 and amd64) and on three operating systems (Windows Vista, Ubuntu Intrepid and Mac OS Leopard). The methodology consists of checking the number of correct significant digits returned by each spreadsheet when computing the sample mean, standard deviation, first-order autocorrelation, F statistic in ANOVA tests, linear and nonlinear regression and distribution functions. A discussion about the algorithms for pseudorandom number generation provided by these platforms is also conducted. We conclude that there is no safe choice among the spreadsheets here assessed: they all fail in nonlinear regression and they are not suited for Monte Carlo experiments.

    Impact of Seismic Risk on Lifetime Property Values

    Get PDF
    This report presents a methodology for establishing the uncertain net asset value, NAV, of a real-estate investment opportunity considering both market risk and seismic risk for the property. It also presents a decision-making procedure to assist in making real-estate investment choices under conditions of uncertainty and risk-aversion. It is shown that that market risk, as measured by the coefficient of variation of NAV, is at least 0.2 and may exceed 1.0. In a situation of such high uncertainty, where potential gains and losses are large relative to a decision-maker's risk tolerance, it is appropriate to adopt a decision-analysis approach to real-estate investment decision-making. A simple equation for doing so is presented. The decision-analysis approach uses the certainty equivalent, CE, as opposed to NAV as the basis for investment decision-making. That is, when faced with multiple investment alternatives, one should choose the alternative that maximizes CE. It is shown that CE is less than the expected value of NAV by an amount proportional to the variance of NAV and the inverse of the decision-maker's risk tolerance, [rho]. The procedure for establishing NAV and CE is illustrated in parallel demonstrations by CUREE and Kajima research teams. The CUREE demonstration is performed using a real 1960s-era hotel building in Van Nuys, California. The building, a 7-story non-ductile reinforced-concrete moment-frame building, is analyzed using the assembly-based vulnerability (ABV) method, developed in Phase III of the CUREE-Kajima Joint Research Program. The building is analyzed three ways: in its condition prior to the 1994 Northridge Earthquake, with a hypothetical shearwall upgrade, and with earthquake insurance. This is the first application of ABV to a real building, and the first time ABV has incorporated stochastic structural analyses that consider uncertainties in the mass, damping, and force-deformation behavior of the structure, along with uncertainties in ground motion, component damageability, and repair costs. New fragility functions are developed for the reinforced concrete flexural members using published laboratory test data, and new unit repair costs for these components are developed by a professional construction cost estimator. Four investment alternatives are considered: do not buy; buy; buy and retrofit; and buy and insure. It is found that the best alternative for most reasonable values of discount rate, risk tolerance, and market risk is to buy and leave the building as-is. However, risk tolerance and market risk (variability of income) both materially affect the decision. That is, for certain ranges of each parameter, the best investment alternative changes. This indicates that expected-value decision-making is inappropriate for some decision-makers and investment opportunities. It is also found that the majority of the economic seismic risk results from shaking of S[subscript a] < 0.3g, i.e., shaking with return periods on the order of 50 to 100 yr that cause primarily architectural damage, rather than from the strong, rare events of which common probable maximum loss (PML) measurements are indicative. The Kajima demonstration is performed using three Tokyo buildings. A nine-story, steel-reinforced-concrete building built in 1961 is analyzed as two designs: as-is, and with a steel-braced-frame structural upgrade. The third building is 29-story, 1999 steel-frame structure. The three buildings are intended to meet collapse-prevention, life-safety, and operational performance levels, respectively, in shaking with 10%exceedance probability in 50 years. The buildings are assessed using levels 2 and 3 of Kajima's three-level analysis methodology. These are semi-assembly based approaches, which subdivide a building into categories of components, estimate the loss of these component categories for given ground motions, and combine the losses for the entire building. The two methods are used to estimate annualized losses and to create curves that relate loss to exceedance probability. The results are incorporated in the input to a sophisticated program developed by the Kajima Corporation, called Kajima D, which forecasts cash flows for office, retail, and residential projects for purposes of property screening, due diligence, negotiation, financial structuring, and strategic planning. The result is an estimate of NAV for each building. A parametric study of CE for each building is presented, along with a simplified model for calculating CE as a function of mean NAV and coefficient of variation of NAV. The equation agrees with that developed in parallel by the CUREE team. Both the CUREE and Kajima teams collaborated with a number of real-estate investors to understand their seismic risk-management practices, and to formulate and to assess the viability of the proposed decision-making methodologies. Investors were interviewed to elicit their risk-tolerance, r, using scripts developed and presented here in English and Japanese. Results of 10 such interviews are presented, which show that a strong relationship exists between a decision-maker's annual revenue, R, and his or her risk tolerance, [rho is approximately equal to] 0.0075R[superscript 1.34]. The interviews show that earthquake risk is a marginal consideration in current investment practice. Probable maximum loss (PML) is the only earthquake risk parameter these investors consider, and they typically do not use seismic risk at all in their financial analysis of an investment opportunity. For competitive reasons, a public investor interviewed here would not wish to account for seismic risk in his financial analysis unless rating agencies required him to do so or such consideration otherwise became standard practice. However, in cases where seismic risk is high enough to significantly reduce return, a private investor expressed the desire to account for seismic risk via expected annualized loss (EAL) if it were inexpensive to do so, i.e., if the cost of calculating the EAL were not substantially greater than that of PML alone. The study results point to a number of interesting opportunities for future research, namely: improve the market-risk stochastic model, including comparison of actual long-term income with initial income projections; improve the risk-attitude interview; account for uncertainties in repair method and in the relationship between repair cost and loss; relate the damage state of structural elements with points on the force-deformation relationship; examine simpler dynamic analysis as a means to estimate vulnerability; examine the relationship between simplified engineering demand parameters and performance; enhance category-based vulnerability functions by compiling a library of building-specific ones; and work with lenders and real-estate industry analysts to determine the conditions under which seismic risk should be reflected in investors' financial analyses

    Symbolic energy estimation model with optimum start algorithm implementation

    Get PDF
    The drive to reduce carbon emissions and energy utilisation, directly associated with dwellings and to achieve a zero carbon home, suggests that the assessment of energy ratings will have an increasingly prioritised role in the built environment. Created by the Building Research Establishment (BRE), the Standard Assessment Procedure (SAP) is the UK Government’s recommended method of assessing the energy ratings of dwellings. This paper describes a new, simplified dynamic method (hence known as IDEAS – Inverse Dynamics based Energy Analysis and Simulation) of assessing the controllability of a building and its servicing systems. The IDEAS method produces results that are comparable to SAP. An Optimum Start algorithm is explored in this paper to allow heating systems of different responsiveness and size to be integrated into the IDEAS framework. Results suggest that this design approach could enhance the SAP Methodology by the addition of advanced systems controllability and dynamic values

    Effects of Orientations, Aspect Ratios, Pavement Materials and Vegetation Elements on Thermal Stress inside Typical Urban Canyons

    Get PDF
    The analysis of local climate conditions to test artificial urban boundaries and related climate hazards through modelling tools should become a common practice to inform public authorities about the benefits of planning alternatives. Different finishing materials and sheltering objects within urban canyons (UCs) can be tested, predicted and compared through quantitative and qualitative understanding of the relationships between the microclimatic environment and subjective thermal assessment. This process can work as support planning instrument in the early design phases as has been done in this study that aims to analyze the thermal stress within typical UCs of Bilbao (Spain) in summertime through the evaluation of Physiologically Equivalent Temperature using ENVI-met. The UCs are characterized by different orientations, height-to-width aspect ratios, pavement materials, trees’ dimensions and planting pattern. Firstly, the current situation was analyzed; secondly, the effects of asphalt and red brick stones as streets’ pavement materials were compared; thirdly, the benefits of vegetation elements were tested. The analysis demonstrated that orientation and aspect ratio strongly affect the magnitude and duration of the thermal peaks at pedestrian level; while the vegetation elements improve the thermal comfort up to two thermophysiological assessment classes. The outcomes of this study, were transferred and visualized into green planning recommendations for new and consolidated urban areas in Bilbao.The work leading to these results has received funding from COST Action TU0902, the European Community’s Seventh Framework Programme under Grant Agreement No. 308497, Project RAMSES—Reconciling Adaptation, Mitigation and Sustainable Development for Cities (2012–2017) and DiputaciĂłn Foral de Bizkaia Exp. 6-12-TK-2010-0027, Project SICURB-ITS- Desarrollo de Sistemas para el anĂĄlisis de la ContaminaciĂłn atmosfĂ©rica en zonas URBanas integrados en ITS (2010–2011)

    On the Numerical Accuracy of Spreadsheets

    Get PDF
    This paper discusses the numerical precision of five spreadsheets (Calc, Excel, Gnumeric, NeoOffice and Oleo) running on two hardware platforms (i386 and amd64) and on three operating systems (Windows Vista, Ubuntu Intrepid and Mac OS Leopard). The methodology consists of checking the number of correct significant digits returned by each spreadsheet when computing the sample mean, standard deviation, first-order autocorrelation, F statistic in ANOVA tests, linear and nonlinear regression and distribution functions. A discussion about the algorithms for pseudorandom number generation provided by these platforms is also conducted. We conclude that there is no safe choice among the spreadsheets here assessed: they all fail in nonlinear regression and they are not suited for Monte Carlo experiments

    Topological exploration of artificial neuronal network dynamics

    Full text link
    One of the paramount challenges in neuroscience is to understand the dynamics of individual neurons and how they give rise to network dynamics when interconnected. Historically, researchers have resorted to graph theory, statistics, and statistical mechanics to describe the spatiotemporal structure of such network dynamics. Our novel approach employs tools from algebraic topology to characterize the global properties of network structure and dynamics. We propose a method based on persistent homology to automatically classify network dynamics using topological features of spaces built from various spike-train distances. We investigate the efficacy of our method by simulating activity in three small artificial neural networks with different sets of parameters, giving rise to dynamics that can be classified into four regimes. We then compute three measures of spike train similarity and use persistent homology to extract topological features that are fundamentally different from those used in traditional methods. Our results show that a machine learning classifier trained on these features can accurately predict the regime of the network it was trained on and also generalize to other networks that were not presented during training. Moreover, we demonstrate that using features extracted from multiple spike-train distances systematically improves the performance of our method

    Indoor Air Quality Assessment: Comparison of Ventilation Scenarios for Retrofitting Classrooms in a Hot Climate

    Get PDF
    Current energy e ciency policies in buildings foster the promotion of energy retrofitting of the existing stock. In southern Spain, the most extensive public sector is that of educational buildings, which is especially subject to significant internal loads due to high occupancy. A large fraction of the energy retrofit strategies conducted to date have focused on energy aspects and indoor thermal comfort, repeatedly disregarding indoor air quality criteria. This research assesses indoor air quality in a school located in the Mediterranean area, with the objective of promoting di erent ventilation scenarios, based on occupancy patterns and carbon dioxide levels monitored on site. Results show that manual ventilation cannot guarantee minimum indoor quality levels following current standards. A constant ventilation based on CO2 levels allows 15% more thermal comfort hours a year to be reached, compared to CO2-based optimized demand-controlled ventilation. Nevertheless, the latter ensures 35% annual energy savings, compared to a constant CO2-based ventilation, and 37% more annual energy savings over that of a constant ventilation rate of outdoor air per person
    • 

    corecore