1,229 research outputs found
Banking on Shared Value: How Banks Profit by Rethinking Their Purpose
This paper articulates a new role for banks in society using the lens of shared value. It is intended to help bank leaders, their partners, and industry regulators seize opportunities to create financial value while addressing unmet social and environmental needs at scale. The concepts included here apply across different types of banking, across different bank sizes, and across developed and emerging economies alike, although their implementation will naturally differ based on context
A non-linear stochastic asset model for actuarial use
This paper reviews the stochastic asset model described in Wilkie (1995) and previous work on refining this model. The paper then considers the application of non-linear modelling to investment series, considering both ARCH techniques and threshold modelling. The paper suggests a threshold autoregressive (TAR) system as a useful progression from the Wilkie (1995) model. The authors are making available (by email, on request) a collection of spreadsheets, which they have used to simulate the stochastic asset models which are considered in this paper
Towards the “ultimate earthquake-proof” building: Development of an integrated low-damage system
The 2010–2011 Canterbury earthquake sequence has highlighted the
severe mismatch between societal expectations over the reality of seismic performance
of modern buildings. A paradigm shift in performance-based design criteria
and objectives towards damage-control or low-damage design philosophy and
technologies is urgently required. The increased awareness by the general public,
tenants, building owners, territorial authorities as well as (re)insurers, of the severe
socio-economic impacts of moderate-strong earthquakes in terms of damage/dollars/
downtime, has indeed stimulated and facilitated the wider acceptance and
implementation of cost-efficient damage-control (or low-damage) technologies.
The ‘bar’ has been raised significantly with the request to fast-track the development
of what the wider general public would hope, and somehow expect, to live
in, i.e. an “earthquake-proof” building system, capable of sustaining the shaking of
a severe earthquake basically unscathed.
The paper provides an overview of recent advances through extensive research,
carried out at the University of Canterbury in the past decade towards the development
of a low-damage building system as a whole, within an integrated
performance-based framework, including the skeleton of the superstructure, the
non-structural components and the interaction with the soil/foundation system.
Examples of real on site-applications of such technology in New Zealand, using
concrete, timber (engineered wood), steel or a combination of these materials, and
featuring some of the latest innovative technical solutions developed in the laboratory
are presented as examples of successful transfer of performance-based seismic
design approach and advanced technology from theory to practice
Using Lunar Observations to Validate Pointing Accuracy and Geolocation, Detector Sensitivity Stability and Static Point Response of the CERES Instruments
Validation of in-orbit instrument performance is a function of stability in both instrument and calibration source. This paper describes a method using lunar observations scanning near full moon by the Clouds and Earth Radiant Energy System (CERES) instruments. The Moon offers an external source whose signal variance is predictable and non-degrading. From 2006 to present, these in-orbit observations have become standardized and compiled for the Flight Models -1 and -2 aboard the Terra satellite, for Flight Models-3 and -4 aboard the Aqua satellite, and beginning 2012, for Flight Model-5 aboard Suomi-NPP. Instrument performance measurements studied are detector sensitivity stability, pointing accuracy and static detector point response function. This validation method also shows trends per CERES data channel of 0.8% per decade or less for Flight Models 1-4. Using instrument gimbal data and computed lunar position, the pointing error of each detector telescope, the accuracy and consistency of the alignment between the detectors can be determined. The maximum pointing error was 0.2 Deg. in azimuth and 0.17 Deg. in elevation which corresponds to an error in geolocation near nadir of 2.09 km. With the exception of one detector, all instruments were found to have consistent detector alignment from 2006 to present. All alignment error was within 0.1o with most detector telescopes showing a consistent alignment offset of less than 0.02 Deg
Future Flight Opportunities and Calibration Protocols for CERES: Continuation of Observations in Support of the Long-Term Earth Radiation Budget Climate Data Record
The goal of the Clouds and the Earth s Radiant Energy System (CERES) project is to provide a long-term record of radiation budget at the top-of-atmosphere (TOA), within the atmosphere, and at the surface with consistent cloud and aerosol properties at climate accuracy. CERES consists of an integrated instrument-algorithm validation science team that provides development of higher-level products (Levels 1-3) and investigations. It involves a high level of data fusion, merging inputs from 25 unique input data sources to produce 18 CERES data products. Over 90% of the CERES data product volume involves two or more instruments. Continuation of the Earth Radiation Budget (ERB) Climate Data Record (CDR) has been identified as critical in the 2007 NRC Decadal Survey, the Global Climate Observing System WCRP report, and in an assessment titled Impacts of NPOESS Nunn-McCurdy Certification on Joint NASA-NOAA Climate Goals . Five CERES instruments have flown on three different spacecraft: TRMM, EOS-Terra and EOS-Aqua. In response, NASA, NOAA and NPOESS have agreed to fly the existing CERES Flight Model (FM-5) on the NPP spacecraft in 2011 and to procure an additional CERES Sensor with modest upgrades for flight on the JPSS C1 spacecraft in 2014, followed by a CERES follow-on sensor for flight in 2018. CERES is a scanning broadband radiometer that measures filtered radiance in the SW (0.3-5 m), total (TOT) (0.3-200 m) and WN (8-12 m) regions. Pre-launch calibration is performed on each Flight Model to meet accuracy requirements of 1% for SW and 0.5% for outgoing LW observations. Ground to flight or in-flight changes are monitored using protocols employing onboard and vicarious calibration sources. Studies of flight data show that SW response can change dramatically due to optical contamination. with greatest impact in blue-to UV radiance, where tungsten lamps are largely devoid of output. While science goals remain unchanged for ERB Climate Data Record, it is now understood that achieving these goals is more difficult for two reasons. The first is an increased understanding of the dynamics of the Earth/atmosphere system which demonstrates that separation of natural variability from anthropogenic change on decadal time scales requires observations with higher accuracy and stabili
X-ray and optical variability of Seyfert 1 galaxies as observed with XMM-Newton
We have examined simultaneous X-ray and optical light curves of a sample of
eight nearby Seyfert 1 galaxies observed using the EPIC X-ray cameras and
Optical Monitor on board XMM. The observations span ~1 day and revealed optical
variability in four of the eight objects studied. In all cases, the X-ray
variability amplitude exceeded that of the optical both in fractional and
absolute luminosity terms. No clearly significant correlations were detected
between wavebands using cross correlation analysis. We conclude that, in three
of the four objects in which optical variability was detected, reprocessing
mechanisms between wavebands do not dominate either the optical or X-ray
variability on the time-scales probed.Comment: 9 pages, 2 figures, accepted for publication in MNRA
Developing the Inclusive Course Design Tool: a tool to support staff reflection on their inclusive practice
Inclusivity is fundamental to higher education, its course design, its assessment and its delivery. The principles of inclusivity offer all students the opportunities to achieve to the best of their ability. The purpose of this case-study based paper is to outline the context, process and development and initial evaluation of a newly generated tool designed for academic colleagues. The Inclusive Course Design Tool (ICDT) offers a series of reflective questions and supporting guidance rooted in theory and research on inclusion, pedagogy, multiculturalism, universal design for learning and implicit and unconscious bias. This first version of the Tool encourages course teams to reflect on and interrogate the nature of inclusive academic practice in their courses, in their course curricula, their classrooms (virtual or physical) and their approaches to student learning and support. The contextualised rationale for the Tool, its design, the consultation process, its early evaluation and future considerations as an institutional tool are explored. Its use to try to reduce the Black, Asian and minority ethnic (BAME) student attainment gap and enhance success and graduate outcomes, and enhance academic practice and reflection are specifically explored
Discussion of “The effect of energy concentration of earthquake ground motions on the nonlinear response of RC structures” by H. Cao, M.I. Friswell
This discussion raises a few comments and questions on the paper by Cao and Friswell [Cao H, Friswell MI. The effect of energy concentration of earthquake ground motions on the nonlinear response of RC structures. Soil Dyn Earthquake Eng 2009; 29: 292?9.]. The authors consider an interesting problem on seismic response analysis of nonlinear structures. Specifically, the study examines the implication of the energy concentration of the adopted ground acceleration record on the nonlinear response of reinforced concrete structures. The paper employs the wavelet transform to characterize the energy content of the ground acceleration in time and frequency domains
Testing For Nonlinearity Using Redundancies: Quantitative and Qualitative Aspects
A method for testing nonlinearity in time series is described based on
information-theoretic functionals -- redundancies, linear and nonlinear forms
of which allow either qualitative, or, after incorporating the surrogate data
technique, quantitative evaluation of dynamical properties of scrutinized data.
An interplay of quantitative and qualitative testing on both the linear and
nonlinear levels is analyzed and robustness of this combined approach against
spurious nonlinearity detection is demonstrated. Evaluation of redundancies and
redundancy-based statistics as functions of time lag and embedding dimension
can further enhance insight into dynamics of a system under study.Comment: 32 pages + 1 table in separate postscript files, 12 figures in 12
encapsulated postscript files, all in uuencoded, compressed tar file. Also
available by anon. ftp to santafe.edu, in directory pub/Users/mp/qq. To be
published in Physica D., [email protected]
A reference framework for process-oriented software development organizations
In this paper, a proposal of a generic framework for process-oriented software development organizations is presented. Additionally, the respective way of managing the process model, and the instantiation of their processes with the Rational Unified Process (RUP) disciplines, whenever they are available, or with other kind of processes is suggested. The proposals made here were consolidated with experiences from real projects and we report the main results from one of those projects.FCT -Fuel Cell Technologies Program(POSI/37334/CHS/2001
- …