33,258 research outputs found

    Modeling the risk process in the XploRe computing environment

    Get PDF
    A user friendly approach to modeling the risk process is presented. It utilizes the insurance library of the XploRe computing environment which is accompanied by on-line, hyperlinked and freely downloadable from the web manuals and e-books. The empirical analysis for Danish ïŹre losses for the years 1980-90 is conducted and the best fitting of the risk process to the data is illustrated.Risk process, Monte Carlo simulation, XploRe computing environment

    Yxilon – a Modular Open-Source Statistical Programming Language

    Get PDF
    Statistical research has always been at the edge of available computing power. Huge datasets, e.g in DataMining or Quantitative Finance, and computationally intensive techniques, e.g. bootstrap methods, always require a little bit more computing power than is currently available. But the most popular statistical programming language R, as well as statistical programming languages like S or XploRe, are interpreted which makes them slow in computing intensive areas. The common solution is to implement these routines in low-level programming languages like C/C++ or Fortran and subsequently integrate them as dynamic linked libraries (DLL) or shared object libraries (SO) in the statistical programming language.statistical programming language, XploRe, Yxilon, Java, dynamic linked libraries, shared object libraries

    Community Perspectives- How Study Abroad with Service Learning Impacts the Locals

    Get PDF
    This case study focuses on the community perspectives of homestay families, partner organizations, and local program staff that collaborated with Xplore USA Summer Language Adventure Camps in the Summer of 2011, in Asheville, North Carolina. The researcher focused on the service work aspect of the Xplore programming, and its impact on the local community via the local community’s perspective, to inform the reader of an underexplored subject. Interviews and survey results showed that the volunteer service projects performed by Xplore students and their local brothers and sisters were perceived as beneficial by an overwhelming majority of all local parties concerned: Xplore staff, host family siblings and parents, and local recipient organizations. The case study illustrates the need to collect local community feedback for proper evaluation of international service learning experiences

    A joint-channel diagonalization for multiuser MIMO antenna systems

    Get PDF
    In this paper, we address the problem of improving the performance of multiuser space-division multiplexing (SDM) systems where multiple independent signal streams can be transmitted in the same frequency and time slot. The problem is important in multiuser multiple-input multiple-output systems where communication from one base station to many mobile stations can occur simultaneously. Our objective is to devise a multiuser linear space-time precoder for simultaneous channel diagonalization of the multiuser channels enabling SDM. Our new approach is based on diagonalizing the multiuser channel matrices and we use a variation of successive Jacobi rotations. In addition to the diagonalization, our approach attempts to optimize the resultant channel gains for performance enhancement. Our method is valid for both frequency-flat and frequency-selective fading channels but we assume that the base station knows all the channels and that they are quasi-stationary

    Using Self-Contradiction to Learn Confidence Measures in Stereo Vision

    Get PDF
    Learned confidence measures gain increasing importance for outlier removal and quality improvement in stereo vision. However, acquiring the necessary training data is typically a tedious and time consuming task that involves manual interaction, active sensing devices and/or synthetic scenes. To overcome this problem, we propose a new, flexible, and scalable way for generating training data that only requires a set of stereo images as input. The key idea of our approach is to use different view points for reasoning about contradictions and consistencies between multiple depth maps generated with the same stereo algorithm. This enables us to generate a huge amount of training data in a fully automated manner. Among other experiments, we demonstrate the potential of our approach by boosting the performance of three learned confidence measures on the KITTI2012 dataset by simply training them on a vast amount of automatically generated training data rather than a limited amount of laser ground truth data.Comment: This paper was accepted to the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2016. The copyright was transfered to IEEE (https://www.ieee.org). The official version of the paper will be made available on IEEE Xplore (R) (http://ieeexplore.ieee.org). This version of the paper also contains the supplementary material, which will not appear IEEE Xplore (R

    Blind hyperspectral unmixing using an Extended Linear Mixing Model to address spectral variability

    No full text
    International audienceSpectral Unmixing is one of the main research topics in hyperspectral imaging. It can be formulated as a source separation problem whose goal is to recover the spectral signatures of the materials present in the observed scene (called endmembers) as well as their relative proportions (called fractional abundances), and this for every pixel in the image. A Linear Mixture Model is often used for its simplicity and ease of use but it implicitly assumes that a single spectrum can be completely representative of a material. However, in many scenarios, this assumption does not hold since many factors, such as illumination conditions and intrinsic variability of the endmembers, induce modifications on the spectral signatures of the materials. In this paper, we propose an algorithm to unmix hyperspectral data using a recently proposed Extended Linear Mixing Model. The proposed approach allows a pixelwise spatially coherent local variation of the endmembers, leading to scaled versions of reference endmembers. We also show that the classic nonnegative least squares, as well as other approaches to tackle spectral variability can be interpreted in the framework of this model. The results of the proposed algorithm on two different synthetic datasets, including one simulating the effect of topography on the measured reflectance through physical modelling, and on two real datasets, show that the proposed technique outperforms other methods aimed at addressing spectral variability, and can provide an accurate estimation of endmember variability along the scene thanks to the scaling factors estimation

    Frequency and fundamental signal measurement algorithms for distributed control and protection applications

    Get PDF
    Increasing penetration of distributed generation within electricity networks leads to the requirement for cheap, integrated, protection and control systems. To minimise cost, algorithms for the measurement of AC voltage and current waveforms can be implemented on a single microcontroller, which also carries out other protection and control tasks, including communication and data logging. This limits the frame rate of the major algorithms, although analogue to digital converters (ADCs) can be oversampled using peripheral control processors on suitable microcontrollers. Measurement algorithms also have to be tolerant of poor power quality, which may arise within grid-connected or islanded (e.g. emergency, battlefield or marine) power system scenarios. This study presents a 'Clarke-FLL hybrid' architecture, which combines a three-phase Clarke transformation measurement with a frequency-locked loop (FLL). This hybrid contains suitable algorithms for the measurement of frequency, amplitude and phase within dynamic three-phase AC power systems. The Clarke-FLL hybrid is shown to be robust and accurate, with harmonic content up to and above 28% total harmonic distortion (THD), and with the major algorithms executing at only 500 samples per second. This is achieved by careful optimisation and cascaded use of exact-time averaging techniques, which prove to be useful at all stages of the measurements: from DC bias removal through low-sample-rate Fourier analysis to sub-harmonic ripple removal. Platform-independent algorithms for three-phase nodal power flow analysis are benchmarked on three processors, including the Infineon TC1796 microcontroller, on which only 10% of the 2000 mus frame time is required, leaving the remainder free for other algorithms

    Delay distributions of slotted ALOHA and CSMA

    Get PDF
    We derive the closed-form delay distributions of slotted ALOHA and nonpersistent carrier sense multiple access (CSMA) protocols under steady state. Three retransmission policies are analyzed. We find that under a binary exponential backoff retransmission policy, finite average delay and finite delay variance can be guaranteed for G<2S and G<4S/3, respectively, where G is the channel traffic and S is the channel throughput. As an example, in slotted ALOHA, S<(ln2)/2 and S<3(ln4-ln3)/4 are the operating ranges for finite first and second delay moments. In addition, the blocking probability and delay performance as a function of r/sub max/ (maximum number of retransmissions allowed) is also derived

    Requirements Prioritization Based on Benefit and Cost Prediction: An Agenda for Future Research

    Get PDF
    In early phases of the software cycle, requirements prioritization necessarily relies on the specified requirements and on predictions of benefit and cost of individual requirements. This paper presents results of a systematic review of literature, which investigates how existing methods approach the problem of requirements prioritization based on benefit and cost. From this review, it derives a set of under-researched issues which warrant future efforts and sketches an agenda for future research in this area

    Ocean energy:the wave of the future

    Get PDF
    The power point presentation discussed the developing technology of ocean energy with design convergence on tidal but not on wave. Today's technologies will help solve the immediate needs, but we need to work hard nurturing tomorrow's low carbon technologies today. Ocean energy represents one of the more difficult forms of renewable energy to harness. The UK is leading internationally in the development of marine energy but further development investment is needed to move the technology forward. Marine energy could supply up to 2 GW of UK electricity demand by 2020 and significantly more than this by 2050. The development of ocean energy and promising ocean driven machines are briefly reviewed, their operating conditions and the suitability of different types of hydro turbines for use as power take off options, the recent international experience, and how the technology is developing
    • 

    corecore