34,464 research outputs found

    Flare: Architecture for rapid and easy development of Internet-based Applications

    Full text link
    We propose an architecture, Flare, that is a structured and easy way to develop applications rapidly, in a multitude of languages, which make use of online storage of data and management of users. The architecture eliminates the need for server-side programming in most cases, creation and management of online database storage servers, re-creation of user management schemes and writing a lot of unnecessary code for accessing different web-based services using their APIs. A Web API provides a common API for various web-based services like Blogger [2], Wordpress, MSN Live, Facebook [3] etc. Access Libraries provided for major programming languages and platforms make it easy to develop applications using the Flare Web Service. We demonstrate a simple micro-blogging service developed using these APIs in two modes: a graphical browser-based mode, and a command-line mode in C++, which provide two different interfaces to the same account and data.Comment: 4 pages, 5 figure

    Exploiting skewness to build an optimal hedge fund with a currency overlay

    Get PDF
    This paper documents an investigation into the use of portfolio selection methods to construct a hedge fund with a currency overlay. The fund, which is based on number of international stock and bond market indices and is constructed from the perspective of a Sterling investor, allows the individual exposures in the currency overlay to be optimally determined. As well as using traditional mean variance, the paper constructs the hedge funds using portfolio selection methods that incorporate skewness in the optimisation process. These methods are based on the multivariate skewnormal distribution, which motivates the use of a linear skewness shock. An extension to Stein's lemma gives the ability to explore the mean-variance-skewness efficient surface without the necessity to be concerned with the precise form of an individual investor's utility function. The results suggest that it is possible to use mean variance optimisation methods to build a hedge fund based on the assets and return forecasts described. The results also suggest that the inclusion of a skewness component in the optimisation is beneficial. In many of the cases reported, the skewness term contributes to an improvement in performance over and above that given by mean variance methods

    Accelerating Curriculum Design: A Love It, Don\u27t Leave It Approach to Creative Process and Idealized Design

    Get PDF
    Purpose and Background: The Institute of Medicine’s (IOM) report (2010) on the “Future of Nursing” emphasized the need for nurses to lead health care change. One of the key messages in this report is a call to action for nursing schools to re-envision nursing education that focuses on a population-based perspective and emerging roles for nurses across the care continuum. With an evolving focus on primary and community-based care rather than acute care, and recognition of the importance of coordinating care and managing transitions across providers and settings of care, registered nurses now and in the future will need to be prepared with a breadth of knowledge, skills, and competencies. In response, the Jefferson College of Nursing (JCN) embarked on the ambitious task of designing a new 21st century baccalaureate nursing curriculum over a 13-month period. Nursing curriculum design varies widely and can span the course of two to five years. To reduce the lengthy process and ensure faculty commitment, JCN leadership selected a core team of nine faculty members to navigate the full faculty through the design of the curriculum. Each team member was assigned three teaching credits for curriculum development and design. Although a 13-month turnaround time for curriculum design is unprecedented, what is most unique about JCN’s initiative is that it began with a charge of developing an idealized curriculum from a blank slate. To ensure that the curriculum reflected multiple perspectives, the team recruited six stakeholders including a nurse practice partner, health care consumer, community leader, alumnus, current student, and adjunct clinical faculty. Poster presented at: NLN Education Summit, 2015:Bridging Practice and Education, Las Vegas, Nevada, September 30, 2015-October 2, 2015.https://jdc.jefferson.edu/nursingposters/1009/thumbnail.jp

    Analysis of Neighbourhoods in Multi-layered Dynamic Social Networks

    Full text link
    Social networks existing among employees, customers or users of various IT systems have become one of the research areas of growing importance. A social network consists of nodes - social entities and edges linking pairs of nodes. In regular, one-layered social networks, two nodes - i.e. people are connected with a single edge whereas in the multi-layered social networks, there may be many links of different types for a pair of nodes. Nowadays data about people and their interactions, which exists in all social media, provides information about many different types of relationships within one network. Analysing this data one can obtain knowledge not only about the structure and characteristics of the network but also gain understanding about semantic of human relations. Are they direct or not? Do people tend to sustain single or multiple relations with a given person? What types of communication is the most important for them? Answers to these and more questions enable us to draw conclusions about semantic of human interactions. Unfortunately, most of the methods used for social network analysis (SNA) may be applied only to one-layered social networks. Thus, some new structural measures for multi-layered social networks are proposed in the paper, in particular: cross-layer clustering coefficient, cross-layer degree centrality and various versions of multi-layered degree centralities. Authors also investigated the dynamics of multi-layered neighbourhood for five different layers within the social network. The evaluation of the presented concepts on the real-world dataset is presented. The measures proposed in the paper may directly be used to various methods for collective classification, in which nodes are assigned to labels according to their structural input features.Comment: 16 pages, International Journal of Computational Intelligence System

    Developing a Business Case for the Care Coordination and Transition Management Model: Need, Metrics, and Measures

    Get PDF
    In this descriptive qualitative study, nurse and healthcare leaders\u27 experiences, perceptions of care coordination and transition management (CCTM®), and insights as to how to foster adoption of the CCTM RN role in nursing education, practice across the continuum, and policy were explored. Twenty-five barriers to recognition and adoption of CCTM RN practice across the continuum were identified and categorized. Implications of these findings, recommendations for adoption of CCTM RN practice across the care continuum, and strategies for reimbursement policies are discussed

    Species-specific forest variable estimation using non-parametric modeling of multi-spectral photogrammetric point cloud data

    Get PDF
    The recent development in software for automatic photogrammetric processing of multispectral aerial imagery, and the growing nation-wide availability of Digital Elevation Model (DEM) data, are about to revolutionize data capture for forest management planning in Scandinavia. Using only already available aerial imagery and ALS-assessed DEM data, raster estimates of the forest variables mean tree height, basal area, total stem volume, and species-specific stem volumes were produced and evaluated. The study was conducted at a coniferous hemi-boreal test site in southern Sweden (lat. 58° N, long. 13° E). Digital aerial images from the Zeiss/Intergraph Digital Mapping Camera system were used to produce 3D point-cloud data with spectral information. Metrics were calculated for 696 field plots (10 m radius) from point-cloud data and used in k-MSN to estimate forest variables. For these stands, the tree height ranged from 1.4 to 33.0 m (18.1 m mean), stem volume from 0 to 829 m3 ha-1 (249 m3 ha-1 mean) and basal area from 0 to 62.2 m2 ha-1 (26.1 m2 ha-1 mean), with mean stand size of 2.8 ha. Estimates made using digital aerial images corresponding to the standard acquisition of the Swedish National Land Survey (Lantmäteriet) showed RMSEs (in percent of the surveyed stand mean) of 7.5% for tree height, 11.4% for basal area, 13.2% for total stem volume, 90.6% for pine stem volume, 26.4 for spruce stem volume, and 72.6% for deciduous stem volume. The results imply that photogrammetric matching of digital aerial images has significant potential for operational use in forestry

    I/O Schedulers for Proportionality and Stability on Flash-Based SSDs in Multi-Tenant Environments

    Get PDF
    The use of flash based Solid State Drives (SSDs) has expanded rapidly into the cloud computing environment. In cloud computing, ensuring the service level objective (SLO) of each server is the major criterion in designing a system. In particular, eliminating performance interference among virtual machines (VMs) on shared storage is a key challenge. However, studies on SSD performance to guarantee SLO in such environments are limited. In this paper, we present analysis of I/O behavior for a shared SSD as storage in terms of proportionality and stability. We show that performance SLOs of SSD based storage systems being shared by VMs or tasks are not satisfactory. We present and analyze the reasons behind the unexpected behavior through examining the components of SSDs such as channels, DRAM buffer, and Native Command Queuing (NCQ). We introduce two novel SSD-aware host level I/O schedulers on Linux, called A & x002B;CFQ and H & x002B;BFQ, based on our analysis and findings. Through experiments on Linux, we analyze I/O proportionality and stability in multi-tenant environments. In addition, through experiments using real workloads, we analyze the performance interference between workloads on a shared SSD. We then show that the proposed I/O schedulers almost eliminate the interference effect seen in CFQ and BFQ, while still providing I/O proportionality and stability for various I/O weighted scenarios
    corecore