6,684 research outputs found

    Towards an aims-led curriculum

    Get PDF

    Information Literacy Opportunities within the Discovery Tool Environment

    Full text link
    Discovery tools such as Primo, EBSCO Discovery Service, Summon, and WorldCat Local aim to make scholarly research more intuitive for students in part because of their single interface for searching across multiple platforms, including the library, fee-based databases, and unique digital collections. Discovery tools are in sync with the way many undergraduates look for information because they offer a more “Google-like” experience in contrast with previous methods of research that required first knowing which database to use, then searching each one differently according to its specifications. However, broad searches across multiple formats with different systems of controlled vocabulary force instructors to rethink the way they teach students to find information. This article will establish best practices to assist librarians in developing instructional classes for students to conduct research using a discovery tool

    Robot UR5 guiado por visiĂłn artificial

    Get PDF
    The aim of this project is to use a collaborative robot arm with six articulation points and a wide scope of flexibility, to feed a conveyor belt. These collaborative robot arms are designed to mimic the range of motion of a human arm and can be used with safety in the range of human workspace. The main problem of this project is that the robot doesnÂŽt know at any moment of the program the sizes, position nor orientation of the pieces it had to take. It will be accomplished with the help of a vision system which will guide the robot to locate and pick up the pieces from the pickup zone, and then place they in the beginning of the conveyor belt, but the conveyor belt and the base of the robot may not be always with the same offset, so the robot have to have a reference point of the conveyor belt to transforms and interpolate all the waypoints of the trajectories used to leave the piece on it. To accomplish this objective, an URCap will be used, the Wirst camera from Robotiq which need a vision server running in the controller with the licence key in a USB and to keep the collaborative attribute of the robot, a collaborative gripper will be used in which can be set the force, velocity of the opening and closing operations. As a conclusion the Wrist Camera, can detect really fast any thought object, and also dropped pieces and orientates them in the correct way, so the robot can pick them up. Communications between PLC and robot works in real-time and the robot tries to keep the PLC always busy.Departamento de QuĂ­mica AnalĂ­ticaGrado en IngenierĂ­a en ElectrĂłnica Industrial y AutomĂĄtic

    Quality-Driven Disorder Handling for M-way Sliding Window Stream Joins

    Full text link
    Sliding window join is one of the most important operators for stream applications. To produce high quality join results, a stream processing system must deal with the ubiquitous disorder within input streams which is caused by network delay, asynchronous source clocks, etc. Disorder handling involves an inevitable tradeoff between the latency and the quality of produced join results. To meet different requirements of stream applications, it is desirable to provide a user-configurable result-latency vs. result-quality tradeoff. Existing disorder handling approaches either do not provide such configurability, or support only user-specified latency constraints. In this work, we advocate the idea of quality-driven disorder handling, and propose a buffer-based disorder handling approach for sliding window joins, which minimizes sizes of input-sorting buffers, thus the result latency, while respecting user-specified result-quality requirements. The core of our approach is an analytical model which directly captures the relationship between sizes of input buffers and the produced result quality. Our approach is generic. It supports m-way sliding window joins with arbitrary join conditions. Experiments on real-world and synthetic datasets show that, compared to the state of the art, our approach can reduce the result latency incurred by disorder handling by up to 95% while providing the same level of result quality.Comment: 12 pages, 11 figures, IEEE ICDE 201

    Eligible assets, investment strategies and investor protection in light of modern portfolio theory: Towards a risk-based approach for UCITS. ECMI Policy Briefs No. 2, 18 September 2006

    Get PDF
    As the European Commission is currently in the process of preparing its White Paper on the enhancement of the EU framework for investment funds (scheduled for November 2006), now is a good time to reflect on whether the UCITS framework needs a radical overhaul if the regulatory landscape is going to adapt itself to the reality of market evolutions. European Capital Markets Institute (ECMI) Head of Research Jean-Pierre Casey contributes to this important debate with the second ECMI Policy Brief, in which he argues that UCITS ought to move to a risk-based approach as opposed to a reliance on the product approach. Casey concludes that both the product approach, which necessitates defining eligible assets – a laborious exercise – and the investment restrictions which form the other cornerstone of investor protection in UCITS, are outdated and out of sync with the lessons of modern portfolio theory. ECMI is an independent research body specialising in research on capital markets. It is managed by CEPS staff

    When Learning Counts: Rethinking Licenses for School Leaders

    Get PDF
    Recommends restructuring state licensing systems to focus on the skills and knowledge leaders need to improve learning, and better aligning licenses with the current job demands on principals

    The Maintenance of Virtue Over Time: Notes on Changing Household Lives in Post-Disaster Nepal

    Get PDF
    Although it is banal to say the series of earthquakes that hit Nepal in Spring 2015 will radically change the country, what this change will consist of still remains undetermined. As many earthquake victims learn to make do in broken houses, tents, or corrugated tin structures, post-earthquake Nepal seems held within a frustrating stasis, wherein temporary hardship is often impossible to distinguish from lasting consequence. Yet this sense of stasis is in part misleading. While the act of building remains slow, households who lost their homes have been scrambling to rethink their financial futures in order to afford reconstruction. In doing so, many earthquake victims have begun to enact changes in their households, accelerating divisions and unearthing tensions that had hitherto been allowed to lie dormant. Revitalizing Meyer Fortes’ classic discussions of amity and the development cycle, I introduce the stories of three informants who attempt to maintain the virtues of kinship in spite of the financial pressures they bear. I also explore how their actions reflect a reckoning between legal ownership and everyday household ownership practices – a reckoning that has affected how household members interact, often in unpredictable ways

    Progger: an efficient, tamper-evident kernel-space logger for cloud data provenance tracking

    Get PDF
    Cloud data provenance, or "what has happened to my data in the cloud", is a critical data security component which addresses pressing data accountability and data governance issues in cloud computing systems. In this paper, we present Progger (Provenance Logger), a kernel-space logger which potentially empowers all cloud stakeholders to trace their data. Logging from the kernel space empowers security analysts to collect provenance from the lowest possible atomic data actions, and enables several higher-level tools to be built for effective end-to-end tracking of data provenance. Within the last few years, there has been an increasing number of proposed kernel space provenance tools but they faced several critical data security and integrity problems. Some of these prior tools' limitations include (1) the inability to provide log tamper-evidence and prevention of fake/manual entries, (2) accurate and granular timestamp synchronisation across several machines, (3) log space requirements and growth, and (4) the efficient logging of root usage of the system. Progger has resolved all these critical issues, and as such, provides high assurance of data security and data activity audit. With this in mind, the paper will discuss these elements of high-assurance cloud data provenance, describe the design of Progger and its efficiency, and present compelling results which paves the way for Progger being a foundation tool used for data activity tracking across all cloud systems
    • 

    corecore