9 research outputs found

    Algorithms for Image Analysis in Traffic Surveillance Systems

    Get PDF
    Import 23/07/2015The presence of various surveillance systems in many areas of the modern society is indisputable and the most perceptible are the video surveillance systems. This thesis mainly describes novel algorithm for vision-based estimation of the parking lot occupancy and the closely related topics of pre-processing of images captured under harsh conditions. The developed algorithms have their practical application in the parking guidance systems which are still more popular. One part of this work also tries to contribute to the specific area of computer graphics denoted as direct volume rendering (DVR).Přítomnost nejrůznějších dohledových systémů v mnoha oblastech soudobé společnosti je nesporná a systémy pro monitorování dopravy jsou těmi nejviditelnějšími. Hlavní část této práce se věnuje popisu nového algoritmu pro detekci obsazenosti parkovacích míst pomocí analýzy obrazu získaného z kamerového systému. Práce se také zabývá tématy úzce souvisejícími s předzpracováním obrazu získaného za ztížených podmínek. Vyvinuté algoritmy mají své praktické uplatnění zejména v oblasti pomocných parkovacích systémů, které se stávají čím dál tím více populárními. Jedna část této práce se snaží přispět do oblasti počítačové grafiky označované jako přímá vizualizace objemových dat.Prezenční460 - Katedra informatikyvyhově

    Analysis of Android Device-Based Solutions for Fall Detection

    Get PDF
    Falls are a major cause of health and psychological problems as well as hospitalization costs among older adults. Thus, the investigation on automatic Fall Detection Systems (FDSs) has received special attention from the research community during the last decade. In this area, the widespread popularity, decreasing price, computing capabilities, built-in sensors and multiplicity of wireless interfaces of Android-based devices (especially smartphones) have fostered the adoption of this technology to deploy wearable and inexpensive architectures for fall detection. This paper presents a critical and thorough analysis of those existing fall detection systems that are based on Android devices. The review systematically classifies and compares the proposals of the literature taking into account different criteria such as the system architecture, the employed sensors, the detection algorithm or the response in case of a fall alarms. The study emphasizes the analysis of the evaluation methods that are employed to assess the effectiveness of the detection process. The review reveals the complete lack of a reference framework to validate and compare the proposals. In addition, the study also shows that most research works do not evaluate the actual applicability of the Android devices (with limited battery and computing resources) to fall detection solutions.Ministerio de Economía y Competitividad TEC2013-42711-

    Approximate computing: An integrated cross-layer framework

    Get PDF
    A new design approach, called approximate computing (AxC), leverages the flexibility provided by intrinsic application resilience to realize hardware or software implementations that are more efficient in energy or performance. Approximate computing techniques forsake exact (numerical or Boolean) equivalence in the execution of some of the application’s computations, while ensuring that the output quality is acceptable. While early efforts in approximate computing have demonstrated great potential, they consist of ad hoc techniques applied to a very narrow set of applications, leaving in question the applicability of approximate computing in a broader context. The primary objective of this thesis is to develop an integrated cross-layer approach to approximate computing, and to thereby establish its applicability to a broader range of applications. The proposed framework comprises of three key components: (i) At the circuit level, systematic approaches to design approximate circuits, or circuits that realize a slightly modified function with improved efficiency, (ii) At the architecture level, utilize approximate circuits to build programmable approximate processors, and (iii) At the software level, methods to apply approximate computing to machine learning classifiers, which represent an important class of applications that are being utilized across the computing spectrum. Towards this end, the thesis extends the state-of-the-art in approximate computing in the following important directions. Synthesis of Approximate Circuits: First, the thesis proposes a rigorous framework for the automatic synthesis of approximate circuits , which are the hardware building blocks of approximate computing platforms. Designing approximate circuits involves making judicious changes to the function implemented by the circuit such that its hardware complexity is lowered without violating the specified quality constraint. Inspired by classical approaches to Boolean optimization in logic synthesis, the thesis proposes two synthesis tools called SALSA and SASIMI that are general, i.e., applicable to any given circuit and quality specification. The framework is further extended to automatically design quality configurable circuits , which are approximate circuits with the capability to reconfigure their quality at runtime. Over a wide range of arithmetic circuits, complex modules and complete datapaths, the circuits synthesized using the proposed framework demonstrate significant benefits in area and energy. Programmable AxC Processors: Next, the thesis extends approximate computing to the realm of programmable processors by introducing the concept of quality programmable processors (QPPs). A key principle of QPPs is that the notion of quality is explicitly codified in their HW/SW interface i.e., the instruction set. Instructions in the ISA are extended with quality fields, enabling software to specify the accuracy level that must be met during their execution. The micro-architecture is designed with hardware mechanisms to understand these quality specifications and translate them into energy savings. As a first embodiment of QPPs, the thesis presents a quality programmable 1D/2D vector processor QP-Vec, which contains a 3-tiered hierarchy of processing elements. Based on an implementation of QP-Vec with 289 processing elements, energy benefits up to 2.5X are demonstrated across a wide range of applications. Software and Algorithms for AxC: Finally, the thesis addresses the problem of applying approximate computing to an important class of applications viz. machine learning classifiers such as deep learning networks. To this end, the thesis proposes two approaches—AxNN and scalable effort classifiers. Both approaches leverage domain- specific insights to transform a given application to an energy-efficient approximate version that meets a specified application output quality. In the context of deep learning networks, AxNN adapts backpropagation to identify neurons that contribute less significantly to the network’s accuracy, approximating these neurons (e.g., by using lower precision), and incrementally re-training the network to mitigate the impact of approximations on output quality. On the other hand, scalable effort classifiers leverage the heterogeneity in the inherent classification difficulty of inputs to dynamically modulate the effort expended by machine learning classifiers. This is achieved by building a chain of classifiers of progressively growing complexity (and accuracy) such that the number of stages used for classification scale with input difficulty. Scalable effort classifiers yield substantial energy benefits as a majority of the inputs require very low effort in real-world datasets. In summary, the concepts and techniques presented in this thesis broaden the applicability of approximate computing, thus taking a significant step towards bringing approximate computing to the mainstream. (Abstract shortened by ProQuest.

    Exploiting general-purpose background knowledge for automated schema matching

    Full text link
    The schema matching task is an integral part of the data integration process. It is usually the first step in integrating data. Schema matching is typically very complex and time-consuming. It is, therefore, to the largest part, carried out by humans. One reason for the low amount of automation is the fact that schemas are often defined with deep background knowledge that is not itself present within the schemas. Overcoming the problem of missing background knowledge is a core challenge in automating the data integration process. In this dissertation, the task of matching semantic models, so-called ontologies, with the help of external background knowledge is investigated in-depth in Part I. Throughout this thesis, the focus lies on large, general-purpose resources since domain-specific resources are rarely available for most domains. Besides new knowledge resources, this thesis also explores new strategies to exploit such resources. A technical base for the development and comparison of matching systems is presented in Part II. The framework introduced here allows for simple and modularized matcher development (with background knowledge sources) and for extensive evaluations of matching systems. One of the largest structured sources for general-purpose background knowledge are knowledge graphs which have grown significantly in size in recent years. However, exploiting such graphs is not trivial. In Part III, knowledge graph em- beddings are explored, analyzed, and compared. Multiple improvements to existing approaches are presented. In Part IV, numerous concrete matching systems which exploit general-purpose background knowledge are presented. Furthermore, exploitation strategies and resources are analyzed and compared. This dissertation closes with a perspective on real-world applications

    On User Privacy for Location-based Services

    Get PDF
    This thesis investigates user privacy concerns associated with the use of location based services. We begin by introducing various privacy schemes relevant to the use of location based services. We introduce the notion of constraints, i.e. statements limiting the use and dis tribution of Location Information (LI), i.e. data providing information regarding a subject's location. Constraints can be securely bound to LI, and are designed to reduce threats to privacy by controlling its dissemination and use. The various types of constraint which may be required are also considered. The issues and risks with the possible use of constraints are discussed, as are possible solutions to these hazards. To address some of the problems that have been identified with the use of constraints, we introduce the notion of an LI Preference Authority (LIPA). A LIPA is a trusted party which can examine LI constraints and make decisions about LI distribution without revealing the constraints to the entity requesting the LI. This is achieved by encrypting both the LI and the constraints with a LIPA encryption key, ensuring that the LI is only revealed at the discretion of the LIPA. We further show how trusted computing can be used to enhance privacy for LI. We focus on how the mechanisms in the Trusted Computing Group specifications can be used to enable the holder of LI to verify the trustworthiness of a remote host before transferring the LI to that remote device. This provides greater assurance to end users that their expressed preferences for the handling of personal information will be respected. The model for the control of LI described in this thesis has close parallels to models controlling the dissemination and use of other personal information. In particular, Park and Sandhu have developed a general access control model intended to address issues such as Digital Rights Management, code authorisation, and the control of personal data. We show how our model for LI control fits into this general access control model. We present a generic service which allows a device to discover the location of other devices in ad hoc networks. The advantages of the service are discussed in several scenarios, where the reliance on an infrastructure such as GPS satellites or GSM cellular base stations is not needed. An outline of the technology which will be needed to realise the service is given, along with a look at the security issues which surround the use of this location discovery service. Finally, we provide conclusions and suggestions for future work

    Quantifying Quality of Life

    Get PDF
    Describes technological methods and tools for objective and quantitative assessment of QoL Appraises technology-enabled methods for incorporating QoL measurements in medicine Highlights the success factors for adoption and scaling of technology-enabled methods This open access book presents the rise of technology-enabled methods and tools for objective, quantitative assessment of Quality of Life (QoL), while following the WHOQOL model. It is an in-depth resource describing and examining state-of-the-art, minimally obtrusive, ubiquitous technologies. Highlighting the required factors for adoption and scaling of technology-enabled methods and tools for QoL assessment, it also describes how these technologies can be leveraged for behavior change, disease prevention, health management and long-term QoL enhancement in populations at large. Quantifying Quality of Life: Incorporating Daily Life into Medicine fills a gap in the field of QoL by providing assessment methods, techniques and tools. These assessments differ from the current methods that are now mostly infrequent, subjective, qualitative, memory-based, context-poor and sparse. Therefore, it is an ideal resource for physicians, physicians in training, software and hardware developers, computer scientists, data scientists, behavioural scientists, entrepreneurs, healthcare leaders and administrators who are seeking an up-to-date resource on this subject

    Quantifying Quality of Life

    Get PDF
    Describes technological methods and tools for objective and quantitative assessment of QoL Appraises technology-enabled methods for incorporating QoL measurements in medicine Highlights the success factors for adoption and scaling of technology-enabled methods This open access book presents the rise of technology-enabled methods and tools for objective, quantitative assessment of Quality of Life (QoL), while following the WHOQOL model. It is an in-depth resource describing and examining state-of-the-art, minimally obtrusive, ubiquitous technologies. Highlighting the required factors for adoption and scaling of technology-enabled methods and tools for QoL assessment, it also describes how these technologies can be leveraged for behavior change, disease prevention, health management and long-term QoL enhancement in populations at large. Quantifying Quality of Life: Incorporating Daily Life into Medicine fills a gap in the field of QoL by providing assessment methods, techniques and tools. These assessments differ from the current methods that are now mostly infrequent, subjective, qualitative, memory-based, context-poor and sparse. Therefore, it is an ideal resource for physicians, physicians in training, software and hardware developers, computer scientists, data scientists, behavioural scientists, entrepreneurs, healthcare leaders and administrators who are seeking an up-to-date resource on this subject

    Actas de las XXXIV Jornadas de Automática

    Get PDF
    Postprint (published version
    corecore