18 research outputs found

    Origin and insertion of the medial patellofemoral ligament: a systematic review of anatomy.

    Get PDF
    PURPOSE: The medial patellofemoral ligament (MPFL) is the major medial soft-tissue stabiliser of the patella, originating from the medial femoral condyle and inserting onto the medial patella. The exact position reported in the literature varies. Understanding the true anatomical origin and insertion of the MPFL is critical to successful reconstruction. The purpose of this systematic review was to determine these locations. METHODS: A systematic search of published (AMED, CINAHL, MEDLINE, EMBASE, PubMed and Cochrane Library) and unpublished literature databases was conducted from their inception to the 3 February 2016. All papers investigating the anatomy of the MPFL were eligible. Methodological quality was assessed using a modified CASP tool. A narrative analysis approach was adopted to synthesise the findings. RESULTS: After screening and review of 2045 papers, a total of 67 studies investigating the relevant anatomy were included. From this, the origin appears to be from an area rather than (as previously reported) a single point on the medial femoral condyle. The weighted average length was 56 mm with an 'hourglass' shape, fanning out at both ligament ends. CONCLUSION: The MPFL is an hourglass-shaped structure running from a triangular space between the adductor tubercle, medial femoral epicondyle and gastrocnemius tubercle and inserts onto the superomedial aspect of the patella. Awareness of anatomy is critical for assessment, anatomical repair and successful surgical patellar stabilisation. LEVEL OF EVIDENCE: Systematic review of anatomical dissections and imaging studies, Level IV

    TopoCad – A unified system for geospatial data and services

    No full text
    "E-government" is a leading trend in public sector activities in recent years. The Survey of Israel set as a vision to provide all of its services and datasets online. The TopoCad system is the latest software tool developed in order to unify a number of services and databases into one on-line and user friendly system. The TopoCad system is based on Web 1.0 technology; hence the customer is only a consumer of data. All data and services are accessible for the surveyors and geo-information professional in an easy and comfortable way. The future lies in Web 2.0 and Web 3.0 technologies through which professionals can upload their own data for quality control and future assimilation with the national database. A key issue in the development of this complex system was to implement a simple and easy (comfortable) user experience (UX). The user interface employs natural language dialog box in order to understand the user requirements. The system then links spatial data with alpha-numeric data in a flawless manner. The operation of the TopoCad requires no user guide or training. It is intuitive and self-taught. The system utilizes semantic engines and machine understanding technologies to link records from diverse databases in a meaningful way. Thus, the next generation of TopoCad will include five main modules: users and projects information, coordinates transformations and calculations services, geospatial data quality control, linking governmental systems and databases, smart forms and applications. The article describes the first stage of the TopoCad system and gives an overview of its future development

    UPDATING NATIONAL TOPOGRAPHIC DATA BASE USING CHANGE DETECTION METHODS

    No full text
    The traditional method for updating a topographic database on a national scale is a complex process that requires human resources, time and the development of specialized procedures. In many National Mapping and Cadaster Agencies (NMCA), the updating cycle takes a few years. Today, the reality is dynamic and the changes occur every day, therefore, the users expect that the existing database will portray the current reality. Global mapping projects which are based on community volunteers, such as OSM, update their database every day based on crowdsourcing. In order to fulfil user's requirements for rapid updating, a new methodology that maps major interest areas while preserving associated decoding information, should be developed. Until recently, automated processes did not yield satisfactory results, and a typically process included comparing images from different periods. The success rates in identifying the objects were low, and most were accompanied by a high percentage of false alarms. As a result, the automatic process required significant editorial work that made it uneconomical. In the recent years, the development of technologies in mapping, advancement in image processing algorithms and computer vision, together with the development of digital aerial cameras with NIR band and Very High Resolution satellites, allow the implementation of a cost effective automated process. The automatic process is based on high-resolution Digital Surface Model analysis, Multi Spectral (MS) classification, MS segmentation, object analysis and shape forming algorithms. This article reviews the results of a novel change detection methodology as a first step for updating NTDB in the Survey of Israel

    SEMI AUTOMATED LAND COVER LAYER UPDATING PROCESS UTILIZING SPECTRAL ANALYSIS AND GIS DATA FUSION

    No full text
    Technological improvements made in recent years of mass data gathering and analyzing, influenced the traditional methods of updating and forming of the national topographic database. It has brought a significant increase in the number of use cases and detailed geo information demands. Processes which its purpose is to alternate traditional data collection methods developed in many National Mapping and Cadaster Agencies. There has been significant progress in semi-automated methodologies aiming to facilitate updating of a topographic national geodatabase. Implementation of those is expected to allow a considerable reduction of updating costs and operation times. Our previous activity has focused on building automatic extraction (Keinan, Zilberstein et al, 2015). Before semiautomatic updating method, it was common that interpreter identification has to be as detailed as possible to hold most reliable database eventually. When using semi-automatic updating methodologies, the ability to insert human insights based knowledge is limited. Therefore, our motivations were to reduce the created gap by allowing end-users to add their data inputs to the basic geometric database. In this article, we will present a simple Land cover database updating method which combines insights extracted from the analyzed image, and a given spatial data of vector layers. The main stages of the advanced practice are multispectral image segmentation and supervised classification together with given vector data geometric fusion while maintaining the principle of low shape editorial work to be done. All coding was done utilizing open source software components

    Steps towards 3D cadastre and ISO 19152 (LADM) in Israel

    No full text
    This paper contains the results of the 3D Cadastre and LADM (Land Administration Domain Model) investigations in context of possible future renewal of the Cadastral database at the Survey of Israel. The two topics of 3D Cadastres and LADM are highly related and therefore this paper covers both aspects. After recapturing the past 3D cadastre investigations in Israel and analyzing the current Israeli cadastral procedures, an initial step towards a 3D LADM country profile and recommendations to realize the inclusion of 3D in the future workflow of the registrations are given.OTBArchitecture and The Built Environmen
    corecore