54 research outputs found

    Rupture of the ilio-psoas tendon after a total hip arthroplasty: an unusual cause of radio-lucency of the lesser trochanter simulating a malignancy

    Get PDF
    Avulsion fracture or progressive radiolucency of lesser trochanter is considered a pathognomic finding in patients with malignancies. Although surgical release of the iliopsoas tendon may be required during a total hip arthroplasty (THA), there is no literature on spontaneous rupture of the ilio-psoas tendon after a THA causing significant functional impairment. We report here such a case, which developed progressive radiolucency of the lesser trochanter over six years after a THA, simulating a malignancy. The diagnosis was confirmed by MRI. Because of the chronic nature of the lesion, gross retraction of the tendon into the pelvis, and low demand of our patient, he was treated by physiotherapy and gait training. Injury to the ilio-psoas tendon can occur in various steps of the THA and extreme care should be taken to avoid this injury. Prevention during surgery is better, although there are no reports of repair in the THA setting. This condition should be considered in patients who present with progressive radioluceny of the lesser trochanter, especially in the setting of a hip/pelvic surgery. Awareness and earlier recognition of the signs and symptoms of this condition will aid in diagnosis and will direct appropriate management

    Query-Time Record Linkage and Fusion over Web Databases

    No full text
    Abstract-Data-intensive Web applications usually require integrating data from Web sources at query time. The sources may refer to the same real-world entity in different ways and some may even provide outdated or erroneous data. An important task is to recognize and merge the records that refer to the same real world entity at query time. Most existing duplicate detection and fusion techniques work in the off-line setting and do not meet the online constraint. There are at least two aspects that differentiate online duplicate detection and fusion from its offline counterpart. (i) The latter assumes that the entire data is available, while the former cannot make such an assumption. (ii) Several query submissions may be required to compute the "ideal" representation of an entity in the online setting. This paper presents a general framework for the online setting based on an iterative record-based caching technique. A set of frequently requested records is deduplicated off-line and cached for future reference. Newly arriving records in response to a query are deduplicated jointly with the records in the cache, presented to the user and appended to the cache. Experiments with real and synthetic data show the benefit of our solution over traditional record linkage techniques applied to an online setting

    Data civilizer 2.0: A holistic framework for data preparation and analytics

    No full text
    Data scientists spend over 80% of their time (1) parameter-tuning machine learning models and (2) iterating between data cleaning and machine learning model execution. While there are existing efforts to support the first requirement, there is currently no integrated workflow system that couples data cleaning and machine learning development. The previous version of Data Civilizer was geared towards data cleaning and discovery using a set of pre-defined tools. In this paper, we introduce Data Civilizer 2.0, an end-to-end workflow system satisfying both requirements. In addition, this system also supports a sophisticated data debugger and a workflow visualization system. In this demo, we will show how we used Data Civilizer 2.0 to help scientists at the Massachusetts General Hospital build their cleaning and machine learning pipeline on their 30TB brain activity dataset

    Data Civilizer 2.0: a holistic framework for data preparation and analytics

    No full text
    © 2019 VLDB Endowment. Data scientists spend over 80% of their time (1) parameter-tuning machine learning models and (2) iterating between data cleaning and machine learning model execution. While there are existing efforts to support the first requirement, there is currently no integrated workflow system that couples data cleaning and machine learning development. The previous version of Data Civilizer was geared towards data cleaning and discovery using a set of pre-defined tools. In this paper, we introduce Data Civilizer 2.0, an end-to-end workflow system satisfying both requirements. In addition, this system also supports a sophisticated data debugger and a workflow visualization system. In this demo, we will show how we used Data Civilizer 2.0 to help scientists at the Massachusetts General Hospital build their cleaning and machine learning pipeline on their 30TB brain activity dataset
    corecore