375 research outputs found

    Necessity of informing the patient: An analytical study

    Get PDF
    تتناول هذه الدراسة ماهية وطبيعة الالتزام بتبصير المريض باعتباره واجباً قانونيا على الطبيب بمناسبة عقد العلاج الطبي. وتهدف الدراسة إلى تحديد المركز القانوني لكل من الطبيب والمريض باعتبارهما الطرفين الأساسيين في عقد العلاج الطبي. وتستعرض الدراسة ماهية الالتزام بالتبصير من حيث طبيعته، ونطاقه، وأشكاله وتطبيقاته. وتستند الدراسة، لتحقيق الهدف الذي تسعى إليه، إلى نصوص التشريع، وآراء الفقهاء، وأحكام القضاء.The present study focuses on the essence and nature of a doctor's commitment to inform the patient. Its aim is to determine clearly the legal status of the doctor and the patient as the two principal parties in the medical treatment contract. It examines the above mentioned commitment in terms of nature, scope, form and application. The study, to achieve its aim, refers, in general context, to a number of relevant laws, opinions of great scholars, and certain judicial verdict

    Online Data Cleaning

    Get PDF
    Data-centric applications have never been more ubiquitous in our lives, e.g., search engines, route navigation and social media. This has brought along a new age where digital data is at the core of many decisions we make as individuals, e.g., looking for the most scenic route to plan a road trip, or as professionals, e.g., analysing customers’ transactions to predict the best time to restock different products. However, the surge in data generation has also led to creating massive amounts of dirty data, i.e., inaccurate or redundant data. Using dirty data to inform business decisions comes with dire consequences, for instance, an IBM report estimates that dirty data costs the U.S. $3.1 trillion a year. Dirty data is the product of many factors which include data entry errors and integration of several data sources. Data integration of multiple sources is especially prone to producing dirty data. For instance, while individual sources may not have redundant data, they often carry redundant data across each other. Furthermore, different data sources may obey different business rules (sometimes not even known) which makes it challenging to reconcile the integrated data. Even if the data is clean at the time of the integration, data updates would compromise its quality over time. There is a wide spectrum of errors that can be found in the data, e,g, duplicate records, missing values, obsolete data, etc. To address these problems, several data cleaning efforts have been proposed, e.g., record linkage to identify duplicate records, data fusion to fuse duplicate data items into a single representation and enforcing integrity constraints on the data. However, most existing efforts make two key assumptions: (1) Data cleaning is done in one shot; and (2) The data is available in its entirety. Those two assumptions do not hold in our age where data is highly volatile and integrated from several sources. This calls for a paradigm shift in approaching data cleaning: it has to be made iterative where data comes in chunks and not all at once. Consequently, cleaning the data should not be repeated from scratch whenever the data changes, but instead, should be done only for data items affected by the updates. Moreover, the repair should be computed effciently to support applications where cleaning is performed online (e.g. query time data cleaning). In this dissertation, we present several proposals to realize this paradigm for two major types of data errors: duplicates and integrity constraint violations. We frst present a framework that supports online record linkage and fusion over Web databases. Our system processes queries posted to Web databases. Query results are deduplicated, fused and then stored in a cache for future reference. The cache is updated iteratively with new query results. This effort makes it possible to perform record linkage and fusion effciently, but also effectively, i.e., the cache contains data items seen in previous queries which are jointly cleaned with incoming query results. To address integrity constraints violations, we propose a novel way to approach Functional Dependency repairs, develop a new class of repairs and then demonstrate it is superior to existing efforts, in runtime and accuracy. We then show how our framework can be easily tuned to work iteratively to support online applications. We implement a proof-ofconcept query answering system to demonstrate the iterative capability of our system

    Experimental investigation on the optimal formulation of dune sand concrete reinforced with date palm fibers

    Get PDF
    The main objective of this study is to propose an optimal formulation of sand concrete. The experimental approach consists in fixing the cement dosage at 350 kg / m3 and, in the first phase, to investigate the optimum proportions of each constituent (limestone filler, silica fume, water and adjuvant). The second phase of this study was reserved for the effect of date palm fibers on the optimal formulation deduced from the experimental program of the first phase. The date palm fiber mass rates studied are: 0.5%, 1%, 1.5% and 2%. The results obtained from the first phase made it possible to fix the desired optimal composition equivalent to a unit volume of sand concrete (1 m3). The cement dosage was maintained at 350 kg, dune sand, alluvial sand, fine limestone, silica fume, water and superplasticizer were optimized respectively as follows: 423.98 kg; 980.34 kg; 165 kg; 35 kg; 210 kg and 2%.The results obtained from the second phase show different tendencies in compression than in indirect traction. Indeed, a reduction in resistance was recorded in compression as a function of the increase in the rate of the reinforcement of date palm fibers. This progressive decrease can reach 27% when the rate of reinforcement is equal to 2%. On the other hand, a fluctuation in the results of traction in the range of 5 to 6 MPa was recorded. The maximum tensile strength was reached at a date palm fiber content of 1%. The results obtained from the mass loss measurements clearly showed that the increase in the fiber reinforcement rate induces a decrease in the mass loss estimated at 23.5 %
    corecore