321 research outputs found

    Pattern of abdominal wall herniae in females: a retrospective analysis

    Get PDF
    Background: Gender differences are expected to influence the pattern and outcome of management of abdominal wall hernias. Some of these are left to speculations with few published articles on hernias in females.Objectives: To describe the clinical pattern of abdominal wall hernias in females.Method: A 5 year retrospective review.Result: There were 181 female patients with 184 hernias representing 27.9% of the total number of hernia patients operated. Mean age was 41.66±24.46 years with a bimodal peak in the 1st and 7th decades. Inguinal hernia accounted for majority (50.5%) but incisional hernia predominated in the 30-49 age group, while only inguinal and umbilical hernias were seen in the first two decades (p=0.04). There was no side predilection in the cases of inguinal hernia. There were 12 (6.6%) emergency presentations, most of which occurred in the 6th decade and above and none below 30 years (p=0.02). Umbilical (4 cases) and femoral hernias (3cases) accounted for most of these cases. Incisional hernia was the commonest cause of recurrent hernias.Conclusion: Inguinal hernia is the commonest hernia type in females followed by incisional hernias which also accounteds for most recurrent cases. Age appears to be a risk factor for developing complications.Keywords: Female, herni

    Cross-Layer Automated Hardware Design for Accuracy-Configurable Approximate Computing

    Get PDF
    Approximate Computing trades off computation accuracy against performance or energy efficiency. It is a design paradigm that arose in the last decade as an answer to diminishing returns from Dennard\u27s scaling and a shift in the prominent workloads. A range of modern workloads, categorized mainly as recognition, mining, and synthesis, features an inherent tolerance to approximations. Their characteristics, such as redundancies in their input data and robust-to-noise algorithms, allow them to produce outputs of acceptable quality, despite an approximation in some of their computations. Approximate Computing leverages the application tolerance by relaxing the exactness in computation towards primary design goals of increasing performance or improving energy efficiency. Existing techniques span across the abstraction layers of computer systems where cross-layer techniques are shown to offer a larger design space and yield higher savings. Currently, the majority of the existing work aims at meeting a single accuracy. The extent of approximation tolerance, however, significantly varies with a change in input characteristics and applications. In this dissertation, methods and implementations are presented for cross-layer and automated design of accuracy-configurable Approximate Computing to maximally exploit the performance and energy benefits. In particular, this dissertation addresses the following challenges and introduces novel contributions: A main Approximate Computing category in hardware is to scale either voltage or frequency beyond the safe limits for power or performance benefits, respectively. The rationale is that timing errors would be gradual and for an initial range tolerable. This scaling enables a fine-grain accuracy-configurability by varying the timing error occurrence. However, conventional synthesis tools aim at meeting a single delay for all paths within the circuit. Subsequently, with voltage or frequency scaling, either all paths succeed, or a large number of paths fail simultaneously, with a steep increase in error rate and magnitude. This dissertation presents an automated method for minimizing path delays by individually constraining the primary outputs of combinational circuits. As a result, it reduces the number of failing paths and makes the timing errors significantly more gradual, and also rarer and smaller on average. Additionally, it reveals that delays can be significantly reduced towards the least significant bit (LSB) and allows operating at a higher frequency when small operands are computed. Precision scaling, i.e., reducing the representation of data and its accuracy is widely used in multiple abstraction layers in Approximate Computing. Reducing data precision also reduces the transistor toggles, and therefore the dynamic power consumption. Application and architecture level precision scaling results in using only LSBs of the circuit. Arithmetic circuits often have less complexity and logic depth in LSBs compared to most significant bits (MSB). To take advantage of this circuit property, a delay-altering synthesis methodology is proposed. The method finds energy-optimal delay values under configurable precision usage and assigns them to primary outputs used for different precisions. Thereby, it enables dynamic frequency-precision scalable circuits for energy efficiency. Within the hardware architecture, it is possible to instantiate multiple units with the same functionality with different fixed approximation levels, where each block benefits from having fewer transistors and also synthesis relaxations. These blocks can be selected dynamically and thus allow to configure the accuracy during runtime. Instantiating such approximate blocks can be a lower dynamic power but higher area and leakage cost alternative to the current state-of-the-art gating mechanisms which switch off a group of paths in the circuit to reduce the toggling activity. Jointly, instantiating multiple blocks and gating mechanisms produce a large design space of accuracy-configurable hardware, where energy-optimal solutions require a cross-layer search in architecture and circuit levels. To that end, an approximate hardware synthesis methodology is proposed with joint optimizations in architecture and circuit for dynamic accuracy scaling, and thereby it enables energy vs. area trade-offs

    Working memory constellations

    Get PDF
    Evidence is presented that supports the view that most models of short-term memory cannot account for the flexibility of the primary memory system. It is argued that the working memory model outlined by Baddeley and Hitch (1974) is, however, a potentially adequate model. Working memory, in this thesis, is depicted as a system that assembles 'constellations' consisting of the central executive and one or more sub-systems. This view suggests a formulation that is considerably more complex than the 1974 model. The empirical studies examine the role of the visuo-spatial scratch pad in the formation and maintenance of working memory constellations. It is concluded from these studies that the scratch pad is independent of the articulatory loop but is usually coupled to the central executive except during maintenance rehearsal. Furthermore, it can be used concurrently with the articulatory loop to process spatial aspects of highly verbal tasks. However a constellation consisting of the executive, the loop and the scratch pad is vulnerable to a wider range of interference effects than a simpler constellation. Paivio (1971) suggested that 'dual coding' leads to better memory performance, however, this is only the case when no distractors are present. The final two chapters present some speculations on how working memory research might proceed in the future. It is concluded that the current trend towards collecting convergent evidence and the emphasis on testing theory in applied situations should give us insights into memory that were not available to Ebbinghaus and other early memory researchers

    Investigation of light scattering in highly reflecting pigmented coatings. Volume 3 - Monte Carlo and other statistical investigations Final report, 1 May 1963 - 30 Sep. 1966

    Get PDF
    Monte Carlo methods, Mie theory, and random walk and screen models for predicting reflective properties of paint film

    Analytic and constructive processes in the comprehension of text

    Get PDF
    This thesis explores the process of comprehension as a purposeful interaction between a reader and the information in a text. The review begins by discussing the difference between educational and psychological perspectives on comprehension. Approaches to the analysis of text structure are then described and models and theories of the representation of knowledge are evaluated. It is argued that these are limited in that they tend to focus either on the text or the reader: they either examine those procedures that are necessary for text analysis or the knowledge structures required for comprehension, storage and retrieval. Those that come nearest to examining the interaction between text and knowledge structures tend to be limited in terms of the texts they can deal with and they do not deal adequately with the predictive aspects of comprehension.Experiments are reported which look at the ongoing predictions made by readers, and how these are affected by factors such as text structure and ‘interestingness’. The experiments provided the opportunity for examining the potential of alternative methodologies (such as the content analysis of open-ended questions). It is felt that it is necessary to examine comprehension using methods which are direct but not intrusive. The studies reported demonstrate that it is possible to obtain reliable measures of a reader's predictions and that these are systematically affected by the structure and content of the text

    The Death of Contract

    Get PDF
    The Death of Contract collects Professor Gilmore\u27s lectures given at Ohio State University Law School in 1970, with footnotes added to provide further explanation, qualification, and documentation. It is easy to tell that these were lectures, not because of their tone of urbane chattiness (Gilmore\u27s gift of style makes some of his most technical work sound like Talleyrand\u27s table talk), but because of the looseness of their design and casualness of their execution. The speaker frequently drops the thread of his narrative to break into anecdote or digression and, when he again picks up the narrative, it is not always by the same thread. But for all their informality these lectures are of extraordinary interest. They tell us how a great commercial lawyer (who is also a legal historian and contracts casebook editor) views what happened to the law of contract in the 20th century. Though expounded with rare felicity and supported by an unusual breadth of historical learning, this perspective is, I believe, a common one among scholars of contract law. I shall be arguing here that it is also a fundamentally distorted one-not so much erroneous as myopic. But first, a summary of the book

    English historical novels on the first century A.D. as reflecting the trends of religious thought during the nineteenth and twentieth centuries

    Get PDF
    1. The Outside Tradition • 2. The Novel on Early Christian Times 1820-1850 • 3. The Early Christian Novel Ehters Controversy • 4. The Imaginative Approach to the New Testament Outside the Novel • 5. The Early Christian Romancers, 1860-1900 • 6. Modernism, and the Reconstruction of a Point of view Towards Christian Origins • 7. The First Two Decades of the Twentieth Century • 8. Novels, mostly of Scepticism, 1920-1939 • 9. Novels, chiefly of Belief, 1940-1955 • 10. The Continuing Outside Traditio

    The letters of Charlotte Mary Yonge (1823-1901) edited by Charlotte Mitchell, Ellen Jordan and Helen Schinske.

    Get PDF
    Charlotte Yonge is one of the most influential and important of Victorian women writers; but study of her work has been handicapped by a tendency to patronise both her and her writing, by the vast number of her publications and by a shortage of information about her professional career. Scholars have had to depend mainly on the work of her first biographer, a loyal disciple, a situation which has long been felt to be unsatisfactory. We hope that this edition of her correspondence will provide for the first time a substantial foundation of facts for the study of her fiction, her historical and educational writing and her journalism, and help to illuminate her biography and also her significance in the cultural and religious history of the Victorian age
    • …
    corecore