375 research outputs found

    Improved image classification with neural networks by fusing multispectral signatures with topological data

    Get PDF
    Automated schemes are needed to classify multispectral remotely sensed data. Human intelligence is often required to correctly interpret images from satellites and aircraft. Humans suceed because they use various types of cues about a scene to accurately define the contents of the image. Consequently, it follows that computer techniques that integrate and use different types of information would perform better than single source approaches. This research illustrated that multispectral signatures and topographical information could be used in concert. Significantly, this dual source tactic classified a remotely sensed image better than the multispectral classification alone. These classifications were accomplished by fusing spectral signatures with topographical information using neural network technology. A neural network was trained to classify Landsat mulitspectral signatures. A file of georeferenced ground truth classifications were used as the training criterion. The network was trained to classify urban, agriculture, range, and forest with an accuracy of 65.7 percent. Another neural network was programmed and trained to fuse these multispectral signature results with a file of georeferenced altitude data. This topological file contained 10 levels of elevations. When this nonspectral elevation information was fused with the spectral signatures, the classifications were improved to 73.7 and 75.7 percent

    German accounting profession -- 1931 and before: A reflection of national ideologies

    Get PDF
    The purpose of this paper is to examine how the demand for independent audits and the German accounting profession evolved from the late 1800s to the early 1930s despite the absence of competitive market forces. The paper posits that cultural ideologies, specifically with respect to nationalism, paternalism and anti-individualism, provide reasons for the unique configuration of not only the German corporate/banking structures responsible for originating financial reports but the accounting profession that audited them. As the German accounting profession was in an embryonic stage, it was not capable of successfully confronting the corporate/banking alliance to significantly impact financial reporting or the demand for audits. Economic crises served as the dominant pressure for business reform and legislation mandating audits in Germany

    Integrins Are the Necessary Links to Hypertrophic Growth in Cardiomyocytes

    Get PDF
    To compensate for hemodynamic overload of the heart, an event which stretches the myocardium, growth and survival signaling are activated in cardiac muscle cells (cardiomyocytes). Integrins serve as the signaling receptors of cardiomyocytes responsible for mechanotransduction toward intracellular signaling. The main integrin heterodimers on the cardiomyocyte surface are α5β1 and αvβ3, and elimination of either β1 or β3 integrins impedes pressure-induced hypertrophic signaling and leads to increased mortality. The growth signaling pathways downstream of β1 and β3 integrins are well characterized. However, new integrin pathways responsible for inhibiting apoptosis induced by hemodynamic overload are emerging. β1 and β3 integrins activate differential survival signaling, yet both integrins initiate survival signaling downstream of ubiquitination and the kinase pathway including phosphoinositol-3-kinase (PI3K)/Akt. Further characterization of these integrin-signaling mechanisms may lead to drug targets to prevent decompensation to heart failure

    Evolution of professional enforcement in Texas : An examination of violations and sanctions

    Get PDF
    The purpose of this paper is to examine the enforcement of the Texas Rules of Professional Conduct (Rules) from 1946 to 1978. This period encompasses the early regulation of the Texas accounting profession after the passage of the Texas Public Accountancy Act (Act) in 1945. The Act and accompanying Rules remained in effect until 1979, when the Texas legislature enacted new accountancy legislation which inaugurated a more regulatory era. Results indicate that enforcement of the Rules of Conduct was a process evolving over time as both the state and professional political systems impacted the behavior of the Texas State Board of Public Accountancy. During the period under study, internal professional competition between certified public accountants and non-certified public accountants surfaced as a substantial explanatory factor behind rule promulgation and enforcement. Violators differed from non-violators in level of education, type of training, and type of practice. In total numbers, certified public accountants were subject to more hearings and sanctions than non-certified public accountants. However, in accordance with expectations, the public accountants received a disproportionate share of alleged violations and sanctions. Violations implying practice incompetence and those impairing professional integrity were subject to more severe disciplinary actions, but the Board heard more competitive behavior allegations than those involving malpractice

    MIDAS: Deep learning human action intention prediction from natural eye movement patterns

    Get PDF
    Eye movements have long been studied as a window into the attentional mechanisms of the human brain and made accessible as novelty style human-machine interfaces. However, not everything that we gaze upon, is something we want to interact with; this is known as the Midas Touch problem for gaze interfaces. To overcome the Midas Touch problem, present interfaces tend not to rely on natural gaze cues, but rather use dwell time or gaze gestures. Here we present an entirely data-driven approach to decode human intention for object manipulation tasks based solely on natural gaze cues. We run data collection experiments where 16 participants are given manipulation and inspection tasks to be performed on various objects on a table in front of them. The subjects' eye movements are recorded using wearable eye-trackers allowing the participants to freely move their head and gaze upon the scene. We use our Semantic Fovea, a convolutional neural network model to obtain the objects in the scene and their relation to gaze traces at every frame. We then evaluate the data and examine several ways to model the classification task for intention prediction. Our evaluation shows that intention prediction is not a naive result of the data, but rather relies on non-linear temporal processing of gaze cues. We model the task as a time series classification problem and design a bidirectional Long-Short-Term-Memory (LSTM) network architecture to decode intentions. Our results show that we can decode human intention of motion purely from natural gaze cues and object relative position, with 91.9%91.9\% accuracy. Our work demonstrates the feasibility of natural gaze as a Zero-UI interface for human-machine interaction, i.e., users will only need to act naturally, and do not need to interact with the interface itself or deviate from their natural eye movement patterns

    ICA-based denoising for ASL perfusion imaging

    Get PDF
    Arterial Spin Labelling (ASL) imaging derives a perfusion image by tracing the accumulation of magnetically labeled blood water in the brain. As the image generated has an intrinsically low signal to noise ratio (SNR), multiple measurements are routinely acquired and averaged, at a penalty of increased scan duration and opportunity for motion artefact. However, this strategy alone might be ineffective in clinical settings where the time available for acquisition is limited and patient motion are increased. This study investigates the use of an Independent Component Analysis (ICA) approach for denoising ASL data, and its potential for automation.72 ASL datasets (pseudo-continuous ASL; 5 different post-labeling delays: 400, 800, 1200, 1600, 2000 m s; total volumes = 60) were collected from thirty consecutive acute stroke patients. The effects of ICA-based denoising (manual and automated) where compared to two different denoising approaches, aCompCor, a Principal Component-based method, and Enhancement of Automated Blood Flow Estimates (ENABLE), an algorithm based on the removal of corrupted volumes. Multiple metrics were used to assess the changes in the quality of the data following denoising, including changes in cerebral blood flow (CBF) and arterial transit time (ATT), SNR, and repeatability. Additionally, the relationship between SNR and number of repetitions acquired was estimated before and after denoising the data.The use of an ICA-based denoising approach resulted in significantly higher mean CBF and ATT values (p [less than] 0.001), lower CBF and ATT variance (p [less than] 0.001), increased SNR (p [less than] 0.001), and improved repeatability (p [less than] 0.05) when compared to the raw data. The performance of manual and automated ICA-based denoising was comparable. These results went beyond the effects of aCompCor or ENABLE. Following ICA-based denoising, the SNR was higher using only 50% of the ASL-dataset collected than when using the whole raw data.The results show that ICA can be used to separate signal from noise in ASL data, improving the quality of the data collected. In fact, this study suggests that the acquisition time could be reduced by 50% without penalty to data quality, something that merits further study. Independent component classification and regression can be carried out either manually, following simple criteria, or automatically

    The Uptake and Usage of Object Technology: A Review

    Get PDF
    Two years ago, the authors conducted a major survey of IT projects in the UK to determine which methods, techniques and tools were being used to develop computer-based information systems. The survey also sought to establish a relationship between their usage, the types of system that developers were seeking to develop and the outcomes, successful or otherwise, of the development process. The authors were particularly concerned to assess the extent to which the tools, techniques and principles of Object Technology (OT) had been adopted. The results of the survey indicated widespread use of OT and suggested that systems developers considered it to be a mature technology. Two years on, the authors have conducted a second survey of IT projects in the UK. The intention was twofold: to take a snapshot of information systems development practices as the IT industry moves into the new millennium and to identify any trends that have developed during the last two years. The results of this second survey confirm many of the findings from the earlier survey, particularly in relation to the use of OT, but also reveal some interesting and unexpected changes in the use of system development methods in general

    Which user interaction for cross-language information retrieval? Design issues and reflections

    Get PDF
    A novel and complex form of information access is cross-language information retrieval: searching for texts written in foreign languages based on native language queries. Although the underlying technology for achieving such a search is relatively well understood, the appropriate interface design is not. The authors present three user evaluations undertaken during the iterative design of Clarity, a cross-language retrieval system for low-density languages, and shows how the user-interaction design evolved depending on the results of usability tests. The first test was instrumental to identify weaknesses in both functionalities and interface; the second was run to determine if query translation should be shown or not; the final was a global assessment and focused on user satisfaction criteria. Lessons were learned at every stage of the process leading to a much more informed view of what a cross-language retrieval system should offer to users
    corecore