20 research outputs found

    Characterization of neurological disorders using evolutionary algorithms

    Get PDF
    The life expectancy increasing, in the last few decades, leads to a large diffusion of neurodegenerative age-related diseases such as Parkinson’s disease. Neurodegenerative diseases are part of the huge category of neurological disorders, which comprises all the disorders affecting the central nervous system. These conditions have a terrible impact on life quality of both patients and their families, but also on the costs associated to the society for their diagnosis and management. In order to reduce their impact on individuals and society, new better strategies for the diagnosis and monitoring of neurological disorders need to be considered. The main aim of this study is investigating the use of artificial intelligence techniques as a tool to help the doctors in the diagnosis and the monitoring of two specific neurological disorders (Parkinson’s disease and dystonia), for which no objective clinical assessments exist. The evolutionary algorithms are chosen as the artificial intelligence technique to evolve the best classifiers. The classifiers evolved by the chosen technique are then compared with those evolved by two popular well-known techniques: artificial neural network and support vector machine. All the evolved classifiers are not only able to distinguish among patients and healthy subjects but also among different subgroups of patients. For Parkinson’s disease: two different cognitive impairment subgroups of patients are considered, with the aim of an early diagnosis and a better monitoring. For dystonia: two kinds of dystonia patients are considered (organic and functional) to have a better insight in the division of the two groups. The results obtained for Parkinson’s disease are encouraging and evidenced some differences among the cognitive impairment subgroups. Dystonia results are not satisfactory at this stage, but the study presents some limitations that could be overcome in future work

    On the shortage of engineering in recent information systems research

    Get PDF
    In this paper we argue that the so-called 'positivism'-versus-'interpretivism' conflict raised by some constructivist, postmodernist, relativist philosophers and methodologists in information systems research is merely a pseudo problem which has no basis in reality. This pseudo problem of so-called 'positivism' versus 'interpretivism' only distracts from the genuine problem of the information systems discipline, namely the design and construction of reliable devices from reasonable specifications, for well-defined purposes, on the basis of scientifically acceptable principles. In contrast to those relativist 'philosophies' we show that information systems research actually belongs to the domain of engineering which already has its time-tested methodology and epistemology, including a trinity of scientific-nomothetic, hermeneutic-idiographic, as well as pragmatic-normative elements. By accepting fact that information systems research is a specific instance of engineering research, which also includes (and has always included) the un-quantifiable 'human dimension', a number of fruitless debates can be terminated for the sake of genuine progress in information systems' theory, design and deployment.National Research Foundatio

    On the shortage of engineering in recent information systems research

    Get PDF
    In this paper we argue that the so-called 'positivism'-versus-'interpretivism' conflict raised by some constructivist, postmodernist, relativist philosophers and methodologists in information systems research is merely a pseudo problem which has no basis in reality. This pseudo problem of so-called 'positivism' versus 'interpretivism' only distracts from the genuine problem of the information systems discipline, namely the design and construction of reliable devices from reasonable specifications, for well-defined purposes, on the basis of scientifically acceptable principles. In contrast to those relativist 'philosophies' we show that information systems research actually belongs to the domain of engineering which already has its time-tested methodology and epistemology, including a trinity of scientific-nomothetic, hermeneutic-idiographic, as well as pragmatic-normative elements. By accepting fact that information systems research is a specific instance of engineering research, which also includes (and has always included) the un-quantifiable 'human dimension', a number of fruitless debates can be terminated for the sake of genuine progress in information systems' theory, design and deployment.National Research Foundatio

    The autonomous acoustic buoy

    Get PDF
    Treball desenvolupat dins el marc del programa 'European Project Semester'.The Autonomous Acoustic Buoy (AAB) has been designed and manufactured by Laboratori d’Aplicacions Bioacústiques (LAB) with the assistance of Universitat Politècnica de Catalunya (UPC) over the past two years. This is due to the need to measure and control the correlation between human activity and presence of marine mammals in the marine environment. The buoy is used for recording sound patterns in the underwater environment, specifically referring to the physiological and neurophysiologic processes by which sounds are produced, received and processed [42]. Furthermore the AAB has an important role in monitoring and identifying the communication between marine mammals. This paper presents the further advancements and improvements of the design and functionality of the AAB as well as a concise introduction into the buoy capabilities and a brief overview of the history, purpose and organisation of the LAB. With regards to the improvements made this article gives details of elaborate research undertaken into dolphin whistles analysis and a well structured marketing plan. The mechanical and electrical features concerning the safe operating conditions of the buoy offshore are also presented within this report

    Temporal behavior of defect detection performance in design documents

    Get PDF
    Die Qualität der Software ist natürlich ein erfolgskritischer Faktor im Software Engineering (SE), genauso wie die Design Dokumente in den frühen Softwareentwicklungsphasen. Organisatorische Faktoren, wie etwa der verwendete Software-Entwicklungsprozess, helfen den Prozeß an sich besser zu Strukturieren und zu Optimieren. Entwicklungsansätze unterstützen diesen Prozeß, während analytische Ansätze darauf abzielen Fehler und Produktabweichungen zu vermeiden. Software Inspektionen (SI) und Tests sind bereits bekannte und anerkannte Techniken im SE um Fehler im Software Code, in Spezifikationen oder Design Dokumenten, während verschiedenster Phasen des Produktlebenszykluses, zu identifizieren. Ein Hauptaugenmerk von analytischen Qualitätssicherungen wie SI und Tests liegt auf der frühen Entdeckung von Fehlern. Denn je später ein Fehler im Produktentwicklungsprozess gefunden wird, desto aufwendiger und teurer ist dessen Entfernung. SI fokussieren auf eine Fehlerfindung in einer sehr frühen Phase des gesamten Prozesses ohne die Notwendigkeit eines Ausführbaren Software Codes. Deshalb ist SI anwendbar auf geschriebene Text Doku-ment wie Design Dokumente. Traditionelle Testansätze fokussieren auf die Erstellung von Testfällen und deren Exekution in späteren Phasen des Prozesses, weil sie im Gegensatz zu SI auf ausführbaren Code angewiesen sind. Folgernd ist es notwendig Testfallerstellung und SI zu kombinieren, um in noch frühen Phasen die Qualität weiter verbessern zu können. Die Vorteile beider Ansätze zu vereinen wird helfen um (a) Fehler sehr früh zu finden und (b) Testfälle zu definieren, welche ein systematisches Testen erlauben, daß wiederum auf Anfor-derungen und Use-Cases basiert. Der Ansatz in dieser These - auf Inspektionen basiertes Tes-ten – wird zu einer „Zuerst Testen“ Strategie auf Anforderungsbasis führen Diese These konzentriert sich auf einen auf Inspektionen basierten Test Ansatz, sowie auf SI generell mit einer genaueren Untersuchung des zeitlichen Verhaltens dieser Techniken in Design Dokumenten mit Hauptaugenmerk auf sehr kritische und kritische Fehler. Die Ergebnisse der Untersuchungen des zeitlichen Verhaltens ergaben, daß UBR in dem Zeit-intervall der ersten 120 Minuten äußerst effektiv und effizient agiert. UBT-i hingegen benötigt mehr Zeit, ca. 44 % um ein gleichwertiges Ergebnis erzielen zu können. Der Vergleich der bei-den Software Fehlerfindungstechniken zeigte weiters, daß UBR ganzheitlich gesehen nicht die überlegene Technik ist. Wegen der inkonsistenten Resultate der Experiment Sessions kann jedoch auch keine überlegene Technik definitiv genannt werden. Betreffend den Ergebnissen der False Positives, konnte das erwartete zeitliche Verhalten, daß die wenigsten False Positi-ves in den ersten 120 Minuten gefunden werden, nicht beobachtet werden. Deshalb mußte die betreffende Hypothese verworfen werden. Die These basiert auf einem Experiment, welches in einer kontrollierten akademischen Umge-bung durchgeführt wurde um die Fehlerfindungseffizienz Einzelner zu untersuchen. Die Ergebnisse werden Projekt- und Qualitätsmanagern helfen, um deren Qualitätsmaßnahmen besser planen zu können und es weiters ermöglichen deren zeitliche Dauer und daraus folgende Effizienz und Effektivität besser abschätzen zu können.The quality of software requirements and design documents are success critical issues in soft-ware engineering (SE) practice. Organizational measures, e.g., software processes, help struc-turing the development process along the project life-cycle, constructive approaches support building software products, and analytical approaches aim at investigating deliverables with respect to defects and product deviations. Software inspection and testing are well-known and common techniques in Software Engineering to identify defects in code documents, specifica-tions, and requirements documents in various phases of the project life-cycle. A major goal of analytical quality assurance activities, e.g., inspection and testing, is the detection of defects as early as possible because rework effort and cost increase, if defects are identified late in the project. Software inspection (SI) focuses on defect detection in early phases of software development without the need for executable software code. Thus, SI is applicable to written text documents, e.g., specification and requirements documents. Traditional testing approaches focus on test case definition and execution in later phases of development because testing requires executable code. Thus, we see the need to combine test case generation and software inspection early in the software project to increase software product quality and test cases. Bundling benefits from early defect detection (SI application) and early test case definition based on SI results can help identifying (a) defects early and (b) derive test cases definitions for systematic testing based on requirements and use cases. Our approach – inspection-based testing – leads to a test-first strategy on requirements level. This thesis focuses on the investigation of an inspection-based testing approach and software inspection with respect to the temporal behavior of defect detection with emphasis on critical defects in requirements and specification documents. The outcomes concerning the temporal behavior showed up some interesting results. UBR performs in the time interval of the first 120 minutes very effective and efficient. UBT-i in con-trary needs more time, about 44 % for its testing duration to achieve as good defect detection results as UBR. The comparison of these two software fault detection techniques showed that UBR is on the whole not the superior technique. Because of the inconsistent findings in the experiment sessions a clear favorite cannot be named. Concerning the results for the fault positives the expected temporal behavior, which was that the fewest false positives were found in the first 120 minutes, could not be investigated and the hypothesis on this had to be rejected. A controlled experiment in an academic environment was made to investigate defect detection performance and the temporal behavior of defect detection for individuals in a business IT software solution. The results can help project and quality managers to better plan analytical quality assurance activities, i.e., inspection and test case generation, with respect to the temporal behavior of both defect detection approaches

    Characterizing radio channels : the science and technology of propagation and interference, 1900-1935

    Get PDF
    Thesis (Ph. D. in History and Social Study of Science and Technology (HASTS))--Massachusetts Institute of Technology, Program in Science, Technology and Society, 2004.Includes bibliographical references (p. 409-429).Guglielmo Marconi's trans-Atlantic wireless experiment in 1900 marked the beginning of a communication revolution that transformed the open space above the earth into channels of information flow. This dissertation grapples with the historical conditions that gave rise to such a transformation: the studies of radio-wave propagation and the treatments of radio interferences in early twentieth-century America and Western Europe. The part on propagation examines the debate between the surface diffraction theory and the atmospheric reflection theory for long waves, the development of the ionic refraction theory for short waves, the evidential quests for the existence of the ionosphere, and the studies of the geomagnetic effects on propagation. The part on interferences focuses on the engineering efforts toward the characterization of atmospheric noise and signal-intensity fluctuations, the policies of radio-channel allocation for fighting man-made interference, and the scientific research into electronic tube noise. By the mid-30s, the results from these endeavors had considerably improved the quality of radio communication. Characterizing Radio Channels builds a bridge between the history of science and the history of technology by inspecting an immaterial engineering entity--radio channels--whose control required significant scientific research. In the history of science, it contributes to an integrated study of electrical physics and geophysics. In the history of technology, it enriches radio history, epistemology of engineering knowledge, consumer studies, and the studies of technological policies. Combining both fields with the concept of radio channels enables a new understanding of the historical conditions that made the information society(cont.) social factors that facilitated the modern research organizations in academia, industry, governments and the military.by Chen-Pang Yeang.Ph.D.in History and Social Study of Science and Technology (HAST

    Measurement of epithelial electrical passive parameters and its application to study gastric defence against acid and ulcerogenic agents

    Get PDF
    The aim of this study was to develop a reliable method for measuring epithelial membrane and shunt resistances. This was accomplished by improving the intraepithelial two dimensional cable analysis by using multiple electrodes simultaneously and by sequentially applying the intraepithelial current through different electrodes thus taking advantage of their spatial relationship. The improvement achieved with this novel method is its excellent temporal resolution; changes in the membrane and shunt pathway resistances can be typically measured in 9-20 seconds. The actual measurement time depends on the target tissue, number of electrodes, electrode noise and distance configuration. This technique was applied to investigate the effects of luminal acid on membrane resistances of Necturus gastric (antral) mucosa. The main finding was that luminal acid closes sodium selective, amiloride blockable channels on the apical cell membrane probably by protonating 1-2 amino acid residues of the channel molecule itself. These findings suggest that the epithelium can generate a protective barrier against the luminal acidic offence by closing its apical cell membrane channels. Besides direct protection against H+ influx, another possible advantage gained by closure of the Na+-selective channels in the apical cell membrane is the maintenance of a sufficient transmembrane Na+ gradient for Na+-dependent acid equivalent transport processes across the basolateral cell membrane. The method was also used to elucidate the effects of luminal ethanol on the epithelial membrane resistances of Necturus gastric mucosa. Surprisingly, the first effects were seen on the basolateral cell membrane, not on the apical cell membrane or on the shunt pathway, as would have been expected. With ion substitution and channel blocker experiments, it was deduced that potassium selective channels on the basolateral cell membrane were opened by luminal ethanol exposure. This opening of potassium channels decreased cell volume. The present data indicate that opening of basolateral K+ channels with resultant epithelial cell shrinkage are among the earliest functional perturbations that might precede and underlie ethanol induced gastric mucosal injury. The subsequent opening of apical Na+ selective channels with consequent increase in intracellular Na+ load after more prolonged ethanol exposure suggests further functional deterioration of the epithelium. On the other hand, the profound changes in intraepithelial resistances provoked by stronger ethanol insult (i.e. collapse of Ra, decrease in Rs and closing of the gap-junctions as judged from the increased Rx) are more compatible with a structural damage of the epithelium and probably reflect emerging disruption of the surface epithelium.reviewe

    Registration techniques for computer assisted orthopaedic surgery

    Get PDF
    The registration of 3D preoperative medical data to patients is a key task in developing computer assisted surgery systems. In computer assisted surgery, the patient in the operation theatre must be aligned with the coordinate system in which the preoperative data has been acquired, so that the planned surgery based on the preoperative data can be carried out under the guidance of the computer assisted surgery system.The aim of this research is to investigate registration algorithms for developing computer assisted bone surgery systems. We start with reference mark registration. New interpretations are given to the development of well knowm algorithms based on singular value decomposition, polar decomposition techniques and the unit quaternion representation of the rotation matrix. In addition, a new algorithm is developed based on the estimate of the rotation axis. For non-land mark registration, we first develop iterative closest line segment and iterative closest triangle patch registrations, similar to the well known iterative closest point registration, when the preoperative data are dense enough. We then move to the situation where the preoperative data are not dense enough. Implicit fitting is considered to interpolate the gaps between the data . A new ellipsoid fitting algorithm and a new constructive implicit fitting strategy are developed. Finally, a region to region matching procedure is proposed based on our novel constructive implicit fitting technique. Experiments demonstrate that the new algorithm is very stable and very efficient
    corecore