8,454 research outputs found

    Digital Image Access & Retrieval

    Get PDF
    The 33th Annual Clinic on Library Applications of Data Processing, held at the University of Illinois at Urbana-Champaign in March of 1996, addressed the theme of "Digital Image Access & Retrieval." The papers from this conference cover a wide range of topics concerning digital imaging technology for visual resource collections. Papers covered three general areas: (1) systems, planning, and implementation; (2) automatic and semi-automatic indexing; and (3) preservation with the bulk of the conference focusing on indexing and retrieval.published or submitted for publicatio

    ImageJ2: ImageJ for the next generation of scientific image data

    Full text link
    ImageJ is an image analysis program extensively used in the biological sciences and beyond. Due to its ease of use, recordable macro language, and extensible plug-in architecture, ImageJ enjoys contributions from non-programmers, amateur programmers, and professional developers alike. Enabling such a diversity of contributors has resulted in a large community that spans the biological and physical sciences. However, a rapidly growing user base, diverging plugin suites, and technical limitations have revealed a clear need for a concerted software engineering effort to support emerging imaging paradigms, to ensure the software's ability to handle the requirements of modern science. Due to these new and emerging challenges in scientific imaging, ImageJ is at a critical development crossroads. We present ImageJ2, a total redesign of ImageJ offering a host of new functionality. It separates concerns, fully decoupling the data model from the user interface. It emphasizes integration with external applications to maximize interoperability. Its robust new plugin framework allows everything from image formats, to scripting languages, to visualization to be extended by the community. The redesigned data model supports arbitrarily large, N-dimensional datasets, which are increasingly common in modern image acquisition. Despite the scope of these changes, backwards compatibility is maintained such that this new functionality can be seamlessly integrated with the classic ImageJ interface, allowing users and developers to migrate to these new methods at their own pace. ImageJ2 provides a framework engineered for flexibility, intended to support these requirements as well as accommodate future needs

    AI Solutions for MDS: Artificial Intelligence Techniques for Misuse Detection and Localisation in Telecommunication Environments

    Get PDF
    This report considers the application of Articial Intelligence (AI) techniques to the problem of misuse detection and misuse localisation within telecommunications environments. A broad survey of techniques is provided, that covers inter alia rule based systems, model-based systems, case based reasoning, pattern matching, clustering and feature extraction, articial neural networks, genetic algorithms, arti cial immune systems, agent based systems, data mining and a variety of hybrid approaches. The report then considers the central issue of event correlation, that is at the heart of many misuse detection and localisation systems. The notion of being able to infer misuse by the correlation of individual temporally distributed events within a multiple data stream environment is explored, and a range of techniques, covering model based approaches, `programmed' AI and machine learning paradigms. It is found that, in general, correlation is best achieved via rule based approaches, but that these suffer from a number of drawbacks, such as the difculty of developing and maintaining an appropriate knowledge base, and the lack of ability to generalise from known misuses to new unseen misuses. Two distinct approaches are evident. One attempts to encode knowledge of known misuses, typically within rules, and use this to screen events. This approach cannot generally detect misuses for which it has not been programmed, i.e. it is prone to issuing false negatives. The other attempts to `learn' the features of event patterns that constitute normal behaviour, and, by observing patterns that do not match expected behaviour, detect when a misuse has occurred. This approach is prone to issuing false positives, i.e. inferring misuse from innocent patterns of behaviour that the system was not trained to recognise. Contemporary approaches are seen to favour hybridisation, often combining detection or localisation mechanisms for both abnormal and normal behaviour, the former to capture known cases of misuse, the latter to capture unknown cases. In some systems, these mechanisms even work together to update each other to increase detection rates and lower false positive rates. It is concluded that hybridisation offers the most promising future direction, but that a rule or state based component is likely to remain, being the most natural approach to the correlation of complex events. The challenge, then, is to mitigate the weaknesses of canonical programmed systems such that learning, generalisation and adaptation are more readily facilitated

    Multipitch Analysis and Tracking for Automatic Music Transcription

    Get PDF
    Music has always played a large role in human life. The technology behind the art has progressed and grown over time in many areas, for instance the instruments themselves, the recording equipment used in studios, and the reproduction through digital signal processing. One facet of music that has seen very little attention over time is the ability to transcribe audio files into musical notation. In this thesis, a method of multipitch analysis is used to track multiple simultaneous notes through time in an audio music file. The analysis method is based on autocorrelation and a specialized peak pruning method to identify only the fundamental frequencies present at any single moment in the sequence. A sliding Hamming window is used to step through the input sound file and track through time. Results show the tracking of nontrivial musical patterns over two octaves in range and varying tempos

    A workshop on the gathering of information for problem formulation

    Get PDF
    Issued as Quarterly progress reports no. [1-5], Proceedings and Final contract report, Project no. G-36-651Papers presented at the Workshop/Symposium on Human Computer Interaction, March 26 and 27, 1981, Atlanta, G

    A Parametric Sound Object Model for Sound Texture Synthesis

    Get PDF
    This thesis deals with the analysis and synthesis of sound textures based on parametric sound objects. An overview is provided about the acoustic and perceptual principles of textural acoustic scenes, and technical challenges for analysis and synthesis are considered. Four essential processing steps for sound texture analysis are identifi ed, and existing sound texture systems are reviewed, using the four-step model as a guideline. A theoretical framework for analysis and synthesis is proposed. A parametric sound object synthesis (PSOS) model is introduced, which is able to describe individual recorded sounds through a fi xed set of parameters. The model, which applies to harmonic and noisy sounds, is an extension of spectral modeling and uses spline curves to approximate spectral envelopes, as well as the evolution of parameters over time. In contrast to standard spectral modeling techniques, this representation uses the concept of objects instead of concatenated frames, and it provides a direct mapping between sounds of diff erent length. Methods for automatic and manual conversion are shown. An evaluation is presented in which the ability of the model to encode a wide range of di fferent sounds has been examined. Although there are aspects of sounds that the model cannot accurately capture, such as polyphony and certain types of fast modulation, the results indicate that high quality synthesis can be achieved for many different acoustic phenomena, including instruments and animal vocalizations. In contrast to many other forms of sound encoding, the parametric model facilitates various techniques of machine learning and intelligent processing, including sound clustering and principal component analysis. Strengths and weaknesses of the proposed method are reviewed, and possibilities for future development are discussed

    High-Energy-Physics Event Generation with PYTHIA 6.1

    Get PDF
    PYTHIA version 6 represents a merger of the PYTHIA 5, JETSET 7 and SPYTHIA programs, with many improvements. It can be used to generate high-energy-physics `events', i.e. sets of outgoing particles produced in the interactions between two incoming particles. The objective is to provide as accurate as possible a representation of event properties in a wide range of reactions. The underlying physics is not understood well enough to give an exact description; the programs therefore contain a combination of analytical results and various models. The emphasis in this article is on new aspects, but a few words of general introduction are included. Further documentation is available on the web.Comment: 1 + 27 pages, submitted to Computer Physics Communication

    An Aerial Gamma Ray Survey of Springfields and the Ribble Estuary in September 1992

    Get PDF
    <p>A short aerial gamma ray survey was conducted in the vicinity of the Springfields site and Ribble Estuary from 1st-5th September 1992, to define existing background radiation levels, against which any future changes can be assessed. A twin engine AS 355 "Squirrel" helicopter chartered from Dollar Helicopters was used for this work. It was loaded with a 16 litre NaI(Tl) gamma ray detector and spectroscopy system on the 31st August and during the following days over 2700 separate spectra were recorded within a survey area of 20 x 12 km. Gamma ray spectra were recorded every 5 seconds at survey speed and altitude of 120 kph and 75 m respectively. A flight line spacing of 0.3km was chosen for the main survey area. On the 3rd September a low altitude, high spatial resolution (flight line spacing 100m and altitude 30m) was made over Banks Marsh (an area frequented by local wild fowlers).</p> <p>Survey results have been stored archivally and used to map the naturally occurring radionuclides 40K, 214Bi and 208Tl together with 137Cs and total gamma ray flux. In addition, for the first time, estimates of 234mPa in terms of deconvoluted count rate (normalised to 100m altitude) were made in the presence of 228Ac interference probably in disequilibrium with its parent thorium series.</p> <p>The maps provide a clear indication of the distribution and sources of environmental radioactivity in the Ribble at the time of the survey. The Ribble estuary is subject to regular and ongoing ground based studies by BNF, MAFF, HMIP, and University based groups, as a result of the authorised discharges of low level radioactivity from the Springfields site. The results of this survey complement this ground based work, and add to confidence that the estuarine system, it's associated sediments, tide washed pastures, salt marshes and river banks, have been thoroughly examined. There is support for earlier conclusions that the Cs on the salt marshes is the dominant source of external gamma exposure, and that the Springfields contribution to these locations is minor in comparison with this, Sellafield derived, signal. Upstream the situation is more complex, particularly where the dynamic sources of beta radiation are considered. As far as critical group assessments are concerned the survey provides clear evidence that the areas affected by 137Cs, where external gamma dose and possible food chain effects are of greatest interest, are in the lower reaches of the Ribble, whereas, at the time of the survey the 234mPa distribution was in the upper reaches of the river. This not only confirms the findings of ground based work, but provides some assurance that the different exposure paths (external gamma dose, skin dose) are not entirely synergistic. The discovery of possible transient sources of natural 228Ac in the salt marsh environment as a consequence of Th series disequilibrium immediately following spring tides is extremely interesting. If substantiated by further studies using semiconductor detectors this provides a new insight into the dynamic radiation environment of tide washed contexts.</p> <p>Aerial survey can potentially provide a rapid and cost effective means of studying environmentally dynamic sources such as 234mPa. In the case of the Ribble it would be necessary to reduce survey height to below 50m ground clearance to improve spatial resolution. Possible inconvenience to residents and property owners of such low altitude flights would have to be considered in addition to the potential value of environmental knowledge of the behaviour of short lived nuclides in a dynamic system such the Ribble estuary. There is nonetheless considerable potential for time series studies of this location. Recent flight trials by SURRC incorporating high efficiency germanium semiconductor detectors have verified the feasibility and potential a hybrid scintillation⁄ semiconductor spectrometer. Such a device can resolve any ambiguities arising from overlapping gamma ray peaks. This is particularly relevant to the confirmation of 228Ac in salt marshes. Ground based sampling at the time of measurement would enable concentration calibrations to be made for these dynamic sources. Further ground based measurements would be desirable to establish the extent to which low energy photons contribute to external gamma ray dose rates from sources with pronounced subsurface activity maxima.</p&gt

    MPEG-SCORM : ontologia de metadados interoperáveis para integração de padrões multimídia e e-learning

    Get PDF
    Orientador: Yuzo IanoTese (doutorado) - Universidade Estadual de Campinas, Faculdade de Engenharia Elétrica e de ComputaçãoResumo: A convergência entre as mídias digitais propõe uma integração entre as TIC, focadas no domínio do multimídia (sob a responsabilidade do Moving Picture Experts Group, constituindo o subcomitê ISO / IEC JTC1 SC29), e as TICE, (TIC para a Educação, geridas pelo ISO / IEC JTC1 SC36), destacando-se os padrões MPEG, empregados na forma de conteúdo e metadados para o multimídia, e as TICE, aplicadas à Educação a Distância, ou e-Learning (o aprendizado eletrônico). Neste sentido, coloca-se a problemática de desenvolver uma correspondência interoperável de bases normativas, atingindo assim uma proposta inovadora na convergência entre as mídias digitais e as aplicações para e-Learning, essencialmente multimídia. Para este fim, propõe-se criar e aplicar uma ontologia de metadados interoperáveis para web, TV digital e extensões para dispositivos móveis, baseada na integração entre os padrões de metadados MPEG-21 e SCORM, empregando a linguagem XPathAbstract: The convergence of digital media offers an integration of the ICT, focused on telecommunications and multimedia domain (under responsibility of the Moving Picture Experts Group, ISO/IEC JTC1 SC29), with the ICTE (the ICT for Education, managed by the ISO/IEC JTC1 SC36), highlighting the MPEG formats, featured as content and as description metadata potentially applied to the Multimedia or Digital TV and as a technology applied to e-Learning. Regarding this, it is presented the problem of developing an interoperable matching for normative bases, achieving an innovative proposal in the convergence between digital Telecommunications and applications for e-Learning, both essentially multimedia. To achieve this purpose, it is proposed to creating a ontology for interoperability between educational applications in Digital TV environments and vice-versa, simultaneously facilitating the creation of learning metadata based objects for Digital TV programs as well as providing multimedia video content as learning objects for Distance Education. This ontology is designed as interoperable metadata for the Web, Digital TV and e-Learning, built on the integration between MPEG-21 and SCORM metadata standards, employing the XPath languageDoutoradoTelecomunicações e TelemáticaDoutor em Engenharia ElétricaCAPE

    AXMEDIS 2007 Conference Proceedings

    Get PDF
    The AXMEDIS International Conference series has been established since 2005 and is focused on the research, developments and applications in the cross-media domain, exploring innovative technologies to meet the challenges of the sector. AXMEDIS2007 deals with all subjects and topics related to cross-media and digital-media content production, processing, management, standards, representation, sharing, interoperability, protection and rights management. It addresses the latest developments and future trends of the technologies and their applications, their impact and exploitation within academic, business and industrial communities
    corecore