591 research outputs found

    Methylene tetrahydrofolate reductase gene and coronary artery disease

    Get PDF
    Abstract is not provided by the author/publishe

    Advanced solutions for quality-oriented multimedia broadcasting

    Get PDF
    Multimedia content is increasingly being delivered via different types of networks to viewers in a variety of locations and contexts using a variety of devices. The ubiquitous nature of multimedia services comes at a cost, however. The successful delivery of multimedia services will require overcoming numerous technological challenges many of which have a direct effect on the quality of the multimedia experience. For example, due to dynamically changing requirements and networking conditions, the delivery of multimedia content has traditionally adopted a best effort approach. However, this approach has often led to the end-user perceived quality of multimedia-based services being negatively affected. Yet the quality of multimedia content is a vital issue for the continued acceptance and proliferation of these services. Indeed, end-users are becoming increasingly quality-aware in their expectations of multimedia experience and demand an ever-widening spectrum of rich multimedia-based services. As a consequence, there is a continuous and extensive research effort, by both industry and academia, to find solutions for improving the quality of multimedia content delivered to the users; as well, international standards bodies, such as the International Telecommunication Union (ITU), are renewing their effort on the standardization of multimedia technologies. There are very different directions in which research has attempted to find solutions in order to improve the quality of the rich media content delivered over various network types. It is in this context that this special issue on broadcast multimedia quality of the IEEE Transactions on Broadcasting illustrates some of these avenues and presents some of the most significant research results obtained by various teams of researchers from many countries. This special issue provides an example, albeit inevitably limited, of the richness and breath of the current research on multimedia broadcasting services. The research i- - ssues addressed in this special issue include, among others, factors that influence user perceived quality, encoding-related quality assessment and control, transmission and coverage-based solutions and objective quality measurements

    Gene markers and complex disorders: A review

    Get PDF

    Network coding meets multimedia: a review

    Get PDF
    While every network node only relays messages in a traditional communication system, the recent network coding (NC) paradigm proposes to implement simple in-network processing with packet combinations in the nodes. NC extends the concept of "encoding" a message beyond source coding (for compression) and channel coding (for protection against errors and losses). It has been shown to increase network throughput compared to traditional networks implementation, to reduce delay and to provide robustness to transmission errors and network dynamics. These features are so appealing for multimedia applications that they have spurred a large research effort towards the development of multimedia-specific NC techniques. This paper reviews the recent work in NC for multimedia applications and focuses on the techniques that fill the gap between NC theory and practical applications. It outlines the benefits of NC and presents the open challenges in this area. The paper initially focuses on multimedia-specific aspects of network coding, in particular delay, in-network error control, and mediaspecific error control. These aspects permit to handle varying network conditions as well as client heterogeneity, which are critical to the design and deployment of multimedia systems. After introducing these general concepts, the paper reviews in detail two applications that lend themselves naturally to NC via the cooperation and broadcast models, namely peer-to-peer multimedia streaming and wireless networkin

    Rate distortion optimized graph partitioning for omnidirectional image coding

    Get PDF
    International audienceOmnidirectional images are spherical signals captured by cameras with 360-degree field of view. In order to be compressed using existing encoders, these signals are mapped to planar domain. A commonly used planar representation is the equirectangular one, which corresponds to a non uniform sampling pattern on the spherical surface. This particularity is not explored in traditional image compression schemes, which treat the input signal as a classical perspective image. In this work, we build a graph-based coder adapted to the spherical surface. We build a graph directly on the sphere. Then, to have computationally feasible graph transforms, we propose a rate-distortion optimized graph partitioning algorithm to achieve an effective trade-off between the distortion of the reconstructed signals, the smoothness of the signal on each subgraph, and the cost of coding the graph partitioning description. Experimental results demonstrate that our method outperforms JPEG coding of planar equirectangular images

    Association study of two interleukin-1 gene loci with essential hypertension in a Pakistani Pathan population

    Get PDF
    An association study of IL-1 beta -511C/T and IL-1 RN 86 bp VNTR polymorphisms with essential hypertension was carried out in a sample population of 500 Pakistani Pathan subjects selected randomly, comprising groups of 235 subjects with hypertension and 265 controls. The distribution of both genotypes and alleles was not statistically different in cases and controls. In conclusion, IL-1 beta -511C/T and IL-1 RN 86 bp VNTR do not contribute to the aetiology of essential hypertension in the Pakistani Pathan population investigated here

    Biomechanical analyses of the performance of Paralympians: From foundation to elite level

    Get PDF
    Biomechanical analysis of sport performance provides an objective method of determining performance of a particular sporting technique. In particular, it aims to add to the understanding of the mechanisms influencing performance, characterization of athletes, and provide insights into injury predisposition. Whilst the performance in sport of able-bodied athletes is well recognised in the literature, less information and understanding is known on the complexity, constraints and demands placed on the body of an individual with a disability. This paper provides a dialogue that outlines scientific issues of performance analysis of multi-level athletes with a disability, including Paralympians. Four integrated themes are explored the first of which focuses on how biomechanics can contribute to the understanding of sport performance in athletes with a disability and how it may be used as an evidence-based tool. This latter point questions the potential for a possible cultural shift led by emergence of user-friendly instruments. The second theme briefly discusses the role of reliability of sport performance and addresses the debate of two-dimensional and three-dimensional analysis. The third theme address key biomechanical parameters and provides guidance to clinicians, and coaches on the approaches adopted using biomechanical/sport performance analysis for an athlete with a disability starting out, to the emerging and elite Paralympian. For completeness of this discourse, the final theme is based on the controversial issues on the role of assisted devices and the inclusion of Paralympians into able-bodied sport is also presented. All combined, this dialogue highlights the intricate relationship between biomechanics and training of individuals with a disability. Furthermore, it illustrates the complexity of modern training of athletes which can only lead to a better appreciation of the performances to be delivered in the London 2012 Paralympic Games

    A Posteriori Quantization of Progressive Matching Pursuit Streams

    Get PDF
    This paper proposes a rate-distortion optimal a posteriori quantization scheme for Matching Pursuit coefficients. The a posteriori quantization applies to a Matching Pursuit expansion that has been generated off-line, and cannot benefit of any feedback loop to the encoder in order to compensate for the quantization noise. The redundancy of the Matching Pursuit dictionary provides an indicator of the relative importance of coefficients and atom indices, and subsequently on the quantization error. It is used to define a universal upper-bound on the decay of the coefficients, sorted in decreasing order of magnitude. A new quantization scheme is then derived, where this bound is used as an Oracle for the design of an optimal a posteriori quantizer. The latter turns the exponentially distributed coefficient entropy-constrained quantization problem into a simple uniform quantization problem. Using simulations with random dictionaries, we show that the proposed exponentially upper-bounded quantization (EUQ) clearly outperforms classical schemes. Stepping on the ideal Oracle-based approach, a sub-optimal adaptive scheme is then designed that approximates the EUQ but still outperforms competing quantization methods in terms of rate-distortion characteristics. Finally, the proposed quantization method is studied in the context of image coding. It performs similarly to state-of-the-art coding methods (and even better at low rates), while interestingly providing a progressive stream, very easy to transcode and adapt to changing rate constraints
    corecore