15 research outputs found

    Real-time camera motion tracking in planar view scenarios

    Get PDF
    We propose a novel method for real-time camera motion tracking in planar view scenarios. This method relies on the geometry of a tripod, an initial estimation of camera pose for the first video frame and a primitive tracking procedure. This process uses lines and circles as primitives, which are extracted applying classification and regression tree. We have applied the proposed method to high-definition videos of soccer matches. Experimental results prove that our proposal can be applied to processing high-definition video in real time. We validate the procedure by inserting virtual content in the video sequence

    HIF-transcribed p53 chaperones HIF-1α

    Get PDF
    Chronic hypoxia is associated with a variety of physiological conditions such as rheumatoid arthritis, ischemia/reperfusion injury, stroke, diabetic vasculopathy, epilepsy and cancer. At the molecular level, hypoxia manifests its effects via activation of HIF-dependent transcription. On the other hand, an important transcription factor p53, which controls a myriad of biological functions, is rendered transcriptionally inactive under hypoxic conditions. p53 and HIF-1α are known to share a mysterious relationship and play an ambiguous role in the regulation of hypoxia-induced cellular changes. Here we demonstrate a novel pathway where HIF-1α transcriptionally upregulates both WT and MT p53 by binding to five response elements in p53 promoter. In hypoxic cells, this HIF-1α-induced p53 is transcriptionally inefficient but is abundantly available for protein-protein interactions. Further, both WT and MT p53 proteins bind and chaperone HIF-1α to stabilize its binding at its downstream DNA response elements. This p53-induced chaperoning of HIF-1α increases synthesis of HIF-regulated genes and thus the efficiency of hypoxia-induced molecular changes. This basic biology finding has important implications not only in the design of anti-cancer strategies but also for other physiological conditions where hypoxia results in disease manifestation

    SCORE: Exploiting Global Broadcasts to Create Offline Personal Channels for On-Demand Access

    Get PDF
    The last 5 years have seen a dramatic shift in media distribution. For decades, TV and radio were solely provisioned using push-based broadcast technologies, forcing people to adhere to fixed schedules. The introduction of catch-up services, however, has now augmented such delivery with online pull-based alternatives. Typically, these allow users to fetch content for a limited period after initial broadcast, allowing users flexibility in accessing content. Whereas previous work has investigated both of these technologies, this paper explores and contrasts them, focusing on the network consequences of moving towards this multifaceted delivery model. Using traces from nearly 6 million users of BBC iPlayer, one of the largest catch-up TV services, we study this shift from push-to pull-based access. We propose a novel technique for unifying both push-and pull-based delivery: the Speculative Content Offloading and Recording Engine (SCORE). SCORE operates as a set-top box, which interacts with both broadcast push and online pull services. Whenever users wish to access media, it automatically switches between these distribution mechanisms in an attempt to optimize energy efficiency and network resource utilization. SCORE also can predict user viewing patterns, automatically recording certain shows from the broadcast interface. Evaluations using our BBC iPlayer traces show that, based on parameter settings, an oracle with complete knowledge of user consumption can save nearly 77% of the energy, and over 90% of the peak bandwidth, of pure IP streaming. Optimizing for energy consumption, SCORE can recover nearly half of both traffic and energy savings

    Real-Time Production and Delivery of 3D Media

    Get PDF
    The Prometheus project has investigated new ways of creating, distributing and displaying 3D television. The tools developed will also help today’s virtual studio production. 3D content is created by extension of the principles of a virtual studio to include realistic 3D representation of actors. Several techniques for this have been developed: • Texture-mapping of live video onto rough 3D actor models. • Fully-animated 3D avatars: • Photo-realistic body model generated from several still images of a person from different viewpoints. • Addition of a detailed head model taken from two close-up images of the head. • Tracking of face and body movements of a live performer using several cameras, to derive animation data which can be applied to the face and body. • Simulation of virtual clothing which can be applied to the animated avatars. MPEG-4 is used to distribute the content in its original 3D form. The 3D scene may be rendered in a form suitable for display on a ‘glasses-free’ 3D display, based on the principle of Integral Imaging. By assembling these elements in an end-to-end chain, the project has shown how a future 3D TV system could be realised. Furthermore, the tools developed will also improve the production methods available for conventional virtual studios, by focusing on sensor-free and markerless motion capture technology, methods for the rapid creation of photo-realistic virtual humans, and real-time clothing simulation

    Real-Time Production and Delivery of 3D Media

    No full text
    The Prometheus project has investigated new ways of creating, distributing and displaying 3D television. The tools developed will also help today’s virtual studio production. 3D content is created by extension of the principles of a virtual studio to include realistic 3D representation of actors. Several techniques for this have been developed: • Texture-mapping of live video onto rough 3D actor models. • Fully-animated 3D avatars: • Photo-realistic body model generated from several still images of a person from different viewpoints. • Addition of a detailed head model taken from two close-up images of the head. • Tracking of face and body movements of a live performer using several cameras, to derive animation data which can be applied to the face and body. • Simulation of virtual clothing which can be applied to the animated avatars. MPEG-4 is used to distribute the content in its original 3D form. The 3D scene may be rendered in a form suitable for display on a ‘glasses-free’ 3D display, based on the principle of Integral Imaging. By assembling these elements in an end-to-end chain, the project has shown how a future 3D TV system could be realised. Furthermore, the tools developed will also improve the production methods available for conventional virtual studios, by focusing on sensor-free and markerless motion capture technology, methods for the rapid creation of photo-realistic virtual humans, and real-time clothing simulation

    Real-time Camera Tracking in the Matris Project

    No full text
    In order to insert a virtual object into a TV image, the graphics system needs to know precisely how the camera is moving, so that the virtual object can be rendered in the correct place in every frame. Nowadays this can be achieved relatively easily in post-production, or in a studio equipped with a special tracking system. However, for live shooting on location, or in a studio that is not specially equipped, installing such a system can be difficult or uneconomic. To overcome these limitations, the MATRIS project is developing a real-time system for measuring the movement of a camera. The system uses image analysis to track naturally occurring features in the scene, and data from an inertial sensor. No additional sensors, special markers, or camera mounts are required. This paper gives an overview of the system and presents some results
    corecore