301 research outputs found
Service Platform for Converged Interactive Broadband Broadcast and Cellular Wireless
A converged broadcast and telecommunication
service platform is presented that is able to create, deliver, and
manage interactive, multimedia content and services for consumption
on three different terminal types. The motivations of
service providers for designing converged interactive multimedia
services, which are crafted for their individual requirements, are
investigated. The overall design of the system is presented with
particular emphasis placed on the operational features of each
of the sub-systems, the flows of media and metadata through the
sub-systems and the formats and protocols required for inter-communication
between them. The key features of tools required for
creating converged interactive multimedia content for a range of
different end-user terminal types are examined. Finally possible
enhancements to this system are discussed. This study is of particular
interest to those organizations currently conducting trials
and commercial launches of DVB-H services because it provides
them with an insight of the various additional functions required
in the service provisioning platforms to provide fully interactive
services to a range of different mobile terminal types
Delivery of Personalized and Adaptive Content to Mobile Devices:A Framework and Enabling Technology
Many innovative wireless applications that aim to provide mobile information access are emerging. Since people have different information needs and preferences, one of the challenges for mobile information systems is to take advantage of the convenience of handheld devices and provide personalized information to the right person in a preferred format. However, the unique features of wireless networks and mobile devices pose challenges to personalized mobile content delivery. This paper proposes a generic framework for delivering personalized and adaptive content to mobile users. It introduces a variety of enabling technologies and highlights important issues in this area. The framework can be applied to many applications such as mobile commerce and context-aware mobile services
Multimodal Content Delivery for Geo-services
This thesis describes a body of work carried out over several research projects in the area of multimodal interaction for location-based services. Research in this area has progressed from using simulated mobile environments to demonstrate the visual modality, to the ubiquitous delivery of rich media using multimodal interfaces (geo- services). To effectively deliver these services, research focused on innovative solutions to real-world problems in a number of disciplines including geo-location, mobile spatial interaction, location-based services, rich media interfaces and auditory user interfaces. My original contributions to knowledge are made in the areas of multimodal interaction underpinned by advances in geo-location technology and supported by the proliferation of mobile device technology into modern life. Accurate positioning is a known problem for location-based services, contributions in the area of mobile positioning demonstrate a hybrid positioning technology for mobile devices that uses terrestrial beacons to trilaterate position. Information overload is an active concern for location-based applications that struggle to manage large amounts of data, contributions in the area of egocentric visibility that filter data based on field-of-view demonstrate novel forms of multimodal input. One of the more pertinent characteristics of these applications is the delivery or output modality employed (auditory, visual or tactile). Further contributions in the area of multimodal content delivery are made, where multiple modalities are used to deliver information using graphical user interfaces, tactile interfaces and more notably auditory user interfaces. It is demonstrated how a combination of these interfaces can be used to synergistically deliver context sensitive rich media to users - in a responsive way - based on usage scenarios that consider the affordance of the device, the geographical position and bearing of the device and also the location of the device
Wearable Mixed Reality System In Less Than 1 Pound
We have designed a wearable Mixed Reality (MR) framework which allows to real-time render game-like 3D scenes on see-through head-mounted displays (see through HMDs) and to localize the user position within aknown internet wireless area. Our equipment weights less than 1 Pound (0.45 Kilos). The information visualized on the mobile device could be sent on-demand from a remote server and realtime rendered onboard.We present our PDA-based platform as a valid alternative to use in wearable MR contexts under less mobility and encumbering constraints: our approach eliminates the typical backpack with a laptop, a GPS antenna and a heavy HMD usually required in this cases. A discussion about our results and user experiences with our approach using a handheld for 3D rendering is presented as well
- …