4 research outputs found

    Full Geo-localized Mobile Video in Android Mobile Telephones

    Full text link
    [EN] The evolution of mobile telephones have produced smart devices that not only allows the mobile user to talk but also to use a lot of telematic services. High quality photos and videos are produced by smart mobile telephones. The Global Positioning System available in the Mobile telephones allows the user to tag their photos and videos. There are several photo and integral video tagging mobile software but there is not a mobile application that allows the mobile users to tag the full video frames. This full tagging process allows the mobile user to tag independent video frames in order to explode the photo-video properties of the integral video. In this paper we present a mobile application and a Server application that allow the mobile user to full tag the mobile videos and share them with other users (registered in the Server). We present some tradeoffs present in the design of the tagging processMacias Lopez, EM.; Abdelfatah, H.; Suarez Sarmiento, A.; Canovas Solbes, A. (2011). Full Geo-localized Mobile Video in Android Mobile Telephones. Network Protocols and Algorithms. 3(1):64-81. doi:10.5296/npa.v3i1.641S64813

    Hyperlinking Reality using a Camera Phone

    Get PDF
    A user interface concept for camera phones, called "Hyperlinking Reality via Camera Phones", that we present in this thesis, provides a solution to one of the main challenges facing mobile user interfaces, that is, the problem of selection and visualization of actions that are relevant to the user in his current context. Instead of typing keywords on a small and inconvenient keypad of a mobile device, a user of our system just snaps a photo of his surroundings and objects in the image become hyperlinks to information. Our method commences by matching a query image to reference panoramas depicting the same scene that were collected and annotated with information beforehand. Once the query image is related to the reference panoramas, we transfer the relevant information from the reference panoramas to the query image. By visualizing the information on the query image and displaying it on the camera phone's (multi-)touch screen, the query image augmented with hyperlinks allows the user intuitive access to information. In addition, we provide the user with information about his position and orientation, thus augmenting the built-in GPS. The user interface concept presented in this thesis is enabled by a novel high-dimensional feature matching method based on the concept of meaningful nearest neighbors and a novel approximate nearest neighbors search method that provides a ten-fold speed-up over an exhaustive search even for high dimensional spaces while retaining excellent approximation to an exact nearest neighbors search. Our novel high-dimensional feature matching method improves effectiveness of image matching methods which are based on local invariant features, while the speed-up provided by the novel approximate nearest neighbors search method, brings our system closer to interactivity in real-time The "Hyperlinking Reality via Camera phones" mobile user interface concept requires a data set of reference panoramas that are collected and annotated with information beforehand. The Graz Urban Image data Set consists of 107 reference panoramas shot from accurately measured positions, while the camera orientations were acquired in post-processing stage using computer vision techniques followed by manual verification. On each reference panorama a few dozens of buildings, logos, banners, monuments, and other objects of interest to the user were annotated using the hyperlinks annotation tool
    corecore