Whilst there has been considerable progress in augmented reality over recent years it has principally been related to either marker based or apriori mapped systems which limits its opportunity for wide scale deployment. Recent advances in marker-less systems that have no apriori information using techniques borrowed from robotic vision are now finding their way into mobile augmented reality and are producing exciting results. However, unlike marker based and apriori tracking systems these techniques are independent of scale which is a vital component in ensuring that augmented objects are contextually sensitive to the environment they are projected upon. In this paper we address the problem of scale by adapting a Depth From Focus (DFF) technique, which has previously been limited to high-end cameras to a commercial mobile phone. The results clearly show that the technique is viable and with the ever-improving quality of camera phone optics, add considerably to the enhancement of mobile augmented reality solutions. Further as it simple require a platfrom with an auto-focusing camera the solution is applicable to other AR platforms
To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.