This thesis presents a real-time automated building progress monitoring solution for indoor environments using a mobile device. Such a system could prompt accurate and timely assessment of work progress that would allow managers to make adjustments and minimise both time and cost overrun when deviations from the schedule occur.
Although many researchers have proposed approaches for progress monitoring in outdoor scenes, these cannot perform in real-time and shift into the complex interior environment. Research efforts for indoor environments are not fully automated and lead to errors in more complex scenes. Systems based on mobile devices could potentially enhance the inspection process and reduce the required time by allowing the inspector to acquire progress data by simply walking around the site. The main challenge of these systems is the tracking of the pose of the camera to achieve accurate alignment between the 3D design model and the real-world scene. Methods for estimating the user’s pose rely on a) tags on each target of interest, which require additional time and cost for installation and maintenance; b) pre-selected user locations, which restricts the user to those locations only; or c) GPS on the augmented reality headset, which only applies to outdoor inspections. Additionally, current mobile-based inspection systems do not perform any comparison between the captured as-built and the as-planned data.
In this research, different marker-less Augmented Reality (AR) potential methods were implemented and tested for finding the most robust tracking solution. The Microsoft HoloLens was found to be the top performer for tracking the user’s pose and for overall user-experience. Next, a semi-automated method was developed for initially registering the 3D model to the real environment by exploiting information from detected floor and wall surfaces. Results showed that this method reduces the time of the initial registration by 58%. Having the 3D model aligned to the real environment and knowing the pose of the camera at every moment, an automated method was developed that exploits the captured as-built surface mesh data from the mobile device, compares it against the 3D design model and identifies in real-time whether an object has been built according to plan. Different parameters were tested for finding the optimum combination based on the current quality of mesh data. If quality of mesh data changes, then new parameters should be explored. Finally, the proposed solution was tested in real site conditions resulting in 76.6% precision, 100.0% recall, and 83.5% accuracy