5,656 research outputs found

    Bright paint makes interior-space surfaces appear farther away

    Get PDF
    Previous studies have reported that bright ceilings appear higher than dark ceilings, irrespective of the other colorimetric properties of the ceiling color (hue, saturation) and irrespective of the luminance of the remaining room surfaces (walls, floor). In the present study, we expand these findings to width and depth estimates. We presented stereoscopic full-scale room simulations on a head-mounted display and varied the luminance of the side walls, rear wall, and ceiling independently of each other. Participants judged the width and depth of the simulated rooms. Our results show that the perceived spatial layout of a given room is significantly influenced by the luminance of the direct bounding surfaces (e.g., the side walls when judging perceived width) but less affected by the luminance of the other surfaces. In the discussion, we provide an overall picture of effects of surface luminance on the perceived layout of interior spaces and discuss the conclusions in the context of existing interior-design guidelines

    Understanding the immersive experience: Examining the influence of visual immersiveness and interactivity on spatial experiences and understanding

    Get PDF
    Advances in computer graphics have enabled us to generate more compelling 3D virtual environments. 'Immersive experience' in these environments result from a combination of immersion and interactivity. As such, various disciplines have started adopting 3D technology for enhancing spatial understanding and experience. But the impact of the immersive experience on spatial understanding and experience remains unclear. This study utilized a controlled, between-subjects experiment to systematically manipulate a virtual reality system's technology affordances (stereoscopy, field of view, and navigability) and measure their impact. Participants, N=120, explored a virtual office and completed a questionnaire on the experience and tasks evaluating their understanding of the space. The results indicated that visual immersion had the greatest impact on understanding but, better experiences were gained when visual immersion was combined with greater interactivity. These findings support the notion the immersive experience is important for the comprehension of virtual spaces. This study overall served to provide insight into the role of the immersive experience on the comprehension of virtual spaces. The findings advance theories of spatial presence and immersion, support the use of methods which look at technology as affordances rather than entities, and support the use of 3D technology for communicating spatial information as in the case of architecture and fire-fighter training

    Understanding the challenges of immersive technology use in the architecture and construction industry: A systematic review

    Get PDF
    Despite the increasing scholarly attention being given to immersive technology applications in the architecture and construction industry, very few studies have explored the key challenges associated with their usage, with no aggregation of findings or knowledge. To bridge this gap and gain a better understanding of the state-of-the-art immersive technology application in the architecture and construction sector, this study reviews and synthesises the existing research evidence through a systematic review. Based on rigorous inclusion and exclusion criteria, 51 eligible articles published between 2010 and 2019 (inclusive) were selected for the final review. Predicted upon a wide range of scholarly journals, this study develops a generic taxonomy consisting of various dimensions. The results revealed nine (9) critical challenges which were further ranked in the following order: Infrastructure; Algorithm Development; Interoperability; General Health and Safety; Virtual Content Modelling; Cost; Skills Availability; Multi-Sensory Limitations; and Ethical Issues

    Saliency prediction in 360° architectural scenes:Performance and impact of daylight variations

    Get PDF
    Saliency models are image-based prediction models that estimate human visual attention. Such models, when applied to architectural spaces, could pave the way for design decisions where visual attention is taken into account. In this study, we tested the performance of eleven commonly used saliency models that combine traditional and deep learning methods on 126 rendered interior scenes with associated head tracking data. The data was extracted from three experiments conducted in virtual reality between 2016 and 2018. Two of these datasets pertain to the perceptual effects of daylight and include variations of daylighting conditions for a limited set of interior spaces, thereby allowing to test the influence of light conditions on human head movement. Ground truth maps were extracted from the collected head tracking logs, and the prediction accuracy of the models was tested via the correlation coefficient between ground truth and prediction maps. To address the possible inflation of results due to the equator bias, we conducted complementary analyses by restricting the area of investigation to the equatorial image regions. Although limited to immersive virtual environments, the promising performance of some traditional models such as GBVS360eq and BMS360eq for colored and textured architectural rendered spaces offers us the prospect of their possible integration into design tools. We also observed a strong correlation in head movements for the same space lit by different types of sky, a finding whose generalization requires further investigations based on datasets more specifically developed to address this question.</p

    Saliency prediction in 360° architectural scenes: Performance and impact of daylight variations

    Get PDF
    Saliency models are image-based prediction models that estimate human visual attention. Such models, when applied to architectural spaces, could pave the way for design decisions where visual attention is taken into account. In this study, we tested the performance of eleven commonly used saliency models that combine traditional and deep learning methods on 126 rendered interior scenes with associated head tracking data. The data was extracted from three experiments conducted in virtual reality between 2016 and 2018. Two of these datasets pertain to the perceptual effects of daylight and include variations of daylighting conditions for a limited set of interior spaces, thereby allowing to test the influence of light conditions on human head movement. Ground truth maps were extracted from the collected head tracking logs, and the prediction accuracy of the models was tested via the correlation coefficient between ground truth and prediction maps. To address the possible inflation of results due to the equator bias, we conducted complementary analyses by restricting the area of investigation to the equatorial image regions. Although limited to immersive virtual environments, the promising performance of some traditional models such as GBVS360eq and BMS360eq for colored and textured architectural rendered spaces offers us the prospect of their possible integration into design tools. We also observed a strong correlation in head movements for the same space lit by different types of sky, a finding whose generalization requires further investigations based on datasets more specifically developed to address this question

    Saliency prediction in 360° architectural scenes:Performance and impact of daylight variations

    Get PDF
    Saliency models are image-based prediction models that estimate human visual attention. Such models, when applied to architectural spaces, could pave the way for design decisions where visual attention is taken into account. In this study, we tested the performance of eleven commonly used saliency models that combine traditional and deep learning methods on 126 rendered interior scenes with associated head tracking data. The data was extracted from three experiments conducted in virtual reality between 2016 and 2018. Two of these datasets pertain to the perceptual effects of daylight and include variations of daylighting conditions for a limited set of interior spaces, thereby allowing to test the influence of light conditions on human head movement. Ground truth maps were extracted from the collected head tracking logs, and the prediction accuracy of the models was tested via the correlation coefficient between ground truth and prediction maps. To address the possible inflation of results due to the equator bias, we conducted complementary analyses by restricting the area of investigation to the equatorial image regions. Although limited to immersive virtual environments, the promising performance of some traditional models such as GBVS360eq and BMS360eq for colored and textured architectural rendered spaces offers us the prospect of their possible integration into design tools. We also observed a strong correlation in head movements for the same space lit by different types of sky, a finding whose generalization requires further investigations based on datasets more specifically developed to address this question.</p

    Human experience in the natural and built environment : implications for research policy and practice

    Get PDF
    22nd IAPS conference. Edited book of abstracts. 427 pp. University of Strathclyde, Sheffield and West of Scotland Publication. ISBN: 978-0-94-764988-3

    Effects of Clutter on Egocentric Distance Perception in Virtual Reality

    Full text link
    To assess the impact of clutter on egocentric distance perception, we performed a mixed-design study with 60 participants in four different virtual environments (VEs) with three levels of clutter. Additionally, we compared the indoor/outdoor VE characteristics and the HMD's FOV. The participants wore a backpack computer and a wide FOV head-mounted display (HMD) as they blind-walked towards three distinct targets at distances of 3m, 4.5m, and 6m. The HMD's field of view (FOV) was programmatically limited to 165{\deg}×\times110{\deg}, 110{\deg}×\times110{\deg}, or 45{\deg}×\times35{\deg}. The results showed that increased clutter in the environment led to more precise distance judgment and less underestimation, independent of the FOV. In comparison to outdoor VEs, indoor VEs showed more accurate distance judgment. Additionally, participants made more accurate judgements while looking at the VEs through wider FOVs.Comment: This paper was not published yet in any venue or conference/journal, ACM conference format was used for the paper, authors were listed in order from first to last (advisor), 10 pages, 10 figure
    corecore