Human-aware robot navigation is very important in many applications in human-robot shared environments. There are some situations, people have to move with less visual and auditory perceptions. In that case, the robot can help to enhance the efficiency of navigation when moving in noisy and low visibility conditions. In that scenario, haptic is the best way to communicate when other modalities are less reliable. We used a rein to guide a human when 1-DoF robotic arm can perturb the humans’ arm to guide into a desired point. The novelty of our work is presenting behavioral metrics based on novel predictive model to strategically position the humans in human-robot shared environment in low visibly and auditory conditions. We found that humans start with a second order reactive autoregressive following model and changes it to a predictive model with training. This result would help us to enhance humans’ safety and comfort in robot leading navigation in shared environment