The combination of Brain-Computer Interface (BCI) technology, allowing online monitoring and decoding of brain activity, with virtual and mixed reality (MR) systems may help to shape and guide implicit and explicit learning using ecological scenarios. Real-time information of ongoing brain states acquired through BCI might be exploited for controlling data presentation in virtual environments. Brain states discrimination during mixed reality experience is thus critical for adapting specific data features to contingent brain activity. In this study we recorded electroencephalographic (EEG) data while participants experienced MR scenarios implemented through the eXperience Induction Machine (XIM). The XIM is a novel framework modeling the integration of a sensing system that evaluates and measures physiological and psychological states with a number of actuators and effectors that coherently reacts to the user's actions. We then assessed continuous EEG-based discrimination of spatial navigation, reading and calculation performed in MR, using linear discriminant analysis (LDA) and support vector machine (SVM) classifiers. Dynamic single trial classification showed high accuracy of LDA and SVM classifiers in detecting multiple brain states as well as in differentiating between high and low mental workload, using a 5 s time-window shifting every 200 ms. Our results indicate overall better performance of LDA with respect to SVM and suggest applicability of our approach in a BCI-controlled MR scenario. Ultimately, successful prediction of brain states might be used to drive adaptation of data representation in order to boost information processing in MR.The present study was supported by EU grants: FP7-ICT-2009-258749 CEEDs: The Collective Experience of Empathic Data Systems; FP7-ICT-2013- 609593 BNCI Horizon 2020. The Future of Brain/Neural Computer Interaction: Horizon 2020; Italian Ministry of Health, GR-2009-1591908