We consider the problem of representing causal models that encode
context-specific information for discrete data using a proper subclass of
staged tree models which we call CStrees. We show that the context-specific
information encoded by a CStree can be equivalently expressed via a collection
of DAGs. As not all staged tree models admit this property, CStrees are a
subclass that provides a transparent, intuitive and compact representation of
context-specific causal information. We prove that CStrees admit a global
Markov property which yields a graphical criterion for model equivalence
generalizing that of Verma and Pearl for DAG models. These results extend to
the general interventional model setting, making CStrees the first family of
context-specific models admitting a characterization of interventional model
equivalence. We also provide a closed-form formula for the maximum likelihood
estimator of a CStree and use it to show that the Bayesian information
criterion is a locally consistent score function for this model class. The
performance of CStrees is analyzed on both simulated and real data, where we
see that modeling with CStrees instead of general staged trees does not result
in a significant loss of predictive accuracy, while affording DAG
representations of context-specific causal information.Comment: 28 pages, supplementary material 15 page