We consider an infinite horizon optimal control problem for a continuous-time
Markov chain X in a finite set I with noise-free partial observation. The
observation process is defined as Yt​=h(Xt​), t≥0, where h is a
given map defined on I. The observation is noise-free in the sense that the
only source of randomness is the process X itself. The aim is to minimize a
discounted cost functional and study the associated value function V. After
transforming the control problem with partial observation into one with
complete observation (the separated problem) using filtering equations, we
provide a link between the value function v associated to the latter control
problem and the original value function V. Then, we present two different
characterizations of v (and indirectly of V): on one hand as the unique
fixed point of a suitably defined contraction mapping and on the other hand as
the unique constrained viscosity solution (in the sense of Soner) of a HJB
integro-differential equation. Under suitable assumptions, we finally prove the
existence of an optimal control