Inference after model selection presents computational challenges when
dealing with intractable conditional distributions. Markov chain Monte Carlo
(MCMC) is a common method for sampling from these distributions, but its slow
convergence often limits its practicality. In this work, we introduce a method
tailored for selective inference in cases where the selection event can be
characterized by a polyhedron. The method transforms the variables constrained
by a polyhedron into variables within a unit cube, allowing for efficient
sampling using conventional numerical integration techniques. Compared to MCMC,
the proposed sampling method is highly accurate and equipped with an error
estimate. Additionally, we introduce an approach to use a single batch of
samples for hypothesis testing and confidence interval construction across
multiple parameters, reducing the need for repetitive sampling. Furthermore,
our method facilitates fast and precise computation of the maximum likelihood
estimator based on the selection-adjusted likelihood, enhancing the reliability
of MLE-based inference. Numerical results demonstrate the superior performance
of the proposed method compared to alternative approaches for selective
inference