We consider adaptive approximations of the parameter-to-solution map for
elliptic operator equations depending on a large or infinite number of
parameters, comparing approximation strategies of different degrees of
nonlinearity: sparse polynomial expansions, general low-rank approximations
separating spatial and parametric variables, and hierarchical tensor
decompositions separating all variables. We describe corresponding adaptive
algorithms based on a common generic template and show their near-optimality
with respect to natural approximability assumptions for each type of
approximation. A central ingredient in the resulting bounds for the total
computational complexity are new operator compression results for the case of
infinitely many parameters. We conclude with a comparison of the complexity
estimates based on the actual approximability properties of classes of
parametric model problems, which shows that the computational costs of
optimized low-rank expansions can be significantly lower or higher than those
of sparse polynomial expansions, depending on the particular type of parametric
problem