8 research outputs found

    Buying Data over Time: Approximately Optimal Strategies for Dynamic Data-Driven Decisions

    Get PDF
    We consider a model where an agent has a repeated decision to make and wishes to maximize their total payoff. Payoffs are influenced by an action taken by the agent, but also an unknown state of the world that evolves over time. Before choosing an action each round, the agent can purchase noisy samples about the state of the world. The agent has a budget to spend on these samples, and has flexibility in deciding how to spread that budget across rounds. We investigate the problem of choosing a sampling algorithm that optimizes total expected payoff. For example: is it better to buy samples steadily over time, or to buy samples in batches? We solve for the optimal policy, and show that it is a natural instantiation of the latter. Under a more general model that includes per-round fixed costs, we prove that a variation on this batching policy is a 2-approximation

    Joint Block-Sparse Recovery Using Simultaneous BOMP/BOLS

    Full text link
    We consider the greedy algorithms for the joint recovery of high-dimensional sparse signals based on the block multiple measurement vector (BMMV) model in compressed sensing (CS). To this end, we first put forth two versions of simultaneous block orthogonal least squares (S-BOLS) as the baseline for the OLS framework. Their cornerstone is to sequentially check and select the support block to minimize the residual power. Then, parallel performance analysis for the existing simultaneous block orthogonal matching pursuit (S-BOMP) and the two proposed S-BOLS algorithms is developed. It indicates that under the conditions based on the mutual incoherence property (MIP) and the decaying magnitude structure of the nonzero blocks of the signal, the algorithms select all the significant blocks before possibly choosing incorrect ones. In addition, we further consider the problem of sufficient data volume for reliable recovery, and provide its MIP-based bounds in closed-form. These results together highlight the key role of the block characteristic in addressing the weak-sparse issue, i.e., the scenario where the overall sparsity is too large. The derived theoretical results are also universally valid for conventional block-greedy algorithms and non-block algorithms by setting the number of measurement vectors and the block length to 1, respectively.Comment: This work has been submitted to the IEEE for possible publicatio
    corecore