In this paper we propose an algorithm for the detection of edges in images
that is based on topological asymptotic analysis. Motivated from the
Mumford--Shah functional, we consider a variational functional that penalizes
oscillations outside some approximate edge set, which we represent as the union
of a finite number of thin strips, the width of which is an order of magnitude
smaller than their length. In order to find a near optimal placement of these
strips, we compute an asymptotic expansion of the functional with respect to
the strip size. This expansion is then employed for defining a (topological)
gradient descent like minimization method. As opposed to a recently proposed
method by some of the authors, which uses coverings with balls, the usage of
strips includes some directional information into the method, which can be used
for obtaining finer edges and can also result in a reduction of computation
times