Let V be a nonempty finite set and A=(aij)i,j∈V be a matrix with
entries in a field K. For a subset X of V, we denote by A[X]
the submatrix of A having row and column indices in X. We study the
following problem. Given a positive integer k, what is the relationship
between two matrices A=(aij)i,j∈V, B=(bij)i,j∈V with
entries in K and such that det(A[X])=det(B[X]) for any subset X of V of size at most k ? The Theorem that we
get in this Note is an improvement of a result of R. Loewy [5] for
skew-symmetric matrices whose all off-diagonal entries are nonzero