Efficient vectors for reciprocal matrices

Susana Furtado

Abstract: In the Analytic Hierarchy Process, a method used in decision-making to rank
alternatives, a vector of weights should be extracted from a reciprocal
matrix (that is, a positive square matrix $[a_{ij}]$ with $a_{ij}=\frac{1}{%
a_{ji}}$ for all $i,j$). Several proposals have been made for the choice of
the vector of weights, as the Perron eigenvector of the reciprocal matrix or
the geometric mean of its columns. A property that the vector should satisfy
is efficiency (also called Pareto optimality). Informally, a vector is
efficient for a reciprocal matrix if the approximation of the entries of the
matrix made by the ratios of the components of the vector cannot be improved
in some position, without making it worse in some other position, by any
other vector. It is known that the Perron eigenvector of a reciprocal matrix
may not be efficient, though the geometric mean vector of the columns of a
reciprocal matrix is always efficient. In this talk we give some recent
developments in the study of efficient vectors for reciprocal matrices.