EN
One of the fundamental theorems of mathematical statistics is that which provides an effective form of linear regression function coefficients of two random variables with finite variances. In this paper, the authors formulate and prove a generalisation of this theorem for two random vectors, both when they are of the same dimension, and for vectors of any size. In this case the coefficients, for obvious reasons, will form the matrix. At the core of the considerations is the concept of the power vector in a Hilbert space and the definition of a moment of arbitrary order (ordinary and central) random vector with values in Hilbert space. Some properties of the Gram determinant, borrowed from linear algebra, were used to prove the theorems.