Limitations on Detecting Row Covariance in the Presence of Column Covariance
Abstract Many inference techniques for multivariate data analysis assume that the rows of the data matrix are realizations of independent and identically distributed random vectors. Such an assumption will be met, for example, if the rows of the data matrix are multivariate measurements on a set of independently sampled units. In the absence of an independent random sample, a relevant question is whether or not a statistical model that assumes such row exchangeability is plausible. One method for assessing this plausibility is a statistical test of row covariation. Maintenance of a constant type I error rate regardless of the column covariance or matrix mean can be accomplished with a test that is invariant under an appropriate group of transformations. In the context of a class of elliptically contoured matrix regression models (such as matrix normal models), I show that there are no non-trivial invariant tests if the number of rows is not sufficiently larger than the number of columns. Furthermore, I show that even if the number of rows is large, there are no non-trivial invariant tests that have power to detect arbitrary row covariance in the presence of arbitrary column covariance. However, we can construct biased tests that have power to detect certain types of row covariance that may be encountered in practice. Keywords: hypothesis test, invariance, random matrix, regression, separable covariance.