The exploration of large random matrices through asymptotic deterministic equivalents has been approached by a multitude of techniques. One approach employs the matrix Dyson equation to establish an asymptotic equivalence between a random resolvent and the solution of a matrix fixed point equation. Another, the linearization trick, has proven effective in studying rational functions of random matrices. This trick involves embedding a matrix expression into a larger random matrix, known as a linear matrix pencil, with a simplified correlation structure. In this presentation, we introduce an extension of the matrix Dyson equation framework tailored specifically for linearizations. This extends previous work which has focused primarily on the case of pencils with blocks of canonical Wigner or Circular type. Within this framework, we derive an anisotropic global law for a broad class of pseudo-resolvents with general correlation structures. To highlight the practical implications of our framework, we apply it to a problem coming from machine learning. Specifically, we apply it to derive an exact asymptotic expression for the validation error of random features ridge regression and establish a general Gaussian equivalence result.