Yes, for example if you choose $\rho$ small enough to ensure that your matrix is strictly diagonally dominant, then it is guaranteed to be positive definite. In this case "small enough" means $|\rho|<1/r$, where $r$ is the valency of the regular graph.
But possibly you do not want to choose $\rho$ so small. A useful thing to remember here is that a symmetric matrix is positive definite if and only if its eigenvalues are all positive. And since you are constructing your matrix as
$$ M = I + \rho A$$
where $A$ is the adjacency matrix of the graph, it follows that the eigenvalues of $M$ are of the form $1+\rho\lambda$ as $\lambda$ ranges over the eigenvalues of $A$. So if you have a particular $\rho>0$ in mind, then the graphs that will work are precisely those for which all eigenvalues (of the adjacency matrix) satisfy the bound $\lambda > -1/\rho$. In other words, you need graphs whose negative eigenvalues aren't too large in magnitude. Note that for a regular graph with valency $r$, all its eigenvalues satisfy $|\lambda| \leq r$, which leads to the same sufficient condition $|\rho| < 1/r$ described above.
There is quite a bit of information available about graphs whose most negative eigenvalue isn't too large in magnitude; this falls within the subject of spectral graph theory. In particular, the problem of characterizing graphs whose eigenvalues satisfy $\lambda \geq -2$ is treated in the book
Spectral Generalizations of Line Graphs: On Graphs with Least Eigenvalue -2. It contains the following result, showing that with fairly trivial exceptions the bound $\lambda\geq -2$ is the best that we can hope for a regular graph to satisfy:
Corollary 2.3.22. If G is a connected regular graph with least eigenvalue greater than −2 then G is either a complete graph or an odd
cycle.
There are methods of constructing broad families of regular graphs which attain this bound, i.e. whose least eigenvalue is -2. The most basic one is the construction of a line graph. If you start with any graph $G$, you can construct a new graph $L(G)$, whose vertices correspond to the edges of $G$ and whose edges correspond to edge-incidences of $G$. This graph $L(G)$ is called the line graph of $G$, and it is guaranteed that its eigenvalues satisfy $\lambda \geq -2$, no matter which graph $G$ you start with. Moreover, if you start with a regular graph $G$ with valency $r$, then $L(G)$ will also be regular, with valency $2(r-1)$. This gives you a way to construct regular graphs for which you can take $\rho$ to be any value satisfying $|\rho|<1/2$ and end up with M being positive definite. In light of the result cited above, this is the best that is possible, unless you want to go with a complete graph (which allows $\rho$ to be arbitrarily close to 1) or an odd cycle (which allows $\rho$ to be a little larger than $1/2$, but by an amount which approaches zero as the size of the cycle increases), or a disjoint union of complete graphs and odd cycles.
If it is unsatisfactory to restrict to regular graphs with even valency, it's worth noting that you don't have to start with a regular graph $G$ in order for $L(G)$ to be regular. For instance, you could instead start with $G$ being a semiregular bipartite graph, where one of the two valencies is even and the other is odd, and this would result in $L(G)$ being regular with odd valency.
Best Answer
To follow up on @cardinal's comment: your $x$, $y$, and $z$ define a $(3 \times 3)$ correlation matrix $R$. Since a correlation matrix also is a possible covariance matrix (of standardized variables), it has to be positive definite. This is the case if all eigenvalues are $> 0$. If $R$ is indeed positive definite, then all vectors $\boldsymbol{s}$ of variances (i.e., numbers $> 0$) will turn $\boldsymbol{R}$ into a positive definite covariance matrix $\boldsymbol{\Sigma} = \boldsymbol{D}_{s}^{1/2} \boldsymbol{R} \boldsymbol{D}_{s}^{1/2}$, where $\boldsymbol{D}_{s}^{1/2}$ is the square root of the diagonal matrix made from $\boldsymbol{s}$.
So just construct $R$ from $x, y, z$, and check if the eigenvalues are all $> 0$. If so, you're good, and you can transform any set of data to have a corresponding covariance matrix with arbitrary variances:
gives
So our $\boldsymbol{R}$ here is positive definite. Now construct the corresponding covariance matrix from arbitrary variances.
Generate some data matrix $\boldsymbol{X}$ that we will transform to later have exactly that covariance matrix.
To do that, we first orthonormalize matrix $\boldsymbol{X}$, giving matrix $\boldsymbol{Y}$ with covariance matrix $\boldsymbol{I}$ (identity).
Transform matrix $\boldsymbol{Y}$ to have covariance matrix $\boldsymbol{\Sigma}$ and centroid $\boldsymbol{\mu}$.
Edit: what's going on here: Do a spectral decomposition $\boldsymbol{\Sigma} = \boldsymbol{G} \boldsymbol{D} \boldsymbol{G}^{t}$, where $\boldsymbol{G}$ is the matrix of normalized eigenvectors of $\boldsymbol{\Sigma}$, and $\boldsymbol{D}$ is the corresponding matrix of eigenvalues. Now matrix $\boldsymbol{G} \boldsymbol{D}^{1/2} \boldsymbol{Y}$ has covariance matrix $\boldsymbol{G} \boldsymbol{D}^{1/2} Cov(\boldsymbol{Y}) \boldsymbol{D}^{1/2} \boldsymbol{G}^{t} = \boldsymbol{G} \boldsymbol{D} \boldsymbol{G}^{t} = \boldsymbol{\Sigma}$, as $Cov(\boldsymbol{Y}) = \boldsymbol{I}$.
Check that the correlation matrix is really $\boldsymbol{R}$.
For other purposes, the question might now be: How do I find a positive definite matrix that is "very similar" to a pre-specified one that is not positive definite. That I don't know.
Edit: corrected some square roots