A, B, C, and D are boolean variables, meaning that each takes the value
"true" or "false". More complex expressions have value "true" or "false"
depending on the values of these variables, so for example A'BD' is true if
A is false, B is true, and D is false, and C is either true or false.
To say that F = G, where F and G are complex expressions, means that, no
matter what values the boolean variables have, the value of F is the same
as the value of G (that is, F and G are either both true or both false).
So, for example, we have
AB + AC = A(B+C)
because if A is true and either B or C is true, then both sides are true,
and in any other case both sides are false---there's no way to assign
values to A, B, C such that the two side come out differently.
As previous posters have mentioned, you can always prove (or disprove!) an
equality by going through all possible assignments of "true" and "false" to
the variables. The goal seems to be to prove "by theorems", that is, using
operations previously proven true. As you say, we can either manipulate
one side of the equation until it has the form of the other side, or we
could manipulate both sides and get them into a common form.
In this case, the first thing to notice is that the right-hand side (RHS)
is a copy of the LHS with some extra stuff tacked onto the end. For this
reason it's simplest to manipulate just the RHS to get rid of the
surplussage! We can start with this basic theorem:
if Q is true whenever P is true, then Q = Q + P
We can prove this theorem by systematically considering all possibilities
for P and Q. Or look at it this way: If Q is true, then both sides of the
equation are true. And if Q is false, then P must be false (since, by
assumption, if P were true Q would be true) hence both sides are false.
Given this theorem we prove:
XY + X'Z = XY + X'Z + YZ
Proof: Let P be YZ and let Q be XY+X'Z. Suppose P is true, that is, Y and
Z are both true. But if Y and Z are both true, then XY+X'Z must be true.
(Reason: If X is true then, since Y is true, XY is true. And if X is
false then, since Z is true, X'Z is true. So in either case XY+X'Z is
true.) So we've shown that Q is true whenever P is true, hence by the
previous theorem Q = Q + P, which in this case is what we wanted to prove.
Now we can prove
A'D' + AC' = A'D' + AC' + C'D'
This is just the same as the previous theorem, putting A for X, D' for Y,
and C' for Z.
This last theorem implies
B(A'D' + AC') = B(A'D' + AC' + C'D')
which is the same as
A'BD' + ABC' = A'BD' + ABC' + BC'D'
Given this, we can take the RHS of the original and substitute A'BD' + ABC'
for A'BD' + ABC' + BC'D', which is to say, we can drop the term BC'D'.
With steps similar to the above we can prove these two theorems:
A'BD' + BCD = A'BD' + BCD + A'BC
BCD + ABC' = BCD + ABC' + ABD
which permit us to drop the final two terms of the RHS of the original,
completing the proof.
Algorithmically, this is the #SAT problem (also known as Sharp-SAT). The general problem statement is: given a propositional logic formula $\phi(x_1,\dots,x_n)$ on $n$ boolean variables $x_1,\dots,x_n$, count the number of satisfying assignments. In other words, count the size of the set $\{(x_1,\dots,x_n) : \phi(x_1,\dots,x_n)\}$.
The #SAT problem is known to be NP-hard, which (likely) means there is no general algorithm for this problem that is efficient.
If $n$ is small, you can enumerate all $2^n$ assignments and check which ones are satisfying assignments, but this becomes inefficient very quickly as $n$ becomes large. If $\phi$ has some special structure, you may be able to count the number of satisfying assignments more quickly than enumerating them, but this requires cleverness, and it is believed there is no general technique (no technique that works for all $\phi$) to do this efficiently.
Nonetheless, there has been significant work on heuristics and techniques for solving the #SAT problem. Those techniques might prove effective enough in practice for many conditions that you run into in practice (because in practice the condition often has some structure that can be exploited to speed up the counting process). There's a rich and detailed literature on this subject in the computer science literature, too deep to cover in this answer, but if you search for research on #SAT, you should find some entry points into the literature.
This question would probably be better asked on the Computer Science or Theoretical Computer Science sites -- you'll get better answers there. For example, see the following questions, which address aspects of this:
If you just want to get the answer rapidly for a particular formula, your best bet is to find an existing #SAT solver and use it. It'll implement sophisticated techniques, but you don't need to know how those techniques work to use the solver and (with any luck) get an answer.
Best Answer
You would treat this problem the same way you would treat one with only one output: create two separate Karnaugh Maps, one for each output. I quickly made two with your example data which resulted in the following expressions:
Z1 = A'B'D'
Z2 = A'B'C'D'
Here is a Wikipedia article on Karnaugh Maps if you are not familiar with them: https://en.wikipedia.org/wiki/Karnaugh_map
Sources: As an Electrical Engineering student I spent a lot of time creating logic circuitry with multiple outputs and this was the method I used.
Edit: I'm not sure if this was what you were trying to get at, but if you wanted Z1 to depend on Z2, you would simply add Z2 as an additional input to Z1's equation. I hope this helps!