[Math] Dilworth’s Theorem is equivalent to Hall’s

combinatoricsgraph theory

I have been told that Dilworth's Theorem implies Hall's Theorem.
I don't seem to be able to show this implication, especially because I haven't done any graph theory in a while!
Any help?
If possible, I would like a graph theory proof (and not using systems of distinct representative).

I believe the main idea is to order the bipartite graph $G = X \cup Y$ by letting $x<y$ if and only if $x \in X$, $y\in Y$ and there is an edge between $x$ and $y$. Now if I could show that I can assume the longest anti-chain in $G$ contains the whole of $X$, I am done. But I am not sure this is true!
I tried to apply Hall's Condition to a possibly "larger" anti-chain which does not contain some elements of X and show this is not possible, but I was unsuccessful.

Best Answer

More generally, you can show that, dilworth theorem implies that in bipartite graphs, maximum matching is equal to minimum vertex cover, and vice versa.

As a hint, one way is easy: the complement of a minimum vertex cover is an anti chain. I leave the other side to you!