Peter Smith’s approach is probably the best way to think about such problems in general. In this case there’s another fairly simple approach. Notice first that the connective $\#$ is symmetric in its three arguments: if $XYZ$ is any permutation of $ABC$, $\#ABC$ is equivalent to $\#XYZ$. From this it’s easy to see that all combinations of exactly two sentence letters $A$ and $B$ with $\#$ are equivalent either to $\#AAB$ or to $\#ABB$. Moreover, $\#AAB$ is equivalent simply to $A$, and $\#ABB$ to $B$; this suggests trying to show that as long as we work with at most two sentence letters, $\#$ essentially does nothing.
Claim: Every wff that can be constructed from the sentence letters $A$ and $B$ and the connectives $\neg$ and $\#$ is equivalent to one of $A,B,\neg A$, and $\neg B$.
Proof: The claim is easily proved by induction on the complexity of the wff. It’s certainly true for wffs with no connective: they are simply $A$ and $B$. Suppose that $\varphi$ has at least one connective. Then $\varphi$ is either $\neg\psi$ for some wff $\psi$, or $\#\psi_0\psi_1\psi_2$ for wffs $\psi_0,\psi_1$, and $\psi_2$. Suppose that $\varphi=\neg\psi$. The induction hypothesis ensures that $\psi$ is equivalent to one of $A,B,\neg A$, and $\neg B$, and $\varphi$ is therefore equivalent to one of $\neg A,\neg B,A$, and $B$, respectively.
Now suppose that $\varphi=\#\psi_0\psi_1\psi_2$. Again the induction hypothesis ensures that each of $\psi_0,\psi_1$, and $\psi_2$ is equivalent to one of $A,B,\neg A$, and $\neg B$. Any set of three chosen from $\{A,B,\neg A,\neg B\}$ necessarily contains either two copies of one of the four wffs or a wff and its negation, so $\varphi$ is equivalent to $\#XXY$ or $\#X\neg XY$ for some $X,Y\in\{A,B,\neg A,\neg B\}$. But $\#XXY$ is equivalent to $X$, and $\#X\neg XY$ is equivalent to $Y$, so in every case $\varphi$ is equivalent to one of $A,B,\neg A$, and $\neg B$. The claim now follows by induction. $\dashv$
Since $A\land B$ is not equivalent to $A,B,\neg A$, or $\neg B$, $\{\neg,\#\}$ is not complete.
(I’ve implicitly used the fact that if $\psi_0,\psi_1$, and $\psi_2$ are equivalent to $\psi_0',\psi_1'$, and $\psi_2'$, respectively, then $\#\psi_0\psi_1\psi_2$ is equivalent to $\#\psi_0'\psi_1'\psi_2'$. This is clear from consideration of the relevant truth tables.)
What we would like to prove is a conjunction, so it suffices to prove each conjunct separately and then glue them together at the end. This problem would probably be easier and more intuitive using proof by contradiction, but after talking with the asker, I will provide a direct proof.
$$\begin{array}{lr}
1. & (P \rightarrow Q)\wedge(Q \rightarrow R) & \text{Premise} \\
2. & P \rightarrow Q &\text{Simplification, 1}\\
3. & Q \rightarrow R & \text{Simplification, 1}\\
4. & \neg{P} \vee Q & \text{Conditional Law, 2}\\
5. & \neg{Q} \vee R & \text{Conditional Law, 3}\\
6. & Q \vee \neg{Q} & \text{Tautology} \\
7. & \neg{P} \vee R & \text{Constructive Dilemma, 4,5,6}\\
8. & P \rightarrow R & \text{Conditional Law, 7}\\
9. & (P \rightarrow Q) \vee (Q \rightarrow R) &\text{Addition, 2}\\
10. & (Q \rightarrow P) \vee (Q \rightarrow R) &\text{Addition, 3}\\
11. & (P \rightarrow Q) \vee (R \rightarrow Q) &\text{Addition, 2}\\
12. & Q \vee \neg{Q} & \text{Tautology}\\
13. & (Q \vee \neg{Q}) \vee (P \vee \neg{R}) & \text{Addidition, 12}\\
14. & (\neg{Q} \vee P) \vee (\neg{R} \vee Q) & \text{Associative Law, 13}\\
15. & (Q \rightarrow P) \vee (R \rightarrow Q) & \text{Conditional Law, 14}\\
16. & \big((P \rightarrow Q) \vee (Q \rightarrow R)\big)\wedge \big((Q \rightarrow P) \vee (Q \rightarrow R)\big) & \text{Conjunction, 9,10}\\
17. & \big((P \rightarrow Q) \vee (R \rightarrow Q)\big)\wedge \big((Q \rightarrow P) \vee (R \rightarrow Q)\big) & \text{Conjunction, 11,15}\\
18. & \big((P \rightarrow Q) \wedge (Q \rightarrow P)\big)\vee (Q \rightarrow R) & \text{Distributive Law, 16}\\
19. & \big((P \rightarrow Q) \wedge (Q \rightarrow P)\big)\vee (R \rightarrow Q) & \text{Distributive Law, 17}\\
20. & \Big(\big((P \rightarrow Q) \wedge (Q \rightarrow P)\big)\vee (Q \rightarrow R)\Big) \wedge & \\
&\Big(\big((P \rightarrow Q) \wedge (Q \rightarrow P)\big)\vee (R \rightarrow Q)\Big) & \text{Conjunction, 18,19}\\
21. & \big((P \rightarrow Q) \wedge (Q \rightarrow P)\big)\vee \big((Q \rightarrow R) \wedge (R \rightarrow Q) \big) & \text{Distributive Law, 20}\\
22. & (P \equiv Q) \vee (Q \equiv R) & \text{Definition of Biconditional, 21}\\
\therefore & (P \rightarrow R)\wedge \big((P \equiv Q) \vee (Q \equiv R)\big) & \text{Conjunction, 8,22}
\end{array}$$
As desired.
Best Answer
There is a result in Robert Reckhow's thesis that characterizes the adequate sets of connectives. The result says that for a set of connectives to be complete one needs the following:
These are necessary and sufficient conditions for a set of connectives to be adequate. If a set of connectives is not adequate then it lacks one of these. If you have an odd connective, you can fix the inputs with T and F to get either $\vee$ or $\wedge$. If you have a non-monotone connective, you can fix the inputs with T and F to get $\lnot$. Any boolean function can then be constructed from these as a CNF.
Note that even connectives are closed under composition as well as monotone connectives. So to prove that a set of connectives is inadequate, you typically need to show that either
For your examples, the first one is a set of monotone connectives, the second one is a set of even connective. So they cannot be adequate.