After all our goal is to explain physical phenomenon...what if we venture into this jungle of real functions and come up with a totally different theory which explains physical phenomenon.
Good luck to you.
First of all you are wrong that classical physics did not use imaginary functions. The solutions of Maxwell's equations expressed as imaginary functions are more general and universal than sines and cosines.
The simplest set of solutions to the wave equation result from assuming sinusoidal waveforms of a single frequency in separable form
$$\mathbf{E}(\mathbf{r}, t)=\mathrm{Re}\{\mathbf{E}(\mathbf{r} e^{i\omega t}\}$$
Imaginary functions are a useful tool in integrations and descriptions of real data.
((If you ask me to do research in theoretical physics, I'll throw all the QM books in garbage (no disregard though) and start thinking from this point of view...Thats my style of working!)
With such blind spots I am sure nobody will ask you to do research in theoretical physics.
The difference between the classical use of imaginary functions from the solutions of the wave equations and the quantum mechanical one is the postulate the posits that the square of the wavefunction is real and gives the probability of an elementary particle (or nuclear) interaction to be observed. When in the microcosm quantum mechanics reigns. There one cannot take a ruler, mark it and measure, it was found that the theories and data agreed when the probability postulate was imposed. One has to make many measurements and get the probability distribution for a particularly desired value.
The above link discusses the postulates of quantum mechanics which were not imposed out of a freak imagination, but were necessary to be able to calculate and fit known observations, like the hydrogen atom, and predict the outcome of experiments and observations.
EDIT to address the last part of the question:
((If you ask me to do research in theoretical physics, I'll throw all the QM books in garbage (no disregard though) and start thinking from this point of view...Thats my style of working!)
That works for art, art is much less dependent on data bases of observations and the tools that can be used.
The fact that for two thousand years people have been creating models of physical observations, and particularly the last 300 a data base of mathematical tools too, constrains creativity in science. The mathematical tools have been used to model all observations up to now. These models are in a way a shorthand description of nature that could be used in many ways instead of going back to the data itself. There exists a frontier of experimental research where the models have not been validated , and that is where new thinking can come in.
My expected answer is in this spirit, "Hey if you go in that direction, you are bound to end up in a quick sand, for so and so reason"
If you go into the direction of throwing everything away you will end up with vague models like the Democritus atomic model, or the phlogiston theory, in your own words. The mathematical models used now are validated, some of them to great accuracy. New mathematical tools to model the already modeled data would only be worth the attention if something new and unexpected is predicted and found in the experiments.
There are people working off the beaten track theories, trying to explain quantum mechanics by underlying deterministic theories. These people have a thorough knowledge of existing mathematical tools and the physics models that have been validated. They just want to work at the frontier by ignoring that mainstream physics considers their effort contradictory or impossible/prohibited by the postulates of quantum mechanics and special relativity. An example is the current research interests of G.'t Hooft who has also participated here a while ago .
So if you go in that direction you will end in quick sand surely if you do not have a thorough knowledge of the data and mathematical tools used by physics up to now. If you make the effort to acquire them, then of course you are free to prove mainstream physics "wrong" , as long as your new theory can accommodate the data shorthand of the models up to now . All new theories as they appeared in physics joined smoothly with the old ones, as limiting cases.
In order to test whether a system is in a state with some wide uncertainty in position or momentum, you would have to do an interference experiment. The smallness of $\hbar$ does not explain why you can't do such an experiment with a macroscopic system.
Rather, what happens is that the system interacts with its environment and undergoes decoherence. Decoherence would select a set of states that are narrow in position and momentum on a macroscopic scale. The system would then exist in each of those states, but the states would be unable to undergo interference as a result of the decoherence. As a result, each version of you would see the system in one of the allowed states, none of which is wide in position and momentum. For more explanation of decoherence see
https://arxiv.org/abs/quant-ph/0306072
https://arxiv.org/abs/1212.3245
and references therein.
Best Answer
The uncertainty principle is much more general than anything you might say about the wave-particle duality. In particular, wave-particle duality is a vague and imprecise statement about how certain types of quantum systems qualitatively behave, while the uncertainty principle is a very general and quantitative statement about the standard deviations of operators.
While, in settings like the double-slit, it is true that you may think about the quantum objects as being represented by a probability wave, this breaks down whenever one considers finite-dimensional Hilbert spaces, as they occur e.g. in the setting of quantum information and its qubits. There's no continuous set of generalized position operators - not ever a position operator at all - and hence no "wavefunction". Nevertheless, the relation $$ \sigma_A(\psi)\sigma_B(\psi) \geq \frac{1}{2}\lvert\langle \psi \vert [A,B] \vert \psi \rangle\rvert$$ holds for all operators $A,B$ and all states $\psi$.
And even in the infinite-dimensional setting where you might claim that we have a "wave nature" and a "particle nature", this relation holds for all operators, not just position and momentum, and the proof just relies on basic properties of Hilbert spaces like the Cauchy-Schwarz inequality.
To stress this crucial fact: The uncertainty relation is a general consequence of the axioms that states are rays in a Hilbert space and the rule how these states give expectation values. No conception of "particle" or "wave" ever enters into the derivation, and the fact that waves also exhibit a type of uncertainty relation in their widths is a simple consequence of the properties of the Fourier transform. Since the Fourier transform is also intimately related to the position and momentum operators by the Stone-von Neumann theorem about their essentially unique representation as multiplication and differentiation, this explains the similarity without any reference to "wave-particle duality".