More physically than a lot of the other answers here (a lot of which amount to "the formalism of quantum mechanics has complex numbers, so quantum mechanics should have complex numbers), you can account for the complex nature of the wave function by writing it as $\Psi (x) = |\Psi (x)|e^{i \phi (x)}$, where $i\phi$ is a complex phase factor. It turns out that this phase factor is not directly measurable, but has many measurable consequences, such as the double slit experiment and the Aharonov-Bohm effect.
Why are complex numbers essential for explaining these things? Because you need a representation that both doesn't induce nonphysical time and space dependencies in the magnitude of $|\Psi (x)|^{2}$ (like multiplying by real phases would), AND that DOES allow for interference effects like those cited above. The most natural way of doing this is to multiply the wave amplitude by a complex phase.
After all our goal is to explain physical phenomenon...what if we venture into this jungle of real functions and come up with a totally different theory which explains physical phenomenon.
Good luck to you.
First of all you are wrong that classical physics did not use imaginary functions. The solutions of Maxwell's equations expressed as imaginary functions are more general and universal than sines and cosines.
The simplest set of solutions to the wave equation result from assuming sinusoidal waveforms of a single frequency in separable form
$$\mathbf{E}(\mathbf{r}, t)=\mathrm{Re}\{\mathbf{E}(\mathbf{r} e^{i\omega t}\}$$
Imaginary functions are a useful tool in integrations and descriptions of real data.
((If you ask me to do research in theoretical physics, I'll throw all the QM books in garbage (no disregard though) and start thinking from this point of view...Thats my style of working!)
With such blind spots I am sure nobody will ask you to do research in theoretical physics.
The difference between the classical use of imaginary functions from the solutions of the wave equations and the quantum mechanical one is the postulate the posits that the square of the wavefunction is real and gives the probability of an elementary particle (or nuclear) interaction to be observed. When in the microcosm quantum mechanics reigns. There one cannot take a ruler, mark it and measure, it was found that the theories and data agreed when the probability postulate was imposed. One has to make many measurements and get the probability distribution for a particularly desired value.
The above link discusses the postulates of quantum mechanics which were not imposed out of a freak imagination, but were necessary to be able to calculate and fit known observations, like the hydrogen atom, and predict the outcome of experiments and observations.
EDIT to address the last part of the question:
((If you ask me to do research in theoretical physics, I'll throw all the QM books in garbage (no disregard though) and start thinking from this point of view...Thats my style of working!)
That works for art, art is much less dependent on data bases of observations and the tools that can be used.
The fact that for two thousand years people have been creating models of physical observations, and particularly the last 300 a data base of mathematical tools too, constrains creativity in science. The mathematical tools have been used to model all observations up to now. These models are in a way a shorthand description of nature that could be used in many ways instead of going back to the data itself. There exists a frontier of experimental research where the models have not been validated , and that is where new thinking can come in.
My expected answer is in this spirit, "Hey if you go in that direction, you are bound to end up in a quick sand, for so and so reason"
If you go into the direction of throwing everything away you will end up with vague models like the Democritus atomic model, or the phlogiston theory, in your own words. The mathematical models used now are validated, some of them to great accuracy. New mathematical tools to model the already modeled data would only be worth the attention if something new and unexpected is predicted and found in the experiments.
There are people working off the beaten track theories, trying to explain quantum mechanics by underlying deterministic theories. These people have a thorough knowledge of existing mathematical tools and the physics models that have been validated. They just want to work at the frontier by ignoring that mainstream physics considers their effort contradictory or impossible/prohibited by the postulates of quantum mechanics and special relativity. An example is the current research interests of G.'t Hooft who has also participated here a while ago .
So if you go in that direction you will end in quick sand surely if you do not have a thorough knowledge of the data and mathematical tools used by physics up to now. If you make the effort to acquire them, then of course you are free to prove mainstream physics "wrong" , as long as your new theory can accommodate the data shorthand of the models up to now . All new theories as they appeared in physics joined smoothly with the old ones, as limiting cases.
Best Answer
You are looking at a solution of the time-independent Schrodinger equation as your $\psi(x)$ does not have any time dependence, and the basic solutions of the time-independent equation can often be real. Linear combinations of these basic solutions can be complex.
The solutions to the time-dependent Schrodinger equation are always linear combinations of the form $$ \Psi(x,t)=\sum_n c_n e^{-iE_nt/\hbar} \psi_n(x) $$ and will be complex even if the time-independent functions $\psi_n(x)$ are real.
To relate $a$ to the uncertainty relation you would need to compute $\Delta x^2$ and $\Delta p^2$ using your $\psi(x)$ (which you will have to normalize) and to find how $a$ enters into the product $\Delta x\Delta p$.
To give you a hint I'm including the plot of $\psi(x)^2$ for $a=1$ (black), $a=2$ (blue) and $a=1/2$ (red).