Let $T(n)$ denote the time complexity of multiplying two $n$ digit numbers.
Your algorithm states that addition and subtraction is cheap (constant time). Also, you can perform a multiplication of two $2n$ digit numbers by performing 4 small multiplications on $n$ bit numbers - $AC, AD, BC, BD$. However, you can reduce this to three multiplications by noting that $AD+BC = (A+B)\cdot(C+D)-AC-BD$. So,
$$(10^nA + B)\cdot(10^nC + D) = 10^{2n}AC + 10^n((A+B)\cdot(C+D)-AC-BD)) + BD $$
Note that each of $(A+B)$ and $(C+D)$ is atmost $n+1$ digits. So, multiplying two $2n$ digit numbers boils down to two $n$ digit multiplications ($AC$ and $BD$), one $n+1$ bit multiplication ($(A+B)\cdot(C+D)$), two additions, and two subtractions. Addition and subtraction are constant time. Also note that multiplying two $2n$ digit numbers takes $T(2n)$. Hence,
$$T(2n) = 2T(n) + T(n+1) + \Theta(1)$$
or, $$T(n) = 3T(n/2) + \Theta(1)$$
Note that in the second equation, I have done away with ceiling and floor functions, and so $T((n+1)/2)$ has been converted to $T(n/2)$. This is a very common slight abuse when calculating asymptotic complexities. Now, you can apply the Master theorem to this recurrence relation.
For this you can use Akra-Bazzi method.
But recurrence graph can help equally well.
If you have $T(n)$ as top node, then in the previous step you have had these nodes (sum of which gives $T(n)$)
$$T(\frac{n}{3})|T(\frac{2n}{3})|n$$
The next level has had these nodes ($n$ is added from the previous level):
$$T(\frac{n}{9}),T(\frac{2n}{9}),\frac{n}{3}|T(\frac{2n}{9}),T(\frac{4n}{9}),\frac{2n}{3},n$$
So you have at this level
$$T(\frac{n}{9}),(2)T(\frac{2n}{9}),T(\frac{4n}{9}),2n$$
Notice that because $\frac{n}{3}+\frac{2n}{3}=n$ each level will have a node with $n$ as a value. (2) above signifies the number of nodes.
Instead of trying to do this manually one level at a time, the structure of nodes after $N$ steps is the same as what in symbolic notation you would have after deriving
$$(x\frac{1}{3}+y\frac{2}{3})^N=\sum_{k=0}^{N}\binom{N}{k}x^{N-k}\frac1{3^{N-k}}y^{k}\frac{2^k}{3^k}$$
Now you simply replace any $x^py^q$ with $T(n^*)$ and there you have the structure after $N$ steps, of course we will add $Nn$.
$$\sum_{k=0}^{N} \binom{N}{k}T(\frac{2^k}{3^N}n)+Nn$$
The question of how many levels you have is easily answered by looking at $\frac{2^k}{3^N}n$ because this one will never go lower than 1. So N is obviously $\sim \log(n)$ because $\frac{2^k}{3^N}$ is exponential.
So in the end we can safely assume $T(n) \sim n\log(n)$. Since we do not know what $T(1)$ is and this will affect the evaluation since this value will be multiplied by the number of the leaf nodes, we take $T(n) \cong an\log(n)$ and try to calculate the asymptotic value of $a$. Even though it is not necessary, we just want to confirm the finding.
$$an\log(n)=\frac{1}{3}an\log(\frac{n}{3})+\frac{2}{3}an\log(\frac{2n}{3})+n$$
$$\frac{1}{3}a\log(\frac{1}{3})+\frac{2}{3}a\log(\frac{2n}{3})+1=0$$
or
$$a=\frac{3}{3\log(3)+2\log(2)}$$
and there you have it:
$$T(n) \cong \frac{3}{3\log(3)+2\log(2)} n\log(n)$$
Best Answer
The level of recursions should rather be
You can see this quite easily if you consider the special cases $n=b^m$:
Obviously you need also for $1\leq n < b$ at least $1$ round. So, the level of recursions here is $\lceil \log_b n \rceil$.
Similarly:
For $b^{m-1} +1 \leq n < b^m$ you also have $m = \lceil \log_b n \rceil$ recursions, because after $m-1$ recursions there are still $b \geq n-b^{m-1}>0$ tasks left to be solved.