I'm not sure I completely get the question -- you are having trouble understanding Dijkstra's algorithm? The Wikipedia article is good, and there are many nice worked examples on the web. I would try to work through some of these examples, and then ask specific questions if you're still confused about how the algorithm works.
As for the homework problem you've quoted, here's a hint: any algorithm, Dijkstra or otherwise, for finding a shortest path will only work if such a path actually exists. Can you come up with an example of a connected weighted graph where no shortest path exists between some pair of nodes?
For context: a result of Pósa says that a random graph with $C n \log n$ edges already contains a Hamiltonian path with probability tending to $1$, and here we have a uniformly random graph (with $\frac{n^2}{4}$ edges on average). So we can be fairly wasteful.
As far as an algorithm goes, we can take the algorithm used to prove Dirac's theorem. The hypotheses of that theorem don't hold here, but the algorithm will still work with high probability. The strategy is this:
- Pick a path greedily: start at a vertex $v_1$, pick a vertex $v_2$ it is adjacent to, then pick a vertex $v_3$ adjacent to $v_2$, and so on. Repeat until you get stuck.
- Turn the path into a cycle: if it has endpoints $v_1$ and $v_\ell$, find adjacent vertices $v_i$ and $v_{i+1}$ on the path such that $v_{i+1}$ is adjacent to $v_1$ and $v_i$ is adjacent to $v_\ell$. Then the cycle is $v_1, \dots, v_i, v_\ell, \dots, v_{i+1}, v_1$.
- Turn the cycle into a longer path: if $v_{\ell+1}$ is a vertex not on the cycle, find a vertex $v$ on the cycle adjacent to $v_{\ell+1}$, and take the path starting next to $v$, going around the cycle, and then going to $v_{\ell+1}$.
- Repeat steps 2 and 3 until the path contains all the vertices.
All of these steps are quite likely to work individually. The problem is that to analyze how likely they are to work as a whole, we'd have to check edges of the graph multiple times, which doesn't preserve independence.
So instead, we will write our uniformly random graph $G$ as the union of graphs $G_0, G_1, \dots, G_{10 \log n}$, where $G_0$ contains each edge independently with probability $\frac14$, and $G_1, \dots, G_{10 \log n}$ contain each edge independently with probability $p$, chosen so that $\frac34(1-p)^{10 \log n} = \frac12$. Then the union will contain each edge with probability $\frac12$, so it will be the uniformly random graph. If we solve for $p$, we get $p = O(\frac{1}{\log n})$, but all we'll really need is for $p$ to be asymptotically bigger than $\frac{1}{\sqrt n}$.
Then I claim that:
- A greedy path chosen in $G_0$ will reach length $n - 5\log n$ with very high probability.
- Doing step 2 of the algorithm in any of the $G_1, \dots, G_{10 \log n}$ will work with very high probability.
- Doing step 3 of the algorithm in any of the $G_1, \dots, G_{10 \log n}$ will work with very high probability.
Then we can just look at graphs $G_0, G_1, \dots$ sequentially as we go through the algorithm. This preserves independence, and if the edges we need will be in the graph $G_i$ we're looking at, they will be in the union $G$.
For the first claim: the probability that a greedy path will get stuck while there's still $5 \log n$ vertices to pick from is $(\frac34)^{5\log n} = n^{-5\log \frac43}$, so the probability is at most $n^{1 - 5\log \frac43} \approx n^{-0.43}$ that it ever gets stuck.
For the second claim: we have at least $\frac n2$ options for vertices $v_i$ and $v_{i+1}$, and each one works with probability $p^2$. The probability that none of them work is $(1-p^2)^{n/2} \le e^{-p^2n/2}$, which approaches $0$ as $n\to \infty$. (This is where we want $p \gg \frac{1}{\sqrt n}$, so that the exponent goes to $-\infty$ as $n\to\infty$. We actually want a bit better than that, because we want all $O(\log n)$ of these steps to work, so the probability should go to $0$ faster than $\frac{1}{\log n}$. But our $p$ is actually quite a bit larger than $\frac{1}{\sqrt n}$, so we have that wiggle room.)
For the third claim: we have at least $\frac n2$ options for the vertex $v$, and each works with probability $p$. The probability that none of them work is $(1-p)^{n/2} \le e^{-pn/2}$, which approaches $0$ as $n\to \infty$.
Best Answer
I'm pretty sure my proof works, but I didn't formalize it so I'm not sure. It indeed uses the same argument as the lower bound for comparison based sorts (if you don't know it then I doubt it's the solution your professor/textbook expects).
First show that for each permutation of the $n$ vertices you can construct a Tournament whose unique Hamiltonian path is that permutation. Then think of any algorithm as a decision tree. Every time it looks at a cell of the adjacency matrix the answer is either
true
orfalse
and it takes a different branch in the decision tree. So you start with a set $S$ of $n!$ possible graphs and then, in the worst case, the algorithm always gets an unfortunate answer back (i.e. it sees $T[i,j] = \text{false}$ so it has to take the larger "half" of $S$). Ultimately the binary tree has $n!$ leaf nodes and so there has to be some path with length at least $\log_2(n!)$. The rest falls out... (I hope)