Typical implementations of quicksort don't append the pivot to one of the lists. Instead, they are implemented (in Haskell notation) as
qsort [] = []
qsort xs = qsort left ++ [pivot] ++ qsort right
where pivot = head xs
left = filter (<x) xs
right = filter (>=x) xs
which keeps the symmetry between the left and right lists (if the list elements are distinct) and results in the same number of comparisons for an initial list that is sorted increasing or sorted decreasing.
Note that the 'worst case' isn't just found for the lists that are sorted in increasing or decreasing order. For example, the following lists all require six comparisons to sort:
[1,2,3,4]
[1,2,4,3]
[1,4,2,3]
[1,4,3,2]
[4,3,2,1]
[4,3,1,2]
[4,1,2,3]
[4,1,3,2]
In general, any ordering which results in the lest or greatest element being selected as the pivot at every stage of the recursion will give you worst case performance.
Consider the set $z_{i,j} = \{z_i,z_{i+1},....,z_{j-1},z_j\}$ This set order is in terms of the values (not the order in the Array) i.e. $z_i<z_j \;and\; j>i$. As long as none of the these are chosen as pivot,all are passed to the same recursive call.
i.e. pivot $\leftarrow$ x
$x > z_{j} or \; x<z_{i}$ , $z_{i,j}$ is passed to same recursive call.
Consider the case when either of $z_{i},...z_{j} $ gets chosen as pivot.
a) if $z_i$ or $z_j$ gets chosen first, then $z_i$ and $z_j$ get compared.
b) if one of the $z_{i+1} ... z_{j-1}$ gets chosen first,then $z_i$ and $z_j$ are never compared.
$Pr\{z_i z_j compared\} = Pr\{X=z_i\} + Pr\{X=z_j\}$
PS: $Pr\{X=z_i\} = Pr\{X=z_j\} = 1/(j-i+1)$
I hope it helps.
Best Answer
the $ck$ is the additional complexity you get by adding another element you need to sort. For example, as soon as you "sorted" 1 object, you just need to add the complexity for $N-1$, hence $T(k)=T(k-1) + ck$ where c is some constant. the $k$ on the righthand side comes from sorting the $k$'th additional input, which has complexity $k$, (since it needs to be shuffled all through the sequence in the worst case).
Hence you arrive on the bottom formula including the sum $\sum_{i=1}^N ci$. This is, by the formula for arithmetic series: $$\sum_{i=1}^N ci= cN(N+1)/2=cN^2/2 + cN/2$$ hence we get as complexity $O(N^2)$.