Showing posts with label Quick3 Sort. Show all posts
Showing posts with label Quick3 Sort. Show all posts

Friday, 10 February 2012

Sorting algorithms


Introduction

I grew up with the bubble sort, in common, I am sure, with many colleagues. Having learnt one sorting algorithm, there seemed little point in learning any others, it was hardly an exciting area of study. Mundane sorting may be, but it is also central to many tasks carried out on computer. Prompted by its inclusion in the AEB 'A' level syllabus, I looked at the process again in detail. The efficiency with which sorting is carried out will often have a significant impact on the overall efficiency of a program. Consequently there has been much research and it is interesting to see the range of alternative algorithms that have been developed.
It is not always possible to say that one algorithm is better than another, as relative performance can vary depending on the type of data being sorted. In some situations, most of the data are in the correct order, with only a few items needing to be sorted. In other situations the data are completely mixed up in a random order and in others the data will tend to be in reverse order. Different algorithms will perform differently according to the data being sorted. Four common algorithms are the exchange or bubble sort, the selection sort, the insertion sort and the quick sort.
The selection sort is a good one to use with students. It is intuitive and very simple to program. It offers quite good performance, its particular strength being the small number of exchanges needed. For a given number of data items, the selection sort always goes through a set number of comparisons and exchanges, so its performance is predictable.
procedure SelectionSort ( d: DataArrayType; n: integer) {n is the number of elements}
for k = 1 to n-1 do
begin
small = k
for j = k+1 to n do
if d[ j ] < d[small] then small = j
{Swap elements k and small}
Swap(d, k, small)
end

The below are different sorting algorithms:
  1. Exchange (Bubble) Sort
  2. Insertion Sort
  3. Selection Sort
  4. Shell Sort
  5. Merge Sort
  6. Heap Sort
  7. Quick Sort
  8. Quick3 Sort

Comparing the Algorithms

There are two important factors when measuring the performance of a sorting algorithm. The algorithms have to compare the magnitude of different elements and they have to move the different elements around. So counting the number of comparisons and the number of exchanges or moves made by an algorithm offer useful performance measures. When sorting large record structures, the number of exchanges made may be the principal performance criterion, since exchanging two records will involve a lot of work. When sorting a simple array of integers, then the number of comparisons will be more important.
It has been said that the only thing going for the bubble (exchange) sort is its catchy name. The logic of the algorithm is simple to understand and it is fairly easy to program. It can also be programmed to detect when it has finished sorting. The selection sort, by comparison, always goes through the same amount of work regardless of the data and the quick sort performs particularly badly with ordered data. However, in general the bubble sort is a very inefficient algorithm.
The insertion sort is a little better and whilst it cannot detect that it has finished sorting, the logic of the algorithm means that it comes to a rapid conclusion when dealing with sorted data.
The selection sort is a good one to use with students. It is intuitive and very simple to program. It offers quite good performance, its particular strength being the small number of exchanges needed. For a given number of data items, the selection sort always goes through a set number of comparisons and exchanges, so its performance is predictable.
The first three algorithms all offer O(n2) performance, that is sorting times increase with the square of the number of elements being sorted. That means that if you double the number of elements being sorted, then there will be a four-fold increase in the time taken. Ten times more elements increases the time taken by a factor of 100! This is not a problem with small data sets, but with hundreds or thousands of elements, this becomes very significant. With most large data sets, the quick sort is a vastly superior algorithm (although as you might expect, it is much more complex), as the table below shows.

Random Data Set: Number of comparisons made

Sort/Elements50100200300400500
Selection Sort12254950199004485079800124750
Exchange Sort14105335203004565079866126585
Insertion Sort13915399204734444978779123715
Quick Sort3999901954338450666256
It should be pointed out that the methods above all belong to one family, they are all internal sorting algorithms. This means that they can only be used when the entire data structure to be sorted can be held in the computer's main memory. There will be situations where this is not possible, for example when sorting a very large transaction file which is stored on, say, magnetic tape or disc. Then an external sorting algorithm will be needed.

Quick3 Sort


Discussion

The 3-way partition variation of quick sort has slightly higher overhead compared to the standard 2-way partition version. Both have the same best, typical, and worst case time bounds, but this version is highly adaptive in the very common case of sorting with few unique keys.
When stability is not required, 3-way partition quick sort is the general purpose sorting algorithm of choice.
The 3-way partitioning code shown above is written for clarity rather than optimal performance; it exhibits poor locality, and performs more swaps than necessary. A more efficient but more elaborate 3-way partitioning method is given in Quicksort is Optimal by Robert Sedgewick and Jon Bentley.

Algorithm

# choose pivot
swap a[n,rand(1,n)]

# 3-way partition
i = 1, k = 1, p = n
while i < p,
  if a[i] < a[n], swap a[i++,k++]
  else if a[i] == a[n], swap a[i,--p]
  else i++
end
→ invariant: a[p..n] all equal
→ invariant: a[1..k-1] < a[p..n] < a[k..p-1]

# move pivots to center
m = min(p-k,n-p+1)
swap a[k..k+m-1,n-m+1..n]

# recursive sorts
sort a[1..k-1]
sort a[n-p+k+1,n]

Properties

  • Not stable
  • O(lg(n)) extra space
  • O(n2) time, but typically O(n·lg(n)) time
  • Adaptive: O(n) time when O(1) unique keys

References

Programming Pearls by Jon Bentley. Addison Wesley, 1986.
Quicksort is Optimal by Robert Sedgewick and Jon Bentley, Knuthfest, Stanford University, January, 2002.
Bubble-sort with Hungarian ("Csángó") folk dance YouTube video, created at Sapientia University, Tirgu Mures (Marosvásárhely), Romania.
Select-sort with Gypsy folk dance YouTube video, created at Sapientia University, Tirgu Mures (Marosvásárhely), Romania.
The Beauty of Sorting YouTube video, Dynamic Graphics Project, Computer Systems Research Group, University of Toronto.