We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
You are on page 1/ 12
Sorting algorithm
From Wikipedia, the free encyclopedia
A sorting algorithm is an algorithm that puts elements ofa list in a certain order. ‘The most-used orders are numerical order and
lexicographical order. Efficient sorting is important for optimizing the use of other algorithms (such as search and merge
algorithms) which require input data to be in sorted lists; i is also often useful for canonicalizing data and for producing human-
readable output. Mote formally, the output must satisfy two conditions:
1. The output is in nondecreasing order (each element is no smaller than the previous element according to the desired total
order);
2. The output is a permutation (reordering) of the input
Further, the data is often taken to be in an array, which allows random access, rather than a list, which only allows sequential
access, though often algorithms can be applied with suitable modification to either type of data,
Since the dawn of computing, the sorting problem has attracted a great deal of research, perhaps due to the complexity of solving it
efficiently despite its simple, familiar statement. For example, bubble sort was analyzed as carly as 1956.11 A fundamental limit of
comparison sorting algorithms is that they require linearithmic time — O(1 log n)— in the worst case, though better performance is
possible om real-world data (such as almost-sorted data), and algorithms not based on comparison, such as counting sort, can have
better performance. Although many consider sorting a solved problem — asymptotically optimal algorithms have been known since
the mid-20th century — useful new algorithms are still being invented, with the now widely used Timsort dating to 2002, and the
library sort being first published in 2006.
Sorting algorithms are prevalent in introductory computer science classes, where the abundance of algorithms for the problem
provides a gentle introduction to a variety of care algorithm concepts, such as big O notation, divide and conquer algorithms, data
structures such as heaps and binary trees, randomized algorithms, best, worst and average case analysis, time-space tradeoffs, and
upper and lower bounds.
Contents
= 1 Classification
= LI Stability
‘= 2 Comparison of algorithms
= 3 Popular sorting algorithms
= 3.1 Simple sorts
= 3.1.1 Insertion sort
= 3.1.2 Selection sort
= 3.2 Efficient sorts
= 3.2.1 Merge sort
= 3.2.2 Heapsort
= 3.2.3 Quicksort
= 3.3 Bubble sort and variants
= 33.1 Bubble sort
= 3.3.2 Shell sort
= 3.3.3 Comb sort
= 34 Distribution sort
= 3.4.1 Counting sort
= 3.4.2 Bucket sort
= 3.4.3 Radix sort
= 4 Memory usage patterns and index sorting
1 5 Inefficient sorts
= 6 Related algorithms
= 7 See also
= 8 References
= 9 Further reading
= 10 External links
Classification
Sorting algorithms are often classified by:
= Computational complexity (worst, average and best behavior) of element comparisons in terms of the size of the list (n). Fortypical serial sorting algorithms good behavior is O(n log n), with parallel sort in O(log? n, and bad behavior is O(4?). (See
Big O notation.) Ideal behavior for a serial sort is O(n), but this is not possible in the average case. Optimal parallel sorting is
O(log n). Comparison-based sorting algorithms, which evaluate the element ofthe list via an abstract key comparison
‘operation, need atleast O(n log n) comparisons for most inputs
+ Computational complexity of swaps (for "in place” algorithms).
+ Memory usage (and use of other computer resources). In particular, some sorting algorithms are “in place". Strictly, an in
place sort needs only O(1) memory beyond the items being sorted; somtimes O(log(n) additional memory is considered "in
place’
+ Recursion, Some algorithms ae either recursive or non-recusive, while others may be both (eg, merge sor)
+ Stability: stable sorting algorithms maintain the relative order of records with equal keys (i, values).
+ Whether or not they are a comparison sort. A comparison sort examines the data only by comparing two elements with a
comparison operator.
+ Gencral method: insertion, exchange, sclection, merging, ete, Exchange sorts include bubble sort and quicksort. Selection
sorts include shaker sort and heapsor. Also whether the algorithm is serial or parallel. The remainder of this discussion
almost exclusively concentrates upon serial algorithms and assumes serial operation.
+ Adaptability: Whether or not the presortedness ofthe input affect the running time. Algorithms that take this into account
are known to be adaptive
Stability
‘When sorting some kinds of data, only part ofthe data is examined when determining the
sort order, For example, in the card sorting example to the right, the cards are being sorted
by their rank, and their suit is being ignored. This allows the possibility of multiple different
correctly sorted versions of the original list. Stable sorting algorithms choose one of these,
according to the following rule: if two items compare as equal, like the two 5 cards, then
their relative order will be preserved, so that if one came before the other in the input, it will
also come before the other in the output.
More formally, the data being sorted can be represented as a record or tuple of values, and
the part of the data that is used for sorting is called the key. In the card example, cards are
represented as a record (rank, suit), and the key is the rank. A sorting algorithm is stable if
whenever there are two records R and S with the same key, and R appears before S$ in the
original list, then R will always appear before S in the sorted list. ae tee loo bee
ae
‘When equal elements ate indistinguishable, such as with integers, or more generally, any +8)
data where the entire element is the key, stability is not an issue. Stability is also not an
issue if all keys are different. :
Unstable sorting algorithms can be specially implemented to be stable. One way of doing.
this is t artificially extend the key comparison, so that comparisons between two objects agile eg
with otherwise equal keys are decided using the order ofthe entries inthe original input lst
{An example of sable sorting on
asa tie-breaker, Remembering this order, however, may require additional time and space, P ‘
playing cards. When the cards are
sorted by rank with a stable sor, the
‘ovo Ss must remain in the same order
in the sorted output that they were
originally in, When they are sorted
with @ non-stable sort, the §s may end
up in the opposite order in the sorted
One application for stable sorting algorithms is sorting a list using a primary and secondary
key. For example, suppose we wish to sort a hand of cards such that the suits are in the order
clubs (#), diamonds (4), hearts (¥), spades (4), and within cach suit, the cards are sorted by
rank. This can be done by first sorting the cards by rank (using any sort), and then doing a
stable sort by suit
Bye fie fee
“Ee . ee
73
9 £ ee
Fe lee fee
‘ ate
* * + 2s
¢ 2% #8
4 2
fee Tas lee
St tt
eer ee vy
L 2L? 94
Within each suit, the stable sort preserves the ordering by rank that was already done. This idea can be extended to any number of
keys, and is leveraged by radix sort, The same effect can be achieved with an unstable sort by using a lexicographic key
comparison, which e.g, compares first by suits, and then compares by rank if the suits are the same.Comparison of algorithms
In this table, m is the number of records to be sorted, The columns "Average" and "Worst" give the time complexity in each case,
under the assumption that the length of each key is constant, and that therefore all comparisons, swaps, and other needed operations
can proceed in constant time, "Memory" denotes the amount of auxiliary storage needed beyond that used by the list itself, under
the same assumption, The run times and the memory requirements listed below should be understood to be inside big O notation,
Logarithms are of any base; the notation Jog? n, means (log n)*.
‘These are all comparison sorts, and so cannot perform better than O(7 log n) in the average or worst case.Comparison sorts
‘Name Best [Average Worst Memory [Stable | Method | Other notes
‘Quicksor is
usually done in
place with
‘O(log n) stack
space. Most
implementations
are unstable, as
typical stable in-place
in-place partitioning is
sortis ‘more complex.
not "Naive variants
stable; |PATEHOMINE ase an O(n) space
stable array to store the
versions partition
exist ‘Quicksort variant
‘using three-way
(at) partitioning
takes O(n)
‘comparisons
‘when sorting an
array of equal
keys
log non
average, worst
Sedgewick
variation is
log n worst
case
Quicksort nlogn) nlogn
Highly
parallelizable (up
{to O(log n) using
the Three
Hungarians’
Mergesot — |nlogn) nlogn nlogn n Yes | Merging Algorithm?! or,
‘more practically,
‘Cole's parallel
‘merge sort) for
processing large
‘amounts of data.
‘Can be
implemented as a
In-place merge sort — — nlog?n 1 Yes | Merging stable sort hase
‘on stable in-place
merging.)
Heapsort nlogn) nlogn 1 No | Selection
(O(n-+ d), in the
\worst case over
sequences that
have d inversions.
Insertion sort n | 7 1 Yes | Insertion
attoning Used Fever
Introsort nlogn) nlogn nlogn logn No |g Selection
implementations.
Stable with O(n)
extra space, for
‘example using
lists.
Selection sort
1 No | Selection
Makes
‘comparisons
‘when the data is
already sorted or
reverse sorted.
Insertion &
Timsort Yes
n | nlogn nlogn n enon
Makes 1
somprisons
Cubesort n | nlogn — nlogn | Yes | Insertion when the data is
already sorted or
reverse sorted
Small code size,Shell sort n
nlog?n
2
Depends on gap
sequence;
best known is
log? n
Insertion
‘no use of call
stack, reasonably
fast, useful where
memory is ata
‘premium such as
‘embedded and
‘older mainframe
‘applications.
Bubble sort n
=
Yes
Exchanging |
‘Tiny code size.
Binary tree sort n
nlogn (balanced)
Yes
Insertion
When using a
self-balancing
binary search tree.
Cycle sort
Insertion
In-place with
‘theoretically
‘optimal number
of waite.
Library sort n
Yes
Insertion
Patience sorting n
No
Insertion &
Selection
‘Finds all the
longest increasing
subsequences in
(O(n ox m).
‘Smoothsort n
No
Selection
‘An adaptive sort:
comparisons
‘when the data is
already sorted,
land 0 swaps.
Strand sort n
Yes
Selection
Tournament sort nlogn
No
Selection
‘Variation of Heap
Sort.
Cocktail sort n
‘Yes
Exchanging |
Comb sort n
‘No
Exchanging |
Small code size.
Gnome sort n
Yes
Exchanging |
Tiny code size.
UnShuffle Sort! kN
kN
kN
In place for
linked lists.
N*sizeoftlink)
for array.
Distribution
and Merge
[No exchanges are
performed. The
‘parameter’ is
proportional to
‘the entropy in the
input. K= 1 for
‘ordered or
ordered by
reversed input
Francesehini's method!) —
nlogn
nlogn
‘Yes
Block sort n
nlogn
nlogn
Yes
Insertion &
‘Merging
‘Combine a block-
‘based O(n) in-
place merge
algorithm'®) with
‘bottom-up
‘merge sort.
Odd-even sort n
Yes
Exchanging
‘Can be run on
parallel
‘processors easily,
The following table describes integer sorting algorithms and other sorting algorithms that are not comparison sorts. As such, they
are not limited by a(n log m) lower bound. Complexities below assume nr items to be sorted, with keys of size k, digit size d,
and r the range of numbers to be sorted. Many of them are based on the assumption that the ey size is large enough that all entries
have unique key values, and hence that 2