Insitu Merge 

Given an array of N elements, A[0..N)^{*}, such that segment A[0..N1) is sorted and segment A[N1..N) is sorted, where 0 ≤ N1 ≤ N, rearrange A[0..N) so that it is sorted. Merging lies at the heart of the mergesort algorithm (where initially A[p..q) and A[q..r) are sorted, 0 ≤ p ≤ q ≤ r ≤ N, and A[p..r) must be made sorted).
The obvious ways to do the merge use extra workspace. For example, copy A[0..N1) into workspace W[0..N1), and set pointers p1 to zero, p2 to N1 and p to zero. Then repeatedly copy the smaller of W[p1] and A[p2] to A[p] advancing p and either p1 or p2 as appropriate; when of the segments is exhausted copy over the remainder of the other one. The obvious methods take lineartime but use up to O(N) extra workspace. It turns out that it is possible to do the merge insitu, that is using only a constant amount of extra workspace, and still in lineartime. Such algorithms involve considering A[0..N1) and A[N1..N) to be made up of consecutive blocks of size √N. The algorithms are easiest to understand when N1 and (NN1) are multiples of √N but they can be made to work when this is not the case. There are even stable versions of the algorithms although the degree of difficulty is much higher. Some basic operations are used by the various merge algorithms:
Huang and Langston, unstable algorithm (1988):For simplicity and for now, assume that N is a perfect square. The √N largest elements in A[] must be at the end of seg_{1} and/or seg_{2}:
Using swapSegs operations, move the √N largest to the front of the array (in any order):
Assume, for now, that the two remaining sorted segments each consist of whole blocks of size √N:
Consider the blocks after block_{1}. Using (a variation on) selection sort, rearrange these blocks so that their last elements are monotonic. This takes O((√N)^{2}) comparisons of last elements and O(√N) moves of blocks of size O(√N) so it can be done in O(N)time. Next find series_{1} and series_{2}: Series_{1} starts with block_{2}. Series_{2} is the first block (only) whose first element is smaller than the last element of the previous block, for example:
Note that the last element of series_{2} is at least as large as the last element of series_{1} because of the earlier rearrangement of blocks. Using block_{1} as (moving) workspace, merge series_{1} and series_{2} into the start of A, favouring series_{1} when breaking ties. Continue until all elements of series_{1} have been correctly placed. Note that the first part of the array is now sorted and the largest elements have migrated to immediately follow this sorted part. Now find a new series_{1} and series_{2}.
The new series_{1} starts with the first unmoved element of the old series_{2}. The new series_{2} is found as before. Repeat until the series are empty. Eventually most of the array is sorted and the largest elements have migrated to the right hand end of the array. The only problem is that the relative order of the largest elements may have been scrambled but there are only √N of them: Sort them, in O((√N)^{2})time, i.e., in O(N)time. Done. Of course the initial assumptions may not hold: N may not be a perfect square and, even if it is, the segments may not consist of whole blocks of size √N. If N is not a perfect square, let b = ⌊√N⌋, and N' = N  (N mod b). Find the N  N' largest elements of the array – these must be at the end of seg_{1} and/or seg_{2} – use a swap to move them to the right hand end of the array and sort them (in at worst O(N)time) then work on A[0..N'). Even if N is a perfect square or has been made multiple of a block size that is approximately √N, the lengths of the two segments may not be multiples of the block size. If so, find the largest elements at the end of seg_{1} and/or seg_{2} such that swapping positions will result in two (sorted) segments with lengths that are both multiples of the block size. Proceed as before. As a practical matter, if the array has fewer than some small fixed number of elements, k, do not apply the algorithm above, just sort it in O(k^{2})time. E.g.,Is insitu merge of any practical use? After all, modern computers have enormous memory and even larger virtual memory. Well maybe, because although insitu merge has a worse constant of proportionality in its timecomplexity, it has better locality – about half the workingset – of other merges: Think about (multilevel) caches. — L.A. 29 September 2018 References
Also search for [insitu merge] in the bibliography. ^{*} A[i..j) = {A[i], ..., A[j1]} if i<j, and is empty otherwise. 

↑ © L. Allison, www.allisons.org/ll/ (or as otherwise indicated). Created with "vi (Linux)", charset=iso88591, fetched Thursday, 01Jun2023 06:11:57 UTC. Free: Linux, Ubuntu operatingsys, OpenOffice officesuite, The GIMP ~photoshop, Firefox webbrowser, FlashBlock flash on/off. 