How to master in-place array modification algorithms? - Stack Overflow



In-place modification algorithms could become very hard to handle.

Consider a couple:

  • Inplace out-shuffle in linear time. Uses number theory.
  • In-place merge sort, was open for a few years. An algorithm came but was too complicated to be practical. Uses very complicated bookkeeping.

Sorry, if this sounds discouraging, but there is no magic elixir which will solve all in-place algorithm problems for you. You need to work with the problem, figure out its properties and try to exploit them (as is the case with most algorithms).

That said, for array modifications which result in a permutation of the original array, you can try the method of following the cycles of the permutation. Basically any permutation can be written as a disjoint set of cycles (see John's answer too). For instance the permutation:

1 4 2 5 3 6  

of 1 2 3 4 5 6 can be written as

1 -> 1  2 -> 3 -> 5 -> 4 -> 2  6 -> 6.  

you can read the arrow as 'goes to'.

So to permute the array 1 2 3 4 5 6 you follow the three cycles:

1 goes to 1.

6 goes to 6.

2 goes to 3, 3 goes to 5, 5 goes to 4 and 4 goes to 2.

To follow this long cycle, you can use just one temp variable. Store 3 in it. Put 2 where 3 was. Now put 3 in 5 and store 5 in the temp and so on. Since you only use constant extra temp space to follow a particular cycle, you are doing an in-place modification of the array for that cycle.

Now if I gave you a formula for computing where an element goes to, all you now need is the set of starting elements of each cycle.

A judicious choice of the starting points of the cycles can make the algorithm easy. If you come up with the starting points in O(1) space, you now have a complete in-place algorithm. This is where you might actually have to get familiar with the problem and exploit its properties.

Even if you didn't know how to compute the starting points of the cycles, but had a formula to compute the next element, you could use this method to get an O(n) time in-place algorithm in some special cases.

For instance: if you knew the array of signed integers held only positive integers.

You can now follow the cycles, but negate the numbers in them as an indicator of 'visited' elements. Now you can walk the array and pick the first positive number you come across and follow the cycles for that, making the elements of the cycle negative and continue to find untouched elements. In the end you just make all the elements positive again to get the resulting permutation.

You get an O(n) time and O(1) space algorithm! Of course, we kind of 'cheated' by using the sign bits of the array integers as our personal 'visited' bitmap.

Even if the array was not necessarily integers, this method (of following the cycles, not the hack of sign bits :-)) can actually be used to tackle the two problems you state:

  • The inshuffle (or out-shuffle) problem: When 2n+1 is a power of 3, it can be shown (using number theory) that 1,3,3^2, etc are in different cycles and all cycles are covered using those. Combine this with the fact that the inshuffle is susceptible to divide and conquer, you get an O(n) time, O(1) space algorithm (the formula is i -> 2*i modulo 2n+1). Refer the above paper for more details.

  • The cyclic shift an array problem: Cyclic shift an array of size n by k also gives a permutation of the resulting array (given by the formula i goes to i+k modulo n), and can also be solved in linear time and in-place using the following the cycle method. In fact, in terms of the number of element exchanges this following cycle method is better than the 3 reverses algorithm. Of course, following the cycle method can kill the cache because of the access patterns and in practice the 3 reverses algorithm might actually fare better.


Read full article from How to master in-place array modification algorithms? - Stack Overflow


No comments:

Post a Comment

Labels

Algorithm (219) Lucene (130) LeetCode (97) Database (36) Data Structure (33) text mining (28) Solr (27) java (27) Mathematical Algorithm (26) Difficult Algorithm (25) Logic Thinking (23) Puzzles (23) Bit Algorithms (22) Math (21) List (20) Dynamic Programming (19) Linux (19) Tree (18) Machine Learning (15) EPI (11) Queue (11) Smart Algorithm (11) Operating System (9) Java Basic (8) Recursive Algorithm (8) Stack (8) Eclipse (7) Scala (7) Tika (7) J2EE (6) Monitoring (6) Trie (6) Concurrency (5) Geometry Algorithm (5) Greedy Algorithm (5) Mahout (5) MySQL (5) xpost (5) C (4) Interview (4) Vi (4) regular expression (4) to-do (4) C++ (3) Chrome (3) Divide and Conquer (3) Graph Algorithm (3) Permutation (3) Powershell (3) Random (3) Segment Tree (3) UIMA (3) Union-Find (3) Video (3) Virtualization (3) Windows (3) XML (3) Advanced Data Structure (2) Android (2) Bash (2) Classic Algorithm (2) Debugging (2) Design Pattern (2) Google (2) Hadoop (2) Java Collections (2) Markov Chains (2) Probabilities (2) Shell (2) Site (2) Web Development (2) Workplace (2) angularjs (2) .Net (1) Amazon Interview (1) Android Studio (1) Array (1) Boilerpipe (1) Book Notes (1) ChromeOS (1) Chromebook (1) Codility (1) Desgin (1) Design (1) Divide and Conqure (1) GAE (1) Google Interview (1) Great Stuff (1) Hash (1) High Tech Companies (1) Improving (1) LifeTips (1) Maven (1) Network (1) Performance (1) Programming (1) Resources (1) Sampling (1) Sed (1) Smart Thinking (1) Sort (1) Spark (1) Stanford NLP (1) System Design (1) Trove (1) VIP (1) tools (1)

Popular Posts