Why does array size have to be 3^k+1 for cycle leader iteration algorithm to work? -


the cycle leader iteration algorithm algorithm shuffling array moving even-numbered entries front , odd-numbered entries while preserving relative order. example, given input:

a 1 b 2 c 3 d 4 e 5 

the output be

a b c d e 1 2 3 4 5 

this algorithm runs in o(n) time , uses o(1) space.

one unusual detail of algorithm works splitting array blocks of size 3k+1. apparently critical algorithm work correctly, have no idea why is.

why choice of 3k + 1 necessary in algorithm?

thanks!

this going long answer. answer question isn't simple , requires number theory answer. i've spent half day working through algorithm , have answer, i'm not sure can describe succinctly.

the short version:

  • breaking input blocks of size 3k + 1 breaks input apart blocks of size 3k - 1 surrounded 2 elements not end moving.

  • the remaining 3k - 1 elements in block move according interesting pattern: each element moves position given dividing index 2 modulo 3k.

  • this particular motion pattern connected concept number theory , group theory called primitive roots.

  • because number 2 primitive root modulo 3k, beginning numbers 1, 3, 9, 27, etc. , running pattern guaranteed cycle through elements of array once , put them proper place.

  • this pattern highly dependent on fact 2 primitive root of 3k k ≥ 1. changing size of array value break because wrong property preserved.

the long version

to present answer, i'm going proceed in steps. first, i'm going introduce cycle decompositions motivation algorithm efficiently shuffle elements around in right order, subject important caveat. next, i'm going point out interesting property of how elements happen move around in array when apply permutation. then, i'll connect number-theoretic concept called primitive roots explain challenges involved in implementing algorithm correctly. finally, i'll explain why leads choice of 3k + 1 block size.

cycle decompositions

let's suppose have array , permutation of elements of array. following standard mathematical notation, we'll denote permutation of array σ(a). can line initial array on top of permuted array σ(a) sense every element ended up. example, here's array , 1 of permutations:

      0 1 2 3 4  σ(a)   2 3 0 4 1 

one way can describe permutation list off new elements inside permutation. however, algorithmic perspective, it's more helpful represent permutation cycle decomposition, way of writing out permutation showing how form permutation beginning initial array , cyclically permuting of elements.

take @ above permutation. first, @ 0 ended up. in σ(a), element 0 ended taking place of element 2 used be. in turn, element 2 ended taking place of element 0 used be. denote writing (0 2), indicating 0 should go 2 used be, , 2 should go 0 used be.

now, @ element 1. element 1 ended 4 used be. number 4 ended 3 used be, , element 3 ended 1 used be. denote writing (1 4 3), 1 should go 4 used be, 4 should go 3 used be, , 3 should go 1 used be.

combining these together, can represent overall permutation of above elements (0 2)(1 4 3) - should swap 0 , 2, cyclically permute 1, 4, , 3. if starting initial array, we'll end @ permuted array want.

cycle decompositions extremely useful permuting arrays in place because it's possible permute individual cycle in o(c) time , o(1) auxiliary space, c number of elements in cycle. example, suppose have cycle (1 6 8 4 2). can permute elements in cycle code this:

int[] cycle = {1, 6, 8, 4, 2};  int temp = array[cycle[0]]; (int = 1; < cycle.length; i++) {     swap(temp, array[cycle[i]]); } array[cycle[0]] = temp; 

this works swapping around until comes rest. aside space usage required store cycle itself, needs o(1) auxiliary storage space.

in general, if want design algorithm applies particular permutation array of elements, can using cycle decompositions. general algorithm following:

for (each cycle in cycle decomposition algorithm) { apply above algorithm cycle elements; }

the overall time , space complexity algorithm depends on following:

  1. how can determine cycle decomposition want?
  2. how efficiently can store cycle decomposition in memory?

to o(n)-time, o(1)-space algorithm problem @ hand, we're going show there's way determine cycle decomposition in o(1) time , space. since moved once, overall runtime o(n) , overall space complexity o(1). it's not easy there, you'll see, again, it's not awful either.

the permutation structure

the overarching goal of problem take array of 2n elements , shuffle even-positioned elements end @ front of array , odd-positioned elements end @ end of array. let's suppose have 14 elements, this:

 0  1  2  3  4  5  6  7  8  9 10 11 12 13 

we want shuffle elements come out this:

 0  2  4  6  8 10 12  1  3  5  7  9 11 13 

there couple of useful observations can have way permutation arises. first, notice the first element not move in permutation, because even-indexed elements supposed show in front of array , it's first even-indexed element. next, notice the last element not move in permutation, because odd-indexed elements supposed end @ of array , it's last odd-indexed element.

these 2 observations, put together, means if want permute elements of array in desired fashion, need permute subarray consisting of overall array first , last elements dropped off. therefore, going forward, we purely going focus on problem of permuting middle elements. if can solve problem, we've solved overall problem.

now, let's @ middle elements of array. our above example, means we're going start array one:

 element    1  2  3  4  5  6  7  8  9 10 11 12  index      1  2  3  4  5  6  7  8  9 10 11 12 

we want array this:

 element    2  4  6  8 10 12  1  3  5  7  9 11  index      1  2  3  4  5  6  7  8  9 10 11 12 

because array formed taking 0-indexed array , chopping off first , last element, can treat one-indexed array. that's going critically important going forward, sure keep in mind.

so how can go generating permutation? well, starters, doesn't hurt take @ each element , try figure out began , ended up. if so, can write things out this:

  • the element @ position 1 ended @ position 7.
  • the element @ position 2 ended @ position 1.
  • the element @ position 3 ended @ position 8.
  • the element @ position 4 ended @ position 2.
  • the element @ position 5 ended @ position 9.
  • the element @ position 6 ended @ position 3.
  • the element @ position 7 ended @ position 10.
  • the element @ position 8 ended @ position 4.
  • the element @ position 9 ended @ position 11.
  • the element @ position 10 ended @ position 5.
  • the element @ position 11 ended @ position 12.
  • the element @ position 12 ended @ position 6.

if @ list, can spot few patterns. first, notice final index of even-numbered elements half position of element. example, element @ position 4 ended @ position 2, element @ position 12 ended @ position 6, etc. makes sense - pushed elements front of array, half of elements came before them have been displaced , moved out of way.

now, odd-numbered elements? well, there 12 total elements. each odd-numbered element gets pushed second half, odd-numbered element @ position 2k+1 pushed @ least position 7. position within second half given value of k. therefore, elements @ odd position 2k+1 gets mapped position 7 + k.

we can take minute generalize idea. suppose array we're permuting has length 2n. element @ position 2x mapped position x (again, numbers halfed), , element @ position 2x+1 mapped position n + 1 + x. restating this:

the final position of element @ position p determined follows:

  • if p = 2x integer x, 2x ↦ x
  • if p = 2x+1 integer x, 2x+1 ↦ n + 1 + x

and we're going that's entirely crazy , unexpected. right now, have piecewise rule determining each element ends up: either divide two, or weird involving n + 1. however, number-theoretic perspective, there single, unified rule explaining elements supposed end up.

the insight need in both cases, seems like, in way, we're dividing index two. case, new index formed dividing two. odd case, new index kinda looks it's formed dividing 2 (notice 2x+1 went x + (n + 1)), there's term in there. in number-theoretic sense, though, both of these correspond division two. here's why.

rather taking source index , dividing 2 destination index, if take destination index , multiply two? if that, interesting pattern emerges.

suppose our original number 2x. destination x, , if double destination index 2x, end source index.

now suppose our original number 2x+1. destination n + 1 + x. now, happens if double destination index? if that, 2n + 2 + 2x. if rearrange this, can alternatively rewrite (2x+1) + (2n+1). in other words, we've gotten original index, plus (2n+1) term.

now kicker: what if of our arithmetic done modulo 2n + 1? in case, if our original number 2x + 1, twice destination index (2x+1) + (2n+1) = 2x + 1 (modulo 2n+1). in other words, destination index half of source index, done modulo 2n+1!

this leads very, interesting insight: the ultimate destination of each of elements in 2n-element array given dividing number two, modulo 2n+1. means there nice, unified rule determining goes. need able divide 2 modulo 2n+1. happens work out in case, normal integer division, , in odd case, works out taking form n + 1 + x.

consequently, can reframe our problem in following way: given 1-indexed array of 2n elements, how permute elements each element @ index x ends @ position x/2 mod (2n+1)?

cycle decompositions revisited

at point, we've made quite lot of progress. given element, know element should end up. if can figure out nice way cycle decomposition of overall permutation, we're done.

this is, unfortunately, things complicated. suppose, example, our array has 10 elements. in case, want transform array this:

 initial:  1  2  3  4  5  6  7  8  9 10  final:    2  4  6  8 10  1  3  5  7  9 

the cycle decomposition of permutation (1 6 3 7 9 10 5 8 4 2). if our array has 12 elements, want transform this:

 initial:  1  2  3  4  5  6  7  8  9 10 11 12  final:    2  4  6  8 10 12  1  3  5  7  9 11 

this has cycle decomposition (1 7 10 5 9 11 12 6 3 8 4 2 1). if our array has 14 elements, want transform this:

 initial:  1  2  3  4  5  6  7  8  9 10 11 12 13 14  final:    2  4  6  8 10 12 14  1  3  5  7  9 11 13 

this has cycle decomposition (1 8 4 2)(3 9 12 6)(5 10)(7 11 13 14). if our array has 16 elements, want transform this:

 initial:  1  2  3  4  5  6  7  8  9 10 11 12 13 14 15 16  final:    2  4  6  8 10 12 14 16  1  3  5  7  9 11 13 15 

this has cycle decomposition (1 9 13 15 16 8 4 2)(3 10 5 11 14 7 12 6).

the problem here these cycles don't seem follow predictable patterns. real problem if we're going try solve problem in o(1) space , o(n) time. though given individual element can figure out cycle contains , can efficiently shuffle cycle, it's not clear how figure out elements belong cycles, how many different cycles there are, etc.

primitive roots

this number theory comes in. remember each element's new position formed dividing number two, modulo 2n+1. thinking backwards, can figure out number take place of each number multiplying 2 modulo 2n+1. therefore, can think of problem finding cycle decomposition in reverse: pick number, keep multiplying 2 , modding 2n+1, , repeat until we're done cycle.

this gives rise well-studied problem. suppose start number k , think sequence k, 2k, 22k, 23k, 24k, etc., done modulo 2n+1. doing gives different patterns depending on odd number 2n+1 you're modding by. explains why above cycle patterns seem arbitrary.

i have no idea how figured out, turns out there's beautiful result number theory talks happens if take pattern mod 3k number k:

theorem: consider sequence 3s, 3s·2, 3s·22, 3s·23, 3s·24, etc. modulo 3k k ≥ s. sequence cycles through through every number between 1 , 3k, inclusive, divisible 3s not divisible 3s+1.

we can try out on few examples. let's work modulo 27 = 32. theorem says if @ 3, 3 · 2, 3 · 4, etc. modulo 27, should see numbers less 27 divisible 3 , not divisible 9. well, let'see get:

  • 3 · 20 = 3 · 1 = 3 = 3 mod 27
  • 3 · 21 = 3 · 2 = 6 = 6 mod 27
  • 3 · 22 = 3 · 4 = 12 = 12 mod 27
  • 3 · 23 = 3 · 8 = 24 = 24 mod 27
  • 3 · 24 = 3 · 16 = 48 = 21 mod 27
  • 3 · 25 = 3 · 32 = 96 = 15 mod 27
  • 3 · 26 = 3 · 64 = 192 = 3 mod 27

we ended seeing 3, 6, 12, 15, 21, , 24 (though not in order), indeed numbers less 27 divisible 3 not divisible 9.

we can try working mod 27 , considering 1, 2, 22, 23, 24 mod 27, , should see numbers less 27 divisible 1 , not divisible 3. in other words, should give numbers less 27 aren't divisible 3. let's see if that's true:

  • 20 = 1 = 1 mod 27
  • 21 = 2 = 2 mod 27
  • 22 = 4 = 4 mod 27
  • 23 = 8 = 8 mod 27
  • 24 = 16 = 16 mod 27
  • 25 = 32 = 5 mod 27
  • 26 = 64 = 10 mod 27
  • 27 = 128 = 20 mod 27
  • 28 = 256 = 13 mod 27
  • 29 = 512 = 26 mod 27
  • 210 = 1024 = 25 mod 27
  • 211 = 2048 = 23 mod 27
  • 212 = 4096 = 19 mod 27
  • 213 = 8192 = 11 mod 27
  • 214 = 16384 = 22 mod 27
  • 215 = 32768 = 17 mod 27
  • 216 = 65536 = 7 mod 27
  • 217 = 131072 = 14 mod 27
  • 218 = 262144 = 1 mod 27

sorting these, got numbers 1, 2, 4, 5, 7, 8, 10, 11, 13, 14, 16, 17, 19, 20, 22, 23, 25, 26 (though not in order). these exactly numbers between 1 , 26 aren't multiples of three!

this theorem crucial algorithm following reason: if 2n+1 = 3k number k, if process cycle containing 1, shuffle numbers aren't multiples of three. if start cycle @ 3, shuffle numbers divisible 3 not 9. if start cycle @ 9, shuffle numbers divisible 9 not 27. more generally, if use cycle shuffle algorithm on numbers 1, 3, 9, 27, 81, etc., reposition elements in array once , not have worry missed anything.

so how connect 3k + 1? well, need have 2n + 1 = 3k, need have 2n = 3k - 1. remember - dropped first , last element of array when did this! adding in tells need blocks of size 3k + 1 procedure work correctly. if blocks size, know for certain cycle decomposition consist of cycle containing 1, nonoverlapping cycle containing 3, nonoverlapping cycle containing 9, etc. , these cycles contain elements of array. consequently, can start cycling 1, 3, 9, 27, etc. , absolutely guaranteed gets shuffled around correctly. that's amazing!

and why theorem true? turns out number k 1, k, k2, k3, etc. mod pn cycles through numbers aren't multiples of p (assuming p prime) called primitive root of number pn. there's theorem says 2 primitive root of 3k numbers k, why trick works. if have time, i'd come , edit answer include proof of result, though unfortunately number theory isn't @ level know how this.

summary

this problem tons of fun work on. involves cute tricks dividing 2 modulo odd numbers, cycle decompositions, primitive roots, , powers of three. i'm indebted this arxiv paper described similar (though quite different) algorithm , gave me sense key trick behind technique, let me work out details algorithm described.

hope helps!


Comments

Popular posts from this blog

OpenCV OpenCL: Convert Mat to Bitmap in JNI Layer for Android -

android - org.xmlpull.v1.XmlPullParserException: expected: START_TAG {http://schemas.xmlsoap.org/soap/envelope/}Envelope -

python - How to remove the Xframe Options header in django? -