Thread overview | |||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
|
February 07, 2013 Re: nextPermutation and ranges | ||||
---|---|---|---|---|
| ||||
Posted in reply to bearophile | On Thu, Feb 07, 2013 at 08:47:39PM +0100, bearophile wrote: > H. S. Teoh: > > >1) To avoid excessive allocations, it would have to be a transient range. > > See the doCopy boolean template argument in my code. Note that it's true on default. (D Zen: do the safe thing on default, and the fast thing on request). Good point. > >2) If the starting range is not sorted, then the permutation range needs to make a copy of the original range so that it knows when all permutations have been enumerated. But there is no generic way to make a copy of a range > > Isn't it possible to call array() on the input range? (Performing > array() takes a microscopic amount of time compared to computing its > permutations.) array() doesn't work on transient ranges. But I suppose it's OK to forego that, if we're going to be safe by default. > >I think this is an unfair comparison: using factorial assumes that all array elements are unique. > > It's a fair comparison because it tests a common usage case in my code. > > I'm not asking to remove nextPermutation from Phobos. I think > nextPermutation is useful, but in many cases my items are unique > (example: I make them unique before giving them to permutations()). > (And I think it's not good to slow down this very common case > because in general they aren't unique.) [...] We could make a variant of nextPermutation (or a range incarnation thereof) that assumes uniqueness. I think that would be a great addition to Phobos. Nevertheless, any implementation based on factorial would have to address the issue of how to handle the exponential growth. I think it's unacceptable for the standard library to support permutations only up to ranges of 20 elements or less, because 21! > 2^64. T -- "Real programmers can write assembly code in any language. :-)" -- Larry Wall |
February 07, 2013 Re: nextPermutation and ranges | ||||
---|---|---|---|---|
| ||||
Posted in reply to H. S. Teoh | H. S. Teoh:
> any implementation based on factorial would have to
> address the issue of how to handle the exponential growth. I think it'sunacceptable for the standard library to support
> permutations only up to ranges of 20 elements or less,
> because 21! > 2^64.
What are the use cases of generating the permutations of a set of items higher than abot 20? (in my code I don't remember having to permute more than few items.)
One thing I was forgetting: thank you for your work H. S. Teoh.
Bye,
bearophile
|
February 07, 2013 Re: nextPermutation and ranges | ||||
---|---|---|---|---|
| ||||
Posted in reply to bearophile | On Thu, Feb 07, 2013 at 09:24:24PM +0100, bearophile wrote: > H. S. Teoh: > > >any implementation based on factorial would have to address the issue of how to handle the exponential growth. I think it'sunacceptable for the standard library to support permutations only up to ranges of 20 elements or less, because 21! > 2^64. > > What are the use cases of generating the permutations of a set of items higher than abot 20? (in my code I don't remember having to permute more than few items.) Combinatorial puzzles come to mind (Rubik's cube solvers and its ilk, for example). Maybe other combinatorial problems that require some kind of exhaustive state space search. Those things easily go past 20! once you get beyond the most trivial cases. > One thing I was forgetting: thank you for your work H. S. Teoh. [...] You're welcome. T -- Never ascribe to malice that which is adequately explained by incompetence. -- Napoleon Bonaparte |
February 07, 2013 Re: nextPermutation and ranges | ||||
---|---|---|---|---|
| ||||
Posted in reply to H. S. Teoh | H. S. Teoh:
> Combinatorial puzzles come to mind (Rubik's cube solvers and its ilk,
> for example). Maybe other combinatorial problems that require some kind
> of exhaustive state space search. Those things easily go past 20! once
> you get beyond the most trivial cases.
I know many situations/problems where you have more than 20! cases, but in those problems you don't iterate all permutations, because the program takes ages to do it. In those programs you don't use nextPermutation, you scan the search space in a different and smarter way.
I don't know of any use case for permuting so large sets of items.
Bye,
bearophile
|
February 07, 2013 Re: nextPermutation and ranges | ||||
---|---|---|---|---|
| ||||
Posted in reply to bearophile | On Thu, Feb 07, 2013 at 09:42:34PM +0100, bearophile wrote: > H. S. Teoh: > > >Combinatorial puzzles come to mind (Rubik's cube solvers and its ilk, for example). Maybe other combinatorial problems that require some kind of exhaustive state space search. Those things easily go past 20! once you get beyond the most trivial cases. > > I know many situations/problems where you have more than 20! cases, but in those problems you don't iterate all permutations, because the program takes ages to do it. In those programs you don't use nextPermutation, you scan the search space in a different and smarter way. > > I don't know of any use case for permuting so large sets of items. [...] It depends, sometimes in complex cases you have no choice but to do exhaustive search. I agree that it's very rare, though. T -- If creativity is stifled by rigid discipline, then it is not true creativity. |
February 08, 2013 Re: nextPermutation and ranges | ||||
---|---|---|---|---|
| ||||
Posted in reply to H. S. Teoh | Am Thu, 7 Feb 2013 13:52:01 -0800 schrieb "H. S. Teoh" <hsteoh@quickfur.ath.cx>: > On Thu, Feb 07, 2013 at 09:42:34PM +0100, bearophile wrote: > > H. S. Teoh: > > > > >Combinatorial puzzles come to mind (Rubik's cube solvers and its ilk, for example). Maybe other combinatorial problems that require some kind of exhaustive state space search. Those things easily go past 20! once you get beyond the most trivial cases. > > > > I know many situations/problems where you have more than 20! cases, but in those problems you don't iterate all permutations, because the program takes ages to do it. In those programs you don't use nextPermutation, you scan the search space in a different and smarter way. > > > > I don't know of any use case for permuting so large sets of items. > [...] > > It depends, sometimes in complex cases you have no choice but to do exhaustive search. I agree that it's very rare, though. > > > T > So right now we can handle 20! = 2,432,902,008,176,640,000 permutations. If every check took only 20 clock cycles of a 4 Ghz CPU, it would take you ~386 years to go through the list. For the average human researcher this is plenty of time. -- Marco |
February 08, 2013 Re: nextPermutation and ranges | ||||
---|---|---|---|---|
| ||||
Posted in reply to Marco Leise | On Friday, 8 February 2013 at 06:59:20 UTC, Marco Leise wrote:
> Am Thu, 7 Feb 2013 13:52:01 -0800
> schrieb "H. S. Teoh" <hsteoh@quickfur.ath.cx>:
>
>> On Thu, Feb 07, 2013 at 09:42:34PM +0100, bearophile wrote:
>> > H. S. Teoh:
>> >
>> > >Combinatorial puzzles come to mind (Rubik's cube solvers and its ilk,
>> > >for example). Maybe other combinatorial problems that require some
>> > >kind of exhaustive state space search. Those things easily go past
>> > >20! once you get beyond the most trivial cases.
>> >
>> > I know many situations/problems where you have more than 20! cases,
>> > but in those problems you don't iterate all permutations, because the
>> > program takes ages to do it. In those programs you don't use
>> > nextPermutation, you scan the search space in a different and smarter
>> > way.
>> >
>> > I don't know of any use case for permuting so large sets of items.
>> [...]
>>
>> It depends, sometimes in complex cases you have no choice but to do
>> exhaustive search. I agree that it's very rare, though.
>>
>>
>> T
>>
>
> So right now we can handle 20! = 2,432,902,008,176,640,000
> permutations. If every check took only 20 clock cycles of a 4
> Ghz CPU, it would take you ~386 years to go through the list.
> For the average human researcher this is plenty of time.
On a modern supercomputer this would take well under 2 months. (I calculated it as ~44 days on minerva at Warwick, UK). 19! would take less than 3 days.
In a parallel setting, such large numbers are assailable.
|
February 08, 2013 Re: nextPermutation and ranges | ||||
---|---|---|---|---|
| ||||
Posted in reply to John Colvin | On Friday, 8 February 2013 at 12:27:50 UTC, John Colvin wrote:
> On Friday, 8 February 2013 at 06:59:20 UTC, Marco Leise wrote:
>>
>> So right now we can handle 20! = 2,432,902,008,176,640,000 permutations. If every check took only 20 clock cycles of a 4 Ghz CPU, it would take you ~386 years to go through the list. For the average human researcher this is plenty of time.
>
> On a modern supercomputer this would take well under 2 months. (I calculated it as ~44 days on Minerva at Warwick, UK). 19! would take less than 3 days.
>
> In a parallel setting, such large numbers are assailable.
If we have such a large number of computations, then either cent will have to be implemented, use BigInt, or an internal array that handles locational information. That would remove the limitations of 20 to either 255, or 65535 (assuming you EVER need that many). Course rummaging through the array for the next computation becomes more difficult the larger the number of elements.
|
February 09, 2013 Re: nextPermutation and ranges | ||||
---|---|---|---|---|
| ||||
Posted in reply to Era Scarecrow | On Friday, 8 February 2013 at 21:07:58 UTC, Era Scarecrow wrote:
> On Friday, 8 February 2013 at 12:27:50 UTC, John Colvin wrote:
>> On Friday, 8 February 2013 at 06:59:20 UTC, Marco Leise wrote:
>>>
>>> So right now we can handle 20! = 2,432,902,008,176,640,000 permutations. If every check took only 20 clock cycles of a 4 Ghz CPU, it would take you ~386 years to go through the list. For the average human researcher this is plenty of time.
>>
>> On a modern supercomputer this would take well under 2 months. (I calculated it as ~44 days on Minerva at Warwick, UK). 19! would take less than 3 days.
>>
>> In a parallel setting, such large numbers are assailable.
>
> If we have such a large number of computations, then either cent will have to be implemented, use BigInt, or an internal array that handles locational information. That would remove the limitations of 20 to either 255, or 65535 (assuming you EVER need that many). Course rummaging through the array for the next computation becomes more difficult the larger the number of elements.
Seeing as 61! is of the order of the number of atoms in the observable universe, i don't think there's much need to plan for any higher than that!
|
Copyright © 1999-2021 by the D Language Foundation