Jump to page: 1 2
Thread overview
how to handle very large array?
Feb 09, 2022
MichaelBi
Feb 09, 2022
MichaelBi
Feb 09, 2022
MichaelBi
Feb 09, 2022
ag0aep6g
Feb 09, 2022
user1234
Feb 09, 2022
bauss
Feb 09, 2022
MoonlightSentinel
Feb 09, 2022
Ali Çehreli
Feb 09, 2022
H. S. Teoh
Feb 10, 2022
MichaelBi
Feb 10, 2022
H. S. Teoh
Feb 10, 2022
forkit
Feb 12, 2022
Era Scarecrow
Feb 12, 2022
H. S. Teoh
Feb 14, 2022
MichaelBi
Feb 14, 2022
Matheus
February 09, 2022

day6 of the advent of code 2021 needs to handle an array of 10^12 length, or even bigger... plus change elements and append elements. normal implementation such as length, appender and ref element etc, seems cannot handle that big array? is there any alternative data structure or algorithm can handle such large array properly? thanks.

February 09, 2022

On Wednesday, 9 February 2022 at 10:03:21 UTC, MichaelBi wrote:

>

day6 of the advent of code 2021 needs to handle an array of 10^12 length, or even bigger... plus change elements and append elements. normal implementation such as length, appender and ref element etc, seems cannot handle that big array? is there any alternative data structure or algorithm can handle such large array properly? thanks.

got outofmemory error:
core.exception.OutOfMemoryError@src\core\lifetime.d(126): Memory allocation failed

February 09, 2022

On Wednesday, 9 February 2022 at 10:05:23 UTC, MichaelBi wrote:

>

On Wednesday, 9 February 2022 at 10:03:21 UTC, MichaelBi wrote:

>

day6 of the advent of code 2021 needs to handle an array of 10^12 length, or even bigger... plus change elements and append elements. normal implementation such as length, appender and ref element etc, seems cannot handle that big array? is there any alternative data structure or algorithm can handle such large array properly? thanks.

got outofmemory error:
core.exception.OutOfMemoryError@src\core\lifetime.d(126): Memory allocation failed

https://adventofcode.com/2021/day/6#part2

"Suppose the lanternfish live forever and have unlimited food and space. Would they take over the entire ocean?

After 256 days in the example above, there would be a total of 26984457539 lanternfish!

How many lanternfish would there be after 256 days"

26984457539 in above is the array length.

February 09, 2022
On 09.02.22 11:09, MichaelBi wrote:
> On Wednesday, 9 February 2022 at 10:05:23 UTC, MichaelBi wrote:
[...]
>> got outofmemory error:
>> core.exception.OutOfMemoryError@src\core\lifetime.d(126): Memory allocation failed
> 
> https://adventofcode.com/2021/day/6#part2
> 
> "Suppose the lanternfish live forever and have unlimited food and space. Would they take over the entire ocean?
> 
> After 256 days in the example above, there would be a total of 26984457539 lanternfish!
> 
> How many lanternfish would there be after 256 days"
> 
> 26984457539 in above is the array length.

If you store one byte per lanternfish, that's 25 GiB. You don't seem to have enough RAM for such a large array.

Try to think of a more efficient way of storing the information.
February 09, 2022

On Wednesday, 9 February 2022 at 10:03:21 UTC, MichaelBi wrote:

>

day6 of the advent of code 2021 needs to handle an array of 10^12 length, or even bigger... plus change elements and append elements. normal implementation such as length, appender and ref element etc, seems cannot handle that big array? is there any alternative data structure or algorithm can handle such large array properly? thanks.

Use a memorymapped file that holds the array values, in theory it can be infinite then.

Since an array has a known size for each entry then you can treat the file as an array.

Let's say you have an array of ints.

For a memorymapped file you obviously only have bytes to work with, so each entry will be 4 bytes, since a 32 bit integer (int) is 4 bytes.

So to read something at a specific index you simply do N x I where N is the size of the type and I is the index you want to read at.

Otherwise the size of an array cannot exceed the RAM you have, if your system can't use diskspace as RAM of course.

February 09, 2022
On Wednesday, 9 February 2022 at 10:29:03 UTC, ag0aep6g wrote:
> Try to think of a more efficient way of storing the information.

I cant agree more. The problem of OP is not dynamic arrays, it's that he uses an inadequate data structure.
February 09, 2022

On Wednesday, 9 February 2022 at 10:03:21 UTC, MichaelBi wrote:

>

day6 of the advent of code 2021 needs to handle an array of 10^12 length, or even bigger...

As others have mentioned, huge arrays require appropriate memory / the use of memory-mapped files to actually store it. But the calculation will take a long time even if your program could allocate an array of that size.

There's a way to use a much smaller array to manage the lanternfish population...

February 09, 2022
On 2/9/22 10:38, MoonlightSentinel wrote:

> There's a way to use a much smaller array to manage the lanternfish
> population...

As soon as I read your comment, I was reminded of a certain ingenious sorting algorithm that is O(N). ;) After reading the problem description, I see your hint was useful.

Ali

February 09, 2022
On Wed, Feb 09, 2022 at 11:05:23AM -0800, Ali Çehreli via Digitalmars-d-learn wrote:
> On 2/9/22 10:38, MoonlightSentinel wrote:
> 
> > There's a way to use a much smaller array to manage the lanternfish population...
> 
> As soon as I read your comment, I was reminded of a certain ingenious sorting algorithm that is O(N). ;) After reading the problem description, I see your hint was useful.
[...]

If you know beforehand that your data is discrete and bounded, you can achieve O(N) sort complexity (as opposed to O(N log N), which is the lower bound if you cannot make any assumptions about your data beyond their ordering).  You can use pigeonhole sort, for example, i.e., create an M-element array of counters where M is the number of distinct values your elements may have, then just iterate over your data and increment the counter corresponding to each element you find (using array indexing for O(1) access to each counter). Then iterate over the array of counters in order and replace the input data with that many copies of each element. Your overall complexity then would be O(N+M), which would be O(N) if M is about the same order of magnitude as N or less.

However, this algorithm quickly becomes impractical when M grows large, because you will soon need to create many more slots than there are input elements. For example, sorting a general int[] this way would require 2^32 counters, which is usually overkill unless your input has close to 2^32 elements. (You could do it if you knew your int's are ≤ M for some small value of M, though. But it won't work for general int[] that can contain any value.) And trying to sort a general ulong[] this way will not work because you would need an array of 2^64 counters, which would not only exhaust all available RAM in today's computers, but also take a ridiculously long amount of time to iterate in the second step.

The insight here is *taking advantage of additional structure in your data*.  If you take the general, minimal-assumptions approach, the best you can do is the general O(N log N) algorithm.  However, by exploiting additional structure in your data, you can do better.  Similarly, if you take the naïve approach to modelling your lanternfish, you'll end up with an unwieldy array that's far too large to fit in RAM.  If you exploit the additional structure in your data, however, you can accomplish the same task with a much smaller array.


T

-- 
Latin's a dead language, as dead as can be; it killed off all the Romans, and now it's killing me! -- Schoolboy
February 10, 2022
On Wednesday, 9 February 2022 at 19:48:49 UTC, H. S. Teoh wrote:
> [...]

thanks, very helpful! i am using a assocArray now...
« First   ‹ Prev
1 2