February 09, 2022
On Thu, Feb 10, 2022 at 01:32:00AM +0000, MichaelBi via Digitalmars-d-learn wrote:
> On Wednesday, 9 February 2022 at 19:48:49 UTC, H. S. Teoh wrote:
> > [...]
> 
> thanks, very helpful! i am using a assocArray now...

Are you sure that's what you need?


T

-- 
Computers are like a jungle: they have monitor lizards, rams, mice, c-moss, binary trees... and bugs.
February 10, 2022
On Thursday, 10 February 2022 at 01:43:54 UTC, H. S. Teoh wrote:
> On Thu, Feb 10, 2022 at 01:32:00AM +0000, MichaelBi via Digitalmars-d-learn wrote:
>> On Wednesday, 9 February 2022 at 19:48:49 UTC, H. S. Teoh wrote:
>> > [...]
>> 
>> thanks, very helpful! i am using a assocArray now...
>
> Are you sure that's what you need?
>
>
> T

https://youtu.be/yJjpXJm7x0o?t=213
February 12, 2022

On Thursday, 10 February 2022 at 01:43:54 UTC, H. S. Teoh wrote:

>

On Thu, Feb 10, 2022 at 01:32:00AM +0000, MichaelBi via Digitalmars-d-learn wrote:

>

thanks, very helpful! i am using a assocArray now...

Are you sure that's what you need?

Depends. if you do say TYPE[long/int] then you have effectively a sparse array, if there's a few entries (Say a hundred million or something) it will probably be fine.

Depending on what you're storing, say if it's a few bits per entry you can probably use bitarray to store differing values. The 10^12 would take up.... 119Gb? That won't work. Wonder how 25Gb was calculated.

Though data of that size sounds more like a database. So maybe making index-access that does file access to store to media might be better, where it swaps a single 1Mb block (or some power^2 size) doing read/writes. Again if the sparseness/density isn't heavy you could then get away using zram drive and leaving the allocating/compression to the OS so it all remains in memory (Though that's not going to be a universal workaround and only works on linux), or doing compression using zlib on blocks of data and swapping them in/out handled by the array-like object, but that i'm more iffy on.

If the array is just to hold say results of a formula, you could then instead do a range with index support to generate the particular value and uses very little space (depending on the minimum size of data needed to generate it) though it may be slower than direct memory access.

February 12, 2022
On Sat, Feb 12, 2022 at 06:41:14PM +0000, Era Scarecrow via Digitalmars-d-learn wrote:
> On Thursday, 10 February 2022 at 01:43:54 UTC, H. S. Teoh wrote:
> > On Thu, Feb 10, 2022 at 01:32:00AM +0000, MichaelBi via Digitalmars-d-learn wrote:
> > > thanks, very helpful! i am using a assocArray now...
> > 
> > Are you sure that's what you need?
> 
> Depends. if you do say TYPE[long/int] then you have effectively a sparse array, if there's a few entries (*Say a hundred million or something*) it will probably be fine.
> 
> Depending on what you're storing, say if it's a few bits per entry you can probably use bitarray to store differing values. The 10^12 would take up....  119Gb? That won't work. Wonder how 25Gb was calculated.
[...]

That was not my point. My point was to question whether the OP has discovered the insight that would allow him to accomplish his task with a LOT less space than the naïve approach of storing everything in a gigantic array.  Substituting an AA for a gigantic array matters little as long as the basic approach remains the same -- you have only modified the implementation details but the algorithm is still a (highly) suboptimal one.


--T
February 14, 2022
On Saturday, 12 February 2022 at 20:31:29 UTC, H. S. Teoh wrote:
> On Sat, Feb 12, 2022 at 06:41:14PM +0000, Era Scarecrow via Digitalmars-d-learn wrote:
>> [...]
> [...]
>
> That was not my point. My point was to question whether the OP has discovered the insight that would allow him to accomplish his task with a LOT less space than the naïve approach of storing everything in a gigantic array.  Substituting an AA for a gigantic array matters little as long as the basic approach remains the same -- you have only modified the implementation details but the algorithm is still a (highly) suboptimal one.
>
>
> --T

thanks, you are all correct. i just change the algorithm and use the AA, previously using the naïve method...:), now solved perfectly. thanks again.
February 14, 2022
On Monday, 14 February 2022 at 13:20:45 UTC, MichaelBi wrote:
> thanks, you are all correct. i just change the algorithm and use the AA, previously using the naïve method...:), now solved perfectly. thanks again.

You could have used a normal Int Array for this task too, you're dealing with numbers and could go with array of 9 elements 0..8, then move data circularly and adding new lives as hit -1.

I pretty sure there is a Mathematical way to solve this, I mean without any Arrays, but I didn't bother about it since that even my basic JavaScript implementation runs in less than 1ms for 256 days in an old computer.

Matheus.
1 2
Next ›   Last »