| |
| Posted by rikki cattermole in reply to Ola Fosheim Grøstad | PermalinkReply |
|
rikki cattermole
Posted in reply to Ola Fosheim Grøstad
|
On 21/01/2022 11:58 PM, Ola Fosheim Grøstad wrote:
> It would work for D, I guess.
Not likely, it would need a lot of work to bring that up to todays requirements.
For context:
A heap size of less than 1 megabyte may work very well. The cost
of duplicating a heap of 100.000 bytes only takes up about 2% extra cpu time on my
single processor personal computer from 2002-2003 running a blocksize of 128.
(Note that a more common blocksize of 1028 would yield a cpu usage of 1/8 of 2%).
Furthermore, since todays computers are many times faster than
this, especially those with 4 or 8 cores, (which the garbage collector
is able to fully utilitize (well, at least that's the plan)), the number
of bytes in the heap can probably be multiplied many times.
As long as the heap is reasonable small, which it probably should be while doing signal processing
tasks, duplicating the entire heap should not be a costly operation with current personal
computers. On a multiprocessor machine, using heaps of many megabytes should not
be a problem. I've also successfully tried using a 20MB heap on a dual-core intel machine.
|