May 13, 2005
If I may, I do have one question:

> Capped static array size at 16Mb.

Why?  I realize that having larger arrays is sometimes bad, and makes the length specifiers harder to work with, (which only seems a problem with dynamic arrays to me) but I imagine there's some bigger reason I'm missing?

Thanks,
-[Unknown]


> The usual <g>.
> 
> http://www.digitalmars.com/d/changelog.html
> 
> 
> 
May 13, 2005
"Unknown W. Brackets" <unknown@simplemachines.org> wrote in message news:d61jpa$1m0l$1@digitaldaemon.com...
>  > Capped static array size at 16Mb.
> Why?  I realize that having larger arrays is sometimes bad, and makes
> the length specifiers harder to work with, (which only seems a problem
> with dynamic arrays to me) but I imagine there's some bigger reason I'm
> missing?

1) Gigantic static arrays are often either the result of a typo or are a
newbie mistake.
2) Such require a lot of memory for the compiler to handle. Before the OS
officially runs out of memory, it goes to greater and greater lengths to
scavenge memory for the compiler, often bringing the computer to its knees
in desperation.
3) D needs to be a portable language, and by capping the array size a
program is more likely to be portable.
4) Giant arrays are reflected in a corresponding giant size for the exe
file.
5) There simply isn't a need I can think of for such arrays. There shouldn't
be a problem with allocating them dynamically.


May 13, 2005
Wouldn't it be safer to do a similar limit for dynamic arrays too?

I've catched some bugs in my own code by adding an assertion in my array class' resize. I've limited it at 0x100000 items.

1) resizing big dynamic arrays will be extremely slow;
2) one should better use direct OS functions (VirtualAlloc) for really large
arrays;
3) again, catch 'newbie' mistakes, typos;
4) 'zeroing' the data will be expensive.

See also: http://www.digitalmars.com/drn-bin/wwwnews?digitalmars.D/15486

L.


May 13, 2005
caught :-(


May 13, 2005
What you're saying makes sense in practice /right now/, but what if a research team 3 years from now want to use D (because it's so sweet) on some supercomputer, modeling planetary physics and decide they want (for some reason unknown to us, but makes perfect sense in their model) a huge static array? That might be a stretch scenario, but only because it hasn't happened yet.

It's just always seemed to me, better to let the programmer learn from their own mistakes rather than to hold their hand and say "no"; especially on something like memory which has a tendency to grow quite rapidly.

I'm just suggesting that D look to be desirable now & 20 years in the future. That requires forward-thinking in it's design.

My $0.02.

-Kramer


In article <d61n7n$1pp3$1@digitaldaemon.com>, Walter says...
>
>
>"Unknown W. Brackets" <unknown@simplemachines.org> wrote in message news:d61jpa$1m0l$1@digitaldaemon.com...
>>  > Capped static array size at 16Mb.
>> Why?  I realize that having larger arrays is sometimes bad, and makes
>> the length specifiers harder to work with, (which only seems a problem
>> with dynamic arrays to me) but I imagine there's some bigger reason I'm
>> missing?
>
>1) Gigantic static arrays are often either the result of a typo or are a
>newbie mistake.
>2) Such require a lot of memory for the compiler to handle. Before the OS
>officially runs out of memory, it goes to greater and greater lengths to
>scavenge memory for the compiler, often bringing the computer to its knees
>in desperation.
>3) D needs to be a portable language, and by capping the array size a
>program is more likely to be portable.
>4) Giant arrays are reflected in a corresponding giant size for the exe
>file.
>5) There simply isn't a need I can think of for such arrays. There shouldn't
>be a problem with allocating them dynamically.
>
>


May 13, 2005
"Kramer" <Kramer_member@pathlink.com> wrote in message news:d62k88$2fr1$1@digitaldaemon.com...
> What you're saying makes sense in practice /right now/, but what if a
research
> team 3 years from now want to use D (because it's so sweet) on some
> supercomputer, modeling planetary physics and decide they want (for some
reason
> unknown to us, but makes perfect sense in their model) a huge static
array?
> That might be a stretch scenario, but only because it hasn't happened yet.
>
> It's just always seemed to me, better to let the programmer learn from
their own
> mistakes rather than to hold their hand and say "no"; especially on
something
> like memory which has a tendency to grow quite rapidly.
>
> I'm just suggesting that D look to be desirable now & 20 years in the
future.
> That requires forward-thinking in it's design.

I don't think putting a limit on it is really closing such a door. It's easy for a vendor to up the cap.


May 13, 2005
On Fri, 13 May 2005 12:22:32 -0400, Kramer <Kramer_member@pathlink.com> wrote:

> What you're saying makes sense in practice /right now/, but what if a research
> team 3 years from now want to use D (because it's so sweet) on some
> supercomputer, modeling planetary physics and decide they want (for some reason
> unknown to us, but makes perfect sense in their model) a huge static array?
> That might be a stretch scenario, but only because it hasn't happened yet.

How about having a documented pragma that allows the limit to be changed..
May 13, 2005
So then that sounds like it's more of a DMD implementation detail, rather than a design in the language.  That's fair. <g>

-Kramer

In article <d62mo5$2hnk$1@digitaldaemon.com>, Walter says...
>
>
>"Kramer" <Kramer_member@pathlink.com> wrote in message news:d62k88$2fr1$1@digitaldaemon.com...
>> What you're saying makes sense in practice /right now/, but what if a
>research
>> team 3 years from now want to use D (because it's so sweet) on some
>> supercomputer, modeling planetary physics and decide they want (for some
>reason
>> unknown to us, but makes perfect sense in their model) a huge static
>array?
>> That might be a stretch scenario, but only because it hasn't happened yet.
>>
>> It's just always seemed to me, better to let the programmer learn from
>their own
>> mistakes rather than to hold their hand and say "no"; especially on
>something
>> like memory which has a tendency to grow quite rapidly.
>>
>> I'm just suggesting that D look to be desirable now & 20 years in the
>future.
>> That requires forward-thinking in it's design.
>
>I don't think putting a limit on it is really closing such a door. It's easy for a vendor to up the cap.
>
>


May 13, 2005
"Vathix" <vathix@dprogramming.com> wrote in message news:op.sqp5xdcwkcck4r@esi...
> On Fri, 13 May 2005 12:22:32 -0400, Kramer <Kramer_member@pathlink.com> wrote:
>
> > What you're saying makes sense in practice /right now/, but what if a
> > research
> > team 3 years from now want to use D (because it's so sweet) on some
> > supercomputer, modeling planetary physics and decide they want (for some
> > reason
> > unknown to us, but makes perfect sense in their model) a huge static
> > array?
> > That might be a stretch scenario, but only because it hasn't happened
> > yet.
>
> How about having a documented pragma that allows the limit to be changed..

I understand how it feels like a limitation, but it really isn't.


May 14, 2005
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

[followup set to digitalmars.D.bugs]

Shawn Liu schrieb am Thu, 12 May 2005 03:03:09 +0000 (UTC):
> In article <d5tvdk$1p7k$1@digitaldaemon.com>, Walter says...
>
> I have detected only one thing so far.
> When an "auto" keyword is found in a function, the compiler complains that "no
> return at end of function".

Do you have any sample code?

Thomas


-----BEGIN PGP SIGNATURE-----

iD8DBQFChZ8q3w+/yD4P9tIRAgF2AJ4iL8zrj/E9L2KWBEqrNOzsTv982wCgmdgd
CF5Mz6CSGPvTGvNLFM1g7W4=
=XPpe
-----END PGP SIGNATURE-----
1 2
Next ›   Last »