December 16, 2005
>
>I agree this can happen - and it does happen with every language. The flip side is that if D doesn't constantly improve as a language, it will die. Show me a language that is unchanging and I'll show you a dead language. D has no choice but to move forward.
>
>That said, there is good reason to fork D once 1.0 is set - one fork will be purely bug fixes, the language improvements go into the other fork.
>
>

This admission alone is an indicator that D should be turning 1.0 anytime soon. I'm certainly for it.

-JJR


December 16, 2005
> ...Most of the engines you mentioned are nowhere near production quality, although I
> think bits like the Open Dynamics Engine can save a company a LOT of money.

True, though Torque was used to create Tribes (yeah that code is a mess), and Dave Eberly worked for NetImmerse (which is now Gamebryo I believe), so I imagine his design and coding style would be similar since he was the lead engineer there. Also OSG is used quite a bit in the VIS/SIM industry for production, though not as much for games. But Pirates of the XXI Century is using OSG:
http://www.openscenegraph.org/osgwiki/pmwiki.php/Screenshots/DIOSoftPirates

But alas, I don't quite have enough money for something like the Unreal Engine or Gamebryo.
December 16, 2005
clayasaurus wrote:
> Hrm.. well it isn't that bad. You can help out the Sinbad folks once that gets rolling ( http://www.dsource.org/projects/sinbad/ ).
> 
> Also, you don't necessarily have to convert all C++ code to D, you can create a C++ --> C --> D wrapper.

Yeah I've looked at Sinbad and I was somewhat interested in it, but then I switched from OGRE to OSG for various reasons. But if the Sinbad team really does do a rewrite I'd be up for that. Wrappers for some reason rub me the wrong way (but I do them when I must).
December 16, 2005
Walter Bright wrote:
> In article <dnoru3$mgr$1@digitaldaemon.com>, Niko Korhonen says...
> 
>>I also would like to see bit arrays and AA's in the standard library instead of the language; IMO D isn't high-enough level language for that.
> 
> 
> While I agree that bit arrays in the core language, in retrospect, were probably
> a mistake they are in the language, are used, and so need to stay.

Whoa, am I reading right? Isn't that against the philosophy of D? I mean, we complain about C++'s nuisances related to backwards compatibility, yet, when we find a certain feature in D that is regretable, it's decided it can't be taken off, moreso even tough D is still in development?

-- 
Bruno Medeiros - CS/E student
"Certain aspects of D are a pathway to many abilities some consider to be... unnatural."
December 16, 2005
Niko Korhonen wrote:
> Walter Bright wrote:
> 
>> The reason to have the stack allocated objects is that it opens up a large
>> avenue of applications for templates.
> 
> 
> I couldn't help but noticing that a very large portion of the recent D feature additions have to do with template (meta)programming.
> 
> Does allowing templates in a language necessarily lead to gradually increasing complexity until the language's template mechanism and supporting features start to look like C++?
> 

We have hindsight on our side, so I hope not.
December 16, 2005
Walter Bright wrote:
> That said, there is good reason to fork D once 1.0 is set - one fork will be
> purely bug fixes, the language improvements go into the other fork.
> 

This is exactly what I wanted to hear, thanks.
December 18, 2005
In article <dnssku$r3g$1@digitaldaemon.com>, Kris says...
>
>
>"Derek Parnell" <derek@psych.ward> wrote in message news:1ltc1ir9rny3r.1vwocglyyu3uu.dlg@40tude.net...
>> On Thu, 15 Dec 2005 10:40:45 -0800, Kris wrote:
>>
>>> Since Walter feels that bit-arrays in the core-language were (in
>>> retrospect)
>>> probably a mistake, I think there's a really good opportunity to heal a
>>> wart.
>>>
>>> Suppose there were a nice library-based bitset ~ just how much effort
>>> would
>>> be needed to clean up now rather than later? Is most of the usage
>>> actually
>>> within Phobos? Would those people who actually use bit-arrays kick and
>>> scream all the way to hell and back?
>>>
>>> How about it?
>>
>> I don't use bit arrays but I do you bool associative arrays.
>
>
>me too, but I think that's implemented akin to a byte-AA instead? That is, I doubt it would be impacted.

I've used bit arrays where I in C would use bit fields. (but IMHO, the C standard has done the wrong thing by allowing the order of fields in bit fields to become endian dependent)

I tried implementing a BitArray class in D, just seeing what would turn up.

This is how it works:

BitArray - bit[] replacement
StaticBitArray!(#) - bit[#] replacement
BitField!(member1,size1,member2,size2,...) - C bit field replacement

BitArray supports unaligned slices, concatenation etc...

# StaticBitArray!(13) arr;
# assert(arr.sizeof == 2);
# arr[2] = 1;
# BitArray t = arr[1..3] ~ arr[5..9];
#
# foreach(inout bit b; t)
#   b = ~b;

BitField also supports named fields:

# typedef void ready, write_protected, loaded, error_code, command;
# BitField!(
#   ready, 1,
#   write_protected, 1,
#   loaded, 1,
#   error_code, 8,
#   command, 5
# ) status;
#
# assert(status.sizeof == 2);
#
# status.ref!(error_code)() = 137;
# status.ref!(command)() = 3;
# status.ref!(ready)() = 1;
#
# BitArray something = status.ref!(error_code)() ~ status.ref!(command)();

I'm not sure why dmd requires () after templated member functions...

I can post the code if anyone is interested.

/Oskar


December 19, 2005
"Niko Korhonen" <niktheblak@hotmail.com> wrote in message news:dntrhf$1g0e$1@digitaldaemon.com...
> Walter Bright wrote:
>> The reason to have the stack allocated objects is that it opens up a
>> large
>> avenue of applications for templates.
>
> I couldn't help but noticing that a very large portion of the recent D feature additions have to do with template (meta)programming.
>
> Does allowing templates in a language necessarily lead to gradually increasing complexity until the language's template mechanism and supporting features start to look like C++?

That's a very legitimate concern. But I think what is wrong with C++ templates is not the idea, but the expression of the idea. C++ templates broke a lot of new ground, and just like early airplanes, once we know where we want to get to, we can device a better design.


January 02, 2006
For some months I have a thought... And I just found the answer.

>>>> Since Walter feels that bit-arrays in the core-language were (in
>>>> retrospect)
>>>> probably a mistake, I think there's a really good opportunity to heal a
>>>> wart.

Yes. Bit-arrays should not be embedded into the language itself.

There are a lot of problems returning a bit from function:
- Bit type out function parameters and so on.
- Bit-arrays are not able to replace C-style bit fields.
- Bit-arrays can not be used as normal sets. The basic operations missing.
(in, or, and, xor)

And finally I have found some posts in D newsgroups that explained me, most of these problems can be solved elegantly with templates/mixins in a convenient library.

>I've used bit arrays where I in C would use bit fields. (but IMHO, the C
>standard has done the wrong thing by allowing the order of fields in bit fields
>to become endian dependent)
>...
>I can post the code if anyone is interested.

I am interested. Please send a link if possible.

Thanks,
Tamas Nagy


January 23, 2006

Niko Korhonen wrote:

...

> Each time you fork the language construct in two, you get two schools of thought. These two schools of thought will wage war on each indefinitely. This is exactly what happened with C++ (in an extreme manner) and nowadays every C++ programmer knows and uses a different subset of the language and wages war with the other programmers.
> 
> Consider:
> 
> int x = 5;
> int x(5);
> 
> Which is better?
> 
> MyClass a;
> MyClass a = MyClass();
> 
> Again, which is better?
> 
> void func(const MyClass& myClass);
> void func(const MyClass* myClass);
> 
> Again, which is better?
> 
> C++ has created millions of ways to do the same thing, i.e. the language is full of redundant constructs. A couple of recent additions to D (type inference and stack allocation) have forked/will fork the language in two, now we have to choose between:
> 
> auto c = new MyClass();
> MyClass c = new MyClass(); // Which is better?
> 
> and soon:
> 
> auto c = MyClass();
> auto c = new MyClass();
> 
> People are bound to create Coding Guidelines for D with one saying 'Always use type inference' and the other saying 'Never use type inference'. I've read three different company internal style guides for C++ and all of them are completely different. I don't want the same to happen to D.
> 
> The point that I'm desperately (and rather poorly come to think of it) trying to make is that we should keep the number of lanugage constructs to a bare minimum and absolutely ban any redundant constructs. This helps to keep the language shareable, clean and easy to understand and easy to parse.
> 
> Whenever there is a choice between language constructs programmers will fight about which one is better. The only historically proven way to alleviate this issue is to reduce the number of choices. The C++ gurus standard answer of 'educating the programmers better' hasn't worked in the real world.
> 
> Only if and only if there are no choices between language constructs to fight about, programmers will not fight about them. Otherwise they will.

I tend to agree.

Parallel ways of doing things confuses all who start with the language. They create war camps and religions. (The more trivial the difference, the harder the fight.)

Later, when more changes get into the language, one often finds that one of the parallel ways is changed and the other not "hey, we have these two ways, let's change only one of them, thus creating more choice for the programmer!", leading ultimately to additional confusion to newcomers, to not-well thought out features, and to a loss of coherency and focus.

Language design should remember to prune and not just add.

"A language is like a garden. Let it grow without weeding, and you'll end up using a machete just to get through."