June 10, 2019
On Monday, 10 June 2019 at 08:33:55 UTC, Ola Fosheim Grøstad wrote:
> struct A { char a; /* 3 unused bytes */ double b; }
> struct B : A { char c; }
>
> Should result in { char a; char c; /*3 unused bytes*/ double b;}

Err, that should be:

Should result in { char a; char c; /*2 unused bytes*/ double b;}


June 10, 2019
On 6/10/19 4:03 AM, Manu wrote:
> On Mon, Jun 10, 2019 at 12:30 AM Andrei Alexandrescu via Digitalmars-d
> <digitalmars-d@puremagic.com> wrote:
>>
>> On 6/9/19 5:13 PM, Manu wrote:
>>> This is NOT polymorphism, we're not talking about polymorphism, I wish
>>> people would not change the topic.
>>
>> The problem here is that it's difficult to define subtyping without
>> polymorphism. C++ does technically allow it, but code such as:
>>
>> struct my_vector : public std::vector<int> { ... }
>>
>> is universally reviled. I really think it wouldn't help D much to add
>> struct inheritance.
> 
> We do it anyway though, it's just terrible.
> How will it cause any harm to anybody?

The same way it causes in C++. Crack open literally any C++ introductory book to find the explanation.

> I hate this idea that something
> has shown to be sorely missed for so long, and we still can't
> reconsider it because some moron out there might write bad code?
> D almost exclusively attracts power-users, who have unbelievably
> complex software to get on with. Tools are good, especially tools like
> this which only make a shitty piece of existing language much cleaner
> and simpler to reason about.

Nonsense. Again: the feature exists in C++ and is explicitly forbidden by all coding standards. If they could turn the time back they wouldn't have it. So... why would we now imitate it alongside with its mistakes? If you have a great idea on how to improve on it, sure. Otherwise, just don't.
June 10, 2019
On Monday, 10 June 2019 at 14:42:41 UTC, Andrei Alexandrescu wrote:
> Nonsense. Again: the feature exists in C++ and is explicitly forbidden by all coding standards. If they could turn the time back they wouldn't have it. So... why would we now imitate it alongside with its mistakes? If you have a great idea on how to improve on it, sure. Otherwise, just don't.

Err... that claim makes no sense to me.

Just look at the implementation of the C++ standard library, it is used all over the place!  With multiple inheritance even. For good effect, actually. (Although I seldom use multiple inheritance myself.)

C++ makes no distinction between class and "struct", the slight difference is only for compatibility with C.

Do you really think that Bjarne Stroustrup would remove inheritance from C++?

That would be truly shocking…

June 10, 2019
On Monday, 10 June 2019 at 14:54:06 UTC, Ola Fosheim Grøstad wrote:
> On Monday, 10 June 2019 at 14:42:41 UTC, Andrei Alexandrescu wrote:
>> Nonsense. Again: the feature exists in C++ and is explicitly forbidden by all coding standards. If they could turn the time back they wouldn't have it. So... why would we now imitate it alongside with its mistakes? If you have a great idea on how to improve on it, sure. Otherwise, just don't.
>
> Err... that claim makes no sense to me.
>
> Just look at the implementation of the C++ standard library, it is used all over the place!  With multiple inheritance even. For good effect, actually. (Although I seldom use multiple inheritance myself.)

IMO, that's a fallacy. There are constraints on a language's standard library that don't exist for user code, such that sometimes weird stuff has to be done to avoid or limit breaking changes (I'm not saying this necessarily *is* the case for C++'s stdlib; I'm not familiar enough with the implementation to comment).

Just look at all the outdated and suboptimal stuff in Phobos.
June 10, 2019
On Monday, 10 June 2019 at 16:44:35 UTC, Meta wrote:
> IMO, that's a fallacy. There are constraints on a language's standard library that don't exist for user code, such that sometimes weird stuff has to be done to avoid or limit breaking changes

I don't think so. Anyway, C++ is already a language where the demand for proficiency is high. So it doesn't matter, inheritance is the least problematic aspect of the language… Well, actually, inheritance is the sole purpose for its existence… so it would be very odd if that was a regret. Especially since Bjarne deeply respects Kristen and Ole-Johan based on what he has said in interviews.

Besides, it is reasonable to require that people who write libraries are skilled.  Generally,  you need at least some experts to provide, and sustain, a solid library in any language, IMHO.

Maybe it is reasonable to make a separation between features for framework code and features for application/user-interface code, rather than @safe or not.

> Just look at all the outdated and suboptimal stuff in Phobos.

Ok, but I think with any language that currently provides meta programming the implementation of a standard library will look messy. How to do it really well, in terms of legibility, is simply not known at this point.

I think you need a different layer for API specification and implementation to get around that, and to get that kind of shift we will have to wait for a decade of comp-sci research to be done, i.e. a shift towards more constraint based library-programming. I don't think it is possible to do this now in a way that scales. Except maybe for embedded programming with small code bases.


June 10, 2019
On 6/10/2019 7:42 AM, Andrei Alexandrescu wrote:
> The same way it causes in C++. Crack open literally any C++ introductory book to find the explanation.

Some reasons:

https://stackoverflow.com/questions/4353203/thou-shalt-not-inherit-from-stdvector

https://stackoverflow.com/questions/2034916/is-it-okay-to-inherit-implementation-from-stl-containers-rather-than-delegate

Scott Meyers' "Effective C++ 3rd Edition" points out more failings in "Item 38: Model 'has-a' or 'is-implemented-in-terms-of' through composition".
June 10, 2019
On Monday, 10 June 2019 at 21:15:52 UTC, Walter Bright wrote:
> On 6/10/2019 7:42 AM, Andrei Alexandrescu wrote:
>> The same way it causes in C++. Crack open literally any C++ introductory book to find the explanation.
>
> Some reasons:
>
> https://stackoverflow.com/questions/4353203/thou-shalt-not-inherit-from-stdvector

The answers with high rating actually state that there is no problem in inheriting from std::vector…

> https://stackoverflow.com/questions/2034916/is-it-okay-to-inherit-implementation-from-stl-containers-rather-than-delegate

The most highly rated answer says that there is no problem with inheriting from std::vector if you don't rely on on having a virtual destructor…

So, not the best source for building an argument.

June 10, 2019
On Monday, 10 June 2019 at 21:59:51 UTC, Ola Fosheim Grøstad wrote:
> On Monday, 10 June 2019 at 21:15:52 UTC, Walter Bright wrote:
>> On 6/10/2019 7:42 AM, Andrei Alexandrescu wrote:
>>> The same way it causes in C++. Crack open literally any C++ introductory book to find the explanation.
>>
>> Some reasons:
>>
>> https://stackoverflow.com/questions/4353203/thou-shalt-not-inherit-from-stdvector
>
> The answers with high rating actually state that there is no problem in inheriting from std::vector…
>
>> https://stackoverflow.com/questions/2034916/is-it-okay-to-inherit-implementation-from-stl-containers-rather-than-delegate
>
> The most highly rated answer says that there is no problem with inheriting from std::vector if you don't rely on on having a virtual destructor…
>
> So, not the best source for building an argument.

Just because the code is semantically correct or "safe" doesn't mean it's good code. Inheriting from a container that you either aren't directly extending or don't own is a huge code smell in my view
June 10, 2019
On Mon, Jun 10, 2019 at 2:55 AM Walter Bright via Digitalmars-d <digitalmars-d@puremagic.com> wrote:
>
> On 6/9/2019 3:03 PM, Manu wrote:
> > I can form a misuse fail for practically any language feature. It's possible we identify some common-sense restrictions here to inhibit the worst of them, but the current restriction is anti-productive, the line in the sand is drawn incorrectly.
>
> Everyone, and I mean everyone, who misuses a feature is thoroughly convinced their use is productive. For an ancient example:
>
>      #define BEGIN {
>      #define END }
>
> Yep. That. :-) And it's not remotely the worst abuse of the preprocessor I've seen, all of which are stoutly defended.

That's absolutely nothing like using struct inheritance to *do struct
inheritance*.
Using `alias this` to do struct inheritance is a much closer analogy
to your example above; ie, using a feature to do something that people
may not expect and/or could be expressed a whole lot more cleanly or
directly.

> > For contrast, I've been arguing on bug reports recently that people
> > think interactions with uninitialised unions (or =void initialised
> > code) is @safe... 🤯🤯🤯
> > which just shows how astonishingly arbitrary this shit is.
>
> Uninitialized non-pointers are @safe, because @safe refers to memory safety, not "no bugs". Perhaps I was wrong to define it that way, but it is deliberate, and not arbitrary.

Uninitialised data, which is guaranteed to have no valid state, and has absolutely no chance of proper program execution under any circumstance, and may even be leaking internal/private state(!), is not safe. You can't convince me otherwise.

It is arbitrary. Almost everything comes down to "I like this", or "I
don't like this", and the noise in the middle is mostly a waste of our
time.
Occasionally argument improves outcomes, but for the most part, we're
all just wasting time here.

I'm 100% certain that it's crazy to allow accessing uninitialised memory freely without any compiler intervention in allegedly @safe code. Program error is practically guaranteed, and you have absolutely no idea how that error may manifest (it might even violate memory safety). But there's nothing I can do to convince you of that. Meanwhile, you'll suggest I'm crazy for wanting to use struct inheritance to do struct inheritance rather than ugly hacks to emulate it...

If this isn't all just completely arbitrary, I don't know what is. ...but let's not get off topic ;)

June 10, 2019
On Mon, Jun 10, 2019 at 7:45 AM Andrei Alexandrescu via Digitalmars-d <digitalmars-d@puremagic.com> wrote:
>
> On 6/10/19 4:03 AM, Manu wrote:
> > On Mon, Jun 10, 2019 at 12:30 AM Andrei Alexandrescu via Digitalmars-d <digitalmars-d@puremagic.com> wrote:
> >>
> >> On 6/9/19 5:13 PM, Manu wrote:
> >>> This is NOT polymorphism, we're not talking about polymorphism, I wish people would not change the topic.
> >>
> >> The problem here is that it's difficult to define subtyping without polymorphism. C++ does technically allow it, but code such as:
> >>
> >> struct my_vector : public std::vector<int> { ... }
> >>
> >> is universally reviled. I really think it wouldn't help D much to add struct inheritance.
> >
> > We do it anyway though, it's just terrible.
> > How will it cause any harm to anybody?
>
> The same way it causes in C++. Crack open literally any C++ introductory book to find the explanation.

And yet we're doing this anyway... except the code to do it is obscured, and therefore much more sinister, and prone to accidental misuse.

> > I hate this idea that something
> > has shown to be sorely missed for so long, and we still can't
> > reconsider it because some moron out there might write bad code?
> > D almost exclusively attracts power-users, who have unbelievably
> > complex software to get on with. Tools are good, especially tools like
> > this which only make a shitty piece of existing language much cleaner
> > and simpler to reason about.
>
> Nonsense. Again: the feature exists in C++ and is explicitly forbidden by all coding standards. If they could turn the time back they wouldn't have it. So... why would we now imitate it alongside with its mistakes? If you have a great idea on how to improve on it, sure. Otherwise, just don't.

I mean, supporting `struct Thing : Base` is the obvious idea to
improve it... but that settles that. I'll desist.
TL;DR, we do it all the time and the world hasn't ended. We will keep
doing it. Saying it's bad will not stop us from doing it.
In C++ I've been doing this for like 20 years, and still never
encountered or imagined any sort of issue. Yes, I understand there is
text that says it's bad, I think it's extremely overrated text.
Our code suffers form being noisy and hard to understand at a glance,
and not playing well with the editor/debugger. These things are all
*actual* bad things, and in my opinion, they are MUCH worse things
than the alleged issue that's never caused me a moments harm in my
life (or presumably any of my colleagues who work in the same
codebases...)
There is just no way I could find to balance the risk vs reward
differently. Something that interferes with my work every day, vs some
theoretical issue that's never been a problem in my life...