June 11, 2002
Matthew Wilson wrote:

> Why cannot we get rid of delete, in favour of assignment to the reference causing the destructor to be fired at that point rather than at the exit of the declaring scope

In places where the compiler can determine at runtime that no external
references exist, it could insert an automatic delete there.  However, this has
a few downsides:
    1) If the reference is ever passed to another function, then the compiler
has to track into that function to see if the reference is ever "saved off"
anywhere.  Can be done, but it gets complex, and may not catch all instances.
    2) Different compilers may detect different conditions, meaning that in some
compilers your object is cleaned up immediately, while others wait for the GC to
find it.  To fix that, you have to define a new standard for their seach
algorithm...
    3) As Walter noted on the D website, sometimes you don't want immediate
destruction even if it's possible.  Sometimes it makes more sense to just let
the program run ahead and clean up the garbage later, when you hit an idle time
or you actually run out of memory.  Before then (in some cases) running the
destructor just slows the program down.

IMHO, if you want automatic destruction, you should do something to note that. Either have a keyword that tells the compiler to auto-destruct it, or use different declaration syntax for auto-destructed and garbage-collected objects.

--
The Villagers are Online! villagersonline.com

.[ (the fox.(quick,brown)) jumped.over(the dog.lazy) ]
.[ (a version.of(English).(precise.more)) is(possible) ]
?[ you want.to(help(develop(it))) ]


June 11, 2002
A keyword is one solution, but can lead to error. The need for deterministic destruction is a function of the type rather than its location (in the majority of cases, anyway)

I think it would be better to have something denoting such in the class
definition. It could be some kind of attribute (stinks a bit) or it could be
the presence of a destructor. There may be some need for a late destruction,
in which case you could have a destructor (~MyClass()) and a finaliser
(Finalise()) but that sucks a bit as well. Perhaps you could have


class MyClass
{
    ...

    deterministic ~MyClass() // deterministic destruction
    {
    }

    ...
}

as opposed to either

class MyClass
{
    ...

    ~MyClass() // garbage-collect time finalisation
    {
    }

    ...
}

and

class MyClass
{
    ...

    // no dtor means nothing to be done at garbage collection

    ...
}

Does this sound ok?

"Russ Lewis" <spamhole-2001-07-16@deming-os.org> wrote in message news:3D0672F2.794C7225@deming-os.org...
> Matthew Wilson wrote:
>
> > Why cannot we get rid of delete, in favour of assignment to the
reference
> > causing the destructor to be fired at that point rather than at the exit
of
> > the declaring scope
>
> In places where the compiler can determine at runtime that no external references exist, it could insert an automatic delete there.  However,
this has
> a few downsides:
>     1) If the reference is ever passed to another function, then the
compiler
> has to track into that function to see if the reference is ever "saved
off"
> anywhere.  Can be done, but it gets complex, and may not catch all
instances.
>     2) Different compilers may detect different conditions, meaning that
in some
> compilers your object is cleaned up immediately, while others wait for the
GC to
> find it.  To fix that, you have to define a new standard for their seach
> algorithm...
>     3) As Walter noted on the D website, sometimes you don't want
immediate
> destruction even if it's possible.  Sometimes it makes more sense to just
let
> the program run ahead and clean up the garbage later, when you hit an idle
time
> or you actually run out of memory.  Before then (in some cases) running
the
> destructor just slows the program down.
>
> IMHO, if you want automatic destruction, you should do something to note
that.
> Either have a keyword that tells the compiler to auto-destruct it, or use different declaration syntax for auto-destructed and garbage-collected
objects.
>
> --
> The Villagers are Online! villagersonline.com
>
> .[ (the fox.(quick,brown)) jumped.over(the dog.lazy) ]
> .[ (a version.of(English).(precise.more)) is(possible) ]
> ?[ you want.to(help(develop(it))) ]
>
>


June 12, 2002
Matthew Wilson wrote:

> C# has really surprisingly good performance so, whilst I think D will still win out there, the other "advantages" of C# (it being promulgated by M$, so lot's of corporate buy-in) will not sufficiently distinguish D in the minds of "managers". But were D to employ deterministic destructors, the push will come from software engineers (=== people who use C++ ;), since the Java/.NET Finalize model is almost universally unpopular with such

Well, don't forget that C# needs the baggage of 60+ MB (20 MB compressed) of the .NET runtime!  Which in turn needs a bloated OS under it.

Current rumor has it that there are problems getting .NET ported to Microsoft's "other" platform, MacOS X.

D will be everywhere long before C# and .Net are anywhere other than Windows.

Features being discussed for D make it a strong contender for the embedded market.  Over 85% of all microprocessors sold go into the embedded market.  You probably own over 30 microprocessors yourself, but only one of them is in your PC.  The majority of software applications you interact with on a daily basis do not run on a PC.  They run your microwave, refrigerator, phone, car, TV, remote control, VCR, radio, CD player. monitor, keyboard, mouse, ...

There more code written for the PC than for embedded systems, but much of that code is never used, and never sees the light of day.  It is not successful code.  A much higher proportion of embedded code actually ships.  There is more code running in embedded systems than in all the world's PCs combined.

Put D where embedded programmers can use it, and it will be everywhere.

At least .NET is abstracting the custom code from the bulk.  An application is being written at work, the GUI for the embedded product I'm building, it is being written in C#.  It is a large, complex program.  The beta I got today fits on a floppy (though I didn't put it on one, of course).

MFC is dead.  Microsoft killed it with .Net.  Thank God I never learned MFC or the underlying Windows API.  And VS.Net may even wind up killing the beast that is C++.  (C++ never made much of a dent in the embedded world.)

Remember, the QNX folks put an entire real-time PC Web server on a floppy, including the OS and networking stacks, with no need for 60 MB of support code and an OS that now consumes nearly a gigabyte of disk.

These are the opposite extremes of the PC application world.

I think we'll do even better with D:  Bigger and better apps that go fast, are also fast to code and debug.  This is what comes when the language is targeted to the happy middle between power, features and simplicity.

C# does that with its syntax.  D does it with syntax and object code.

In that part of the world where Windows is running on multi-gigahertz processors with 100 GB disks, C# is tough to beat.

That leaves the rest of the world for D.

Aim there.


-BobC


June 12, 2002
Once again, I agree almost entirely. Where I differ is that I am a little more circumspect wrt what may eventuate, since I believe we should not underestimate the commercial might of M$ (clearly one is unable to underestimate their technical might :).

My original point stands in concert with yours. We should aim to have D be all that M$'s offspring are not: small, efficient, advanced but not bloated, answering the needs of developers and compiler walters equally. That means embedded, powerful syntax, proper dtors, and many of the other goodies that have been mentioned lately

"Robert W. Cunningham" <rcunning@acm.org> wrote in message news:3D06C338.49A0D997@acm.org...
> Matthew Wilson wrote:
>
> > C# has really surprisingly good performance so, whilst I think D will
still
> > win out there, the other "advantages" of C# (it being promulgated by M$,
so
> > lot's of corporate buy-in) will not sufficiently distinguish D in the
minds
> > of "managers". But were D to employ deterministic destructors, the push
will
> > come from software engineers (=== people who use C++ ;), since the
Java/.NET
> > Finalize model is almost universally unpopular with such
>
> Well, don't forget that C# needs the baggage of 60+ MB (20 MB compressed)
of
> the .NET runtime!  Which in turn needs a bloated OS under it.
>
> Current rumor has it that there are problems getting .NET ported to
Microsoft's
> "other" platform, MacOS X.
>
> D will be everywhere long before C# and .Net are anywhere other than
Windows.
>
> Features being discussed for D make it a strong contender for the embedded market.  Over 85% of all microprocessors sold go into the embedded market.
You
> probably own over 30 microprocessors yourself, but only one of them is in
your
> PC.  The majority of software applications you interact with on a daily
basis
> do not run on a PC.  They run your microwave, refrigerator, phone, car,
TV,
> remote control, VCR, radio, CD player. monitor, keyboard, mouse, ...
>
> There more code written for the PC than for embedded systems, but much of
that
> code is never used, and never sees the light of day.  It is not successful code.  A much higher proportion of embedded code actually ships.  There is
more
> code running in embedded systems than in all the world's PCs combined.
>
> Put D where embedded programmers can use it, and it will be everywhere.
>
> At least .NET is abstracting the custom code from the bulk.  An
application is
> being written at work, the GUI for the embedded product I'm building, it
is
> being written in C#.  It is a large, complex program.  The beta I got
today
> fits on a floppy (though I didn't put it on one, of course).
>
> MFC is dead.  Microsoft killed it with .Net.  Thank God I never learned
MFC or
> the underlying Windows API.  And VS.Net may even wind up killing the beast
that
> is C++.  (C++ never made much of a dent in the embedded world.)
>
> Remember, the QNX folks put an entire real-time PC Web server on a floppy, including the OS and networking stacks, with no need for 60 MB of support
code
> and an OS that now consumes nearly a gigabyte of disk.
>
> These are the opposite extremes of the PC application world.
>
> I think we'll do even better with D:  Bigger and better apps that go fast,
are
> also fast to code and debug.  This is what comes when the language is
targeted
> to the happy middle between power, features and simplicity.
>
> C# does that with its syntax.  D does it with syntax and object code.
>
> In that part of the world where Windows is running on multi-gigahertz processors with 100 GB disks, C# is tough to beat.
>
> That leaves the rest of the world for D.
>
> Aim there.
>
>
> -BobC
>
>


June 12, 2002
btw, Robert, this sounds like another opinion peice for "The D Journal"

"Robert W. Cunningham" <rcunning@acm.org> wrote in message news:3D06C338.49A0D997@acm.org...
> Matthew Wilson wrote:
>
> > C# has really surprisingly good performance so, whilst I think D will
still
> > win out there, the other "advantages" of C# (it being promulgated by M$,
so
> > lot's of corporate buy-in) will not sufficiently distinguish D in the
minds
> > of "managers". But were D to employ deterministic destructors, the push
will
> > come from software engineers (=== people who use C++ ;), since the
Java/.NET
> > Finalize model is almost universally unpopular with such
>
> Well, don't forget that C# needs the baggage of 60+ MB (20 MB compressed)
of
> the .NET runtime!  Which in turn needs a bloated OS under it.
>
> Current rumor has it that there are problems getting .NET ported to
Microsoft's
> "other" platform, MacOS X.
>
> D will be everywhere long before C# and .Net are anywhere other than
Windows.
>
> Features being discussed for D make it a strong contender for the embedded market.  Over 85% of all microprocessors sold go into the embedded market.
You
> probably own over 30 microprocessors yourself, but only one of them is in
your
> PC.  The majority of software applications you interact with on a daily
basis
> do not run on a PC.  They run your microwave, refrigerator, phone, car,
TV,
> remote control, VCR, radio, CD player. monitor, keyboard, mouse, ...
>
> There more code written for the PC than for embedded systems, but much of
that
> code is never used, and never sees the light of day.  It is not successful code.  A much higher proportion of embedded code actually ships.  There is
more
> code running in embedded systems than in all the world's PCs combined.
>
> Put D where embedded programmers can use it, and it will be everywhere.
>
> At least .NET is abstracting the custom code from the bulk.  An
application is
> being written at work, the GUI for the embedded product I'm building, it
is
> being written in C#.  It is a large, complex program.  The beta I got
today
> fits on a floppy (though I didn't put it on one, of course).
>
> MFC is dead.  Microsoft killed it with .Net.  Thank God I never learned
MFC or
> the underlying Windows API.  And VS.Net may even wind up killing the beast
that
> is C++.  (C++ never made much of a dent in the embedded world.)
>
> Remember, the QNX folks put an entire real-time PC Web server on a floppy, including the OS and networking stacks, with no need for 60 MB of support
code
> and an OS that now consumes nearly a gigabyte of disk.
>
> These are the opposite extremes of the PC application world.
>
> I think we'll do even better with D:  Bigger and better apps that go fast,
are
> also fast to code and debug.  This is what comes when the language is
targeted
> to the happy middle between power, features and simplicity.
>
> C# does that with its syntax.  D does it with syntax and object code.
>
> In that part of the world where Windows is running on multi-gigahertz processors with 100 GB disks, C# is tough to beat.
>
> That leaves the rest of the world for D.
>
> Aim there.
>
>
> -BobC
>
>


June 12, 2002
"Matthew Wilson" <dm@synesis-group.com> wrote in message news:ae6hk3$13h0$1@digitaldaemon.com...
> answering the needs of developers and compiler walters equally.

As a "compiler walter", I agree <g>.


June 12, 2002
9 out of 10 compiler walters agree that D is spiffy.

Sean

"Walter" <walter@digitalmars.com> wrote in message news:ae6omh$1aen$1@digitaldaemon.com...
>
> "Matthew Wilson" <dm@synesis-group.com> wrote in message news:ae6hk3$13h0$1@digitaldaemon.com...
> > answering the needs of developers and compiler walters equally.
>
> As a "compiler walter", I agree <g>.



June 12, 2002
Have I coined a new term? :-)

"Sean L. Palmer" <seanpalmer@earthlink.net> wrote in message news:ae6qql$1cil$1@digitaldaemon.com...
> 9 out of 10 compiler walters agree that D is spiffy.
>
> Sean
>
> "Walter" <walter@digitalmars.com> wrote in message news:ae6omh$1aen$1@digitaldaemon.com...
> >
> > "Matthew Wilson" <dm@synesis-group.com> wrote in message news:ae6hk3$13h0$1@digitaldaemon.com...
> > > answering the needs of developers and compiler walters equally.
> >
> > As a "compiler walter", I agree <g>.
>
>
>


June 12, 2002
"Russ Lewis" <spamhole-2001-07-16@deming-os.org> wrote in message news:3D064768.A1C560DA@deming-os.org...
> Juan Carlos Arevalo Baeza wrote:
>
> >    The problem is in the semantics and potential hidden bugs in the
program.
> > I'm not sure what D specifies will happen if you explicitly "delete" an object (you _could_ explicitly delete an object, couldn't you?) and then
try
> > to use it (for example, if "whatever()" leaves copies of the reference
lying
> > around). Still, I believe Walter's express desire is that such a
dangerous
> > "delete" shouldn't happen behind the programmer's back. Dangerous
statements
> > like that one are necessary to enhance the programmer's arsenal, but
they
> > are indeed problematic when they happen without the programmer's
awareness
> > of it.
>
> Walter has said that you can use "delete" against something with
references
> remaining...but it's undefined behavior.  As I understand it, calling
"delete"
> forces immediate cleanup of the object...including running its
destructors.

Please, please no Undefined Behaviour in language specs.
If you delete an object all references to it should magically became null.
Or invalid. Or get a GPF, or ... format the hard disk in a documented
manner.
Anything but the UB.

Sandor



June 12, 2002
I really like the idea of doing this at the class level rather than the usage
level!  However, there are some questions to address:
1) How to enforce that the user doesn't keep extra references to a deterministic
object type?  That is, if you pass a reference to a deterministic object to
another function, how do you enforce that it doesn't save the reference?  I'm
not really comfortable with "caveat emptor".  Somebody suggested that maybe you
shouldn't be able to pass these types of references to other functions...but
then how do you pass an open file handle to another function?
2) If you include a reference to this type of object in another class, does the
other class have to become deterministic, too, or not?

--
The Villagers are Online! villagersonline.com

.[ (the fox.(quick,brown)) jumped.over(the dog.lazy) ]
.[ (a version.of(English).(precise.more)) is(possible) ]
?[ you want.to(help(develop(it))) ]