October 23, 2007
On Tue, 23 Oct 2007 07:28:42 -0400, Jascha Wetzel <firstname@mainia.de> wrote:

> Vladimir Panteleev wrote:
>> Except, they're not really as easy to use.
>>  With .NET, you can derive from a class in a compiled assembly without having access to the source. You just add the assembly in the project's dependencies and import the namespace with "using". In C, you must use the included .h files (and .h files are a pain to maintain anyway since you must maintain the declaration and implementation separately, but that's not news to you). You must still use .lib and .di files with D and such - although they can be automated in the build process, it's still a hassle.  Besides that, statically linking in the runtime seems to be a too common practice, as "DLL hell" has been a discouragement for dynamically-linked libraries in the past (side-by-side assemblies is supposed to remedy that though). I guess the fault is not in the DLLs themselves, it's how people and Microsoft used them...
>
> That is correct, but the obvious solution to that problem is to support the OO paradigm in dynamic linking. That is, we don't need a VM, we need DDL.
> Had C++ standardized it's ABI, this problem would probably not exist today.

  http://www.codesourcery.com/cxx-abi/
I don't know the whole deal, but I guess some decided not to go by this; I don't even know if DMC does or not.
October 23, 2007
Bruce Adams wrote:
> 
> An interpreter itself is relatively small. I can only assume that a
> lot of the bloat is down to bad coding. If you look at games these
> days they weigh in at a ridiculous 4Gb install. No amount of
> uncompressed data for performance gain excuses that. 

It's not the code that makes modern games eat up 4Gb of space, it's the textures, animations, 3D models, audio, video cut scenes, etc.  The code is a pretty small part of that.

--bb
October 24, 2007
Vladimir Panteleev wrote:
> With .NET, you can derive from a class in a compiled assembly without
> having access to the source. You just add the assembly in the
> project's dependencies and import the namespace with "using". In C,
> you must use the included .h files (and .h files are a pain to
> maintain anyway since you must maintain the declaration and
> implementation separately, but that's not news to you).

Yes, but that's a language bug, not anything inherent to native compilers.

> You must
> still use .lib and .di files with D and such - although they can be
> automated in the build process, it's still a hassle.

D has the potential to do better, it's just that its a bit mired in the old school.


> Besides that, statically linking in the runtime seems to be a too
> common practice, as "DLL hell" has been a discouragement for
> dynamically-linked libraries in the past (side-by-side assemblies is
> supposed to remedy that though). I guess the fault is not in the DLLs
> themselves, it's how people and Microsoft used them...

The solution to this is to have automatically generated versions for each build of a DLL/shared library. I imagine that .net does the same thing for assemblies.

October 24, 2007
Chris Miller wrote:
>   http://www.codesourcery.com/cxx-abi/
> I don't know the whole deal, but I guess some decided not to go by this; I don't even know if DMC does or not.

DMC++ follows the Microsoft C++ ABI under Windows.
October 24, 2007
serg kovrov wrote:
> Walter Bright wrote:
>> This problem is addressed by DLLs (Windows) and shared libraries (Linux).
> 
> I wanted to ask long time ago, will D-runtime be available as dll/so?

Eventually, yes. It just lacks someone working on it.
October 24, 2007
Bill Baxter Wrote:

> Bruce Adams wrote:
> > 
> > An interpreter itself is relatively small. I can only assume that a lot of the bloat is down to bad coding. If you look at games these days they weigh in at a ridiculous 4Gb install. No amount of uncompressed data for performance gain excuses that.
> 
> It's not the code that makes modern games eat up 4Gb of space, it's the textures, animations, 3D models, audio, video cut scenes, etc.  The code is a pretty small part of that.
> 
> --bb

That's partly my point. A lot of that could be achieved programmatically or with better compression. You get map and model files (effectively data structures representing maps and model) that are huge and hugely inefficient with it, describing low level details with little or no abstraction. E.g. a pyramid might made of points rather than recognising a pyramid as an abstraction. Some bright sparks have decided to use XML as their data format. Its only a little bigger and only takes a little extra time to parse. This costs little on a modern machine but can hardly be considered compact and efficient.

October 24, 2007
Bruce Adams wrote:
> Bill Baxter Wrote:
> 
>> Bruce Adams wrote:
>>> An interpreter itself is relatively small. I can only assume that a
>>> lot of the bloat is down to bad coding. If you look at games these
>>> days they weigh in at a ridiculous 4Gb install. No amount of
>>> uncompressed data for performance gain excuses that. 
>> It's not the code that makes modern games eat up 4Gb of space, it's the textures, animations, 3D models, audio, video cut scenes, etc.  The code is a pretty small part of that.
>>
>> --bb
> 
> That's partly my point. A lot of that could be achieved programmatically or with better compression. You get map and model files (effectively data structures representing maps and model) that are huge and hugely inefficient with it, describing low level details with little or no abstraction. E.g. a pyramid might made of points rather than recognising a pyramid as an abstraction. Some bright sparks have decided to use XML as their data format. Its only a little bigger and only takes a little extra time to parse. This costs little on a modern machine but can hardly be considered compact and efficient.
> 

Map and model file formats for most modern games that I know of *do* provide a way to factor out common geometry elements, so you only store one copy of the geometry for a streetlight (say) rather than repeating it for every streetlight in the game world.  Even so, a modern game involves a hell of a lot of content.  That's just the way it is.

I'm not sure how compressed that data is on the hard drive.  It's possible that they could shrink the data significantly with more attention to compression.  However, that probably adversely impacts level loading times which are already long enough (I was playing the latest installment of Half-Life the other day, and seeing approx. 20-30 second load times).  Despite your opinion about uncompressed data for performance's sake, a lot of gamers *would* rather the game take up 4GB of space than add to the load times.

Thanks,
Nathan Reed
October 24, 2007
Nathan Reed Wrote:

> Bruce Adams wrote:
> > Bill Baxter Wrote:
> > 
> >> Bruce Adams wrote:
> >>> An interpreter itself is relatively small. I can only assume that a lot of the bloat is down to bad coding. If you look at games these days they weigh in at a ridiculous 4Gb install. No amount of uncompressed data for performance gain excuses that.
> >> It's not the code that makes modern games eat up 4Gb of space, it's the textures, animations, 3D models, audio, video cut scenes, etc.  The code is a pretty small part of that.
> >>
> >> --bb
> > 
> > That's partly my point. A lot of that could be achieved programmatically or with better compression. You get map and model files (effectively data structures representing maps and model) that are huge and hugely inefficient with it, describing low level details with little or no abstraction. E.g. a pyramid might made of points rather than recognising a pyramid as an abstraction. Some bright sparks have decided to use XML as their data format. Its only a little bigger and only takes a little extra time to parse. This costs little on a modern machine but can hardly be considered compact and efficient.
> > 
> 
> Map and model file formats for most modern games that I know of *do* provide a way to factor out common geometry elements, so you only store one copy of the geometry for a streetlight (say) rather than repeating it for every streetlight in the game world.  Even so, a modern game involves a hell of a lot of content.  That's just the way it is.
> 
> I'm not sure how compressed that data is on the hard drive.  It's possible that they could shrink the data significantly with more attention to compression.  However, that probably adversely impacts level loading times which are already long enough (I was playing the latest installment of Half-Life the other day, and seeing approx. 20-30 second load times).  Despite your opinion about uncompressed data for performance's sake, a lot of gamers *would* rather the game take up 4GB of space than add to the load times.
> 
> Thanks,
> Nathan Reed

Don't get hung up on the geometry example. My example generator is broken. It is my contention that both the performance and compactness can be improved given the time and effort.
I imagine it varies a lot from shop to shop but typically from what I hear they are working to tight deadlines with poor processes. Hopefully they still at least use the rule "get it right, then get it fast" but they miss off the "then get small" at the end.
The huge install sizes and huge patches to supposedly "complete" games are one result of this. Battlefield 2 is painful slow to load each tiny level and yet still has a 4Gb install.
Its almost a part of the package now. If someone realised a game that only needed a CD and not a DVD a lot of people would (wrongly) assume it was less feature rich than the DVD version.
Take a look at a good shareware game and you see more of the full craft at work parly because download sizes are restrictive (though less so than they were).

October 24, 2007
Walter Bright wrote:
> Vladimir Panteleev wrote:
>> With .NET, you can derive from a class in a compiled assembly without
>> having access to the source. You just add the assembly in the
>> project's dependencies and import the namespace with "using". In C,
>> you must use the included .h files (and .h files are a pain to
>> maintain anyway since you must maintain the declaration and
>> implementation separately, but that's not news to you).
> 
> Yes, but that's a language bug, not anything inherent to native compilers.
> 
>> You must
>> still use .lib and .di files with D and such - although they can be
>> automated in the build process, it's still a hassle.
> 
> D has the potential to do better, it's just that its a bit mired in the old school.
> 

What do you envision as better for the future? Or were you just speaking hypothetically? Will link compatibility be kept for 2.0, 3.0 etc?

> 
>> Besides that, statically linking in the runtime seems to be a too
>> common practice, as "DLL hell" has been a discouragement for
>> dynamically-linked libraries in the past (side-by-side assemblies is
>> supposed to remedy that though). I guess the fault is not in the DLLs
>> themselves, it's how people and Microsoft used them...
> 
> The solution to this is to have automatically generated versions for each build of a DLL/shared library. I imagine that .net does the same thing for assemblies.
> 
October 24, 2007
在 Wed, 24 Oct 2007 08:54:16 +0800,Walter Bright <newshound1@digitalmars.com> 写道:

> Vladimir Panteleev wrote:
>> With .NET, you can derive from a class in a compiled assembly without
>> having access to the source. You just add the assembly in the
>> project's dependencies and import the namespace with "using". In C,
>> you must use the included .h files (and .h files are a pain to
>> maintain anyway since you must maintain the declaration and
>> implementation separately, but that's not news to you).
>
> Yes, but that's a language bug, not anything inherent to native compilers.
>
>> You must
>> still use .lib and .di files with D and such - although they can be
>> automated in the build process, it's still a hassle.
>
> D has the potential to do better, it's just that its a bit mired in the old school.
>
>
>> Besides that, statically linking in the runtime seems to be a too
>> common practice, as "DLL hell" has been a discouragement for
>> dynamically-linked libraries in the past (side-by-side assemblies is
>> supposed to remedy that though). I guess the fault is not in the DLLs
>> themselves, it's how people and Microsoft used them...
>
> The solution to this is to have automatically generated versions for each build of a DLL/shared library. I imagine that .net does the same thing for assemblies.
>

The solution is banning those guys from creating changing DLL/shared libraries. They just have no idea of what DLLs are and how DLLs should be. Generating versions is a bad idea. Consider FireFox with its tons of plugins. Almost all plugins I use actually works well with *any* FireFox version. Just it bothers me to change the version no in the jar file.

Cause FireFox APIs & Javascript is something fixed. So the interface of what plugins rely on is fixed. That's basically how and what DLLs should be. Interface interacts with design. I can't imagine a good design yields changing interface.



-- 
使用 Opera 革命性的电子邮件客户程序: http://www.opera.com/mail/