December 17, 2012
On 2012-12-17 10:13, eles wrote:

> WRT to all opinions above (ie: binary vs text, what to put etc.)
>
> I had some reflection on that some time ago: how about bundling a
> "header" file (that would be the .di file) and a binary file (the
> compiled .d file, that is the .obj file) into a single .zip (albeit with
> another extension), that will be recognized and processed by the D
> compiler (let's name that file a .dobj).
>
> Idea may seem a bit crazy, but consider the following:
>
> -the standard .zip format could be used by a user of that object/library
> to learn the interface of the functions provided by the object (just
> like a C header file)
> -if he's a power user, he can simply extract the .zip/.dobj, modify the
> included header (adding comments, for example), then archive that back
> and present the compiler a "fresh" .dobj/library file

Sounds a lot like frameworks and other type of bundles on Mac OS X. A framework is a folder, with the .framework extension, containing a dynamic library, header files and all other necessary resource files like images and so on.

> The responsability of maintaining the .obj and the header in sync will
> be of the compiler or of the power user, if the latter edit it manually.
> More, IDEs could simply extract relevant header information from the
> .zip archive and use it for code completion, documentation and so all.
>
> Basically, this would be like bundling a .h file with the corresponding
> .obj file (if we speak C++), all that under a transparent format. The
> code is hidden and obfuscated, just like in a standard library (think
> -lstdc++ vs <iostream>). The use of a single file greatly facilitate
> synchronization, while the use of the standard .zip format allow a
> plethora of tools to manually tune the file (if desired).
>
> This can be extended also to entire .dlib (that is, archive of .dobjs),
> which can become self-documenting, that way. I kinda of dreamt about
> that since programming in C++ and always needed to have the headers and
> the libs with me. Why do not include the headers in the lib, in a
> transparent and manually readable/editable format?
>
> A checksum could guarantee also that the header information and the
> binary information are in sync inside the .zip archive.
>
> What do you think?

In general I think it's better to have a package manager handle this.

-- 
/Jacob Carlborg
December 17, 2012
AFAIK those are more like Windows API & ABI reverse engineered and reimplemented and that is a huge difference.

On Monday, 17 December 2012 at 10:01:35 UTC, Jacob Carlborg wrote:
> On 2012-12-17 09:21, Walter Bright wrote:
>
>> I know what I'm talking about with this. The only time they get reverse
>> engineered is when somebody really really REALLY wants to do it, an
>> expert is necessary to do the job, and it's completely impractical for
>> larger sets of files. You cannot build a tool to do it, it must be done
>> by hand, line by line. It's the proverbial turning of hamburger back
>> into a cow.
>
> Evert heard of Wine or ReactOS, it's basically Windows reversed engineered.

December 17, 2012
> Sounds a lot like frameworks and other type of bundles on Mac OS X. A framework is a folder, with the .framework extension, containing a dynamic library, header files and all other necessary resource files like images and so on.

I don't know about such frameworks, but the idea that these kind of files should be handled by the compiler, not by the operating system. They are not meant to be applications, but libraries.
December 17, 2012
On Monday, 17 December 2012 at 09:58:28 UTC, Walter Bright wrote:
> On 12/17/2012 1:35 AM, Paulo Pinto wrote:
>> It suffices to get the general algorithm behind the code, and that is impossible
>> to hide, unless the developer resorts to cryptography.
>
> I'll say again, with enough effort, an expert *can* decompile object files by hand. You can't make a tool to do that for you, though.
>
> It can also be pretty damned challenging to figure out the algorithm used in a bit of non-trivial assembler after it's gone through a modern compiler optimizer.
>
> I know nobody here wants to believe me, but it is trivial to automatically turn Java bytecode back into source code.
>
> Google "convert .class file to .java":
>
>     http://java.decompiler.free.fr/
>
> Now try:
>
> Google "convert object file to C"
>
> If you don't believe me, a guy who's been working on C compilers for 30 years, and who also wrote a Java compiler, that should be a helpful data point.

Of course I believe you and respect your experience.

The point I was trying to make is that if someone really wants your code, they will get it, even if that means reading assembly instructions manually.

In one company I used to work, we rewrote the TCL parser to read encrypted files to avoid delivering text to the customer, hoping that it would be enough to detain most people.

--
Paulo
December 17, 2012
On 2012-12-17 10:58, Walter Bright wrote:

> Google "convert object file to C"

A few seconds on Google resulted in this:

http://www.hex-rays.com/products/decompiler/index.shtml

-- 
/Jacob Carlborg
December 17, 2012
> If we want to allow D to fit into various niche markets overlooked by C++, for added security, encryption could be added, where the person compiling encrypted .di files would have to supply a key. That would work only for certain situations, not for mass distribution, but it may be useful to enough people.

I can't imagine a situation where encrypting .di files would make any sense. Such files would be completely useless without the key, so you would have to either distribute the key along with the files or the compiler would need to contain the key. The former obviously makes encryption pointless and you could only make the latter work by attempting to hide the key inside the compiler. The fact that the compiler is open source would make that harder and someone would eventually manage to extract the key in any case. This whole DRM business would also prevent D from ever being added to GCC.
December 17, 2012
On 17 December 2012 12:54, jerro <a@a.com> wrote:

> If we want to allow D to fit into various niche markets overlooked by C++,
>> for added security, encryption could be added, where the person compiling encrypted .di files would have to supply a key. That would work only for certain situations, not for mass distribution, but it may be useful to enough people.
>>
>
> I can't imagine a situation where encrypting .di files would make any sense. Such files would be completely useless without the key, so you would have to either distribute the key along with the files or the compiler would need to contain the key. The former obviously makes encryption pointless and you could only make the latter work by attempting to hide the key inside the compiler. The fact that the compiler is open source would make that harder and someone would eventually manage to extract the key in any case. This whole DRM business would also prevent D from ever being added to GCC.
>

It's not as if phobos would be distributed that way.  And even it if was, then there'd be an uproar and a fork of the project.

-- 
Iain Buclaw

*(p < e ? p++ : p) = (c & 0x0f) + '0';


December 17, 2012
> It's not as if phobos would be distributed that way.  And even it if was,
> then there'd be an uproar and a fork of the project.

I don't think that the FSF would be to happy about adding a front end with DRM support to GCC, even if no encrypted libraries would be added along with it. Of course a fork without DRM support could still be added to GCC, but if support for DRM libraries became part of D, then this would cause problems when some people would choose to actually use this feature with their closed source libraries and those couldn't be used with GDC.
December 17, 2012
On 2012-12-17 12:20, eles wrote:

> I don't know about such frameworks, but the idea that these kind of
> files should be handled by the compiler, not by the operating system.
> They are not meant to be applications, but libraries.

They are handled by the compiler. GCC has the -framework flag.

https://developer.apple.com/library/mac/#documentation/MacOSX/Conceptual/BPFrameworks/Concepts/WhatAreFrameworks.html

The Finder also know about these frameworks and bundles and treat them as a single file.

-- 
/Jacob Carlborg
December 17, 2012
On 12/17/2012 4:38 AM, Jacob Carlborg wrote:> On 2012-12-17 10:58, Walter Bright wrote:
>
>> Google "convert object file to C"
>
> A few seconds on Google resulted in this:
>
> http://www.hex-rays.com/products/decompiler/index.shtml
>

hex-rays is an interactive tool. It's "decompile" to things like this:

v81 = 9;
v63 = *(_DWORD *)(v62 + 88);
if ( v63 )
{
   v64 = *(int (__cdecl **)(_DWORD, _DWORD, _DWORD,
   _DWORD, _DWORD))(v63 + 24);
   if ( v64 )
     v62 = v64(v62, v1, *(_DWORD *)(v3 + 16), *(_DWORD
     *)(v3 + 40), bstrString);
}

It has wired in some recognition of patterns that call standard functions like strcpy and strlen, but as far as I can see not much else. It's interactive in that you have to supply the interpretation.

It's pretty simple to decompile asm by hand into the above, but the work is only just beginning. For example, what is v3+16? Some struct member?

Note that there is no type information in the hex-ray output.