Jump to page: 1 2 3
Thread overview
[idea] self-checking option for release versions
Jan 14, 2005
Manfred Nowak
Jan 15, 2005
parabolis
Jan 15, 2005
Manfred Nowak
Jan 15, 2005
Asaf Karagila
Jan 15, 2005
Manfred Nowak
Jan 15, 2005
Asaf Karagila
Jan 15, 2005
parabolis
Jan 15, 2005
Manfred Nowak
Jan 15, 2005
Asaf Karagila
Jan 15, 2005
Manfred Nowak
Jan 15, 2005
parabolis
Jan 16, 2005
Juanjo Álvarez
Jan 16, 2005
Thomas Kuehne
Jan 17, 2005
Juanjo Álvarez
Jan 17, 2005
Asaf Karagila
Jan 15, 2005
Norbert Nemec
Jan 15, 2005
Manfred Nowak
Jan 15, 2005
Norbert Nemec
Jan 15, 2005
Manfred Nowak
Jan 16, 2005
Norbert Nemec
Jan 16, 2005
Asaf Karagila
Jan 19, 2005
Georg Wrede
Jan 17, 2005
Gold Dragon
Jan 17, 2005
Asaf Karagila
January 14, 2005
A couple of month ago I had a discussion in this forum on usage-prevention-chemes.

www.strongbit.com has a software product called execryptor that promises even more than that I was talking about, i.e. making the task of analyzing an executable NP-Hard.

Maybe that this is a technic that should be incorporated into compilers in the long run.

As a first step it may be sufficient to protect executables against the most known automatic plague: viruses.

A mounted cheme that reads an md5 checksum incorporated into the executable and checks the not-md5 checksum part of the executable for consistency with the checksum part would detect the result of all modifications to the executable of viruses that are unaware of the integrated self-checking cheme.

On the other hand every virus that would be aware of the selfchecking option must be able to find it and switch it off or must find and change the md5 checksum according to the changes that the virus wants to do to the executable.

Hopefully this searching for the critical points of the mounted cheme of self checking creates a signature that all self-checking aware viruses must show, thereby easying the task of antivirus software.

Is this an idea without any impact or is D a candidate to be the first language that incorporates some antivirus strategy?

-manfred



January 15, 2005
Manfred Nowak wrote:
> A couple of month ago I had a discussion in this forum on
> usage-prevention-chemes.

I found your repeated use of 'cheme' to be both mauve and gustipating. I am assuming you meant 'scheme'?

> www.strongbit.com has a software product called execryptor that promises
> even more than that I was talking about, i.e. making the task of analyzing
> an executable NP-Hard.

Security app promises a new paradigm but does not publish algorithm and uses phrases like 'nondeterminate transformations'. I have seen this one already. It does not turn out well.

Further reading just made me wince more. EXECryptor sounds like it should be named the AntiOptimizer. It bloats the compiled code to 'a couple of dozens times' the original and sounds like it kills almost any attempts at optimization in the process.

> As a first step it may be sufficient to protect executables against the most
> known automatic plague: viruses.
> 
> A mounted cheme that reads an md5 checksum incorporated into the executable
> and checks the not-md5 checksum part of the executable for consistency with
> the checksum part would detect the result of all modifications to the
> executable of viruses that are unaware of the integrated self-checking
> cheme.

It would if a virus that prepends itself to the compiled code ever decides to actually pass control back to your code and does not remove itself from the compiled code before restarting the executable.
January 15, 2005
> www.strongbit.com has a software product called execryptor that promises even more than that I was talking about, i.e. making the task of analyzing an executable NP-Hard.

execryptors are not just a mere md5 checksum,
they are usually being used against crackers and reversers to protect the
code.
most of them have been unpacked successfully.
due to this fact, the code is encrypted in several layers,
sometimes there's some "evaluation notify" screen added,
it's mostly used by lazy programmers that have no clue how to protect their
code properly,
viruses on the other hand are far trickier since they are parasiting on your
code.
and due to that, they load themselves before your code is actually executed,
thus, they have the ability to disappear (as parabolic mentioned), or
analyze
your code (in search for D compiler signature) and alter any built-in check.

although it's a nice feature to have a self checksum integrated into the
code
by default, it's pretty useless in protection serious pests or smart
crackers.

- Asaf.


January 15, 2005
Manfred Nowak wrote:
> Maybe that this is a technic that should be incorporated into compilers in the long run.

Making something in that direction part of the compiler seems like a bad idea. Post-processing executables for specific purposes can be done perfectly well in a separate stage after compilation. There are many different needs for post-processing (compression, encryption, checksumming, prelinking, etc.) and technology evolves quickly. Builing anything like that into the compiler will kill flexibility and unnecessarily increase the complexity of the compiler. Modularity seems a far better approach, unless the individual stages are inherently interconnected.

January 15, 2005
parabolis wrote:
> I found your repeated use of 'cheme' to be both mauve and gustipating. I am assuming you meant 'scheme'?

Thanks for pointing this out correctly. It was 'worng', 'Bad speling' and may be choosen as an indicator of my severe 'dain bramage' :-)

> Further reading just made me wince more. EXECryptor sounds like it should be named the AntiOptimizer. It bloats the compiled code to 'a couple of dozens times' the original and sounds like it kills almost any attempts at optimization in the process.

At first glance I agree with you..

> It would if a virus that prepends itself to the compiled code ever decides to actually pass control back to your code and does not remove itself from the compiled code before restarting the executable.

At present you are right again. But with the growing number of
processors that incorporate hardware protection against writing into
unallowed areas the viruses  types that stay in or want to remove
themselves from main memory will go into their happy hunting grounds.
And the classification of code as a virus depends on the definition. I
prefer the definition that a virus is executable code that propagates itself
by turning other executables into trojan horses ... and an executable
that does not do anything is clearly not a trojan horse.

-manfred


January 15, 2005
Asaf Karagila wrote:
> although it's a nice feature to have a self checksum integrated into the code by default, it's pretty useless in protection serious pests or smart crackers.

Smart people are unstoppable by defintion :-)
And you are right so far: because a complete shield against all types of
viruses needs a supervising instance, that instance will become the target
of virus attacks until that again is shielded by another super-supervising
instance ...and so on. The only hope to break this coevolving cycle is, that
the code a virus needs to have is so blurred that the pure mass of code
makes it easy to detect it. If someone can proof that this will not happen,
then there is no need to start with. If someone can proof the counterpart,
then there is an urgent need to start. Shouldn't one start in the hope that
someone can proof the counterpart?

-manfred


January 15, 2005
"Manfred Nowak" <svv1999@hotmail.com> wrote in message news:csb073$2158$1@digitaldaemon.com...
> And you are right so far: because a complete shield against all types of viruses needs a supervising instance

well, there has been a contest of protecting an applicated in some reverse
engineering site,
one of the submitted solutions injected codes to random processes and
created remote threads
that would checksum the original code, he also included a lot of protection
codes.
it was a brilliant solution, won first place i believe. but the contests
site is down.
if i'll manage to dig up the files, or if the site is up again, i'll post it
here.

- Asaf.


January 15, 2005
Norbert Nemec wrote:
[...]
> Post-processing executables for specific purposes can be done perfectly well in a separate stage after compilation.
Is this a defintion? That would mean, that those tasks that cannot be done perfectly well in a separate stage after compilation have to be built into the compiler. Is it imperfect enough if you have to renanalyze the just generated program to find the places where you are able to insert checksums and routines for checking, thereby increasing the needed cpu-time from one unit to at least units in the order of the length of the program?

> Modularity seems a far better approach, unless
> the individual stages are inherently interconnected.
So please explain what the currently implemented release option is doing that is inherently interconnected with the compiling phase and cannot perfectly well be done in a separate stage after compilation.

-manfred


January 15, 2005
Manfred Nowak wrote:

> 
> Norbert Nemec wrote:
> [...]
>> Post-processing executables for specific purposes can be done perfectly well in a separate stage after compilation.
> Is this a defintion? That would mean, that those tasks that cannot be done perfectly well in a separate stage after compilation have to be built into the compiler. Is it imperfect enough if you have to renanalyze the just generated program to find the places where you are able to insert checksums and routines for checking, thereby increasing the needed cpu-time from one unit to at least units in the order of the length of the program?

OK, if the checksum - or whatever - algorithm goes so deep into the structure of the program, it would make sense to build it into the compiler.

>> Modularity seems a far better approach, unless
>> the individual stages are inherently interconnected.
> So please explain what the currently implemented release option is doing that is inherently interconnected with the compiling phase and cannot perfectly well be done in a separate stage after compilation.

I'm not sure what you mean with "release option". If you talk of "release" vs. "debug" mode, then this option is not a separate stage after compilation but an option that affects the code generate stage.

In any case: I still think, that self-checking binaries should be left as a separate project. There are many different purposes for checksumming of binaries (protection against viruses, intrusion, general corruption, probably many more) Finding one "general purpuse" checksumming method will result in something that is suboptimal for at least some of the purposes. If a system administrator has special need for checksumming, it should be up to him to decide on the details. In any other case, checksumming would just cost space and performance with little gain. (Actually, for my personal machine under Linux, I see no point why certain applications should be self-checking. If I knew that every binary is self-checking, there might be a certain gain, but if it is depending on the programming language, the feature is mostly worthless.

January 15, 2005
Manfred Nowak wrote:

> Smart people are unstoppable by defintion :-)
> And you are right so far: because a complete shield against all types of
> viruses needs a supervising instance, that instance will become the target
> of virus attacks until that again is shielded by another super-supervising
> instance ...and so on. The only hope to break this coevolving cycle is, that
> the code a virus needs to have is so blurred that the pure mass of code
> makes it easy to detect it.

I think Microsoft found an answer to the 'supervising instance' problem. They are working with hardware people to ensure that signed startup code is not executed unless the startup code has not been changed.

This startup code can then be used to ensure the rest of the OS code has not been mucked with and will also allow the OS to guarantee your signed code has not been changed either.
« First   ‹ Prev
1 2 3