Thread overview
[Issue 1071] New: DoS code on Windows Platform
Mar 20, 2007
d-bugmail
Mar 20, 2007
d-bugmail
Jul 20, 2007
d-bugmail
Jul 21, 2007
d-bugmail
Jul 23, 2007
d-bugmail
Jul 23, 2007
d-bugmail
Jul 24, 2007
d-bugmail
Sep 04, 2007
d-bugmail
March 20, 2007
http://d.puremagic.com/issues/show_bug.cgi?id=1071

           Summary: DoS code on Windows Platform
           Product: D
           Version: 1.009
          Platform: PC
        OS/Version: Windows
            Status: NEW
          Severity: normal
          Priority: P2
         Component: DMD
        AssignedTo: bugzilla@digitalmars.com
        ReportedBy: davidl@126.com


static char[] hello()
{
        char[] result="";
hello:
        result ~= `abc`;
        goto hello;
        return result;
}
void main()
{
        pragma(msg,hello());
}

what i want to say is DMD should limit the memory use of CTFE.
if CTFE keep use memory at a certain level and release some, reallocate some,
then the situation would be a disaster of compiling.

probably we should have a compile option to limit the compiler's use of memory. if exceeds then break the compilation

buggy code could easily drop into an endless loop in CTFE


-- 

March 20, 2007
http://d.puremagic.com/issues/show_bug.cgi?id=1071


smjg@iname.com changed:

           What    |Removed                     |Added
----------------------------------------------------------------------------
                 CC|                            |smjg@iname.com




------- Comment #1 from smjg@iname.com  2007-03-20 10:46 -------
The user shouldn't have to remember to set a compiler flag each time just in case.  There therefore ought to be a default limit, which the user can override if necessary.


-- 

July 20, 2007
http://d.puremagic.com/issues/show_bug.cgi?id=1071


vietor@zettabytestorage.com changed:

           What    |Removed                     |Added
----------------------------------------------------------------------------
                 CC|                            |vietor@zettabytestorage.com




------- Comment #2 from vietor@zettabytestorage.com  2007-07-19 23:41 -------
Requiring a flag to act normally is counter intuitive.

Having a default memory limit for the compiler would result in unexpected failure during normal and valid operation. Causing the compiler to run out of memory is, by no stretch of the imagination, an uncommon case. Thus if anything should require passing an additional argument to the compiler, imposing a limit should, not removing one.

If the user is worried about creating a situation where infinite allocation could occur, and running on an OS that will not handle it in at least a semi-graceful manner, then it should be their responsibility to remember to limit memory usage.


-- 

July 21, 2007
http://d.puremagic.com/issues/show_bug.cgi?id=1071





------- Comment #3 from vietor@zettabytestorage.com  2007-07-20 20:34 -------
In the above, "an uncommon case" should read "a common case".


-- 

July 23, 2007
http://d.puremagic.com/issues/show_bug.cgi?id=1071





------- Comment #4 from davidl@126.com  2007-07-22 20:31 -------
No, obviously most app *wouldn't* use memory over a certain level, this is common case. And for special case , apps which would use a great bunch memory which exceeds the compiler limitation should turn the compiler switch , after the author has tested his piece of code thoroughly.


-- 

July 23, 2007
http://d.puremagic.com/issues/show_bug.cgi?id=1071





------- Comment #5 from thecybershadow@gmail.com  2007-07-23 08:54 -------
I agree with Stewart and disagree with Vietor. It is an uncommon case for a program's CTFE code to use large amounts of memory, and it is always best to make safety options enabled by default. Not all users can be informed and constantly aware that compiling a program is a potential security risk, and in a multi-user environment it can become disastrous, as a result of simply compiling a program, which, at first sight, sounds like a rather harmless operation.


-- 

July 24, 2007
http://d.puremagic.com/issues/show_bug.cgi?id=1071





------- Comment #6 from vietor@zettabytestorage.com  2007-07-23 22:03 -------
Seems I'm strongly in the minority here.

My chief concern is:
What is a sane value for a memory limit and what do you plan to do in 10 years
when it's no longer a sane value? Additionally, sane for who?

Calling compiler induced memory exhaustion a security risk is making a mountain out of a molehill. At best it's a fairly weak DoS that though it will dramatically reduce system performance as it pushes into swap, will be automatically resolved by the OS when it hits the end of swap and is killed. The likelihood of the memory allocation subsystem killing anything other that the compiler gone wild is very small, but at worst this could result in randomly killing other processes.

Calling this disastrous is playing to hysteria. If you are serious about reliability in a multi-user environment, yet do not have per user resource limits in order to prevent this sort of problem, then your sysadmin is not doing their job.

Additionally, this is a compiler, it's a development tool. If you are running it on mission critical servers that cannot withstand an easily contained memory exhaustion, then you have far greater problems than a "misbehaving" compiler.

Solving this sort of problem by demanding that each application decide upon an arbitrary memory limit to impose upon itself is asking for trouble. Any situation in which this behavior will be a problem and not just an inconvenience, almost certainly has far greater threats to worry about.

I recognize that I am being perhaps overly passionate about a trivial issue in only one of two compilers for a language that is hardly mainstream, and that regardless of how it's decided will probably never effect me as I detest gratuitous preprocessing and compile time shenanigans. However,am I the only one who thinks that creating situations in which valid operations will fail without additional effort, in order to provide an expedient solution to a problem better solved by other means, is the wrong thing to do?


-- 

September 04, 2007
http://d.puremagic.com/issues/show_bug.cgi?id=1071





------- Comment #7 from davidl@126.com  2007-09-04 05:08 -------
200M is a sane value. Seems few sources use over this amount. And with an option -MLimit=400M ( or so ) you can use over 200M It's sane to limit the memory usage.


--