June 26, 2014 Software Assurance Reference Dataset | ||||
---|---|---|---|---|
| ||||
http://samate.nist.gov/SRD/view.php?count=20&first=0&sort=asc This is a list of security vulnerabilities in languages including C/C++. 88,737 of them (!). It's an interesting list, and an opportunity for D. I once said that my job was to put Coverity out of business. The more of these issues D can automatically prevent with @safe, the better. |
June 26, 2014 Re: Software Assurance Reference Dataset | ||||
---|---|---|---|---|
| ||||
Posted in reply to Walter Bright | Walter Bright: > It's an interesting list, and an opportunity for D. I once said that my job was to put Coverity out of business. Even if D has wide success, I don't think D will delete all the C and C++ code out of existence, so I don't think D will ever put Coverity out of business :-) > The more of these issues D can automatically prevent > with @safe, the better. I think D will need/better guns for that, like a more principled (formalized, written fully down in specs, eventually even proved) management of uniqueness, etc. Bye, bearophile |
June 26, 2014 Re: Software Assurance Reference Dataset | ||||
---|---|---|---|---|
| ||||
Posted in reply to Walter Bright | On Thursday, 26 June 2014 at 00:36:58 UTC, Walter Bright wrote:
> http://samate.nist.gov/SRD/view.php?count=20&first=0&sort=asc
>
> This is a list of security vulnerabilities in languages including C/C++. 88,737 of them (!).
>
> It's an interesting list, and an opportunity for D. I once said that my job was to put Coverity out of business. The more of these issues D can automatically prevent with @safe, the better.
As expected, most of them would be automatically fixed by using a descendent from Algol/Mesa/Cedar branch of languages.
I look forward that @safe can provide a similar set of guarantees.
--
Paulo
|
June 26, 2014 Re: Software Assurance Reference Dataset | ||||
---|---|---|---|---|
| ||||
Posted in reply to Walter Bright | Walter Bright:
> It's an interesting list, and an opportunity for D. I once said that my job was to put Coverity out of business. The more of these issues D can automatically prevent with @safe, the better.
One kind of problem left is to avoid stack overflows. I have had several such cases in my D code (perhaps because I use fixed-sized arrays a lot).
I think they can be caused by:
1) Too much large allocations in stack frames;
2) By alloca();
3) By recursion and co-recursion chains;
The first cause is probably sufficiently easy to solve mechanically, with a conservative call tree analysis of the code. This is a job for external tools.
The second cause can be solved with a formal proof of the upper bounds of the arguments of alloca, or even more conservatively disallowing alloca.
The third cause, a cycle in the call graph, can be found with an external too, but in theory it's also easy to avoid with a @no_call_cyles annotation :-) Such annotation looks useful only if you want to avoid stack overflows, so it's better to use a more general annotation that also forbids alloca(). Something like @fixed_stack or @constant_memory.
I think adding this annotation to D is a little overkill (despite SPARK and Ada analysers are doing stack size analysis since many years). So I think a better solution (beside creating an external tool) is to add to D a more general feature to attach some compile-time semantics to user-defined annotations. To do this you need a trait that returns a sequence of all the functions called by a function.
Bye,
bearophile
|
June 26, 2014 Re: Software Assurance Reference Dataset | ||||
---|---|---|---|---|
| ||||
Posted in reply to bearophile | On 6/26/2014 2:19 AM, bearophile wrote:
> One kind of problem left is to avoid stack overflows.
In general, stack overflow checking at compile time is the halting problem. It needs a runtime check.
Stack overflows are not safety problems when a guard page is used past the end of the stack. Then, overflow checking is done in hardware. Guard pages aren't currently used for fibers, so overflows are a real danger there.
In 64 bit code, however, one should be able to use guard pages for fibers.
|
June 26, 2014 Re: Software Assurance Reference Dataset | ||||
---|---|---|---|---|
| ||||
Posted in reply to Walter Bright | Walter Bright: > In general, stack overflow checking at compile time is the halting problem. It needs a runtime check. There are several systems, including SPARK, that perform a conservative and apparently acceptable stack overflow check at compile time. If you don't agree with what I've written in my post, then please give a more detailed answer to the points I've written above. > Stack overflows are not safety problems when a guard page is used past the end of the stack. It's not a safety problem in Erlang/Rust, because those languages are designed to manage such failures in a good way. In most other languages it's a "safety" problem, if your program execution has some importance. Bye, bearophile |
June 26, 2014 Re: Software Assurance Reference Dataset | ||||
---|---|---|---|---|
| ||||
Posted in reply to Walter Bright | On 06/26/2014 11:35 AM, Walter Bright wrote:
> On 6/26/2014 2:19 AM, bearophile wrote:
>> One kind of problem left is to avoid stack overflows.
>
> In general, stack overflow checking at compile time is the halting
> problem.
That is irrelevant to his point because he is not suggesting to solve the general problem precisely. Analogously: In general, checking whether some variable in e.g. Python is ever assigned a string value is undecidable as well, but this does not imply we cannot have 'int' variables in D.
|
June 26, 2014 Re: Software Assurance Reference Dataset | ||||
---|---|---|---|---|
| ||||
Posted in reply to bearophile | On 6/26/2014 2:50 AM, bearophile wrote: > Walter Bright: > >> In general, stack overflow checking at compile time is the halting problem. It >> needs a runtime check. > > There are several systems, including SPARK, that perform a conservative and > apparently acceptable stack overflow check at compile time. If you don't agree > with what I've written in my post, then please give a more detailed answer to > the points I've written above. Spark is a research language that does not work, as I've discovered and discussed with you before. It cannot be determined the max stack usage at compile time, again, this is the halting problem. >> Stack overflows are not safety problems when a guard page is used past the end >> of the stack. > It's not a safety problem in Erlang/Rust, because those languages are designed > to manage such failures in a good way. Please explain. > In most other languages it's a "safety" > problem, if your program execution has some importance. I mean "safety" in the sense of being a security problem, which is the context of this thread. |
June 26, 2014 Re: Software Assurance Reference Dataset | ||||
---|---|---|---|---|
| ||||
Posted in reply to Timon Gehr | On 6/26/2014 2:52 AM, Timon Gehr wrote:
> On 06/26/2014 11:35 AM, Walter Bright wrote:
>> On 6/26/2014 2:19 AM, bearophile wrote:
>>> One kind of problem left is to avoid stack overflows.
>>
>> In general, stack overflow checking at compile time is the halting
>> problem.
>
> That is irrelevant to his point because he is not suggesting to solve the
> general problem precisely. Analogously: In general, checking whether some
> variable in e.g. Python is ever assigned a string value is undecidable as well,
> but this does not imply we cannot have 'int' variables in D.
When you're dealing with security issues, which is what this about, you'll need a guarantee about stack overflow. Adding annotations is not helpful with this because they are not checkable.
Again, what WORKS is a runtime check.
|
June 26, 2014 Re: Software Assurance Reference Dataset | ||||
---|---|---|---|---|
| ||||
Posted in reply to Walter Bright | >
> Spark is a research language that does not work, as I've discovered and discussed with you before. It cannot be determined the max stack usage at compile time, again, this is the halting problem.
>
What?! It's easily solvable: Forbid recursion and indirect
function calls and it's guaranteed that the program only requires
a fixed size stack and you can compute an upper bound of the
required stack size at compile-time. Which is BTW exactly what
OpenCL does as GPUs tend to have no stacks.
In what way is Spark a "research language that does not work"?
And how many language design issues need to be discovered until
you admit that Safe-D is a "research language that does not work"?
|
Copyright © 1999-2021 by the D Language Foundation