Jump to page: 1 2
Thread overview
D is nice whats really wrong with gc??
Dec 18, 2023
Bkoie
Dec 18, 2023
H. S. Teoh
Dec 20, 2023
Imperatorn
Dec 22, 2023
Dmitry Ponyatov
Dec 22, 2023
H. S. Teoh
Dec 22, 2023
bomat
Dec 22, 2023
Bkoie
Dec 22, 2023
bachmeier
Dec 22, 2023
bomat
Dec 22, 2023
H. S. Teoh
Dec 23, 2023
bomat
Dec 23, 2023
IGotD-
December 18, 2023

just look at this i know this is overdesign im just trying to get a visual on how a api can be design im still new though but the fact you can build an api like this and it not break it is amazing.

but what is with these ppl and the gc?
just dont allocate new memory or invoke,
you can use scopes to temporry do stuff on immutable slices that will auto clean up
the list goes on

and you dont need to use pointers at all.......!!

i honesty see nothing wrong with gc,

ofc d has some downsides,
docs not very good compare to some other lang.
ide support not great but it works sometimes
i use helix and lapce and maybe sometimes intellj
it works better in helix though.
and of d is missing some minor libraries

import std.stdio: writeln, readln;
auto struct Game
{
    string title;
    private Board _board;
    private const(Player)[] _players;
    final auto load(T)(T any) {
        static if (is(T == Player)) {
            _pushPlayer(any);
        }
        return this;
    };
    final auto play() {assert(_isPlayersFull, "require players is 2 consider removing"); "playing the game".writeln;};
    final auto _end() {};
    auto _currentPlayers() const {return _players.length;}
    enum _playerLimit = 2;
    auto _isPlayersFull() const {return _currentPlayers == _playerLimit;}
    import std.format: format;
    auto _pushPlayer(T: Player)(T any) {
        if (_isPlayersFull) assert(false, "require %s players".format(_playerLimit));
        _players.reserve(_playerLimit);
        _players ~= any;
        }
}
private struct Board {}
enum symbol {none, x, o}
private struct Player {const(string) _name; symbol _hand; @disable this(); public this(in string n) {_name = n;}}
alias game = Game;
alias player = Player;
alias board = Board;
auto main()
{
    import std.string: strip;
    game()
    .load(player(readln().strip))
    // .matchmake
    .load(player(readln().strip))
    .play;
}
December 18, 2023
On Mon, Dec 18, 2023 at 04:44:11PM +0000, Bkoie via Digitalmars-d-learn wrote: [...]
> but what is with these ppl and the gc?
[...]

It's called GC phobia, a knee-jerk reaction malady common among C/C++ programmers (I'm one of them, though I got cured of GC phobia thanks to D :-P).  95% of the time the GC helps far more than it hurts.  And the 5% of the time when it hurts, there are plenty of options for avoiding it in D.  It's not shoved down your throat like in Java, there's no need to get all worked up about it.


T

-- 
Computerese Irregular Verb Conjugation: I have preferences.  You have biases.  He/She has prejudices. -- Gene Wirchenko
December 20, 2023
On Monday, 18 December 2023 at 17:22:22 UTC, H. S. Teoh wrote:
> On Mon, Dec 18, 2023 at 04:44:11PM +0000, Bkoie via Digitalmars-d-learn wrote: [...]
>> but what is with these ppl and the gc?
> [...]
>
> It's called GC phobia, a knee-jerk reaction malady common among C/C++ programmers (I'm one of them, though I got cured of GC phobia thanks to D :-P).  95% of the time the GC helps far more than it hurts.  And the 5% of the time when it hurts, there are plenty of options for avoiding it in D.  It's not shoved down your throat like in Java, there's no need to get all worked up about it.
>
>
> T

Truth
December 22, 2023

On Monday, 18 December 2023 at 16:44:11 UTC, Bkoie wrote:

>

but what is with these ppl and the gc?
[...]

I'm a C++ programmer in my day job. Personally, I have no problem with a GC, but one of my colleague is a total C fanboy, so I feel qualified to answer your question. :)

I think the problem most "old school" programmers have with automatic garbage collection, or any kind of "managed" code, really, is not the GC itself, but that it demonstrates a wrong mindset.

If you use (or even feel tempted to use) a GC, it means that you don't care about your memory. Neither about its layout nor its size, nor when chunks of it are allocated or deallocated, etc.
And if you don't care about these things, you should not call yourself a programmer. You are the reason why modern software sucks and everything gets slower and slower despite the processors getting faster and faster. In fact, you probably should get another job, like flooring inspector or something. :)

And although this is not my opinion (otherwise I wouldn't use D), I have to admit that this isn't completely wrong. I like my abstractions because they make my life easier, but yeah, they detach me from the hardware, which often means things are not quite as fast as they could possibly be. It's a tradeoff.

Of course, people with a "purer" mindset could always use the "BetterC" subset of D... but then again, why should they? C is perfect, right? :)

December 22, 2023

On Friday, 22 December 2023 at 12:53:44 UTC, bomat wrote:

>

I think the problem most "old school" programmers have with automatic garbage collection, or any kind of "managed" code, really, is not the GC itself, but that it demonstrates a wrong mindset.

If you use (or even feel tempted to use) a GC, it means that you don't care about your memory. Neither about its layout nor its size, nor when chunks of it are allocated or deallocated, etc.
And if you don't care about these things, you should not call yourself a programmer. You are the reason why modern software sucks and everything gets slower and slower despite the processors getting faster and faster. In fact, you probably should get another job, like flooring inspector or something. :)

and that's the reason why modern programs are getting bigger, slower and leaking memory. no one should be manually managing memory, rust is a prime example of that but now "barrow checker the issue" or "too many unsafe blocks", and as one guy said above you can avoid the gc in d so...

December 22, 2023

On Friday, 22 December 2023 at 12:53:44 UTC, bomat wrote:

>

If you use (or even feel tempted to use) a GC, it means that you don't care about your memory. Neither about its layout nor its size, nor when chunks of it are allocated or deallocated, etc.
And if you don't care about these things, you should not call yourself a programmer. You are the reason why modern software sucks and everything gets slower and slower despite the processors getting faster and faster. In fact, you probably should get another job, like flooring inspector or something. :)

Given how fast computers are today, the folks that focus on memory and optimizing for performance might want to apply for jobs as flooring inspectors, because they're often solving problems from the 1990s. That's not to say it's never needed, but the number of cases where idiomatic D, Go, or Java will be too slow is shrinking rapidly. And there's a tradeoff. In return for solving a problem that doesn't exist, you get bugs, increased development time, and difficulty changing approaches.

I say this as I'm in the midst of porting C code to D. The biggest change by far is deleting line after line of manual memory management. Changing anything in that codebase would be miserable.

December 22, 2023
> It's called GC phobia, a knee-jerk reaction malady common among C/C++ programmers

I'd like to use D in hard realtime apps (gaming can be thought as one of them, but I mostly mean realtime dynamic multimedia and digital signal processing).

So, GC in such applications commonly supposed unacceptable. In contrast, I can find some PhD theses speaking about realtime GC, prioritized message passing and maybe RDMA-based clustering.

Unfortunately, I have no hope that D lang is popular enough that somebody in the topic can rewrite its runtime and gc to be usable in more or less hard RT apps.

December 22, 2023
On Fri, Dec 22, 2023 at 07:22:15PM +0000, Dmitry Ponyatov via Digitalmars-d-learn wrote:
> > It's called GC phobia, a knee-jerk reaction malady common among C/C++ programmers
> 
> I'd like to use D in hard realtime apps (gaming can be thought as one of them, but I mostly mean realtime dynamic multimedia and digital signal processing).

For digital signal processing, couldn't you just preallocate beforehand? Even if we had a top-of-the-line incremental GC I wouldn't want to allocate wantonly in my realtime code. I'd preallocate whatever I can, and use region allocators for the rest.


> So, GC in such applications commonly supposed unacceptable. In contrast, I can find some PhD theses speaking about realtime GC, prioritized message passing and maybe RDMA-based clustering.

I'm always skeptical of general claims like this. Until you actually profile and identify the real hotspots, it's just speculation.


> Unfortunately, I have no hope that D lang is popular enough that somebody in the topic can rewrite its runtime and gc to be usable in more or less hard RT apps.

Popularity has nothing to do with it. The primary showstopper here is the lack of write barriers (and Walter's reluctance to change this). If we had write barriers a lot more GC options would open up.


T

-- 
What is Matter, what is Mind? Never Mind, it doesn't Matter.
December 22, 2023

On Friday, 22 December 2023 at 16:51:11 UTC, bachmeier wrote:

>

Given how fast computers are today, the folks that focus on memory and optimizing for performance might want to apply for jobs as flooring inspectors, because they're often solving problems from the 1990s.

Generally speaking, I disagree. Think of the case of GTA V where several minutes of loading time were burned just because they botched the implementation of a JSON parser.
Of course, this was unrelated to memory management. But it goes to show that today's hardware being super fast doesn't absolve you from knowing what you're doing... or at least question your implementation once you notice that it's slow.
But that is true for any language, obviously.
I think there is a big danger of people programming in C/C++ and thinking that it must be performing well just because it's C/C++. The C++ codebase I have to maintain in my day job is a really bad example for that as well.

>

I say this as I'm in the midst of porting C code to D. The biggest change by far is deleting line after line of manual memory management. Changing anything in that codebase would be miserable.

I actually hate C with a passion.
I have to be fair though: What you describe doesn't sound like a problem of the codebase being C, but the codebase being crap. :)
If you have to delete "line after line" of manual memory management, I assume you're dealing with micro-allocations on the heap - which are performance poison in any language.
A decent system would allocate memory in larger blocks and manage access to it via handles. That way you never do micro-allocations and never have ownership problems.
Essentially, it's still a "memory manager" that owns all the memory, the only difference being that it's self-written.
Porting a codebase like that would actually be very easy because all the mallocs would be very localized.

Of course, this directly leads to the favorite argument of C defenders, which I absolutely hate: "Why, it's not a problem if you're doing it right."

By this logic, you have to do all these terrible mistakes while learning your terrible language, and then you'll be a good programmer and can actually be trusted with writing production software - after like, what, 20 years of shooting yourself in the foot and learning everything the hard way? :)
And even then, the slightest slipup will give you dramatic vulnerabilities.
Such a great concept.

December 22, 2023
On Fri, Dec 22, 2023 at 09:40:03PM +0000, bomat via Digitalmars-d-learn wrote:
> On Friday, 22 December 2023 at 16:51:11 UTC, bachmeier wrote:
> > Given how fast computers are today, the folks that focus on memory and optimizing for performance might want to apply for jobs as flooring inspectors, because they're often solving problems from the 1990s.
> 
> *Generally* speaking, I disagree. Think of the case of GTA V where several *minutes* of loading time were burned just because they botched the implementation of a JSON parser.

IMNSHO, if I had very large data files to load, I wouldn't use JSON. Precompile the data into a more compact binary form that's already ready to use, and just mmap() it at runtime.


> Of course, this was unrelated to memory management. But it goes to show that today's hardware being super fast doesn't absolve you from knowing what you're doing... or at least question your implementation once you notice that it's slow.

My favorite example is this area is the poor selection of algorithms, a
very common mistake being choosing an O(n²) algorithm because it's
easier to implement than the equivalent O(n) algorithm, and not very
noticeable on small inputs. But on large inputs it slows to an unusable
crawl. "But I wrote it in C, why isn't it fast?!" Because O(n²) is
O(n²), and that's independent of language. Given large enough input, an
O(n) Java program will beat the heck out of an O(n²) C program.


> But that is true for any language, obviously.
>
> I think there is a big danger of people programming in C/C++ and thinking that it *must* be performing well just because it's C/C++. The C++ codebase I have to maintain in my day job is a really bad example for that as well.

"Elegant or ugly code as well as fine or rude sentences have something in common: they don't depend on the language." -- Luca De Vitis

:-)


> > I say this as I'm in the midst of porting C code to D. The biggest change by far is deleting line after line of manual memory management.  Changing anything in that codebase would be miserable.
> 
> I actually hate C with a passion.

Me too. :-D


> I have to be fair though: What you describe doesn't sound like a problem of the codebase being C, but the codebase being crap. :)

Yeah, I've seen my fair share of crap C and C++ codebases. C code that makes you do a double take and stare real hard at the screen to ascertain whether it's actually C and not some jokelang or exolang purposely designed to be unreadable/unmaintainable. (Or maybe it would qualify as an IOCCC entry. :-D)  And C++ code that looks like ... I dunno what.  When business logic is being executed inside of a dtor, you *know* that your codebase has Problems(tm), real big ones at that.



> If you have to delete "line after line" of manual memory management, I assume you're dealing with micro-allocations on the heap - which are performance poison in any language.

Depends on what you're dealing with.  Some micro-allocations are totally avoidable, but if you're manipulating a complex object graph composed of nodes of diverse types, it's hard to avoid. At least, not without uglifying your APIs significantly and introducing long-term maintainability issues.  One of my favorite GC "lightbulb" moments is when I realized that having a GC allowed me to simplify my internal APIs significantly, resulting in much cleaner code that's easy to debug and easy to maintain. Whereas the equivalent bit of code in the original C++ codebase would have required disproportionate amounts of effort just to navigate the complex allocation requirements.

These days my motto is: use the GC by default, when it becomes a problem, then use a more manual memory management scheme, but *only where the bottleneck is* (as proven by an actual profiler, not where you "know" (i.e., imagine) it is).  A lot of C/C++ folk (and I speak from my own experience as one of them) spend far too much time and energy optimizing things that don't need to be optimized, because they are nowhere near the bottleneck, resulting in lots of sunk cost and added maintenance burden with no meaningful benefit.


[...]
> Of course, this directly leads to the favorite argument of C defenders, which I absolutely hate: "Why, it's not a problem if you're doing it *right*."
> 
> By this logic, you have to do all these terrible mistakes while learning your terrible language, and then you'll be a good programmer and can actually be trusted with writing production software - after like, what, 20 years of shooting yourself in the foot and learning everything the hard way?  :) And even then, the slightest slipup will give you dramatic vulnerabilities.  Such a great concept.

Year after year I see reports of security vulnerabilities, the most common of which are buffer overflows, use-after-free, and double-free. All of which are caused directly by using a language that forces you to manage memory manually.  If C were only 10 years old, I might concede that C coders are just inexperienced, give them enough time to learn from field experience and the situation will improve. But after 50 years, the stream of memory-related security vulnerabilities still hasn't ebbed.  I think it's beyond dispute that even the best C coders make mistakes -- because memory management is HARD, and using a language that gives you no help whatsoever in this department is just inviting trouble. I've personally seen the best C coders commit blunders, and in C, all it takes is *one* blunder among millions of lines of code that manage memory, and you have a glaring security hole.

It's high time people stepped back to think hard about why this is happening, and why 50 years of industry experience and hard-earned best practices has not improved things.

And also think hard about why eschew the GC when it could single-handedly remove this entire category of bugs from your program in one fell swoop.

(Now, just below memory-related security bugs is data sanitization bugs. Unfortunately the choice of language isn't going to help you very much in there...)


T

-- 
In theory, software is implemented according to the design that has been carefully worked out beforehand. In practice, design documents are written after the fact to describe the sorry mess that has gone on before.
« First   ‹ Prev
1 2