March 11, 2012
On 3/10/2012 2:44 PM, Lars T. Kyllingstad wrote:
> But... writefln is still there. Is it incompatible with the D1 one in some way?

Try writefln(3);
March 11, 2012
"Walter Bright" <newshound2@digitalmars.com> wrote in message news:jjho0a$2qdu$1@digitalmars.com...
> On 3/10/2012 2:44 PM, Lars T. Kyllingstad wrote:
>> But... writefln is still there. Is it incompatible with the D1 one in some way?
>
> Try writefln(3);

Having been accustomed to C's printf, I don't think it would have ever even occurred to me to try that. (FWIW/YMMV)


March 11, 2012
I find the point on developing on a slower computer very interesting, and here's my story.

I've been using an EeePC for everything for the past 2.5 years and until now, I could cope. I'm getting a new laptop this week because I direly need it at the faculty (some robotics/image processing/computer vision — no way to run these on an EeePC realtime).

However, I could notice an interesting trend between my colleagues' programs and mine. For example, solving the 15-game with heuristics took ~0.01 secs on the Eee, and comparing to others' programs theirs took several seconds and found worse solutions (not all of them of course, but most). When doing some local search optimisation, the difference was seconds-to-HOURS. I guess someone was really sloppy, but still.

This has also been one of the reasons I became interested in languages like C and D. Believe it or not, in our university, you don't ever get to see C officially if you don't learn it yourself. I consider this pathetic. The official and only taught language is Java. Which I grant them is at least cross-platform, but I believe that every university-educated programmer must know C.

I am convinced that my university produces bad programmers and as such don't find it surprising that new written programs are terribly slow, if they even work at all.

Matej
March 11, 2012
Le 09/03/2012 23:32, Walter Bright a écrit :
> This statement is from Linus Torvalds about breaking binary compatibility:
>
> https://lkml.org/lkml/2012/3/8/495
>
> While I don't think we need to worry so much at the moment about
> breaking binary compatibility with new D releases, we do have a big
> problem with breaking source code compatibility.
>
> This is why we need to have a VERY high bar for breaking changes.

I think Linus is mostly right. But I don't think this is a reason not put the bar very high. Especially with the current state of D. This is more about providing a nice, and long, transition process.

The way @property evolve is a good example of what we should do.

An example of management the change (even breaking change) is PHP. PHP has so much improved if you consider what have changed between v4 and v5, and then v5 and v5.3, that it is a factual proof that breaking change can be done to great benefit.
March 11, 2012
On Sun, Mar 11, 2012 at 04:12:12AM -0400, Nick Sabalausky wrote:
> "so" <so@so.so> wrote in message news:pzghdzojddybajuguxwa@forum.dlang.org...
[...]
> > No matter how much hardware you throw at it, somehow it gets slower and slower.  New hardware can't keep up with (ever increasing) writing bad software.
> >
> > http://www.agner.org/optimize/blog/read.php?i=9
> >
> 
> That is a *FANTASTIC* article. Completely agree, and it's very well-written.

I really liked the point about GUIs. Many resources are used for graphical elements that only have aesthetic value AND tends to distract the user from actual work. IOW, you're wasting CPU, RAM, and disk time (which comes from spending lots of hard-earned cash for that expensive hardware upgrade) just for some silly eye-candy that has no value whatsoever except to distract from the task at hand, that is, to accomplish what you set out to do in the first place.

That's why I use ratpoison as my WM. Who needs title bars with fancy colored icons, gradient shading, and *shadows*?! I mean, c'mon. You're trying to get work done, not admire how clever the UI designers were and how cool a color gradient is. If I wanted to admire eye-candy, I'd be playing computer games, not working. (That said, though, I did at one point have a Compiz installation for the sole purpose of showing off Linux to clueless people. :-P)

Then the points about background processes, auto-updates, and boot-up times. These are things about Windows that consistently drive me up the wall. Background processes are all nice and good as long as they are (1) necessary, and (2) don't do stupid things like hog your CPU or thrash your disk every 12 seconds. But the way Windows works, every time you install something, it insists on starting up at boot-time, incessantly checking for auto-updates every 12 seconds, downloading crap from online without your knowledge, and THEN pop up those intrusive, distracting, and utterly annoying "Plz Update Meeee!" dialogs. Ugh. Everytime I see one of those dialogs I have this urge to delete the app and expunge all traces of it from the system with extreme prejudice.

At least on Linux you can turn off this crap and/or otherwise prevent it from doing stupid things. But on Windows you have no choice. Attempting to disable stuff usually breaks said apps, or affects system usability in some way.


> That's actually one of reasons I like to *not* use higher-end hardware.  Every programmer in the world, no exceptions, has a natural tendancy to target the hardware they're developing on. If you're developing on high-end hardware, your software is likely to end up requiring high-end hardware even without your noticing. If you're developing on lower-end hardware, your software is going to run well on fucking *everything*.

True. I suppose it's a good thing at my day job that we don't get free upgrades. Whatever was current when we first got the job is whatever we have today. It does have a certain value to it, in that we notice how idiotically long it takes to compile the software we're working on, and how much CPU and RAM a particular ludicrously-long linker command-line eats up at a certain point in the build (which, not too surprisingly, is the linking of the GUI component). It does provide a disincentive against doing more stupid things to make this worse.

Now if only everyone (particular the people working on the GUI component :-P) had 5-year old development machines, perhaps that ludicrously-long linker command would never have existed in the first place. Well, I can dream. :-)


T

-- 
Ignorance is bliss... but only until you suffer the consequences!
March 11, 2012
Le 10/03/2012 00:24, Jonathan M Davis a écrit :
> On Friday, March 09, 2012 15:14:39 H. S. Teoh wrote:
>> On Fri, Mar 09, 2012 at 11:46:24PM +0100, Alex Rønne Petersen wrote:
>>> On 09-03-2012 23:32, Walter Bright wrote:
>>>> This statement is from Linus Torvalds about breaking binary
>>>> compatibility:
>>>>
>>>> https://lkml.org/lkml/2012/3/8/495
>>>>
>>>> While I don't think we need to worry so much at the moment about
>>>> breaking binary compatibility with new D releases, we do have a big
>>>> problem with breaking source code compatibility.
>>>>
>>>> This is why we need to have a VERY high bar for breaking changes.
>>>
>>> If we want to start being able to avoid breaking changes, we
>>> *really* need to finally deprecate the stuff that's been slated for
>>> deprecation for ages...
>>
>> [...]
>>
>> Does that include std.stdio and std.stream? When are we expecting std.io
>> to be ready?
>>
>> IMHO, this is one major change that needs to happen sooner rather than
>> later. The current lack of interoperability between std.stdio and
>> std.stream is a big detraction from Phobos' overall quality.
>
> Note that he didn't say that we should _never_ make breaking changes but
> rather that we need to have a very high bar for making such changes. In
> particular, it's stuff like renaming functions without changing functionality
> that he's against.
>

Just about that, having a consistent naming convention is an issue of first importance. In the given topic, I did argue for some convention, but, let's look at the larger picture and how it relate to this topic. Whatever the naming convention is, it mandatory to have one.

So at some point, renaming stuff are going to happen, or phobos will become completely opaque when growing.

Renaming just one function isn't important enough to justify to break compatibility. Having a consistent naming convention is.
March 11, 2012
On Sun, Mar 11, 2012 at 12:20:43PM +0100, Matej Nanut wrote: [...]
> This has also been one of the reasons I became interested in languages like C and D. Believe it or not, in our university, you don't ever get to see C officially if you don't learn it yourself. I consider this pathetic. The official and only taught language is Java.

Ugh.


> Which I grant them is at least cross-platform, but I believe that every university-educated programmer must know C.

+1.

Java is a not-bad language. In fact, as a language it has quite a few good points. However, one thing I could never stand about Java culture is what I call the bandwagon-jumping attitude. It's this obsessive belief that Java is the best thing invented since coffee (har har) and that it's the panacea to solve all programming problems, cure world hunger, and solve world peace, and that whoever doesn't use Java must therefore be some inferior antiquated dinosaur from the last ice age. Every new trend that comes out must be blindly adopted without any question, because obviously new == good, and therefore whatever diseased fancy some self-appointed genius dreamed up one night must be adopted without regard for whether it actually adds value. C is therefore a fossilized relic from bygone times and nobody uses it anymore, and we've never heard of what on earth an assembler is, neither do we care, since the JVM is obviously superior anyway.

As the saying goes, if you don't know history, you'll end up repeating it.


> I am convinced that my university produces bad programmers and as such don't find it surprising that new written programs are terribly slow, if they even work at all.
[...]

Obligatory quote:

	If Java had true garbage collection, most programs would delete
	themselves upon execution. -- Robert Sewell

:-)


T

-- 
Question authority. Don't ask why, just do it.
March 11, 2012
Le 10/03/2012 20:37, Walter Bright a écrit :
> On 3/10/2012 10:58 AM, H. S. Teoh wrote:
>> Win9x's success is mainly attributable to Microsoft's superior marketing
>> strategies. It can hardly be called a success technology-wise.
>
> Oh, I disagree with that. Certainly, Win9x was a compromise, but it
> nailed being a transition operating system from 16 to 32 bit, and it
> nailed making Windows an attractive target for game developers.

Windows 3.1 had patches provided by microsoft to handle 32bits. But this is quite offtopic. Win9x was good back then. Now it is crap.

When doing something new (like D) you don't only need to provide something as good as what existed before. Actually, providing better isn't enough either. You need to provide enough to compensate the cost of the change, and additionally communication/marketing must convince user to switch.
March 11, 2012
Le 10/03/2012 06:47, Walter Bright a écrit :
> On 3/9/2012 8:40 PM, Adam D. Ruppe wrote:
>> On Windows though, even if you relied on bugs twenty
>> years ago, they bend over backward to keep your app
>> functioning. It is really an amazing feat they've
>> accomplished, both from technical and business
>> perspectives, in doing this while still moving
>> forward.
>
> I agree that Windows does a better job of it than Linux. MS really does
> pour enormous effort into backwards compatibility. You could
> legitimately call it heroic - and it has paid off for MS.

Micsrosoft being incompatible with mostly everything, they sure can't afford to not be compatible with themselves.

This is of strategic importance for microsoft. I don't think this is as important for us as it is for microsoft (even if it is).
March 11, 2012
Le 10/03/2012 20:38, Walter Bright a écrit :
> On 3/10/2012 11:31 AM, Nick Sabalausky wrote:
>> I still like the name better. Do we really need an alphabet soup
>> appended to
>> "write" just to spit out one string?
>
> It's not about whether it was a better name. It was about having to
> constantly edit code.

I do think better name isn't the problem. The problem is about consistency, and will persist as long as we don't agree on a guideline on that in phobos.

Changing a name just for changing it doesn't worth the cost, unless the original name is horribly misleading - rare case. But getting the naming convention consistent is of much greater importance, and justify breaking code.