July 30, 2012
On Sun, 2012-07-29 at 21:35 +0200, Paulo Pinto wrote:
[…]
> The problem is that in the enterprise world with expendable programmers, is very hard to do JVM based projects with anything other than Java.

My experience from various training courses I have given is that in the large corporate Web applications world, the average programmer knows Java, just enough Java, and isn't that interested in anything else. Gross oversimplification but a good indicator.  However within the financial sector, some of the large players are ditching Java and unitary systems in favour of a small Scala core and Python using what is effectively an SOA architecture. In all these organizations there are large bodies of C++ code hence a move away from Java and C# to Python. In other organizations Java remains the core but Groovy, JRuby and Clojure are admitted, which sounds like your experience…

> I was very happy when on a project for a new internal proprietary JSF based framework, we were allowed to have Groovy as part of the framework's supported languages.
> 
> That only happened because the said company was replacing the Perl scripts by Groovy scripts in their deployment infrastructure.

Perl is a fine language in many ways and not so fine in others. It seems though that it is increasingly seen as legacy in the way COBOL is.

-- 
Russel. ============================================================================= Dr Russel Winder      t: +44 20 7585 2200   voip: sip:russel.winder@ekiga.net 41 Buckmaster Road    m: +44 7770 465 077   xmpp: russel@winder.org.uk London SW11 1EN, UK   w: www.russel.org.uk  skype: russel_winder


July 30, 2012
On Monday, 30 July 2012 at 06:04:31 UTC, Russel Winder wrote:
> On Sun, 2012-07-29 at 21:35 +0200, Paulo Pinto wrote:
> […]
>> The problem is that in the enterprise world with expendable programmers, is very hard to do JVM based projects with anything other than Java.
>
> My experience from various training courses I have given is that in the
> large corporate Web applications world, the average programmer knows
> Java, just enough Java, and isn't that interested in anything else.
> Gross oversimplification but a good indicator.

That is actually the case.
July 30, 2012
On Monday, 30 July 2012 at 07:26:51 UTC, Paulo Pinto wrote:
> On Monday, 30 July 2012 at 06:04:31 UTC, Russel Winder wrote:
>> On Sun, 2012-07-29 at 21:35 +0200, Paulo Pinto wrote:
>>> The problem is that in the enterprise world with expendable programmers, is very hard to do JVM based projects with anything other than Java.
>>
>> My experience from various training courses I have given is that in the large corporate Web applications world, the average programmer knows Java, just enough Java, and isn't that interested in anything else. Gross oversimplification but a good indicator.
>
> That is actually the case.

 The only thing I walked away with after I learned java as part of a programming course, was how polymorphism and interfaces worked in OO programming.

 I just really wish they would change their courses for 'this is inheritance with animals!' which although logically makes sense (in a dictionary wikipedia way) but structurally and programming-wise doesn't. Describing it as a building block for a game (like item) would have made much more sense.
July 30, 2012
On Monday, 30 July 2012 at 07:57:47 UTC, Era Scarecrow wrote:
> On Monday, 30 July 2012 at 07:26:51 UTC, Paulo Pinto wrote:
>> On Monday, 30 July 2012 at 06:04:31 UTC, Russel Winder wrote:
>>> On Sun, 2012-07-29 at 21:35 +0200, Paulo Pinto wrote:
>>>> The problem is that in the enterprise world with expendable programmers, is very hard to do JVM based projects with anything other than Java.
>>>
>>> My experience from various training courses I have given is that in the large corporate Web applications world, the average programmer knows Java, just enough Java, and isn't that interested in anything else. Gross oversimplification but a good indicator.
>>
>> That is actually the case.
>
>  The only thing I walked away with after I learned java as part of a programming course, was how polymorphism and interfaces worked in OO programming.
>
>  I just really wish they would change their courses for 'this is inheritance with animals!' which although logically makes sense (in a dictionary wikipedia way) but structurally and programming-wise doesn't. Describing it as a building block for a game (like item) would have made much more sense.

It always depends how lucky you get with who teaches you. I learned OO with
Turbo Pascal 6.0, as I prepared a class about OO in high school.

Afterwards it was time to learn with OO the C++ way.

Before Java I was exposed to OO in Smalltalk, CLOS, Eiffel, Modula-3, Oberon,
Python.

Not only the languages, but OO abstract design in the form of Boochs and UML methods.

So when I came to lean Java, many things felt natural, and I was not contaminated like many, with the idea that OOP == Java.

--
Paulo



July 30, 2012
On Sunday, 29 July 2012 at 22:22:33 UTC, Walter Bright wrote:
> On 7/27/2012 7:38 PM, Jonathan M Davis wrote:
>> True, but I'm kind of shocked that anything 16-bit even still exists. _32-bit_
>> is on its way out. I thought that 16-bit was dead _years_ ago. I guess that
>> some embedded stuff must use it. But really, I wouldn't expect the lack of 16-
>> bit support to be much of an impediment - if any at all - and in the long run,
>> it'll mean absolutely nothing.
>
> For those who may not realize it, C++ is simply not suitable for 16 bit systems either. It theoretically supports 16 bit code, but in practice, full C++ will never work on them.
>
> So, you might ask, why was 16 bit C++ popular on 16 bit MSDOS in the 80's? That was C++ before exception handling and RTTI, both of which were unimplementable on 16 bit machines. (Yes, you could do it, but the result was practically unusable.)

I remember using both features with Borland compilers.

But then I was around 16 years old and was not doing anything serious
with C++, besides getting to know the language.

On those days, Turbo Pascal was my number one choice for serious software.

>
> C and 16 bits go reasonably well together, but even so, the best programs were written all in asm.


July 30, 2012
Jonathan M Davis , dans le message (digitalmars.D:173382), a écrit :
> scope on local variables is going away for pretty much the same reason that delete is. They're unsafe, and the fact that they're in the core language encourages their use. So, they're being removed and put into the standard library instead.
> 

I don't mind scope going away, since it can be replaced with a library solution. But scope is not more dangerous than a static array, or simple function variables. Slice them, or take their reference, and you're up for troubles. Do you think they should be removed as core features of the langage?
July 30, 2012
On 07/27/12 21:22, Stuart wrote:
> On Friday, 27 July 2012 at 19:17:10 UTC, H. S. Teoh wrote:
>>
>> Nevertheless, D has gotten to the point where it's powerful enough that most feature requests can be implemented in a library rather than as a language extension.
> 
> How could Yield be implemented as a library feature? Something like:
> 
>    return ITERATOR(
>       { ...setup code ... },
>       { ...loop body code... })
> 
> ?
> 
> I don't see how that would work.

The threading "yield" and the iterator "yield" are actually strongly related. By yielding a coroutine, you can effectively suspend a routine
and resume it later. Let me give you an example:

int delegate() genEvenNumbers() {
  // not actual api
  return startCoroutineIterator!int((void delegate(int) yield) {
    for (int i = 0; true; i++) if (i % 2 == 0) yield(i);
  });
}

// long form of the above function, so you can see how you gain iterator-style behavior via coroutines
int delegate() genEvenNumbers() {
  int* resultp = new int;
  void cofun(void delegate(int) yield) {
    for (int i = 0; true; i++) if (i % 2 == 0) yield(i);
  }
  // still not actual api
  auto coroutine = new Coroutine(1024*1024 /* stack size */);
  // set the 'main function' of the coroutine
  coroutine.fun = {
    // yield, when called, will cause the coroutine to suspend and run() to return
    cofun((int i) { *resultp = i; coroutine.yield(); });
  };
  // run the coroutine up to the next yield, then return yielded value
  return { coroutine.run(); return *resultp; };
}

You can fully implement Coroutine as a library, provided you're somewhat confident with x86 assembly and stack frame layouts.
July 30, 2012
On Monday, July 30, 2012 11:23:06 Christophe Travert wrote:
> Jonathan M Davis , dans le message (digitalmars.D:173382), a écrit :
> > scope on local variables is going away for pretty much the same reason
> > that
> > delete is. They're unsafe, and the fact that they're in the core language
> > encourages their use. So, they're being removed and put into the standard
> > library instead.
> 
> I don't mind scope going away, since it can be replaced with a library solution. But scope is not more dangerous than a static array, or simple function variables. Slice them, or take their reference, and you're up for troubles. Do you think they should be removed as core features of the langage?

I don't think that you _can_ implement them in a library.

I _do_ think that implicit slicing of static arrays should go away, since it's just begging for bugs. But there's pretty much no way that that's going to change at this point.

- Jonathan M Davis
July 30, 2012
Le 28/07/2012 18:07, Andrei Alexandrescu a écrit :
>> Microsoft wouldn't have brought F# into Visual Studio if it wasn't worth
>> it, Microsoft is a business, not a language charity company.
>
> It was for Basic :o). Anyhow, indeed, the tools around it make F# pretty
> cool (just not all that original as a language).
>
> Andrei

Indeed, it is damn close to Caml.
July 30, 2012
On Mon, 30 Jul 2012 07:04:21 +0100
Russel Winder <russel@winder.org.uk> wrote:
>
> My experience from various training courses I have given is that in the large corporate Web applications world, the average programmer knows Java, just enough Java, and isn't that interested in anything else. Gross oversimplification but a good indicator.

Ugh, such "programmers" don't deserve a paycheck. It's like hiring someone who never learned arithmetic as an accountant. Makes no fucking sense at all - they just simply *can't* do their fucking trade, period. The "programmer" and the hiring manager should both be fired and pointed to nearest (possibly hiring) fast food joint.

(Incarceration for "impersonating a programmer" would perhaps be more
appropriate ;) )