February 02, 2012
On 02/02/2012 08:32 PM, H. S. Teoh wrote:
> On Thu, Feb 02, 2012 at 08:20:55PM +0100, dsimcha wrote:
>> On Thursday, 2 February 2012 at 18:55:02 UTC, Andrej Mitrovic wrote:
>>> On 2/2/12, Manu<turkeyman@gmail.com>  wrote:
>>>> PC's are an endangered and dying species...
>>>
>>> Kind of like when we got rid of cars and trains and ships once we
>>> started making jumbo jets.
>>>
>>> Oh wait, that didn't happen.
>>
>> Agreed.  I just recently got my first smartphone and I love it.  I see
>> it as a complement to a PC, though, not as a substitute.  It's great
>> for when I'm on the go, but when I'm at home or at work I like a
>> bigger screen, a full keyboard, a faster processor, more memory, etc.
>> Of course smartphones will get more powerful but I doubt any will ever
>> have dual 22 inch monitors.
> [...]
>
> Funny you should mention that, a number of years ago I got to know a guy
> who worked in a lab that was researching organic polymer-based
> electronics, which lets you build things like screens and keyboards that
> can be rolled up or folded and put into your pocket. It was still
> experimental technology at the time, though, and I'm not expecting it to
> be publicly available anytime soon. But the day may come when your
> smartphone *can* literally have a 22" monitor (that folds into a pocket
> sized card).
>
>
> T
>

Chances are that region allocator can run on mobile phones just fine at the time that happens. ;)
February 02, 2012
Long term I suspect the typical desktop experience will just be a phone or other mobile device talking wirelessly to input and display peripherals.

On Feb 2, 2012, at 11:20 AM, "dsimcha" <dsimcha@yahoo.com> wrote:

> On Thursday, 2 February 2012 at 18:55:02 UTC, Andrej Mitrovic wrote:
>> On 2/2/12, Manu <turkeyman@gmail.com> wrote:
>>> PC's are an endangered and dying species...
>> 
>> Kind of like when we got rid of cars and trains and ships once we started making jumbo jets.
>> 
>> Oh wait, that didn't happen.
> 
> Agreed.  I just recently got my first smartphone and I love it.  I see it as a complement to a PC, though, not as a substitute.  It's great for when I'm on the go, but when I'm at home or at work I like a bigger screen, a full keyboard, a faster processor, more memory, etc.  Of course smartphones will get more powerful but I doubt any will ever have dual 22 inch monitors.
February 02, 2012
On Thu, Feb 02, 2012 at 01:22:54PM -0800, Sean Kelly wrote:
> Long term I suspect the typical desktop experience will just be a phone or other mobile device talking wirelessly to input and display peripherals.
[...]

Funny, it comes full-circle to the good ole days of dumb X terminals that only handle input/display, and leave the actual computing to the server. Except now the server is no longer an oversized ceiling-high machine in the server room, but a tablet you can fit into your pocket. :)

However, due to the inconvenience of needing to be near a bulky monitor and keyboard, I suspect rather that the mobile device will come with a built-in projector and foldable keyboard. The latter already exists; the former allows the convenience of usability anywhere there's a large flat surface.


T

-- 
Heuristics are bug-ridden by definition. If they didn't have bugs, they'd be algorithms.
February 02, 2012
On 2 February 2012 20:13, Jonathan M Davis <jmdavisProg@gmx.com> wrote:

> On Thursday, February 02, 2012 20:06:14 Manu wrote:
> > On 2 February 2012 17:40, dsimcha <dsimcha@yahoo.com> wrote:
> > > On Thursday, 2 February 2012 at 04:38:49 UTC, Robert Jacques wrote:
> > >> An XML parser would probably want some kind of stack segment growth schedule, which, IIRC isn't supported by RegionAllocator.
> > >
> > > at least assuming we're targeting PCs and not embedded devices.
> >
> > I don't know about the implications of your decision, but comment makes
> me
> > feel uneasy.
> >
> > I don't know how you can possibly make that assumption? Have you looked
> > around at the devices people actually use these days?
> > PC's are an endangered and dying species... I couldn't imagine a worse
> > assumption if it influences the application of D on different systems.
>
> PCs are not endangered in the least. It's just that we're getting an
> increase
> in other devices (particularly smart phones and tablets). PCs are _heavily_
> used, and there's no way that smart phones or tablets could replace them.
> They
> do different stuff. It _is_ true that applications are increasingly being
> written for non-PCs, but PCs definitely aren't dying off.
>
> Also, how much do you really treat smart phones or tablets like embedded devices rather than PCs? They're certainly more like PCs than the embedded devices of yore. True, they have stricter performance requirements, but they're nowhere near as restrictive as they used to be.


They're restrictive devices with weaker hardware, yet people seem to
generally expect a roughly equivalent experience from them as they get from
their PC.
A modern PC's software can be written in any language the programmer likes,
and expect that the result will generally run well. Performance is almost a
'solved' problem on PC's for most applications.

I really just argue that D, a so called systems language [this is surely D's primary offering/niche, or am I mistaken? It basically competes with C/C++, and on roughly the same terms], needs to constantly consider (and potentially even favour) the performance requirements of these weaker systems. They are the devices with the highest demand for efficiency placed on them.

For productivity applications, PC's will remain dominant, sure, but consider the number of productivity applications vs the number of entertainment products/games/apps/toys. The scope doesn't even compare. Also consider that productivity software is rarely written from scratch. Most productivity software people use is already written, only updated now, and isn't going to restart in D any time soon.

Games/apps/toys, in terms of number of pieces of software developed, and
number of developers writing such software (read: **NEW** software, where
the choice of using a new language like D is a *real* possibility),
completely blows the PC developer base SOOO far out of the water it's
barely quantifiable, probably thousands to 1. That developer base are not
working on x86 platforms, they're targeting games consoles, phones,
tablets... TV's will be big very soon.
They are a large community, grown very tired of C/C++, want a language
that's not shit, and that is exactly as efficient as C/C++ when used
correctly. D fits this bill like a glove, and also has the advantage of
being compatible with their existing libraries.

Embedded hardware toyed with the notion of using managed languages for a
little while, not even offering C toolchains at first (Android, Windows
Phone), but that failed, and both revoked that policy. The bar is raising
fast on those platforms. Everyone producing high end products required
native development to remain competitive.
There's really not a lot of modern native language competition out there, D
is positioned well, and with this fact in mind, D's performance must be as
comparable as possible to C/C++.
In fact, D has potential for many performance *advantages* over C/C++, but
the GC is a key concern for me personally... and if possible to tune it for
these systems, it should be. They require it more, and PC honestly won't
care that much :)

Apologies for once again derailing an unrelated topic! :)
I really just can't handle seeing this sort of presumption constantly
popping up all over the news group. This PC-centric mentality has to go if
D is to have any commercial success. The commercial software industry moved
on years ago, it's not on PC anymore... the D community needs to
become intimately aware of that fact, whether they like it or not.
(presuming of course that the intent here if for D to be successful :)


February 03, 2012
On 2 February 2012 21:15, dsimcha <dsimcha@yahoo.com> wrote:

> On Thursday, 2 February 2012 at 18:06:24 UTC, Manu wrote:
>
>> On 2 February 2012 17:40, dsimcha <dsimcha@yahoo.com> wrote:
>>
>>  On Thursday, 2 February 2012 at 04:38:49 UTC, Robert Jacques wrote:
>>>
>>>  An XML parser would probably want some kind of stack segment growth
>>>> schedule, which, IIRC isn't supported by RegionAllocator.
>>>>
>>>>
>>> at least assuming we're targeting PCs and not embedded devices.
>>>
>>>
>> I don't know about the implications of your decision, but comment makes me feel uneasy.
>>
>> I don't know how you can possibly make that assumption? Have you looked
>> around at the devices people actually use these days?
>> PC's are an endangered and dying species... I couldn't imagine a worse
>> assumption if it influences the application of D on different systems.
>>
>
> I'm not saying that embedded isn't important.  It's just that for low level stuff like memory management it requires a completely different mindset.  RegionAllocator is meant to be fast and simple at the expense of space efficiency.  In embedded you'd probably want completely different tradeoffs.  Depending on how deeply embedded, space efficiency might be the most important thing.  I don't know exactly what tradeoffs you'd want, though, since I don't do embedded development.  My guess is that you'd want something completely different, not RegionAllocator plus a few tweaks that would complicate it for PC use.  Therefore, I designed RegionAllocator for PCs with no consideration for embedded environments.
>

Okay, this reasoning seems sound to me. I wonder though, is it
easy/possible/compatible to plug a new back end in for embedded systems? It
should definitely conserve memory at all costs, but above all, it needs to
be fast.
Short term embedded platforms will surely have some lenience with memory
size, but demand high performance.

What sort of overheads are we talking about for these allocators? I wonder what the minimum footprint of a D app would be, ie. exe size + minimum memory allocation for the runtime/etc?


February 03, 2012
People and, more importantly, professionals will always want the most power they can get to do their work. Whether that power remains on the client system or is off-set to the home/office server or cloud is the big question.

I think the future will see an increase of local servers which intelligently allocate resources to transport-friendly modular clients devices in both homes and businesses.
February 03, 2012
On 2 February 2012 21:20, dsimcha <dsimcha@yahoo.com> wrote:

> On Thursday, 2 February 2012 at 18:55:02 UTC, Andrej Mitrovic wrote:
>
>> On 2/2/12, Manu <turkeyman@gmail.com> wrote:
>>
>>> PC's are an endangered and dying species...
>>>
>>
>> Kind of like when we got rid of cars and trains and ships once we started making jumbo jets.
>>
>> Oh wait, that didn't happen.
>>
>
> Agreed.  I just recently got my first smartphone and I love it.  I see it as a complement to a PC, though, not as a substitute.  It's great for when I'm on the go, but when I'm at home or at work I like a bigger screen, a full keyboard, a faster processor, more memory, etc.  Of course smartphones will get more powerful but I doubt any will ever have dual 22 inch monitors.
>

You're obviously a nerd though, hanging out in a place like this :)

There is no small number of people who use their PC's purely for communication, facebook, emails, skype, etc. and there are well documented trends showing more and more people are performing those tasks exclusively from their phones/portable devices. They are now, already, ex-pc-users. That trend is accelerating (I wish I could remember where the slides were
>_<)

My girlfriend for instance, I bought an iPad to do some dev on... she
confiscated it one day after I showed her the eBook app, which she loved,
and promptly threw away her whole hard copy library...
Funnily though, and completely unconsciously on her part, I haven't seen
her open her computer more than once or twice since that time. It has
everything on it she ever used her computer for, it's more portable and
convenient, fits in her bag (takes it to class), and it even has a recipe
app and conveniently sits in the bench while she's baking :)
I have had the same discussion in the office, and I'm not the only one who
has seem a similar transition first hand.

Most people aren't computer nerds... they don't give 2 shits about
computers, it's just a tool, and the second something more practical or
convenient comes along for their purposes, they'll use that instead.
That time has already past, the trend is accelerating quickly, the snowball
is rolling, even REAL nerds are starting to slip. (I must confess, I'm
seriously tempted by a transformer prime...)


I think the MOST interesting part about this trend though, is that many of
these people who used their laptop/pc *exclusively* for communication have
now discovered these mobile app stores, and they're being exposed to apps
that do all sorts of things. Far more than they ever did on their PC's of
days past. Even games; people who would say they have no interest in video
games at all are being shown to be picking up simple mobile/tablet games at
an alarming rate.
This is a huge and fast growing software market, almost all the software is
written from scratch (read: developers *could* choose D if it were
available), and D should be a part of it!
The language landscape in this space is very turbulent at the moment, but
it will settle, and D needs to be in it before it does, or it will miss the
boat.


February 03, 2012
On 2 February 2012 23:22, Sean Kelly <sean@invisibleduck.org> wrote:

> Long term I suspect the typical desktop experience will just be a phone or other mobile device talking wirelessly to input and display peripherals.


'Long term'? I think you're being a bit pessimistic. It's already here! http://eee.asus.com/eeepad/transformer-prime/features/ (do want!)

I give it... maybe 2-3 years before it's standard (a very significant portion of the hardware landscape). I don't think that's very long... certainly not a lot of time for D to get its shit together! ;)


February 03, 2012
On 02/02/2012 17:11, Jesse Phillips wrote:
> for me disabling the GC during load doesn't change load time,
> but I'm not using the document loader.
>


The GC hit is related to the number of dom nodes that exist at one i think -> the visitor approach doesnt allocate the whole dom tree, so there are far fewer items (and less allocated memory for the gc to scan).
For comparison, parsing my test file using XmlVisitor takes less than 3 seconds (over twice as fast as the DOM version).


I looked in to it a bit and found this: http://www.dsource.org/projects/xmlp/ticket/10

Seems that it's calling GC.qmalloc 350000+ times, mostly for 16byte blocks, when normalizing attributes. This doesn't seem hugely clever :-(
February 03, 2012
Speaking of GC improvements, I googled around a bit and found this thread from 2 years ago:

	http://www.digitalmars.com/d/archives/digitalmars/D/Is_there_a_modern_GC_for_D_105967.html

Especially interesting is this paper:

	Nonintrusive Cloning Garbage Collector with Stock Operating
	System Support.
	        http://www.cs.purdue.edu/homes/grr/snapshot-gc.ps

Has anything been done along these lines since that time? Seems like this particular GC algorithm is exactly the kind we need for D. It's a conservative mark-and-sweep algo with a very low overhead(*), mark phase concurrent with mutator thread(s), and lazy incremental sweeping at allocation time.  Synchronization is automatically done by default OS kernel-space mechanisms (copy on write memory pages).

More to the point, how easy/hard is it to switch between GCs in the current D implementation(s)? I think it would be helpful if these kinds of experimental GCs were available in addition to the current default GC, and people can play around with them and find out which one(s) are the cream of the crop. Otherwise we're just bottlenecked at a small number of people who can actually play with GC algos for D -- which means improvements will be slow.


(*) True on *nix systems anyway, but judging from comments in
that thread, Windoze also was acquiring equivalent functionality -- and
this being 2 years ago, I'm assuming it's available now.


--T