View mode: basic / threaded / horizontal-split · Log in · Help
November 23, 2005
Re: Var Types
Georg Wrede wrote:
> One thing that comes to mind is cryptography. Doing serious encrypting 
> on the fly would benefit from having, say, 1024 bit processors.

I'm no CPU-engineer, but I think there must be a trade-off between 
extremely huge hardware registers & other cpu optimizations like 
parallelism. Still current apm-libraries run significantly faster on 
64-bit CPUs:

http://www.swox.com/gmp/32vs64.html

> Anybody remember the IBM boss? Or the Intel boss? Or Bill? They all said 
> "xxx ought to be plenty enough forever".
> 

So you would like to have bigger address space than it is possible to 
physically implement using all the material we have on Earth?!

> Currently (I believe) none of the 64 bit cpus actually have address 
> buses that are 64 bits wide. Nobody is the wiser, but when you go to the 
> pc vendor and ask how much memory one can put on this or that 64 bit PC, 
> the usual answer is like "16GB".

These 64-bit machines cannot use traditional page tables, otherwise you 
would end up filling all your available memory with page tables. I think 
there may be other addressing problems as well, but at least you need to 
use inverted page tables.

> The one thing that guarantees us a 256 bit cpu on everybody's table top 
> eventually, has nothing to do with computers per se. It's to do with the 
> structure of the Western society, and also with our inborn genes.
> 
> Better computer games, anybody?

The best computer games are usually small enough to fit on a single 
floppy disk. :)

> 
> Speech recognition that actually works?

AFAIK modern speech synthesis and recognition are based on linguistics & 
highly sophisticated neural networks. In fact you need less space than 
those cut'n'paste things a decade ago.


Jari-Matti
November 23, 2005
Re: Var Types
In article <43848FF5.10909@nospam.org>, Georg Wrede says...
>
>Don Clugston wrote:
>> Tomás Rossi wrote:
>>> In article <dlv33k$1o7u$1@digitaldaemon.com>, xs0 says...
>>> 
>>>>>> What happens to our short, int, long language types when
>>>>>> 256-bit processors come along?  We'd find it hard to address
>>>>>> a 16-bit integer in that system limited to only three type
>>>>>> names.
>>>>> 
>>>>> Exactly, what would happen? Would "we" have to engineer another
>>>>> language, would it be D v2 :P? Certainly platform-dependent
>>>>> integral types are THE choice. Aliases of the type intXXX would
>>>>> be necessary always.
>>>> 
>>>> I think you guys are exaggerating the problem. Even 64-bit CPUs
>>>> were developed (afaik) mainly because of the need to cleanly
>>>> address more than 4GB of RAM, not because there's some
>>>> overwhelming need for 64-bit calculations. Considering how much
>>>> RAM/disk/whatever 2^64 is, I don't think anyone will need a CPU
>>>> that is 128-bit, let alone 256-bit any time soon (and even if
>>>> developed because of marketing purposes, I see no reason to use
>>>> 32-byte variables to have loop counters from 0 to 99).
>
>(Hmm, so   for(byte b=0; b<99; b++)  is what one writes today?)
>
>;-)
>
>One thing that comes to mind is cryptography. Doing serious encrypting 
>on the fly would benefit from having, say, 1024 bit processors.
>
>Oh yes, and the NSA and other spooks really need double the width that 
>everyone else has. This is a law of nature. :-)
>
>I remember reading about a graphics card that had a 256 bit cpu. This 
>was so long ago that I think it's on the market already.
>
>>> The same thing said some people about 32-bit machines before those
>>> were developed and now we have 64-bit CPUs. Plus, I´m sure that
>>> already exists 128/256-bit CPUs nowadays, maybe not for home PCs,
>>> but who say D only has to run on home computers? For example, the
>>> PlayStation2 platform is builded upon a 128-bit CPU!
>
>Anybody remember the IBM boss? Or the Intel boss? Or Bill? They all said 
>"xxx ought to be plenty enough forever".
>
>> No, it's a fundamentally different situation. We're running up
>> against the laws of physics. 2^64 is a fantastically large number.
>> 
>> (a) Address buses. If you could store one bit of RAM per silicon
>> atom, a memory chip big enough to require 65 bit addressing would be
>> one cubic centimetre in size. Consider that existing memory chips are
>> only 2D, and you need wiring to connect to each bit. Even if the
>> cooling issues are overcome, it's really hard to imagine that
>> happening. A memory chip big enough to require 129 bit addressing
>> would be larger than the planet.
>> 
>> The point is, that we're approaching Star Trek territory. The
>> Enterprise computer probably only has a 128 bit address bus.
>> 
>> Many people think that doubling the number of bits is exponential 
>> growth. It's not. Adding one more bit is exponential growth! It's
>> exp(exp(x)) which is frighteningly fast function. Faster than a 
>> factorial!
>> 
>> (b) Data buses
>> 
>> I began programming with 8 bit data registers. That was limiting.
>> Almost all useful numbers are greater than 256. 16 bits was better,
>> but still many quanties are > 65536. But almost everything useful
>> fits into a 32 bit register. 32 bits really is a natural size for an
>> integer. The % of applications where each of these is inadequate is
>> decreasing exponentially. Very few applications need 128 bit
>> integers.
>> 
>> But almost everything can use increased parallellism, hence 128 bit
>> SIMD instructions. But they're still only using 32 bit integers.
>> 
>> I think that even the 60 year timeframe for 128 bit address buses is
>> a bit optimistic. (But I think we will see 1024-core processors long
>>  before then. Maybe even by 2015. And we'll see 64K core systems).
>
>Depends.
>
>I started doing programming on an HP handheld. It had a 4 bit cpu. (Yes, 
>four bits.) Its address bus was wider, though. Programming it was done 
>in assembly, although they never said it in the manual, probably so as 
>not to frighten folks away.
>
>My next assembly I wrote on the 6502, which was an 8 bit cpu. The 
>address bus was 16 bits. Then I went on to the PC, which was touted as a 
>16 bit machine. True, the 8086 was 16 bits, but because that needed an 
>expensive motherboard and memory hardware, a cheaper version was built 
>for the masses, the 8088, so we had 16 bit PCs with an 8 bit data bus. 
>Slow yes, but cheaper. But still 16 bit. (The software never knew.)
>
>Currently (I believe) none of the 64 bit cpus actually have address 
>buses that are 64 bits wide. Nobody is the wiser, but when you go to the 
>pc vendor and ask how much memory one can put on this or that 64 bit PC, 
>the usual answer is like "16GB".
>
>It is also conceivable (somebody here know the fact?) that most of those 
>64 bit modern PCs actually use a 32 bit data bus.
>
>So, historically, the data bus, the address buss, and the accumulator 
>(where integer math is done, and the width of which is often taken to be 
>the "width of the cpu") have usually not all had the same width -- 
>although folks seem to believe so.
>
>---
>
>What we however do need, is a virtual address space that is large enough 
>to accommodate the most demanding applications and data. This makes 
>writing software (and especially operating systems and compilers) a lot 
>easier, because we then don't have to start constructing kludges for the 
>situations where we bang into the end of the memory range. (This is a 
>separate issue from running out of memory.)
>
>---
>
>The one thing that guarantees us a 256 bit cpu on everybody's table top 
>eventually, has nothing to do with computers per se. It's to do with the 
>structure of the Western society, and also with our inborn genes.
>
>(What???) First, the society thing: the world (within the foreseeable 
>future) is based on capitalism. (I'm no commie, so it's fine with me.) 
>This in itself makes vendors compete. And that's good, otherwise we'd 
>still all be driving T-Fords.
>
>But this leads to bigger, faster, fancier, cooler, ... ad absurdum. Laws 
>of physics, bah. In the eighties, it was common knowledge that we 
>wouldn't have hard disks by the time a hard disk goes beyond gigabyte 
>size. It  was supposed to be physically impossible. It would have to be 
>some kind of solid state tech instead. And now I read about Nokia phones 
>having internal hard disks with multi gig capacity.
>
>Second, it's in our genes. There's a revealing commercial on my TV: day 
>care kids on a break. "My mom makes better food than yours." "My mom 
>makes better food than both of your mothers." A teenager walks by and 
>says "My mother makes better food than any of yours." And then this 
>2-year old from the other kindergarten says over the fence: "My mother 
>makes the food all your mothers serve." (Think Nestle, Kraft, whatever.)
>
>Suits and non-nerds live to brag. "A trillion bit cpu? Naaa, get out of 
>here!" That day is nearer than Doomsday. I don't even bother to bet on 
>it, would be like stealing the money.
>
>---
>
>Ever heard "no matter how big your garage, it fills up with crap, and 
>soon your car stays outside"?
>
>Ever heard "it makes no difference how big your hard disk is, it takes 
>the same amount of time before it gets full"?
>
>Ever heard "it makes no difference how fast the PC is, the next Windows 
>puts it on its knees anyhow"?
>
>---
>
>Better computer games, anybody?
>
>Who wouldn't like to have a 6-month accurate weather forecast in the 
>pocket the day Boss asks when you'd like to have this year's vacation?
>
>We need a good computer model of the human body, so we don't have to 
>kill mice, pigs and apes whenever a new drug is developed.
>
>Earthquakes, anybody?
>
>Speech recognition that actually works?
>
>How about a computer that the law makers can ask every time they've 
>invented a new law? They could get an answer to how the new law _really_ 
>would impact citizens, tax revenue, and other laws!

You're out of your mind man but I like it :P

I don't like to bound short/int/long to any specific size because we don't know
for sure what will happen in the forecoming years... maybe with future quantum
computers 32-bit integers would end to be a ridiculous small precision to use or
may even just not exist anymore. Not making integers width platform-specific
makes D a lot more "unscalable" and it'll be nice that D could be used in
distant future platforms as well, without changing it's spec.   

Tom
November 23, 2005
Re: Var Types
Tom wrote:

> I don't like to bound short/int/long to any specific size because we don't know
> for sure what will happen in the forecoming years... maybe with future quantum
> computers 32-bit integers would end to be a ridiculous small precision to use or
> may even just not exist anymore. Not making integers width platform-specific
> makes D a lot more "unscalable" and it'll be nice that D could be used in
> distant future platforms as well, without changing it's spec.   

Well, if I understand the article

http://www.iiap.res.in/outreach/blackhole5.html

correctly, any device can only process 10^44 bits a second (where any 
device means _any_ device, even the entire universe), so even in a 
trillion years, you can only get about 10^63 bits processed, which is 
about 2^210. Considering how much smaller part of the time-space we are, 
and how the universe is not trying hard to produce information useful to 
humans, I think it's safe to say we'll _never_ need more than 128-bit 
addressing, at least in this universe :)

As for data itself, can you think of any single quantity one would want 
to commonly represent in a computer using more than 128 bits? If not, D 
has it all covered (btw, also note that you can't measure things with 
infinite precision (again regardless of technology), so something like a 
2048-bit double for that extra precision is not a good answer, at least 
if you're not into marketing ;)


xs0
November 23, 2005
Re: Var Types
Jari-Matti Mäkelä wrote:
> Georg Wrede wrote:
> 
>> One thing that comes to mind is cryptography. Doing serious
>> encrypting on the fly would benefit from having, say, 1024 bit
>> processors.
> 
> I'm no CPU-engineer, but I think there must be a trade-off between 
> extremely huge hardware registers & other cpu optimizations like 
> parallelism. Still current apm-libraries run significantly faster on
>  64-bit CPUs:
> 
> http://www.swox.com/gmp/32vs64.html

Excellent example!

>> Anybody remember the IBM boss? Or the Intel boss? Or Bill? They all
>> said "xxx ought to be plenty enough forever".
> 
> So you would like to have bigger address space than it is possible to
> physically implement using all the material we have on Earth?!

Weren't we supposed to colonize other planets too?

But seriously, the day a machine with "too much" address space gets 
brought into the software office, the Pointy Haired boss decrees every 
developer his own address space.

And thereafter it gets all kinds of uses we just don't have the time to 
invent now. Not that it's what I want -- it's what's gonna happen.

>> Better computer games, anybody?
> 
> The best computer games are usually small enough to fit on a single 
> floppy disk. :)

For us here, yes!

>> Speech recognition that actually works?
> 
> AFAIK modern speech synthesis and recognition are based on
> linguistics & highly sophisticated neural networks. In fact you need
> less space than those cut'n'paste things a decade ago.

That actually works, was the phrase. :-)
November 23, 2005
Re: Var Types
Tom wrote:
> In article <43848FF5.10909@nospam.org>, Georg Wrede says...
> 
>> Better computer games, anybody?
>> 
>> Who wouldn't like to have a 6-month accurate weather forecast in
>> the pocket the day Boss asks when you'd like to have this year's
>> vacation?
>> 
>> We need a good computer model of the human body, so we don't have
>> to kill mice, pigs and apes whenever a new drug is developed.
>> 
>> Earthquakes, anybody?
>> 
>> Speech recognition that actually works?
>> 
>> How about a computer that the law makers can ask every time they've
>>  invented a new law? They could get an answer to how the new law
>> _really_ would impact citizens, tax revenue, and other laws!
> 
> You're out of your mind man but I like it :P

Gotta be, DMD is getting to be the absolutely most expensive compiler 
I've ever used!

> I don't like to bound short/int/long to any specific size because we
> don't know for sure what will happen in the forecoming years... maybe
> with future quantum computers 32-bit integers would end to be a
> ridiculous small precision to use or may even just not exist anymore.
> Not making integers width platform-specific makes D a lot more
> "unscalable" and it'll be nice that D could be used in distant future
> platforms as well, without changing it's spec.

The current crop of int size definitions (damn, I just forgot where they 
were in the D documentation) is adequate, for the time being.

No code breaks if we add later a few wider integral types, so that is 
not a problem.

And anywhere it does make a difference to the programmer, he will choose 
a specific width for his integers anyhow. (Within his own imagination 
and foresight, of course.)

But anywhere id does not make a difference (like in for-loop indexes, 
etc.), he'll use whatever is the fastest anyhow.

I see no problem with this.


And if you're a heavy duty mathematician/programmer writing a new GMP 
library, then -- compared to the overall effort -- it is a tiny thing to 
write a couple of conditional typedefs right at the beginning, so your 
code works ok on several CPU-widths or endiannesses.
November 23, 2005
Re: Var Types
xs0 wrote:
> 
> Well, if I understand the article
> 
> http://www.iiap.res.in/outreach/blackhole5.html

That article didn't account for the fact that because of space-time 
curvature under high gravitation, the actual volume of the black hole is 
larger than what appears when one only looks at the diameter.

> correctly, any device can only process 10^44 bits a second (where any 
> device means _any_ device, even the entire universe), so even in a 
> trillion years, you can only get about 10^63 bits processed, which is 
> about 2^210. Considering how much smaller part of the time-space we are, 
> and how the universe is not trying hard to produce information useful to 
> humans, I think it's safe to say we'll _never_ need more than 128-bit 
> addressing, at least in this universe :)

So, your figures are off by a factor of ln(2^(V/v)*c^5), where V and v 
are the real and the apparent volume, respectively, and c is the speed 
of light in vacuum.

> As for data itself, can you think of any single quantity one would want 
> to commonly represent in a computer using more than 128 bits? If not, D 
> has it all covered (btw, also note that you can't measure things with 
> infinite precision (again regardless of technology), so something like a 
> 2048-bit double for that extra precision is not a good answer, at least 
> if you're not into marketing ;)

Na, just kidding.
November 23, 2005
Re: Var Types
In article <4384DE09.7060301@nospam.org>, Georg Wrede says...
>
>
>
>xs0 wrote:
>> 
>> Well, if I understand the article
>> 
>> http://www.iiap.res.in/outreach/blackhole5.html
>
>That article didn't account for the fact that because of space-time 
>curvature under high gravitation, the actual volume of the black hole is 
>larger than what appears when one only looks at the diameter.

BUAAAAHHHAHAHAHAH!

>> correctly, any device can only process 10^44 bits a second (where any 
>> device means _any_ device, even the entire universe), so even in a 
>> trillion years, you can only get about 10^63 bits processed, which is 
>> about 2^210. Considering how much smaller part of the time-space we are, 
>> and how the universe is not trying hard to produce information useful to 
>> humans, I think it's safe to say we'll _never_ need more than 128-bit 
>> addressing, at least in this universe :)
>
>So, your figures are off by a factor of ln(2^(V/v)*c^5), where V and v 
>are the real and the apparent volume, respectively, and c is the speed 
>of light in vacuum.

BUAHAHAHAHAHAHAHHAAHHAHAHAHAHAHAHA! (I can't describe with onomatopoeias how I
laughed after reading this, you definitively brought JOY to my day :D !!)

>> As for data itself, can you think of any single quantity one would want 
>> to commonly represent in a computer using more than 128 bits? If not, D 
>> has it all covered (btw, also note that you can't measure things with 
>> infinite precision (again regardless of technology), so something like a 
>> 2048-bit double for that extra precision is not a good answer, at least 
>> if you're not into marketing ;)
>
>Na, just kidding.

Ok, enough, I can't bear it anymore! 
I promise not to post about this issue never again, no matter what I really
think about the subject! I'll be happy with actual D-way and I'll pray every
night for it to stay like it is just now. God bless Walter and his
magniffiecient omnipotent language... Just stop with all that PHYSICs crap!! :P

PS: I can prove that God really exists, it exists because of:
http://www.iiap.res.in/outreach/blackhole5.html
:P

Tom
November 24, 2005
Re: Var Types
Tom wrote:
> In article <4384DE09.7060301@nospam.org>, Georg Wrede says...
> 
>>Na, just kidding.
> 
> Ok, enough, I can't bear it anymore! 

Thanks!

Your comments made me laugh so I had tears i my eyes!

georg
November 24, 2005
Re: Var Types
Lionello Lunesu escribió:
> 
> But there's one reason this doesn't quite go up. Most of the time I don't 
> care what 'int' I use. I use 'int' for any number. I use it in for-loops, in 
> simple structs, in interfaces. Most of the time the size of the int doesn't 
> matter and I just want a number. Only when serializing to disk or network I 
> want fixed size types.
> 
> It's nice that D fixes the size of an int, but it's not really helping me. 
> On another platform the registers might not have 32-bits but now all my 
> numbers I don't care about will generate excessive code since D forces them 
> to 32-bits.
> 
> It's "the wrong way around" because D makes me create and use an alias for 
> the cases I didn't care about. Now I have to care about those cases for 
> portability.
> 
> L. 
> 
> 

So use "auto". It won't work for all situations (like your "simple struct" 
example), but I think it'll work.

-- 
Carlos Santander Bernal
November 24, 2005
Re: Var Types
Georg Wrede wrote:
> 
>>> The same thing said some people about 32-bit machines before those
>>> were developed and now we have 64-bit CPUs. Plus, I´m sure that
>>> already exists 128/256-bit CPUs nowadays, maybe not for home PCs,
>>> but who say D only has to run on home computers? For example, the
>>> PlayStation2 platform is builded upon a 128-bit CPU!
> 
> 
> Anybody remember the IBM boss? Or the Intel boss? Or Bill? They all said 
> "xxx ought to be plenty enough forever".

With respect, Bill was an idiot, and it was obvious at the time. The 
infamous "64K will be enough for everyone" was made at a time when 
mainframes already had far more RAM than that, and were already 
increasing. From memory, some of the machines with ferrite core memories 
 had 16K of RAM.

>> The point is, that we're approaching Star Trek territory. The
>> Enterprise computer probably only has a 128 bit address bus.
>> But almost everything can use increased parallellism, hence 128 bit
>> SIMD instructions. But they're still only using 32 bit integers.
>>
>> I think that even the 60 year timeframe for 128 bit address buses is
>> a bit optimistic. (But I think we will see 1024-core processors long
>>  before then. Maybe even by 2015. And we'll see 64K core systems).
> 
> 
> Depends.

> It is also conceivable (somebody here know the fact?) that most of those 
> 64 bit modern PCs actually use a 32 bit data bus.

Actually many of the 32 bit Pentiums use a 64 bit or 128 bit data bus!

> So, historically, the data bus, the address buss, and the accumulator 
> (where integer math is done, and the width of which is often taken to be 
> the "width of the cpu") have usually not all had the same width -- 
> although folks seem to believe so.

All this is true. There's no intrinsic restriction on the width of 
registers. But there are just not many uses for really large 
fixed-precision numbers. (For arbitrary precision, there are; and the
main use of wide arithmetic is to speed up the arbitrary precision case).

> 
> What we however do need, is a virtual address space that is large enough 
> to accommodate the most demanding applications and data. This makes 
> writing software (and especially operating systems and compilers) a lot 
> easier, because we then don't have to start constructing kludges for the 
> situations where we bang into the end of the memory range. (This is a 
> separate issue from running out of memory.)

> ---
> 
> The one thing that guarantees us a 256 bit cpu on everybody's table top 
> eventually, has nothing to do with computers per se. It's to do with the 
> structure of the Western society, and also with our inborn genes.
> 
> (What???) First, the society thing: the world (within the foreseeable 
> future) is based on capitalism. (I'm no commie, so it's fine with me.) 
> This in itself makes vendors compete. And that's good, otherwise we'd 
> still all be driving T-Fords.

<cynic> And the US would still be using imperial measurement units.</cynic>

> But this leads to bigger, faster, fancier, cooler, ... ad absurdum. Laws 
> of physics, bah. In the eighties, it was common knowledge that we 
> wouldn't have hard disks by the time a hard disk goes beyond gigabyte 
> size. It  was supposed to be physically impossible. It would have to be 
> some kind of solid state tech instead. And now I read about Nokia phones 
> having internal hard disks with multi gig capacity.

Those arguments were not based on physics. They were based on 
assumptions about manufacturing technology.

The technological changes required for a 128 bit address bus are so 
huge, the change in size_t would be irrelevant.

To have 128 bits of addressable RAM, you need to store more than 1 bit 
per atom. This means quantum computers would be already well developed.

Here's how I see it:

High probability:
Multi-CPU systems with huge number of cores.
Cures for most cancers.

Medium probability:
Quantum computers.
Cures for all cancers.
Colonisation of other planets.

Low probability:
128 bit physical address buses.

:-)

Sure, maybe we'll reach the end of that list. But the ones in the middle 
will have more impact (even on programmers!) than the last one.

This is not something D needs to worry about. But thousand-core CPUs? 
Definitely.
1 2 3 4 5 6
Top | Discussion index | About this forum | D home