Archive: January 16, 2008

<<< January 15, 2008

Home

January 17, 2008 >>>


as the memory turns

Wednesday,  01/16/08  06:21 PM

<rant optional=yes>

You all know my status as a dinosaur; I can remember when all we had were zeros, and how great it was when we first got ones.  (There are 10 kinds of people in the world, those who understand binary, and those who don’t.)

So in the bad old days of 16-bit computing, the biggest programming problem was the size of your address space.   With only 64K to work with, and typically more physical memory than logical address space, you had to page stuff in and out in order to deal with it.  In those days every malloc was surrounded by an if(), because memory allocations could and did fail.

In the good new days of 32-bit computing, the biggest programming problem is the size of physical memory, and avoiding paging.  With 2GB to work with, and typically more logical address space than physical memory, you can allocate virtual storage with impunity.   In these days every new goes unchecked, because memory allocations don’t usually fail.

Well, we're entering some newer new days now, with machines that have more physical memory than logical address space again.   It is quite common to have 4GB on a machine, and yet the address space is “only” 2GB.   (Windows lamely uses the high-order bit, so you don’t have all 2^32.)   Which means once again you have to page stuff in and out in order to deal with it, and once again you have to check whether a virtual storage allocation has failed.

I suppose soon we'll all be running 64-bit operating systems and applications, and so this is a temporary situation; once we have a 2^64 address space we'll once again be worried about physical memory size and paging, and not about allocating virtual storage.

But for now, this is a problem.

You may know, a little while ago I made the world’s largest TIFF file, containing nearly 3TB of information.   I discovered 5 hours into an 8 hour compression run, I had run out of virtual storage.   I was running on a machine with 4GB of RAM, but my address space was “only” 2GB.   And so yes after a while – a long while – I allocated so much stuff that I hit the address space limit of my local heap, and news began to fail.  And of course my code didn't expect news to fail, so it died a horrible death.  I had to rearchitect the cache I was using to check for virtual storage availability in addition to physical storage availability.  A lot of work for an artificial limit.

So, what do we do?

We can surround every new with a an if(), and attempt to gracefully handle memory allocation failures.   That is too hard and too ugly to be right.  Anyway what do you do if one fails?   Most of the memory allocations are little pissant buffers and arrays; it is only the accumulation of literally millions of them that results in an overall failure.  We can catch the exceptions thrown by the C++ runtime when a new fails - that is better than checking every new - but it still leaves the problem of what do you do if you catch a failure.  We can move buffers into shared memory segments - kind of complex - or we can wait for 64-bit computing to be ubiquitous.

I do think that 64-bit will be the final frontier; it is unimaginable that 2^64 wouldn't be a big enough address space for everything.  Remind me I said that :)

<rant>

 

Wednesday,  01/16/08  08:33 PM

I'm back!  (on my new laptop drive)  Considering the potential for disaster, it was relatively painless.  A mere matter of reinstalling Windows and reloading the entire 150GB of data which constitutes my online / business life.  {Let me just put in a plug for Acronis TrueImage which is a great backup/restore tool.}  Have I ever told you how much I hate file permissions?  Yeah, well I do.

Okay, with that out of the way, let's see what's happening...

The blogosphere is having a field day digesting yesterday's Apple announcements, of course.  Engadget has a ton of product details and hands-on reviews.  You might find this interview with Walt Mossberg interesting - he makes some sensible points (e.g. the significance of multitouch gestures on a trackpad).  Journalists like Walt have now become celebrities to the point where they are the subject of interviews!  BusinessWeek has an interesting series on Apple's New Friends and Foes.  Of course as a media distribution company (yes Virginia, that's what they are) they have relationships with a lot of content providers like music labels and movies studies, and with a lot of distribution points like cellular carriers. 

My own morning after reaction: I still think iTunes Movies Rentals and the AppleTV are brilliant; they are going to do for movies what the iTunes store and iPods did for music.  The Macbook Air puzzles me, however; why doesn't it "stay on the air" all the time, with Internet access via Edge or EVDO?  The iPhone does...  This must be coming, right?  John Gruber and Paul Boutin wonder the same...  Reinforcing this for me, I actually watched part of the Jobsnote in my car while driving from San Diego to Los Angeles (don't ask) using my laptop's EVDO card....

And I'm wondering as I'm sure is Steve Jobs: how long before someone hacks Apple's DRM to allow "rented" movies to be owned?

Uncov skewers rewriting in Ajax: Just because you can, doesn't mean you should.  "The proliferation of stupid is the cancer that is killing the internet.  In the quest to re-implement every conceivable desktop application in Ajax, you mental midgets are setting computing back 10 years.  The worst part about it is, you think that you're innovating."  Indeed. 

Kind of reminds me of Russell Beattie's WTF 2.0?  "I really do think there should be a litmus test for new web apps launched from now on - something very basic and if they don't pass, they don't qualify for any buzz or linkage.  It's a simple test: Will they take my credit card?"

Random note: Have you ever noticed that Chick Hearn appears on Pink Floyd's The Wall?  Yep, right near the end of Don't Leave Me Now, seems like maybe a Lakers / Bulls Game... 

I saw this in The Scientist; seems more and more scientists are "going digital":
 

 
 

Return to the archive.