# Wednesday, April 26, 2006

Did you know...

...that there are (more than) 21 algorithms to compute the factorial of a number?
I discovered this fact today, while implementing the Fisher's exact test for a 2x2 contingency table.
(For those interested: Fisher's exact test for a 2x2 contingency table, like the more famous Chi-square test, is used for the analysis of categorical frequency data: its aim is to determine whether the two categorical variables are associated.
We use it to extract the most significant genes from a gene set experiment: the two categories are "clue present/clue absent", while the two data sets are the control set and the experiment set)
Fisher exact test requires the computation of 9 factorials for each iteration, and one of them is on the total. Typically, applets and code you can found on the internet can handle data up to a total of about 100-200 before giving up due to overflow problems. Even code that uses libraries for arithmetic with arbitrary precision can handle numbers up to 6000-12000 before becoming unsuable.

Here, we are facing two problems:
  • numbers (integers and doubles) provided by the machine (physichal CPU or VM) have a finite precision and extension
  • the simple and dumb algorithm (the one that performs all the multiplications) is very SLOW
The solution is to adopt a good library that handles multiplications of very large numbers in an efficient way (using Karatsuba multiplication or a FFT-based multiplication), and forget about the naive algorithm. At this page you can find implementations and benchmarks for the 21 different algorithms I mentioned. The most efficients use prime numbers (or, better, primorials, i.e. the multiplication of the n first primes), but even the simple and elegant Split Recursive performs a lot better than the naive algorithm!

Notable news

Some funny / useful things I found browsing last week:
  • From Brad Adams, .NET Framework 2.0 Poster
    It is the replication of the version 1.1 I found shipped with Visual Studio 2003; unfortunaltely the same poster wasn't in my Visual Studio 2005 box. Great!
  • selk'bag
    The SELK’BAG is a sleeping bag you wear. The entire bag fits like a glove. You can walk with it, sleep in a sit position.. I think it would be great to have one of these for my work place in summer: the ice-conditioning system makes our offices freezing cold!
  • TiwyFeeds
     "Take It With You" Feed Reader: the idea is great, the realization is poor..

    But I'd definitely love a feed reader 'a-la' gmail, where you can store all your feeds and news for an unlimited amount of time, with a searchable interface!
  • Krugle
    A search engine for developers: personally I use google + MSN desktop search (for code on my machine). But a tool tailored to developers is surely great! I'd like to have Visual Studio search and navigation facilities in a desktop serch application...


# Wednesday, April 19, 2006

User interfaces

There is a lot of hype around new user interfaces lately: the new Vista glass look, how it compares to the always improving Mac OSX interface, and the Linux response (Xgl).
These are all nice and eye candy evolutions of existing interfaces, though. Like I discussed in a previous post, and as Jeff Atwood frequently reports in his articles, there is a lot of room for UI improvement. Embedded and ubiquitous search is a key improvement in my opinion. But surely future interfaces will bring more interesting features, even without a radical shift of paradigm. In the last "In our time" column of IEEE Computer (March 2006), David Alan Grier talks about pervesive computing. The definition of pervasive computing dates back to 1991, and it is due to Mark Weiser. Mark wrote that "the most profound technologies are those that disappear", "weave themselves into the fabric of everyday life".
Mark war an engineer working at (guess where?) Xerox PARC. Exactly, the same lab where GUI as we know them today were born. 
He was aware of the importance of such ideas, but he also felt that they were incompatible with the idea of a pervasive computing environment. A windowing operating system, no matter how sophisticated, only makes "the computer screen a demanding focus of attention, it does not fade into the background".
I agree with him: where I work I often see people that are so focused on the screen to forget somehow they primary work. They are so focused on how, they forget what.

But is it really a defect of the window paradigm?
I don't think so. Especially after seeing "The Island". Take a look at this desktop environment:

do you see the computer? The operating system?
Maybe a closer look...

The desktop... is the desk. It is amazing to see it in action! Using some sort of mouse, the principal (The character with glasses) "throws" a window at Lincoln (the guy in white). Then Lincoln begins to draw, like on a sheet of paper on the desk. And when he finishes, he passes the window with the drawing back to the principal.
And there is no shift of paradigm. There are windows, and folders, and documents on a desk. A mouse to grab and move them. But the desktop IS the desk!

This is pervasive computing, for me. And I really want one of those desks... :)
# Wednesday, April 05, 2006

I am Windows 2000

This is ironic...

You are Windows 2000 SP3.  You're a steady and reliable friend.  People think you're all business, but with your recent therapy you've become a little more playful.
Which OS are You?

But I like Win2000 very much!