# Tuesday, 03 January 2006

Happy new year!

Finally 2006 arrived!
Wew, it was a rather tiring December. I had a lot of "collateral" work: new servers, a web infrastructure to build, porting a web application to mysql (and Castor still refuses to collaborate!). I spent a very good Christmas with my family, made some good reads (my heap of to-read papers/articles/books diminished a little)  and went to Salzburg with my girlfriend.
On the informatics side, I designed my 2006 LOTY: Lisp! I will dedicate a post on my decision (Why Lisp?), but for the moment let me say that Practical Common Lisp is a very good book! I look forward to learn using macros: in the second chapter Peter Seibel gives you a whole bunch of database bindings (with very easy to use select, update and delete functions) with only 50 lines of code! Amazing... It works only on attributed lists, but it resembles a lot Linq.
I am really thrilled to see if Lisp turn out to be the language for building languages that claims to be. I always thought that the most annoying thing for a programming language is to NOT expose the language, i.e. to not make available to the programmer constructs of the language itself.
Take java: I find highly annoying that the language has cast operators for classes and primitive types, and operator overloading for classes in the Java class library (like + on String), and do not let the programmer to use them! Till now, C++ was my language of choice for this very reason: it lets you to "adapt" the language, build a meta-language that fits your needs and your application domain.
Maybe Lisp macros will get an hold on me.. =)

# Thursday, 22 December 2005

LOTY:2005

My 2005 experience with languages was less interesting. I finished university, and started to work at a research center on Bioinformatics, trying to decide what should I do next (work? PhD? try to make a startup?).

I really liked bioinformarics how they introduced us back at the university, being all a thing of concurrency, synchronizzation and programming languages (as I learned later, this is a branch called System Biology).
Well, in the real world (TM) the language of choice of Bioinformaticians is.. perl.
And they (too often) use it NOT how it was meant to be: the 'duct tape' of programming language, to glue together different pieces written by different people in different envirnoment. For this task, perl works like a charm. And it is very easy too... but Larry Wall claim (Perl make easy things easy, and difficult things not impossible) stops at the first sentence. The second one is not wrong, but having seen some perl hacks, it is better said "and difficult things very messy".

And biologists use it because it is messy. Declare varaibles? Why? Not using goto? It is one of the instructions, why can't I use it?

All together, I am glad I learned perl. It has some very interesting stuff (hash tables, file manipulation, a neat database interface that many should learn from) and, above all, is the father of Perl regular expressions. It is un-matched in the field of text-file manipulation.
But surely, it is better remeber that you don't build an house out of duct tape..

# Wednesday, 21 December 2005

LOTY: 2004

2004 was my C# year. I think I have done a nice work.. Coming from C++ was not so difficult, at first it was like using a "scripting C++", i.e. C++ as a two level language. (Little digression: one of the thing I like most of C++ is its ability to be used as a language for languages. Nobody should use C++ low-level features to build a commercial application, but she shuold create a very well designed and optimized library, with all the quick and dirts of the language, and then use this newly created enviroment as her production language. That's the power of C++!)

But C# is much more than a simplified version of C++. It has functional seeds in it, that are growing more and more through the various versions: first delegates, now full closures (anon delegetes), iterators and yields, and for the future anon classes, extension methods, expressions trees and lambda functions! I am very thrilled!
   
You may wonder why it took me one year (and why I think I have done a nice work) to learn C#. Well, I was productive in an evening (is a very simple language) bbut I wanted to go "under the hood". I think that the best starting point is still Don Box's "Essential .NET". Is a .NET book, but it describes very well the interactions between the language and the compiler. Then I downloaded and take a peek inside Rotor (like you can see here.. Boy, that was interesting!). And then I went for C# variations: Spec#, Comega... and my own contract language! =)

2004 was a nice year. I like C#, and I found it very procuctive (truth to say, it is mainly thanks to .NET), and the functional flavour it is taking is good!

# Monday, 19 December 2005

Language of the Year

"Learn at least one new [programming] language every year. Different languages solve the same problems in different ways. By learning several different approaches, you can help broaden your thinking and avoid getting stuck in a rut."
   --- The Pragmatic Programmer



I embraced this philsophy some years ago, and I found it very fruitful. Learning a new programming language is surely simpler than learning a new language: you can write simple programs in a night, and being productive in a week (surely, it takes a lot more to "master" it, henece the "one language for year").It has however the same advatages. Knowing many languages makes you able to speak directly with different people, easing your job; the same is true form programming languages: you can "speak" directly to many different code, which will definitely ease your work!

Another advantage is that you can use the right tool for the right job. This was not perfectly true in the past: if you program was in C and you had to write a little reasoner, it was hard to write it in another language (say Lisp) and then integrate it in the main program: often the choice did not paid off (for performace, reliability, the difficulty to interface the two worlds using strange mechanisms, sometimes even the necessity to write by yourself such interfaces).

If .NET succeeded in making to me a really good impression, its ability to integrate seamlessy different programming languages without imposing you a "standard language" was surely an important point. I found myself re-opening my CS books on functional programming and use ML and Caml again, with my greatest pleasure.

I think my two points can be summoned with this code snippet I found in an interesing blog post:

sort           [] = []
sort (pivot:rest) = sort [y | y <- rest, y < pivot]
                    ++ [pivot] ++
                    sort [y | y <- rest, y >=pivot]


It's Haskell. If you don't know Haskell, or a similar programming language, you may have the wrong reaction, put this code in the thrash, and write a 20 lines long version of the algorithm in your "standard language". Or, if you have learned a different programming language once in your life, you can appreciate the beauty and the simplicity of it.

Of course, Haskell is terrible for other things: but you can compile the code in an assembly, and reuse it from another language. The right tool for the right thing.

Last but not least: like the opening citation said, learning a new language is food for your mind.

# Wednesday, 14 December 2005

Meaningful programming T-shirts

From Jeff Prosise

Comment my code?

Why do you think they call it code?

Perfectly true. It is the same thing expressed in an interesting book I'm reading (Hackers and Painters, by Paul Graham - it is an interesting book, with some very good points and some very weak spots.. more on this in a later post, when I'll finish it).

The shirt is obviusly intended to be ironc.. but there is truth in the statement. A programming language is the perfect way to express algorithms; if you have to go and comment it.. you are not writing good code! Comments should be placed only in the right and meaningful spots (bleargh to automatic tools - what they were think at when they promoted the use of this tool?)

On a similar note, another T-shirt (or cup) I always wanted to have is

It compiles!

Let's ship it!

(Also the subtitle of my blog). The message is funny, but it shouldn't be: I hate testing, and my dream for the future is a really automatic and clever tool that is executed when you build your program. Love static analisys, and remember: an error the compiler can catch is a bug less in your product!

# Tuesday, 13 December 2005

On the way of OS history

Last post I wrote that windows/386 (on a Compaq Prolinea 486) was my first OS. Next there were Windows 3.1 (and 3.11), and next? If you are guessing windows 95, you are wrong.. My next OS, in 1994, if i recall it correctly, was Windows NT 3.1! A friend of mine was a system administrator at my father's office, so when he bought NT 3.5 he gave to me NT 3.1! What a big leap =) I still had dual boot with DOS (for games) but I think I can say I pioneered the 32-bit era. I never left NT since then, and I always had dual boot machines (though OS were really expensive at that time! I remember I changed machine when it was OS upgrade time, to have a cheaper OEM version): Win95 / NT4.0, Win95 / Win2000.. and finally a unique OS with XP!

My dearest OS still remains NT 3.5. I never had it on one of my machines, but I used to work on it. It was beautiful, a "pure" OS, if you get what I mean. On the other side, I admired the most Windows 95. It was, and is, an engineer masterpiece of equilibrism. How they managed to make it so compatible with a pletora of legacy applications, bringing 32 bits to the userland, and being fairly stable..it's a mystery!
# Monday, 12 December 2005

I wish I was there...

...but at the original presentation I was only 5!
Surely, this amusing presentation by Charles Petzold is a bit late =) , but I've found it fashinating. Hope we'll not lost memories from the early programming days, as too often happens in many other fields!
I've never used Windows 1.0, my first version was Windows 386, a re-compiled verision of Windows 2.0 (which was meant to run on 286, I believe). I still have diskettes and manuals!

I grew as a computer appasionate first, and as a computer programmer later, in the Windows world, so no surprise if I try my best to defend it againts useless attacks on how buggy or annoying or insecure...or whatever is Windows!

Happy birthday Windows!
# Sunday, 11 December 2005

Sorry for the delay

Wow! It's been a long time since my last blog.. Too much to do at work, plus a little holiday with a trip to Florence.. =)
Here in Italy we had a pleasant long week-end (we call it a "bridge" when an holiday day comes close to the weekend), and I especially enjoyed it at my granma' in Florence.
It snowed a little the week before, it was only about 20 cm but I think we are not more used to snow: traffic was a mess, and childrens were so happy =) it used to snow a lot more when I was a child, when i was five we had a huge snow shower that brought us almost 1 meter of snow!

So, back on this blog subject: programming! At work we had a lot of work because our servers finally arrived. There were 4 beautiful AMD opteron at 2.6GHz that needed me to be inserted in an HPC cluster. An HPC cluster is not for redundancy or fail-over, nor for load distribution in a web-server fashion: is about high performance computing. So, I had to learn about ways of configuring and using the cluster, and because I found the informations both interesting and amusing the few next entries will be about clusters!
Bye

# Sunday, 27 November 2005

Jingle Bells time again...

Yesterday we had a snow shower, today is the first Sunday of Advent.. What should be more appropriate for a practical example than Jingle Bells? =)

Trono designed the Santa Claus problem as a new exercise in concurrent programming. He wished to have an exercise harder to solve than the ususal mutual exclusion problems, and indeed the Santa Claus problem involves three sorts of process that need to synchronize and cooperate at certain moments. The problem may be stated as follows:

Santa repeatedly sleeps until wakened by either all of his nine reindeer,
back from their holidays, or by a group of three of his ten elves.
If awakened by the reindeer, he harnesses each of them to his sleigh,
delivers toys with them and finally unharnesses them (allowing them
to go off on holiday). If awakened by a group of elves, he shows each
of the group into his study, consults with them on toy R&D and
finally shows them each out (allowing them to go back to work).
Santa should give priority to the reindeer in the case that there is
both a group of elves and a group of reindeer waiting.

Some parts of the problem, if solved with simple primitives like monitors or semaphores, are very tricky: Trono's original implementation had a bug itself, as pointed out by Ben Ari. Ben Ari solved the problem in a quite simple way using Ada95 synchronization primitives. He also showed how java monitors ar not much better at this task than the simple P-V semaphores used by Trono.
Unlike Java, Ada 95 do not use monitors with condition variables which must be explicitly signalled and waited upon. The language uses instead a construct similar to our wait preconditions: a boolean expression (called a barrier) associated with each method. If the barrier evaluates to true, the thread calling the method is allowed to execute it, it nobody is already 'holding' that object (access to objects is mutual exclusive).
If the barrier is false, threads calling the method are queued. When a method exits, other threads are signalled, so they can re-evaluate therir barrier and see if they are allowed to execute the method.

Wait preconditions acts in the same manner, with two exceptions:
  1. it is possible to use temporal logic
  2. it is possible to refer to meta-variables representing methods.
Later, Nick Benton solved the problem using Polyphonic C# (now Comega). Comega is based on a paradigm derived from join calculus, whose basis are chords and asynch methods. To make it short, synchronization is message-based, and execution derives from the ability of a chord to 'consume' messages.

For example:

static void reindeerBack() & static async  reinWaiting(int r)
{
if (r == 8)
reindeerReady(); // last reindeer
else
reinWaiting(r + 1);
}


the method reindeerBack will be executed only if there is a reinWaiting 'message' ready to be consumed. Then will send the reindeerReady 'message' that will be consumed by the Santa's thread. This will wake up and do its job with the reindeer.

Polyphonic C# solution is nice, and the code very compact. However, it could be quite hard to understand if you are not familiar with process algebras, or with concurrency based on messages.

For our solution, we take an approach that combines the two. Synchronization contracts let us do so, since it is possible to use wait preconditions like Ada95 write barriers. On the other side, we can mimick (in a very restricted way, but sufficient for this example) the method/message consumption of Comega.

We use the Room class from Ben-Ari solution (we'll call it Queue, and this class will replace class nway in the Comega solution, since they solve the same problem).

class Queue
{
private int queueSize;

public Queue(int size)
{
queueSize = size;
}

void Accept() //Open_Door
{
queue.clear();
enterPossible = true;
}

public void Entry(Object o)
requires_wait(enterPossible == true)
{
queue.add(o);
waiting = waiting + 1;
if (waiting < queueSize)
{
waitForLastMember();
}
else
{
waiting = waiting - 1;
enterPossible = false;
roomDoor = true;
}
}

private void waitForLastMember(Object o)
requires_wait(queue.size() == groupSize)
{
waiting = waiting - 1;
if (waiting = 0)
{
//i'm the last one
roomDoor = false;
santa.Wake(this.queue);
}
}

public bool Full()
{
return queue.size() == queueSize;
}

public void WaitFull()
requires_wait(queue.size() == queueSize);

}

Then, to wake Santa, we do not use a special keyword like Ada95 (we do not have one, at this point), but somethig similar to Comega:

class Santa
{
//...

void WakedByReindeer()
{
harness.WaitFull();
DeliverToys();
unharness.WaitFull();
ReindeerQueue.Accept();
}

void Wake() { }

void Sleep()
require_wait(pre(@Wake))
{
if (ReindeerQueue.Full())
{
//Santa's wake and Go to Deliver (when all harnessed..)
WakedByReindeer();
}
else
WakedByElves();
}
}

We do not allow, with the actual implementation, to override methods with same signature but different wait preconditions, otherwise the Comega solution:

static void waittobewoken() & static async reindeerready() { }
static void waittobewoken() & static async elvesready() { }

could be mapped to:

void Sleep()
require_wait(pre(@ReindeerWake));
void Sleep()
require_wait(pre(@ElvesWake));


Surely, Comega solution is more elegant in this point, but our solution is not so bad, and we hadn't the need to introduce a keyword, like select in Ada95.

The key point is that synchronization contracts are flexible; they allow the programmer to use the style best tailored to the task, or best tailored to her own experience.

References:

N. Benton
Jingle Bells: Solving the Santa Claus Problem in Polyphonic C#
Microsoft Research, 2003

M. Ben-Ari
How to solve the santa claus problem.
Concurrency: Practice & Experience, 1998.

# Saturday, 26 November 2005

Synchronization contracts: an example of syntax

I'd like to introduce very briefly the syntax of synchronization contracts.
As we have seen in the previuos two posts, we use them to express
  • assertions on temporal behaviour
  • synchronization
In the first group we have three keywords, mutated from DbC (those familiar with the concept will be at home here): requires, ensures and invariant.

In the second group, we have the variants of preconditions and postconditions we introduced last time: requires_wait and ensures_wait. In the base form of a synchonization contract language, these keywords are all we need. Well, plus the LTL keywords, of course.
Their meaning is pretty obvious: for a complete introduction, see chapter 5 of my thesis. Here I will introduce only some of them:
  • pre(cond): cond was true at the previous step
  • once(cond): cond was true once
  • cond1 since cond2: cond2 became true, and from that moment cond1 was true also
  • cond1 until cond2: cond1 must hold until cond2 is true
These operators will be useful to express conditions on the flow of the program: the first three are PLTL operators, the last one is a classical LTL operator.

class Stream
{
bool ready;
invariant(@recevive -> (!@send until @endReceive));
invariant(@receive -> finally(endReceive));

void init();

void receive()
requires(ready && once(@init));
void endReceive()
void send()
void endSend()
}

Pretty simple, huh? the @method construct is used to express the concept "the method has been executed", and is a pretty shortcut for not introducing a ton of state variables. Here synchronization contracts are used to specify behaviour, and to check it. But let's see another example:

public class HistoryBuffer
{
invariant(current == max -> (!@Put until @Get));

public virtual object Get()
requires_wait(current > 0);

public virtual object Get()
requires_wait(current < max);
{
--current;
object ret = buf[current];
return ret;
}
 public virtual object GGet()
requires_wait(pre(@Put))
{
return Get();
}
}

Now we have chacks (the invariant) and also synchronization (the guards on Put, Get and GGet). The GGet function have the requirement to be executed only if the previous operation is a Put (i.e. it does not permit two subsequent Get operations). See how it is simple to express such a concept, much simpler than with a stadard programming language.
Next time we'll see an interesting synchronization problem and how it can be solved with synchronization contracts.