# Sunday, 26 May 2013

Back to C#

Three years ago, give or take, I was asked to help in assessing the technical choices for an interesting project.
The project aimed at realizing a platform (intranet, web) to help a company in dealing with all the quality-related processes and procedures; in particular, it should help in preparing and maintaining the high quality levels required by international standards (ISO-9001 and alike).

A very good friend of mine was asked to act as a program manager/product owner, and work as a proxy between the initial set of companies interested in the product (the customers) and the ISV that was in charge of developing the software.
My friend designed and detailed the specification for every aspect needed in the software, which was composed of three main modules: storage, review and approval processes for corporate documents and handbooks; collection, review and resolution of quality-related issues and process enhancement proposals; tracking and monitoring of supplies and suppliers.
All these processes follow nicely the pattern of a workflow: clear steps, with well defined responsibilities, and long-spans between steps. In short, reactive programs that need to be durable, recorded (for auditing) and persisted.

That said, we decided (in agreement with the ISV) to use Windows Workflow Foundations as a basic engine: it seemed a very good fit for the problem at hand. In particular, the ISV decided to use Sharepoint 2010 (which was in Beta, at that time). Personally, I think it was a good choice: I already had experience with WF, and SP seemed to offer all the missing bits and a lot of bells and whistles.

However, the ISV had no prior knowledge of SP 2010, and that showed in how they implemented the solution; it worked, but it required time to implement and was not nearly as flexible as expected. When faced with something more complex, something that required some effort to customize, they fell back to write custom code. A lot of code was written, and a little too much was hard-wired: for example, modifying a workflow (adding an activity, like "send an email", or exchanging two steps) should have been something easily done through the Sharepoint Designer, but ended up being something doable only with code, because most of the activities and the code to compose them was custom. Same for the web forms, for the workflow data (stored as XML data in custom SQL server tables instead of SP lists), for excel reports (which are provided out of the box... but only for SP lists).

Nonetheless, the software did its job, and did it quite well; the design phase was done carefully, and it met the expectations. More companies wanted to use it, and of course they wanted customizations and improvements.
I know all these details because I was asked to take over from were the ISV left, and help in bug fixing, modifications and deployment of new installations. It was hard, more that it should have been; some things that should have been done directly by the customers were possible only using code.
And deployment of Sharepoint solutions on top of an existing, running instance is a real, royal pain.
In short, we arrived at a point were some of the requests were simply not possible to fulfill. Some of the shortcomings are due to SP 2010 (more fine grained permissions and access control; less heavy footprint and resource usage; more custom user interface; easy and flexible deploy options) and we could not address them easily. Two features especially were a no-go for further development: deployment in a shared hosting environment (cloud?) and (linked to it) the ability to use a storage engine that was NOT SQL server.

Earlier this year, my friend and I met, and we decided to try and build a solution that was independent from SP, that retained the good bits (workflows, lists, reports) and improved upon them (adding more fined permissions and access control, workflows that were easier to author, persistence over different storage, a more modern UI).
We decided also to try and do it in what we thought was the right way, more for "fun" than for "profit": that meant no heavy pressure on times, the ability to learn new stuff (ASP.NET MVC, websockets,..) and to re-build some parts of the stack that we saw as necessary (I have a daily job for a living in a software company, so this project was just for fun and to keep my abilities sharp).
I was happy to have an opportunity to go back to C#, to keep my training on .NET going, and stay up-to-date on the latest  .NET technologies: it has been almost two years now without using C# professionally, and I was missing it! So, I said yes.
I decided also to keep a journal here, explaining what I have done and what I will do, what is working and what I will learn.

Next time: building a Sharepoint "clone"
# Monday, 20 May 2013

I'm back!

After some interval that looks like 7 years (hikes!)... I decided it was time to go back and blog again!

The first step was to refresh my old blog: the domain was still there, registered and paid. And unbelievably, after upgrading dasBlog with the right tools, the old posts where there too!

Disclaimer: take them for what they are; old posts about my past coding experiences! :)


They were a fun read; some of them looks funny (wow, my writing style really changed - I hope for the best - in the last few years!), but I decided to let them there anyway: someone can still find them useful.
However, I will definitely not going to blog about the same thinks.

In fact, the main reason I wanted to resume blogging is because I am going to start a new project, and I want to keep a record, a journal of how it will evolve. Even if it is only for myself. Hence, the title: coding for fun. I will talk mainly about coding, and the fun that it (still) gives to me.

# Saturday, 08 July 2006

One thing I really don't like...

...is when search engines (like Google, but also MSN) "automatically" sense what language I want to use. I live in Italy, therefore Google thinks I want results in Italian, or from Italian web pages. Well, that's NOT true. I want results and pages in English (like I want to type on an English keyboard and I also want my OS and programs to be in English). Especially if I type google.com instead of google.it. Also, when I search for development-related answers, as a result of this behaviour, I get really poor search result.

Speaking of translations: one of the worst things I have ever seen? Compiler errors (C++) translated into Italian. Even worse, VBA keywords used to be translated into Italian in some old release of Office. THAT was really crazy! :)

# Thursday, 06 July 2006

Windows Vista and Office 2007 Beta 2 (2)

Over all, I liked Office 2007 very much. The only thing I could not test was weblog posting, since dasBlog is not currently supported in the Beta version (it will be in the final version). I surely hope it will work well!

Instead, I have different feelings wrt Windows Vista. I followed Longhorn closely from the first PDC (2003?) and I was really looking forward to see it. Remember the “native kernel + managed subsystems” part? The three pillars, Indigo, Avalon and WinFS? Well, Indigo and Avalon are great but will be part of .NET 3.0, also available for XP, and WinFS is dead

So, what’s the point of Windows Vista? The three pillars gone, it remains the new UI, the improvements to the kernel and window manager, and the improved security model.

Running as a simple User and having to use tedious runas commands to do very common tasks on my notebook (such as changing IP, power profile or directory permissions) I thought the new LUA model of Vista will be great for me. The default user is still marked as “Administrator”, but I think (hope?) it is a simple User account under disguise, and when performing security related operations (i.e. clicking on buttons with the little shield) the security token is upgraded and substituted with one of the Administrator group, if the user grant the permit.
This is my first complaint: why they did that? Be clear, use default account from the Users group and simply ask for an Administrator password before running administrative programs, or on their first security related operation; then make the admin take ownership of the whole program. Surely, it is safest to ask every time if a program can do this or that…or not? People gets bored very easily, and do not always read what it is written on dialog boxes. Normal users almost never do that, they only try to get rid of that annoying (or for someone scary) box that prevent them “using my computer”. Despite this, the new LUA is still better than the previous situation.

The new window manager-UI instead is great. And I’m not only speaking about all the eye-candy (transparencies, the new Flip 3D I already love, the shiny new icons, the sidebar etc.) but also about usability. I love the new explorer UI and the new Open and Save dialog boxes. Finally we went a step further, stripping away the file menu where is no longer useful (like in Office 2007, where it “evolved” into the ribbon) or necessary (like in Explorer and Internet Explorer, where it is… no more!). The click-able locations on the address bar, the search box in the toolbar (yes! No more stupid search dogs…) and the new copy/move progress dialogs are some things I have waited for, and they are really great. And the Sidebar is both useful and beautiful to see (only one complaint: why is it not easy to hide and show it? Maybe with a F-something key, a-la Exposè?).
On the negative side, I have found the new UI very poorly configurable and customizable: If you chose Aero, you can’t even change colors. Very little can be done, but maybe this is the price to pay for a mainstream and “standardized” OS.

Finally, I know this is only a Beta, but I had a LOT of problems installing programs: cygwin does not work, it is impossible to register file extensions for some programs (7-zip comes into my mind), other programs crash without reason. Even SQL Server 2005 needs a patch to work correctly! There is still much work to be done, and it passed a lot of time. Maybe Mini is right, and the Windows team need to change direction.
The course the event took really disappoints me. Vista is great, but not so great and not so radically different to justify for me the switch from XP (a 5 year old OS!). I love .NET, and the managed and secure world, and I’m with Gary McGraw when he says that Vista is a missed opportunity for a new, modern and secure OS; the three-years-ago Longhorn still looks better to me than the actual Vista. I’ll have to wait for Singularity… :)

# Tuesday, 04 July 2006

Windows Vista and Office 2007 Beta 2

I could not resist, and installed both :) I am very pleased with Office 2007. I think is the best release of Office I have ever used! The ribbon is great, although there are some command that are “misplaced”, but I think this is my opinion and it is based on the use I do of Office (Word, in this case). Everybody uses Word or Excel in its own way, so the Office team had done a great work examining usage patterns, correlations between command usages and position, and so on. Just read Jensen Harris blog; as an example, I found particularly amusing his post on “command lingering”.

I used Word, Excel and PowerPoint for about a month now, and I have to say I really like the ribbon, the mini-toolbar, even the Office “Start” button I initially feared so much. And it looks great on both XP and Vista too! I have found myself using and discovering more and more office commands, that I am sure where there even before Office 2007 but were essentially “ghost commands”, buried into menus and not readily accessible.  A really great piece of software, even in the Beta version.

I'm here again!

Sorry, I have little excuses for not posting for a long time (more than one month!). The only justification is that I have been abroad for one month. Yes, I have been to Salt Lake City, UT, for the last month, working on the project I am currently employed on at IASMA.

In the last year, from July 2005 when I started working here on the Grape Genome Project, we prepared all the programs, databases and informatics structure to house and analyze data of the grape genome. This included databases for the handling of huge data volumes, applications for mass-analysis of data (gene predictions, automatic annotation of as many genes as possible using different sources, etc) and a first infrastructure to query and retrieve data on the genome.

This part involved a lot of scripting, being more a matter of handling data and gluing existing programs together, and this was the reason I had to learn Perl. But there was also room for building interesting programs which make a good use of structured data, such as the Gene Ontology. Ontologies have a lot of limitations (I will probably dedicate a post to this subject), but they are a great leap forward from the actual position of biological data handling. The structure of the Gene Ontology allow us to do some very interesting things, like inferring functions in a more precise manner, taking into consideration more than one source, the “informativeness” of the assigned term, and the degree of confidence we have; it is possible, for example, to see if a term is supported consistently among different sources. We also used the Gene Ontology to create a new and flexible tool for the analysis of expression in different classes of biological processes from microarray experiments, and a new tool for data-mining that queries genes not based on textual searches (as most of the public databases do) or on sequence similarity searches, but based on “semantic” queries, i.e. queries based on the “type” of the gene.

Be aware that ontologies and their terms are not type systems and types; personally I think they are a starting point on which to build more formal and complete techniques. But this will be part of future work, maybe even my future work… :)

# Wednesday, 26 April 2006

Did you know...

...that there are (more than) 21 algorithms to compute the factorial of a number?
I discovered this fact today, while implementing the Fisher's exact test for a 2x2 contingency table.
(For those interested: Fisher's exact test for a 2x2 contingency table, like the more famous Chi-square test, is used for the analysis of categorical frequency data: its aim is to determine whether the two categorical variables are associated.
We use it to extract the most significant genes from a gene set experiment: the two categories are "clue present/clue absent", while the two data sets are the control set and the experiment set)
Fisher exact test requires the computation of 9 factorials for each iteration, and one of them is on the total. Typically, applets and code you can found on the internet can handle data up to a total of about 100-200 before giving up due to overflow problems. Even code that uses libraries for arithmetic with arbitrary precision can handle numbers up to 6000-12000 before becoming unsuable.

Here, we are facing two problems:
  • numbers (integers and doubles) provided by the machine (physichal CPU or VM) have a finite precision and extension
  • the simple and dumb algorithm (the one that performs all the multiplications) is very SLOW
The solution is to adopt a good library that handles multiplications of very large numbers in an efficient way (using Karatsuba multiplication or a FFT-based multiplication), and forget about the naive algorithm. At this page you can find implementations and benchmarks for the 21 different algorithms I mentioned. The most efficients use prime numbers (or, better, primorials, i.e. the multiplication of the n first primes), but even the simple and elegant Split Recursive performs a lot better than the naive algorithm!

Notable news

Some funny / useful things I found browsing last week:
  • From Brad Adams, .NET Framework 2.0 Poster
    It is the replication of the version 1.1 I found shipped with Visual Studio 2003; unfortunaltely the same poster wasn't in my Visual Studio 2005 box. Great!
  • selk'bag
    The SELK’BAG is a sleeping bag you wear. The entire bag fits like a glove. You can walk with it, sleep in a sit position.. I think it would be great to have one of these for my work place in summer: the ice-conditioning system makes our offices freezing cold!
  • TiwyFeeds
     "Take It With You" Feed Reader: the idea is great, the realization is poor..

    But I'd definitely love a feed reader 'a-la' gmail, where you can store all your feeds and news for an unlimited amount of time, with a searchable interface!
  • Krugle
    A search engine for developers: personally I use google + MSN desktop search (for code on my machine). But a tool tailored to developers is surely great! I'd like to have Visual Studio search and navigation facilities in a desktop serch application...

     

# Wednesday, 19 April 2006

User interfaces

There is a lot of hype around new user interfaces lately: the new Vista glass look, how it compares to the always improving Mac OSX interface, and the Linux response (Xgl).
These are all nice and eye candy evolutions of existing interfaces, though. Like I discussed in a previous post, and as Jeff Atwood frequently reports in his articles, there is a lot of room for UI improvement. Embedded and ubiquitous search is a key improvement in my opinion. But surely future interfaces will bring more interesting features, even without a radical shift of paradigm. In the last "In our time" column of IEEE Computer (March 2006), David Alan Grier talks about pervesive computing. The definition of pervasive computing dates back to 1991, and it is due to Mark Weiser. Mark wrote that "the most profound technologies are those that disappear", "weave themselves into the fabric of everyday life".
Mark war an engineer working at (guess where?) Xerox PARC. Exactly, the same lab where GUI as we know them today were born. 
He was aware of the importance of such ideas, but he also felt that they were incompatible with the idea of a pervasive computing environment. A windowing operating system, no matter how sophisticated, only makes "the computer screen a demanding focus of attention, it does not fade into the background".
I agree with him: where I work I often see people that are so focused on the screen to forget somehow they primary work. They are so focused on how, they forget what.

But is it really a defect of the window paradigm?
I don't think so. Especially after seeing "The Island". Take a look at this desktop environment:



do you see the computer? The operating system?
Maybe a closer look...






The desktop... is the desk. It is amazing to see it in action! Using some sort of mouse, the principal (The character with glasses) "throws" a window at Lincoln (the guy in white). Then Lincoln begins to draw, like on a sheet of paper on the desk. And when he finishes, he passes the window with the drawing back to the principal.
And there is no shift of paradigm. There are windows, and folders, and documents on a desk. A mouse to grab and move them. But the desktop IS the desk!

This is pervasive computing, for me. And I really want one of those desks... :)
# Wednesday, 05 April 2006

I am Windows 2000

This is ironic...

You are Windows 2000 SP3.  You're a steady and reliable friend.  People think you're all business, but with your recent therapy you've become a little more playful.
Which OS are You?


But I like Win2000 very much!