Monday, June 04, 2007

Tech Ed Day 1

So here I am in the land of the US, where big cars, wide roads and really thick blades of grass seem to be the order of the day.

Tech Ed started in anger this morning as Bob Muglia (Senior VP of Server and Tools Business at Microsoft) kicked off a rather impressive Key Note that involved an introductory video with Christopher Lloyd decked out as Doc Brown from Back to the Future (complete with the DeLorean time machine). The short film mocked the previous Key Notes tendency to promote Microsoft's "visions" for the future (called MS BS by Lloyd), and made inside jokes about Microsoft Bob and the Office paper clip among other things ("It looks like you want to scream" was the question that it asked I think)--Lloyd's joke refering to Bill Gates' alleged 640K of memory quote raised a few laughs among the attending folks.

After this the Key Note departed from the usual "vision" statements and instead concentrated on giving introductions to up and coming Microsoft products and how they can be used in the real world. Muglia touched on Windows Server 2008, Operations Manager 2007, SQL Server 2008, the 2006 R2 release of BizTalk, Visual Studio 2008 and Silverlight in the Key Note address that was delayed most likely to the massive queue of people waiting to register for their conference ID badges.

This delay caused most other scheduled sessions to be pushed back by about 15 minutes, which means that I only found the food hall at about 2pm (and consumed what I believe to be some kind of beef), contained within a huge expanse in the middle of the South building. I once attended Internet World back in the UK in Earls Court, and this room alone is about twice the size of that venue (remember what I said about things being large?)

So, onto the sessions for today:

Ron Jacobs on the Architecture Landscape extolled the virtues of Test Driven Development (TDD) and separating user interface from business logic. We develop using XP as a methodology at Esendex, so instruction on how to separate logic so it can be tested wasn't that much use to me.

What was useful was his statement that unit tests should be as simple as possible and be able to run completely contained within themselves. So if a test required a look up into the database then that test should be able to run if the database was not available. The way this is tested is by using "mock" objects, and we were pointed in the direction of NMock. I haven't had chance to look into this yet, but plan to later.

What I found amusing though were some questions in the Q&A section after the talk. People asked how to convince their bosses that TDD worked, as it seemed like working backwards. It's a valid question in a way, and I think I only find it amusing because we've already successfully adopted TDD at Esendex. TDD and pair programming has definitely reduced our defect rate in produced code and has also increased our performance in quite a few places.

I don't have any concrete metrics for anyone wanting to convince management on adopting TDD, only to say that it works.

Then there was the lunch session snappily titled The .NET StockTrader Application Service-Orientation Case Study: Building High-Performance, High Reliability Systems with .NET 3.0 and Windows Communication Foundation. This was interesting on many levels, and centred around the StockTrader MSDN example. One thing I need to look into in this example when the source is released later this week is the concept of centralising your configuration files to be read from a SQL Server. The problem this solves is one which the operations team at Esendex face every day.

Reading settings from a config file is fine in stand-alone apps, but when (as Esendex does) you have multiple Windows Services running on multiple servers, keeping that information consistent is a problem. Internally we've talked about how centralise the information a number of times, so it will be interesting to see how this is managed in the StockTrader example.

This session was my introduction to WCF, so much of it was new to me. What wasn't new though was the concept of different services communicating with each other, and that's what WCF was designed to do. WCF is the plumbing that enables services to talk to each other, and it masks it all so that the developer doesn't need to worry about exactly what transport mechanism or encoding the underlying channel is using--it is all configurable.

That session really should have been put on before my final one of today, which was Programming Microsoft Windows Communication Foundation: A Developer's Primer. In this session, fast-talking "Software Legend" (tried to find a definition, but can't) Juval Lowy gave the ultimate lowdown on WCF. This session was probably the most interesting today, and certainly the most entertaining (notwithstanding the time travelling Key Note, that is).

What was most intriguing was that although WCF is part of .Net 3.0, you can use it in .Net 2.0 if you get the WCF stuff from .Net 3.0. Now, how you do this I'm not sure yet, but we currently develop on .Net 2.0, so if we didn't want to make the jump to 3.0 yet we could still use WCF.

WCF is described as an SDK for building SOA applications: it is the plumbing between services. Nobody cares about plumbing, Lowy says, so why spend any time creating it?

Why indeed? And with something as rich as WCF there really is no need.

WCF is a thought process shift. Syntactically WCF is simple, a normal C# developer will be able to look at the physical code and understand what it is doing. The learning curve is figuring out how to implement this in your system.

There is a clear separation between interface and implementation in WCF. It even catches any exceptions thrown so that the interface is maintained. The service exposes metadata that describes the service, and the client creates a proxy for this and uses it.

In many ways its similar to web services, but without the restrictions that web services have. Web services can only be over HTTP, and use XML documents. WCF doesn't have these restrictions: you can have a binding that specifies TCP and binary if you want. In fact this binding is replacing the old .Net Remoting concept.

The in-process calls that WCF allows is also quite interesting. Communicating through WCF removes any assembly specific dependencies that an application might have. In any large system you will have assemblies referencing specific versions of other assemblies. If these dependencies ever change the others need to be rebuilt, or the assembly configured to use a different version if the interface doesn't change.

WCF allows you to separate these completely. Now I can have one assembly that communicates through WCF to tell another assembly to do something. I don't have to worry about how it is doing it, or how WCF is sending the messages, or even where the assembly is hosted. I can build a completely scalable, totally decoupled system that is easy to maintain.

WCF is certainly something to look to for future systems.

That's a lot of stuff for just one day....

1 comment:

Anonymous said...

I think .Net 3.0 is .Net 2.0 plus some addons (WCF and a couple of other things)

The extra stuff still runs on the .Net 2.0 CLR, so it isn't really a new Framework at all, more like .Net 2.0+.

Thats my understanding of it anyway.

So presumably you can just download the extra bits.

Good article I was sent here:
http://www.grimes.demon.co.uk/dotnet/vistaAndDotnet.htm