Slides from ANUG lightning talk on webservices

by DotNetNerd 26. April 2012 14:30

567340630Thanks to everyone who participated in the first ANUG lightning talk this wednesday. It was a fun to dive into the subtle differences between Nancy, ServiceStack and WebAPI. As with everything else – it is always more fun when you win the battle of course :)

The demo and slides are available at - along with a small sample of using type providers in F# for building a WebAPI service.

F# Type Providers and WebAPI

by dotnetnerd 4. April 2012 10:35

imagesLater this month on the 25th I will be speaking along with two other speakers about webframeworks at an ANUG meeting. The frameworks covered are ServiceStack, Nancy and WebAPI where I will cover the latter.

While I have been preparing my eye cought a new feature in F#, which I think is one of the first really good stories for F# that makes sence outside akademia and science. Type Providers give us a way to access data sources that provide its own metadata, and this fits nicely with building WebAPI services.


Nancy Bootstrapper for Castle Windsor

by DotNetNerd 18. January 2012 19:37

8e00fa6da668702f8b73ac4caebfbee4On a current project I have decided to use NancyFx for services that expose data to the client via Ajax. The solution already uses Umbraco for CMS capabilities and everything is wired up using Castle Windsor for DI.

From the start I was hoping to just install the nuget packages for hosting in an ASP.NET application and for bootstrapping with Windsor. As it turned out neither worked in my case.
Getting Nancy to run alongside an existing site is pretty well documented, so that went pretty smoothly, once I gave up on the package and just followed the documentation.


Using the MVC-Mini-Profiler with Umbraco

by dotnetnerd 16. January 2012 08:50


The last year I have been posting more or less the same blogposts here and at – but my plan is to stop that going forward. I have been asked to do more blogging in Danish, and that fits pretty well with the fact that I have been postponing making a decision on which blog to stick with, since I don’t care much for the cross posting. So going forward this blog will be kept in English and It will contain the topics that are either micro-posts or a bit off topic for what I do at Vertica. So lets get started!

Mini-Profiler colliding with Umbraco

For some time I have been using the MVC-Mini-Profiler, to get a quick look at how my pages perform during development. The good thing about the mini-profiler is that it is so light weight, requires little setup and it is a lot less intrusive than most profiling tools.

In spite of its name the profiler works just fine with regular ASP.NET websites and applications, however I did run into a little twist when using it with Umbraco, which is what I want to share today.

To get started using  mini-profiler you simply add the nuget package to your project, and do the regular setup with adding scripts and calling start and stop methods on the profiler.  If you run your site now, nothing will happen that is visible to the naked eye – doh!

The solution

If you take a look at the DOM by using your favorite browsers developer-tools/firebug you will see that some container elements have been added to the page but with no content. This is because the elements are loaded asynchronously, but as the tab in your developer-tool that shows network traffic shows the call to mini-profiler-results returns a 404.

So as you may have guessed by now if you are used to working with Umbraco, you also need to add the mini-profiler-results to the umbracoReservedUrls in the web.config making it look something like this:

<add key="umbracoReservedUrls" value="~/config/splashes/booting.aspx,~/install/default.aspx,~/config/splashes/noNodes.aspx,~/mini-profiler-results" />

Now if you run the site, you should get the profiler box in the top-left corner of the site, and you are ready to go hunting for performance bottlenecks.

2011 debriefing

by DotNetNerd 23. December 2011 09:00

happy-new-yearsSo, the time has come to look back on the year that has passed, and reflect a bit. I am usually more focused on making plans than writing history books, but once in a while it can be beneficial to take a look at what you have been doing, so you can make better conscious choices in the future. I do tend to get lost in all my little projects, so sometimes it is a nice reminder to look back and get some ideas of why the year went by so fast.

So what were the more interesting parts of my year? Well, I

  • Built MiniMe, which has had > 370 downloads since july and a couple of contributors.
  • Blogged here and on – writing 18 blog posts total. Getting a pat on the shoulder as blogger of the year at Vertica, along with another good colleague.
  • Hosted an ANUG code camp on IronRuby.
  • Did a ANUG podcast about the NHibernate 3.0 Cookbook.
  • Played around with a bunch of technologies, and found use for some of them at work – doing faceted search with MongoDB and KnockoutJS as one of the more exciting solutions.
  • Did hobby projects to fool around with WebMatrix, MVC3 and Entity Framework.
  • Read a handfull of books on webdevelopment rangeing from Ruby on Rails 3 Tutorial to Dependency Injection in .NET.
  • Attended HTML5 fasttrack, Commerce Server Training and a bunch of ANUG and Trifork events and code camps.
  • Was SEEE certified – mostly getting to know why not to use it.
  • Helped win the e-commerce award for start-ups in Denmark – which I actually worked on last year.
  • My first Windows Phone 7 app Blue Orb Player just turned one year and has been installed from marketplace > 285 times.

At work it has been a pretty versatile year, where I have had some consultant tasks, worked with Windows CE and Silverlight as well as the more regular ASP.NET projects - where BD, Bolia and Trollbeads have taken up the most of my time. I can only hope that 2012 will be just as exciting and with HTML5, devices and more client driven solutions shaping up to become important themes, I will probably have as much fun as I did this year.

Merry xmas and happy new year everyone :)

HTML5 fast-track

by DotNetNerd 20. December 2011 10:42

This December I spent a couple of days at the HTML5 fast-track course, getting up to speed on various new and coming browser features and API’s. The course was hosted by Trifork who had invited Peter Lubbers – who wrote the book Pro HTML5 Programming to go over the umbrella that is the HTML5 specification. He and his co-host Mathias Bynens did a great job at presenting the most important aspects of the spec. They also managed to communicate the more blurry parts in a way, so we left with a good overview of the maturity of the different browser implementations. So today I will write a bit about some of the things we went over, and hopefully help you learn a bit more about HTML5, while providing you with a bunch of links where you can learn more.


Implementing faceted search with MongoDB

by DotNetNerd 8. December 2011 08:29

After my last post on the faceted search I was asked to elaborate on how it was implemented with MongoDB. So that is just what I will do with this post – giving me the chance to comment a bit on the good and the bad experiences. Even though it was a good overall experience, there will always be things to wish for – the day I say otherwise is the day I should stop being a developer.


Faceted search with MongoDB and KnockoutJS

by DotNetNerd 5. November 2011 16:52

Almost two years ago I took my first look at MongoDB as my first exploration into the NoSQL. Today I still find it to be one of the most interesting NoSQL tools – rivaled mainly by RavenDB, for most what I would call common scenarios. Recently I finally had the chance to use MongoDB in a real project, when we were talking over our options for doing faceted search, where the main design criteria was speed.

To get a good smooth user experience, the performance of the faceted search is critical. One if the limitations of a traditional SQL database is that everything is modelled as rows and columns. So when you need to have complex structured objects, you rely on joining tabels together and mapping them into objects. This is all fine and well in most cases, but when you aim to get the very best performance and the objects are well defined we can do better with tools like MongoDB. Not having to deal with schema reduces development time, when building something that should be denormalized, and avoiding mapping and multiple joins cuts the cost of a query runtime.

Takes two to perform

Just as it takes two to tango, it also takes good performance both backend and frontend, to provide the experience of good performance. To facilitate this I chose to use KnockoutJS – another tool I have blogged about earlier. knockout was used to handle two-way binding of the model and elements on the page, and Ajax for requesting the search results from the server, and updating the model.

Snapping together the logo pieces

As Scott Hanselman often describes it, modern tools should fit together well, giving the same feeling as lego pieces that snap together. This really was the feeling I had when I implemented the faceted search. Defining the model serverside, passing objects on through a service, and then having them serialized to JSon which in turn was made into KnockoutJS observables just felt smooth and painless.

The only thing I had to reconsider was using the LINQ implementation in NoRM, which isn’t quite good enough yet. This was however a small hickup, as the more native API that NoRM provides worked fine and was easy to use.

Looking back the actual implementation including doing indexes did not take long, and the performance just rocks. So this is without a doubt one if the more fun challanges I have had lately, and a solution that I feel proud of.

Dynamic dataaccess with Webmatrix Webpages

by dotnetnerd 11. October 2011 22:06

Today I went to a talk by Hadi Hariri about dynamics which was arranged for the ANUG user group in cooperation with the goto conference. The talk happened to fit very well with the first topic I had planned for my series on Webmatrix Webpages, which is dataaccess.

The talk was about not fearing dynamics, and some of the scenarios where it can provide some benefits over static types. Some of the scenarios Hadi talked about were DTO’s, ViewObjects and for DataAccess, which is exactly what WebMatrix WebPages utilizes.

The flood of Micro-ORMs

Over the last year or so a lot of so called Micro-ORMs have seen the light or day with some of the more popular ones being Simple Data, Dapper, PetaPoco and Massive. The reason for their popularity is that they provide a sense of simplicity in the vietnam of computer science as Ted Neward put it.

Each of these ORMs have their own focus, strengths and weaknesses – and some are more “Micro” than others. Compared to NHibernate or Entity Framework they are all very simple to get started with. For quite a few of them part of the strength is the return of good old SQL in stead of LINQ or some other abstraction.


When a new Webmatrix Webpages project is created it comes with its own Micro ORM out of the box. The ORM allows queries and commands to be executed, which are expressed as SQL statements. For queries data is returned as dynamic objects. So a regular query could be done like this.

var db = Database.Open("myDatabase");
dynamic user = db.QuerySingle("SELECT * FROM Users WHERE User_Id = @0", 123);

The big advantage of this approach is that you can select any fields, calculate fields, join with other tabels to your hearts content and you won’t have to write a class to represent each shape of the data returned. This also means that we can use the power of SQL and that we avoid overcomplicating things.  Because the distance from database, to query and then to the view is so short working with dynamic objects is not a problem. So if your domain is not too vast and complex life is good.

Doing inserts and updates is equally easy, but it is one of the areas where I find the ORM lacking. Most annoying is that it does not handle converting null to DBNull. Also while you do get extention methods to convert strings as int, DateTime etc, there is no option to get null instead of the default value of the datatype. So the code tends to get cluttered with parsing and conversions – if you don’t write the extentions yourself.

Clean Ajax

A nice surprise for me was how easy it is to expose data as JSon to enable Ajax when working with WebPages. All you have to do is create a WebPage that retrieves data, pass it to JsonEncode and write it to the response like this example shows.

var json = Json.Encode(user);

Life does not get more simple than this, and it leaves you with this smooth feeling when moving data between server and client.


The dataaccess bits for WebMatrix have been fun to work with, and it has given me a great sense of freedom to get things done, without having to do viewobjects, mappers and a bunch of configuration.

The fluidity of working with dynamic data, and doing ajax certainly has opened my eyes with reguards to the value that this kind of framework can provide. “The return of SQL” has also reminded me, that LINQ is not all rainbows and smiles. The power of micromanaging a JOIN statement and doing UPDATES and deletes should not be overlooked.

Using the right tool for the right job is still the key phrase though. It has been a good match for this project I am working on, but I would not want to use it for an enterprise application. Testability is clearly not a goal of the framework, refactoring is error prone and when complexity increases the code tends to get messy.  So if anything I will argue that it proves that the place for WebMatrix is hobbyists, simple projects, startups and prototyping.

What is the Matrix?

by DotNetNerd 29. September 2011 20:50

Currently some of my spare time is spent building an application using WebPages in WebMatrix and Visual Studio Express, which has been fun and given me the chance to look at webdevelopment and quality from a different angle. WebMatrix is a free Microsoft tool that is targeted at hobbyists and developers doing lightweight development, by enabling users to build websites either from scratch or on top of open source sites such as Umbraco, Screwturn or dasBlog. All in all there currently 59 sites to choose from spread across the categories blogs, CMS, eCommerce, forums, galleries, tools and wikies.


The application I am working on is built from scratch, because the requirements don’t really fit with any of the generic systems. For this purpose WebMaxtrix has the option to build a completely custom site using a framework called WebPages which utilizes the Razor viewengine.

At first look when you see WebPages, you might think of classic ASP or PHP, because it also works by mixing markup and code in the same file. I will admit that at first this made me cringe – as I think most professional developers who are used to building enterprise scale applications will. Having read a bit more about it on various blogs over the last year it cought my interest for some specific scenarios. The point to me is that it is quite powerful for quickly putting together applications where the complexity is manageable. So to get a startup on its feet quickly, to do a proof of concept or for those who dabble in webdevelopment as a hobby it can definately provide good value.

I my case it is actually a startup that I am looking to help my brother with – who just happens to dabble a bit in webdevelopment once in a while. So for a guy who understands the basics of the web, but is not familiar with the amount of abstraction that is involved in building an ASP.NET or MVC application, WebMatrix and Webpages seem like a good fit. Having implemented the first couple of pages and a basic structure I am pretty happy - hopefully I won’t wake up and think why, oh why, didn’t I take the blue pill.

And what is Quality?

A word that is often heard when discussing a framework like this is “quality” or more specifically lack there of. For a while I have actually thought about writing a comment on what quality is, because it is something all agree that they can and will deliver, but mostly when it comes time to define it people start arguing. If anything the tendency is that people argue that the skillset they personally possess is what defines quality. TDD guys argue for testability and code coverage, designers argue touch and feel of a site and so on and so forth.

Personally I always think back to one of my teachers who said that quality is to which degree a product lives up to what the customer expects. Which is pretty close to what google comes up with when referring to wikipedia and the definition based on the ISO 9000 standard which defines it as “degree to which a set of inherent characteristics fulfills requirements”. While I think this is a good definition it also leaves me with a sense that an important aspect is overlooked. As Henry Ford said “If I'd asked customers what they wanted, they would have said a faster horse". This could be interpreted to mean that the most important thing is guiding the customer and telling them what they never knew they always wanted. So my personal definition is:

Quality exceeds the customers original expectations and fulfills the requirements to a high degree. 

At least, that is the experience I want as a customer, and it is what gives me that amazing feeling of victory when we are able to provide it.

Reflecting on the topic of the blogpost I think that WebMatrix and WebPages can indeed deliver quality. I think so for a number of reasons, the obvious ones being if the requirements revolved around speed of delivery or the customers ability to participate with a basic set of knowledge about webdevelopment.

Now what?

This blogpost has been quite unusual for me, because it contains no code at all! Rest assured though, because my intent is to turn this into a little series of blogposts – I just wanted to lay the foundation. The current idea is that I will look at how deep the rabbithole goes, by digging into some of the building blocks of a WebPages application. Hopefully I will be uncovering what WebPages can be used for, and also some  techniques and tools that might be used in other contexts.

Who am I?

My name is Christian Holm Diget, and I work as an independent consultant, in Denmark, where I write code, give advice on architecture and help with training. On the side I get to do a bit of speaking and help with miscellaneous community events.

Some of my primary focus areas are code quality, programming languages and using new technologies to provide value.

Microsoft Certified Professional Developer

Microsoft Most Valuable Professional

Month List