I’m gonna build my own blog engine, because it’s never been done before, or maybe I can just do it better!

Today is the first day of my paternity leave, I’ll be home with my little daughter for at least six months. Which is great! But I immediately started to get the itch to write some code. I needed a hobby project. Since I suck at coming up with great ideas, I decided to be a good Jedi and build my own blog engine!

Yes, I know, there really is no need for another blog engine. Except that there is. The one I would like to use doesn’t seem to be out there. Also, it seems like a perfect way to try out some new technology that I’ve been wanting to use, without a client (or a manager) having opinions about it. And since I replaced the native comments of Subtext with Disqus a few weeks ago, I realized that there really aren’t that many features needed for a blog anymore.

So, my plan right now is to build it on ASP.NET MVC 3, with the Razor view engine. I’ve been wanting to try this whole NoSQL-thing for a while, so I’ll use RavenDB for persistence. I’ve also been wanting to try a distributed version control system (getting reeaaally tired of TFS!), so I’ll use Mercurial and host it on bitbucket. I’m sure I’ll think of other sweet stuff to use as well, and maybe I’ll change my mind on some of these along the way.

But now it’s on it’s way. I called the project “Alpha”, because that’s what it’ll always be. But I aim to replace Subtext with Alpha before the end of this year. And hopefully the process of building it will give me reasons to write some blog posts, since I will not be writing blog posts about the process of taking care of my daughter.

Anyway, the project is available at https://bitbucket.org/nahojd/alpha/overview. Not much to see yet, though.

Extension methods are teh shit!

I’m really starting to like extension methods! I must admit I was a bit sceptical at first when they appeared in C# 3.0 (wow, that was 3 years ago!). I was worried that they would make the code hard to follow, and thought that you could just make a static helper class instead (which is, after all, what extension methods really are).

1
2
3
4
5
//After all, what's wrong with this?
cleanString = StringHelper.RemoveHtml(htmlString);

//On the other hand, this does look nicer...
cleanString = htmlString.RemoveHtml();

So at first, I didn’t use it for much, but now I’ve really started to like it, and I find myself extract anything that operates on classes I can’t change to extension methods. For example, don’t you hate that you can’t use Action.ForEach on IEnumerable? Don’t worry, there’s an extension for that:

1
2
3
4
5
6
7
public static void ForEach<T>( this IEnumerable<T> enumeration, Action<T> action )
{
foreach (var item in enumeration)
{
action( item );
}
}

And the ugly !string.IsNullOrEmpty(someStringThatMayBeNullOrEmpty)? Well, how about

1
2
3
4
5
6
public static bool IsNullOrEmpty(this string @string, bool trim = false)
{
if (@string == null)
return true;
return String.IsNullOrEmpty(trim ? @string.Trim() : @string);
}

And suddenly you can call it like this instead: !someStringThatMayBeNullOrEmpty.IsNullOrEmpty(). And yes, this works even if the string is null, since you’re not actually calling a method on the string object, but rather a static method with the string as a parameter!

I’m also very fond of a generic extension that lets me parse strings to enums with an optional fall-back value if the string doesn’t parse: enumString.ParseToEnum<MyEnum>(MyEnum.DefaultValue). I expect that if this keeps up, all logic will be in extension methods by the end of the year.

For some reason, my colleagues thought that I went a bit to far with this one:

1
2
3
4
public static bool ContainsSomething( this string str )
{
return !str.IsNullOrEmpty();
}

But I think they’re just jealous.

Update: Fixed some bad formatting…

Testing EPiServer sure takes a lot of mocking…

A quick follow up to my last post. Even with EPiAbstractions, testing takes a lot of mocking and setup code…

Testing this

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
public void CreateOrtkoppling( int verksamhetsId )
{
var ortContainer = GetOrtContainer() ?? CreateOrtContainer();

CreateVerksamhetPage( ortContainer, verksamhetsId );
}

public PageData GetOrtContainer()
{
return (from p in _dataFactory.GetChildren( _page.PageLink )
where p.PageName == OrtContainerPageName
select p).SingleOrDefault();
}

private void CreateVerksamhetPage( PageData ortContainer, int verksamhetsId )
{
var verksamhet = _verksamhetRepository.Get( verksamhetsId );
if (verksamhet == null)
throw new ArgumentException( "Det finns ingen verksamhet med id " + verksamhetsId );

var page = _dataFactory.GetDefaultPageData( ortContainer.PageLink, OrtPageTypeId );
page.PageName = string.Format( "{0}, {1}", verksamhet.Ort, verksamhet.Besöksadress );
page.Property[VerksamhetsIdProperty].Value = verksamhetsId;
_dataFactory.Save( page, EPiServer.DataAccess.SaveAction.Publish, EPiServer.Security.AccessLevel.NoAccess );
}

took me this much code:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
private Mock<IDataFactoryFacade> dataFactoryMock;
private Mock<IVerksamhetRepository> verksamhetRepositoryMock;
private Mock<PagePropertyUtils> pagePropertyUtilsMock;
private Mock<IPageReferenceFacade> pageReferenceMock;
private PageData utbildningsPage;
private UtbildningPageHandler utbildning;

[SetUp]
public void Setup()
{
int utbildningsPageId = 4711;
utbildningsPage = CreatePageDataObject( "Testutbildning", utbildningsPageId );
verksamhetRepositoryMock = new Mock<IVerksamhetRepository>();
pagePropertyUtilsMock = new Mock<PagePropertyUtils>();
pageReferenceMock = new Mock<IPageReferenceFacade>();
dataFactoryMock = new Mock<IDataFactoryFacade>();
dataFactoryMock.Setup( d => d.GetPage( It.Is<PageReference>( p => p.ID == utbildningsPageId ) ) )
.Returns( utbildningsPage );
utbildning = new UtbildningPageHandler( dataFactoryMock.Object, verksamhetRepositoryMock.Object, pagePropertyUtilsMock.Object,
pageReferenceMock.Object, utbildningsPageId );
}

private int MockOrtContainerPage()
{
var pageName = GetPrivateField<string>( utbildning, "OrtContainerPageName" );
int containerPageId = 17;

var containerPage = CreatePageDataObject( pageName, containerPageId );

dataFactoryMock.Setup( d => d.GetChildren( It.Is<PageReference>( p => p.ID == utbildningsPage.PageLink.ID ) ) ).Returns(
new PageDataCollection( new List<PageData> {
containerPage
} )
);
return containerPageId;
}

protected static PageData CreateDefaultPageDataObject()
{
var page = new PageData();
page.Property.Add( "PageName", new PropertyString() );
var propertyPageReference = new PropertyPageReference( PageReference.EmptyReference );
SetPrivateField<Guid>( propertyPageReference, "_pageGuid", Guid.NewGuid() );
page.Property.Add( "PageLink", propertyPageReference );
page.ACL = new EPiServer.Security.AccessControlList();
return page;
}

protected static T GetPrivateField<T>( object o, string fieldName )
{
Type type = o.GetType();
BindingFlags privateBindings = BindingFlags.NonPublic | BindingFlags.Instance | BindingFlags.Static;
FieldInfo match = type.GetField( fieldName, privateBindings );
if (match != null)
{
return (T)match.GetValue( o );
}
return default(T);
}

[Test]
public void CreateOrtkoppling_Will_Create_Page_And_Save_VerksamhetsId_As_Property()
{
var containerPageId = MockOrtContainerPage();
var verksamhet = new Verksamhet { Id = 17, Besöksadress = "Gatan 1", Ort = "Staden", Postadress = "123 45" };
verksamhetRepositoryMock.Setup( v => v.Get( verksamhet.Id ) ).Returns( verksamhet );
var defaultPageData = CreateDefaultPageDataObject();
var propertyName = GetPrivateField<string>( utbildning, "VerksamhetsIdProperty" );
defaultPageData.Property.Add( propertyName, new PropertyNumber() );
var ortPageTypeId = 19;
pagePropertyUtilsMock.Setup( p => p.GetPropertyValue<int>( It.IsAny<PageReference>(), "UtbildningsOrtPageType" ) ).Returns( ortPageTypeId );
dataFactoryMock.Setup( d => d.GetDefaultPageData( It.Is<PageReference>( p => p.ID == containerPageId ), ortPageTypeId ) )
.Returns( defaultPageData );
dataFactoryMock.Setup(d => d.Save(defaultPageData, EPiServer.DataAccess.SaveAction.Publish, EPiServer.Security.AccessLevel.NoAccess))
.Callback<PageData, EPiServer.DataAccess.SaveAction, EPiServer.Security.AccessLevel>( ( pageToSave, saveAction, accessLevel ) =>
{
var id = (int)pageToSave.Property[propertyName].Value;
Assert.That( id, Is.EqualTo( verksamhet.Id ) );
Assert.That( pageToSave.PageName, Is.EqualTo( string.Format( "{0}, {1}", verksamhet.Ort, verksamhet.Besöksadress ) ) );
} );

utbildning.CreateOrtkoppling( verksamhet.Id );

dataFactoryMock.VerifyAll();
verksamhetRepositoryMock.VerifyAll();
}

Now, most of that code (everything except for the actual test) is reused in other tests, but still! It’s a bit exhausting!

Unit testing saving pages in EPiServer CMS 6

I’ve been working quite a lot with EPiServer in the last six years or so, and unit testing has always been a pain in the ass (not that I was writing unit tests anyway six years ago, I hardly knew what I was doing back then… can’t believe I actually got paid! :-)). And it still is, even in the latest version.

Luckily there are some EPiServer developers that are either smarter than me, or just have more free time, for example Joel Abrahamson who has created a wrapper project, EPiAbstractions, for wrapping the unmockable EPiServer classes in mockable classes and interfaces. Wonderful!

We started using this in our current project recently, and mostly it works fine. Instead of using EPiServer.DataFactory inside our business classes, we now inject IDataFactoryFacade from EPiAbstractions, and use that instead. Like this:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
//Old way
public class Foo
{
public PageData GetPageById(int id)
{
return EPiServer.DataFactory.Instance.GetPage(
new PageReference(id));
}
}

//New way
public class Foo
{
private IDataFactoryFacade dataFactory;

public Foo(IDataFactoryFacade dataFactory) {
this.dataFactory = dataFactory;
}

public PageData GetPageById(int id)
{
return dataFactory.GetPage(new PageReference(id));
}
}

Simple enough, it let’s us mock the DataFactory with Moq, and we can use an IoC container like StructureMap to inject the the DataFactory class, or we can create a constructor overload that uses DataFactoryFacade.Instance.

The problem

So, with all these nice interfaces that we are able to mock, what is the problem? Well, I was trying to write a test for this method:

1
2
3
4
5
6
7
8
9
10
public void SetAnmälanTillåten(int verksamhetPageId, bool anmälanTillåten)
{
var page = _dataFactory.GetPage(new PageReference(verksamhetPageId));
if (page == null)
return;

var writeablePage = page.CreateWritableClone();
writeablePage.Property["TillåtAnmälan"].Value = anmälanTillåten;
_dataFactory.Save(writeablePage, EPiServer.DataAccess.SaveAction.Publish, EPiServer.Security.AccessLevel.NoAccess);
}

This is a pretty simple method, it gets a page from the DataFactory, sets a property, and saves it again. So we just need to mock GetPage and Save, right? I tried it, and I got this error:

1
2
3
4
5
6
7
The type initializer for 'EPiServer.Web.PermanentLinkMapStore' threw an exception.
System.TypeInitializationException
EPiServer.Web.PermanentPageLinkMap Find(EPiServer.Core.PageReference)
System.Guid get_GuidValue()
EPiServer.Core.PropertyData CreateWritableClone()
EPiServer.Core.PropertyDataCollection CreateWritableClone()
EPiServer.Core.PageData CreateWritableClone()

GetPage was mocked like this (somewhat simplified):

1
2
3
4
5
6
7
var page = new PageData();
page.Property.Add( "PageName", new PropertyString( "Name" ) );
page.Property.Add( "PageLink", new PropertyPageReference(
new PageReference( 4711 ) ) );
page.Property.Add( "TillåtAnmälan", new PropertyBoolean() );
dataFactoryMock.Setup( d => d.GetPage(
It.Is<PageReference>( p => p.ID == 4711 ) ) ).Returns( page );

So, what is the problem? Well, a look inside EPiServer.dll told me that when calling CreateWriteableClone for the PageData object, it in turn calls CreateWriteableClone for each property in the PropertyDataCollection. When it comes to the PageLink property, it calls PropertyPageReference.CreateWriteableClone:

PropertyPageReference.CreateWriteableClone() implementation from EPiServer CMS 6

This doesn’t look so bad, does it? But the problem is in this.GuidValue:

PropertyPageReference.GuidValue implementation from EPiServer CMS 6

So, unless the field _pageGuid has a non-empty value, it will call PermanentLinkMapStore.Find, and we will get the error. Unfortunately, since the setter won’t let us manually set the value without calling Modified(), which will also make external calls, we can’t use it to set the value.

The solution

System.Reflection to the rescue! With a little helper method for setting the private field, I was able to get the test working:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
private static void SetPrivateField<T>( object o, string fieldName, T newValue )
{
Type type = o.GetType();
BindingFlags privateBindings = BindingFlags.NonPublic | BindingFlags.Instance | BindingFlags.Static;
FieldInfo field = type.GetField( fieldName, privateBindings );
field.SetValue( o, newValue );
}

//Mocking code
var page = new PageData();
page.Property.Add( "PageName", new PropertyString( pageName ) );
var propertyPageReference = new PropertyPageReference( new PageReference( 4711) );
SetPrivateField<Guid>( propertyPageReference, "_pageGuid", Guid.NewGuid() );
page.Property.Add( "PageLink", propertyPageReference );

dataFactoryMock.Setup( d => d.GetPage( It.Is<PageReference>( p => p.ID == 4711) ) )
.Returns( page );

And, yay, we have a working test! (Of course, the actual test also had to do a setup for Save, but I’ll leave that as an exercise for the reader.

Finally, there may well be an easier way to do this. If so, I would be most interested to hear of it! Now, I’ll get back to writing unit tests for my project (we still have pretty bad code coverage…). I have a feeling I’ll have to revisit this topic in the future, as more interesting mocking scenarios appear, though.

Powershell makes Windows Server AppFabric tolerable

I have been working quite a bit with Windows Server AppFabric lately, especially the WF Instancing. AppFabric is kind of nice, but unfortunately the GUI inside IIS is mind-numbingly slow! Luckily, there is a better and faster way of querying and working with AppFabric: Powershell.

It turns out that AppFabric comes with a whole bunch of Powershell cmdlets, that allows you to do just about anything you can do in the GUI (and lots of stuff you can’t do in the GUI). If you start Powershell Modules, and type in

1
> Get-Command -Module ApplicationServer

you will get a long list of available commands (or cmdlets).

A long list of available commands

I won’t go in to details, but rather just give you a very useful example of what you can do. When developing workflows, you get a lot of broken workflow instances in your instance store. You often need to find these, and delete them. You can do this in the GUI, of course, but it takes a long time. In Powershell, you can use the commands Get-ASAppServiceInstance and Remove-ASAppServiceInstance. First, we get all the instances for a specific web site:

1
> Get-ASAppServiceInstance -Sitename "Default Web Site"

List of workflow instances

Now, if just want to remove all persisted instances, we can do that by piping the results to Remove-ASAppServiceInstance:

1
> Get-ASAppServiceInstance -Sitename "Default Web Site" | Remove-ASAppServiceInstance

And just like that, all instances are gone! This is just a taste of what you can do, for more info on the commands, check out the documentation on MSDN. It’s still a bit painful to work with AppFabric and WF, but this at least makes it tolerable.

Resizing cross-domain Iframes

UPDATE 2013-03-20
This example is very obsolete, now that we have HTML5 and window.postMessage for sending messages between windows. Unless you are specifically targeting antique browsers like IE7, you should use that instead of this pile of garbage.

—————– WARNING: OBSOLETE JUNK FOLLOWS ————————-

Recently (last week, actually) I came across a problem that turned out to be quite a challenge for me. Possibly due to me lacking some javascript skillz. Anyway, I need to write it down for future reference, as I’m rather happy with the solution in the end.

We have an application that is showed in an Iframe inside a SharePoint site. Initially it was deployed as an application within the Sharepoint site on the same server, but for various reasons we are moving it to another server. Now, since it is an iframe, and we don’t want that to be visible, we need to resize the iframe whenever the content changes size. Now, if the top window and the page in the iframe are on the same server (same domain, actually), this is not a problem. We just call a resize method in the top window on load of the iframe, or whenever we load something ajaxy, like this:

1
2
3
4
5
6
7
8
9
10
11
//In the top window
resizeIframe = function(height) {
var theFrame = document.getElementById('theFrame');
theFrame.height = height;
};

//In the frame page, called whenever something changes the content
resize = function() {
height = document.body.clientHeight + 50;
top.resizeIframe(height);
};

Piece of cake! When we move the application to another domain, however, this won’t work anymore. We are not allowed to call top.resizeIframe from the frame page anymore, since that is considered cross-site scripting. What to do, what to do… Well, in this excellent article, Michael Mahemoff lays out a few options for handling this. The one that seemed to have the most appeal is what he calls “the Marathon version”, which leverages the fact that you can open another iframe inside your iframe, loading a page on the server containing the top page. This new page is allowed to call functions and access the DOM in the top page, and by passing parameters to this page you can set the height. This is basically how it works:

image

Of course, the new inframe is invisible, and is also removed after a few seconds. So, my application now contains this instead:

1
2
3
4
5
6
7
8
9
10
11
runXDomainScript = function(params) {
var iframe = document.createElement("iframe");
iframe.src = 'http://portal.example.com/xss.html?' + params;
document.getElementById('xdomaincontainer').appendChild(iframe );
setTimeout("document.getElementById('xdomaincontainer').innerHTML = ''", 2000);
};

resize = function() {
height = document.body.clientHeight + 50;
runXDomainScript('height=' + height);
};

And portal.example.com/xss.html is a really simple file that looks like this:

1
2
3
4
5
6
7
8
9
<html>
<head>
<meta http-equiv="cache-control" content="public">
<script type="text/javascript">
var h = getQuerystring('h');
top.setFrameHeight(h);
</script>
</head>
</html>

And yeah, getQuerystring is a utility method to read a parameter from the querystring which I will not clutter the code with here. It’s in the example files at the end of this already very long post. Now, when I want to resize my iframe, I just create a new iframe with the .src set to xss.html, with the height as a querystring parameter. And since xss.html is located on the same server as the top window, it is allowed to change the height!

Great! Only, there was another problem. In our application, we are using modal dialogs (taking advantage of the excellent nyroModal library). Now, these dialogs are automatically positioned in the center of the window, which works fine if you’re not in an iframe. So what we need to do is find out how far the user has scrolled the top window, and then move the modal dialog (let’s just call it the div) to the visible portion of the screen. So, we try a function like this:

1
2
3
4
5
6
//In the framed application
moveModal = function(id) {
var offsetHeight = top.window.pageYOffset || top.document.documentElement.scrollTop || top.document.body.scrollTop;
var topPos = Math.max(offsetHeight - 100, 20);
document.getElementById(id).style.top = topPos + 'px';
};

Now, as you might guess, this won’t work. Why? Because we are not allowed to get the pageYOffset property from top.window, because that would be cross-site scripting (not documentElement.scrollTop or document.body.scrollTop either, those are there to make this cross-browser, which I hate). So, now we have a double problem. We need to

  1. Call a function whenever a modal dialog is shown.
  2. Read a property from the top window
  3. Do some calculations
  4. Set a property in the framed page

1 can only be done in the framed application, 2 can only be done in the top window domain (portal.example.com), 3 can be done anywhere and 4 can only be done in the framed application. How do we solve this? Why, the same way as we solved the resizing, only in more steps. Enter the xsscallback page! Here is how it will work:

image

Confusing? A little. But this is the same as earlier, only once we’re done with calculating the offset in xss.html on the portal site, instead of just quitting, we open another iframe, back to the framed site, and open another static html file there, sending the result of the calculation back in the querystring. This html file, xsscallback.html, then can call the function in the original framed page to actually move the div. Code.

1
2
3
4
5
6
7
8
9
//In the framed page
centerModal = function(id) {
runXDomainScript('cmd=offset&id=' + id);
};

moveModalTo = function(id, offsetHeight) {
var topPos = Math.max(offsetHeight - 100, 20);
document.getElementById(id).style.top = topPos + 'px';
};
1
2
3
4
5
6
7
8
9
10
11
12
13
//In xss.html in the portal site
callback = function(params) {
var url = decodeURI(getQuerystring('url'));
var iframe = document.createElement("iframe");
iframe.src = url + '?' + params;
document.getElementById('xDomainScriptContainer').appendChild(iframe);
}

if (cmd == 'offset') {
var offsetHeight = top.window.pageYOffset || top.document.documentElement.scrollTop || top.document.body.scrollTop;
var id = getQuerystring('id');
callback('cmd=offset&h=' + offsetHeight + '&id=' + id);
}

And finally, the xsscallback.html file:

1
2
3
4
5
6
7
8
9
10
11
12
13
<html>
<head>
<meta http-equiv="cache-control" content="public">
<script type="text/javascript">
var cmd = getQuerystring('cmd');
if ( cmd == 'offset') {
var h = getQuerystring('h');
var id = getQuerystring('id');
parent.parent.centerModalDialog(id, parseInt(h));
}
</script>
</head>
</html>

And voìla – the div gets moved. In order to make this work flawlessly, you should also make sure that the xss html files are cacheable and served as quickly as possible. But from my experience, this seems to work fine even on slow machines and slow networks. You could also increase the time until the iframes get removed.

Since I have omitted quite a few details in this post, I’ll end by giving you some working code. These are meant to be deployed under different domains in you web server, for example portal.local and app.local. They are not exactly like my examples here, but close enough.

Get the examples

By the way, I’ll leave it as an exercise for the reader to figure out why I couldn’t simply use position: fixed for the modal dialog.

Debugging Silverlight needs IE?

I’ve decided to learn at least a little Silverlight, since it seems to be pretty cool, and I feel the need to learn something new (yeah, yeah, I’ll learn Erlang or something equally hardcore next). Since I basically don’t know any Silverlight at all, I though I’d start by following a tutorial of some kind. And since the Silverlight Training Course for Silverlight 4 on Channel 9 was announced pretty much the same day, I chose that one.

In Lab 3, I had some difficulties (probably due to my inability to press the correct keys on the keyboard), and felt that I needed to set a few breakpoints and debug. So I did, and pressed F5. But unfortunately, my breakpoints were not hit. Instead I got the always equally entertaining message

1
The breakpoint will not currently be hit. No symbols have been loaded for this document.

Ok, no biggie, we’ve all seen this before. Remove all bin and obj folders, clean, rebuild, shift-F5 to force reload in the browser – but no, still nothing.

However, one thing struck me. I had noticed that my default browser, Firefox, was bad at loading the latest version of the Silverlight app when I started debugging. So I just pasted the URL into Internet Explorer instead and, if necessary, pressed Shift-F5 to force a reload. But now I tried to actually set IE as the default browser (which is insane, I know), and lo and behold - debugging works!

So I draw the conclusion that you need to have IE as the default browser to get Silverlight debugging to work. If so, it should probably be pointed out somewhere, since pretty much every web developer has Firefox or Chrome as the default browser (can’t live without Firebug)…

A little more experimentation has led me to believe that IE does not need to be the default browser of the system, it just needs to be the default browser for the debug start page. If I right-clicked on the start page in the Web project and clicked “Browse with…”, and set IE as default, debugging works. If I set Firefox or Chrome as default, debugging doesn’t work, even if I open the page manually in IE.

I have yet to find out if this is by design or by accident…

Getting an ASP.NET 4 application to work on IIS6

Now, for the last 5 months or so, I’ve been involved in a project where we develop an application with ASP.NET MVC on .NET 4 (we started on the beta 2). The main part of this application is hosted on servers running Windows 2008 Server R2 and IIS 7.5. But unfortunately, one part of the application has to be deployed to servers running Windows 2003 Server, and thus IIS 6.0.

Turns out, it’s a bit tricky to get .NET 4 working on IIS 6.0.

I did the usual stuff, I installed .NET 4 Framework, restarted the server, created a new application in IIS, running in its own Application Pool, and changed the ASP.NET version to 4. Then I tried to access my application. I was greeted by this:

The page cannot be found - WTF??

Strange. I’m pretty sure this is not what should happen. Now, since I’m running ASP.NET MVC, and have to do some mapping stuff on IIS6 (described for example by Phil Haack back in 2008), I naturally suspected this to be the problem. So I created a new application, and just added a default.aspx that prints the current date. Exactly the same result. Weird.

After some extensive googling (Yes, googling. Not Binging.) I got the tip to look at my IIS logfile to see what the error was, and I found this row:

1
2010-04-13 14:27:05 W3SVC674436196 127.0.0.1 GET / - 80 - 127.0.0.1 Mozilla/4.0+(*snip*) 404 2 1260

So, the error is 404.2. A look at the IIS status codes told me what this means: 404.2 – Lockdown policy prevents this request. WTF? Well, it turns out that just because you install .NET 4 on Windows 2003 Server, you’re not automatically allowed to use it i IIS! The article provided a few useful links to the Knowledge base:

  • 328505 How to list Web Server Extensions and Extension files in IIS 6.0
  • 328360 How to enable and disable ISAPI extensions and CGI applications in IIS 6.0

Following the instructions from the first of these, I checked what extensions where available on my server:

Picture of ASP.NET 4 not being allowed.

Notice that the Status for the .NET 4 ASP.NET ISAPI extension is 0? Well, that means that it is disabled. Now, if you read the other knowledge base article, it will tell you how to enable it. Just run the following command:

1
cscript iisext.vbs /EnFile C:\WINDOWS\Microsoft.NET\Framework\v4.0.30319\aspnet_isapi.dll

And If I now run the /ListFile command again, I will se that the ASP.NET 4 extension is now showing a little “1” as status, telling me that it is now enabled.

Yay, ASP.NET 4 is enabled!

And after this, my application worked fine. And not only the test application that printed today’s date, but even my ASP.NET MVC application!

Do NOT forget to close() your WCF ServiceClients!

We had a very disturbing problem today. Our client was doing acceptance testing on the application we’re building for them, and had assembled a team of 15 people to do it. The application ran smoothly, until after a few minutes it came to a grinding halt and simply stopped responding. We checked the server, and the CPU was fine. For a while, after a minute or so it went up to 100%.

The application is an ASP.NET MVC web client, talking to WCF services, which are talking to an SQL Server. Everything running on .NET 4 RC. Well, we recycled the application pool, and everything was fine again, for a few minutes…

Some profiling on the server told us that not only did the application freeze after a few minutes, and the CPU went to 100%, but we also saw a shitload of exceptions. Well, to cut to the chase, some more testing revealed that the only page that was still responsive was the only page that didn’t talk to the WCF service at all. Aha!

In our application, we use the StructureMapControllerFactory from MvcContrib to let StructureMap inject dependencies into our controllers. And our controllers are depending on repositories, that are in turn depending on ServiceClients for the WCF services. So basically, we have a repository like this:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
public class FooRepository: IFooRepository
{
private ServiceClients.IFooService client;

public FooRepository(ServiceClients.IFooService client)
{
this.client = client;
}

public Bar GetBar(int id)
{
return client.GetBar(id);
}
}

And a StructureMap configuration like this:

1
2
ForRequestedType<ServiceClients.IFooService>()
.TheDefault.Is.ConstructedBy(() => new ServiceClients.FooServiceClient());

See the problem? Yep, we never close the client. And I can even remember thinking to myself, back in December when we wrote this: “I wonder if the fact that we don’t close the clients will cause a problem… Well, we’ll cross that bridge when we get to it.”

Today we got to the bridge. And it was under enemy fire. So why didn’t we notice this earlier? This is my guess. We have finite number of available connections to the WCF Service. Let’s say 200. And every time we make a request, we create a new client, and use one of these. But, after a while, usually 1 minute, the connection times out and closes. So we need to make enough requests to use all the available connections before they time out. And what happens then? The requests end up in a queue waiting for a connection, and the application stops responding.

So what is the solution? In the WCF textbook, you are supposed to use a service client like this (and some try and catch and other things, of course):

1
2
3
4
5
6
public Bar GetBar(int id) {
var client = new ServiceClients.FooServiceClient();
var bar = client.GetBar(id);
client.Close();
return bar;
}

But that doesn’t work too good for us, because we want to inject the service client into the constructor, in order to be able to mock it away when testing. So how do we ensure that the service client gets closed? Why, the destructor, of course! I added a destructor to each of my repositories, closing the client:

1
2
3
4
5
6
7
8
~FooRepository()
{
var clientObject = client as ServiceClients.FooServiceClient;
if (clientObject != null && clientObject.State == System.ServiceModel.CommunicationState.Opened)
{
clientObject.Close();
}
}

First, we need to cast the client to the underlying class, because the interface doesn’t have the State property, or the close method. Then, we check that is not already closed. And then we closed it.

I was actually not sure that this would work, because I wasn’t sure that the repositories would be garbage collected in time. Buy they were, and it did. So now we’re happy again!

And why did the CPU go up to 100%? Well, when the requests started timing out, we started to get lots of exceptions, which our logger couldn’t handle. We’ll check on that tomorrow. :-)

Keeping things in sync part 2 – Dropbox and Junction

This is kind of a follow-up to my ancient post from november 2008, Keeping things in sync. I still use more than one computer, three to be precise; an Thinkpad X301 at work, a workstation of sorts at home, and a Netbook in front of the TV. And I want the switch between them to go as smoothly as possible. So, just as I did in 2008, I use Dropbox to store all my documents. Great!

But, as it turns out, not all applications lets you choose where to store the data. About 15 minutes ago, I sat myself down in front of my computer at home, to start writing a blog post, a tutorial to Windows Workflow Foundation 4. I started Windows Live Writer (which is a great app for writing the blog posts), and suddenly recalled that I hade already started on that post, but on my laptop.

Live Writer stores its data (posts and drafts) in a a folder called “My Weblog Posts” in the users “Documents” folder. That is not configurable. But I really would like to keep it in my dropbox instead. If only there was a way…

Wait, there is! Junction to the rescue! As it turns out, Windows (or rather NTFS) has been supporting directory symbolic links, or symlinks, since Windows 2000. A symbolic links is an alias for a directory in a different location, and to applications there is no difference between symbolic links and the actual directory. Unfortunately, there is no built-in tool for creating or managing these in Windows. There is, however, a free downloadable tool, called Junction.

So, here is what I did:

  1. I created a folder in My Dropbox called “Apps”, and under that I created a folder called “My Weblog Posts”.
  2. I moved all the content from “Documents\My Weblog Posts” to “My Dropbox\Apps\My Weblog Posts”.
  3. I deleted “Documents\My Weblog Posts”.
  4. I opened a command window, and executed the following command
1
2
> junction.exe "C:\Users\Johan\Documents\My Weblog Posts"
"C:\Users\Johan\Documents\My Dropbox\Apps\My Weblog Posts"

And voilà, I now have a symlink in my Documents folder, pointing to the folder in My Dropbox. Rinse, and repeat this on my laptop, and suddenly my drafts are available on both!

This is, of course, not only useable for Windows Live Writer, but for all applications that keep its data files in some unconfigurable folder somewhere, that you would like to have available on multiple computers.

Hmmm, maybe I should get back to writing that WF4 tutorial now…