Google Analytics

Thursday, December 15, 2011

I am adding the module now to the shell. I'm looking at the UI Composition QuickStart in the guidance to get an idea of how to start. I want to have a main region and within that a tab region that I can add a tab to. Started by adding TabControl and TabItem template to the shell. Had to work out the fact that I didn't need the Silverlight adapter and other related markup. Added the module project as a simple class library and added the Views directory to it. Added the UserManagementModule class as a normal class. It implements IModule and its constructor takes a UnityContainer and RegionManager interfaces. In the Initialize() method we Register the UserManagement view with the tabRegion on the shell. So... we need to add the UserManagement View. We add that to the Views directory as a UserControl. I also just added a textBlock in the view to indicate that it is loaded. I had to explicitly override ConfigureModuleCatalog in the Bootstrapper and add the UserManagementModule to the catalog for now. Like I said before, I want to use the directory discovery method to get the modules, but we are taking baby steps here. So when I run this now an odd thing happens: I get a shell with two tabs. The first has no content. I think that it has to do with the fact that I am doing view injection rather than the view discovery of the QuickStart for the main region and I am getting the two methods crossed. The second tab has my content. It doesn't have the name on the tab, which is supposed to be bound to the View's ViewName property. Took me a few minutes to realize that I need to put that in the ViewModel object for the view. (Duh! Isn't that how all this is supposed to work in the first place?) So Next I will be adding the ViewModel for the UserManagement view, adding the ViewName property of that, and seeing if it gets displayed in my tab. The UserManagementModel gets added as a plain class that implements INotifyPropertyChanged, so it has a PropertyChanged event. The model is injected in the constructor of the view. Then we set the dataContext of the view to the model. This is a change from the MVP architecture we have used in the past where an interface to the view was injected into the constructor of a presenter and the view basically had no knowledge of its presenter. With MVVM the view will actively subscribe to events from the model and bind to its collections and commands. I'm wondering at this point if I should create an interface for the ViewModel, but I don't see why at the moment so leaving as is. WPF TabControl done differently than the Silverlight example in QuickStart, and that is what was giving me the two tabs and not correctly showing the view name on the tab as I wanted. I had to bind to Content.DataContext.ViewName rather than just ViewName as described in the QuickStart or DataContext.ViewName as some examples on the internets showed. Now I need to activate the tab. Hook up some dataService and bind users to a listControl. Add a second tab and probably localize region names.

Wednesday, December 14, 2011

Prism, WPF and My First Module

To start off with, I would like to create my shell, bootstrapper and first module. I'm choosing the User Management interface of our current project as that initial module, though to begin with all I plan on doing is loading a tab control into the main region. My current plan is to create a shell with a main menu and a main content region. One of the menu items will be Administration and one of the sub-menus under that will be User Management. When the user selects User Management the tab control with the title User Management should load in the main content region.

At first the menu content will just be static, but it would be nice if I could dynamically add the menu item at runtime. I'm not sure that I can specify that the User Management module should load OnDemand rather than OnStartup and still be able to populate the menu. That is one of the things to figure out. I am going to try using populating the module catalog with directory discovery. Another option I think we COULD possibly use is a ConfigurationModuleCatalog. In either case we just need to make sure we are doing thigns securely so that no one can drop in a rogue assembly that could be discovered and loaded. Also I would like to be able to "unload" the tab control and not just hide it so that the control is not using overhead memory. That is another thing to experiment with.

Then the user management view is probably going to be composed of two related views: a Roles control and a Users control. Each of those controls will have a list of their respective model data as well as a Detail view for selected items. So my next task will be to wire up the model to correctly display the system's roles and users as well as the command(s) for selecting an oject from the list and showing its properties in the details view.

In the Users control there will be commands for adding, editing and disabling users. (There is no deleting of created users.) This will give me a chance to do error handling and business rules regarding the users (like password strength and that the default Admin user cannot be disabled).

For each of the controls there are a couple view-related features like the refresh and filter commands. They may be interesting to experiment with in the MVVM framework, but aren't terribly important.

Finally, there is the relatively simple task of adding and removing users from roles, which will give me the opportunity to see how my two controls will communicate with each other.

So on to creating my application and creating the shell and bootstrapper.

Following the walk-throughs for Prism Guidance:

In Visual Studio 2010: New Project > Windows > WPF Application. I'm naming it AdjudicationPoC.

Change MainWindow.xaml to Shell.xaml. VS changes the code-behind name to Shell.xaml.cs, but need to change the actual class name and constructor and also go into App.xaml and remove the startupURI attribute. The Bootstrapper will handle what starts, and in fact we can just go in now to the App.xaml.cs code-behind and override the OnStartup method to create and run our Bootstrapper. Of course, now VS is going to bitch that we don't have a Bootstrapper class so let's add that to the project now too.

Now I'm going to make the executive decision to use Unity as our IoC container, so our Bootstrapper is going to inherit from UnityBootstrapper, and to do this we need to include the Microsoft Prism Libraries. Again following the walk-through in the Guidance I created a Libraries folder in my solution folder. I copied in the Prism libraries (I'm just leaving out the MEF libraries, since, at least for now, I won't be using them), and added the references to my project. Our development team usually keeps our 3rd party libraries outside the solution, so that will probably change, but good for now.

After adding those references I just need to override the CreateShell and InitializeShell methodsof the UnityBootstrapper to create an instance of the Shell, set it as the MainWindow of our App and show it. I'm sort of not crazy about the reference to the App here, since it would be nice to be able to unit test the bootstrapper (and, more importantly, automate that test) without it. It isn't doing much at the moment, plus I may see what MSTest can do for me in order to really test it. That's for tomorrow.
plus I may see what MSTest can do for me in order to really test it. That's for tomorrow.

Wednesday, December 7, 2011

Playing with Prism 4

We're at the end of this project I've been working on for some time now. As per usual, we kind of got into this fire drill situation of trying to get everything all wrapped up at the end. So things like trying to implement acceptance testing got pushed aside in favor of squashing all these bugs that popped up precisely because we didn't implement acceptance testing in the first place. That is all for another series of blog posts altogether I think.

Anyway, also as per usual we got to the end of this project and came to the realization that the architecture of the project was deficient to begin with. I don't mean to say that this just dawned on us. I mean I think we knew before that the architecture was deficient. I mean "realization" in the sense that the problem "manifested" (perhaps a better word) itself at the end. The core components of the application are a WPF client that interfaces with a web service as well as a windows service which handles data loading. The web service can interface with the windows service to report back to the client when loading occurs. Both services are WCF services, and we could probably improve on their architecture to an extent, but that was a bullet we bit about halfway through the project. At that time we made some significant changes to the service side, so while there are definitely some changes we could make, the architecture is ok.

On the client side though we ran into issues. The client-side architecture is a bastardization of one we used on a previous project. That project used the Composite Application Library and an MVP architecture. It was a big, complex application. We thought we could scale that down and cut out some things we wouldn't really need. It turned into sort of an ugly bastard child in every sense of the word.

So, after the fact (which is totally the way I LOVE to do things), I am looking at Prism 4, seeing how it improves I what we did with CAL and figuring out how we can use it for the second release of this project, which we'll probably start after the Holidays. So what follows in the next few episodes are my rambling thoughts as I experiment with Prism and WPF and reassembling pieces of the existing client into something that I don't want to throw out a window.

Wednesday, November 9, 2011

Integrating Fitnesse with CCNet

Just have to laugh because the CURRENT CruiseControl.Net documentation on integrating Fitnesse and CCNet says, "Use the TestRunner which comes as part of the standard fitnesse distribution to run all the fitnesse tests and generate the results. You will need to use the task," and dates from 2006. This is somewhat akin to teaching your grandmother how to send email by saying, "Use Outlook."

Anyway, I have a test running automatically with the build and passing. The problem is with my report. I want to have a link from the CCNet dashboard for the project build to a nice, neat fitnesse report. CCNet 1.6 has something in the packages called Fitnesse Results package, but there is, of course, no documentation on using it.

Monday, October 31, 2011

CruiseControl.Net, or Is There A Limit to my Patience?

Continuing with my trials and tribulations in regards to CCNet

Building my projects. I needed to update the svn settings file we have with the path to the svn executable. Then, we are using Rodemeyer.MsBuildToCCNet.dll for the MSBuild tasks and the server needed that library. So I had to copy that to the CCnet server directory.

I ended up just installing VisualStudio 2010 on the build server though I wanted to avoid it. We are using utilities like resgen.exe as well as our unit tests using MSTest. I'm aware I can jump through hoops to install these separate of VisualStudio, but it ended up being a lot easier to just install VS.

I am setting up CCNet to run an NCover task to run our unit tests in MSTest (Apparently I was on drugs when I was thinking earlier that our tests were done in NUnit; I'm sort of wishing now that we had stuck with NUnit rather than using MSTest). The path to MSTest is in it's regular place after installing VisualStudio: C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\IDE\MSTest.exe. NCover is at its default installation place: C:\Program Files (x86)\NCover\ncover.console.exe. Hopefully this is as simple as putting the NCover profiler task into my ccNet project definition (http://www.cruisecontrolnet.org/projects/ccnet/wiki/NCover_Profiler_Task).

<ncoverprofile>
<executable>C:\Program Files (x86)\NCover\NCover.Console.exe</executable>
<program>C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\IDE\MSTest.exe</program> <testproject>myproject.test.dll</testproject>
<workingdir>build\unittests</workingdir>
<includedassemblies>myproject.*.dll</includedassemblies>
</ncoverProfile>

Yeah... definitely not that easy. First off, the ncoverProfile tag has a bug and doesn't handle spaces in the paths. At least in the 3.3 version of NCover I was using to begin with. I got around that by using the DOS shortnames in the paths. CCNet just passes the testProject value as the first argument to MSTest, which also causes a problem. You get the error from MSTest: 'Invalid switch "c"'. The assembly name in the testProject node needs to have the /testcontainer switch appended. Then I had set the includedAssemblies to *.dll since I figured that would include all the dlls in my working directory. I got an error from NCover that the includedAssemblies included an invalid wildcard '*'. Really?!? Ok, so I just lef the includedAssemblies value blank figuring NCover would just be smart enough to include the dlls in the working directory, and it seems that maybe it was close anyway, because I finally got the MSTest command to at least run my unit tests. My task at this point looked like this:

<ncoverprofile>
<executable>C:\PROGRA~2\NCover\NCover.Console.exe</executable>
<program>C:\PROGRA~2\MICROS~2.0\Common7\IDE\MSTest.exe</program> <testproject>/testcontainer:MembershipTests.dll</testproject>
<workingdir>build\unittests</workingdir> <includedassemblies></includedassemblies>
</ncoverProfile>

MSTest was running but NCover was erroring saying that there was no coverage data: "NCover.Console is returning exit code #20000". Supposedly this can happen when you use NCover 64 bit to cover a 32 bit assembly or vice-versa. My assembly was built for AnyCPU, so I thought I should be ok. I tried running the tests through NCoverExplorer and was getting a similar error, so since only the 32-bit version of NCover was installed on this build machine (since that's the installer we had locally available before), I figured it wouldn't hurt to install the 64-bit version and see if that was indeed the problem.

Once I have the 64-bit version installed, my tests ran fine through NCoverExplorer even though I was still referencing the 32-bit installation of NCover on the machine! So, I was like... ok, let's try this out through CCNet now. After working through an issue with an old NCover task being in the MSBuild project (It suddenly reared its head since it HAD been looking for NCover in C:/Program Files/NCover instead of C:/Program Files (x86)/NCover and was simply failing to even load until I installed the 64-bit version on the new machine *sheepish*), I am getting 42 passing tests, 1 failure with 72% coverage. Not horrible considering it has been a while since we've gotten these tests to run at all. Just gotta go fix that one test, get my build to go green (it fails automatically on the failed unit test) and see if I can't get the coverage up at least to our 85% target. Then the more daunting task: to integrate this into the other VisualStudio projects that comprise the system. While the Membership providers were written in a TDD manner, most of the rest of the system was not. Last I looked, our unit test coverage was somewhere around 15%. Ugh. Finally our real goal here is to get some decent coverage with our acceptance testing. I still have to get Fitnesse running with CCNet on the new machine.

I frankly am wondering what sort of testing ThoughtWorks does before releasing some of their updates. I have to guess they aren't doing much testing on a Windows environment with Microsoft tools. I have to agree that it seems more and more that CCNet and MSBuild with MSTest simply don't mix well and that, though I have gotten this far, Continuous Integration shouldn't be this hard.

Thursday, October 27, 2011

Getting CruiseControl.net Running

I have CCNet installed on the build server and a couple packages installed as well, the same ones I added to Jenkins, essentially: NCover, NUnit, MSBuild, and Fitnesse reporting along with StyleCop reporting. I need to make a note to myself to look for that with Jenkins. Alas, there is no ChuckNorris package built-in with CCNet. I restarted the web server but still there is no evidence that those packages are installed other than they say "[Installed]" in the package list. I don't have any build projects created, yet, though. I like that I can administer packages for CCNet like this now. Not thrilled that I need to restart the web server through IIS Mgr to see the configuration changes in general. Not a huge deal though.

I need to install Subversion on here first of all, and then I can check out the ccnet and msbuild scripts from our version control. I'll try this installer for CollabNet client 1.6.6.4 to install to "~Programs\CollabNet\Subversion Client". I tried 'svn' on the command line to confirm and it told me to "Type 'svn help' for usage.", so that's good.

Now I need to create the directory structure to checkout my code. Also, this is going to go easier if I have TortoseSVN installed as well, so I add that and reboot.
When I get back on there is a CollabNet settings manager in my start bar. Let's lookee and see what we can configure. Ooo! Turn OFF the automatic updates. We fear change, particularly on servers that we barely otherwise pay attention to except when they tell us our client application is outdated and all work must halt until they are brought back up to speed.

My TortoiseSvn commands aren't showing up in the context menu. Maybe the version I have doesn't work in Windows Server 2008. I check the MSBuild scripts and CCNet scripts out manually with svn on the command line.

Now to change the ccnet config files to start building my projects.
Issue: CCNet server starts and stops apparently because of the notification attribute in the email group node of the email settings file we have. "notification" has been changed to "notifications". The error in the event lgo was : 'ERROR CruiseControl.NET [(null)] - Exception: Unused node detected: notification="always"' for those who are interested.

Ok! CCNet is running now. I have a couple of our basic projects that are almost set to build. I need to go modify the build scripts for them to reflect the directory structure on this new server and we should be set to go.

*Grumble* CCNet

I have my new build server. So in the interest of time, at least I thought it was, I am installing CCNet on the new server. I'm even taking advantage of the opportunity to upgrade to 1.6. We were running 1.3. But really, CCNet, I need to give permissions to the AppPool user to update the files that you just installed? So far this is not going smoothly. I'm coming in this weekend to install Jenkins here too and have everything being built and tested by Jenkins come Monday. (♪♫ It'll be all right. ♪♫)

Wednesday, October 26, 2011

MSBuild and Jenkins

The available build steps that Jenkins has out of the box are Execute Windows batch command, Execute shell, Invoke ant, and invoke top-level Maven targets. I am pretty sure that I've used Jenkins to build using MSBuild, back when I was playing around with it last spring, so either I need a plug-in or need to do some Googling to see how others have done it by invoking a batch command.

So I go to Manage Jenkins > Manage Plug-ins and begin to peruse... Of course this can be very distracting when you begin to find all the incredible plug-ins readily available for Jenkins. (A lava lamp notifier!?! Awesome!) I find the MSBuild plug-in and also select the Fitnesse, NUnit and NCover plug-ins since I am pretty sure I'll be using those as well. I congratulate myself on my self-control for not also getting the Twitter notifier or Google Calendar Plug-in. I DID get the Chuck Norris plug-in, but only because Lisa Crispin told us we had to. I have to manually restart Jenkins again when the installations are complete.

I then go back to my initial build project in the dashboard and click 'Configure'. Now I have the option to create a build task with type ' Build a Visual Studio project or solution using MSBuild'. It asks for an MSBuild version but my only option is 'Default'. It also asks for the name of the MSBuild file - I input the name of the .sln solution file - and for any command line arguments - I leave those blank for now. As a post-build action I, of course, select 'Activate Chuck Norris'.
I save that, go back to my dashboard and schedule a build. Again I need to refresh to update teh status, wchi, to my dismay, is failed. When I click on the little number that shows that build #2 failed I am taken to a page where I can choose to see more details. Also Chuck Norris boasts that no statement can catch his exceptions...

I click on Console Output and see the error from the build: msbuild.exe is not recognized as an internal or external command. Hmm, I knew that I was getting away with having to do too little configuration for this thing. Let's go see where I need to set that path to MSBuild. I'm guessing in the management of the MSBuild Plug-in.
It is actually in Manage Jenkins > Configure System under a section for MSBuild. You have to provide the path there. I chose C:\Windows\Microsoft.NET\Framework64\v4.0.30319\MSBuild.exe though maybe I should have just selected the one under plain-ol' C:\Windows\Microsoft.NET\Framework. I'm going for broke.

I'll leave the MSBuild entry for my project as (DEfault) and see what happens. I may need to change it.

Turns out you do need to go back into your build configuration and select the version of MSBuild to run. I do this and WOOHOO... another failure. I'm missing a whole slew of other libraries that MSBuild looks for, but does not find. However, clearly MSBuild is taking control and attempting to build my solution. Only a matter of time and I will be checking out those other libraries and building these in some good sort of order... successfully.

Getting Jenkins to Check Out from Subversion

Jenkins comes with an SVN plug-in. When I went to Manage Jenkins > Manage Plugins it alerted me to an update to the SVN plug-in, which I updated. I checked the little box to have Jenkins re-start once the installation was complete, but it didn't. I had to go to the "Installed" tab of the plug-ins view and click the button there to re-start Jenkins.

Then I created a Jenkins project to just checkout some code from Subversion. After choosing to start a new build project I checked the Subversion radio control under Source Code Management, specified the URL to our SVN repository and set the Local module directory to '.\TestBuild' so I could evaluate where Jenkins puts things. Jenkins tried to immediately access the repository and I was warned that it could not access it because it appeared I was missing credentials. I added those credentials through the separate pop-up window that displayed. Jenkins then confirmed that it could access the repository. I left the other options there are their default values. I didn't add a build step nor any post-build options.

I went to the Dashboard and scheduled a build. The build indicator began to build indicating that the build was in progress. After a couple minutes it did not change, so I refreshed the dashboard and was shown that the last build was successful 2 minutes ago and only took 5.1 seconds. Having to refresh was no big deal, since the same thing happens with the CCNet web interface.

Now I just needed to confirm that Jenkins was able to check out the source code. In my Jenkins installation folder is a directory called jobs. Within that was a directory named the same as the name I gave my Jenkins job. Under that, along with some configuration files, are now two directories: builds and workspace. The builds directory has a directory named with a date-time stamp that is essentially empty. Under workspace is my TestBuild directory and under that the source code. I should have left the workspace path as '.'.

So now I'm moving on to creating a build step to have Jenkins use MSBuild to build my project.

Installing Jenkins for Windows 7

Here is all the "trouble" I had to go through (since my last blog post, mind you) to get the Jenkins server up and running on my local machine.

1) Download native Windows package from jenkins-ci.org. (jenkins 1.436)
2) Run setup to install to C:\Program Files (x86)\Jenkins\
3) Everything else went automatically to start the server on port 8080; however, when the browser came up directed at http://localhost:8080 it got a 404. Turns out the web service just took a little extra time to start up and refreshing the browser showed the Jenkins dashboard.

Automating With Jenkins

I'm re-configuring our project build to use Jenkins instead of CruiseControl. I'm sure the entire world is bound to find this extremely fascinating, so I am blogging about it. There are a number of little reasons for this, but mostly it's simply because I find CCNet to bed tired and old and Jenkins to ben the new hotness. Actually, I am just up for learning something new, and since we need to create a new build environment anyway, I figured I'd take a shot at using Jenkins.

A bit about the project


It is a .Net project that uses the .Net 4.0 framework with a WPF client that communicates to a WCF Web Service that communicates with a WCF Windows Service. It's not a bad architecture, though probably not the BEST architecture. It is definitely not an architecture that is easily testable. It uses SQL Server 2008 as the data store and our version control is SVN.

Our old build environment, like I said, uses CruiseControl.Net. It was what we were using for previous projects, and was adequate for simply doing our build. Now that we are getting into automating our unit and acceptance tests, however, its age is showing. We are generally building for x64 processors and the build environment is an x86 processor. SQL Server 2005 is installed, but not 2008. The version of NCover and NUnit that are installed are outdated and NUnit 2.4.8 will not handle assemblies written for .Net 4.0.

Onward!


So while our IT department is building me a virtual machine with the build environment that I want, I am going to try to and get an automated build running on my laptop with Jenkins. Wish me luck.