OpenNETCF Scanner Compatibility Library

Some days I think I have too much code “lying around”.  As you would expect from many years as a developer, I have utility libraries for all sorts of tasks.  Generally when I think something is likely to be useful for others I like to make it publicly available for anyone to use – just take a look at the list of Codeplex projects I admin.

This morning I saw a question on StackOverflow about intelligently detecting a platform and loading the proper binaries for it.  In this case it was specific to doing so with Windows Mobile barcode scanners.  I immediately thought, “hey, I have a library for that” and went to answer and give a link.  Except the link didn’t exist.  I never created the open source project for it, so the code has just been sitting here doing nothing.

Yes, this code is probably 5 years or more past it’s prime useful period due to the decline in use of Windows Mobile, but hey, I just used it on a project last week, so it’s got some life left in it.

So, here’s yet another open source library from my archive – the OpenNETCF Barcode Scanner Compatibility Library.

OpenNETCF ORM Updates: Dream Factory and Azure Tables

We’ve been busy lately.  Very, very busy with lots of IoT work.  A significant amount of that work has been using the Dream Factory DSP for cloud storage, and as such we’ve done a lot of work to make the Dream Factory implementation of the OpenNETCF ORM more solid and reliable (as well as a pretty robust, stand-along .NET SDK for the Dream Factory DSP as a byproduct) .  It also shook out a few more bugs and added a few more features to the ORM core itself.

I’ve pushed a set of code updates (though not an official release yet) up to the ORM Codeplex project that includes these changes, plus an older Azure Table Service implementation I had been working on a while back in case anyone is interested and wanted to play with it, use it or extend it.  The interesting thing about the Azure implementation is that it includes an Azure Table Service SDK that is Compact Framework-compatible.

As always, feel free to provide feedback, suggestions, patches or whatever over on the project site.

Using Jenkins to Build from Git without the Git plug-in

A few months ago we decided to upgrade our source control provider and moved everything over to Visual Studio Online.  It’s been working great for source control, though getting used to using Git instead of the TFS source control is a bit of work.  For a few reasons we’re not using the build features of Visual Studi Online, but instead are using a Jenkins build server.  Jenkins is really nice and can do just about anything you could want, which is a big plus.  The only down side is that it’s all in Java.  Why is that a down side, you may wonder?  Well if things get broken, you’re in a pickle.

We were running all well and good for over a month.  Nightly builds were running.  Version umbers were auto-incrementing. Releases for Windows and Mono were getting auto-generated and getting FTPed up to the public release directory.  Things were great.  Until about a week before Christmas, when an update for the git plug-in was released.  The Git plug-in is what allows you to configure Jenkins to easily connect to a Git server and pull your source code.  Well the plug-in update broke the ability to get code from a Git server on Windows.  Now Jenkins has a rollback feature, and had I understood what the failure actually was (it wasn’t obvious that it was a plug-in failure) then I could have rolled back and probably been fine.  But I didn’t know.  And in my effort to “fix” things, I wiped out the roll-back archived version.

So the option was to either install a Java environment and try to learn how Jenkins works and fix it, or to wait for the community to fix the problem.  I opted to do the latter, because it surely would break other people and would get straightened out quickly, right?  Hmm, not so much it seems.  I found a reported bug and asked for a time estimate.  I waited a few days.  No fix.  I left the office for a week of “unplugged” vacation and came back.. No fix.  I then learned that you can access the nightly builds for the plug ins themselves (which is actually pretty cool) so I tried manually installing the latest builds of the plug-in.  Turns out it was still broken.

While I was trying to figure out what was broken, I also appear to have broken something in the Git workspace on the server too, so it was hard to tell if the plug-in was failing, or if Git was confused.  I know that I was confused.  So today I decided that I really needed to get this stuff working again.  I changed the Job to no longer use source control, but instead to just run Window batch files.

REM make sure nothing is hidden 
attrib -H /S
REM recursively remove child folders 
for /d %%X in (*.*) do rd /s /q "%%X"
REM delete files in root folder 
del /q /f *
REM get the source 
git init 
git clone https://[username]:[password]@opennetcf.visualstudio.com/DefaultCollection/_git/SolutionFamily
REM check out
git checkout master
git add ./Common/SolutionFamily.VersionInfo.cs
REM increment the build number
powershell -File "%WORKSPACE%UtilitySetFamilyVersion.ps1" 2.1.%BUILD_NUMBER%.0
REM commit the change
git commit -a -m auto-version
git push https://[username]:[password]@opennetcf.visualstudio.com/DefaultCollection/_git/SolutionFamily

Once that was done, the MSBUILD plug-in was then able to build from the workspace, though the source code directory had changed one level as compared to where the Git plug-in had been pulling code.  If I had wanted to, I could have had my command do the build as well and not even used the MSBUILD plug in by adding this to the end:

C:WindowsMicrosoft.NETFrameworkv4.0.30319msbuild.exe "/p:Configuration=Debug;Platform=Any CPU" /m "%WORKSPACE%SolutionFamilySolutionEngine.FFx.sln" && exit %%ERRORLEVEL%%

Once the Git plug-in is actually fixed, I’ll post how to use it to connect to Visual Studio Online.  It actually seems to be working “somewhat” this morning.  I say “somewhat” because while it actually is pulling the code and behaving properly, when you do the configuration you get an error, which makes it look like it’s going to fail.  Until that’s ironed out I’m going to wait.

Lots of ORM Updates

We use the OpenNETCF ORM in the Solution Family products.  Unfortunately I haven’t figured out a good way to keep the code base for the ORM stuff we use in Solution Family in sync with the public code base on CodePlex, so occasionally I have to go in and use Araxis Merge to push changes into the public tree, then check them into the public source control server.  What that means to you is that you’re often working with stale code.  Sorry, that’s just how the cookie crumbles, and until I figure out how to clone myself Multiplicity-style, it’s not likely to change.

At any rate, we’re pretty stable on the Solution Family side of things, so I did a large merge back into the public tree this evening.  I still have to do a full release package, but the code is at least up to date as of change set 104901 and all of the projects (at least I hope) properly build.

Most of the changes revolve around work I’ve been doing with the Dream Factory cloud implementation, so there are lots of changes there, but I also have been doing more with DynamicEntities, so some changes were required for that too.  Of course there are assorted bug fixes as well, most of them in the SQLite implementation.  I leave it to you and your own diff skills if you really, really want to know what they are.

Go get it.  Use it.  And for Pete’s sake, quit writing SQL statements!

Our New Cross-Platform Build, Test, Store and Deploy Architecture

First, let me preface this by saying no, I’ve not migrated any Compact Framework application to Visual Studio 2013. We’re still using Visual studio 2008 for CF apps, so don’t get too excited. That said, we’ve done some pretty interesting work over the last week that was interesting so please, read further.

Microsoft recently announced the availability of not just Visual Studio 2013, but also Visual Studio Online, which is effectively a hosted version of Team Foundation Server 2013.  We use the older TFS 2010 internally as our source control provider as well as for running unit tests, but it’s got some significant limitations for our use case.

The biggest problem is that our flagship Solution Engine product runs on a lot of platforms – Windows CE, Windows Desktop and several Linux variants.  For Linux we’re using Mono as the runtime, which means we’re using XBuild to compile and Xamarin Studio for code editing and debugging.  Well Mono, XBuild and Xamarin Studio don’t really play well with TFS 2010.  To put it bluntly, it’s a pain in the ass using them together.  You have to train yourself to have Visual Studio and Xamarin Studio open side by side and to absolutely always do code edits in Visual Studio so the file gets checked out, but do the debugging in Xamarin Studio.  Needless to say, we lose a lot of time dealing with conflicts, missing files, missing edits and the like when we go to do builds.

TFS 2013 supports not just the original TFS SCC, it also supports Git as an SCC, which is huge, since Xamarin Studio also supports Git. The thinking was that this would solve this cross-platform source control problem, so even if everything else stayed the same, we’d end up net positive.

I decided that if we were going to move to TFS 2013, we might as well look at having is hosted by Microsoft at the same time.  The server we’re running TFS 2010 on is pretty old, and to be honest I hate doing server maintenance.  I loathe it.  I don’t want to deal with getting networking set up.  I don’t like doing Hyper-V machines.  I don’t like dealing with updates, potential outages and all the other crap associated with having a physical server.  Even worse, that server isn’t physically where I am (all of the other servers we have are) so I have to deal with all of that remotely.  So I figured I’d solve problem #2 at the same time by moving to the hosted version of TFS 2013.

Of course I like challenges, and Solution Engine is a mission-critical product for us.  We have to be able to deliver updates routinely. It’s effectively a continuously updated application – features are constantly rolling into the product instead of defined periodic releases with a set number of features.  We’ll add bits and pieces of a feature incrementally over weeks to allow us to get feedback from end users and to allow feature depth to grow organically based on actual usage.  What this means is that the move had to happen pretty seamlessly – we can’t tolerate more than probably 2 or 3 days of down time.  So how did I handle that?  Well, by adding more requirements to my plate, of course!

If I was going to stop putting my attention toward architecting and developing features and shift to our build and deployment system, I decided it was an excellent opportunity to implement some other things I’ve wanted to do.  So my list of goals started to grow:

  1. Move all projects (that are migratable) to Visual Studio 2013
  2. Move source control to Visual Studio Online
  3. Abandon our current InstallAware installer and move to NSIS, which meant:
    1. Learn more about NSIS than just how to spell it
    2. Split each product into separate installers with selectable features
  4. Automate the generation of a tarball for installation on Linux
  5. Automate FTPing all of the installers to a public FTP
  6. Setting up that FTP server
  7. Setting up a nightly build for each product on each platform that would also do the installers and the FTP actions
  8. Setting up a Continuous Integration build for each product on each platform with build break notifications

Once I had my list, I started looking at the hosted TFS features and what it could do to help me get some of the rest of the items on my list done.  Well it turns out that it does have a Build service and a Test service, so it could do the CI builds for me – well the non Mono CI builds anyway.  The nightly builds could be done, but no installer and FTP actions would be happening.  And it looked like I was only going to get 60 minutes of build time per month for free.  Considering that a build of just Engine and Builder for Windows takes roughly 6 minutes, and I wanted to do it nightly meant that I probably needed to think outside the box.

I did a little research and ended up installing Jenkins on a local server here in my office (yes, I was trying to get away from a server and ended up just trading our SCC server for a Build server).  The benefit here is that I’ve now got it configured to pull code for each product as check-ins happen and then to do CI builds to check for regression.  If a check-in breaks any platform, everyone gets an email.  So if a Mono change breaks a CF change, we know.  If a CF change breaks the desktop build we know.  That’s a powerful feature that we didn’t have before.

Jenkins also does out nightly builds, compiles the NSIS installers and builds the Linux tarballs.  It FTPs them to our web site so a new installation is available to us or customers every morning just like clockwork, and it emails us if there’s a problem.

It was not simple or straightforward to set all of this up – it was actually a substantial learning curve for me on a whole lot of disparate technologies.  But it’s working and working well, and only took about 6 days to get going.  We had a manual workaround for being able to generate builds after only 2 days, so there was no customer impact.  The system isn’t yet “complete” – I still have some more Jobs I want to put into Jenkins, and I need to do some other housekeeping like getting build numbers auto-incrementing and showing up in the installers but it mostly detail work that’s left.  All of the infrastructure is set up and rolling.  I plan to document some of the Jenkins work here in the next few days, since it’s not straightforward, especially if you’re not familiar with Git or Jenkins, plus I found a couple bugs along the path that you have to work around.

In the end, though, what we ended up with for an extremely versatile cross-platform infrastructure.  I’m really liking the power and flexibility it has already brought to our process, and I’ve already got a lot of ideas for additions to it.  If you’re looking to set up something similar, here’s the checklist of what I ended up with (hopefully I’m not missing anything).

Developer workstations with:

  • Visual Studio 2013 for developing Windows Desktop and Server editions of apps
  • Xamarin Studio for developing Linux, Android and iOS editions
  • Visual Studio 2008 for developing Compact Framework editions

A server with:

  • Visual Studio 2013 and 2008 (trying to get msbuild and mstest running without Studio proved too frustrating inside my time constraints)
  • Mono 3.3
  • 7-Zip (for making the Linux tarballs)
  • NSIS (for making the desktop installers)
  • Jenkins with the following plug-ins
    • msbuild
    • mstest
    • git
    • ftp

Should You Develop Software for Free?

I just read an excellent opinion piece over on the New York Times site. I’d recommend you read it, but if you’ve not got the time, or are generally just too lazy, let me summarize. The author is, not unexpected, a writer. Often he is asked to do writing or give speeches and offered $0.00 in return. Basically people want him to give away the output of his “craft” yet those same people would never ask or expect “a keychain or a Twizzler” for free. What gives? How do you tell them to pound sand, but in a polite way? And should you?

Well it’s not so greatly different here in the computer world. Many people feel software should be free. After all, it’s just electrons, and I can shuffle my feet across the carpet and get electron for free, why should you buy electrons from someone else? In fact the idea is so popular that there’s an entire sect that firmly believes that all software should be free. Amazingly, these people even tend to be software developers.

Yes, I’m somewhat of a contributor to this. I have written a boatload of software that I give away. I even encourage people to go get for free. But let’s be clear, that software was paid for. It was developed over many years and many projects. The customers whom I was working for at the time paid for me to solve their problem, and part of that solution involved adding to or fixing some of those core libraries. Still, it was paid for. I simply reserved a small portion of the work as a part of a “free” library that carried over to the next project.

And here’s another dirty little secret. I didn’t do it out of altruism. No, I didn’t do it for “exposure”. That’s a line that someone trying to wheedle free stuff from you would use. I kept those libraries “free” so that a customer can’t say “hey, you did that work on my dime, I own it!” No, sir, you own the solution to your problem part of which uses a set of free libraries.  You’ve gained from work done for countless other projects in the past. Those libraries allowed me to solve you problem much faster than if I had to start them from scratch. So the reason I provide free software is to save me from doing repetitive work and to, generally, keep a set of base libraries that I’m free to use anywhere, on any project.

So why do I also give those libraries away? I don’t know, maybe I am an altruist. Maybe I have the delusion that some day when I’m called in to fix a broken project they’ll already be using my libraries and it will save me the headache and frustration of coming up to speed. Maybe I just hate seeing people reinvent the wheel. Maybe I’m just crazy. Maybe it’s a bit of all of the above, but let’s be clear: I don’t do work for free.

I don’t feel that anyone has a right to ask for software that is not paid for. You may be able to sway me somewhat with the argument about you being free to run and modify the software you paid for, but you’ll have a harder time on the copy and distribute side of the argument.  We, as software developers, still have to pay our bills.  We’ve spent years honing our skills to be able to solve these problems (admittedly, some have done a better job of honing than others) and it takes actual time and work solving problems.  Sometimes I work on problems or with customers where it, most decidedly, is work.  Even drudgery.  So no, I’m not a believer that software, music, writing, photographs or really any other form of “art” should be free.  You wouldn’t expect an electrician to work for free, don’t expect it of a “content creator” either.

Memory Management in the Compact Framework

Way back in 2006 I presented a talk on Memory Management in the Compact Framework. Well the CF for the CF, at least up through 3.5, is no different than it was back then and the content from that talk is still valid. I had also done the talk as a Webcast that Microsoft had hosted, but evidently that content has been purged from their servers. Unfortunately I don’t have any backup of the actual talk, but I do have a backup of the presentation PowerPoint. You can get it here.

    Update! (Oct 28, 2013)

Hooray! Microsoft recovered the talk from a backup. Not certain if/when they’ll post it back on their site, but as an added back-up I’m also keeping a backup copy. You can download it directly here.