Diskprep Availability

Diskprep.exe is a useful tool for making a bootable USB Disk with an OS but recently it seems to have disappeared from Microsoft’s downloads. I can’t say if it’s another one of those subtle hints on the future of Windows CE, an oversight due to the lack of resources dedicated to Windows CE, or just a simple mistake that will get corrected shortly.  Regardless of the cause, there are people who still find the tool useful, so I’m providing a download mirror of the tool here.

MJPEG (and other camera work)

Back in 2009 I was doing a fair bit of work for some customers in the security field.  I ended up doing some proof-of-concept stuff and ended up with some code that, while not groundbreaking, is at least might be useful to others.  It’s really too small to bother starting a Codeplex project for it, unless I get some pull requests, in which case I’ll turnn it into a full project.  In the meantime feel free to Download the source.

New ORM Release: v1.0.14007

I’ve finally gotten around to wrapping up all of the changes I’ve made in the last year (has it really been that long since the last release?) to the OpenNETCF ORM library.  The changes have always been availble in the change set browser, but I actually have them as binary and source downloads now.  I probably should find the time to create a NuGet package for it (and IoC) now.

Using Jenkins to Build from Git without the Git plug-in

A few months ago we decided to upgrade our source control provider and moved everything over to Visual Studio Online.  It’s been working great for source control, though getting used to using Git instead of the TFS source control is a bit of work.  For a few reasons we’re not using the build features of Visual Studi Online, but instead are using a Jenkins build server.  Jenkins is really nice and can do just about anything you could want, which is a big plus.  The only down side is that it’s all in Java.  Why is that a down side, you may wonder?  Well if things get broken, you’re in a pickle.

We were running all well and good for over a month.  Nightly builds were running.  Version umbers were auto-incrementing. Releases for Windows and Mono were getting auto-generated and getting FTPed up to the public release directory.  Things were great.  Until about a week before Christmas, when an update for the git plug-in was released.  The Git plug-in is what allows you to configure Jenkins to easily connect to a Git server and pull your source code.  Well the plug-in update broke the ability to get code from a Git server on Windows.  Now Jenkins has a rollback feature, and had I understood what the failure actually was (it wasn’t obvious that it was a plug-in failure) then I could have rolled back and probably been fine.  But I didn’t know.  And in my effort to “fix” things, I wiped out the roll-back archived version.

So the option was to either install a Java environment and try to learn how Jenkins works and fix it, or to wait for the community to fix the problem.  I opted to do the latter, because it surely would break other people and would get straightened out quickly, right?  Hmm, not so much it seems.  I found a reported bug and asked for a time estimate.  I waited a few days.  No fix.  I left the office for a week of “unplugged” vacation and came back.. No fix.  I then learned that you can access the nightly builds for the plug ins themselves (which is actually pretty cool) so I tried manually installing the latest builds of the plug-in.  Turns out it was still broken.

While I was trying to figure out what was broken, I also appear to have broken something in the Git workspace on the server too, so it was hard to tell if the plug-in was failing, or if Git was confused.  I know that I was confused.  So today I decided that I really needed to get this stuff working again.  I changed the Job to no longer use source control, but instead to just run Window batch files.

REM make sure nothing is hidden 
attrib -H /S
REM recursively remove child folders 
for /d %%X in (*.*) do rd /s /q "%%X"
REM delete files in root folder 
del /q /f *
REM get the source 
git init 
git clone https://[username]:[password]@opennetcf.visualstudio.com/DefaultCollection/_git/SolutionFamily
REM check out
git checkout master
git add ./Common/SolutionFamily.VersionInfo.cs
REM increment the build number
powershell -File "%WORKSPACE%UtilitySetFamilyVersion.ps1" 2.1.%BUILD_NUMBER%.0
REM commit the change
git commit -a -m auto-version
git push https://[username]:[password]@opennetcf.visualstudio.com/DefaultCollection/_git/SolutionFamily

Once that was done, the MSBUILD plug-in was then able to build from the workspace, though the source code directory had changed one level as compared to where the Git plug-in had been pulling code.  If I had wanted to, I could have had my command do the build as well and not even used the MSBUILD plug in by adding this to the end:

C:WindowsMicrosoft.NETFrameworkv4.0.30319msbuild.exe "/p:Configuration=Debug;Platform=Any CPU" /m "%WORKSPACE%SolutionFamilySolutionEngine.FFx.sln" && exit %%ERRORLEVEL%%

Once the Git plug-in is actually fixed, I’ll post how to use it to connect to Visual Studio Online.  It actually seems to be working “somewhat” this morning.  I say “somewhat” because while it actually is pulling the code and behaving properly, when you do the configuration you get an error, which makes it look like it’s going to fail.  Until that’s ironed out I’m going to wait.

Lots of ORM Updates

We use the OpenNETCF ORM in the Solution Family products.  Unfortunately I haven’t figured out a good way to keep the code base for the ORM stuff we use in Solution Family in sync with the public code base on CodePlex, so occasionally I have to go in and use Araxis Merge to push changes into the public tree, then check them into the public source control server.  What that means to you is that you’re often working with stale code.  Sorry, that’s just how the cookie crumbles, and until I figure out how to clone myself Multiplicity-style, it’s not likely to change.

At any rate, we’re pretty stable on the Solution Family side of things, so I did a large merge back into the public tree this evening.  I still have to do a full release package, but the code is at least up to date as of change set 104901 and all of the projects (at least I hope) properly build.

Most of the changes revolve around work I’ve been doing with the Dream Factory cloud implementation, so there are lots of changes there, but I also have been doing more with DynamicEntities, so some changes were required for that too.  Of course there are assorted bug fixes as well, most of them in the SQLite implementation.  I leave it to you and your own diff skills if you really, really want to know what they are.

Go get it.  Use it.  And for Pete’s sake, quit writing SQL statements!

HOWTO: Add the Win32 file version to your .NET Compact Framework assemblies

[NOTE: This is an old post from November 15, 2004 by Neil Cowburn that is hit fairly frequently and that I’ve recovered using the Wayback Machine]

Currently, this is only one supported method of setting the Win32 file version of your .NET Compact Framework assemblies. This is by command-line compiling your project using the “/win32res” switch with csc.exe and a Win32 resource file. This is definitely not an optimal solution if you are not familiar with command-line compiling .NET CF apps.

In the .NET Framework, those lucky developers are able to set the Win32 file version using a special attribute in the AssemblyInfo file. However this attribute, System.Reflection.AssemblyFileVersionAttribute, is missing from the .NET Compact Framework. How can we fix this so that we can easily set the Win32 file version? Easy! Add the following code to your project:

using System;
namespace System.Reflection
{
    [AttributeUsage(AttributeTargets.Assembly, AllowMultiple=false)]
    public class AssemblyFileVersionAttribute : Attribute
    {
        private string version;
        public string Version
        {
            get { return version; }
        }
        public AssemblyFileVersionAttribute(string version)
        {
            if(version == null)
            {
                throw new ArgumentNullException("version");
            }
            this.version = version;
        }
    }
}

And then, in your AssemblyInfo file, add the following attribute:

[C#]

[assembly: AssemblyFileVersion("1.0.0")]

 

[VB]

<Assembly: AssemblyFileVersion("1.0.0")>

 

Compile your project and then check out its property page using Windows Explorer. You should see that the File Version information has been successfully added to your assembly.

Developing Compact Framework App in Visual Studio 2013

A friend, colleague and fellow MVP, Pete Vickers, brought an interesting product to my attention this weekend.  iFactr has a Compact Framework plug-in for Studio 2013.  I’ve not tried the plug-in, so this isn’t an endorsement just a bit of information.  I also don’t know how they’re pulling it off.  It looks like they have WinMo 6.5 and emulator support, and it requires an MSDN subscription.  I suspect that it requires you to install Studio 2008 so you get the compilers, emulators and all of that goodness on your development system, and it then hooks into those pieces from Studio 2013.

It most certainly is not adding any new language features – you’re still going to be targeting CF 3.5 in all its glory – but the ability to use a newer toolset is a welcome addition.  If they somehow are pulling it off without requiring Visual Studio 2008 that will be really nice.  If you’ve tried the plug-in, let me know how it went in the comments.

Windows CE on Arduino?

If you do much “maker” stuff, you’re probably aware of the Netduino, an Arduino-compatible board that runs the .NET Micro Framework.  Cool stuff and it allows you to run C# code on a low-cost device that could replace a lot of microcontroller solutions out there.

It just came to my attention today that there’s a new game in town – 86duino, an Arduino-compatible x86 board.  Say what?!  Basically we have an Arduino-size, and Arduino-cost ($39 quantity-1 retail price, hell0!) device that can run any OS that runs on x86.  Let’s see, an OS that runs on x86, does well in a headless environment, runs managed code, can be real-time, has a small footprint and low resource utilization?  How about Windows CE?  There’s no BSP for it yet that I see, but it’s x86, so the CEPC BSP is probably most of what you need for bring-up.

I’ll be looking to build up a managed code library to access all of the I/O on this and some popular shields.  Any requests/thoughts on “must-have” shield support?

Building a Mono Solution with Jenkins

I recently switched our entire build system over to using a Jenkins build server.  It’s been running for a couple weeks now and I keep expanding the jobs I have it doing, and all in all I’m very happy with how it’s going.  It’s certainly saved me a load of time and the fact we now are getting automated nightly builds of all of our installers is extremely valuable to us and our customers.

Once of the challenges in getting things working was getting the build of the Linux installer for Solution Engine automated.  The Jenkins Server is a Windows Server and Solution Engine in built using Mono, but the actual deployment package is a tarball.  Generating tarballs on a Windows platform really isn’t well documented or outlined anywhere that I could find, but I was able piece the process together from some help files, man pages and a lot of iterations.

In the end, the build portion of the job is done through four separate “Execute Windows Batch Command” steps.

Step 1: Compile the Solution

This one was pretty straightforward.  You simply use the xbuild.exe application that ships with Mono and point it at your Visual Studio/Xamarin Solution File:

"C:Program Files (x86)Mono-3.2.3libmono4.5xbuild.exe" /p:Configuration=Debug SolutionEngine.Mono.sln

Step 2: Copy the results

xbuild puts the files into the output structure I want (that’s how the Visual Studio solution is architected) but it also generates a lot of cruft.  I use robocopy, which is already part of Server, to copy all of my files except files with a *.pdb or *.mdb extension to a temporary output location:

robocopy PublishSolutionEngineDebugMono Installersoutput /s /xf *.pdb *.mdb

Step 3: Put the results into a tarball

Next I use 7-zip to build the tarball.  This part was surprisingly confusing.  7-zip has some documentation, but it’s far from clear, and I even found a page that had “command line examples” but I’m not sure that it really gave me any more clarity.  In the end I just did lots and lots of iterations on the build, checking the output every time to see what happened.  In the end, this is the command I ended up with:

"C:Program Files (x86)7-Zip7z.exe" a -ttar SolutionEngine.tar "%WORKSPACE%Installersoutput*.*" -mmt -r

This breaks down as follows:

  • The ‘a’ flag means I’m creating (as opposed to extracting from) an archive
  • -ttar is a switch meaning “type tar” – I’m using a tar container (as opposed to say a zip, iso or whatever)
  • SolutionEngine.tar is the output file.  It ends up in the working folder where I’m running command.
  • The next section is the “source”.  %WORKSPACE% is an environment variable Jenkins sets and the rest of the path you can see is the destination from Step 2 above.  *.* means I want everything found at the source location in the tar.
  • -mmt means “use multi-threading when compressing” which I think ends up using multiple cores if available.  I didn’t compare times with and without, so I don’t know how much it helps and it’s probably negligible – my end tar is only about 9MB.  Having the switch doesn’t cause problems, so I’m leaving it in.
  • The -r flag means “recurse the source.”  My source folder has several layers of subdirectories and I want them all in the tarball, maintaining the folder structure.  This flag achieves that.

Step 4: Compress the tarball

For those unaware, a “tar.gz” file is a double operation – package, then compress the package.  Step 3 built the package, but to be friendly to both my upload process and our customers who have to download it, I also compress the file.  To compress the tarball with gzip compression I again use 7-zip.

"C:Program Files (x86)7-Zip7z.exe" a -tgzip SolutionEngine.tar.gz "%WORKSPACE%SolutionEngine.tar"

This one is simpler than Step 3 and breaks down as follows:

  • The ‘a’ flag, again, means I’m creating an archive
  • -tgzip is a switch meaning “compress using type gzip”
  • SolutionEngine.tar.gz is the output file.  Again, it ends up in the working folder where I’m running command.
  • The next section is the “source” file, which was the tar file output by Step 3 above.

New Blog for IoT/Intelligent Systems Topics

I’ve created a new blog page specifically for information on Intelligent Systems, IoT and M2M topics.  This blog will continue to have the same type of content I’ve typically posted: developer-centric information covering cross-platform and embedded devices for the .NET developer, but topics that are specifically looking at IoT and our Solution Family products will now be over on our sister blog.