Sunday, 24 December 2017

MapGuide tidbits: Windows 10 Fall Creators Update

In case anyone else might encounter this problem.

I had a copy of MapGuide Open Source 3.1 installed on my laptop running Windows 10. Recently it was upgraded to the Fall Creator's update and suddenly, the web tier (which was an IIS/.net configuration) of this installation stopped working.

The non-functional web tier would manifest as HTTP 503 errors when I try to connect with MapGuide Maestro



If you go into IIS manager and try to recycle the associated Application Pool, you may see an error like this



If we take a look at the Windows Event Viewer, we see log entries like this



It just so happens if we search online for "\\?\", we find a similar issue but relating to a completely different product (Team Foundation Server). Still, that link gives us a few useful leads.

Per that link, it appears to be some symlink issue regarding the Application Pool. So to verify this issue also applies to us, if we check C:\inetpub\temp\appPools we indeed see a symlink for our Application Pool name.



If we try to open it, windows throws an error



So with this confirmation of a broken symlink, we can fix the issue by following the prescription in that linked issue:
  1. Delete the broken symlink in question under C:\inetpub\temp\appPools
  2. Restart IIS and the Application Pool in question.
After this, I can connect to the Web Tier again with Maestro confirming the resolution of the issue.


Thursday, 23 November 2017

FDO road test: SQL Server 2017 on Linux

You can consider this post as the 2017 edition of this post.

So for some background. There's been several annoyances I've been personally experiencing with the SQL Server FDO provider that have given me sufficient motivation to fix the problem right at the source (code). However, before I can go down that road, I needed to set up a local dev installation of SQL Server as my dev environment is more geared towards MapGuide than individual FDO providers.

But just like my previous adventure with the King Oracle FDO provider, I didn't want to have to actually find/download a SQL Server installer and proceed to pollute my dev environment with a whole assortment of junk and bloat. We now live in the era of docker containers! Spinning up a SQL Server environment should be a docker pull away and when I no longer need the environment, I can cleanly blow it away without leaving lots of junk behind.

And it just so happens that with the latest release of SQL Server 2017, not only is running it inside a docker container a first-class user story, it is also the first release of SQL Server that natively runs on Linux.

So through the exercise of spinning up a SQL Server 2017 linux container we can kill multiple birds with one stone:

  • We'll know if MapGuide/FDO in its current form can work with SQL Server 2017
  • We'll also know how well it works with the Linux version of SQL Server (given its feature set is not at parity with the equivalent Windows version)
  • If MapGuide/FDO works, we'd then have a SQL Server environment ready to go which can be spun up and torn down on demand to then start fixing various problems with the FDO provider.

Spinning up the SQL Server 2017 linux docker container

This was easy because Microsoft provides an official docker image. So it was a case of just pulling down the docker image and adjusting some environment parameters to use a custom SQL Server sa login when we go to docker run the container and also define port mappings so we can connect to this container from the docker host OS.

The FDO Toolbox bootstrapping test

This was an easy way to determine if the SQL Server FDO provider works with SQL Server 2017. FDO Toolbox has the ability to:
  1. Create a SQL Server data store
  2. Bulk Copy spatial data into it
  3. Query/Preview data from it
If we can do all 3 things above in FDO Toolbox against the freshly spun up SQL Server 2017 linux container, that's a very good sign that everything works.

Creating the FDO data store

FDO Toolbox has a specialized UI for creating SQL Server data stores that is accessible by right-clicking the FDO Data Sources node and choosing Create Data Store - Create SQL Server


This gives us the UI to set up a new SQL Server data store


The first real test is to see if the FDO provider can connect to our SQL Server container, which is a case of filling in all the required connection properties and clicking the Test button, which gives us:


So far so good. Now that we know the FDO provider can connect to the container, we can fill out the data store parameters and click OK to create the data store, which gave us another good sign:


Now just to be sure that the FDO provider did actually create the database, I connected to this SQL Server instance through alternative tools (such as the new SQL Operations Studio) and we can see that the database is indeed there.


So now we can bulk copy some spatial data into it, which will be a nice solid verification that the feature and schema manipulation functionality of the FDO provider work in SQL Server 2017.

So I set up a bulk copy using a whole bunch of test SHP files. A few moments later, we got another positive sign:


Again, for verification we can look at this database in a different tool and can see that the FDO provider correctly created the database tables.


And that data was actually being copied in


Just as an aside: SQL Operations Studio doesn't do spatial data previews like its big brother SQL Server Management Studio.

A shame really. Oh well, at least we can do that in FDO Toolbox :)


Which is also confirmation that FDO is getting the geometry data out of our SQL Server 2017 linux container without any problems.

So based on all these findings, I feel comfortable in saying that FDO (and applications using it like MapGuide) works just fine with SQL Server 2017, especially its Linux version.

Now to deal with these actual annoyances in the FDO provider itself ...

An introduction to MgTileSeeder

I previously said I'd cover this tool in a future post, and that future is now.

MgTileSeeder (introduced as a standalone companion release to MapGuide Maestro 6.0m8) is a new command-line tile seeding application that is the successor to the current MgCooker tile seeder.

This tool is the offspring of an original thought experiment about how one could possibly build a multi-threaded tile seeder using 2017-era .net libraries and tools. It turns out the actual implementation didn't differ that much from my hypothetical code sample from the original post!

But besides being a ground-up rewrite, MgTileSeeder has the following unique features over MgCooker:

  • If your MapGuide Server is 2.6 or newer, we will use CREATERUNTIMEMAP to automatically infer the required meters-per-unit value that is critical in determining how many tiles we need to actually seed.
  • MgTileSeeder is a cross-platform and self-contained .net core application taking advantage of the newly netstandard-ized Maestro API.
  • More importantly, MgTileSeeder finally supports seeding of XYZ tilesets. In fact, the way this support has been designed, you can use MgTileSeeder as a generic tile cache seeder for any XYZ tileset, not just ones served by MapGuide itself.
Seeding standard tiled maps

The minimal command to start seeding a tiled map is simply:

MgTileSeeder mapguide -m --map

Here's an example MgTileSeeder invocation to seed a tile set

MgTileSeeder mapguide -m http://localhost/mapguide/mapagent/mapagent.fcgi --map Library://Samples/Sheboygan/TileSets/Sheboygan.TileSetDefinition

This will use CREATERUNTIMEMAP to auto-infer the required meters-per-unit (for tile sets, we make a temporary Map Definition that links to the tile set and run CREATERUNTIMEMAP against that) and then proceeds to display a running progress that updates every second:


There are other options available, such as:
  • Restricting tile seeding to a specific extent
  • Restricting tile seeding to specific base layer groups
  • Manually passing in the meters-per-unit value

Seeding XYZ tile sets

Seeding XYZ tile sets uses a completely different set of parameters. The minimal command to seed an XYZ tile set is:

MgTileSeeder xyz --url --minx --miny --maxx --maxy

An example of tiling a XYZ tile set (eg. Library://Samples/Sheboygan/TileSets/SheboyganXYZ.TileSetDefinition) in MapGuide would look like this:

MgTileSeeder xyz --url "http://localhost/mapguide/mapagent/mapagent.fcgi?OPERATION=GETTILEIMAGE&VERSION=1.2.0&CLIENTAGENT=OpenLayers&USERNAME=Anonymous&MAPDEFINITION=Library://Samples/Sheboygan/TileSets/SheboyganXYZ.TileSetDefinition&BASEMAPLAYERGROUPNAME=Base+Layer+Group&TILECOL={y}&TILEROW={x}&SCALEINDEX={z}" --minx -87.7978 --miny 43.6868 --maxx -87.6645 --maxy 43.8037

Unlike the standard tiling mode you are required to define the bounds (in lat/long) of the area you wish to seed. Also you can see here that the XYZ tiling mode accepts any arbitrary URL that has {x}, {y} and {z} placeholders. This means you can use MgTileSeeder for tiling any XYZ tile set (eg. Your own custom OpenStreetMap tile set), not just ones served by MapGuide. You just need to make sure your URL provides the required XYZ placeholders.



And that concludes our introduction to the MgTileSeeder tool.

Happy tiling!


Friday, 17 November 2017

Announcing: MapGuide Maestro 6.0m8

Here's another new milestone of MapGuide Maestro 6.0. This release is somewhat light in new features, with more emphasis on changes under-the-hood and the surrounding ecosystem.

Let's start with the new features first.

Feature Count for Thematic Rules

When dealing with thematic layers, sometimes one might want to know exactly how many features are covered by each thematic layer rule. There's now a Feature Count button to crunch those numbers for you.


Clicking it will crunch the feature counts of each individual style rule with a filter (default rule is omitted) and present the totals in a new dialog.


MgTileSeeder (the successor to MgCooker)

Not bundled with Maestro yet, but included as a standalone package available for download alongside this release is MgTileSeeder, a new command-line tile seeding application that is the successor to MgCooker and will eventually replace it in a future release.

I'll cover this tool in more detail in a future post.

New project site

Since MapGuide Maestro is now on GitHub, I've activated the GitHub Pages feature and spun up a new project web site for it.

On this site you will also find the user guide, developer's guide and the API reference for Maestro API and friends.

So speaking of Maestro API ...

Where's the SDK package?

The SDK story is going through a bit of churn at the moment. This milestone release is primarily focused around Maestro (the application) and not the API/SDK, so whatever things I had intended to finish regarding the Maestro API/SDK have taken a back seat so I can get Maestro (the application) out the door.

So as it stands, there is no SDK package with this release and there never will be with any future releases. This is due to major under-the-hood work to port the MapGuide Maestro API and supporting libraries over to target .net standard.

The end result of this is that the primary way to acquire the Maestro API is now via a NuGet package

And since the API reference is now online, this makes the SDK package somewhat redundant.

The various sample code and supporting tools in the SDK have been shipped off to a separate repository, that will be revealed in due course once they have all been updated to work in this new .net world we live in.

If you are an existing consumer of the Maestro API, it should be as simple as removing all your current assembly references to Maestro API and friends and installing the NuGet packages in the affected projects.

.net Framework 4.6.1 required

Due to porting the Maestro API to target .net standard 2.0, .net Framework 4.6.1 is the minimum version of the .net Framework required.

The Windows installer will automatically download and install this for you if you don't have it. It will also automatically install the Visual C++ redistributable so the local connection (mg-desktop) mode will also work out of the box.

Other changes/fixes
  • Fix a long standing annoyance where setting WMS bounds on a published layer will set the coordinate system to EPSG:???? requiring you to manually enter in the EPSG code. This should now be automatic most of the time. It will also be automatically transformed to EPSG:4326 bounds if required.
  • Now uses ICSharpCode.TextEditor for dialog to edit raw resource header XML
  • New resources validation rules around WMS-published Layer Definitions
  • Basic line styles no longer trashed on cancellation of the Edit Style dialog
  • Can now read configuration documents where FDO-related attributes have incorrect casing
  • No-op any map viewer rendering requests if any display parameter is <= 0
  • Disable local map preview if connecting to a MapGuide Server older than 2.1
  • Fusion editor no longer adds obsolete VirtualEarthScript element when adding Bing Maps layers
  • Now gracefully handles invalid resources with open editors instead of crashing out to desktop.


Thursday, 19 October 2017

The journey of porting the MapGuide Maestro API to .net standard

So what prompted the push to port the MapGuide Maestro API to .net standard was Microsoft recently releasing a whole slate of developer goodies:
Of particular relevance to this subject of this post, is .net standard 2.0.

For those who don't know, .net standard is (you guessed it) a versioned standard by which one can write portable and cross-platform class libraries against that will work in any .net runtime environment that supports the version of .net standard that you are targeting. If you do Android development, this is similar to API levels.

.net standard is of interest to me as the MapGuide Maestro API at the moment is a set of class libraries that target the full .net Framework. Having it target .net standard instead would give us guaranteed cross-platform portability across .net runtime environments that support .net standard (Mono) and/or supporting platforms that would never have been possible before in the past (.net Core/Xamarin/UWP)

I tried an attempt at porting the Maestro API to earlier versions of .net standard, with mixed success:
  • The ObjectModels library was able to be ported to .net standard 1.6, but required installing many piecemeal System.* nuget packages to fill in the missing APIs.
  • Maestro API itself could not be ported due to reliance of XML schema functionality and HttpWebRequest, that no version of .net standard before 2.0 supported.
  • Maestro API had upstream dependencies (eg. NetTopologySuite) that were not ported to .net standard.
  • More importantly, the bits I were able to port across (ObjectModels), I couldn't run their respective (full-framework) unit test libraries from the VS test explorer due to cryptic assembly loading errors due to the assembly manifest of the various piecemeal System.* assemblies not matching their assembly reference. With no way to run these tests, the porting effort wasn't worth continuing.
Around this time, I heard of what the upcoming (at the time) .net standard 2.0 would bring to the table:
  • Over 2x the API surface of netstandard1.6, including key missing APIs needed by the Maestro API like the XML schema APIs and HttpWebRequest
  • A compatibility mode for full .net Framework. If this works as hoped, it means we can skip waiting on upstream dependencies like NetTopologySuite and friends needing to have netstandard-compatible ports and use the existing versions as-is.
Given the compelling points of .net standard 2.0 and mixed results with porting to the (then) current iteration on .net standard, I decided to put these porting efforts on ice and wait until the time when .net standard 2.0 and its supporting tooling comes out.

Now that .net standard 2.0 and supporting tooling came out, it was time to give this porting effort another try ... and I could not believe how much less painful the whole process was! This was basically all I had to do to port the following libraries to .net standard 2.0:

Preparation Work

To be able to use our (ported to .net standard 2.0) MaestroAPI in the (full framework) Maestro windows application, we needed to first re-target all affected project files to target .net Framework 4.6.1, as this is the minimal version of the full .net framework that supports .net standard 2.0

OSGeo.FDO.Expressions

This is a class library that uses the Irony grammar parser to parse FDO expression strings to an object oriented form. Maestro uses this library to be able to analyze FDO expressions for validation purposes (eg. You don't have a FDO expression that references a property that doesn't exist).

My process of converting the existing full framework csproj file to .net standard was to basically just replace the entire contents of the original csproj file with the minimum required content for a .net standard 2.0 class library.


1
2
3
4
5
<Project Sdk="Microsoft.NET.Sdk">
  <PropertyGroup>
    <TargetFramework>netstandard2.0</TargetFramework>
  </PropertyGroup>
</Project>

That's right, the content of a minimal .net standard 2.0 class library is just 5 lines of XML! All .cs files are implicitly included now when building this project, which greatly contributes to the simplicity of the new csproj format.

Now obviously this project file as-is won't compile as we need to reference Irony and use VS2017 to regenerate the resx string bundles and source link shared assembly info files. After those changes were made, the project builds with the only notable warning being NU1701, which is the warning emitted by the new tooling when we reference full framework libraries/packages from a netstandard2.0 class library (that the new tooling allows us to do for compatibility purposes).

It was around this time that I discovered that someone has made a netstandard-compatible port of Irony, so we replaced the existing Irony reference with the netstandard-compatible port instead. This library was now fully ported across.

ObjectModels

This is the class library that describes all of our XML resources in MapGuide as strongly-typed classes with full XML (de)serialization support to and from both forms at various schema versions.

The original porting attempt targeted netstandard 1.6. While this was mostly painless, I had to reference tons of piecemeal System.* nuget packages, which then flowed down to anything that was referencing it.

For this attempt, we target .net standard 2.0 using the same technique of pasting a minimal netstandard2.0 class library template into the existing csproj file. Like the previous attempt, building this project failed due to dependencies on System.Drawing as a result of usages of System.Drawing.Font. Further analysis shows that we were using Font as a glorified DTO. So it was just a case of adding a new type that carried the same properties we were capturing with the System.Drawing.Font objects that were being passed around.

Due to referencing the NETStandard.Library metapackage by default, this attempt did not require referencing piecemeal System.* nuget packages like the previous attempts. So that's another library ported across.

MaestroAPI

Now for the main event. Maestro API needed to be netstandard-compatible otherwise this whole porting effort is a waste. The previous attempt (to target netstandard1.6) was cut short as APIs such as XML Schema support was not there. For .net standard 2.0, these missing APIs are back, so porting across MaestroAPI should be a much simpler affair.

And indeed it was.

Just like the ObjectModels porting effort, we hit some snags around references to System.Drawing. Unlike ObjectModels, we were using full blown Images and Bitmaps from System.Drawing and not things like Fonts which we were just using to sling font information around.

To address this problem a new full framework library (OSGeo.MapGuide.MaestroAPI.FxBridge) was introduced where classes that were using these incompatible types were relocated to. There was also service interfaces that returned System.Drawing.Image objects (IMappingService). These APIs have been modified to return raw System.IO.Stream objects instead, with the FxBridge library providing extension methods to "polyfill" in the old APIs that returned images. Thus, code that used these affected APIs can just reference the FxBridge library in addition to MaestroAPI and be able to work as before.

After sectioning off these incompatible types to the FxBridge library, the next potential roadblock in our porting efforts was our upstream dependencies. In particular, we were using NetTopologySuite, GeoAPI and Proj.NET to give Maestro API a strongly-typed geometry model and some basic coordinate system transformation capabilities. These were all full framework packages, meaning our previous porting attempt (to target netstandard1.6) was stopped in its tracks.

Because netstandard2.0 has a full-framework compatibility shim, we were able to reference these existing packages with the standard NU1701 compatibility warnings spat out by NuGet. However, since the previous porting attempt, the authors of NetTopologySuite, GeoAPI and Proj.NET have released netstandard-compatible (albeit prerelease) versions of their respective libraries, so as a result we were able to fully netstandard-ify all our dependencies as well.

However, we had to turn off strong naming of our assembly in the process because our upstream dependencies did not offer strong-named netstandard assemblies.

And with that, the Maestro API was ported to .net standard 2.0

MaestroAPI HTTP Provider

However, the Maestro API would not be useful without a functional HTTP provider to communicate with the mapagent. So this library also needed to be netstandard-compatible.

The previous porting attempt (to netstandard1.6) was roadblocked because the HTTP provider uses HttpWebRequest to communicate with the mapagent. While we could have just replaced HttpWebRequest with the newer HttpClient, that would require a full async/await-ification of the whole code base and then having to deal properly with the leaky abstractions known as SynchronizationContext and ConfigureAwait to ensure our async/await-ified HTTP provider is usable in both ASP.net and desktop windows application contexts without it deadlocking on one or the other.

While having a fully async HTTP provider is good, I wanted to have a functional one first before undertaking the task of async/await-ifying it. The development effort involved was such that it was better to just wait for .net standard 2.0 to arrive (where HttpWebRequest was supported) than to try to modify the HTTP provider to use HttpClient.

And just like the porting of the ObjectModels/MaestroAPI projects, this was a case of taking the existing csproj file, replacing the contents with the minimal netstandard class library template and manually adding in the required references and various settings until the project built again.

Caught in a snag

So all the key parts of the Maestro API have been ported across to .net standard 2.0 and the code all builds, so now it was time to run our unit tests to make sure everything was still green.

All green they were indeed. All good! Now to run the thing.

Most things seemed to work until I validated a Map Definition and got this message.



Assembly manifest what? I have no idea! This error is also thrown when I use any part of the MaestroAPI that uses NetTopologySuite -> GeoAPI.

My first port of call was to look at this known issue and try all the workarounds listed:
  • Force all our projects to use PackageReferences mode for installing/restoring nuget packages
  • Enable automatic binding redirect generation on all executable projects
After trying these workarounds, the assembly manifest errors still persisted. At this point I was stuck and was on the verge of giving up on this porting effort until some part of my brain told me to take a look at the assemblies that were in the output directory.

Since the error in question referred to GeoAPI.dll, I'd thought I'd crack that assembly open in ILSpy and see what interesting information I could find about this assembly.



Well this was certainly most interesting! Why is a full-framework GeoAPI.dll being copied out? The only direct consumer of GeoAPI (OSGeo.MapGuide.MaestroAPI.dll) is netstandard2.0, and it is referencing the netstandard target of GeoAPI.

Here's a diagram of what I was expecting to see:



After digging around some more it appears from observation that there is a bug (or is it feature?) in MSBuild where given a nuget package that offers both netstandard and full-framework targets, it will prefer the full-framework target over the netstandard one. This means in the case of GeoAPI, because our root application is a full-framework one, MSBuild chose the full-framework target offered by GeoAPI instead of the netstandard one.

So what's the assembly manifest error all about? The FusionLog property of the exception reveals the answer.



GeoAPI is strong-named for full-framework. GeoAPI is not strong-named for netstandard. The assembly manifest error is because our netstandard-targeting MaestroAPI references the netstandard target of GeoAPI (not strong-named), but because our root application is a full-framework one, MSBuild gave us a full-framework GeoAPI assembly instead. At runtime, .net could not reconcile that a strong-named GeoAPI was being loaded when our netstandard-targeting MaestroAPI was references the netstandard GeoAPI that is not strong named. Hence the assembly manifest error.

Multi-targeting for the ... win?

Okay, so now we know why it's happening, what can we do about it? Well, the other major thing that the new MSBuild and csproj file format gives us is the ability to easily multi-target the project for different frameworks and runtimes.

By changing the TargetFramework element in our project to TargetFrameworks (plural) and specifying a semi-colon-delimited list of TFMs, we now have a class library that can build for each one of the TFMs specified.

For example, a netstandard 2.0 class library like this:

1
2
3
4
5
<Project Sdk="Microsoft.NET.Sdk">
  <PropertyGroup>
    <TargetFramework>netstandard2.0</TargetFramework>
  </PropertyGroup>
</Project>

Can be made to multi-target like this:

1
2
3
4
5
<Project Sdk="Microsoft.NET.Sdk">
  <PropertyGroup>
    <TargetFrameworks>netstandard2.0;net461</TargetFrameworks>
  </PropertyGroup>
</Project>

If MSBuild insists on giving us full-framework dependencies if given the choice between full-framework and netstandard (when both are compatible), then the solution is to basically multi-target the MaestroAPI class library so that we offer 2 flavors of the assembly:
  • A full-framework one (net461) that will be selected by MSBuild if the consuming application is a full-framework one.
  • The netstandard one (netstandard2.0) that will be selected by MSBuild if the consuming application is .net Core, Xamarin, etc.
Under this setup MSBuild will choose the full-framework Maestro API over the netstandard one when building the Maestro windows application. Since we're now building for multiple frameworks/runtimes and explictly targeting full-framework again, we can re-activate strong naming on the full-framework (net461) target, ensuring the full-framework dependency chain of MaestroAPI is fully strong-named (as it was before we started this porting effort), and our assembly manifest error goes away when running unit tests and the Maestro application itself whenever we hit functionality that uses GeoAPI/NetTopologySuite.

So the problem is effectively solved, but the whole process feels somewhat anti-climactic.

I mean ... the whole premise of .net standard and why I wanted to port MaestroAPI to target it was the promise of one unified target (an interface if you will) with many supporting runtimes (ie. various implementations of this interface). Target the standard and your class library will work across the supporting runtimes, in theory.

Unfortunately in practice, strong-naming (and MSBuild choosing full-framework targets over netstandard, even if both are compatible) was the leaky abstraction that threw a monkey wrench on this whole concept, especially if some targets are strong-named and some are not. Having to multi-target the Maestro API as a workaround feels unnecessary.

But at the end of the day, we still achieved our goal of a netstandard-compatbile Maestro API that can be used in .net Core, Xamarin, etc. We just had to take a very long detour to get from A to B and all I can think of was: Was this (multi-targeting) absolutely necessary?

Some Changes and Compromises

Although we now have a .net standard and full framework compatible versions of the Maestro API, we have to make some changes and compromises around the developer and acquisition experience for this to work in a cross-platform .net world.

1. For reasons previously stated, we have to disable strong-naming of the Maestro API for the .net standard target. This is brought upon us by our upstream dependencies (the netstandard flavors of GeoAPI and NetTopologySuite), which we can't do anything about. The full framework target however is still strong-named as before.

2. The SDK package in its current form will most likely go away. This is because turning Maestro API into a .net standard library forces us to use nuget packages as the main delivery mechanism, which is a good thing because nobody should be manually referencing assemblies in this day and age for consuming libraries. The tooling now is just so brain-dead simple that we have no excuse to not make nuget packages. No SDK package also means that we can look at alternative means of generating API documentation (docfx looks like a winner), instead of Sandcastle as making CHM files is kind of pointless and the only reason I made CHM files was to bundle it with the SDK package.

The sample code and supporting tools that were previously part of the SDK package will be offloaded to a separate GitHub repository that I'll announce in due course. I'll need to re-think the main ASP.net code sample as well, because the old example required:

  • Manually setting up a web application in local IIS (not IIS Express)
  • Manually referencing a whole bunch of assemblies
  • Needing to run Visual Studio as administrator to debug the code sample due to the local IIS constraint.

These are things that should not be done in 2017!

3. Because nuget packages are the strongly preferred way of consuming libraries, it meant that having the HTTP provider as a separate library just complicates things (having to register this provider in ConnectionProviders.xml and automating it when installing its theoretical nuget package). The Maestro API on its own is pretty useless without the HTTP provider anyways, so in the interest of killing two birds with one stone, the HTTP provider has been integrated into the Maestro API assembly itself. This means that you don't even need ConnectionProviders.xml unless you need to use the (mg-desktop wrapper) local connection provider, or you need to use a (roll your own wrapper around the official MapGuide API) local-native connection provider.

4. The CI machinery needed some adjustments. I couldn't get OpenCover to work against our newly ported netstandard libraries using (dotnet test) as the runner, so I had to momentarily disable the OpenCover instrumentation while the unit tests ran in AppVeyor. But as a result of needing to multi-target MaestroAPI (for reasons already stated), I decided on this CI matrix:

  • Use AppVeyor to run the Maestro API unit tests for the full-framework target on Windows. Because we're running the tests under a full-framework runner, the OpenCover instrumentation can be restored, allowing us to upload code coverage reports again to coveralls.io
  • Use TravisCI to run the Maestro API unit tests for the netstandard target under .net Core 2.0 on Linux. The whole motivation for netstandard-izing MaestroAPI was to get it to run on these non-windows .net platforms, so let TravisCI handle and verify/validate that aspect for us. We have no code coverage stats here, but surely that can't be radically different than the code coverage states had we run the same test suite on Windows with OpenCover instrumentation.
Where to from here?

Now that the porting efforts have been completed, the next milestone release should follow shortly. 

This milestone will probably only concern the application itself as the SDK story needs revising and I don't want that to hold up on a new release of Maestro (the application).

A simpler MgCooker tile seeding process

I don't know if you've ever seen the guts of the tile seeding code used by MgCooker, it isn't the most prettiest of things, but it for the most part works.

Besides some cosmetic restructuring of the code, I haven't really touched this part of Maestro ever.

Consider the history of this tiling code. It originated around 2009. Things we now take for granted like async/await and the Task Parallel Library probably didn't exist around that time, so you had no choice but to dive deep into wait handles, auto-reset events and manual thread management.

If I had to write MgCooker from scratch today, I'd cook up (pun intended) probably something like this

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
using System.Diagnostics;
using System.Threading;
using System.Threading.Tasks;
...
public class TileSeeder
{
    public void SeedTiles()
    {
        List<(int row, int col, int scale)> tiles = ComputeTileRequestList();
        int total = tiles.Count;
        int rendered = 0;
        var sw = new Stopwatch();
        sw.Start();

        //The magic sauce that takes multi-threads our tile seeding and takes care of all our multi-threading concerns!
        Parallel.ForEach(tiles, (tile) => 
        {
            //Send a HTTP request to GETTILE mapagent API with tile.row, tile.col and tile.scale
            ...
            Interlocked.Increment(ref rendered);
            Console.WriteLine($"Rendered {rendered}/{total} tiles");
        });

        //And this method blocks too, so if we get to this point, the tiling process has finished.
        sw.Stop();
        Console.WriteLine($"Rendered {rendered} tiles in {sw.Elapsed}");
    }
}

Isn't this much easier to read and comprehend?

The implementation of the ComputeTileRequestList method referenced here is omitted for brevity, but for the implementation we can just reuse what is in the current iteration of MgCooker. Most of the settings in MgCooker mainly affects the generation of the list of row/col/scale anyways.

The core multi-threading "render/cache all these tiles" is just one simple Parallel.ForEach method baked right in the .net Framework itself!

MgCooker is overdue for a rewrite anyways. I just didn't really think it would be so conceptually simple with today's .net libraries and C# language constructs!

Saturday, 9 September 2017

Not all quiet on the MapGuide Maestro front

Just because MapGuide Maestro is a windows-only .net application does not mean the Maestro API driving it (and your own custom MapGuide applications using the Maestro API) need to bear the same platform constraint.

We have .net Core to run .net code cross-platform. We have .net standard as a specification for building truly cross-platform .net class libraries. The API surface used by the MaestroAPI highly intersects with what is provided by .net standard. All the signs say that we have the means to make MaestroAPI a truly cross-platform .net standard compatible class library, and that we should probably do it.

So since the last release, my main focus has been to port the Maestro API to target .net standard 2.0, thus expanding its potential reach beyond full .net Framework on windows.

It is most encouraging to see this on a Linux terminal after all this effort.



The tale of getting to this point is worthy of a blog post itself. The internet could do with more .net standard porting stories.

Friday, 8 September 2017

Announcing: mapguide-react-layout 0.10

Here's a new release of mapguide-react-layout with a metric bucketload of new features.

QuickPlot now feature complete

As blogged previously, the final missing piece of the QuickPlot (capture box rotation) has been implemented, bringing this component to feature parity with the Fusion original. The rotation being controlled through a new numerical slider in the QuickPlot UI


Selection Panel sub-selection

Through creative use of QUERYMAPFEATURES, the Selection Panel can now support highlighting selected features within a selection set. Useful for visually sifting through big selections.



Persistent Load Indicator

A user raised a very valid point that if you don't have the navigator (aka. Zoom slider) active, there is no actual load indicator visible as the only place we have a load indicator is on the navigator component itself.

This release adds a persistent load indicator that is represented by a thin blue animated bar at the top.

Blink and you might miss it in the GIF below.


Coordinate Tracker

This release includes the missing Coordinate Tracker widget.


Init Warnings

Any warnings that we encounter during viewer initialization are now collected and displayed at the end. Here's some warnings that could be shown.



API enhancements

  • NPM module: Now supports custom redux application state and reducers. Check out the updated example to see how this is done.
  • NPM module: Selection Panel supports custom rendering of the body when a selected feature is to be displayed (if you don't like the default table-based attribute display)
  • NPM module: There is now a flat "mapguide-react-layout" module where you can import everything from, instead of having to import everything piecemeal.
  • Browser Globals: You can now specify locale as a mount option
  • Browser Globals: You can now specify a post-init hook function as a mount option
  • Browser Globals: Redux getState and dispatch functions exposed through MapGuide.Application class
  • Browser Globals: All dispatchable redux actions available under MapGuide.Actions namespace
  • Browser Globals: You can now retrieve and invoke registered commands through getCommand()
Other Changes
  • OpenLayers updated to 4.3.2
  • Blueprint updated to 1.27
  • Now built with TypeScript 2.5
  • Removed es6-promise from the viewer bundle. All template HTML files script include the es6-promise polyfill separately so that the viewer will continue to work in Internet Explorer
  • Ported across the view size status bar component
  • Full extension property support for the CursorPosition widget


Project Home Page
Download
mapguide-react-layout on npm

Monday, 28 August 2017

React-ing to the need for a modern MapGuide viewer (Part 19): Highlighting selected features

When I was implementing the Selection Panel for mapguide-react-layout, this super-ancient Fusion ticket was on the back of my mind as being able to highlight selected features is a very useful feature to have.

Whereas I didn't really have an idea how this would be done in Fusion, for mapguide-react-layout I had an idea how this would be done.

The trick is to use the v2.6 QUERYMAPFEATURES operation with the following parameters:

  • REQUESTDATA=2 (asking for inline selection image only)
  • LAYERATTRIBUTEFILTER=0 (so that selection is not constrained by layer selectability and our current view)
  • FEATUREFILTER=(the selection XML sub-fragment that represents the selected feature)
  • PERSIST=0 (the most important parameter. We want this operation to not alter the selection set)
Once the QUERYMAPFEATURES operation is sent and we get a response, the key is to be able to line up the inline selection image (data URI) with the current map. To handle this part, we use an OL static image source to store the inline selection image.

When all the pieces are brought together, we finally have the ability to highlight selected features.


This will be in the next release of mapguide-react-layout.

Thursday, 10 August 2017

React-ing to the need for a modern MapGuide viewer (Part 18): Restoring the Quick Plot capture box

If you've been playing with mapguide-react-layout, you'll no doubt have noticed a glaring omission in the ported Quick Plot component.

The interactive map capture box.


Porting across the capture box sounded mostly simple:

  • Manage the temporary OL vector layer containing the capture box.
  • Attach an OL translate interaction so the capture box feature can be easily moved around by dragging the box around.
  • Auto-update box geometry based on change in paper size / orientation / scale
There was just one technical hurdle: Rotating the box.

For the longest time, I was wracking my brain figuring out how to replace the rotation "grip handle" using the new OpenLayers APIs and I couldn't figure it out. The APIs were there to rotate feature geometry. I heard about ol-rotate-feature and gave it a try, but I couldn't grasp how that interaction actually rotates features.

So I went back to the mental drawing board. OpenLayers has APIs to rotate features, so what we really just need was an intuitive UI input to enter the box rotation. If I couldn't figure out how to replicate the rotation "grip handle", then the next best thing was ... a numerical slider.

And with that, the missing piece was found and I was finally able to port over the final piece of functionality of the original Fusion QuickPlot widget.


In the above GIF, you might have noticed a message flash by.


This is merely a simple countermeasure against the fact that with current OpenLayers, you can rotate the map as well. So rather than deal with number crunching 2 sets of rotations, it is easier to just reset the rotation on the map and prevent the ability to rotate the map while the map capture box is active.

And with that, our Quick Plot component has reached feature parity with the original Fusion widget.

This (newly feature parity) Quick Plot will be available in the next release of mapguide-react-layout, whenever I decide that will be.

Wednesday, 2 August 2017

Announcing: mapguide-react-layout 0.9.6

This is a quick release to make the awesome new layer transparency sliders in the previous release not break down when used on a map that doesn't actually have external base layers (OSM, Bing, etc)

Project Home Page
Download
mapguide-react-layout on npm

Thursday, 27 July 2017

Announcing: mapguide-react-layout 0.9.5

This was originally going to be versioned 0.9.2, but the volume of changes was too big to be a bugfix-level single patch version bump, but at the same time, it was also not enough to warrant a minor version bump, so I decided to go half way with 0.9.5.

Here's what's new in this release.

Toggle-able Layer Transparency

The viewer options UI is now fleshed out to allow you to toggle transparency of:

  • The MapGuide map (and any tiled layer groups)
  • The MapGuide selection overlay
  • External Base Layers
To illustrate this, here's a self-explanatory GIF



This allows one to easily compare the MapGuide map against its base layer backdrops without requiring actual visibility toggling.

And yes, this works even on IE (11, the only version I care to support)

Sprite Icon Support

This release now supports the standard Fusion icon sprite. This will no longer load the individual icons for commands and widgets if it is clear they are referencing the standard icon sprite.


Targeted Command Support

If a command or widget requires execution in a New Window or a specific frame, the viewer will now support it. Note that if a command or widget is set to execute in a New Window, we won't actually spawn a new physical browser window, we'll run it in an iframe inside a BlueprintJS dialog component instead.

Other Changes/Fixes
  • Added support for extension properties for Buffer, FeatureInfo, Query, Search, SelectWithin, Theme.
  • Fixed Fusion MapMessage bar emulation
  • Fixed tooltip queries not being sent with pixel-buffered polygon geometries
  • Fixed zoom requests not snapping the scale to the closest finite list if viewing a tiled map
  • Legend now properly renders layers with multiple geometry styles
  • Fix excessive BlueprintJS toaster components being created and not cleaned up
  • Fix flyout menus requiring double-click to re-open (after clicking a menu item inside the first time)

Friday, 21 July 2017

So ... where's MapGuide Open Source 3.2?

Here's the story, since I gather not everyone reads the mapguide-users mailing list where I mentioned this subject many months ago.

I've decided (many months ago) to skip on making this release.

The differences between 3.1 and (a 3.2 release if I had decided to make one) is so small that it isn't worth investing the build resources on a 3.2 release cycle.

Since I'm skipping on a 3.2 release, it means that we have a good year-long window of solid development time to get some compelling features into the release after it (currently set as 3.3). Some of this development work is already starting to bear fruit.

Now that's not to say there isn't going to be a MapGuide release sometime between now and when 3.3 is out. I still do hope to put out the (hinted previously) patch releases for MGOS 2.6, 3.0 and 3.1 in between, but that requires me rebuilding my build infrastructure first and that is currently taking a back seat to landing some solid features into 3.3 first, so that's where things are at.

And as always. As these features land, you can expect this blog to talk about it.

Sunday, 9 July 2017

React-ing to the need for a modern MapGuide viewer (Part 17): Reason number 5537485 why react was the right choice

An issue cropped up where the legend was not properly rendering a given layer that has multiple geometry styles. This issue was easily reproducible with the Redline widget.

We were expecting to see this after creating a redline layer and drawing some objects.


But we got this instead


Because this legend is a react component, we can inspect it (and the problem layer node) with the React developer tools


Remember the important React motto: The UI is a function of props and state. The HTML content of the LayerNode should be reflective of the props and state given to it. We should've seen something that resembled 3 style icons. But nothing's there.

So let's just check that the layer model for this LayerNode component is indeed a layer with multiple geometry styles


Indeed it is, so that means that the LayerNode component is the culprit. It is not handling the case of multiple geometry styles properly.

As we've already set up our test infrastructure to make it easy to write and run tests, it should be easy to write up an enzyme unit test that shows what we were actually expecting to see when a LayerNode renders a layer that has multiple geometry styles

component.legend.spec.tsx


 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
import * as React from "react";
import { shallow, mount, render } from "enzyme";
import { MapLayer } from "../src/api/contracts/runtime-map";
import { LayerNode } from "../src/components/legend";
import { ILegendContext } from "../src/components/context";

// Mocks the ILegendContext needed by LayerNode and other legend sub-components
function mockContext(): ILegendContext {
    return {
        getIconMimeType: () => "image/png",
        getStdIcon: (path: string) => path,
        getChildren: (id) => [],
        getCurrentScale: () => this.props.currentScale,
        getTree: () => {},
        getGroupVisibility: (group) => group.ActuallyVisible,
        getLayerVisibility: (layer) => layer.ActuallyVisible,
        setGroupVisibility: () => {},
        setLayerVisibility: () => {},
        getLayerSelectability: (layer) => true,
        setLayerSelectability: () => {},
        getGroupExpanded: (group) => true,
        setGroupExpanded: () => {},
        getLayerExpanded: (layer) => true,
        setLayerExpanded: () => {}
    };
}

describe("components/legend", () => {
    it("renders a multi-geom-style layer with a rule for each geom style", () => {
        const layer: MapLayer = {
            Type: 1,
            Selectable: true,
            LayerDefinition: "Session:841258e8-63f9-11e7-8000-0a002700000f_en_MTI3LjAuMC4x0AFC0AFB0AFA//testing.LayerDefinition",
            Name: "_testing",
            LegendLabel: "testing",
            ObjectId: "abcd12345",
            DisplayInLegend: true,
            ExpandInLegend: true,
            Visible: true,
            ActuallyVisible: true,
            ScaleRange: [
                {
                    MinScale: 0,
                    MaxScale: 10000,
                    FeatureStyle: [
                        {
                            Type: 4,
                            Rule: [
                                {
                                    Icon: "iVBORw0KGgoAAAANSUhEUgAAABAAAAAQCAYAAAAf8/9hAAAABHNCSVQICAgIfAhkiAAAAB1JREFUOI1j/M/A8J+BAsBEieZRA0YNGDVgMBkAAFhtAh6Zl924AAAAAElFTkSuQmCC"
                                }
                            ]
                        },
                        {
                            Type: 4,
                            Rule: [
                                {
                                    Icon: "iVBORw0KGgoAAAANSUhEUgAAABAAAAAQCAYAAAAf8/9hAAAABHNCSVQICAgIfAhkiAAAACVJREFUOI1jYBgFwwAwMjD8bcAjL8rAwKCNR56LibruGQVDFAAACkEBy4yPOpAAAAAASUVORK5CYII="
                                }
                            ]
                        },
                        {
                            Type: 4,
                            Rule: [
                                {
                                    Icon: "iVBORw0KGgoAAAANSUhEUgAAABAAAAAQCAYAAAAf8/9hAAAABHNCSVQICAgIfAhkiAAAAFhJREFUOI3t0D0OQFAUROHPIwoNS7ZJe5BoKJR+OpVH4jUkTjv3TDI3w6Z2zooNeSSfKNQYIwd3NISH6sFfQLCkFmQJdkVIGlG+44mfLyjMaCPpgO7C7tkBAXgKXzBhmUQAAAAASUVORK5CYII="
                                }
                            ]
                        }
                    ]
                }
            ]
        };
        const wrapper = shallow(<LayerNode layer={layer} />, {
            context: mockContext()
        });
        const rules = wrapper.find("RuleNode");
        expect(rules.length).toBe(3); //One for each geom style
    });
});

Running this in jest confirms our expectations were not met:

Summary of all failing tests
 FAIL  test\component.legend.spec.tsx (6.75s)
  ● components/legend › renders a multi-geom-style layer with a rule for each geom style

    expect(received).toBe(expected)

    Expected value to be (using ===):
      3
    Received:
      0

      at Object. (test/component.legend.spec.tsx:149:30)
      at Promise.resolve.then.el (node_modules/p-map/index.js:42:16)
      at process._tickCallback (internal/process/next_tick.js:103:7)


Test Suites: 1 failed, 22 passed, 23 total
Tests:       1 failed, 94 passed, 95 total
Snapshots:   0 total
Time:        14.532s
Ran all test suites.

We expected 3 RuleNode components (one for each geometry style) to have been rendered, but we only got nothing.

A look at the LayerNode rendering shows why. It only considered the first feature style of any layer's scale range.

So once that was fixed, not only does our test pass, but we have visual confirmation that multi-geometry-style layers now render like they did in the Fusion and AJAX viewers.


So the reason for writing this post was just a re-affirmation of my choice for using React to build this viewer.

  • The top-quality developer/debugging experience.
  • The react way of thinking about UIs that allowed me to easily identify the culprit (the LayerNode component)
  • The top-quality testing ecosystem around React (Jest, enzyme) that allowed me to easily write a unit test on this component to confirm and verify my expectations