Medical 3D and Maya: Coronary Arteries

I’ve used 3DS Max for years to develop some of the images on fpnotebook.com, such as this  image.

The 3D packages tend to be expensive, have a steep learning curve, and I’ve invested  in an expensive 3d Human Model for 3DS Max;  I have therefore resisted updating my 2010 version of 3DS Max.  However, I was blown away by Emily Holden’s Lynda.com course Medical Animations in Maya.  I’ve also found her website photoshop and illustrator pearls for medical illustration very helpful.  I had been considering moving to Maya for several years, since it seems to be the direction that Autodesk has been moving, and this seemed like a good time to learn Maya.

One of the problems in moving to Maya is in moving the expensive 3D Human Model from 3DS max.  These do move over to Maya fairly well for rendering via FBX files, but I had loads of trouble on working with their geometry, since they are 3DS Max triangle Meshes and not Maya quad Meshes.

After having completed several online Maya courses, my first independent goal was to draw arteries on the surface of a pre-existing heart model.  After several fruitless attempts, here is the successful recipe I used:

  1. Export FBX from 3DS Max with quad geometry (based on this forum response):
    1. Select the geometry in 3dS Max and Convert to Editable Poly
    2. Export to FBX and uncheck “preserve edge orientation” in the dialog box
  2. Create a curve on surface in Maya
    1. Duplicate the target surface and make the surface live
      1. This may not be necessary
    2. Choose one of the curve drawing tools
    3. Draw the course of the coronary artery on the surface
    4. Turn off live surface when done
  3. Extrude a circle along the surface curve
    1. Create a NURBS circle with the diameter of the artery
    2. Rotate the circle to be perpendicular to the start of the curve
      1. This is important – the artery will be irregular if the angle is not 90
      2. Alternatively could draw the curve first in 2D and then drop onto surface
    3. Align the circle center vertex with the curve end control vertex
      1. Select both the circle and curve
      2. Right click on the curve and select “control vertex”
      3. Select the control vertex that is at the start of the extrude
        1. Should be at the end of the curve that was drawn first
        2. If starting at the opposite end (and result black), reverse normals
      4. Choose the move tool (W)
      5. Hold down the “V” Key to snap to points
      6. Move the circle and it will align with the start point
    4. Open the Surfaces > Extrude dialog
      1. Style: Tube
      2. Result position: At Path
      3. Orientation: Profile Normal
      4. Output Geometry: Polygons
      5. Type: Quads
      6. Tessellation: Standard Fit
    5. Modify the extruded curve
      1. Rename as coronary artery
      2. Can modify the extrude in the channel box editor
        1. Set “Use Component Pivot” to Component Pivot
      3. Right click and Assign New Material

 

Addendum.  I had a few complex 3ds max meshes that needed conversion to poly quads and the simple approach above did not work.  Here is a max-script I created that automates this process based on this knowledge base page:

--script for 3dsMax to convert mesh to polys, and polys to quads
--export this to fbx
select objects
turnToPoly = Turn_to_Poly ()
turnToPoly.limitPolySize = on
turnToPoly.maxPolySize = 4
modPanel.addModToSelection (turnToPoly) 
modPanel.addModToSelection (Edit_Poly ()) 
maxFileName --copy this and paste in export dialog
-- OpenFbxSetting()
Advertisements
Posted in Uncategorized

NDC Minnesota

I have for sometime been working on implementing federated sign in with Identity Server in asp.net core.  I have found this challenging, but my project has slowly been coming together.  The Identity Server documentation and github samples are excellent, as are the courses on Pluralsight, but the topic is complicated, and the best source would be Broch Allen and Dominick Baier, the framework developers.    In the last couple of years, I noticed that Norwegian Developers Conference (NDC) not only had Identity Server 2 day workshops (by its developers), but attracted a long list of my favorite speakers (e.g. Scott Allen).  Unfortunately the conference has only been in Oslo, London and Australia.  While I was waiting for winter to end this february, while browsing Reddit, I saw an advertisement for something called “NDC Minnesota”.  I thought, that must be a knock-off conference and can’t possibly be the real NDC Conference, only 20 miles from my home.

Well, it was the real NDC, and I write this after attending 4 days at the conference: 2 days of Broch Allen’s Identity Server workshop and 2 days of general session.  The workshop was excellent and set me up to complete my Identity Server implementation.  Broch stepped through each part of authentication and authorization, with associated labs and great commentary on what not-to-do.

The main conference was also excellent – great speakers, conversation, food and location.  Best of all, the small attendance (300+ instead of Oslo’s 2000+) of this first year of NDC in Minnesota (held at the same time as MS Build, Google IO), allowed for more conversation/access to speakers with no crowds.

If ngConf, a few weeks ago, was abundant in ngRx state management lectures, NDC Minnesota was abundant in microservices.  The attraction appears to be their disconnected, maintable, scalable, modifiable units.   Udi Dahen, developer of nServiceBus, held a 2 day pre-conference workshop here.  I attended a couple of his lectures during the general conference, and am still trying to get my head around vertical slicing of data, business rules…  instead of sending the client everything, a given page sends out a message “I’m about to deliver content with X,Y,Z” and any subscribing suppliers of that info, respond asynchronously with the data.  A price api, locale api and product description api may all reply with a separate stream of data that is concatenated as the content is delivered.

I cannot afford nServiceBus, but container-based nodes might be more in my price range.  Since my backend is written in asp.net core, and I’m on a microsoft server, I have not so far had use for Docker Containers.  However, there are several frameworks including Kubernetes that allow for automated management of multiple container instances.  I send a master controller a manifest, letting it know to spin up a few nodes, and each node with a few pods, each of which contain a few containers.  One pod might be X api, another pod a db, and another the frontend.  If one pod fails, the controller replaces it with another pod.  Scaling up, might simply be adding a few extra pods.  Bridget Kromhout led a few hour workshops on Kubernetes that I was able to follow.  It may not yet be ready for production (at least by mere mortals), but stay tuned.

Finally Steve Sanderson (of Knockout fame) introduced a Microsoft project Blazor, currently in alpha, but I am very hopeful that Microsoft pursues this fully.  Use c# for both frontend and backend, with some shared code, and the client code compiled to run in web assembly.  Very cool.

NDC Minnesota 2018 was a great conference.  I hope to return next year.

 

Posted in Uncategorized

Redirecting URLS to HTTPS and Non-WWW

Behemoth tech companies have a way of stimulating change, in some cases for the better.  Flash saw its early end, when Apple would not allow Flash to run on IOS.

In March of 2018, which is now only a week away, Google will start marking non-Https (non-SSL) websites as insecure.  Although I bought an SSL certificate nearly 2 years ago, I have been slow to make the transition to Https.   The barrier has mostly been fixing everything that would break under https.  Every link, every source file in static folders, asp.net legacy and core applications, angular applications… all had to use the host name: //fpnotebook.com.  Other versions of the protocol and host name (http://www.fpnotebook.com, https://www.fpnotebook.com, http://fpnotebook.com) would break.

In mid-february, I started fixing all the code.  Fpnotebook’s 3 web versions are all compiled, and so the underlying code for each took some clean-up.  Then there was the backend search and other functionality relying on asp.net.  Unfortunately, this was all legacy code that I could not get to recompile, so I recreated these in asp.net core.   Then there were all 15-20 WebApps.

When I finally finished fixing all the links, I thought I was done with the hard part – now simply flip the switch and have all web traffic redirect to https://fpnotebook.com.

This turned out to be much more frustrating then I could have imagined.  I found various examples using IIS URL redirect rules such as this one:  StackOverflow

However these solutions only worked in some cases, and when it did not work (specifically https://www.fpnotebook.com would not redirect to https://fpnotebook.com), it would leave a user with partially broken webpages.

I could redirect http://mywebsite.com to https://mywebsite.com
I could redirect http://www.mywebsite.com to https://mywebsite.com
But I could NOT redirect https://www.mywebsite.com to https://mywebsite.com

I tried multiple URL rewrite/redirect rules in IIS using the examples above, other Stack Overflow options, MSDN, IIS blogs and Scott Forsyth’s blog and Pluralsight’s course.

 

The solution for me was to create a new, dummy “wwww redirect” website in IIS.
I set bindings in IIS to handle http://www.mywebsite.com and https://www.mywebsite.com. This dummy website has only one file, a web.config.
In the web.config, a simple rewrite rule:

<?xml version=”1.0″ encoding=”UTF-8″?>
<configuration>
<system.webServer>
<rewrite>
<rules>
<clear />
<rule name=”Redirect all traffic to https non-www mywebsite.com” stopProcessing=”true”>
<match url=”(.*)” />
<action type=”Redirect” url=”https://mywebsite.com/{R:1}” redirectType=”Permanent” appendQueryString=”true” />
</rule>
</rules>
</rewrite>
</system.webServer>
</configuration>

 

Then, in the main website, change the bindings to not handle the www hostnames for http and https (allowing these to flow to the dummy site).
Now, all traffic to the main site will be non-www.
All that is needed is a simple webconfig redirect rule for Htttp to Https:

<rule name=”Redirect to https” stopProcessing=”true”>
<match url=”(.*)” />
<conditions >
<add input=”{HTTPS}” pattern=”off” ignoreCase=”true” />
</conditions>
<action type=”Redirect” url=”https://mywebsite.com/{R:1}” redirectType=”Permanent” appendQueryString=”true” />
</rule>

Thanks to Bryan C. at Rackspace for helping me find this solution.  Thanks to Bryan, I still have a few hairs that I did not pull out prior to finding this solution.

Posted in Uncategorized

Solution: Entity Framework Core and Net Core Class Libraries 2.0

First, to WordPress – thanks for letting me blog for free here, but your formatting of programming code truly is awful and the WYSIWYG to html translation corrupts pasted code easily.  There are plugins that fix your inadequacies, WordPress, but  since plugins are now only part of an expensive monthly subscription, I am not using them.  If anyone has an alternative recommendation for pasting code in WordPress without using plugins,  or using plugins with free WordPress version, please let me know.

Onto to Microsoft complaints.

Having completed an Identity Server 4 implementation, I am now able to return to work on my main project.  This is an Angular frontend, and asp.net core backend.  In my first iteration of the project, I used a single asp.net core project including the data/EF entities, the web api, the mvc app (for backend admin), and the angular app build copied into the wwwroot folder of the same project.

However, I felt there was too much for a single asp.net core project and I wanted to split the MVC app from the WebApi.   This requires that I create a new class library containing a data project that could be consumed by the MVC and WebApi.

With VS 2017 and asp.net core 2.0. the tooling has improved significantly, so setting up the class library for net core is straight forward (prior versions required this be set up from DotNet Cli).  My only complaint on initial set-up is that the projects are created outside the “src” directory (i.e. Solution Explorer shows the project under “src folder”, but not on the file system).  I’ve seen this bug since starting use of asp.net core, and hope it will someday be fixed.

I’ve watched Julie Lehrman’s Pluralsight EF Core Getting Started course (excellent as usual) set-up of data class libraries.  She separates out the domain models in one class library project, and the EF data project in another library.  I’ve also seen Sean Wildermuth caution against too many projects (focusing on projects for individual deployable units).  For me, I’ve chosen to create a Data Project containing the domain models, the EF entities/migrations, and repositories (with the MVC/WebApi projects containing VMs/DTOs).

There are gotchas in setting up Net Core 2.0 Class Libraries to house entity framework migrations, and these problems/solutions are evolving with each asp.net core release.  The asp.net core documentation does not fully address these, and neither does Julie Lehrman’s course (recorded with asp.net core 1.1), nor with most of my online searches which all focused on solutions for asp.net core 1.1+ and not 2.0.   Below is the process I used to get this done.

Nuget install-package Microsoft.EntityFrameworkCore.SqlServer and Microsoft.EntityFrameworkCore.Design.

Next is deciding which command line tool to use (both execute the same underlying code ultimately):

  1. Powershell commands
    1. Commands: Add-Migration, Script-Migration, Update-Database
    2. Install-Package Microsoft.EntityFrameworkCore.Tools
  2. Dotnet Entity Framework CLI commands
    1. Commands: dotnet ef migrations add, dotnet ef migrations script, dotnet ef database update
    2. Install-Package Microsoft.EntityFrameworkCore.Tools.DotNet

I found that the reference for Microsoft.EntityFrameworkCore.Tools.DotNet needs to be specially set-up in the .csproj file, or the dotnet ef commands will not work:

<ItemGroup>
 <DotNetCliToolReference Include="Microsoft.EntityFrameworkCore.Tools.DotNet" Version="2.0.0" />
 <DotNetCliToolReference Include="Microsoft.VisualStudio.Web.CodeGeneration.Tools" Version="2.0.0" />
</ItemGroup>

 

I then ran into a cryptic error (that is not well explained by the link):

Unable to create an object of type 'Fpn2107DbContext'. Add an implementation of 'IDesignTimeDbContextFactory<Fpn2107DbContext>' to the project, or see https://go.microsoft.com/fwlink/?linkid=851728 for additional patterns supported at design time.

 

However, the solution described here worked perfectly.  In that example, he uses ConfigurationBuilder to get the connectionstring from appsettings.json.  Here is a basic version of that code.  My understanding is that this version of the connection string is only used for the entity framework migration and database update.

public class Fpn2017DesignTimeDbContextFactory : IDesignTimeDbContextFactory<Fpn2107DbContext>
{
     public Fpn2107DbContext CreateDbContext(string[] args)
     {
          var builder = new DbContextOptionsBuilder<Fpn2107DbContext>();
          var connectionString = _connection;
          builder.UseSqlServer(connectionString);
          return new Fpn2107DbContext(builder.Options);

     }
}

 

Other prior work-arounds to get EF working in a class library are not needed.  You no longer need to override OnConfiguring in the DbContext class to define the connection string.   Prior examples and online sources noted that you had to run the migrations using a startup executable project (not the data class library), but using this set-up, I was able to run the migrations directly from the class library.

 

Posted in Uncategorized

Batteries Not Included

I have been coding in Microsoft editors since the early 90s, after I left Borland’s Turbo Pascal.  I liked that Microsoft frameworks did not require re-creating the wheel for each project; the base components were quite functional.  Early on, it was VB with its database form binding,  then later C# with WPF and the now defunct Silverlight, and finally asp.net core and entity framework – all of these come with excellent functionality out of the box.

If you read any of my recent posts, the one glaring exception has been Identity (at least production ready).  In 2017, nearly every project will need Identity management – authentication and authorization, yet these components are out-of-the-box only for how far MS Identity can take you (database and repository service).   So, as I was completing setting up my Identity Server (see my last post), I thought I would “quickly” add a Region/State/Province lookup for the selected Country.  The keen reader will detect some foreshadowing, that this was not in fact very quick (but was contained in a single day of work).

I started checking to see if there was any built-in functionality from Microsoft.  There was not.  I had used the Net Framework to extract a country list:

public static SelectList GetCountryCodeSelectList(
            bool useThreeLetterIsoCountryCode = false)
        {
            var codeArray = CultureInfo.GetCultures(
                CultureTypes.SpecificCultures)
               .Select(c =>
               {
                   var country = new RegionInfo(
                       new CultureInfo(c.Name, false).LCID);
                   return new
                   {
                       Id = (useThreeLetterIsoCountryCode) ?
                           country.ThreeLetterISORegionName.ToString() :
                           country.TwoLetterISORegionName.ToString(),
                       Value = country.DisplayName.ToString()
                   };
               })
               .OrderBy(c => c.Value)
               .Distinct()
               .ToArray();

            var selectList = new SelectList(codeArray, "Id", "Value", "US");

            return selectList;
        }

So I started with a States Json file, and spent 20 minutes reformatting this to a c# object, before I realized this was a fools errand for 2 reasons:

    1.   U.S. States alone were inadequate – I wanted to cover provinces, regions… from other countries. Fortunately, Zeke Sikelianos has already produced this:    provinces.json.
    2. Why not keep the file in json and just load into C# with Newtonsoft?
    3. Did you know that you can paste json inside Visual Studio and it will generate a C# Object.
    1. Edit > Paste Special >  Paste Json as Classes
    2. This is great for prototyping a front end, and creating View Models on the back end from json.  Nice work MS.
        public Province[] GetProvinces()
        {
            string filePath = $"{_env.WebRootPath}/js/provinces.json";

            using (StreamReader reader = new StreamReader(filePath))
            {
                var json = reader.ReadToEnd();
                var provinces = JsonConvert.DeserializeObject<Province[]>(json);
                return provinces;
            }
        }

       public PlaceNameWithAbbreviation[] GetProvincesForSelectList()
        {
            var provinces = _provinces;

            PlaceNameWithAbbreviation[] cleanedProvinces = provinces
                .Select(p =>
                {

                    var hasEnglish = (p.English != null);
                    var hasName = (p.Name != null);
                    var hasTranslation = hasEnglish && hasName;

                    var hasRegion = (p.Region != null);
                    var hasShort = (p.Short != null);


                    var name = (hasTranslation) ? $"{p.English} ({p.Name})" : ((hasEnglish)? p.English : p.Name);

                    var nameAndRegion = (hasRegion) ? $"{name}, {p.Region}" : name;
 
                    var abbrev = (hasShort) ? p.Short : name;
                    var country = p.Country;
                    return new PlaceNameWithAbbreviation()
                    {
                        Country = country,
                        Name = nameAndRegion,
                        Abbreviation = abbrev
                    };
                })
                .ToArray();

            return cleanedProvinces;
        }

My next task was to have a State/Region/Province selection box update based on the Country selection box. Most online examples perform this by making an Ajax call to the server to update each time the Country is changed (onchange calls a js function).

I find intermixing javascript with MVC painful compared to my disconnected workflows with angular/core api. I primarily use angular as a frontend framework, but I use asp.net core MVC for the admin/identity. This is typically easy for simple forms and content that do not need much ajax communication with the server. For the MVC pages I generally use plain javascript. Maybe, React would work well with these admin MVC components?

The State/Region/Province json data is relatively small and could be loaded with the initial MVC page rendering – writing the json data directly into the page in a variable. Here we avoid multiple call backs to the server, and filter the data on the client based on the country selection:

@section Scripts {
    @await Html.PartialAsync("_ValidationScriptsPartial")
  
 open-script-tag
 var provinceList = @Html.Raw(Json.Serialize(@Model.ProvinceCodes)); 
 var initProvince = "@Model.State";
 close-script-tag
 script src="~/js/RefreshProvinceList.js"
    

$("#Country").on("change", function () {
    RefreshProvinceList();
});
function filterByCountryCode(province) {
     var countryCode = $('#Country').val().toUpperCase();
    return province.country.toUpperCase() == countryCode;
}
function RefreshProvinceList() {
//http://www.c-sharpcorner.com/article/bind-dropdown-on-selection-change-of-another-dropdown-in-mvc/
    var province$ = $('#State');
    province$.html("");
    var filteredProvinceList = provinceList.filter(filterByCountryCode);
    if (!filteredProvinceList || filteredProvinceList.length < 1) {
        return
    }
    $.each(filteredProvinceList, function (i, province) {
        province$.append(
            $('').val(province.abbreviation).html(province.name));
    });

    //for the first run we use the asp.net View Model's value.
    if (initProvince != null) {
        province$.val(initProvince);
        initProvince = null;
    }

}
//run on load
RefreshProvinceList();

 

As an aside, in case you are wondering, why some of the formatting problems above? Well, wordpress, thank you – despite using the “pre” and “code” tags in various combinations, while moving between Visual and Html views (for those who use WordPress), my text was reformatted, text deleted, inserted new tags… It was a mess (channeling the old MS Frontpage dis-functionality), and dragged out this blog post into taking far more time than it should have.

Posted in Uncategorized

Pecked to Death by Small Bills… and other random thoughts

Software as a service (Saas) has a way of eroding the checkbook (for those under age 20, this is a bound pad of checks that you would pay bills with).   Honestly, everything as a service (Pluralsight/Lynda, Azure, Appstores, Cloud storage, Adobe CC/Office 360) as a group seem to generate near daily auto-payment notification emails.  You were charged $3 or $30 or $54.99.  Flashbacks to Hitchcock’s “The Birds”.

Asp.net has recently saved me from at least one recurring bill: Identity Services.   Setting up Identity Server is not an afternoon project.  This is, instead, the renovation project that you started last year, and the stairs, kitchen, and really entire first floor is still torn up.  But, now nearly completed, I will have a working solution for every future project – the production component that is left out of nearly every online course.    Most courses, defer this at best to trials of Auth0 (which is a very nice solution, but will incur a monthly charge – see above), or at worst, to insecure mock-ups with the caveat, “do not use in production”.

This reminded me of another potential SaaS dodge: Server email.   This is possible in asp.net core, but code online can be difficult to find, and asp.net core MVC starter code does not implement a solution by default.  Here is one set of example code.

Microsoft Hydra: Developer Software & Everything Else

On a “make the most of what you do pay”, for those who program and use OneNote, there is finally a way to paste formatted code into OneNote.  Why is this not integrated into OneNote.  I get the sense that work at Microsoft is silo-ed, because the behemoth, like Apple and Google, may not always be aware of how the software written in one cave (e.g. Visual Studio or VS Code) might better integrate with software written in another cave (e.g. Office 360 or OneNote).

I really do like what Microsoft is doing for developers, but I feel no great love for Office Apps or the other generic offerings.  OneNote may be the exception.  Excel is ok (but only VBA or VSTO as choices for expandability?  and Stats add ins seem dated).  Microsoft maybe you should try mating PowerBi with Excel.

Outlook messes up my schedule (anyone else have overnight shifts that make it difficult to see which days you start work?) and I really liked the drag a message to calendar which no longer seems to work.   Visio had some very cool programmable functionality with assignment of data points within the models that disappeared in new versions.

Access could have potential if it used local sql server databases as a backend and built a robust front end (e.g. what if it generated stand alone data-centric c# apps, designed with a nice interface,  that were all wired up, and could be then edited in Visual Studio).

I get the sense that Office is stagnating with the 30+ years of legacy-user expectations and baggage.

An Unrelated Asp.Net Core Recipe

Unrelated to added costs, but some helpful code for generating a selection list of Countries and selecting USA as default (to be used on a form) can be generated in a View Model using the following code:

public SelectList CountryCodes
 {
 get
      {
           var codeArray = CultureInfo.GetCultures(CultureTypes.SpecificCultures)
            .Select(c=> {
                var country = new RegionInfo(new CultureInfo(c.Name, false).LCID);
                return new {
                     Id = country.ThreeLetterISORegionName.ToString(),
                     Value = country.DisplayName.ToString() 
                };})
                .OrderBy(c=>c.Value)
                .Distinct()
                .ToArray();
           var selectList = new SelectList(codeArray, "Id", "Value", "USA");

           return selectList;
 }
 }

 

 

Posted in Uncategorized

Working Identity

In my last blog entry, I reviewed the ups and downs (mostly downs) of trying to get a federated identity server (federated login) to work using asp.net core identity with identity server.  Both frameworks (asp.net core and Identity Server) are impressive feats and very well orchestrated; But Identity is not simple, and demos and documentation only bring you so far, especially when the technologies all keep evolving individually and each tutorial quickly out dates.   After having re-worked the Identity Server tutorials and plenty of other courses (see my epic account here), I was determined to finally deploy a working Identity Server this month and get on with my other projects:

  1. Start with Quickstart #6: IdentityServer and ASP.NET Identity
  2. Work through Securing ASP.NET Core with OAuth2 and OpenID Connect by Kevin Dockx (Pluralsight)
    1. Excellent course, but has several pain points. I’ve re-watched this course three times and worked through most of the coding demos.
    2. Kevin use asp.net core 1.1 and I used asp.net core 2.0
    3. Kevin creates his own identity store instead of using asp.net core identity.  At first, I wondered why.  On re-watching segments, I recognized that building your own Identity persistence model, allows for a better understanding of how any model interacts with Identity Server (including asp.net core identity) and bypasses the black box magic built into asp.net core that might complicate learning.
    4. Kevin also walks through using a Claims table for storing address, first name, last name… instead of deriving a custom class from ApplicationUser (which is what I did in the past).
    5. I need to write a review of Kevin’s course – top notch.
  3. Deploy my identity solution and use this to supply identity to apps on development machine

I came across several problems, I’ll document here:

Creating Self Signed Certificates for Tokens

Kevin Docx describes how to create self-signed certificates with makecert.exe here.   Brock Allen describes the makecert command here.  You basically create the certificate with makecert and in the mmc utility (after adding the certificate snap-in), you copy the certificate to the “Trusted Root Certification Authorities” certificate folder.  You use the token thumbprint code to retrieve this certificate in asp.net core.

At first, I could not get the thumbprint to match a certificate.  The thumbprints are easy to incorrectly enter (when copied from the MMC details dialog)

Here is the code to load the certificate:

using (var store = new X509Store(StoreName.My, StoreLocation.LocalMachine))
 {
 store.Open(OpenFlags.ReadOnly);

var certCollection = store.Certificates.Find(X509FindType.FindByThumbprint, thumbprint_fpnIdSvrSigningCert_3, true);
 if (certCollection.Count == 0)
 {
 throw new Exception("The specified certificate was not found");
 }
 return certCollection[0];

}

 

To determine why these were not matching, I wrote out the certificate metadata to the console:

 private static void write_certificates_in_store(X509Store store)
 {
 X509Certificate2Collection collection = (X509Certificate2Collection)store.Certificates;
 foreach (X509Certificate2 x509 in collection)
 {
 try
 {
 Console.WriteLine("------------------------");
 Console.WriteLine("Thumbprint: {0}{1}", x509.Thumbprint, Environment.NewLine);
 Console.WriteLine("Friendly Name: {0}{1}", x509.FriendlyName, Environment.NewLine);
 Console.WriteLine("Certificate Verified?: {0}{1}", x509.Verify(), Environment.NewLine);
 Console.WriteLine("Simple Name: {0}{1}", x509.GetNameInfo(X509NameType.SimpleName, true), Environment.NewLine);
 Console.WriteLine("Signature Algorithm: {0}{1}", x509.SignatureAlgorithm.FriendlyName, Environment.NewLine);
 // Console.WriteLine("Private Key: {0}{1}", x509.PrivateKey.ToXmlString(false), Environment.NewLine);
 // Console.WriteLine("Public Key: {0}{1}", x509.PublicKey.Key.ToXmlString(false), Environment.NewLine);
 Console.WriteLine("Certificate Archived?: {0}{1}", x509.Archived, Environment.NewLine);
 }
 catch (CryptographicException)
 {
 Console.WriteLine("Information could not be written out for this certificate.");
 }
 }
 }

 

After getting this to match, the certificate would work on desktop, but not on the server.  After some cryptic errors (CryptographicException: keyset does not exist), I finally found this post.   In MMC, right click the private certificate, and “Manage Private Keys”.  Add  “IIS_IUSRS” to permissions.

 

Adding Claims to the Identity Token

I could not get roles and other claims to be included in the identity token.  That is, until I added “AlwaysIncludeUserClaimsInIdToken = true” to the IdentityServer Client configuration.  Until I  found this, I was manually adding the roles as claims via the services.AddProfileService<myCustomService>().

 

Registering a User with An External Provider

For a while I wondered about the right way to sync a user authenticated with an external provider (Google, Facebook, Windows) with an email/account on our IdentityServer.  This is included in the Asp.Net Core Identity framework, but it is rudimentary.  After external authentication, the users identifying info including email are matched against those already in our Identity Server database.  If no match is found, the user is redirected to enter their registration information (email is the only field defaulted in the framework).   The user enters their email and the account is added with a link to the external provider login.

However, the email the user enters, is by default, an unverified email and when the user logs out and back in, their account with email is found, but if the configuration requires email verification, this email will be invalid (unverified email).  So the user is prompted again for registration info, and when they type in their email, they are told that email is already taken.  Of course it is.  This also was not an easy error to find, and a little tricky to fix (where to add the email verification).

 

Posted in Uncategorized