Thursday, October 23, 2014

Famous Omahans

OK, so not really a technical post, but being from Omaha, I just hit me that I should catalog some the famous among us.  (Listed by birth date)

Past:
Fred Astaire - b. 5/10/1899 d. 6/22/1987 -- Actor, Dancer, Choreographer, Singer, Musician
Henry Fonda - b. 5/19/1905 d. 8/12/1982 -- Actor
Gerald R. Ford - b. 7/14/1913 d. 12/26/2006 -- President of the United States
Marlon Brando - b. 4/3/1925 d. 7/1/2004 -- Actor
Malcolm X - b. 5/19/1925 d. 2/21/1965 -- Muslim Minister & Activist

Living:
Warren Buffet - b. 8/30/1930 -- 3rd richest man in the world (Forbes Profile)
Nick Nolte - b. 2/8/1941 -- Actor
Joe Ricketts - b. 7/16/1941 -- Owner Chicago Cubs, CEO of TD Ameritrade (fmr.)
Gale Sayers - b. 5/30/1943 -- NFL Running back (raised in Omaha)
John Beasley - b. 6/26/1943 -- Actor
Johny Rogers - b. 7/5/1951 -- Heisman Trophy Winner
Swoosie Kurtz - b.9/6/1955 -- Actor
Paula Zahn - b. 2/24/1956 -- Journalist & Newscaster
Wade Boggs - b. 6/15/1958 -- Professional Baseball Third Baseman
James M. Connor - b. 6/16/1960 -- Actor
Alexander Payne - b. 2/10/1961 -- Director, Screenwriter & Producer
Nicholas Sparks - b. 12/31/1965 -- Novelist, Screenwriter, & Producer
Calvin Jones - b. 11/27/1970 -- Raiders, Packers & Nebraska Cornhusker Runningback
Huston Alexander - b. 3/22/1972 -- Mixed Martial Artist
Gabriell Union - b. 10/29/1972 -- Actor
Ahman Green - b. 2/16/1977 -- Seahawks, Packers & Nebraska Cornhusker Runningback
Brian Greenberg -b. 5/24/1978 -- Actor
Eric Crouch - b. 9/16/1978 -- Heisman Trophy Winner & Sports Analyst
Chris Klein - b. 3/14/1979 -- Actor (Graduated form Millard West High School)
Andy Roddick - b. 8/30/1982 -- Former World #1 Professional Tennis Player
311 - b. 1988 -- OK, technically a band and not a person



Friday, August 29, 2014

SQL Server Access Control & Synonyms

OK. So, say you have a SQL Server database and you want to provide varying levels access to different users groups.   This database was created for one application, and now you are being asked to provide access to a report development team.   How do you go about doing that in some rational way, say the way an admin assigns access to files on a file server. 

Well, that would be great, only there aren't any folders in SQL Server.  BUT, since SQL Server 2005 we've been given real Schema objects and Synonyms to boot.

SO, how does that help us?  Lets take a look, say you built your database and like any rational developer you built everything in DBO.  Here's a list of your tables:

dbo.Customers
dbo.PurchaseOrders
dbo.Invoices
dbo.UserLogins
dbo.ApplicationSettings

Obviously you don't want the report developers having access to the UserLogins and ApplicationSettings.  One, they just don't need it, and two, there's sensitive stuff in there.

Our approach:
  • Use an active directory group (My_Domain\Report Writers) to control access.
  • Create a Schema for the report writers to access
  • Assign access to the new schema and not the old one.
 Step 1) Create the new schema:
CREATE SCHEMA reports

Step 2) Add some table synonyms to the schema:
CREATE SYNONYM reports.PurchaseOrders FOR dbo.PurchaseOrders
CREATE SYNONYM reports.Invoices FOR dbo.Invoices
CREATE SYNONYM reports.Customers FOR dbo.Customers

 Step 3) Map the AD group to your database
CREATE LOGIN [MY_DOMAIN\Report Writers] FROM WINDOWS WITH DEFAULT_DATABASE=[master]
GO
CREATE USER [MY_DOMAIN\Report Writers] FOR LOGIN [MY_DOMAIN\Report Writers]

(This is the same as adding a Login at the server level, and mapping to the public role on a database catalog).

Step 4) Give the report writers access to the reports schema.
GRANT SELECT ON SCHEMA :: reports TO [MY_DOMAIN\Report Writers]

What have we accomplished?
  • Your report writer team can log into your database
  • Your report writer team can view all of the table synonyms in Management Studio
  • Your report writer team doesn't have any write permissions (INSERT/DELETE/UDPATE) to anything.
  • Your report writer team cannot query the objects in DBO directly, so they don't have access to sensitive tables like UserLogins and ApplicationSettings.
Why did this work?

Turns out that in SQL Server Synonyms are like file system hard links.  So if you had a file in one directory, and took away permissions on that directory.  Then created a hard link in another directory and give permissions, the user would have access.  The same idea works here.   Since the report writers didn't have access to the DBO schema, they can't view the tables there.  But since they have access to REPORTS they may read the synonyms and query them as well.

Turns out that you can customize access to the synonyms once they are created.  All of the GANT/DENY/REVOKE commands work the same.  You'll even be able to apply column level security!


Tuesday, August 12, 2014

Anonymous Performance Point Dashboards (SP2013)

Performance Point (PPS) became part of the Enterprise offering of SharePoint starting with Microsoft Office SharePoint Server 2007.  As a tool it was branded as "Bringing BI to the Masses."  In SharePoint 2010, it was possible to deploy PPS dashboards to BI sites with anonymous access.  SharePoint 15 (2013) broke this, either on purpose or by mistake, and here's how it happened:

Assembly: Microsoft.PerformancePoint.ScoreCard.WebControls.dll
Version: 14.0.0.0
Class: Microsoft.PerformancePoint.ScoreCard.OlapViewCache
Derived Class: System.Web.UI.Page

Assembly: Microsoft.PerformancePoint.ScoreCard.WebControls.dll
Version: 15.0.0.0
Class: Microsoft.PerformancePoint.ScoreCard.OlapViewCache
Derived Class: Microsoft.SharePoint.WebControls.LayoutsPageBase

Differences between version 14 & 15: Other than derived class, none.

Result of the change: _layouts/PPSWebParts/OlapViewCache.aspx requires user authentication with SharePoint 2013 (v15), where as SharePoint 2010 (v14) did not.  This means that while the ASPX application page generated by SharePoint designer can be placed in an anonymous access document library, elements referenced on the page via Image (<img src=""/>) tags require authentication.  Failure to provide credentials causes the chart elements to not render, causing a critical failure of the dashboard in anonymous access sites.

Here's the work around we implemented.

  1. Create an ASPX page which duplicates the operations of Microsoft.PerformancePoint.ScoreCard.OlapViewCache.
  2. Copy the ASPX page from (1) to:
    • 15\TEMPLATE\LAYOUTS\PPSWebParts
    • 14\TEMPLATE\LAYOUTS\PPSWebParts

Note: an IISRESET may be required after placing the files in the 14 & 15 hives.

The following content implements the replacement OlapViewCache.aspx which derives from Page instead of LayoutsPageBase.



<%@ Page Language="C#" %>
<%@ Assembly Name="Microsoft.PerformancePoint.ScoreCards.ServerCommon, 
        Version=15.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" %>
<%@ Import Namespace="Microsoft.PerformancePoint.Scorecards"  %>
<%@ Import Namespace="Microsoft.SharePoint.WebControls"  %>
<%@ Import Namespace="System"  %>
<%@ Import Namespace="System.Globalization"  %>
<%@ Import Namespace="System.Web"  %>
<%--
    Name:                   OlapViewCache.aspx
    Deployment Location:    15\TEMPLATE\LAYOUTS\PPSWebParts
    Description:
        Replaces the SharePoint 2013 OlapViewCache.aspx utility page.  The script
        code in this file was produced to replicate
        Microsoft.PerforamcePoint.Scorecards.WebControls which changed inheritance
        to LayoutsPageBase in SharePoint v15 (2013).  In v14, System.Web.UI.Page
        was the derived class.  The change in v15 caused the page to require 
        authentication meanwhile, other dashboard components could be used 
        anonymously.  This ASPX class derives from Page once more.
--%>
<script runat="server" type="text/C#">
    private void Page_Load(object sender, EventArgs e) {
        string externalkey = Request.QueryString["cacheID"];
        string s1 = Request.QueryString["height"];
        string s2 = Request.QueryString["width"];
        string tempFcoLocation = Request.QueryString["tempfco"];
        string str1 = Request.QueryString["cs"];
        string str2 = Request.QueryString["cc"];
        int height;
        int width;
        
        try {
            height = int.Parse(s1, (IFormatProvider)CultureInfo.InvariantCulture);
            width = int.Parse(s2, (IFormatProvider)CultureInfo.InvariantCulture);
        } catch {
            height = 480;
            width = 640;
        }
        
        int colStart = 1;
        int colCount = 100;
        try {
            if (str1.Length > 0)
                colStart = Convert.ToInt32(str1, 
                        (IFormatProvider)CultureInfo.CurrentCulture);
            if (str2.Length > 0)
                colCount = Convert.ToInt32(str2, 
                        (IFormatProvider)CultureInfo.CurrentCulture);
        } catch {
            colStart = 1;
            colCount = 100;
        }
        
        string mimeType;
        string viewHtml;
        byte[] bytesImageData;
        if (!BIMonitoringServiceApplicationProxy.Default
                .GetReportViewImageData(tempFcoLocation, externalkey, 
                    height, width, colStart, colCount, out mimeType, 
                    out viewHtml, out bytesImageData))
            return;
        
        if (mimeType.IndexOf("TEXT", StringComparison.OrdinalIgnoreCase) >= 0) {
            Response.ContentType = mimeType;
            Response.Write(viewHtml);
        } else {
            if (bytesImageData.Length <= 0)
                return;
            Response.Clear();
            Response.ContentType = mimeType;
            Response.BinaryWrite(bytesImageData);
            HttpContext.Current.ApplicationInstance.CompleteRequest();
        }
    }
</script>

Friday, June 13, 2014

Friday Fun: l337 Speek Translator

So, maybe you're not L337 'nough to make your own l337 5P33|< translations.  Here's some help along the way.  The following HTML script will give you a really simple page to translate plain text.

-Matt


<html>
<head>
<title>L337 #4X0r 5p3e|&lt;</title>
<script language="javascript">

    var x1at = [
        ['A', 'a', '4', '@', '/-\\' ],
        ['B','b','5','&amp;','8'],
        ['C','c','&cent','(','{','['],
        ['D','d','|)','|}','|]','|&gt;'],
        ['E','e','3','3','3','3'],
        ['F','f'],
        ['G','6','g'],
        ['H','#','h'],
        ['I','i','1','|','l'],
        ['J','j','|'],
        ['K','k', '|&lt;'],
        ['L','l','1','!','|_'],
        ['M','m','|\\/|','^^'],
        ['N','^','n','|\\|'],
        ['O','o','0','()','[]','{}','&lt;&gt;'],
        ['P','p','p','P'],
        ['Q','q','9','q','9'],
        ['R','r','rrr','R'],
        ['S','s','5','$'],
        ['T','t','7','+'],
        ['U','u','|_|'],
        ['V','v','\\/','`\''],
        ['W','vv','w','\\/\\/'],
        ['X','%','x','*'],
        ['Y','y','`\'|'],
        ['Z','z','-/_','%'],
    ]


    function x147e() {
        var l337 = document.getElementById("L337");
        var _5Rc = document.getElementById("_5Rc");
        var c0un7 = document.getElementById("c0un7");

        l337.innerHTML = "";

        for (var x = 0; x < _5Rc.value.length; x++) {
            c = _5Rc.value.charAt(x).toUpperCase();
            if (c >= 'A' && c <= 'Z') {
                var i = c.charCodeAt(0) - 'A'.charCodeAt(0);

                l337.innerHTML += x1at[i][Math.floor(x1at[i].length
                                         * (Math.random() - 0.0001)) ];
            } else {
                l337.innerHTML += c;
            }
        }

        c0un7.innerText = "Char Count:" + l337.innerText.length;
       
    }
</script>
</head>
<body>

<div id="L337" style="white-space: pre; font-family:Trebuchet MS; 
                        font-size:14px; width:400px; padding:10px; 
                        border:1px solid black; margin-bottom:5px"></div>

<div id="c0un7" style="font-size:14px; font-family:Trebuchet MS; 
                        margin-bottom:5px; padding:10px">Char Count:</div>

<textarea id="_5Rc" cols="50" rows="5" style="margin-bottom:20px">
</textarea><br />

<button  onclick="x147e(); return false;">x147e</button>

</body>
</html>

Monday, June 9, 2014

Information Architecture Re-factorization

Situation: You have a legacy SharePoint 2010 Enterprise site that has grown organically. The web application has a home spun/3rd party global navigation component for the top nav and one site collection for every 1.03 web sites.  Plus, there's no discernable taxonomy to the current site structure. Your mission, should you choose to accept: migrate to SharePoint 2013, add support for Managed Metadata from the Term Store, rework the top navigation as a "MegaMenu," and reduce the overall number of site collections. Go!

Lets start with the problem in IA re-factorization and go from there:

Option 1) It's Enterprise, use "Manage Content and Structure"
Option 2) Use some exotic PowerShell magic to detect and recreate all settings.
Option 3) Use some not-so-exotic PowerShell magic, leveraging Import-SPWeb and Export-SPWeb commands.
Option 4) Buy a third-party tool that like those from Metalogix or AvePoint or some other great source.

I'm cheep, so we're going to go with something from options 1-3.  That said, we'll build the 2013 farm; copy the 2010 content database (fortunately there's only one); and attach it to a web application we've built for the new deployment.  We've now got a 2010-2013 version migrated SharePoint site operating in 2010 visual mode.

Option1: Manage Content and Structure
MCaS is pretty exciting option.  It's GUI based, so the learning curve is pretty short.  It will let you move Items, Lists or Web Sites between Web Sites or Lists.


Drawback #1, you've got to move one Web Site or List at a time.  If you've got a bunch (in my case more than 80), you're going to be at it a while.

Advantages, you get immediate feedback on the move.  I've seen that feedback to be wonky.  For instance, if the move takes too long, it may seem like it's erred out, but it's actually still plugging away in the background.

Drawback #2, you're going to have to keep both the source and destination content in the same web application.  MCaS doesn't work with two different web applications, but is this really a drawback?

Drawback #3, you can't move between site collections inside.  So you're stuck in one place.

Option 2: Exotic PowerShell Magic
This option seems the most sexy.  Create a totally awesome PowerShell script that recreates one SPWeb in another SPWeb.  At the same time, it seems like it's the most error prone, especially when you throw in SharePoint Publishing.  The code you write for this needs to duplicate every setting, while accounting for the new URL taxonomy as the content shifts from one location to another.  Plus, you'll need to not only duplicate the attributes of SPWeb and SPList, but you'll have to use the secondary object overlays for SPFile, SPFolder and publishing classes like PublishingSite, PublishingWeb and PublishingPage.  Too exotic for me...

Drawback #1: Man it's hard to write a script like this.  Better buy a tool, or look for some open source scripts...

Option 3: Not-So-Exotic PowerShell Magic
Ok, so this one seems viable to me.  Here we can script together a bunch of export-spweb commands with import-spweb commands to effectively extract and load the content in different locations.  Best part here is that the content can be in different web applications, and even be in different farms.

Drawback #1, Export-SPWeb will give you some really weird errors, especially if List Templates and their ilk have been removed.  The same goes for Import-SPWeb.  I've run the Import-SPWeb multiple times and it seems to resolve the problem when there's missing list template content.

Advantage if you can call it that: You couldn't look at the list contents anyway if the templates were missing.  They are basically dead items that you have to prune with PowerShell because the GUI won't let you navigate to the list settings pages anyway.

Drawback #2, If you script more than one move together with other moves, you're probably going to have to go digging through a bunch of log files to see what moved and what didn't go.  Error reporting isn't a strong suit of these commands.

Personally, I'll recommend Option 1 and Option 3 to my clients, and point them to some third-part solutions.  But for the situations I've been apart of, 1 & 3 will be just fine.

Friday, May 9, 2014

Extending with Science! ... or files

Hey, say you've got like 40,000 files on your hands.  AND you've got a public, Enterprise SharePoint 2010 portal.  AND you've got a mandate to publish each one of those files to the world.  AND you just got kicked off your previous web host.  Watcha gonna do?


Well my friend, use SCIENCE!


OK, not really.  BUT, SharePoint is a Microsoft.NET application, hosted in Internet Information Services, so we can just add a virtual directory to the SharePoint web application and serve away.  Yes, but...


I tried doing just the above once, but didn't discover the secret sauce until just yesterday.  Here's the breakdown:

  • The 40,000 files are served anonymously (this is actually a good thing)
  • Directory access performance isn't really an issue, there aren't a million hits a month let alone a minute.
  • There's already a structured navigational approach to finding the files so there's no need to browse them or build a new catalog.
What do you need to do to spin up a virtual directory inside the SharePoint web app?  Well here's what I did and it's working find, thank you very much.

  1. Establish a virtual directory in your SharePoint web app.  If you've got multiple AAMs and/or multiple WFE's you'll have to repeat the following process for each.
  2. Once you've built the VDIR, double click it's Authentication option.   We only needed anonymous so I turned off everything else (More Later).
That's it.  You're up and running.  But wait you say, I tried that and it didn't work. Well it didn't work for me the second time either.  My first test I had access to the directory that was being access and was using an IE client on the same host as the server.  That three-headed-dog be sent home to Hades!
 
The change?  This time around, instead of using pass-through security on the VDIR, I used the connect as option and connect using an account that has access to the files.  That solved the Anonymous access problem I was having.  Another solution would be to grant the "Everyone" user access to the files.  But that's a problem in and of itself, plus if you're using a remote VDIR it probably won't work unless you make the file share wide open.

The other change?  Well you need to make sure you establish a web.config file in the directory  you're serving.  Just inheriting from the base file in SharePoint's home dir won't work.  Until I established a custom web.config, SharePoint tried to intercede and bump the URL up against the content database.  Not-found is the same as access denied in that world.  So in summary:

  • Use a delegation account to access the VDIR
  • Establish a web.config that explicitly sets the Anonymous access properties
 Well we just turned SharePoint into a regular IIS web application, and my client is happy.

Wednesday, May 7, 2014

SharePoint Powershell -- Awesomesauce

Introduced with SharePoint 2010 (but actually possible with older versions) we were given direct access to the SharePoint object model in a interpreted scripting environment.

With a simple script like this, I can list all of the lists and libraries in a SharePoint web application:

$wa = get-spwebapplication http://site.domain.com
foreach($site in $wa.Sites) {
    foreach($web in $site.AllWebs) {
        foreach($list in $web.Lists) {
            write-host $wa.Name $web.Title $web.Url $list.Title
        }
    }
}


Great, so what you can list a bunch of the content.  Tell me something I can't do with other tools...

OK, here's one.  I had a WSP that installed custom page layouts.  One of the page layouts got modified by a user using SharePoint designer, and even if I returned the page layout to the original state, upgrading or re-installing the WSP wouldn't overwrite the page layout in the content database.

Solution: Detach the each of the publishing pages from the page layout and  make a file that lists each of the publishing pages that were attached to the page layout.  Then remove all of the bad stuff using what ever tool you like and reattach the page layouts based on the data we saved in the file.

Check out this script, the first half of the process of detaching the page layouts. $detLog gets set outside the function and is the full page to the change log.  Just point the function at an SPWeb and a reference to new page layout (it needs to be a Microsoft.SharePoint.Publishing.PageLayout) and away it goes, cycling through the entire SPWeb and its children.  Oh, $comment gets set outside the function too.  It could be something snappy like "Detaching Page Layout for Upgrade."

 Function BFS-PubPage($web, $newlayout) {
    $web.Lists | foreach-object -process {
        $l = $_
        $l.Items | foreach-object -process {
            $i = $_
            if ([Microsoft.SharePoint.Publishing.PublishingPage]::IsPublishingPage($i)) {
                $pp = [Microsoft.SharePoint.Publishing.PublishingPage]::GetPublishingPage($i)
                if ($pp.Layout -ne $null) {
                
                    if ($pp.Layout.ServerRelativeUrl -eq "/_catalogs/masterpage/OffendingLayout1.aspx" `
                       -or $pp.Layout.ServerRelativeUrl -eq "/_catalogs/masterpage/OffendingLayout2.aspx") {
                        write-host -ForegroundColor DarkBlue ($web.Url + "/" + $pp.Url)
                        
                        $cl = $pp.Layout
                        
                        $pp.CheckOut()
                        $pp.Layout = $newlayout
                        $pp.Update()
                        $pp.CheckIn($comment)
                        $pf = $pp.ListItem.File
                        $pf.Publish($comment)
                        $pf.Approve($comment)
                        
                        add-content $detLog ($web.Url + "`t" + $i.Name + "`t" + $pp.Url + "`t" + $cl.ServerRelativeUrl)
                    }
                     else {
                        write-host -ForeGroundColor DarkYellow `
                             ($web.Url + "/" + $pp.Url +"`t" + $pp.Layout.ServerRelativeUrl)
                    }
                }
            }
            
        }
    }
    
    $web.Webs | foreach-object -process {
        if ($_ -ne $null) {
            BFS-PubPage $_ $newlayout
        }
    }
}


You probably noticed, but this script uses a different way to implement the for-each loops.  Instead of using the C# style of the loop, it uses the object pipe method.  I'm guessing one method is probably more efficient than the other, but  you never know.

Tuesday, May 6, 2014

Windows Installer Don'ts

Ok, so I'm a software professional (or at least I thought I was).

A few weeks back my laptop started to complain that I was running out of disk space.  I've only got 500 GB on board, and with a couple of VM's for SharePoint, SQL Server 2012, and various FOSS OSes, I really was running out of diskspace.

Know that Image of Windows Server 2003 running SQL Server 2005 and MOSS 2007?  Well that was too important to purge, so what did I do?  I dug out the handy du command and started building up a hog report.  Where was all of that space going...

  • c:\MSOCache (hidden) - 2.1 GB
  • c:\Windows\Installer (not hidden, but not shown either) - 17.2 GB
  • c:\users\mbattey\AppData\Local\Temp (AppData hidden) - 1.85 GB
  • c:\ProgramData\Microsoft (ProgramData hidden) - 4.7 GG
Hey, I'm a smart guy, right?  What the heck are all of those Windows Installer files doing still hanging around on my computer?   I'll just delete those.  MSOCache? Stupid Microsoft leaving crap everywhere.  Gone!  Temporary files?  Fried!

Ya bad idea.

Windows Update was immediately broken.  There were a handful of updates to Visual Studio (2005, 2008, 2010) waiting to be installed as well as updates for Office.  None of these would go.  Every time they ran, the MSI tool would ask for a GUID named directory off the root of one of the drive partitions, which of course didn't exist.  Trying to run a Repair from Programs and Features failed miserably with the same result.  In fact, the icons for all of the Office Documents, Adobe Reader PDFs, and a bunch of others disappeared as well.  (Methinks that Adobe and MS got lazy and weren't moving all of the DLL's out to "Program Files" like they tell everyone else to do).

So after manually removing all of the remnants from Office following a nice showcase from Microsoft (the girl reading the script sounded nice, but had trouble pronouncing RegEdit and Suite, which came out more like "reg-it" and suit), writing a custom registry cleaner to delete stuff from HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Microsoft\Windows\CurrentVersion\Uninstall, HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\CurrentVersion\Uninstall, and HKEY_LOCAL_MACHINE\SOFTWARE\Classes\Installer\Products, and finally correcting individual packages with MSIZap, I'm kinda sorta back to normal.

Normal as in I'm back up to all that disk use, and Office and Visual Studio will run updates, mostly.  I do highly recommend MSIZap when you've got the GUID of a a package that won't install, because it's gone missing.  It will root out all of the registry and local files related to the product and in the cases I needed it let a patch go through unperturbed.

Thursday, April 17, 2014

SPTrustedRootAuthority -- Most important internal class for Claims Based Authentication

In wisdom unknown to me the SPTrustedRootAuthority (TRA) class and its manager SPTrustedRootAuthorityManager (TRA manager, both in Microsoft.SharePoint.Administration) are internal and sealed.  Big statement, probably nobody cares but me, so hear me out.  What if you wanted to add on to Central Administration so that CBA could be maitained 100% through the UI?  Can't do it with the classes on hand, because we can't link directly to those classes.

SharePoint maintains an internal Certificate Authority (CA) which just happens to be managed by the SPTrustedRootAuthorityManager.  Microsoft graciously provided us with a form in Central Admin and Power Shell commands to work with the list of certificates (see get-command -noun SPTrustedRootAuthority), but the only way to work one is through then PS command scripts.  This is OK, but it seems like SharePoint team took the easy way out and didn't finish the GUI in Central Admin.

So what's a boy (or girl for that matter) to do?  I went and found all of the PS Cmdlets that parallel the steps in the Claims Walkthrough, just so I wouldn't have to write a Forms app just to setup the SPTrustedLoginProvider.   That was great for me, but I'm a consultant these days and I need to get my clients up and running on this stuff.  If I can barely remember the sequence of command, how could they.  Especially when you want to add new Known Claim Values when a new set of secureables comes out.


Well I took it upon myself to create my own user interface to manage CBA trust providers and developed it into a set of 14 hive application pages, in the ADMIN folder (/_admin/TrustConfig, etc.) so that they would only be available through Central Admin and not through a normal site.   Configuring Trust is an administrative task after all...


So everybody knows when you setup a new CBA Trust you do the following:

  1. Get an X.509 certificate w/o the private key attached in a DER file.
  2. Load the certificate into SharePoint's Trusted Authority Manager (CA or PS usually works)
  3. Create new SPTrustedLoginProvider that references the certificate.
There's some subtleties though.  A X.509 certificate can be used for only one SPTrustedLoginProvider at a time.  Off hand I can't remember if it throws an error when you create the provider or if it error when you try and use it.  Either way, I remember the error message isn't very helpful!  The cert has to be loaded in SP's private CA ahead of time too.  So, why doesn't allow you to just reference the cert's Subject & Issuer?  May because it's important that you have all of the bits in the cert on hand? Maybe, its like the old days, when you couldn't deposit a picture of a check, you had to hand it over to the teller.

Anyway, that was only one problem, when you first setup the Login Provider.  You need to come back to one and update the Known Claim Values, maybe add a new Claim Type now and again too.  But try describing that over the phone (you may be better at than I am).

Solution, add those application pages to Central Admin.  But you'll need to use some reflection magic to get a hold of that data.  Now if you review my code I'm about to post, you may notice that there may have been more direct ways to get to the data, like accessing the TRA manager directly, or calling the "Certificate" property on the TRA.  I'm not saying my code is perfect, but it got me there.

What, you're about to see, is what the GOF would call Adapter Classes.  The first TrustedRootAuthority wraps the SPTrustedRootAuthority to provide access to the X509Certificate2 and the second, RootAuthority, wraps the SPCmdletGetTrustedRootAuthority Power Shell Cmdlet to get the set of all installed certificates.

These additions don't let you install new certificates, but that wouldn't be too hard once you've gotten this far.  They do allow you to get a hold of all of the installed certs so that you can pick and choose the certificate you want to add to your newly minted SPTrustedLoginProvider.



class TrustedRootAuthority
{
    static Type traType;
    static TrustedRootAuthority() {
       Assembly a = Assembly.Load("Microsoft.SharePoint, Version=14.0.0.0, " 
                              +"Culture=neutral, PublicKeyToken=71e9bce111e9429c");
       traType = a.GetType("Microsoft.SharePoint.Administration.SPTrustedRootAuthority",
                              true, true);
    }

    object tra;
    public TrustedRootAuthority(object tra) {
       this.tra = tra;
    }

    public X509Certificate2 Certificate {
       get {
           return traType.InvokeMember("m_Certificate", BindingFlags.NonPublic | 
                            BindingFlags.GetField | BindingFlags.Instance, null,
                            tra, null) as X509Certificate2;
       }
    }

}
      
class RootAuthority
{
    static ConstructorInfo ctor;
    static MethodInfo rdo;
    static RootAuthority() {
       Assembly a = Assembly.Load("Microsoft.SharePoint.PowerShell, Version=14.0.0.0, "+
                                  "Culture=neutral, PublicKeyToken=71e9bce111e9429c");
       Type t = a.GetType("Microsoft.SharePoint.PowerShell.SPCmdletGetTrustedRootAuthority",
                                  true, true);
       ctor = t.GetConstructor(new Type[0]);
       rdo =  t.GetMethod("RetrieveDataObjects", BindingFlags.NonPublic 
                                     | BindingFlags.Instance);
    }

    private object tra;

    public RootAuthority() {
        tra = ctor.Invoke(null);
    }

    public IEnumerable<TrustedRootAuthority> RetrieveDataObjects() {
        IEnumerable src = rdo.Invoke(tra, null) as IEnumerable;
        List<TrustedRootAuthority> ret = new List<TrustedRootAuthority>();
        foreach (object o in src) {
            ret.Add(new TrustedRootAuthority(o));
        }
        return ret;
    }
}