Wednesday, June 7, 2017

Email Enable a SharePoint Online List with Flow!

The Case

It seems like the use cases for SharePoint are a little wind blown.  Maybe it's because SharePoint Foundation came free with every Windows Server license since 2003 or maybe it's because SharePoint (ahem Enterprise) comes free with every Office 365 tenant.  Well anyway, one really good one is for a work support, or work ticket system.  You might say CRM is the way to go, except that it's really expensive, and in many cases way overblown, IMHO.

So, let's go with the work ticket system.  You want, proactive notification, you want status updates as the ticket is worked, you want work history, and probably want some sort of effort tracking.  What else do you say?.. You want to be able to send an email to a generic email box, that will kick off work to be done?

Well, all of this was built in in SharePoint on-premises versions, but with a move to Office 365, not only do you loose out on custom server side code, and the entire on-premises BI stack (Business Intelligence to keep you in the know), you also loose out on incoming email/List integration.  Not to mention you cant customize the outgoing email From or Reply-To for your site collections, web application, or farm.  You can't even specify them at a tenant level.  I'm not really sure why Microsoft choose not to do this, but there are some obscure references to mail being blocked in spam filters or something.  I'm not sure who really wanted email addresses spoofed, a good admin reply-to for support is usually what you want anyway.

Hey, we really wanted that functionality, and dang-it we want it in SharePoint Online, because ... CLOUD IS AWESOME NO MORE TOUCHY THE SERVIE NETZ!  As an aside, why does it seem that many arguments for cloud services are the same arguments to get all of the "wage slave IT riffraff" out of site and into the basement?  Well that's a whole other article

The Soln

Man, I like it when I use that weird abbreviation that comes from technical publications, makes me feel smart or whatever, go a way, it's my feeling don't be a buzz kill :)

OK, now that I got that out of my system, the solution that is both CHEEP, and INCLUDED in your O365 subscription...  Microsoft Flow!!!

Flow is pretty convenient, but what it lacks for in documentation it makes up in obscurity.  Right of the bat, it has a nice Web GUI, with all of those nice connecting lines and drag and drop type abilties, you don't even have to know anything about the PowerApps Workflow Definition Language (ok, that one's a slight stretch).  But really, the big ticket item is that it has built-in support for lots and lots of Cloud tech.

For instance you could send an email to a GMail then have the attachments saved to a Drop Box folder.  Or maybe when a new video is added to a Vimeo channel, you want to schedule an event on your calendar to remind you to watch it.  I bet you have always wanted to fetch a row from Informix whenever a issue has been assigned to you in GitHub.

Well, I never wanted those either, but I did want create an item in a SharePoint list when an email was sent to a service email box.  Well there's triggers and actions for that (check out the Services list, it reads like an SSAS who's-who)!

To the right is a screen capture of the Flow I built for a help-desk ticketing solution (sorry for the small size, click on it to zoom in, but you already knew that. Sorry, sorry, I think I'm turning Canadian).  We wanted two critical features.  First, when an email arrives, create a new SPO List Item.  Second, if a user replies to an email correctly, add any comments from the email to an appending text field on the same list item.  Here's how it works.

  1. Look for new emails in the email box.  We used rules in our Outlook 365 account to move them to a folder based on the alias used, thus we can support multiple lists with a single O365 account.
  2. When an email arrives, fetch the email received time stamp, and then do some edits on the html message body (more on that later).
  3. Decide if the email we found conforms to a "Reply-To" format or if it's something else.
  4. If it's something else:
    1. Create a new SPO List Item
    2. Build a path to store attachments based on the Item's ID and email timestamp.  Note: Flow doesn't support adding attachments to list items yet.  That's coming, but as of yesterday (June 6th, 2017) that feature is Started
    3. Save each of the attachments to a Document Library referenced in the path in the previous step
  5. Else when the email conforms to the correct Reply-To subject string
    1. Retrieve the SPO List Item ID from the email's Subject field
    2. Update the SPO List Item using the ID and the email's message body.  We specifically updated our "Appending Text" field, that you get when you use versioned list items
    3. Calculate a path to store attachments like above.
    4. Save each of the attachments to a Document Library
Check out this beauty.  A screen shot of the outcome.  Yes, that is me eating cake.  It was my birthday, OK!  I deserved it, so boo to your comments! :)


Some things to know:
  • The attachments created an interesting problem.  We can't yet add them to a List Item, and in-line pictures show up as broken empty image boxes.  JavaScript (jQuery) to the rescue... That comment above about editing the email's HTML was some prep to insert the timestamp into the <IMG> tags.  Then jQuery on the forms to fix it up (more on this in another post).
  • I created an email box specifically for this List integration.  I'm not sure what would happen if the email was marked read before Flow tried to process it.
  • I run the Flow process as the account that owns the email box.  This ensures that it would have access to Outlook 365.
  • I granted the account read access to the entire site, and contribute access to the SharePoint List.  With out read access to the site, the UI doesn't work quite right as it provides a drop down to select the list to interact with.
  • I could have created the flow as another account, but I would have had to create connections Outlook and SharePoint user id's in them.  It is much easier to keep it all under one account, IMHO.
  • Getting help for Microsoft Flow from the internet is crazy hard, unless you include PowerApps or Power Apps in your Google/Bing/Yahoo/DuckDuckGo query.  I'm new to Flow, so I suspect it was recently rebranded.  It just showed up in my Charms in February, but there's references to it on the PowerApps PowerUsers site from a couple of years ago

Sunday, July 31, 2016

Zen of Ducati Fuel System Maintenance

So if you you were on my Facebook feed last last summer you would have seen this picture.  Yes, that is my 1999 Ducati 900 Supersport, not running.  I was about half way through a 30 mile round trip when the motor started to cough and sputter.  My main ride had just spent about 14 days in the shop with work related to a check-engine light coming on that turned out to be a bunch of work related to the variable timing.  With that thought in the back of my head I feared the worst.  Needless to say, it was the 4th of July, and I had to call my wife so she could borrow her girlfriend's pickup and we hauled the Italian artwork home.

So, did I say that I was completely freaked out, thinking it was the fuel injection computer, clogged injectors, or maybe some set of sensors out that I wouldn't be able to diagnose, because 1999 Ducati 900 Supersports don't have and ODB-II port as far as I can tell.  Obviously, this is more on the hardware side of things... And not really computer related.

Well, I let it set, and then took another look.  I had just replaced the nut that holds the fuel level sensor (a fuel sender in Ducati speak) in.  It had been leaking slowly, and although my wife complained of a gasoline smell, I never really could locate the leak, well until I did, and then well, three nuts later I finally found one that screwed on, sorta.

First step is to drain the tank.  I got it mostly drained with the handy little pump I picked up for about $7.00 at a local Harbor Freight tool store (don't try the tube and mouth method, don't ask me why I know not to do that, just don't).  Next I removed the cap.  There's four (4) screws that hold it on.   The ring actually has six (6) plus one one more near the hinge, but it turns out the twelve, four, and seven o'clock screws are the only ones that actually attach.  the other three are just for beauty.

Next, the fuel inlet is secured in place by a bunch of tiny hex (allen) screws.  I don't know if you have to take them all the way out but I did, because I pulled and pulled but that stupid thing wouldn't come out.  So, since I'm a tool using kind of sentient being, I used the handle of my mallet to give it some extra leverage.  It popped right out.



There are two hoses attached to the fuel inlet.  They are attached to pipes that terminate outside the tank, and in a diagram they were labeled over pressure lines.  Handily, the left one attaches to the left port and the right one on the right.  So no need to label.  I fully detached them so I could get my hands inside and dig around.  Turns out that was a pretty good thing...  (The picture to the right is actually as I was reassembling.)

 Once I got the inlet off, I found that the rubber extender had fallen into the tank.  So I grabbed that and reinstalled it on the inlet.  The o-ring looked like it was breaking down, but after removing it and giving it a closer look it just had some crud that rubbed off.
Inside the tank was a whole other story.  The reason I wanted to get in there was because the fuel lines had rotted down to the reinforcing braid.  The guy at the parts store said that these should be replaced every two years, and well this bike is 17 years old.  Silly me.  In the bottom of the tank was all of the rubber bits.  The bad part was that when I scooped them up, they turned into a black paste.

The job was then to get all of the gunk out.  I ended up taking the fuel pump out too.  Turns out that was a good decision.  There's no way I'd be able to clamp on that two inch piece of hose between the pump and filter.    On the right is a picture of the reassembled pump and filter.  Notice how different the new hoses look?

I re-used the hose clamps that were there.  Ones I've sen before either have a worm screw to tighten them down, or pinch together and use spring action to hold everything together.  Not these.  Instead of pinching the clamps to release them, they pinch and lock together.  Overall they give a tighter lock, but like I said before, I'd never have been able to install them inside the tank.

They all went back in pretty easy, and it was just a matter of reconnecting the fuel dump lines and replacing the inlet.  The picture to the right shows the fuel pump remounted and the fuel filter connected to the line that heads out to the fuel rails.

I'll give a big a shout-out to Blue over at Ducati Moster Forum for good instructions on how to replace the black beauty ring that goes round the outside of the fuel inlet.  That method of partially installing the inlet and then fitting the ring really did the trick.

The picture below shows my 1988 Honda Hurricane (CBR-600F), in the background.  It's got it's own fuel problems.  One day.  For now the Ducati is running again.  It's back to starting with one revolution of the motor, which it hadn't done since about 2004 (I bought it in 2001).  My guess is that the fuel inlet will be off the tank again sometime in the future.


 ===

So it pays to pay attention at the gas pump.  I only put enough gas in the tank to cover the fuel pump, then I was off to the gas station.  With the rubber guide back in the tank it was hard to see the bottom of the thank when I was filling it, especially from the left side.  As I was leaning in to take a look, my left leg got a little close to the exhaust.  Well at least it will be a nice scar.
 











Sunday, June 5, 2016

Just Released, 5 years in the making, SAFMQ 0.8.4

Hey, you can get your copy of SAFMQ version 0.8.4 on SourceForge.  I know 5.5 years is too long between releases but, I've been busy...

Release 0.8.4 comes with these updates:
  • Updated for compatibility with OpenSSL 1.0.2h (most recent)
  • Windows installer package is built with OpenSSL 1.0.2h
  • Updates for compatibility with g++ 5.4.8 and 5.3.1
  • Updated for compatibility with Visual Studio 2010
  • Test on:
    • Fedora 23 (g++ 5.3.1)
    • FreeBSD 10.6 (g++ 5.4.8)
    • Windows (32bit)
    • Cygwin (g++ 5.3.1)
So there you go, a fresh (incremental) release. The most important update is with OpenSSL, and the bug fixes to the TLS code.

Wednesday, April 6, 2016

Install a SharePoint Publishing Design Package with PowerShell

Microsoft didn't provide a PowerShell cmdlet to install publishing design packages, and the one in the design gallery is just kinda overkill.  So I took it upon myself to make a simpler version, albeit it did require calling a couple of "internal" methods from the Microsoft.SharePoint.Publishing library.  Well, here goes have at it (you'll also see some reflection examples for calling internal constructors and methods).

Oh, here's an important note.  NOTE: MAKE SURE PUBLISHING INFRASTRUCTURE IS ENABLED FIRST.  Otherwise it won't turn on, and you won't be able to access the Master page settings page even if you go directly to ChangeSiteMasterPage.aspx.



#########################################################################
#
#  Install-PublishingDesignPackage -Site <site url> -Path <package path>
#
#########################################################################
param(
    [Parameter(Mandatory=$true)][string]$Site,
    [Parameter(Mandatory=$true)][string]$Path
)

$DesignPackage = [Microsoft.SharePoint.Publishing.DesignPackage]
$DesignPackageInfo = [Microsoft.SharePoint.Publishing.DesignPackageInfo]
$BindingFlags = [System.Reflection.BindingFlags]

function _Using {
    param (
        [System.IDisposable] $inputObject = $(throw `
                  "The parameter -inputObject is required."),
        [ScriptBlock] $scriptBlock = $(throw "The parameter -scriptBlock is required.")
    )
    
    Try {
        &$scriptBlock
    } Finally {
        if ($inputObject -ne $null) {
            if ($inputObject.psbase -eq $null) {
                $inputObject.Dispose()
            } else {
                $inputObject.psbase.Dispose()
            }
        }
    }
}

function Call-InternalConstructor([System.Type]$Type, [Object]$Parms) {
    $ParmTypes = @()
    foreach($parm in $Parms) { $ParmTypes += $parm.GetType() }

    return $Type.GetConstructor($BindingFlags::NonPublic -bor $BindingFlags::Instance,`
                $null, $ParmTypes, $null).Invoke($Parms)
}

function Call-InternalStaticMethod([System.Type]$Type, [string]$MethodName, [Object]$Parms) {
    $ParmTypes = @()
    foreach($parm in $Parms) { $ParmTypes += $parm.GetType() }
    return $Type.GetMethod($MethodName, $BindingFlags::NonPublic -bor`
                   $BIndingFlags::Static, $null, $ParmTypes, $null).Invoke($null, $Parms)
}


_using($spsite = New-Object Microsoft.SharePoint.SPSite($Site)) {
    if ($spsite -ne $null) {
        $Path = [System.Io.Path]::Combine((pwd).path,$Path)
        $finfo = Dir $Path
        $pkg = Get-SPUserSolution -Site $Site $finfo.Name -ErrorAction SilentlyContinue
        if ( ($pkg -eq $null) ) {
            try {
                $pkgInfo = Call-InternalConstructor $DesignPackageInfo ($finfo.Name)
                _Using($fileStream = New-Object System.IO.FileStream($finfo.FullName,`
                           [System.IO.FileMode]::Open, [System.IO.FileAccess]::Read)) {
                    if ($fileStream -ne $null) {
                        try {
                            Call-InternalStaticMethod $DesignPackage "Install" `
                                          ([Microsoft.SharePoint.SPSite]$spsite, `
                                              $pkgInfo, [System.IO.Stream]$fileStream)
                            $DesignPackage::Apply($spsite,$pkgInfo)
                        } catch [Exception] {
                            Write-host -ForegroundColor Red $_.Exception.ToString()
                        }
                    } else {
                        Write-Host -ForegroundColor Red Unable to open $finfo.FullName
                    }
                }
            } catch [Exception] {
                Write-host -ForegroundColor Red $_.Exception.ToString()
            }
        } else {
            Write-Host -ForegroundColor Red $finfo.Name `
                               is already in the solution gallery of $Site
        }
    } else {
        Write-Host -ForegroundColor Red "Unable to open site: " $Site
    }
}

Wednesday, October 21, 2015

Create and Provision SP2013 Usage and Health Service

One of my clients migrated from SharePoint 2007 to SharePoint 2013.  We took all of the steps to migrate to 2010 then migrate again to 2013.  Things went fine and dandy, but we hit some snags.

We had a limited hardware budget.  One way to reduce costs was to re-use hardware as we migrated.  Their SQL Server installation was in great shape, powerful and space to grow as we migrated.  The only problem, it housed both production and a staging environment.  This meant as we moved forward there could be 3 SQL instances with data at any one given time, and space got a little cramped.

One of the surprises I had was how much space is taken up by the Usage and Health application.  It was undoubtedly the largest database in the instance; larger than any of our content databases.  It probably grew so large based on our search configuration and the level of re-arangement we were performing as the site was migrated.

So, to limit data use, I put a hard size limit on the database.  New data was flowing into the tables, but most of the sites were reporting zero activity.  You may say that it could have been pretty accurate, but my client was concerned with the lack of data in reports.

My next step was to decommission the existing Usage & Health service, and create a new one.  You can't do that through the user interface, so I did a little digging and came up with PowerShell commands to get everything going.

First off, allocate the new usage application:

New-SPUsageApplication -Name "Usage and Health Data Collection" `
                       -DatabaseServer "SQL2013"`
                       -DatabaseName "SP2013_UsageAndHealth"

I knew I had an alias on the server's instance but I couldn't remember it.  I looked it up by checking a content database's info in the Application Management > Manage Content Databases section.  I looked it up and it was SQL2013...

The next step was to provision the proxy, with some pretty stock settings:

$proxy = Get-SPServiceApplicationProxy `
  | where {$_.TypeName -eq "Usage and Health Data Collection Proxy"}
$proxy.Provision()


In some other blogs, it says we need to start the service on the individual server, but it's not showing up in my Services on Server list.  I double checked a system that I know is working fine, and it's not listed in the Services on Server configuration page ether; so I think I'm good to go.  I'll update with any changes as we go.