Wednesday, April 6, 2016

Install a SharePoint Publishing Design Package with PowerShell

Microsoft didn't provide a PowerShell cmdlet to install publishing design packages, and the one in the design gallery is just kinda overkill.  So I took it upon myself to make a simpler version, albeit it did require calling a couple of "internal" methods from the Microsoft.SharePoint.Publishing library.  Well, here goes have at it (you'll also see some reflection examples for calling internal constructors and methods).

Oh, here's an important note.  NOTE: MAKE SURE PUBLISHING INFRASTRUCTURE IS ENABLED FIRST.  Otherwise it won't turn on, and you won't be able to access the Master page settings page even if you go directly to ChangeSiteMasterPage.aspx.



#########################################################################
#
#  Install-PublishingDesignPackage -Site <site url> -Path <package path>
#
#########################################################################
param(
    [Parameter(Mandatory=$true)][string]$Site,
    [Parameter(Mandatory=$true)][string]$Path
)

$DesignPackage = [Microsoft.SharePoint.Publishing.DesignPackage]
$DesignPackageInfo = [Microsoft.SharePoint.Publishing.DesignPackageInfo]
$BindingFlags = [System.Reflection.BindingFlags]

function _Using {
    param (
        [System.IDisposable] $inputObject = $(throw `
                  "The parameter -inputObject is required."),
        [ScriptBlock] $scriptBlock = $(throw "The parameter -scriptBlock is required.")
    )
    
    Try {
        &$scriptBlock
    } Finally {
        if ($inputObject -ne $null) {
            if ($inputObject.psbase -eq $null) {
                $inputObject.Dispose()
            } else {
                $inputObject.psbase.Dispose()
            }
        }
    }
}

function Call-InternalConstructor([System.Type]$Type, [Object]$Parms) {
    $ParmTypes = @()
    foreach($parm in $Parms) { $ParmTypes += $parm.GetType() }

    return $Type.GetConstructor($BindingFlags::NonPublic -bor $BindingFlags::Instance,`
                $null, $ParmTypes, $null).Invoke($Parms)
}

function Call-InternalStaticMethod([System.Type]$Type, [string]$MethodName, [Object]$Parms) {
    $ParmTypes = @()
    foreach($parm in $Parms) { $ParmTypes += $parm.GetType() }
    return $Type.GetMethod($MethodName, $BindingFlags::NonPublic -bor`
                   $BIndingFlags::Static, $null, $ParmTypes, $null).Invoke($null, $Parms)
}


_using($spsite = New-Object Microsoft.SharePoint.SPSite($Site)) {
    if ($spsite -ne $null) {
        $Path = [System.Io.Path]::Combine((pwd).path,$Path)
        $finfo = Dir $Path
        $pkg = Get-SPUserSolution -Site $Site $finfo.Name -ErrorAction SilentlyContinue
        if ( ($pkg -eq $null) ) {
            try {
                $pkgInfo = Call-InternalConstructor $DesignPackageInfo ($finfo.Name)
                _Using($fileStream = New-Object System.IO.FileStream($finfo.FullName,`
                           [System.IO.FileMode]::Open, [System.IO.FileAccess]::Read)) {
                    if ($fileStream -ne $null) {
                        try {
                            Call-InternalStaticMethod $DesignPackage "Install" `
                                          ([Microsoft.SharePoint.SPSite]$spsite, `
                                              $pkgInfo, [System.IO.Stream]$fileStream)
                            $DesignPackage::Apply($spsite,$pkgInfo)
                        } catch [Exception] {
                            Write-host -ForegroundColor Red $_.Exception.ToString()
                        }
                    } else {
                        Write-Host -ForegroundColor Red Unable to open $finfo.FullName
                    }
                }
            } catch [Exception] {
                Write-host -ForegroundColor Red $_.Exception.ToString()
            }
        } else {
            Write-Host -ForegroundColor Red $finfo.Name `
                               is already in the solution gallery of $Site
        }
    } else {
        Write-Host -ForegroundColor Red "Unable to open site: " $Site
    }
}

Wednesday, October 21, 2015

Create and Provision SP2013 Usage and Health Service

One of my clients migrated from SharePoint 2007 to SharePoint 2013.  We took all of the steps to migrate to 2010 then migrate again to 2013.  Things went fine and dandy, but we hit some snags.

We had a limited hardware budget.  One way to reduce costs was to re-use hardware as we migrated.  Their SQL Server installation was in great shape, powerful and space to grow as we migrated.  The only problem, it housed both production and a staging environment.  This meant as we moved forward there could be 3 SQL instances with data at any one given time, and space got a little cramped.

One of the surprises I had was how much space is taken up by the Usage and Health application.  It was undoubtedly the largest database in the instance; larger than any of our content databases.  It probably grew so large based on our search configuration and the level of re-arangement we were performing as the site was migrated.

So, to limit data use, I put a hard size limit on the database.  New data was flowing into the tables, but most of the sites were reporting zero activity.  You may say that it could have been pretty accurate, but my client was concerned with the lack of data in reports.

My next step was to decommission the existing Usage & Health service, and create a new one.  You can't do that through the user interface, so I did a little digging and came up with PowerShell commands to get everything going.

First off, allocate the new usage application:

New-SPUsageApplication -Name "Usage and Health Data Collection" `
                       -DatabaseServer "SQL2013"`
                       -DatabaseName "SP2013_UsageAndHealth"

I knew I had an alias on the server's instance but I couldn't remember it.  I looked it up by checking a content database's info in the Application Management > Manage Content Databases section.  I looked it up and it was SQL2013...

The next step was to provision the proxy, with some pretty stock settings:

$proxy = Get-SPServiceApplicationProxy `
  | where {$_.TypeName -eq "Usage and Health Data Collection Proxy"}
$proxy.Provision()


In some other blogs, it says we need to start the service on the individual server, but it's not showing up in my Services on Server list.  I double checked a system that I know is working fine, and it's not listed in the Services on Server configuration page ether; so I think I'm good to go.  I'll update with any changes as we go.

Tuesday, August 18, 2015

Blury Images & Blogspot

My sister-in-law and fellow blogger, Amber Richter, has been running into some of the same issues that I have.

Take for example the two pictures here of from her latest deco project (see below).  They're both the same image, but Google/Blogger have an API that conveniently downsizes pictures, to reduce the time it takes to get them to your phone, tablet, PC, or whatever.  Partially it's a hold over from the old days when browsers didn't have good image resizing, but with today's blog readers using phones and tablets, the smaller images make a big difference in download time.

If you use Blogger's builtin WYSIWYG editor, you probably won't end up with issues like this.  But if you use a external editor, or if you're like me and dive straight in to the HTML, you'll probably end up with something like the example below.

Lets dive in and see just what is happening.  Amber took both of these pictures with here digital camera, and the file name of the image is: DSCN0118.jpg

But these two images have very distinct URLs.

Version (1), blurry when shown at 640 pixels wide, URL:
https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjEEmuL-GCea_vVwRyd8OExNyt8vdZzrCvOxL4opCt0OXf3pzpwPWlxnL0XkNKgjg0JyA5ca-dzkgr_OWaMWxbm8eWZSsPCekkLqTne-LKX-Trd-OgsNRjsq7C_ahO1C172-6w8s0Koj6A/s320/DSCN0118.jpg

Version (2), nice and sharp at 640 pixels wide, URL
https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjEEmuL-GCea_vVwRyd8OExNyt8vdZzrCvOxL4opCt0OXf3pzpwPWlxnL0XkNKgjg0JyA5ca-dzkgr_OWaMWxbm8eWZSsPCekkLqTne-LKX-Trd-OgsNRjsq7C_ahO1C172-6w8s0Koj6A/s1600/DSCN0118.jpg

So what the difference?  It's that directory just in front of the file name.  Version (1) is s320 and Version (2) is s1600.  The Blogger API is actually resizing the image on the server, shrinking or expanding the image so that the longest dimension matches the size listed in the path.  Version (1) is being shrunk to 320 pixels wide.  Version (2) is being shrunk also, as the original was probably much larger, but a 1600 x 1200 image is plenty big for the web.

How can you fix it?  If you're using the Blogger on-line editor, switch to HTML mode and search for your image.  It will be in a tag that looks something like:

<img src="http://..../s320/DSCN1234.jpg" ... />

If you're not used to editing HTML, just jump right in and go for it.  Change that "sNNN" directory in the path and up or down the number to get the width you want.  Remember the blog canvas is only about 700 pixels wide, so won't need to go much above that.  If you wanted to go for the perfect fit, set the "s" directory to the same size as the width you need.

If you're using an external editor, the process is the same, just look for that image URL and update the size directory to fit your need.

Hey, give the pictures a click, and you'll see just how they really are different sizes.

Version (1) of the image from the s320 directory, the image is 320 pixels wide but expanded to 640.

Version (2) of the image from the s1600 directory, the image is 1600 pixels wide, and shrunk to 640.





Wednesday, July 1, 2015

How Literal Interpertation of SharePoint Planning leads to Poor Security

I love the technical documentation from Microsoft.  It's the best technical documentation form any vendor I've ever worked with. I love it, or at least I used to.

<RANT>Microsoft's documentation for SharePoint's development API is pretty sparse, but since we can de-compile code (did I just say that?),  we get the best documentation there is, the code itself.  <ENDRANT>

OK, that was my rant, sorry everyone.  What I really wanted to cover in this post was a tendency by a lot of my colleagues and clients to read Plan for administrative and service accounts in SharePoint 2013 literally.  I'm going to breakdown the Administrative and Service accounts.  Jump straight to The Ugly.


The Good

  • Server Farm Account -- a.k.a. Farm Account: You must separate this account from the App Pool, otherwise: DANGER WILL ROBINSON, DANGER!  Also, this account really needs to be a member of the local Administrators group on each of the farm's servers, otherwise you won't be able to deploy Web Applications or any Solutions that add code to the GAC.
  • Application Pool Identity -- a.k.a. App Pool: Did you know that if you use the Farm Account as the App Pool account, it's possible to access all of the stored passwords from the Managed Accounts and Secure Store Service?
  • Default Content Access -- a.k.a. Search Crawl Account, also bad idea to be the Farm Account, while this will work, the Search Crawl Account may need access to file shared and other web applications outside of SharePoint.
  • Synchronization Connection -- a.k.a. User Profile Service Account. Since this account needs to have special access in Active Directory, it's best to separate it out.  Some people re-use the Farm Account because the FIM components have to be run by that account anyway, but heck, do you really want all your eggs in one basket?  Although the App Pool account is the most vulnerable to attacks, the Farm Account is second most, because it hosts Central Admin.  Any code injected in 15/TEMPLATES/ADMIN runs as the Farm Account, so should the Farm Account really have access to all of your AD information?

The Bad

There are so many possible service accounts to deploy in SP2013 (and SP2010 for that matter) that it gets a bit dizzying.  It's all fine and good to create accounts for each of these, but it's a pain in the but to actually get them all in there, even if you're using AutoSPInstaller.

The downside to using lots of service accounts, is that each different account must run in a separate memory context (Application Pool for those that only speak IIS).  That means you're using 0.5 - 2 GB per service account or application pool whichever count is higher.  The fewer accounts you configure, the fewer application pools you're required to run.  The fewer application pools you run the less memory you need to run, and the less likely that the W3P instance gets swapped out or restarted.  All this leads to higher overall performance.

OK, so you say that you want a W3P service for each Web Application your farm hosts.  Fine, but what do you get?  You might be able to service web pages faster (more threads right), but probably not as SQL Server is the bottleneck in a SharePoint system.  And if you think that separating App Pools by Web Application means that you can hot-install GAC code, think again.  SharePoint only knows how to restart IIS not a single pool.  Plus, you may have to restart the SharePoint Timer Service too if you have Feature Receivers.

Accounts that probably should be combined

So IMHO,  all of these accounts should just be the Application Pool account:
  • Access Services
  • Access Services 2013
  • Business Data Connectivity service
  • Secure Store Service
  • Usage and Health Data Collection Service
  • Visio Graphics Service
  •  Word Automation services
  •  Excel Services
  •  Managed Metadata Service
  •  PerformancePoint Service
  • App Management Service
  •  PowerPoint Conversion Service
  • Machine Translation service
  • Work Management
  • Distributed Cache (OK, so this actually needs access by the Farm and App Pool, so, it's weird).
All of these accounts must be the Farm Account by definition, so why are they mentioned by Microsoft (I don't know...).
  •  Security Token Service
  •  Application Discovery and Load Balancer Service


The Ugly

Here's my gripe.  Almost everyone I speak to, who hasn't had any formal Information Assurance (read Computer Security) training reads the notes about the Setup User Account, and thinks this should be a single, shared account.  Yep you read that right, shared account!  The first rule of Information Assurance is NO SHARED ACCOUNTS!  The minute you have an administrator who changes job positions or otherwise leaves the organization, you're screwed.

I quote from Microsoft's "Plan for Administrative and..."


Setup user account The user account that is used to run:

If you run Windows PowerShell cmdlets that affect a database, this account must be a member of the db_owner fixed database role for the database.
  • Setup on each server computer 
  • SharePoint Products Configuration Wizard The 
  • Psconfig command-line tool 
  • The Stsadm command-line tool

Microsoft doesn't mention it, but this account is the initial Farm Administrator as well.

What's the solution?  Here's the best way to go about this:
  1. Create an AD Group called SP Farm Admins (or what ever makes sense to you)
  2. Grant DBO to SP Farm Admins back on the SQL server before you even think of installing the SharePoint binaries.
  3. Add SP Farm Admins to the Administrators group of each of the farm servers.
  4. For each of the intended administrators from your organization, add them to the SP Farm Admins group.
  5. Now, I'm assuming that you're one of the admins, log-in using your account and install SharePoint normally.
  6. Use the PowerShell commandlet Add-SPShellAdmin to add the SP Farm Admins as a shell-level administrator.
  7. Add SP Farm Admin to the Farm Administrator's group in Central Admin.
  8. You could think about adding SP Farm Admin to one of those weirdo groups that SharePoint supposedly needs, but I've never seen it actually required.
Now you're set.  When you're ready to pull a Milton (Office Space, keep up people) and burn some bridges, all they have to do is disable your account, and anyone else who's a member of SP Farm Admin can take over and make sure each of those $0.01 transactions goes to that account in the Cayman Islands.

Thursday, May 7, 2015

Analytic & Sharepoint : Pros and Cons

I've had the pleasure (or torture) of integrating four enterprise class analytic packages with SharePoint.  I'm putting together this post to list out the pros and cons that I've experienced with each tool.  I know it's not an exhaustive list, but I hope you'll find it useful if you're exploring analytics in SharePoint. 

All of these tools use external tools to develop dashboards and reports. Also, they each have the ability to produce many different types of charts and grid reports.  All are interactive in one aspect or another, and are web-enabled (of course).


SQL Server Reporting Services (SSRS)

Licensing:
  • Included with SQL Server
Pros:
  • Reports can be developed with Visual Studio or Report Builder (Free Download)
  • Works equally well with relational or dimensional data sources
  • Exports reports to PDF, Excel, and many other platforms
  • Produces "pixel perfect" reports for viewing online or printing
  • Includes geospatial analytical charting
  • Uses standard HTML capabilities to render reports on many browser platforms
  • Works well with "touch" devices (tables, phones)
  • Most flexible analytic tool listed 
  • Data can be "blended" from multiple sources
  • Data may be loaded from any ADO.Net data source (SQL Server, Oracle, Excel, Access, etc.)
Cons:
  • Requires the SSRS server to produce reports for users
  • Does not fully support "touch" interfaces for mouse-hover tool tips
  • Developing reports requires thorough knowledge of the data sources
  • Development user interface is geared toward experienced developers

Performance Point Services (PPS)

Licensing:
  • Included with SharePoint Enterprise

Pros:
  • Automatically detects the structure of the multidimensional data, allowing users to explore data via drill through.
  • Included in the SharePoint 2013 software distribution, no additional installation media required
  • When you drill-through items, only the affected components are refreshed, other KPIs and worksheets remain unaffected.
  • PPS dashboards are true dashboards, which consolidate wide data that can either be tied together or desperate.  In this sense it is the only "true" dashboard tool presented here.

Cons:
  • Does not include geospatial analytical charting, but can be augmented with SSRS
  • Requires SQL Server Analysis Services dimensional data to operate
  • Requires two-button mouse support for full capabilities (drill-through and mouse hovers)
  • Dashboard development must be conducted on a system that is on the same domain as the SharePoint server.
  • Development user interface is geared toward experienced developers

Power View (a.k.a. Power BI)

Licensing:
  • Included with SharePoint Enterprise
Pros:
  • Microsoft Excel or web-based tools are used to author dashboards
  • Supported in both Office 365 and on-premises SharePoint 2010 and 2013
  • Includes  geospatial analytical charting
  • Utilizes HTML5 for cross browser and platform support
  • Works well with "touch" devices (tables, phones)
  • Data can be embedded in spreadsheets and automatically hosted by SSAS.
Cons:
  • Requires a Multidimensional data source or Tabular data source for data
  • Not fully "touch" enabled, popup tool tips require mouse hover
  • Although some editing on PowerView dashboards can be done on-line, the best experience comes using Excel 2013.

Tableau

Licensing:
  • Per core or per user licensing (ref)
  • Tableau Desktop licensed per user for development activities
  • Tableau Server or Tableau Online (cloud offering) for publishing, and separately licensed.
  • A breakdown of costs has put together by Brad Fair at Interworks.
Pros:
  • Beautiful worksheets and dashboards out of the box
  • Developing worksheets and dashboards is a drag & drop activity.
  • Data can be "blended" from multiple sources
  • Tableau "packaged dashboards" (.twbx files) are self-contained, with some or all data embedded.
  • "twbx" files can be distributed to users who can view them using the free Tableau Viewer.
  • Works best with single "flat" data sources or with  dimensional data such as SQL Server Analysis Services (SSAS)
  • Supports geospatial analysis and charting (i.e. plotting points on a map)
  • Software assurance (a.k.a. free upgrades) are included with maintenance costs 
Cons:
  • Relatively expensive, if you already have an investment in SQL Server or some other BI suite with required yearly maintenance costs (see breakdown of costs)
  • Does not directly integrate into SharePoint, dashboards are shown via Page Viewer IFRAME HTML elements.
  • Not fully "touch" enabled, some features such as tool-tops require mouse-hover to show.
  • Tuning queries is difficult, since Tableau was designed for the embedded data model first
  • Tableau Online requires that reports be 100% "twbx" embedded data or have access to internet enabled data sources
  • When using dimensional data, textual reports (non numeric) are difficult, and the dimensional model must be tuned to support them, through "existence" facts.