Using PowerShell to Fix Custom Search Pages in SharePoint Online

Matthew Chestnut is a Senior Consultant at ThreeWill. He has over 20 years of software development experience around enterprise and departmental business productivity applications. He has a proven track record of quality software development, on-budget project management and management of successful software development teams.


A customer recently migrated from on-premise SharePoint 2013 to SharePoint Online. Unfortunately, the migration tool they used did not migrate their custom search pages correctly. With four custom pages for 40+ subsites, that’s 160+ pages that needed to be fixed. Fortunately, many of these subsite search pages were identical across the subsites, so scripting a solution to this problem was the approach I took.

Create the search page prototypes

The first step was to create an example of each search page to make certain the search refinement web part and content rollup/content search web part were configured correctly. The search pages are stored in the Site Pages document library and use the Wiki Page template, with a two-column layout, search refiner on the left and content search on the right.

The page prototypes I created included:

  • two corporate-level search pages that were identical across all subsites
  • two division-level search pages that were slightly different based on the four divisions

This resulted in ten search page prototypes (2 corporate-level plus 8 division-level.)

Export the web parts

Now that I had working examples of search pages it was time to prepare for scripting by exporting the web parts to use for all the other search pages. The steps were:

  • edit the search page
  • click the drop-down context menu for each web part
  • choose the “Export…” option

This results in two web part files, “Refinement.webpart” and “Content Search.webpart”.

This pair of “.webpart” files contain the search configuration for each of the ten prototype search pages. I placed each of these pairs of web part files into an appropriately named folder so I wouldn’t get them mixed up.

Identify the scripting technology to use

Here is a recap of what I needed to automate:

  • For each of the 40+ subsites…
  • For each of the four search pages…
  • Create a Wiki Page in Site Pages document library
  • Set the Text Layout for the Wiki Page to “two columns”
  • Add the appropriate Web Parts to the Wiki Page, based on whether it is one of the two Corporate search pages or one of the eight Division search pages
    • Left column, add the Search Refinement web part
    • Right column, add the Content Search web part

There are several options to use for scripting a SharePoint Online solution. This blog article, PowerShell Modules for Managing SharePoint Online (https://blogs.technet.microsoft.com/dawiese/2017/02/15/powershell-modules-for-managing-sharepoint-online/) does a great job of outlining the options available:

  1. Microsoft Online Services Sign-in Assistant
  2. SharePoint Online Management Shell
  3. Office Dev PnP PowerShell Module
  4. SharePoint Online Client-Side Object Model (CSOM)
  5. Azure Azure AD Management Shell
  6. Security and Compliance Center

I’ve used the SharePoint Online Client-Side Object Model (CSOM) option before in solutions I’ve created for on-premise customers. This option exposes much of the SharePoint API and would’ve worked great.

However, I wanted a solution that was a bit more administrator-friendly and not so developer-friendly. The Office Dev PnP PowerShell Module fit that criterion perfectly. These commands use CSOM behind-the-scenes to perform provisioning and artifact management actions, and many of these commands can work against both SharePoint Online and SharePoint Server.

Using the Office Dev PnP PowerShell Module

You’ll need to install the following modules to prepare a machine to use PnP PowerShell:

PS> Install-Module -Name PowerShellGet -Force
PS> Install-Module SharePointPnpPowerShellOnline

More details about PnP PowerShell can be found here, https://github.com/SharePoint/PnP-PowerShell

Scripting the solution

Where am I now?

  • I have my search pages prototyped
  • The web part XML is saved to the local file system
  • I have the PnP PowerShell module installed for SharePoint Online

Now, it’s time to script the solution.

  • Connect to SharePoint Online
  • Get a handle to the SharePoint web (subsite) for division / plant
  • Create the wiki page in the Site Pages library
  • Add the web parts to the page

Below is the list of commands needed to create the search pages:

# connect to SharePoint Online
Connect-PnPOnline -Url "https://example.sharepoint.com/sites/corporate" -Credentials (Get-Credential)

# get the subweb for the division / plant
$web = Get-PnPWeb -Identity "/sites/corporate/division/plant" -ErrorVariable err -ErrorAction Continue
if ($err.count -gt 0 -or $web -eq $null) { throw "Web not found: $($subsite)" }

# Wiki Page to create
$wikiPageUrl = "/sites/corporate/division/plant/SitePages/SearchPageOne.aspx"

# check to see if page exists
$query = "SearchPageOne"
$sitePage = Get-PnPListItem -Web $web -List "Site Pages" -Query $query
if ($sitePage -ne $null) {
Write-Host "Page already exists:" $wikiPageUrl
else {
# create the Wiki Page with "Two columns" text layout
Add-PnPWikiPage -Web $web -ServerRelativePageUrl $wikiPageUrl -Layout TwoColumns
Write-Host "Page created:" $wikiPageUrl

# from the local filesystem, get the web parts to add to the wiki page
$pathRefinement = Join-Path $PSScriptRoot "webparts\Refinement.webpart"
if ((Test-Path $pathRefinement) -eq $false) { throw "File not found: $($pathRefinement)"}
$pathContentSearch = Join-Path $PSScriptRoot "webparts\Content Search.webpart"
if ((Test-Path $pathContentSearch) -eq $false) { throw "File not found: $($pathContentSearch)"}

# the Site Pages library has check out required enabled, so, check out file
Set-PnPFileCheckedOut -Web $web -Url $wikiPageUrl

# remove the web parts if they already exist; if they don't exist, no error is thrown
Remove-PnPWebPart -Web $web -ServerRelativePageUrl $wikiPageUrl -Title "Refinement"
Remove-PnPWebPart -Web $web -ServerRelativePageUrl $wikiPageUrl -Title "Content Search"

# add the Refinement web part to the "left column", row 1 column 1
Add-PnPWebPartToWikiPage -Web $web -ServerRelativePageUrl $wikiPageUrl -Path $pathRefinement -Row 1 -Column 1

# add the Content Search web part to the "right column", row 1 column 2
Add-PnPWebPartToWikiPage -Web $web -ServerRelativePageUrl $wikiPageUrl -Path $pathContentSearch -Row 1 -Column 2

# Now, check in page with comments
Set-PnPFileCheckedIn -Web $web -Url $wikiPageUrl -CheckinType MajorCheckin -Comment "Modified by $($PSCommandPath)"

To keep the example script as concise as possible I intentionally did not include much error checking. In the “production version” of the script I stored the subsite information (division, plant, etc.) in a comma-delimited (.csv) file. I executed the “create search page” commands in a loop for each of the entries in the file.


What I like about the PnP PowerShell module is the relative simplicity of the commands and scripts. Once the connection is established to SharePoint Online, the “PnP” commands work just like server-side PowerShell. There is no need to pass around a context variable like we need to do when using SharePoint Online CSOM. I think this makes this technology much more suitable for SharePoint administrators.

This script accomplished in minutes what would’ve taken me hours to do by hand. And, if changes in the refiners or content search were needed, no worries…just edit the web part XML as needed and re-run the script!

read more
Matthew ChestnutUsing PowerShell to Fix Custom Search Pages in SharePoint Online

Migrating a Document Library and Retaining Authorship Information

Caroline Sosebee is a Software Engineer at ThreeWill. She comes to us with 20+ years of software development experience and a broad scope of general IT support skills.

Lately, I’ve had the opportunity to work on a couple of on-premises SharePoint 2010 to SharePoint 2013 migrations which proved to be both fun and challenging.  Our team chose to leverage PowerShell and CSOM for handling all the heavy lifting when it came to provisioning the new site and migrating data.  This, in turn, lead to one of the more interesting things I got to do, which was write a PowerShell script that would migrate the contents of a document library from the old 2010 site to a document library in the new 2013 site, while at the same time retaining authorship information.

As I researched how to do this, I found that there wasn’t any one source that explained all the steps needed in a nice concise manner. I found dribs and drabs here and there, usually around one step or another, or found that the code provided was using server side code which wasn’t help in my case.

So, I decided it would be worthwhile to pull all the parts into one document, both for my own reference as well as for others. And yes, I admit it. I shamelessly pilfered from some of my co-workers’ fine scripts to piece this together. Thanks, team, for being so awesome!

Here is an outline of the high-level steps needed to move files between sites. You can find the full script at the bottom.

Basic steps

  • Get a context reference for each of the sites (source and target). This will allow you to work on both sites at the same time.
  • Load both the source and target libraries into their appropriate contexts. Load the RootFolder for the target library as well. This will be needed both when saving the document and when updating the metadata.
  • Have a valid user id available to use in place of any invalid users found in the metadata being copied. Ensure the ‘missing user’ user id.
  • Query the source library using a CAML query to get a list of all items to be processed. You can apply filters here to limit results as needed. (Attached code has parameters for specifying start and end item ids).
  • Loop through the items returned by the CAML query
    • Get a reference to the source item using the id
    • Get a reference to the actual file from the source item
    • Load the source file by calling ‘OpenBinaryDirect’ (uses the file ServerRelativeUrl value)
    • Write it back out to the target library by calling ‘SaveBinaryDirect’ on the just loaded file stream
    • Copy over the metadata:
      • Get a reference to the new item just added in the target library
      • Populate the target item metadata fields using values from the source item
      • Update the item
    • Loop

That’s it, in a nutshell. There are all sorts of other things you can do to pretty it up, but I thought I would keep this simple as a quick reference both for myself and others. Be sure to check the top of the script below for other notes about what it does and does not do.

Copies documents from one library into another across sites.
This was tested using SharePoint 2010 as the source and both SharePoint 2010 and SharePoint 2013 as the target.

* Parameters can either be passed in on the command line or pre-populated in the script
* Example for calling from command line:
./copy-documents.ps1 "http://testsite1/sites/testlib1/" "domain\user" "password" "source lib display name" "http://testsite2/sites/testlib2/" "domain\user" "password" "target lib display name" "domain\user" 1 100

* Can cross site collections and even SP versions (e.g. SP2010 to SP2013)
* Allows you to specify both the source and target document library to use
* Can retain created, created by, modified, modified by and other metadata of the original file
* Can specify a range of files to copy by providing a starting id and ending id
* When copying metadata such as created by, will populate any invalid users with the provided 'missing user' value
* Uses a cache for user data so it doesn't have to run EnsureUser over and over for the same person

* Does not currently traverse folders within a document library.
* This only copies.  It does not remove the file from the source library when done.


	[string]$sourceSiteUrl = "",
	[string]$sourceUser = "",
	[string]$sourcePwd = "",
    [string]$sourceLibrary = "",
	[string]$targetSiteUrl = "",
	[string]$targetUser = "",
	[string]$targetPwd = "",
    [string]$targetLibrary = "",
    [string]$missingUser = "",
    [int]$startingId = -1,
    [int]$endingId = -1

## Load the libraries needed for CSOM
## Replace with the appropriate path to the libs in your environment
Add-Type -Path ("c:\dev\libs\Microsoft.SharePoint.Client.dll")
Add-Type -Path ("c:\dev\libs\Microsoft.SharePoint.Client.Runtime.dll")

function Main {
	Write-Host "[$(Get-Date -format G)] copy-documents.ps1: library $($sourceLibrary) from $($sourceSiteUrl) to $($targetSiteUrl)"
    # Get the context to the source and target sites
	$sourceCtx = GetContext $sourceSiteUrl $sourceUser $sourcePwd
	$targetCtx = GetContext $targetSiteUrl $targetUser $targetPwd

    # Ensure the "missing user" in the target environment
    $missingUserObject = $targetCtx.Web.EnsureUser($missingUser)
	## Moved the try/catch for ExecuteQuery to a function so that we can exit gracefully if needed
	ExecuteQueryFailOnError $targetCtx "EnsureMissingUser"

	## Start the copy process
	CopyDocuments $sourceCtx $targetCtx $sourceLibrary $targetLibrary $startingId $endingId $missingUserObject

function CopyDocuments {
	param($sourceCtx, $targetCtx, $sourceLibrary, $targetLibrary, $startingId, $endingId, $missingUserObject)

    $copyStartDate = Get-Date

    # Get the source library
    $sourceLibrary = $sourceCtx.Web.Lists.GetByTitle($sourceLibrary)
	ExecuteQueryFailOnError $sourceCtx "GetSourceLibrary"

    # Get the target library
    $targetLibrary = $targetCtx.Web.Lists.GetByTitle($targetLibrary)
	## RootFolder is used later both when copying the file and updating the metadata.
	ExecuteQueryFailOnError $targetCtx "GetTargetLibrary"

    # Query source list to retrieve the items to be copied
    Write-Host "Querying source library starting at ID $($startingId) [Ending ID: $($endingId)]"
    $sourceItems = @(QueryList $sourceCtx $sourceLibrary $startingId $endingId) # Making sure it returns an array
    Write-Host "Found $($sourceItems.Count) items"

    # Loop through the source items and copy
    $totalCopied = 0
    $userCache = @{}
    foreach ($sourceItemFromQuery in $sourceItems) {

        $totalCount = $($sourceItems.Count)

        if ($sourceItemFromQuery.FileSystemObjectType -eq "Folder") {
            Write-Host "skipping folder '$($sourceItemFromQuery['FileLeafRef'])'"
		Write-Host "--------------------------------------------------------------------------------------"
        Write-Host "[$(Get-Date -format G)] Copying ID $($sourceItemFromQuery.ID) ($($totalCopied + 1) of $($totalCount)) - file '$($sourceItemFromQuery['FileLeafRef'])'"

        # Get the source item which returns all the metadata fields
        $sourceItem = $sourceLibrary.GetItemById($sourceItemFromQuery.ID)
        # Load the file itself into context
		$sourceFile = $sourceItem.File
		ExecuteQueryFailOnError $sourceCtx "GetSourceItemById"

		## Call the function used to run the copy
        $targetId = CopyDocument $sourceCtx $sourceItem $sourceFile $sourceItemFromQuery $targetCtx $targetLibrary $userCache $missingUserObject

    # Done - let's dump some stats
    $copyEndDate = Get-Date
    $duration = $copyEndDate - $copyStartDate
    $minutes = "{0:F2}" -f $duration.TotalMinutes
    $secondsPerItem = "{0:F2}" -f ($duration.TotalSeconds/$totalCopied)
    $itemsPerMinute = "{0:F2}" -f ($totalCopied/$duration.TotalMinutes)
	Write-Host "--------------------------------------------------------------------------------------"
    Write-Host "[$(Get-Date -format G)] DONE - Copied $($totalCopied) items. ($($minutes) minutes, $($secondsPerItem) seconds/item, $($itemsPerMinute) items/minute)"

### Function used to copy a file from one place to another, with metadata
function CopyDocument {
    param($sourceCtx, $sourceItem, $sourceFile, $sourceItemFromQuery, $targetCtx, $targetLibrary, $userCache, $missingUserObject)

    ## Validate the Created By and Modified By users on the source file
    $authorValueString = GetUserLookupString $userCache $sourceCtx $sourceItem["Author"] $targetCtx $missingUserObject
    $editorValueString = GetUserLookupString $userCache $sourceCtx $sourceItem["Editor"] $targetCtx $missingUserObject

    ## Grab some important bits of info
	$sourceFileRef = $sourceFile.ServerRelativeUrl
    $targetFilePath = "$($targetLibrary.RootFolder.ServerRelativeUrl)/$($sourceFile.Name)"

    ## Load the file from source
    $fileInfo = [Microsoft.SharePoint.Client.File]::OpenBinaryDirect($sourceCtx, $sourceFileRef)
    ## Write file to the destination
    [Microsoft.SharePoint.Client.File]::SaveBinaryDirect($targetCtx, $targetFilePath, $fileInfo.Stream, $true)

    ## Now get the newly added item so we can update the metadata
    $item = GetFileItem $targetCtx $targetLibrary $sourceFile.Name $targetLibrary.RootFolder.ServerRelativeUrl

    ## Replace the metadata with values from the source item
    $item["Author"] = $authorValueString
    $item["Created"] = $sourceItem["Created"]
    $item["Editor"] = $editorValueString
    $item["Modified"] = $sourceItem["Modified"]

	## Update the item
    ExecuteQueryFailOnError $targetCtx "UpdateItemMetadata"

    Write-Host "[$(Get-Date -format G)] Successfully copied file '$($sourceFile.Name)'"


## Get a reference to the list item for the file.
function GetFileItem {
	param($ctx, $list, $fileName, $folderServerRelativeUrl)

	$camlQuery = New-Object Microsoft.SharePoint.Client.CamlQuery
	if ($folderServerRelativeUrl -ne $null -and $folderServerRelativeUrl.Length -gt 0) {
		$camlQuery.FolderServerRelativeUrl = $folderServerRelativeUrl
	$camlQuery.ViewXml = @"
        		<FieldRef Name='FileLeafRef' />
        		<Value Type='File'>$($fileName)</Value>

	$items = $list.GetItems($camlQuery)
	if ($items -ne $null -and $items.Count -gt 0){
		$item = $items[0]
		$item = $null
	return $item

## Validate and ensure the user
function GetUserLookupString{
	param($userCache, $sourceCtx, $sourceUserField, $targetCtx, $missingUserObject)

    $userLookupString = $null
    if ($sourceUserField -ne $null) {
        if ($userCache.ContainsKey($sourceUserField.LookupId)) {
            $userLookupString = $userCache[$sourceUserField.LookupId]
        else {
            try {
                # First get the user login name from the source
                $sourceUser = $sourceCtx.Web.EnsureUser($sourceUserField.LookupValue)
            catch {
                Write-Host "Unable to ensure source user '$($sourceUserField.LookupValue)'."  

            try {
                # Now try to find that user in the target
                $targetUser = $targetCtx.Web.EnsureUser($sourceUser.LoginName)
                # The "proper" way would seem to be to set the user field to the user value object
                # but that does not work, so we use the formatted user lookup string instead
                #$userValue = New-Object Microsoft.SharePoint.Client.FieldUserValue
                #$userValue.LookupId = $user.Id
                $userLookupString = "{0};#{1}" -f $targetUser.Id, $targetUser.LoginName
            catch {
                Write-Host "Unable to ensure target user '$($sourceUser.LoginName)'."
            if ($userLookupString -eq $null) {
                Write-Host "Using missing user '$($missingUserObject.LoginName)'."
                $userLookupString = "{0};#{1}" -f $missingUserObject.Id, $missingUserObject.LoginName
            $userCache.Add($sourceUserField.LookupId, $userLookupString)

	return $userLookupString

## Pull ids for the source items to copy
function QueryList {
    param($ctx, $list, $startingId, $endingId)

    $camlQuery = New-Object Microsoft.SharePoint.Client.CamlQuery
    $camlText = @"
            <FieldRef Name='ID' Ascending='True' />
        <FieldRef Name='ID' />
    <QueryOptions />

    if ($endingId -eq -1) {
        $camlQuery.ViewXml = [System.String]::Format($camlText, "<Geq><FieldRef Name='ID' /><Value Type='Counter'>$($startingId)</Value></Geq>", "")
    else {
        $camlQuery.ViewXml = [System.String]::Format($camlText, "<And><Geq><FieldRef Name='ID' /><Value Type='Counter'>$($startingId)</Value></Geq><Leq><FieldRef Name='ID' /><Value Type='Counter'>$($endingId)</Value></Leq></And>", "")

    $items = $list.GetItems($camlQuery)
	ExecuteQueryFailOnError $ctx "QueryList"

    return $items

function GetContext {
	param($siteUrl, $user, $pwd)
	# Get the client context to SharePoint
	$ctx = New-Object Microsoft.SharePoint.Client.ClientContext($siteUrl)
	$securePwd = ConvertTo-SecureString $pwd -AsPlainText -Force
	$cred = New-Object PSCredential($user, $securePwd)
	$ctx.Credentials = $cred
	return $ctx

function ExecuteQueryFailOnError {
	param($ctx, $action)
	try {
	catch {
		Write-Error "$($action) failed with $($_.Exception.Message).  Exiting."
		exit 1

### Start the process
read more
Caroline SosebeeMigrating a Document Library and Retaining Authorship Information

Configure Scheduled Shutdown for Virtual Machines in Azure

Eric Bowden has over 19 years of software development experience around enterprise and departmental business productivity applications.

Scheduled Shutdown in Azure

I’ve been using Virtual Machines in Azure for a while, and it’s a great option. My favorite part about it is how easy it is to expose Virtual Machines in Azure to the public internet. This can be really useful for a number of testing scenarios; It’s been a great resource. I’m able to use it because as part of my MSDN subscription, I’m provided a monthly credit. The only issue is that the credit is approximately enough to cover Virtual Machines that are running for the full month (as long as those Virtual Machines are running only during working hours.)

Approximately eight hours a day, five days a week.  The credit has been enough to cover. That works out great as long as I remember to start my Virtual Machines in the morning and then stop my Virtual Machines at the end of the day. Unfortunately, I’d forgotten to stop my Virtual Machines a number of times, so they run overnight. One time, they ran over the weekend and that caused me to overrun the monthly credit that I have as part of my MSDN subscription.

A colleague of mine, Rob Horton caught wind of that recently and directed me to this blog post that I’m showing, which walks you through the process where you can configure what’s called a runbook in Azure that runbook will run a PowerShell script. It’s actually called a graphical PowerShell script. There are graphical PowerShell scripts and I guess non-graphical PowerShell scripts and this walks you through configuring a graphical PowerShell script as part of your runbook, which can be used to automatically stop your Virtual Machines.

Configured that a couple of weeks ago. It’s been awesome! I configured mine to run at 6:00 PM which is approximately the end of the day and then 2:00 AM which could be the end of my day if it’s a long day. If I’m working on something into the night, 2:00 AM will probably be the latest. I just want  make sure that I have it running twice just to make sure that it shuts down. I’ll walk us through the process of configuring my runbook.  I’m showing on the screen now the blog post that I followed but there was one key difference in how I configured mine versus how this blog post walks you through it. That is that, I used a user account to run my runbook versus this blog post walks you through using an application account.

I think there are probably good reasons for using an application account. I tried that. I couldn’t get it to work, but I got it to work with a user account. I could probably circle back through and get it to work with an application account, but just have it put that effort into it. I’m going to walk you through today how to configure it using a user account. Let me get started.

The first thing I’m going to do is create the account. To do that, I’m going to go into the portal and choose Active Directory to get into my Azure Active Directory accounts. I’m going to go to the users tab. I’m going to add a user. I’ll create a new user and I’m going to call this VM Maintenance. Choose next and we’ll just call it VM Maintenance. Just create it as a standard user and you will get a temporary password for that user. I’m going to copy that to my clipboard. We’ll go ahead and close that up.

I’m going to open up a new incognito window. Then I’m going to go ahead and log in as this new user. It’s going to prompt me to change my password, which I will do. Take my password and sign in. Now, my credential is set, and my next step is that I’ll need to configure this user as a co-administrator of my Azure subscription. To do that, I’m going to scroll down to Settings I’ll choose Administrators and then I’m going to choose Add. Add in my user. Select that user as a co-administrator.

You’ll see the user is actually listed in here twice at the moment because I did it earlier while preparing for this demo. Let’s just remove the first one. There we go. Hopefully, I removed the old one and not the new one, but we’ll see what happens there. My account is now a co-administrator of this account. I know that seems pretty heavy-handed. I have read a blog post, Stack Overflow post and so forth, about people who have stated that this is necessary, and I’ve actually tried to get this user a lower account privilege and it was not successful. This does seem to be necessary.

That should be what we need for our accounts. Next, I’m going to go back to the portal admin page. I’m going to go to automation accounts and I’m going to add a new automation account. Click Plus and we will call this, VM Automation. I’m going to add it to an existing resource group. I’m not really aware of why this matters, but I’ll choose the same resource group as I do for all my others. I’ll leave the other default settings and choose Create.

It may take just a few minutes for the automation account to finish the creation process. For me, I had to click refresh on this automation, the listing of automation accounts. Once that appears, once it’s all like that, once the new automation account is open, the first thing we’re going to want to do is go to Assets and then we’re going to click Variables and we’re going to add a new Variable called Azure Subscription ID. For that, we of course are going to need our Azure subscription ID. Just go back to the portal. Go to My Subscriptions grab the Azure subscription ID. Paste that in.

Next, we’re going to need to create a credential which will be used to run the runbook, so I’m going to click Credentials. I’m going to click Add Credential and we’re going to name this one Azure Credential. I’m going to add in the VM Maintenance account that I created earlier. Now we’ve created a variable and a subscription and our next step is going to be to create the runbook. I’m going to choose Add runbook. No, I’m not going to choose that runbook. I’m going to choose Browse Gallery and then choose Stop Azure Classic. Let’s just see what that gives us. Stop Azure Classic VM’s.

You’ll notice that there’s two of them. There’s one that says PowerShell Workflow Runbook and another one that says Graphical PowerShell Workflow. The Graphical Powershell Workflow is the one that we need. We’ll choose that and then select Import. I can accept the default name, choose okay. We’ll import it from the gallery. My next step is going to be to choose Edit. There is a button for task pane and it prompts me for parameters. The service name that’ll default, I guess, to the current service. The Azure Subscription ID and the name of the credential  both have defaults which we configured a moment ago. You can see Azure Subscription ID is the variable that we just configured and then Azure Credential is the credential that we just created.

When we’re ready, we can just click Start to begin the test. I’m going to go scroll over a little bit here to see the results or the output. It takes a while for these to run, so I’m just going to pause the video while it runs. When the workflow is completed, as you can see here, it gives you a little bit of output. In this case, it tells me that my Virtual Machines were already stopped, so there is no action taken. Had any of them been running, it would tell me that they would list out those that it had stopped.

Let me close out of the task pane, My next step is going to be to publish this runbook so it’ll prompt me, do I want to publish it? Of course I say Yes. Now we have a runbook created. Our accounts are created and we have tested our runbook, so our next step is going to be to schedule it. I’m going to choose, go back to the VM Automation and I will choose select the runbook. Once the runbook is open, I’ll click schedule from the top and it will say like to a schedule. Link a schedule to your runbook, so I’ll choose that. I’m going to create a new schedule.

In this case, I’m just going to choose a one-time schedule. Of course, you can choose once or recurring. I’ll just choose once, and I’ll have it fire off here in the next 20 minutes. We’ll choose 12:20 PM and I’m going to say Create. Now one trick is that, simply creating the schedule,  I thought I was finished, but you’re not. All you have done at this point is created a schedule, but you need to link this runbook to it.

Now I’ve selected it and choose OK. Now you should see the number of schedules increase here under the schedules listing. I’ll open it, and now I can see that my runbook is scheduled and we’ll wait a moment for that schedule to run to confirm that it’s working. To do that, I’m going to go back to my classic virtual machines. Let me just start up one of my smaller virtual machines here.

It’ll start up and hopefully it’ll be fully started by the time that schedule fires off. Let’s go back to Automation Accounts. Open that Automation Account. Open the runbook and open the schedule and we’ll see it’s set to fire off at 12:20, which is just about five minutes from now. They won’t allow you to fire off a schedule. Any sooner than five minutes, so we’ll wait just a moment for that to fire off.

Now I can see that my schedule is listed as expired which tells me that it’s already run. So if I go back to my classic virtual machines, I can confirm now that my virtual machine has been stopped. That’s the quick run-through for creating a runbook which you can use to stop your virtual machines.

read more
Eric BowdenConfigure Scheduled Shutdown for Virtual Machines in Azure

Update SharePoint Online System Fields via CSOM

Kirk Liemohn is a Principal Software Engineer at ThreeWill. He has over 20 years of software development experience with most of that time spent in software consulting.

I recently found myself in a dilemma where I needed to update the Created, Created By, Modified, and Modified By fields for SharePoint list and library items in SharePoint Online.  I had read in the past that some of these couldn’t be changed via CSOM, but I had seen migration tools do this when migrating to SharePoint Online, so I knew this must be possible.

Either my Google/Bing Foo was not good that day or there were just too many people out there saying it wasn’t possible, so I decided to figure it out for myself.  It took a little trial and error, so I decided I should write it up in a blog post.

The short answer is to look at the source further.  The main tricks I needed to deal with were:

  1. When specifying the Created By (Author) or Modified By (Editor) user, I could not use the FieldUserValue object. Instead, I had to hand-craft it by concatenating the hidden user information list item ID and the fully qualified login name (e.g., “22;#i:0#.f|membership|[email protected]”).
  2. For some reason, you can set the Created By user (Author) just fine with list items. However, if you want to set the Created By user on library items, you need to also set the Modified By user or the change won’t stick.
  3. If you want to do this covertly, there are two extra steps to consider:
    1. To prevent creating a new version in version history, you may need to turn off version history before making the update and turn it back on afterward.
    2. To prevent updating the modified date, you need to set the modified date to the current modified date value. Otherwise, it will get updated to the current date.
read more
Kirk LiemohnUpdate SharePoint Online System Fields via CSOM

Azure Web Jobs

Kirk Liemohn is a Principal Software Engineer at ThreeWill. He has over 20 years of software development experience with most of that time spent in software consulting.

Danny Ryan:                       Hi this is Danny Ryan and welcome to the ThreeWill Podcast. Today I’ve got Kirk Liemohn here with me. Kirk is a Principal Software Engineer here at ThreeWill. Thanks for joining me Kirk.

Kirk Liemohn:                    Hello. Good to be here.

Danny Ryan:                       Great. I’m pulling you in for … We’re going to hit a technical subject today and that subject is web jobs. These are Azure Web Jobs. Is that correct?

Kirk Liemohn:                    That is correct.

Danny Ryan:                       Awesome. I’d love to learn a little bit more about what this is. Maybe a little bit of history of how you solved this problem before in the past. Some of the benefits of the way you’re taking your approach today. Tell me a little, maybe just starting at a high level, what is an Azure Web Job?

Kirk Liemohn:                    It’s just a task that runs in the background on Azure, within an Azure app. You can use it just to run anything that might take a long amount of time maybe, or just something that you want to run on a periodic basis.

Danny Ryan:                       Great. What would … is there an equivalent type of thing on SharePoint as far as the way you would approach this?

Kirk Liemohn:                    Yeah. In the past, in SharePoint we would do a timer job, definitely. If we had access to do that, you can’t do that in SharePoint online. You could write your own timer job in SharePoint. You could do a scheduled task on a Windows server as well or even a client machine if you wanted to. Those are ways to in the past and I’ve got a Unix background from back in the ’90s and back then it would have been a ChronJob.

Danny Ryan:                       Excellent. Sort of the history of the way that you periodically ran a piece of software or …

Kirk Liemohn:                    Yep.

Danny Ryan:                       I know when we we’re talking about prepping for this you also mentioned that you could use this for long running processes as well?

Kirk Liemohn:                    Yeah. Say if you’ve got a website, especially if it’s running Azure and you’re using the PaaS solution for doing websites in Azure. It needs to do some processing and for whatever reason that processing takes more time than you want to make the user wait. Even if it’s just a few seconds, might be enough to say, “Hey, let’s do this asynchronously.” You could push it off to a web job. If it certainly might take minutes or hours you would want to consider a web job for background processing.

It may not be something you do on a scheduled basis. It may be initiated by a user. If you we’re to, on that website, try to spawn a thread so that the user didn’t have to wait that delay. IIS, if you’re using IIS, it’ll potentially kill your thread because it thinks, “Well I’ve already returned to the user and I don’t need to be having anything else going on. I see there’s something else going on but I don’t care. I’ve already returned control to the user.” It has no use for your thread you’ve created so it’s better to do it with something like a background process via web job.

Danny Ryan:                       Tell me a little bit about some of the things that you learned while doing this. Talk a little bit about how did you package this up and get it on Azure.

Kirk Liemohn:                    First off, this is my first time doing a web job. I’ve wanted to do them in the past. I’ve had some times I didn’t have to do them, but I kind of wanted to do them. This is a time I really needed to. I am doing a migration to Office 365 and we’re concerned about throttling. This is SharePoint, this is a SharePoint migration. We know that we might get throttled by SharePoint online. We want to know when we’re being throttled.

The best way we can do that is to look at the logs for the migration tool and read that information and find out if there’s an entry in the log that indicates that we’re being throttled. I’ve got several migrations running at once on several servers. I didn’t want them to each go through and look for this because it is a little costly. I have to do a full text index of the content and a query across that. I’ve decided that I want to run it on a periodic basis. A job that will do that check and then it will post something to a simpler database that says, “Yep, I’ve been throttled or not.”

It’ll be a lot easier to say, create a dashboard that shows what the throttle status is, if there’s been any throttles recently, as well as the migration … The servers that are doing migrations they can do that check and say, “Oh, if we’ve been throttled recently I don’t want to start another migration job. I’m going to hold off until I’ve got no throttles within the last hour or something.”

Danny Ryan:                       Awesome. How are you alerted about this? You’ve mentioned a dashboard. I guess you could do it through a dashboard, but how else?

Kirk Liemohn:                    I haven’t created the dashboard yet, I might do that. I use Visual Studio, is what put it into the Azure Web Job for me. I started out making it as a PowerShell script and my PowerShell script relied on a commandlet that for Sequel that I had to use a couple of, actually three, MSIs to get that working. I didn’t know how to install MSI’s in a web job. I don’t think you can. I know you can upload a bunch of files and it was simpler to me to just switch it over to a console app.

I switched to a console app; this console app was written in Visual Studio. Visual Studio was real easy to just right click on the project and say, “Publish this web job.” There’s a few settings you have there, how often you want it to run. Then from Azure, the Azure console, admin console, you can see the status like any output from your console as well … Your console app.

Danny Ryan:                       You mentioned that this runs every hour, but that’s primarily because of the type of Azure account that you have. Is that correct?

Kirk Liemohn:                    That’s right. This will run, I have it running once an hour. That’s the fastest I can have it run since I think we’re on the basic, I think it’s the basic plan for our Azure web app. If I we’re to change that to standard or premium I could do it more often. We may do it for that reason, we haven’t decided that we need to go to that level yet. That is correct. While it’s running it simply does the query, the full text query. It’s against SQL which is running on Azure machines so I have to have the access to that sequel server over the internet so I’ve got special accounts and ports for that. Then if it finds something it will log something to a different database table and it will also send an email. I use SendGrid for that. Which I’ve used SendGrid in the past with Azure so it was pretty easy to set that up.

Danny Ryan:                       It’s a pretty easy configuration for that?

Kirk Liemohn:                    Yeah, it probably took me 30 minutes and I hadn’t used SendGrid in several months, maybe a year. It wasn’t fresh in my memory but about 30 minutes to figure out how to send email, basically. Set it up and configure it and all that.

Danny Ryan:                       Its funny you mention this is the first one you’ve done. I think that’s the best time to give advice to other people who might be doing this for the first time. Any other pointers that you would give to people for writing Azure web apps, web jobs?

Kirk Liemohn:                    Yeah, web jobs. The first thing is try and understand what it relies on. If it’s relying on certain other pieces of software or something installed on your system then it may not be appropriate or you may need to find a way to get those up into Azure. If you need other files, in my case initially I needed some other commandlets for PowerShell and that wasn’t going to be the easiest path for me so just doing an all encompassing console app that knew how to talk to Send Grid and knew how to talk to databases was all I needed.

Danny Ryan:                       It’s funny it seems like console apps are coming back because I was talking to Chris yesterday and he wrote for the migration tool. The Jive to SharePoint migration tool. That’s as a console app as well. It seems like they’re coming back in fashion now.

Kirk Liemohn:                    Oh yeah. All of our stuff for migration is mostly PowerShell based and we run all that from consoles, so definitely. The gotcha is just try and understand what your program relies on. Try and understand what your needs are in terms of how often this thing needs to run. Those are kind of the two things that I ran up against.

Danny Ryan:                       That’s great. I appreciate you taking the time to do this. I know you’re busy on your project but hopefully this will help somebody who may be out there looking to write their first Azure Web Job. Thank you for doing this Kirk.

Kirk Liemohn:                    Sure. You’re welcome.

Danny Ryan:                       Thanks everybody for listening. Have a great day.

read more
Kirk LiemohnAzure Web Jobs

Populating a Multi-Value People Picker with PowerShell

Will Holland is a Senior Software Engineer at ThreeWill. Will has proven to be adept at understanding a client’s needs and matching them with the appropriate solution. Recently he’s developed a passion for working with .NET, MVC, and cloud-based solutions such as Microsoft Azure and Office 365.

On my current project, a large scale migration from a SharePoint dedicated environment to SharePoint Online, my team is using a “Site Inventory” list to keep track of the over 52,000 site collections in our client’s source. Since the source is “live” and things are constantly evolving, we get periodic CSVs containing the most recent data regarding the source.

Naturally, 52,000+ rows is a LOT of data to go look through, so we created a PowerShell script that would parse the updated information, compare it to the Site Inventory list, and update any changes we find. Among the columns in our list are a few “Owner” columns (primary and secondary) and an “All Administrators” column. All three of the columns are just text fields that contain the login names of users in the dedicated environment, and we wanted to aggregate the three fields into one multi-value people picker.

Sounds easy, right? I ended up having quite a struggle, spending more time than I felt necessary dredging the depths of the internet for answers.

I knew that I had to use the Web.EnsureUser method to turn my username into a User object so I could grab the ID. I also knew that I needed to turn that ID into a lookup value since a people picker is, more or less, a special kind of lookup column. Finally, I knew that my “AllOwners” column needed an array of lookup values. That last part was where the issue came in.

$userName = "whd\wholland"
$spuser = EnsureUser $context
$user if($spuser -ne $null){
     $spuserValue = New-Object Microsoft.SharePoint.Client.FieldUserValue 
     $spuserValue.LookupId = $spuser.id
     $listItem["AllOwners"] = @($spuserValue)

After going through the steps of turning a username into a FieldUserValue object (the ‘special’ lookup value I mentioned earlier), I would simply wrap that FieldUserValue in an array and attempt to set it as the value of the “AllOwners” field. My expectation was that, in doing so, I was creating an array of FieldUserValues. Expectations and reality don’t always line up.

As it turns out, unless you specify a type, PowerShell will create an array of the most generic type it can. In my case, I was getting an array of generic Objects.

Before I tried to set the value of my field, I needed to cast my array to be, specifically, an array of FieldUserValue objects. Below you’ll find the code snippet that sorted the issue for me.

$userName = "whd\wholland"  
$spuser = EnsureUser $context $user  
$lookupValueCollection = @()  
if($spuser -ne $null){  
     $spuserValue = New-Object Microsoft.SharePoint.Client.FieldUserValue                 
     $spuserValue.LookupId = $spuser.id  
     $lookupValueCollection += $spuserValue  
$userValueCollection = [Microsoft.SharePoint.Client.FieldUserValue[]]$lookupValueCollection 
$listItem["AllOwners"] = $userValueCollection  
read more
William HollandPopulating a Multi-Value People Picker with PowerShell

Simulating SPWeb.ProcessBatchData Batch Deletes

Pete is a Director of Technology at ThreeWill. Pete’s primary role is driving the overall technology strategy and roadmap for ThreeWill. Pete also serves as ThreeWill’s Hiring Manager and is constantly looking for new talent to join the ThreeWill family.

Over the past few days, I have been migrating some content from a separate product into Office 365 and SharePoint 2013 Online. While testing the export and import process, I used a modified a version of the script I wrote about in With The Death Of The DataSheet, Keep This Script In Your Back Pocket. The new version of the script used a CSV file this time and a slightly different import method, but the scripts are largely the same (the modified script for this article can be found in this Gist).

Using the Add-SPListItemsFromCSV.ps1 script was a great time saver. I could test loading more than 500 list items in a test list in Office 365 very quickly. But after a couple test runs, it was very painful to review the changes and see if the imports were successful.

In order to improve the process, I needed to keep the lookup list content and delete the main list content so I could easily review my incremental field and item additions. Immediately I thought to myself, “Ah, I’ve done this before using PowerShell and ProcessBatchData().” Well, not so fast slick. The world is changing and using the SharePoint Server Object Model isn’t an option here. As it turns out, SPWeb.ProcessBatchData() is not possible from CSOM, nor is any type of batching – there will be no leveraging of old code this time.

But wait, isn’t CSOM just batching up calls anyway until I call ExecuteQuery()? So why not just create a series of CSOM Delete calls and then call ExecuteQuery() to simulate a ProcessBatchData() call? Turns out, that works quite well! The image below is a Fiddler screen shot of the batch request for deleting items in the list.


Looking at the traffic from Fiddler, you can see that the request is batched as a series of DeleteObject() method calls. The collapsed ObjectPaths portion of the screen shot contains all of the parameter information for each of the DeleteObject() Actions in the Actions section.

Here is the script that does all of the deleting. The Clean-SPListItems.ps1 script lets you specify an Office 365 tenant site, a list name, a userid and a batch size and will prompt for your password. The meat of the script is listed below.

#create objects to be used to delete in batches
$camlQuery = New-Object Microsoft.SharePoint.Client.CamlQuery
$colPosition = New-Object Microsoft.SharePoint.Client.ListItemCollectionPosition
#Create a query to get items by id's in batches
$camlQuery.ViewXml = &quot;&lt;View&gt;&lt;ViewFields&gt;&lt;FieldRef Name='Id'/&gt;&lt;/ViewFields&gt;&lt;RowLimit&gt;&quot; + $batchSize + &quot;&lt;/RowLimit&gt;&lt;/View&gt;&quot;;
$camlQuery.ListItemCollectionPosition = $null	#set initial query position (starts as null)
$items = $list.GetItems($camlQuery)	        #execute the query
$ctx.Load($items);			        #load the items
$ctx.ExecuteQuery();				#execute the entire operation set
$retCount = $items.Count
if ($retCount  -gt 0)
		Write-Verbose -Message &quot;Query returned $itemCount items...&quot;

		Write-Verbose -Message &quot;Building batch...&quot;
		$items |
			% {
				$delItem = $list.GetItemById($_.ID);
		Write-Verbose -Message &quot;Executing batch...&quot;

		$itemsDeleted = $itemsDeleted + $batchSize

	} while ($itemsDeleted -le $itemCount)
	Write-Verbose -Message &quot;No list items returned.&quot;

PowerShell and CSOM are a pretty powerful combination. Remember, just when you think you have to have rely on the old Server Object Model calls, you just might be able to find a different solution to your problem using some of that new-fangled CSOM stuff you’ve been so scared to try out.

Happy batching…

read more
Pete SkellySimulating SPWeb.ProcessBatchData Batch Deletes

With The Death Of The DataSheet, Keep This Script In Your Back Pocket

Pete is a Director of Technology at ThreeWill. Pete’s primary role is driving the overall technology strategy and roadmap for ThreeWill. Pete also serves as ThreeWill’s Hiring Manager and is constantly looking for new talent to join the ThreeWill family.

I recently noticed the lack of Datasheet mode for Office 365 – at least the Datasheet that I used to know and love! As I poked around I was looking for a way to upload a spreadsheet to a site in SharePoint Online. I was shocked to discover my old friend, the datasheet, was gone! Some of you may say, “Use the Import Spreadsheet App”. For reason I won’t explain here, this was not an option I could use for now.

After a couple minutes of head scratching (scalp scratching if you know me well…), I decided to script the item additions. I created my list and then used PowerShell and CSOM to upload my list items. Below is the script…

I leveraged a good deal of Chris O’Brien’s post , but made it my own. The intent was to get something accomplished quickly, so don’t hold the code below to “production” standards. Just thought I would post in case someone else wanted a “simple” solution.

WARNING: This script is modified and NOT tested after the modifications to list and column names. Your mileage will vary… and I have tried to make it obvious what needs to be changed.

 # Load SharePoint 2013 CSOM  libraries.
 Add-Type -Path "c:\Program Files\Common Files\microsoft shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.dll"
 Add-Type -Path "c:\Program Files\Common Files\microsoft shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.Runtime.dll"

 # Authenticate to SharePoint Online
 $siteUrl = Read-Host -Prompt "Enter Site URL"
 $userid = Read-Host -Prompt "Enter User ID"
 $password = Read-Host -Prompt "Enter password" -AsSecureString

 $ctx = New-Object Microsoft.SharePoint.Client.ClientContext($siteUrl)
 $credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($userid, $password)
 $ctx.Credentials = $credentials

 $list = $ctx.get_web().get_lists().getByTitle(‘YourListName');
 $path = “path_to_tab_delimited_file”
 #Using Get-Content instead of Import-CSV for a reason…. Content was tab delimited since data contained “,"
 $tdc = get-Content -path $path
 foreach($line in $tdc)
      #skip the first line for the header
      if ($i -gt 0)
           $vals = $line.Split("`t")
           # Create list item.
           Write-Host $vals[9]
           $itemCreateInfo = New-Object Microsoft.SharePoint.Client.ListItemCreationInformation
           $listItem = $list.addItem($itemCreateInfo);
           $listItem.set_item('Title', $vals[2]);
           $listItem.set_item(‘Col2_Name', $vals[4]);
           $listItem.set_item('Col3_Name', $vals[0]);
           $listItem.set_item('Col4_Name', $vals[1]);
           $listItem.set_item('Col5_Name', $vals[3]);
           $listItem.set_item('Col6_Name', $vals[9]);
           Write-Host "Added " $vals[2]
           Write-host "Skipping first line $i"

The most important bits are:

  1. line 11: the authentication which is required to make the CSOM calls
  2. line 17: the loading of tab delimited file content
  3. lines 29-34: and the assignment of the $val[n] array values to the list item for adding to the list

Although I am very sad to see the datasheet go, this gave me a chance to scratch a PowerShell itch with a simple but useful script.


Ask questions below or share your favorite memory of the DataSheet…

read more
Pete SkellyWith The Death Of The DataSheet, Keep This Script In Your Back Pocket

Using SharePoint Site Templates in SharePoint Online Dev Sites

Kirk Liemohn is a Principal Software Engineer at ThreeWill. He has over 20 years of software development experience with most of that time spent in software consulting.

With SP2013 you use a “Development Site” site definition when creating a site collection so that you can deploy apps using Visual Studio to test them out.  Using this site definition is required for Visual Studio “F5” integration.  However, with these types of sites comes a problem within SharePoint Online – sometimes.  In my limited testing I have observed that in some cases you can create sub-sites underneath a Development Site that uses a custom site template (a site template you uploaded to the solution gallery in your Development Site).

However, sometimes you cannot and you get an error:

This web template requires that certain features be installed, activated, and licensed.  The following problems are blocking application of the template:

Feature Description – SearchDriveContent Feature
Feature Scope – Site collection
Feature ID – 592ccb4a-9304-49ab-aab1-66638198bb58
Problem – Not Activated

Here is the PowerShell for fixing this (see also the attached Activate-SPFeatureForDevSite.ps1 script):


if ($siteUrl -eq $null -or $siteUrl.Length -eq 0)
Write-Host "siteUrl parameter required" -ForegroundColor Red -BackgroundColor Black
Write-Host "Usage:    Activate-SPDevFeatureForDevSite.ps1 -siteUrl dev_site_url"
Write-Host "Example:  Activate-SPDevFeatureForDevSite.ps1 -siteUrl ""https://mydomain.sharepoint.com/sites/MyDevSite"""
Exit 11

$user = Read-Host "Input O365 user name"
$pwd = Read-Host -AsSecureString "Input password"

Import-Module Microsoft.Online.SharePoint.Powershell
$cc = New-Object Microsoft.SharePoint.Client.ClientContext($siteUrl)
$cc.Credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($user, $pwd)

$site = $cc.Site
$devFeatureGuid = New-Object System.Guid "592ccb4a-9304-49ab-aab1-66638198bb58"
$site.Features.Add($devFeatureGuid, $true, [Microsoft.SharePoint.Client.FeatureDefinitionScope]::None)

Source code: Activate-SPFeatureForDevSite.ps1

In order to run this PowerShell you first need to install the SharePoint Online Management Shell.  Note that this has a prerequisite of PowerShell 3.0 so you may need to install that first.

Thanks to @Remco  Stack Exchange for doing the hard work: 2013 – Can you apply a site template to a subsite of a Developer Site on Office 365 (SP2013)? – SharePoint Stack Exchang…

Note that I have not seen this issue with on-premise SharePoint 2013; however, I have seen that the ability to save a site as a template is non-existent in a Development Site (and its sub-sites) for both SharePoint Online and SP2013 on-prem.  Activating the feature using the script above does not fix this problem.

Save Site As Template

Hopefully this post helps those that still see this issue.  Maybe it is something that the SharePoint Online team is addressing.

read more
Kirk LiemohnUsing SharePoint Site Templates in SharePoint Online Dev Sites

SharePoint is like sugar…

Pete is a Director of Technology at ThreeWill. Pete’s primary role is driving the overall technology strategy and roadmap for ThreeWill. Pete also serves as ThreeWill’s Hiring Manager and is constantly looking for new talent to join the ThreeWill family.

The Analogy

SharePoint is like sugar cane juice. What? You don’t get the analogy? Well, neither did I until a few days ago. I personally loved John Underwood’s “SharePoint is like butter” analogy, but I had to capture this one somehow. Just as John said, bear with me, I’ll get there.

The Origin

Over the past 5 years of consulting projects with SharePoint I have had the opportunity to meet some really great people – clients, other consultants and contractors, and even the occasional Java developer. To introduce and discuss SharePoint in general, analogies often are the fastest path to explanation and understanding. On my current project, I have the good fortune of working with a new friend, Neeraj Agrawal.

Neeraj and I were discussing some of the best practices each of us has incorporated over the last several years in SharePoint projects, including analogies to explain the best way to introduce and develop SharePoint based applications, in our experience. I referenced John’s blog post, and mentioned that for ThreeWill engagements we usually try to meet needs with as much of the out of box capabilities as possible before we suggest customization through code.

Neeraj responded with, “SharePoint is like sugar cane juice”. I stood there, motionless, flummoxed. The words hung there, hovering motionless, like a visible, malodorous cloud. After what seemed like minutes, shaking my head in an almost cartoonish fashion, I finally managed, “What?” Neeraj just laughed. And then, he started to explain this jewel. Here’s the abridged version.

First, you have a sugar cane stalk. Fairly obvious.

Next, a sugar cane juice vendor who extracts and sells the juice. The juice vendor sends a stalk through a roller to squeeze out sugar cane juice. Immediately, a customer can now have some juice, no waiting.

The vendor can pass the cane stalk through the press multiple times, and with each pass, another customer gets some juice. Every pass extracts more value of the plant for the vendor, a bit more of the juice, and another happy customer enjoys a glass of sugar cane juice.

There you have it – the perfect SharePoint analogy! Neeraj had just thrown down the best SharePoint analogy ever, short, sweet (literally) and so obscure you are guaranteed to have to explain yourself further.

And here is the explanation of the analogy.

The Explanation

SharePoint 2010 comes out of the box with a large number of powerful and compelling features to address a wide variety of situations. The number and types of situations are too large to describe here, and are explained much better elsewhere. The relevant point here is, with so many features, you can extract value from SharePoint with several passes through the press, there is no need to squeeze too hard on the first pass.

The SharePoint Juice Recipe
  • Endeavor to understand standard features
  • Prototype and implement a basic business process
  • Identify and prioritize the gaps
  • Customize the Most Valuable Gaps First

Endeavor to Understand the Standard Features

First, investigate and understand what SharePoint offers out of the box, and what can be consumed without customization. A word of caution, SharePoint is a HUGE product! As in life, you will more than likely NOT know everything there is to know. This is OK, and should not deter you (the consumer or consultant) from understanding what is available to you at low or no cost of entry. You’ve already paid for SharePoint, or at least the hardware in the case of SharePoint Foundation, why not make the most of your investment.

Learn what site templates come out of the box, and determine if they fit one of your situations closely. In the UI, 20 templates were displayed for me, but there were 51 templates by using Get-SPWebTemplate | Measure. Learn what lists and libraries exist (26 in my environment using Get-SPWeb http:///[rootweb] | % { $_.ListTemplates | Measure}) and what each has to offer. Review the default field types (573 in my environment using Get-SPWeb http:///[rootweb] | % { $_.Fields | Measure}) , their options, limitations, and whether a field already exists that might meet your need. Discover what relationships, references, and interactions you can create between sites, lists, content types ((103 in my environment using Get-SPWeb http:///[rootweb] | % { $_.ContentTypes | Measure}) and more. Finally, learn what web parts are available out of the box for your product SKU and how these can help you without creating your own.

Note: I included the PowerShell cmdlets and counts because, as I was writing this, I was shocked to learn some of these numbers and I learned some new things – even after 5 years.

Prototype and Implement an Initial Solution

After you have spent some time learning what comes out of the box, try putting this knowledge to work. Start by picking a business process that is relatively complex and implement the basic needs of the process. For example, do you track expenses? Try creating an Expense Report “application” using a new site, create a few content types using existing or new site columns, and create some views and web part pages to help display, report and print the expense reports. A list to track the different expense items, expense categories, etc. is a good example that is easy to understand and relatively easy to prototype quickly. You’d be surprised how much can be accomplished for a process like this out of the box.

Be sure to have the correct expectations for this exercise. Don’t look for the limitations, do not assume that this is not good enough, and do not look for excuses to jump to code right away. Accept that the “application” will likely fall short of the mark, but take this as an opportunity to see what is possible. Be creative and try to make things that are acceptable, but not perfect. I think you will be pleasantly surprised at how much of an application like this is achievable, and acceptable, using just what a power user has at their finger tips.

Identify and prioritize the gaps

Now that your first run through the press has extracted a healthy portion of sweet, SharePoint juice, and left your users wondering how they can get more of this magic elixir, it is time to determine what to do the next. Let’s take our Expense Report application again. Users liked the idea of an online Expense Report app, but the simple lists with only some content types and views fall short. Management wants a custom workflow, since what you tried initially with the approval wasn’t enough. Users want a way to track expenses offline and then upload them. And, the finance users want a simple way to track and manage payments and some views that show them expenses that are not possible with simple list views.

Now we have enough for a second run through the press to extract some more sweetness. Can you still stay away from code? Maybe. There’s still more juice to be had without squeezing too hard. Maybe a workflow modified in SharePoint Designer 2010 will fit the bill for the managers. No custom code yet, and they might be willing to accept this. Next, you might be able to meet the user’s needs by using InfoPath 2010 or SharePoint Workspace 2010, or by getting even more creative with Excel or other options. Finally, perhaps a third list, secured to only finance users, will enable them to track the payments and a content query web part might provide them with the views they want.

These may work, but what happens when you need to squeeze even harder?

Customize the Most Valuable Gaps First

Finally, once you have exhausted the features you endeavored to learn, start customizing with the new requirements that present the most value by addressing the biggest gaps. The finance team would love to have a more specific UI for their purposes, but they are fine with a list that only they can see and manage. The managers are happy with the new workflow that lets them approve and escalate approvals as needed. But, the users really need a better alternative to the processing of offline expense items.

Since the users at this point are the ones with the most compelling or urgent need for customization, you can now determine a viable alternative for them. You can spend time getting deeper into requirements and scenarios to understand the best way to address the needs, all while the finance team, managers and most users have a viable, functional alternative to managing expenses and are getting reimbursed.

Some Final Thoughts

Think about how you develop solutions today. If your organization immediately moves to gathering and defining deep, fine grained requirements that are intended to ensure every desire of the users is met, then SharePoint may present some mental challenges for you. Developing in SharePoint, and extracting value immediately is a huge advantage that SharePoint as an application itself provides.

Finally, Neeraj did offer a different spin to this crazy analogy. Here it is in summary. First, treat SharePoint like an application and learn to use all of the features it gives you. Once you have learned the features, and mastered it’s capabilities, treat it like a platform and build your own features that exploit the platform to its fullest. Finally, exploit the API of the platform and create your own solutions by connecting, building or extending the platform.

Thanks Neeraj for helping me add another sweet SharePoint analogy to my bag of tricks and for helping me make Mr. Underwood a little happier this week!

read more
Pete SkellySharePoint is like sugar…

Surfing the SharePoint Pipeline

Pete is a Director of Technology at ThreeWill. Pete’s primary role is driving the overall technology strategy and roadmap for ThreeWill. Pete also serves as ThreeWill’s Hiring Manager and is constantly looking for new talent to join the ThreeWill family.

Time to surf the SharePoint PowerShell pipeline.

No, this is not a pipeline Kelly Slater would be surfing, but it is about as sweet as a North Shore wave if you are an IT Pro. I recently had to update a template file across a large number of site collections. Luckily, the root web of the sites that I needed to update all had a common Document Library (name) and a standard folder structure.

SharePoint 2010’s PowerShell module makes performing IT Pro tasks like these extremely simple. Let’s just jump in and look at the cool part of the script.

#Actual start of script
Start-Transcript -Path "$pwd\UploadTemplateLog.rtf" -Force -Append;
$rootWebs = Get-SPWebApplication -Identity "https://extranet.acme.com" | `
Get-SPSite -Limit All | Get-SPWeb -Limit All | Where-Object {$_.IsRootWeb} |
Add-SPFile $_ "Documents" @(dir "$pwd\StatusReportTemplate.docx")[0]

Since I wanted to make sure I knew which sites got the file, and where any errors may have occurred, I used the Start-Transcript cmdlet to append information to the file. To simplify the script a bit (I’ll leave making this a one-liner to the Kelly Slater’s of the PowerShell world), I create the $rootWebs variable. To me, the next part is truly the magic of the pipeline. First, we make a call to Get-SPWebApplication cmdlet, filtered to the web application we want. Next, each result is passed along in the pipeline to Get-SPSite. By default, we will only get back a limited set (usually a good thing), but in this case we want all sites, so we use the -Limit All switch to get all of the sites. Once we have the sites, we want to filter down to only the RootWeb SPWeb instances. Finally, we use a for-each (the % alias) to loop over the RootWeb instances we filtered our objects to, and we call Add-SPFile for each web to update the document template.

You could change the script to take the Document Library name and file as parameters, but I will leave that up to you. I know nothing of surfing, and would face certain death on the North Shore pipeline, but don’t be afraid of the PowerShell pipeline when performing SharePoint 2010 admin or development tasks. After all, it’s a command line, not the North Shore! (Caution: Consider production environments “the North Shore” and prepare appropriately) Here is the whole script in it’s entirety. Feel free to comment on how to make this script better or just to let me know if this was useful. Have fun riding the waves…

Complete Script:

Add-PSSnapin "Microsoft.SharePoint.Powershell" -ErrorAction SilentlyContinue
#Add file function called by entry point below
function global:Add-SPFile
param ([Microsoft.SharePoint.SPWeb]$web= `
$(throw("You must provide the web to which document(s) will be added.")),
$docLibName=$(throw("You must specify the document library name.")),
$file=$(throw("You must specify the file to add."))

if ($web -ne $null)
Write-Host "`tProcesing Web - " $web
if ($web.Lists[$docLibName] -ne $null)
{ $folderFound = $false
foreach ($folder in $web.Lists[$docLibName].Folders)
if ($folder.Title -eq "Status Reports")
Write-Host "`t`t- Found Status Reports Subfolder."
$folderFound = $true
$spFolder = $web.GetFolder("$docLibName/Status Reports" );

Write-Host "`t`t- Loading file byte[]..."
Write-Host "`t`t- Read " $stream.Length.ToString() " bytes from file."

#set property bag values in order to apply some
#”hidden” meta-data to the files, specifically the “AddedBy” property
[email protected]{“vti_title”=””;”ContentType”=”Document”; ‘
“AddedBy”=”PowerShell Add-SPFile CmdLet”}
#update the vti_title in the hash
#Enables setting the SPFile.Item.Title property without calling Item.Update()
$propertybag.vti_title = $file.Name

Write-Host “`t`t- Attempting to add ” $file.Name
#add with stream – using a streams performs better than the above byte[] copy
[Microsoft.SharePoint.SPFile]$newFile = ‘
$spFolder.Files.Add($file.Name,$stream,$propertybag, $true)
Write-Host “`t`t- Successfully added file to ” $web
Write-Host “`t`t- Closing file stream.”
Write-Host “`t`t- Closed stream.”
Write-Host “`t`t- Completed processing $web”
if (-not $folderFound)
Write-Host “`t`t- No Status Reports Subfolder found.”
Write-Host “`t`t- Document library does not exist. Skipping $web”
Write-Host “`t`t- Web was null”

#Actual start of script
Start-Transcript -Path "$pwd\UploadTemplateLog.rtf" -Force -Append;
$rootWebs = Get-SPWebApplication -Identity "https://extranet.acme.com" | `
Get-SPSite -Limit All | Get-SPWeb -Limit All | Where-Object {$_.IsRootWeb} |
Add-SPFile $_ "Documents" @(dir "$pwd\StatusReportTemplate.docx")[0]

read more
Pete SkellySurfing the SharePoint Pipeline