helpful-tips-board.jpg

Closing an Angular 2 Form from within a SharePoint Modal Dialog

Recently I was on a project for a client where we were converting their custom SharePoint 2010 app to SharePoint 2013. This application was based on a fairly complicated InfoPath form with lots of supporting event receivers that we were taking to an Angular 2 application, leaving InfoPath behind forever (we hope!).

In an effort to minimize the disruption to the users due to the completely new UI, we decided to keep it as similar as possible to the old InfoPath form, which meant serving it up in a SharePoint modal dialog.

This worked just fine by making a call to SP.UI.ModalDialog.showModalDialog(). Where I got hung up was on how to close the form. Should be simple, right? But since a modal dialog is loaded into an iFrame and has its own container, we lose the reference to the SP object. This meant that I couldn’t just call the SP.UI.commonModalDialogClose function from the form and be done.

No, problem, I thought. A quick google search should turn up the proper command to use in order to close the dialog. No such luck though – apparently my google juju was all used up for the week. I did pick up a few things to try, but none of them got the job done.

On my own, I got as far as knowing I needed to get a reference to the parent container using Window.parent but still couldn’t figure out how to get to the SP functions I needed. After consulting with a co-worker (thanks, Kirk!) we were able to take what I had and finally get a very simple solution working. From within the iFrame we got a reference to the ‘SP’ object by treating it as a property of the parent window, which itself is an object. Duh. Sometimes the simplest things …

The call to close the dialog ended up looking like this:

Window.parent["SP"].UI.ModalDialog.commonModalDialogClose(closeStatus, null)

Here is the full function, which handles closing the form from within a dialog or closing it from a non-modal form, such as opening the page directly in the browser.

closeForm(closeStatus) {

// closeStatus: 1 ==> UI.DialogResult.OK

// closeStatus: 0 ==> UI.DialogResult.cancel

if (window.parent["SP"] !== undefined) {

window.parent["SP"].UI.ModalDialog.commonModalDialogClose(closeStatus, null);

}

else {

//assumes we are not in a dialog

window.location.href = this.requestService.GetHref();

}

}

Doesn’t get much simpler than this. I hope this can help save someone else some of the growing pains I went through on this one.

read more
Caroline SosebeeClosing an Angular 2 Form from within a SharePoint Modal Dialog
woman-laptop-code.jpg

Migrating a Document Library and Retaining Authorship Information

Lately, I’ve had the opportunity to work on a couple of on-premises SharePoint 2010 to SharePoint 2013 migrations which proved to be both fun and challenging.  Our team chose to leverage PowerShell and CSOM for handling all the heavy lifting when it came to provisioning the new site and migrating data.  This, in turn, lead to one of the more interesting things I got to do, which was write a PowerShell script that would migrate the contents of a document library from the old 2010 site to a document library in the new 2013 site, while at the same time retaining authorship information.

As I researched how to do this, I found that there wasn’t any one source that explained all the steps needed in a nice concise manner. I found dribs and drabs here and there, usually around one step or another, or found that the code provided was using server side code which wasn’t help in my case.

So, I decided it would be worthwhile to pull all the parts into one document, both for my own reference as well as for others. And yes, I admit it. I shamelessly pilfered from some of my co-workers’ fine scripts to piece this together. Thanks, team, for being so awesome!

Here is an outline of the high-level steps needed to move files between sites. You can find the full script at the bottom.

Basic steps

  • Get a context reference for each of the sites (source and target). This will allow you to work on both sites at the same time.
  • Load both the source and target libraries into their appropriate contexts. Load the RootFolder for the target library as well. This will be needed both when saving the document and when updating the metadata.
  • Have a valid user id available to use in place of any invalid users found in the metadata being copied. Ensure the ‘missing user’ user id.
  • Query the source library using a CAML query to get a list of all items to be processed. You can apply filters here to limit results as needed. (Attached code has parameters for specifying start and end item ids).
  • Loop through the items returned by the CAML query
    • Get a reference to the source item using the id
    • Get a reference to the actual file from the source item
    • Load the source file by calling ‘OpenBinaryDirect’ (uses the file ServerRelativeUrl value)
    • Write it back out to the target library by calling ‘SaveBinaryDirect’ on the just loaded file stream
    • Copy over the metadata:
      • Get a reference to the new item just added in the target library
      • Populate the target item metadata fields using values from the source item
      • Update the item
    • Loop

That’s it, in a nutshell. There are all sorts of other things you can do to pretty it up, but I thought I would keep this simple as a quick reference both for myself and others. Be sure to check the top of the script below for other notes about what it does and does not do.

<#
.SYNOPSIS
Copies documents from one library into another across sites.
This was tested using SharePoint 2010 as the source and both SharePoint 2010 and SharePoint 2013 as the target.

Notes:
* Parameters can either be passed in on the command line or pre-populated in the script
* Example for calling from command line:
./copy-documents.ps1 "http://testsite1/sites/testlib1/" "domain\user" "password" "source lib display name" "http://testsite2/sites/testlib2/" "domain\user" "password" "target lib display name" "domain\user" 1 100

Features:
* Can cross site collections and even SP versions (e.g. SP2010 to SP2013)
* Allows you to specify both the source and target document library to use
* Can retain created, created by, modified, modified by and other metadata of the original file
* Can specify a range of files to copy by providing a starting id and ending id
* When copying metadata such as created by, will populate any invalid users with the provided 'missing user' value
* Uses a cache for user data so it doesn't have to run EnsureUser over and over for the same person

Limitations:
* Does not currently traverse folders within a document library.
* This only copies.  It does not remove the file from the source library when done.

#>

[CmdletBinding()]
param(
	[Parameter(Mandatory=$false)]
	[string]$sourceSiteUrl = "",
    [Parameter(Mandatory=$false)]
	[string]$sourceUser = "",
	[Parameter(Mandatory=$false)]
	[string]$sourcePwd = "",
	[Parameter(Mandatory=$false)]
    [string]$sourceLibrary = "",
	[Parameter(Mandatory=$false)]
	[string]$targetSiteUrl = "",
    [Parameter(Mandatory=$false)]
	[string]$targetUser = "",
	[Parameter(Mandatory=$false)]
	[string]$targetPwd = "",
    [Parameter(Mandatory=$false)]
    [string]$targetLibrary = "",
    [Parameter(Mandatory=$false)]
    [string]$missingUser = "",
	[Parameter(Mandatory=$false)]
    [int]$startingId = -1,
    [Parameter(Mandatory=$false)]
    [int]$endingId = -1
)

## Load the libraries needed for CSOM
## Replace with the appropriate path to the libs in your environment
Add-Type -Path ("c:\dev\libs\Microsoft.SharePoint.Client.dll")
Add-Type -Path ("c:\dev\libs\Microsoft.SharePoint.Client.Runtime.dll")

function Main {
	[CmdletBinding()]
	param()
	
	Write-Host "[$(Get-Date -format G)] copy-documents.ps1: library $($sourceLibrary) from $($sourceSiteUrl) to $($targetSiteUrl)"
	
    # Get the context to the source and target sites
	$sourceCtx = GetContext $sourceSiteUrl $sourceUser $sourcePwd
	$targetCtx = GetContext $targetSiteUrl $targetUser $targetPwd

    # Ensure the "missing user" in the target environment
    $missingUserObject = $targetCtx.Web.EnsureUser($missingUser)
    $targetCtx.Load($missingUserObject)
	
	## Moved the try/catch for ExecuteQuery to a function so that we can exit gracefully if needed
	ExecuteQueryFailOnError $targetCtx "EnsureMissingUser"

	## Start the copy process
	CopyDocuments $sourceCtx $targetCtx $sourceLibrary $targetLibrary $startingId $endingId $missingUserObject
}

function CopyDocuments {
	[CmdletBinding()]
	param($sourceCtx, $targetCtx, $sourceLibrary, $targetLibrary, $startingId, $endingId, $missingUserObject)

    $copyStartDate = Get-Date

    # Get the source library
    $sourceLibrary = $sourceCtx.Web.Lists.GetByTitle($sourceLibrary)
    $sourceCtx.Load($sourceLibrary)
	ExecuteQueryFailOnError $sourceCtx "GetSourceLibrary"

    # Get the target library
    $targetLibrary = $targetCtx.Web.Lists.GetByTitle($targetLibrary)
    $targetCtx.Load($targetLibrary)
	## RootFolder is used later both when copying the file and updating the metadata.
    $targetCtx.Load($targetLibrary.RootFolder)
	ExecuteQueryFailOnError $targetCtx "GetTargetLibrary"

    # Query source list to retrieve the items to be copied
    Write-Host "Querying source library starting at ID $($startingId) [Ending ID: $($endingId)]"
    $sourceItems = @(QueryList $sourceCtx $sourceLibrary $startingId $endingId) # Making sure it returns an array
    Write-Host "Found $($sourceItems.Count) items"

    # Loop through the source items and copy
    $totalCopied = 0
    $userCache = @{}
    foreach ($sourceItemFromQuery in $sourceItems) {

        $totalCount = $($sourceItems.Count)

        if ($sourceItemFromQuery.FileSystemObjectType -eq "Folder") {
            Write-Host "skipping folder '$($sourceItemFromQuery['FileLeafRef'])'"
            continue
        }
		Write-Host "--------------------------------------------------------------------------------------"
        Write-Host "[$(Get-Date -format G)] Copying ID $($sourceItemFromQuery.ID) ($($totalCopied + 1) of $($totalCount)) - file '$($sourceItemFromQuery['FileLeafRef'])'"

        # Get the source item which returns all the metadata fields
        $sourceItem = $sourceLibrary.GetItemById($sourceItemFromQuery.ID)
        # Load the file itself into context
		$sourceFile = $sourceItem.File
        $sourceCtx.Load($sourceItem)
        $sourceCtx.Load($sourceFile)
		ExecuteQueryFailOnError $sourceCtx "GetSourceItemById"

		## Call the function used to run the copy
        $targetId = CopyDocument $sourceCtx $sourceItem $sourceFile $sourceItemFromQuery $targetCtx $targetLibrary $userCache $missingUserObject
        
		$totalCopied++
    }

    # Done - let's dump some stats
    $copyEndDate = Get-Date
    $duration = $copyEndDate - $copyStartDate
    $minutes = "{0:F2}" -f $duration.TotalMinutes
    $secondsPerItem = "{0:F2}" -f ($duration.TotalSeconds/$totalCopied)
    $itemsPerMinute = "{0:F2}" -f ($totalCopied/$duration.TotalMinutes)
	Write-Host "--------------------------------------------------------------------------------------"
    Write-Host "[$(Get-Date -format G)] DONE - Copied $($totalCopied) items. ($($minutes) minutes, $($secondsPerItem) seconds/item, $($itemsPerMinute) items/minute)"
}

### Function used to copy a file from one place to another, with metadata
function CopyDocument {
    [CmdletBinding()]
    param($sourceCtx, $sourceItem, $sourceFile, $sourceItemFromQuery, $targetCtx, $targetLibrary, $userCache, $missingUserObject)

    ## Validate the Created By and Modified By users on the source file
    $authorValueString = GetUserLookupString $userCache $sourceCtx $sourceItem["Author"] $targetCtx $missingUserObject
    $editorValueString = GetUserLookupString $userCache $sourceCtx $sourceItem["Editor"] $targetCtx $missingUserObject

    ## Grab some important bits of info
	$sourceFileRef = $sourceFile.ServerRelativeUrl
    $targetFilePath = "$($targetLibrary.RootFolder.ServerRelativeUrl)/$($sourceFile.Name)"

    ## Load the file from source
    $fileInfo = [Microsoft.SharePoint.Client.File]::OpenBinaryDirect($sourceCtx, $sourceFileRef)
    ## Write file to the destination
    [Microsoft.SharePoint.Client.File]::SaveBinaryDirect($targetCtx, $targetFilePath, $fileInfo.Stream, $true)

    ## Now get the newly added item so we can update the metadata
    $item = GetFileItem $targetCtx $targetLibrary $sourceFile.Name $targetLibrary.RootFolder.ServerRelativeUrl

    ## Replace the metadata with values from the source item
    $item["Author"] = $authorValueString
    $item["Created"] = $sourceItem["Created"]
    $item["Editor"] = $editorValueString
    $item["Modified"] = $sourceItem["Modified"]

	## Update the item
    $item.Update()
    ExecuteQueryFailOnError $targetCtx "UpdateItemMetadata"

    Write-Host "[$(Get-Date -format G)] Successfully copied file '$($sourceFile.Name)'"

}

## Get a reference to the list item for the file.
function GetFileItem {
	[CmdletBinding()]
	param($ctx, $list, $fileName, $folderServerRelativeUrl)

	$camlQuery = New-Object Microsoft.SharePoint.Client.CamlQuery
	if ($folderServerRelativeUrl -ne $null -and $folderServerRelativeUrl.Length -gt 0) {
		$camlQuery.FolderServerRelativeUrl = $folderServerRelativeUrl
	}
	$camlQuery.ViewXml = @"
<View>
	<Query>
   		<Where>
      		<Eq>
        		<FieldRef Name='FileLeafRef' />
        		<Value Type='File'>$($fileName)</Value>
      		</Eq>
		</Where>
	</Query>
</View>
"@

	$items = $list.GetItems($camlQuery)
	$ctx.Load($items)
	$ctx.ExecuteQuery()
	
	if ($items -ne $null -and $items.Count -gt 0){
		$item = $items[0]
	}
	else{
		$item = $null
	}
	
	return $item
}

## Validate and ensure the user
function GetUserLookupString{
	[CmdletBinding()]
	param($userCache, $sourceCtx, $sourceUserField, $targetCtx, $missingUserObject)

    $userLookupString = $null
    if ($sourceUserField -ne $null) {
        if ($userCache.ContainsKey($sourceUserField.LookupId)) {
            $userLookupString = $userCache[$sourceUserField.LookupId]
        }
        else {
            try {
                # First get the user login name from the source
                $sourceUser = $sourceCtx.Web.EnsureUser($sourceUserField.LookupValue)
                $sourceCtx.Load($sourceUser)
                $sourceCtx.ExecuteQuery()
            }
            catch {
                Write-Host "Unable to ensure source user '$($sourceUserField.LookupValue)'."  
            }

            try {
                # Now try to find that user in the target
                $targetUser = $targetCtx.Web.EnsureUser($sourceUser.LoginName)
                $targetCtx.Load($targetUser)
                $targetCtx.ExecuteQuery()
                
                # The "proper" way would seem to be to set the user field to the user value object
                # but that does not work, so we use the formatted user lookup string instead
                #$userValue = New-Object Microsoft.SharePoint.Client.FieldUserValue
                #$userValue.LookupId = $user.Id
                $userLookupString = "{0};#{1}" -f $targetUser.Id, $targetUser.LoginName
            }
            catch {
                Write-Host "Unable to ensure target user '$($sourceUser.LoginName)'."
            }
            if ($userLookupString -eq $null) {
                Write-Host "Using missing user '$($missingUserObject.LoginName)'."
                $userLookupString = "{0};#{1}" -f $missingUserObject.Id, $missingUserObject.LoginName
            }
            $userCache.Add($sourceUserField.LookupId, $userLookupString)
        }
    }

	return $userLookupString
}

## Pull ids for the source items to copy
function QueryList {
    [CmdletBinding()]
    param($ctx, $list, $startingId, $endingId)

    $camlQuery = New-Object Microsoft.SharePoint.Client.CamlQuery
    $camlText = @"
<View>
    <Query>
        <Where>
            {0}
        </Where>
        <OrderBy>
            <FieldRef Name='ID' Ascending='True' />
        </OrderBy>
    </Query>
    <ViewFields>
        <FieldRef Name='ID' />
        {1}
    </ViewFields>
    <QueryOptions />
</View>
"@

    if ($endingId -eq -1) {
        $camlQuery.ViewXml = [System.String]::Format($camlText, "<Geq><FieldRef Name='ID' /><Value Type='Counter'>$($startingId)</Value></Geq>", "")
    }
    else {
        $camlQuery.ViewXml = [System.String]::Format($camlText, "<And><Geq><FieldRef Name='ID' /><Value Type='Counter'>$($startingId)</Value></Geq><Leq><FieldRef Name='ID' /><Value Type='Counter'>$($endingId)</Value></Leq></And>", "")
    }

    $items = $list.GetItems($camlQuery)
    $ctx.Load($items)
	ExecuteQueryFailOnError $ctx "QueryList"

    return $items
}

function GetContext {
	[CmdletBinding()]
	param($siteUrl, $user, $pwd)
	
	# Get the client context to SharePoint
	$ctx = New-Object Microsoft.SharePoint.Client.ClientContext($siteUrl)
	$securePwd = ConvertTo-SecureString $pwd -AsPlainText -Force
	$cred = New-Object PSCredential($user, $securePwd)
	$ctx.Credentials = $cred
	
	return $ctx
}

function ExecuteQueryFailOnError {
	[CmdletBinding()]
	param($ctx, $action)
	
	try {
		$ctx.ExecuteQuery()
	}
	catch {
		Write-Error "$($action) failed with $($_.Exception.Message).  Exiting."
		exit 1
	}
}

### Start the process
Main
read more
Caroline SosebeeMigrating a Document Library and Retaining Authorship Information
helpful-tips-black.jpg

CSOM GetItemById Results in “The property or field has not been initialized”

I was having a problem where I was trying to migrate list content from SharePoint 2010 to SharePoint 2013 but a couple of fields were not making it over.  I was using a migration tool that I like to use, but had to resort to connecting to both SharePoint 2010 and SharePoint 2013 using CSOM due to environment restrictions.  I know there are limitations with CSOM and likely more with SharePoint 2010 than there are with SharePoint 2013, but I was still surprised the tool couldn’t get values for these two fields.

I thought I could get around this if I wrote the code myself so I gave it a shot.  I wrote some PowerShell to first query for the list item IDs from the source list.  The code uses a CAML query and only has the ID field in the ViewFields for the query.  I then iterate across the results of the query.  For each result I get the entire source item by ID as follows:

$sourceItem = $sourceList.GetItemById($sourceItemFromCAMLQuery.ID)
$sourceCtx.Load($sourceItem)
$sourceCtx.ExecuteQuery()

GetItemById is the key method above as I thought it would do a good job of getting all fields for that list item.  However, for the two fields that gave me problems with the migration tool, it failed with my code.  The line of code that gave me an error was:

 $sourceValue = $sourceItem[$fieldName]

The exception thrown was of type PropertyOrFieldNotInitializedException.  The exception message was:

The property or field has not been initialized. It has not been requested or the request has not been executed. It may need to be explicitly requested.

I immediately thought this might be a row ordinal issue based on problems it has caused me in the past.  In retrospect I discovered that according to the schema XML for the fields, they are still in row ordinal 0 (which is good) so that didn’t explain my issue, but my fix was the same regardless.

To fix this problem I simply added these two problem fields to the ViewFields for my initial CAML query and subsequently get the values from the CAML query instead of the list item obtained using GetItemById.  It added a little complexity since I had to pay attention to which field values I obtained from the CAML query and which I obtained from GetItemById, but it solved my problem!  I hope it helps others as well.

read more
Kirk LiemohnCSOM GetItemById Results in “The property or field has not been initialized”
pirate-puzzle.png

PnP, CSOM, and Claims-Based Authentication

I was recently tasked with working on a proof-of-concept (POC) that required the use of SharePoint hosted in Azure.  This particular SharePoint environment was configured exclusively with claimed-based authentication.  My task was to leverage the client-side object model (CSOM) to update the AlternateCSSUrl property on the target Web (in this case a Microsoft.SharePoint.Client.Web object).

Side Note: Setting this AlternateCSSUrl property to point to a custom “.css” file that resides in the local Site Assets library is a nice-and-simple way to augment some of the SharePoint styling.  In this case, my goal was to improve the user experience for mobile users (basically… introduce some “responsive” behavior).

Setting the AlternateCSSUrl property is super simple, assuming you can successfully authenticate to your target SharePoint environment.  However, simply setting up a NetworkCredential object or relying on DefaultCredentials does not cut it with claimed-based authentication.  Claims-based authentication requires a separate hop to an ADFS server, and the use of using a FedAuth cookie issued from the target SharePoint environment.

I did a search for “CSOM and claimed-based authentication” and found a couple of interesting links…both of which focus on SharePoint 2010 (I was targeting SharePoint 2013), and offer an understanding of the problem and respective solution.

http://www.piasys.com/blog/accessing-sharepoint-2010-via-csom-using-claims-based-authentication-and-providing-fedauth-cookie/

https://samlman.wordpress.com/2015/02/28/using-the-client-object-model-with-a-claims-based-auth-site-in-sharepoint-2010/

Both of these are worth a read to get a better understanding of what needs to be done.

Since I was very limited on time, I started digging a bit more and decided to have a look at the wealth of samples from the Patterns and Practices (PnP) folks.  Low and behold, I found this…. https://github.com/OfficeDev/PnP-Sites-Core/blob/master/Core/SAML%20authentication.md

Basically, the PnP AuthManager class has support for exactly what I needed.  Thanks PnP team!

The PnP AuthManager makes this very simple.   Here is an example of what this might look like.  The code has been simplified to show the essence.

static void Main(string[] args)
{
	string webUrl = "https://portal.contoso.com";
	string userName = "user1";
	string password = "password1";
	string domain = "contoso.com";
	string sts = "adfs.contoso.com";
	string idpld = "urn:sharepoint:sp-contoso";
	string cssRelativePath = "/SiteAssets/responsive-option1.css";

	SetAlternateCSSUrl(webUrl, userName, password, domain, sts, idpld, cssRelativePath);
}

private static void SetAlternateCSSUrl(string webUrl, string userName, string password, string domain, string sts, string idpld, string cssRelativePath)
{
	OfficeDevPnP.Core.AuthenticationManager am = new OfficeDevPnP.Core.AuthenticationManager();

	using (var ctx = am.GetADFSUserNameMixedAuthenticatedContext(webUrl, userName, password, domain, sts, idpld))
	{
		ctx.RequestTimeout = Timeout.Infinite;

		// Just to output the site details
		Web web = ctx.Web;
		ctx.Load(web, w => w.Title, w => w.ServerRelativeUrl, w => w.AlternateCssUrl);
		ctx.ExecuteQueryRetry();

		web.AlternateCssUrl = web.ServerRelativeUrl + cssRelativePath;
		web.Update();
		web.Context.ExecuteQuery();
	}
}
read more
Chris EdwardsPnP, CSOM, and Claims-Based Authentication
multi-value-people-picker.jpg

Populating a Multi-Value People Picker with PowerShell

On my current project, a large scale migration from a SharePoint dedicated environment to SharePoint Online, my team is using a “Site Inventory” list to keep track of the over 52,000 site collections in our client’s source. Since the source is “live” and things are constantly evolving, we get periodic CSVs containing the most recent data regarding the source.

Naturally, 52,000+ rows is a LOT of data to go look through, so we created a PowerShell script that would parse the updated information, compare it to the Site Inventory list, and update any changes we find. Among the columns in our list are a few “Owner” columns (primary and secondary) and an “All Administrators” column. All three of the columns are just text fields that contain the login names of users in the dedicated environment, and we wanted to aggregate the three fields into one multi-value people picker.

Sounds easy, right? I ended up having quite a struggle, spending more time than I felt necessary dredging the depths of the internet for answers.

I knew that I had to use the Web.EnsureUser method to turn my username into a User object so I could grab the ID. I also knew that I needed to turn that ID into a lookup value since a people picker is, more or less, a special kind of lookup column. Finally, I knew that my “AllOwners” column needed an array of lookup values. That last part was where the issue came in.

$userName = "whd\wholland"
$spuser = EnsureUser $context
$user if($spuser -ne $null){
     $spuserValue = New-Object Microsoft.SharePoint.Client.FieldUserValue 
     $spuserValue.LookupId = $spuser.id
     $listItem["AllOwners"] = @($spuserValue)
}
$listItem.Update()

After going through the steps of turning a username into a FieldUserValue object (the ‘special’ lookup value I mentioned earlier), I would simply wrap that FieldUserValue in an array and attempt to set it as the value of the “AllOwners” field. My expectation was that, in doing so, I was creating an array of FieldUserValues. Expectations and reality don’t always line up.

As it turns out, unless you specify a type, PowerShell will create an array of the most generic type it can. In my case, I was getting an array of generic Objects.

Before I tried to set the value of my field, I needed to cast my array to be, specifically, an array of FieldUserValue objects. Below you’ll find the code snippet that sorted the issue for me.

$userName = "whd\wholland"  
$spuser = EnsureUser $context $user  
$lookupValueCollection = @()  
if($spuser -ne $null){  
     $spuserValue = New-Object Microsoft.SharePoint.Client.FieldUserValue                 
     $spuserValue.LookupId = $spuser.id  
     $lookupValueCollection += $spuserValue  
}  
$userValueCollection = [Microsoft.SharePoint.Client.FieldUserValue[]]$lookupValueCollection 
$listItem["AllOwners"] = $userValueCollection  
$listItem.Update()
read more
William HollandPopulating a Multi-Value People Picker with PowerShell
switch.jpg

Switching SharePoint Views Defaults from Standard/HTML to Edit/Datasheet/GRID and Back

I recently created a new list by uploading an Excel spreadsheet to SharePoint 2013. I’m sure there are several blogs telling you how to do this, but here is one that you may find useful.

The problem I encountered, however, is that after uploading the spreadsheet to Excel, the view that was created was automatically in edit mode (a.k.a. Datasheet view). I guess the thought was that people using Excel would like to be in that mode by default. I didn’t want that so I figured out how to switch it back. It wasn’t entirely obvious, so I thought I would share.

I knew I could create a new view from scratch that was not in edit mode by default, but this view had been updated to have many of columns in a specific order, and I really didn’t want to do it manually. Plus, I knew there had to be a way. I started out looking at the view in the browser and didn’t see anything. Then I edited the page and looked at the web part properties for the view, but that didn’t help either.

The solution was to open the site in SharePoint Designer 2013 and navigate to the view itself. Upon opening the view I saw the following HTML (I have seen this on line 37 and 38 for me depending on the view):

<View Name="{24C927B3-0768-4129-9A41-73961AA48508}" DefaultView="TRUE" Type="GRID" DisplayName="All Items" Url="..." [a bunch more attribues]>[a bunch of child elements]</View>

I simply had to change GRID to HTML and save the page:

<View Name="{24C927B3-0768-4129-9A41-73961AA48508}" DefaultView="TRUE" Type="HTML" DisplayName="All Items" Url="..." [a bunch more attribues]>[a bunch of child elements]</View>

I got a notice about the page being customized (unghosted) which was fine for me. You can change these back and forth and it works fine. SharePoint Designer shows a summary of all views for a list and their type (HTML or GRID). That summary is cached so it doesn’t update after you save the changes to the ASPX page directly.

read more
Kirk LiemohnSwitching SharePoint Views Defaults from Standard/HTML to Edit/Datasheet/GRID and Back
mac.jpg

Starting and Stopping Multiple Azure VM’s from a Mac Terminal

We recently started a project in which ensuring we kept costs of Azure VM’s as low as possible was a real concern. Since I am a Mac user (stop snickering), the Azure CLI tools seemed like an obvious choice. I just wanted a simple way to check the current status of the virtual machines in a specific Azure Subscription, and then start to shutdown the virtual machines as needed. I agree with Scott Hansleman, and thought I would save myself and others some keystrokes. #sharingiscaring

Using the Azure CLI from a Mac Terminal

The first thing I found when searching was Using the Azure CLI for Mac. This was pretty helpful and gave me some basic commands, but didn’t give me the simple path of shutting down all the virtual machines that were running at once. This did give me the basic commands that let me check the status of the virtual machines, and stop and start the machines as well. By the way, this page has a bunch of info I plan on diving into at a later date.

To start, you’ll need to install the Azure CLI tools. This is a pretty straight forward install using npm.

npm install azure-cli -g

Once you have the tools, you’ll need to get your subscription settings. I have to run the following:

azure account download

This will open a browser and you’ll need to select the subscription for which you want the settings file. Once the subscription information was downloaded, running

azure account import [settings file]

will set the current account.

The Basic Commands

Once the Azure CLI tools are installed, the basic commands to list, stop and start the virtual machines are pretty simple

azure vm list

azure vm start vm-name

azure vm shutdown vm-name

Great. Almost there.

Painful Unix Memories

Unfortunately, even though I am now a Mac user, I still have to bang my head against the wall to remember Unix commands and pipe operations. Bad memories of awk and sed from many years ago. Towards the end of the install, the Azure CLI tools post, there was a section about “understanding results” which did something very similar to what I needed. This example was using xargs and something I had never heard of called jsawk. The xargs didn’t give me any flashbacks, so I was ok with that, but the jsawk made me twitch a little. Turns out, the curl installation of jsawk from the GitHub repo failed for me, but I was too close to stop!

A link in the jsawk readme mentioned using Homebrew, so the following installed jsawk and the dependencies I was missing. Back in business!

brew search jsawk
brew install jsawk

The Final Solution

Finally, all the pieces are in place. Now I can list the virtual machines, filter them, and more. This is a little deeper than I want to go here, but the jsawk documentation is really great. With the ability to filter and then use xargs to call another azure command, the final solution was just a simple change based on the example towards the end of the install the Azure CLI tools post. Here they are:

Shutdown some Virtual Machines

azure vm list --json | jsawk -n 'out(this.VMName)' | xargs -L 1 azure vm shutdown

Start all of the Virtual Machines

azure vm list --json | jsawk -n 'out(this.VMName)' | xargs -L 1 azure vm start

There are still some things I want to do, specifically create a shell script to enable some other options and arguments, but this might be a blog post for another day. Hopefully this will help someone else and save some keystrokes.

And BTW – I do realize this would be very easy in PowerShell. I’m just sort of new to the Mac. #justsaying

Get-AzureVM | Where-Object {$_.Status -like "Ready*"} | ForEach-Object { Stop-AzureVM -ServiceName $_.ServiceName -Name $_.Name -Force }

Let me know what you think or if this was helpful.

 

read more
Pete SkellyStarting and Stopping Multiple Azure VM’s from a Mac Terminal
aurelia.jpg

How to Add Aurelia to Visualforce Pages in Force.com

Aurelia is a next gen JavaScript client framework for mobile, desktop and web that leverages simple conventions to empower your creativity (http://aurelio.io).

From my view, I see it as serving a similar purpose as AngularJS, with several key benefits:

  • No more IIFE‘s!
  • ECMAScript 2016
  • jspm package manager
  • Configuration by convention means less setup code

In all fairness, three of the four benefits I’ve listed above are provided by Javascript libraries and code that is outside of Aurelia. But, I’m lumping all of these benefits together under the Aurelia umbrella because they work well together, and I’m working with them because of Aurelia.

I recently got my feet wet with Aurelia thanks to Building Applications with Aurelia from Scott Allen on pluralsight. As usual, Scott does an awesome job of making this topic easy to learn and understand. Though Scott’s examples are created in Visual Studio, he explains that they could be accomplished in any web browser.

Naturally, I wanted to bring Aurelia to Salesforce! However, there are a few twists and turns, and I’ve listed the steps below. You can skip to the bottom and download the source code, if you’re looking to dive into this.

Setup the Developer Environment

Create and configure the project

Create a simple project which includes a Visualforce page and a single static resource. I named my Visualforce page AureliaAppPage and my static resource is named AppJS. Also, create a folder in the project titled “assets.” When finished, your project should appear as shown below.

Before going any further, confirm that you can edit and render your new Visualforce page and that the static resource can be edited and saved to Salesforce, without any errors.

Configure packages

Open a command prompt and navigate to the assets folder that you created as part of the project above and execute the following commands:

  • npm install jspm –g
  • npm install –global gulp
  • npm install –save-dev gulp
    • You can ignore any warning which appear about the Description, README and repository being empty.
  • jspm init
    • Accept the defaults, except for those listed below
    • Which ES6 transpiler would you to like to use, Traceur or Babel? Babel
  • jspm install aurelia-framework
  • jspm install aurelia-bootstrapper
  • jspm install aurelia-http-client
  • jspm install bootstrap

Project Configuration and Source files

Retrieve the following files from the attached zip and place these into the assets folder:

  • config.js
  • gulpfiles.js
  • The “js” folder

Your project should now look like the screen snap below

Visualforce

You can copy the code for the Visualforce page from the attached zip, and I’ve included it below so that I can highlight a few elements.

Notice the following:

  • The baseUrl is set dynamically to the URL of the static resource which will hold our javascript files.
  • We must pass the Salesforce session Id to the our application since it will be necessary when making API calls back into Salesforce.
<apex:page docType="html-5.0" showHeader="false" sidebar="false" applyBodyTag="false" standardStylesheets="false">

<head>

<title>Aurelia</title>

<link rel="stylesheet" type="text/css" href="{!URLFOR($Resource.AppJS, 'jspm_packages/github/twbs/[email protected]/css/bootstrap.css')}"></link>

<link rel="stylesheet" type="text/css" href="{!URLFOR($Resource.AppJS, 'styles/styles.css')}"></link>

<style>

.container {

margin-left: 10px;

margin-right: auto;

}

</style>

</head>

<body aurelia-app="js/app/main" >

<script src="{!URLFOR($Resource.AppJS, 'jspm_packages/system.js')}" ></script>

<script src="{!URLFOR($Resource.AppJS,'config.js')}" ></script>

<script>

System.config({

"baseURL": "{!URLFOR($Resource.AppJS)}",

"apiSessionId": "{!$Api.Session_Id}"

});

&nbsp;

System.import("aurelia-bootstrapper");

</script>

</body>

</apex:page>

Save to Salesforce

Run the following command from command prompt: gulp build

The gulp build command will zip all of the assets into APPJS.resource. The Force.com IDE will automatically upload assets to Salesforce, but it needs a kick in order to upload the static resource file which is created as a result of the gulp build. To get it going, simply right click on APPJS.resource. After doing so, you should notice that the file is uploaded to Salesforce.

Run the app

You should now be able to navigate to login to Salesforce and navigate to https://yourserver/apex/aureliaapppage to view a listing of Opportunities, using Aurelia.

Download the attached zip file to see the full source code and compare it with your results.

Are you building with Aurelia and Salesforce?

I’d love to hear about it. Leave me a comment below.

read more
Eric BowdenHow to Add Aurelia to Visualforce Pages in Force.com
leverage.png

What is Client-Side Rendering and Why Would I Use It?

Sometimes I hear about new features in SharePoint, but it doesn’t immediately occur to me how they would be useful. I’ve heard of Client-Sider Rendering (CSR) for some time, but never really dug in to understand it until I was working on a recent project. Now that I understand it better, I’m sure I will use it much more often. Hopefully this brief explanation can help you better understand the power of Client-Side Rendering so that you can leverage it when it might be the best option.

Why Would I Use Client-Side Rendering?

SharePoint libraries and lists can be a great place to store information, but sometimes the out-of-box presentation of that information is not what is desired. For public facing sites in particular, there is usually a desire to display the content in a more engaging way than the standard interface. Client-Side Rendering is one possible solution to help you customize the display to your liking.

What Is Client-Side Rendering?

Client-Side Rendering (CSR) is javascript code that allows you to override the default SharePoint display by supplying your own html. Through the client-side context that is provided, you can dynamically integrate the data from the SharePoint list or library with your own custom html to manufacture the desired experience. CSR can be applied at several levels. For example, you can choose to override the display of one or two fields and allow the other fields to display as normal. Or maybe you want to change the background color of a numeric field called “Test Score” based upon the value (i.e. greater than 90 is green; between 70 and 80 is Yellow and less than 70 is red). Or maybe you need to customize the display in a more radical way. Maybe you want to display two items per row instead of the default behavior of one item per row. In this case, the default List View Header wouldn’t make sense, so you would likely override the Header. All of this is possible with CSR.

Let’s look at the sample below of default document library content displayed in 2 List View Web Parts. This should look familiar to those of you who have used SharePoint.

 

After applying Client-Side Rendering, I have transformed these two List Views to look a bit different.

 

As you can see in the Documents List View Web Part I, have overridden the Header to only display “Documents,” and I have overridden the item display to only display the name of the file (minus the checkbox a,nd icon). What you can’t really see and what is more significant than the UI changes is that I have also overridden the default click event so that this document opens in the SharePoint modal dialog.

So, when the link is clicked, I call a javascript function that formats the options object and then calls the out-of-box SharePoint modal dialog.

Here is the result:

For the Local Videos List View Web Part, I have overridden the header to display “Local Videos,” but I have also changed the display to display items per row instead of 1. I have also displayed a Thumbnail image that is a clickable link which launches the video in the SharePoint Modal dialog window.

 

As the name Client-Side Rendering implies, this transformation of the display all happens on the client-side. By including a reference to a javascript file in the JS link property of a web part on the page, I can relatively easy transform the display. In this scenario, I used the ~site token to reference the current site and stored my custom javascript file in the SiteAssets folder where it can be easily updated using SharePoint Designer.

 

Here’s a screen shot of some of the code. Note the references to different javascript functions to implement the html for the Footer, Header and Item.

 

I hope this has been useful to help you envision the value provided by Client-Side Rendering.

For a more detailed description of the options that are available, I recommend the following 3 blogs by Suhail Jamaldeen. He has provided a lot of useful detail that will be helpful if you determine Client-Side Rendering is a good solution for your problem.

http://jsuhail.blogspot.com/2014/09/client-side-rendering-using-jslink-post.html

http://jsuhail.blogspot.com/2014/09/client-side-rendering-using-jslink-post_11.html

http://jsuhail.blogspot.com/2014/09/client-side-rendering-using-jslink-post_30.html

read more
Tim CoalsonWhat is Client-Side Rendering and Why Would I Use It?
javascript-laptop.jpg

Using JavaScript IIFE’s and Promise Chaining with Lookup Fields

After reading Will’s excellent blog post about bootstrapping an Angular app with server side data, I wanted to share a trick that I recently used for bootstrapping a pure client side SharePoint app that dealt with lookup fields and promise chaining. While in this post I will also be using Angular, this technique should work with any framework that uses JavaScript promises (such as jQuery’s Defer or Kris Kowal’s Q).

In both SharePoint 2010 and 2013, the REST API will not return lookup field display values (unless you use the $expand query operator, but that only works if you know ahead of time which of the fields are lookups). Instead, the API will just return the ID of the lookup value, meaning you have to make another trip to the server to get the display value for the lookup field. However, when you are working with JavaScript promises, this can get messy quickly because of the asynchronous nature of promises and the need for following a synchronous series of steps to load list data from multiple lists.

In the JavaScript world, there is the notion of an IIFE (Immediately Invoked Function Expression). As it’s name implies, IIFE’s are a mechanism for immediately executing a function block without needing to have a function declaration (Ben Alman does a much better and far more detailed explanation of IIFE’s if you are interested). Using an IIFE inside of a promise’s .then() callback allows your code to be schema-agnostic while being much easier to read and (in my humble opinion) cleaner.

So let’s use a simple,. slightly contrived, example. Say we have two lists: Parent and Child. Parent has a lookup field that is bound to a field on the Child list. Using an IIFE inside a .then() callback to make the second trip to the server to get the Child’s lookup value would look something like this:

That’s it. Now you may have noticed that I contradicted myself because in the above code I have hard coded the ChildColumnId field, so it would have actually been much easier to just use the $expand keyword and avoid the need for the second round trip in the first place. I did this to make the code easier to understand. A more detailed example would be something like this:

You might have noticed that I also adding the result onto the parent object differently. In the first example, I added the ChildColumn field as a child object off the Parent;  In the second example, I added Child Columns as a property on Parent. Frankly, I just did this because it made the code on the second example a little easier to read (and write) but either approach is perfectly fine depending on your situation and how you are binding your view.

A working demo project for SharePoint 2013 can be downloaded here.

read more
Lane GoolsbyUsing JavaScript IIFE’s and Promise Chaining with Lookup Fields
defensive.jpg

Defensive Coding in SharePoint Event Receivers

As you design an Application solution in SharePoint that involves the use of SharePoint event receivers, you have to consider that event receivers can fire in situations that you may not anticipate. Therefore, you need to code defensively as the object you are expecting to be available in the Event Receiver may be null. Here are two such examples:

Update of a List form (i.e. Newform.aspx, Editform.aspx, etc)

When you update a list form that is tied to a list that contains an event receiver, the “ItemUpdating” event will be fired on the event receiver when you save the changes. In this case, the properties.ListItem will be null. So, in our situation, we check for this condition and return immediately when it is true.

Restore of a deleted item from the Recycle Bin

When you restore a deleted item from the Recycle Bin, the event receiver on the list where the item is being restored will fire. In this scenario, there may be properties that are null or you may want to treat restored items differently than newly created items. In our scenario, we did not want to take any additional action on restored items, so we did an immediate return when we determined it was an item from the Recycle Bin. The ItemId for a restored item is greater than 0 so we used this information to determine this was an item being restored as opposed to a new item where the ItemId would be 0.

So, next time you include an Event Receiver as part of your SharePoint solution, be sure and test as many “corner cases” as possible to ensure your Event Receiver doesn’t break your application or produce unexpected results.

read more
Tim CoalsonDefensive Coding in SharePoint Event Receivers
bootstrap.jpg

Bootstrapping an AngularJS App with Server-Side Data

Recently, I had a need to have certain “start up” data present the moment that an AngularJS app I was developing started running. Due to JS’s asynchronous nature, I couldn’t make a call back to the server to fetch the data without angular attempting to use the data before the call was completed.

Normally, when you need something to happen after an AJAX style call has been issued, you’d have your call return a promise and use the ‘then’ function to do what you needed. In this case, my AJAX call was being made by a service called in my angular ‘app.run(…)’ function. The service issued an HTTP GET to a WebAPI controller I had defined who’s sole purpose was to return some essential data I needed for my application to run.

I also had an Angular controller that, at the same time, was trying to consume the data my aforementioned AJAX call was trying to retrieve. Obviously, that’s an issue and a fairly common one known as a ‘race condition.’

I don’t know if he came up with the solution or if he found it online, but I have to thank my former colleague, Sean Hester, for providing me with a a nifty solution to bootstrapping an Angular app with server-side data.

I recalled that, while we were working on another AngularJS project, Sean had implemented a way of injecting some data that we coded in C# in such a way that it was exposed in our Angular app’s javascript code. It was never clear to me how that happened but, since it worked and time was a constraint, I never looked into it.

Thankfully, I still have a copy of that code. After reviewing, what Sean had done was create an MVC ApiController that we added to our required data. On the return, he took that data, serialized it, and returned a formatted string that was actually a javascript function that simply defined our data in a namespace. Then, in our main layout, he added a script tag and set the ‘src’ attribute to the Route for the aforementioned ApiController. When the browser reaches the tag, it calls the ApiController, gets the string and inserts it between the script tags. Then, all you have to do is reference the namespace where you need it and, boom, you have your startup data.

My example HTML page.

  <html>
    <head>
      <script src="api/startup"></script>
      <script src="//ajax.googleapis.com/ajax/libs/angularjs/1.3.8/angular.min.js"></script>
      <script src="/scripts/angular/myApp.js">
    </head>
    <body>
     <div>{{currentUserName}} : {{currentUserId}}</div>
    </body>
  </html>

My example “Startup” API Controller

[RoutePrefix("api/startup")]
[HttpGet]
public class StartupController : ApiController
{
  [Route("")]
  public HttpResponseMessage Startup()
  {
    var data = new
    {
       userName: "Will Holland",
       userId: "U1234"
    }
    try
    {
      const string scriptFormat = "(function(s) {{ s.blog = s.blog || {{}}; s.blog.config = {0}; }})(window || scope)";
      var json = JsonConvert.SerializeObject(data);
      var script = String.Format(scriptFormat, json);
      response = Request.CreateResponse(HttpStatusCode.OK);
      response.Content = new StringContent(script, Encoding.UTF8, "text/plain");
    }
    catch (Exception ex)
    {
      //Error Handling
    }
    return response;
  }  
}

My Example Angular App

(function(s){
  var app = angular.module("myApp", []);
  
  if(demo.config == null) console.error("The Configuration Data must be defined before the angular app can run");
  
  app.run(["$rootScope", "config", function($rootScope, config){
    $rootScope.currentUserName = config.userName;
    $rootScope.currentUserId = config.userId;
  }]
})(window||scope);
read more
William HollandBootstrapping an AngularJS App with Server-Side Data
adding-angularjs-to-sharepoint.jpg

Adding AngularJS to SharePoint

“To Be the Man”…

A few years ago, John Underwood authored “Adding jQuery to SharePoint” which would turn out to be the most popular blog post on our website. Each month, Danny Ryan — our patron saint of blog posts — charges each of us here at ThreeWill to try and take the top spot from John. To quote the great Ric Flair, “To be the man, you gotta beat the man.” And I figured that the best way to do that would be to shamelessly copy his jQuery blog post, except with an AngularJS spin.

I’ll start with the same caveat that John had. This article is not intended to be a tutorial for AngularJS. And though I could (and sometimes do) go on about how awesome AngularJS is, I’ll presume that you already get that and are just looking for a way to add that awesomeness to your SharePoint application.

A Spork in the Road

Adding jQuery is similar to adding Angular JS, but SharePoint has evolved since John’s article, so my questions will closely (but not quite) resemble John’s.

  1. Will you be using Angular for the majority of pages in a site, or just a few?
  2. Are you using Angular on a site page, an application page, a web part, or an app?
  3. Will your solution be deployed to an On-Premise SharePoint farm, or will it be deployed to a hosted SharePoint environment, such as Azure?

If your answer to third question is that you’re developing a Full-Trust solution to go on an On-Premise SharePoint farm, then the same approaches that John lists out in his blog post will work just as well for Angular as they do for jQuery, therefore, I won’t cover those same approaches. If, however, you’re developing against a Hosted environment, (or you want to follow along with the SharePoint App model), then the below approaches might just be what you’re looking for.

The Designer Approach

In John’s jQuery blog post, one of his approaches was to use a CDN (content delivery network) to load jQuery. Like jQuery, you can use a CDN to load AngularJS. I like to take it a step further, though, and provide a backup in case the CDN goes down.

If you’re working on a page or web part and you just need to get angular “in there,” then perhaps this approach is for you. It has the benefit of being a “no-code” solution, meaning that it doesn’t require any deployment and will work in any SharePoint environment (On-Prem, Hosted, Azure, etc…).

First, I’ll take a copy of the AngularJS code I’ve downloaded and add it to my site’s document library.

Now, let’s say you need angular on a new site page you’re creating. Unfortunately, trying to use the tools available via the web browser haven’t worked anytime I’ve tried (the Embed tool seems to strip out any external references to script files), so you’ll have to open (or create) the page using SharePoint Designer and edit it in advanced mode.

Once opened in advanced mode, you’ll want to locate the “PlaceHolderAdditionalPageHead” section. This is where you’ll want to put all of your script tags to include AngularJS on the page.

The first script tag points to the official AngularJS CDN hosted by Google. The second script tag checks to see if angular has been defined in the current window. If not (because the CDN failed for some reason), it will load the version you have in your sites Document Library.

The final script tag sets up an angular app and controller, which-as you probably already know-is required to wire things up to Angular. You can, of course, have this in a self-contained file stored in a document library, as well.

I also added the style section you see there to remove an unsightly warning message that SharePoint injects on the page to inform me that I’ve deviated from the defined page template, which is nothing to worry about.

Once you’ve got that added, you’ll next need to locate the “PlaceHolderMain” content section which should be located towards the bottom of the page. You’ll likely see some <SharePoint:SPRibbonButton> tags and a <SharePoint:EmbeddedFormField> tag. After that last tag, add the following bit of HTML.

You’ll notice that I’ve prepended the ng-app and other directives with ‘data-‘. If you don’t do this, SharePoint Designer will actually remove the directive altogether. Likewise, if you don’t provide a value to a directive, such as data-ng-app, SharePoint Designer assigns it an empty value, which causes problems within angular.

With that done, save your changes and click “ok” to any warnings you receive, then open up your page in SharePoint. You’ll see that there’s a text box that is pre-filled with the text “World” but, when you change it, the “Hello World” text changes to “Hello {{whatever you type}}.” Ah, the goodness of Angular.

If you also happen to be running a Network Traffic tool, like IE’s Developer Tools or Fiddler, you’ll see that there’s a call going out to the Google Angular CDN. You could also test that the “fallback” mechanism is working by intentionally adding a typo to the CDN url (back over in SharePoint Designer) and then refreshing your page. You’ll see that angular is being called from your document library instead of the CDN.

The SharePoint App Approach

If you’re creating a SharePoint app, then this approach is certainly for you!

Start by cranking up Visual Studios 2013 and open (or create anew) your SharePoint App project. After you configure your settings (I create a SharePoint hosted app deploy to a SharePoint Online developer site), you’ll see that you have a handful of modules already in your solution, including a Scripts module. Add an Angular folder if you want (I do), and here you’ll add your downloaded copy of AngularJS.

Next up, open up the Default.aspx page. If you read through the first approach, then you probably know what we’re doing next.

You’ll want to locate the “PlaceHolderAdditionalPageHead” section. This is where you’ll want to put all of your script tags to include AngularJS on the page. The difference is that instead of pointing to the “Shared Documents” library, we’ll be pointing to where we added Angular.

Then, down in the “PlaceHolderMain” section, we’ll add the same HTML tidbit as before, except now we can dispense with the ‘data-‘ prefixes.

Once done, press F5 (or however you like to deploy), When the process is done, you’ll see your fancy Angular-ized SharePoint App alive and well.

The SharePoint App Part Approach

This approach is much like the previous approach, except that instead of adding Angular to a “stand-alone” SharePoint app, we’re adding to an app part that is added to a page. We start with the same steps as before: Create a new SharePoint App Project (or, if you followed along with the last section, you can reuse the same project) and once opened, add Angular to the Scripts module.

After that, right click on the name of the solution and choose “Add -> New Item” and from the list of items, choose Client Web Part (Host Web) and give it a name.

Once the webpart has been added, you’ll have a new page inserted in the Pages module that contains the name of your webpart. It’s here that we’ll add the script references and html tidbits like before. Unlike the previous examples, this page contains mostly pure HTML, so you’ll add the same script tags as before in the <head> section and the HTML in the <body> section. Once added, just deploy as before.

Once added, navigate your way back to your sites home page and put it into edit mode, find an empty section to add your app part, and click the App Part button from the Insert section of the ribbon. You should see your App part listed in the “Parts” section. Select it and click the Add button. Once it’s added, click Save and…voila! You have an angular-ized SharePoint App Part.

All Roads Lead to Angular Awesomeness!

While anyone of these approaches, or those listed by John in his blogpost, should work dandily for you, there might be, (and probably are) other methods of getting Angular into SharePoint. Now you at least have a few ideas on how you might get started. If you feel there’s a solid approach that I failed to mention, please let me know in the comments. The important thing is that you start spicing up some of that dull (and sometimes archaic) SharePoint content and functionality with the goodness that AngularJS provides us!  I hope you enjoyed learning about adding AngularJS to SharePoint.

Let me know if you have any questions in the comments below…

read more
William HollandAdding AngularJS to SharePoint
logs.jpg

Getting the Request URL with SharePoint ULS Logs

I recently ran into an issue where I needed to query the SharePoint ULS logs to find all that had a certain message.  In my case, I was looking for SMTP errors to try to understand when they were happening and under what conditions.  Using the Get-SPLogEvent PowerShell cmdlet is useful as it tells me when this is happening.  Here is an example:

Unfortunately, it doesn’t tell me under what conditions.  What I really wanted was the request URL for the error. I know that the SharePoint ULS logs typically have the request URL as the first ULS log entry for web requests for a given correlation ID.  So, I wrote up a PowerShell script that looks back on the current correlation ID and tries to find the request URL.  Here is the result:

The help text in the source goes into more details on how best to use it.  If you find this useful, please leave me a comment below.  If you have any changes you would like to make, please give me a pull request.


SharePoint is a web application platform in the Microsoft Office server suite. Launched in 2001, SharePoint combines various functions which are traditionally separate applications: intranet, extranet, content management, document management, personal cloud, enterprise social networking, enterprise search, business intelligence, workflow management, web content management, and an enterprise application store. SharePoint servers have traditionally been deployed for internal use in mid-size businesses and large departments alongside Microsoft Exchange, Skype for Business, and Office Web Apps; but Microsoft’s ‘Office 365’ software as a service offering (which includes a version of SharePoint) has led to increased usage of SharePoint in smaller organizations.

While Office 365 provides SharePoint as a service, installing SharePoint on premises typically requires multiple virtual machines, at least two separate physical servers, and is a somewhat significant installation and configuration effort. The software is based on an n-tier service oriented architecture. Enterprise application software (for example, email servers, ERP, BI and CRM products) often either requires or integrates with elements of SharePoint. As an application platform, SharePoint provides central management, governance, and security controls. The SharePoint platform manages Internet Information Services (IIS) via form-based management tooling.

Since the release of SharePoint 2013, Microsoft’s primary channel for distribution of SharePoint has been Office 365, where the product is continuously being upgraded. New versions are released every few years, and represent a supported snapshot of the cloud software. Microsoft currently has three tiers of pricing for SharePoint 2013, including a free version (whose future is currently uncertain). SharePoint 2013 is also resold through a cloud model by many third-party vendors. The next on-premises release is SharePoint 2016, expected to have increased hybrid cloud integration.

Office 365 is the brand name used by Microsoft for a group of software plus services subscriptions that provides productivity software and related services to its subscribers. For consumers, the service allows the use of Microsoft Office apps on Windows and OS X, provides storage space on Microsoft’s cloud storage service OneDrive, and grants 60 Skype minutes per month. For business and enterprise users, Office 365 offers plans including e-mail and social networking services through hosted versions of Exchange Server, Skype for Business Server, SharePoint and Office Online, integration with Yammer, as well as access to the Office software.

After a beta test that began in October 2010, Office 365 was launched on June 28, 2011, as a successor to Microsoft Business Productivity Online Suite (MSBPOS), originally aimed at corporate users. With the release of Microsoft Office 2013, Office 365 was expanded to include new plans aimed at different types of businesses, along with new plans aimed at general consumers wanting to use the Office desktop software on a subscription basis—with an emphasis on the rolling release model.

read more
Kirk LiemohnGetting the Request URL with SharePoint ULS Logs
check.jpg

Updating the Task Approval Form for the “Out of Box” SharePoint Approval Workflow

I was recently working on a project where the customer wanted to use the default SharePoint Approval workflow, but they wanted to include some additional information on the Task Approval form so that the “approver” did not need to navigate back to the SharePoint Document library to get the key information required to approve or reject the request. In this particular scenario, the customer wanted to add the Vendor name and the Invoice amount fields. Having customized Approval forms in the past that were related to SharePoint custom lists, I thought this would be relatively straightforward; it turns out that the process is a bit different between lists and document libraries.

To get started, you will need to open SharePoint Designer and navigate to the appropriate SharePoint site. Next, navigate to the Workflows folder and make a copy of the Approval – SharePoint 2010 as demonstrated below. SharePoint 2010 workflows run in both 2010 and 2013 environments.

Right-click the Approval-SharePoint 2010 workflow and select Copy and Modify. Add the name of the workflow (in my case, “Demonstrate Custom Columns on Approval Form”) and select the Content Type. In this scenario, I am associating this approval workflow with a custom Document Set content type called Custom Document Set.

After providing the workflow name and associated content type, click OK to save the new workflow. I also recommend publishing the workflow at this point so that it will go ahead and generate the default InfoPath Approval Form. Note that the default name of the InfoPath form will be ugly. I won’t go into detail in this blog, but there are ways to manipulate this name in the associated .xoml and xoml.wfconfig.xml workflow files prior to publishing the workflow so you can have a nicer name.

Having created a copy of the “out of box” Approval workflow and published it, we can now edit the workflow. Click the name of the Approval task as highlighted below.

You will see a Task Form Fields section in the top-right of this screen. Here we want to add two new Task Form Fields by clicking on the New button.

Add a field called Vendor as a single line of text.

Next, add Invoice Amount as Currency.

After completing these steps, publish the Workflow again to persist these changes. This has basically added two new fields to the task content type associated with this Approval workflow.

Next, we need to open up the task approval form and add the new fields. Click on the form name to launch InfoPath where we can edit the form.

After the form opens in InfoPath, right-click the Status column and select Insert Row above.

After the new row has been added, enter the Vendor text in the label column.

Next, in the data area of the form, add the Calculated Value object from the Controls list as demonstrated in the 2 screen shots below.

To see the Calculated Value option, select the small arrow in the bottom of the controls list.

After clicking this arrow, you will see the Calculated Value option as demonstrated below.

Next, select the Vendor field from the Fields list as demonstrated below. This screen shot displays the culmination of having clicked the “fx” button and selected the “Insert Field or Group”. Select the Vendor option.

Repeat these same steps to add the Invoice Amount column to the form. For the Invoice Amount column, you can choose to display the value formatted as currency as show below.

Publish the updates to the form by clicking the blue “up arrow” in the top of the InfoPath form.

Allow InfoPath to save a local copy as part of the Publish process as shown below.

You should receive a message indicating the Publish was successful.

At this point, Publish the workflow again so that the updates to the InfoPath form are saved with the Workflow.

Now that we have added fields to the task content type and updated our InfoPath form to display those fields from the task content type, we need to populate those task fields within the workflow.

Edit the Workflow and click the Approval task name as demonstrated below.

Next, select the “Change the behavior of a single task” option.

In the “Before a Task is Assigned” section, add a “Set Task Field” action. Select the Vendor field from the list of Task fields and select the Current Item: Vendor field from the fields displayed in the Current Item. Basically this is populating the Task field called Vendor with the Vendor value from the Current Item before creating the Task. Repeat the same steps for the Invoice Amount field.

Publish the workflow again to persist the changes to the workflow.

Now when you run the workflow and an Approver opens up the Task Approval form, they will see the new Vendor and Invoice Amount fields populated which will help them decide to Approve or not.

One limitation of this solution to point out is that the values on the Task Approval Form are captured at the time the workflow runs and are not dynamic. So, if these values are changed in the document library, the changes will not be shown in this form. The rule of thumb should be that any changes to the underlying data in the document library should prompt any running Approval Workflows to be cancelled and a new one started. This would make sense since approvers in a multi-step approval process may have already approved based on the original values.

read more
Tim CoalsonUpdating the Task Approval Form for the “Out of Box” SharePoint Approval Workflow