Securing Azure Functions with Azure Active Directory

Pete is a Director of Technology at ThreeWill. Pete’s primary role is driving the overall technology strategy and roadmap for ThreeWill. Pete also serves as ThreeWill’s Hiring Manager and is constantly looking for new talent to join the ThreeWill family.

Editor’s Note – Pete has been busy sharing knowledge on his personal blog.  I asked him if we could promote it on our corporate blog and he agreed.  Enjoy!

This is a multi-part post about consuming Azure Functions secured by Azure Active Directory. I have been using Office 365 applications with OAuth tokens for a while but wanted to dive a bit deeper and learn some of what is going on behind the scenes.

read more
Pete SkellySecuring Azure Functions with Azure Active Directory

Office 365 Groups vs Azure AD Security Groups

Caroline Sosebee is a Software Engineer at ThreeWill. She comes to us with 20+ years of software development experience and a broad scope of general IT support skills.

We recently had a client who was ready to streamline the security of their SharePoint Online site and change it from ‘Everyone’ access to groups of people with more specific access. Our recommendation to them was to use Azure AD groups so that the groups would be global and could be both centrally managed and used across site collections.

As we moved ahead with it, they had the groups added with the appropriate members. We then granted SharePoint permissions to the new AD groups by adding them into the appropriate SharePoint groups and removing the reference to ‘Everyone but external users’.

At first all seemed to work ok but as the week progressed, random problems started cropping up that we couldn’t explain, the biggest one revolving around search results. One of their users (and others, we later found out), who had full read access to the root site and all subsites, would only get back results from his OneDrive library and from the separate training documents site (which is open to Everyone). Yet he could easily navigate to and access all the document libraries in all the sites.

Thus began my long search on all sorts of things SharePoint search related, trying to figure out what was going on. For some reason, I finally decided to go look at the AD groups themselves with the thought that since roles are assigned to users, maybe the same thing might need to be done for groups. This was a bust of course, but being fairly new to administrating SharePoint Online, I was game for checking all sorts of things I didn’t know about.

Luckily this random check ultimately ended up pointing me to the real problem. It turns out these two new groups were setup as Office 365 Groups instead of security groups. At the time, I didn’t know anything about Office 365 Groups but didn’t really think this could be the problem. I decided to do a little research anyway into what that meant. One of the definitions I found was:

Office 365 Groups is a service that enables teams to come together and get work done by establishing a single team identity (managed in Azure Active Directory) and a single set of permissions across Office 365 apps including Outlook, SharePoint, OneNote, Skype for Business, Planner, Power BI, and Dynamics CRM.

So SharePoint was mentioned in that list of Office 365 apps, right? How could the group type be the problem then? We needed access to SharePoint and it says it does that. What it doesn’t tell you is that it’s mostly referring to access to the team site that is created, specifically for that group, when the group is first created. It does not mean that it will be very usable by other sites.

After more searching and finding very little, I decided it was time to do some of my own testing. First I had a test user added to the Office 365 Group currently in use. After giving the cloud some time to process this change, I signed in and ran a search or two. What I got back was very similar to the user mentioned above. I got little or no results back, even though I had access to everything in the site.

I then created a new Azure AD Security group, added the same test user to it and then granted it the same permissions in the SharePoint site as the Office 365 Group had. After waiting a decent time so I was sure the security change was processed by search, I signed back in and found that the behavior was now entirely different. With the test user as a member of my new security group, I got back tons of results, just as expected. To further verify, I then removed the user from my test security group, waited a bit, ran a search and found I was back to square one. This was pretty solid evidence that the Office 365 Group was the culprit.

Our end solution was to create new Azure AD Security groups, add the correct members, grant them the same access as had been granted to the Office 365 Groups and then remove the Office 365 Groups. This seems to have corrected all the problems the users were experiencing.

I still don’t know a lot about Office 365 Groups and haven’t had time to research much further, but I do like the below snippet (found here) that very succinctly describes each group and what it does.

I’m sure that Office 365 Groups have a place in the Microsoft world, but it is definitely not as a replacement to AD security groups.

As a quick recap, here are the areas that were impacted (at least on this particular site) by using an Office 365 Group instead of a security group:

  • Search – would not return results from the site
  • Starting a workflow – if a user in an O365 group kicked off a workflow, the workflow got hung up with an ‘Access Denied’ error before it ever got far enough along to send out custom errors.
  • Site access – Various users had problems accessing the site, even though they were in the correct group. Was a very random thing as it worked for some and for others, it didn’t.

I hope this helps save someone some time in the future!

read more
Caroline SosebeeOffice 365 Groups vs Azure AD Security Groups

Deploy an Azure Function App using Azure ARM Templates

Pete is a Director of Technology at ThreeWill. Pete’s primary role is driving the overall technology strategy and roadmap for ThreeWill. Pete also serves as ThreeWill’s Hiring Manager and is constantly looking for new talent to join the ThreeWill family.

“About a month ago, the Azure Functions for SharePoint (AFSP) open source project was introduced. The project’s goal is to provide a set of common functions for scenarios shared by most SharePoint provider hosted add-ins implemented as an Azure Function. Azure Functions is a compelling service from Azure and the Azure Functions for SharePoint pattern is promoting an equally compelling use of Azure Functions in the Office and SharePoint space and I wanted to get involved with the project.  One of the goals of the project is to enable people to make the use of the pattern and Azure Functions quickly and easily. While there is a Client Configuration Guide to manually configure the Azure resources required to start using the Azure Functions for SharePoint, we thought minimizing the friction of getting started by providing a scripted configuration of the Azure Resources required would be very helpful.

Read about how to use an ARM Template to deploy all the resources to get started with the Azure Functions for SharePoint.“

read more
Pete SkellyDeploy an Azure Function App using Azure ARM Templates

Long Running Operations in Azure

Lane is a Senior Software Engineer for ThreeWill. He is a strong technology expert with a focus on programming, network and hardware design, and requirements and capacity planning. He has an exceptional combination of technical and communication skills.

For a project, we recently had to implement a mechanism for performing bulk data operations. Essentially, the customer wanted for an end user to upload an Excel file with a bunch of tabular data in it that we would then process and insert into SQL Server. We knew the act of processing and inserting could be very time consuming (we estimated it would be ~30 seconds/row x 10,000 max rows = ~1.2 days) so we wanted something very robust. Simply working in the application pool wasn’t an option, IIS would time us out. When coding the on-premise implementation, we just used a Windows service to watch for a file event in a folder that would process the file uploaded by the user. Easy peasy lemon squeezy.

However, Azure works differently. To have the equivalent of a Windows service in Azure we either needed a Worker Role or an Azure Fabric Stateless Worker. Both of these approaches were undesirable since worker roles always consume resources, even when they are idle. This bulk import process would be used once, maybe twice, in the initial phases of an engagement and then never again over the course of months (or maybe even years). That means that money is being spent even when nothing is happening.

Our first idea for solving this was to use an Azure Function App. A Function allowed us to listen to an Azure BLOB storage account for when a file was written and then react accordingly. However, we quickly realized that Function Apps have a 5-minute timeout (at least the Consumption plan ones do, I think the ‘always on’ Functions do not have this limitation) and our job had the likelihood of running for days.

The next couple ideas we tried out were very complicated, and in the end, none of them worked. That’s when we stumbled across Azure Batch Services. Batch Services are a way to run very long running operations much like the worker roles. However, unlike Worker Roles, you only pay for Batch Services when a batch job is running. The Azure Team were even nice enough to provide some example projects that showed how to wire up a job, start it, monitor it, and eventually kill it when its done (and all kinds of other very helpful things). Even better, we learned we could set up a Function App to trigger off a file being written to BLOB storage that would start the batch job, but the Function App didn’t have to wait until the job was completed, so we weren’t limited by the 5-minute timeout any longer. Even better, when setting up a job, you can specify how many cores and nodes in the batch cluster you want. That allows us to “preflight” the job by looking at how many records we will be importing and scale our job accordingly. Very slick!

It feels weird writing a blog post without code so here are some steps that should get you pointed in the right direction:

  1. Create a Batch Service account from the Azure Portal.
  2. Grab one of the sample projects from the Github project I linked above. I used HelloWorld as a starting place.
  3. Open Main.cs and comment out the lines for Console.ReadLine at the end of Main() (~line 38), and the call to WaitForJobAndPrintOutputAsync (~line 71) and all the code in the Finally block (~line 75-80) in HelloWorldAsync() (mainly we just want HelloWorld to tee up the job and not wait for it to finish). Save and compile everything.
  4. Make sure your Azure specific settings (keys, URL, etc.) are saved in Common’s AccountSettings.settings.
  5. Create a new Function App. Add a BlobTrigger-CSharp Function and add the following code into Run (you will also need to add ‘using System.Diagnostics’ at the top):
    Process process = new Process();
    process.StartInfo.FileName = @"D:\home\site\wwwroot\BlobTriggerCSharp1\HelloWorld\HelloWorld.exe";
    process.StartInfo.Arguments = "";
    process.StartInfo.UseShellExecute = false;
    process.StartInfo.RedirectStandardOutput = true;
    process.StartInfo.RedirectStandardError = true;
    string output = process.StandardOutput.ReadToEnd();
    string err = process.StandardError.ReadToEnd();
  6. Open the Function App settings and open Kudu. Use Kudu to upload the contents of the HelloWorld Debug folder into the Function App.
  7. Configure the BLOB triggers for the Function App using any path and storage account you want to.
  8. If needed, correct the path to the EXE above.
  9. Open Azure Storage Explorer and upload a file to the storage account you configured in #7.

You should now see your HelloWorld job if you navigate to the Batch Service’s Jobs in the Azure Portal. Note that because we took out the code to delete the job once its done from HelloWorld, you will need to add that logic somewhere else in your workflow (maybe another BLOB trigger that kicks off another Azure Function App).

And that’s it. An on-demand, scalable batch processing mechanism that works (and costs us money) only when we want it to.

read more
Lane GoolsbyLong Running Operations in Azure

Free ThreeWill Webinars for 2017

Danny serves as Vice President of Business Development at ThreeWill. His primary responsibilities are to make sure that we are building partnerships with the right clients and getting out the message about how we can help clients.

We’re excited to announce our Webinar Schedule for 2017 (all times in EST)…

  1. Moving from SharePoint Online Dedicated to Multi-Tenant – 1/26/17 @ 1:00pm – Listen Now
  2. Migrating from Jive to Office 365 – 2/23/17 @ 1:00pm – Listen Now
  3. Complex SharePoint Online/2016 Migrations – 3/30/17 @ 1:00pm – Listen Now
  4. Creating Award-Winning SharePoint Intranets – 4/27/17 @ 1:00pm – Watch Now
  5. Find Anything in SharePoint with Amazon-Like Faceted Search – 6/29/17 @ 1:00pm – Watch Now
  6. Budgeting for 2018 SharePoint Initiatives – 10/26/17 @ 1:00pm – Register
  7. Successful SharePoint Farm Assessments – 11/30/17 @ 1:00pm – Register

The schedule is subject to change (especially if presenters get overloaded on projects). Let us know in the comments if you have other topics that you would like us to cover.

Sign up below to get notified about upcoming events or follow us on twitter.

SharePoint is a web application platform in the Microsoft Office server suite. Launched in 2001, SharePoint combines various functions which are traditionally separate applications: intranet, extranet, content management, document management, personal cloud, enterprise social networking, enterprise search, business intelligence, workflow management, web content management, and an enterprise application store. SharePoint servers have traditionally been deployed for internal use in mid-size businesses and large departments alongside Microsoft Exchange, Skype for Business, and Office Web Apps; but Microsoft’s ‘Office 365’ software as a service offering (which includes a version of SharePoint) has led to increased usage of SharePoint in smaller organizations.

While Office 365 provides SharePoint as a service, installing SharePoint on premises typically requires multiple virtual machines, at least two separate physical servers, and is a somewhat significant installation and configuration effort. The software is based on an n-tier service oriented architecture. Enterprise application software (for example, email servers, ERP, BI and CRM products) often either requires or integrates with elements of SharePoint. As an application platform, SharePoint provides central management, governance, and security controls. The SharePoint platform manages Internet Information Services (IIS) via form-based management tooling.

Since the release of SharePoint 2013, Microsoft’s primary channel for distribution of SharePoint has been Office 365, where the product is continuously being upgraded. New versions are released every few years, and represent a supported snapshot of the cloud software. Microsoft currently has three tiers of pricing for SharePoint 2013, including a free version (whose future is currently uncertain). SharePoint 2013 is also resold through a cloud model by many third-party vendors. The next on-premises release is SharePoint 2016, expected to have increased hybrid cloud integration.

Office 365 is the brand name used by Microsoft for a group of software plus services subscriptions that provides productivity software and related services to its subscribers. For consumers, the service allows the use of Microsoft Office apps on Windows and OS X, provides storage space on Microsoft’s cloud storage service OneDrive, and grants 60 Skype minutes per month. For business and enterprise users, Office 365 offers plans including e-mail and social networking services through hosted versions of Exchange Server, Skype for Business Server, SharePoint and Office Online, integration with Yammer, as well as access to the Office software.

After a beta test that began in October 2010, Office 365 was launched on June 28, 2011, as a successor to Microsoft Business Productivity Online Suite (MSBPOS), originally aimed at corporate users. With the release of Microsoft Office 2013, Office 365 was expanded to include new plans aimed at different types of businesses, along with new plans aimed at general consumers wanting to use the Office desktop software on a subscription basis—with an emphasis on the rolling release model.

read more
Danny RyanFree ThreeWill Webinars for 2017

Getting Started With Azure Functions

Pete is a Director of Technology at ThreeWill. Pete’s primary role is driving the overall technology strategy and roadmap for ThreeWill. Pete also serves as ThreeWill’s Hiring Manager and is constantly looking for new talent to join the ThreeWill family.

Danny Ryan:Hello and welcome to the ThreeWill podcast. This is your host Danny Ryan and I have Pete Skelly here with me. He is our director of technology, yes?


Pete Skelly:Ah, that’s correct.


Danny Ryan:Director of technology.


Pete Skelly:That’s what they call me today.


Danny Ryan:You take the technology and you direct it in certain ways, is that what you do?


Pete Skelly:Some would say that.


Danny Ryan:Some would say that and so this is going to be a really fun podcast because Pete is in the delirium stage at this point right now so what better than to sit down, record it for posterity and so when your grandkids are listening to this they’re like, “Papa, what are you talking about? I don’t understand what you’re talking about. You sound silly.” What, seriously today we wanted to talk about a subject that we covered in the morning brew, a morning brew is something we do weekly where we sort of do knowledge sharing within ThreeWill. This week you covered Azure Functions, correct?


Pete Skelly:Yes, that’s correct.


Danny Ryan:A sec, hit that and tell me at a high level, assume I was there and listening or maybe I wasn’t listening and you were going to describe this to me at a high level?


Pete Skelly:Azure Functions, I don’t even know what the Microsoft marketing terminology and speak is at this point but Azure Functions to me really are the, a pure function if you will, and not in the proper use of pure function, don’t send me hate mail. It’s just the functionality that you want to run.


Danny Ryan:Okay.


Pete Skelly:There’s no additional need to have a full blown visual studio project or, you know, all this extra cruft to accomplish a typically small task. There’s a whole lot of caveats around that but that kind of boils down. You’ve got something you want to accomplish, it can be chunked very small, and it’s kind of an in and out function that achieves some, you know, valuable process.


Danny Ryan:This is, previously we’d take these and you know you’d build out an app and this app would have a bunch of functions inside of it and-


Pete Skelly:Yep.


Danny Ryan:Here we’re taking something that’s going to, I mean we’re just putting it out there in the cloud and something that we can interact with and do what we need to do in little bite sized chunks.


Pete Skelly:Yep.


Danny Ryan:They go and they do their thing and they probably I guess just very specific inputs and outputs for this little guy that does this thing?


Pete Skelly:Absolutely, so Azure Functions are based on the Azure WebJobs SDK.


Danny Ryan:Okay.


Pete Skelly:You know, and please if somebody if I misstate this provide some comments or feedback in the


Danny Ryan:There will be a comments section.


Pete Skelly:Website that Danny has so, you know, full transparency Azure Functions were recently released to GA.


Danny Ryan:Okay.


Pete Skelly:Myself and someone else in our free time are spending some time learning some Azure Functions and one of the things that we started to look at was what do you really do with them? The classic example is a timer job.


Danny Ryan:Okay.


Pete Skelly:Probably the biggest use of WebJobs, at least from ThreeWill’s perspective for customers or even internal, we used WebJobs for timer jobs so we had some-


Danny Ryan:I love a good WebJob, I’ll tell you that much.


Pete Skelly:The timer job really was, I just, I’m not even going there. The timer job really is the classic example, you know, I have to clean out some table in a database, or I have a files in some location that I want to clean out, or it’s a very simple, everybody gets it, you know, classic IT Admin schedule a job to actually import a file or something.


Danny Ryan:Nice.


Pete Skelly:In Azure Functions it’s a very simple mechanism of saying I have this little bit of code that might have to, you know, add a file or parse a file and update a database, or parse many files and update a database and so you can create these functions that are based on HTTP triggers. Basically, kind of a fire and forget mechanism or even npoints that might serve a mobile app, or a custom app, or even a SharePoint app, for example. They could be timers, they could be BLOB or Cube-based and all of these things are just events. An event kicks off that code and that Azure Functions just essentially going to run and perform some action.


Danny Ryan:This tickles you inside because one of the things that you can do is PowerShow? What’s with the deal there.


Pete Skelly:One of the things, yeah, one of the things that part of what I kind of dove into in the morning brew the other day was to try to get my feet wet with Functions you can do. You can write the Azure Functions in multiple languages so they have support for Node, and C#, and Bash, and PowerShow, et cetera and I figured, “Hey, PowerShow seems pretty good.” I had just read that everything went GA for Azure Functions and I also saw an article or blog post by John Woo about using the PnP PowerShell CmdLets in an Azure Function so I said, “Why not give it a shot?”


Danny Ryan:Sure.


Pete Skelly:Using the PowerShell, using a PowerShell function and based on an HTTP trigger I was trying to set up a Microsoft Flow from a SharePoint list to kick off an Azure Function that would provision a site collection in Office 365.


Danny Ryan:Boom. Boom. It was easy, piece of cake?


Pete Skelly:Setting everything up was fairly easy except for getting the PowerShell PnP CmdLets to work. Little did I know, you know spoiler alert, the CmdLets had been changed probably two weeks prior and so everything that, you know, John wrote about-


Danny Ryan:Was it a big change, or?


Pete Skelly:The names, the CmdLet names were changed so they had used, you know, Startup PnP or Startup SPO for SharePoint Online for their CmdLet names and that caused a couple of problems and it was something you could work around but they recently changed it to Start PnP Star. For example, get-pnpsite would get you to site collection.


Danny Ryan:Okay.


Pete Skelly:What I did not know is that that had occurred and that I had grabbed the latest code for the PowerShow CmdLets and was trying to, thinking that it was an Azure Function issue because I’ve used the PowerShow CmdLets enough in the past to know exactly what they were. It’s always something simple and I didn’t know what I didn’t know and they were renamed. In the end it was real simple to set up, very simple issue-


Danny Ryan:Once you figured it out.


Pete Skelly:Just took a little while to figure it out, you know, so, in the end, it was a very simple, creative SharePoint list, creative Flow.


Danny Ryan:Yeah.


Pete Skelly:Then Flow has the ability to basically call out to an HTTP npoint and that pushed the data over to the actual Azure Function, PowerShell took that data and basically provisioned the site collection and applied the site template from the PnP CmdLets. Really simple exercise that shows some of the power of Azure Functions and right now, kind of, diving into some other things. There are some edges with it at this point so learning some of the development style, some of the SDLC or ALM depending on who you are. Some of the lifecycle management ways of developing functions, deploying functions, managing them, et cetera.


Danny Ryan:Thank you for defining the acronyms for me, that’s so sweet.


Pete Skelly:Yeah, so software development lifecycle and application lifecycle management. You know, their functions are pretty, they’re extremely powerful and they’re super simple to get started with so you can literally go to the Azure Functions at and in the course of probably 20 clicks have some pretty powerful stuff wired up.


Danny Ryan:Did you look at all, how are they pricing this? I mean, how do you license something like this?


Pete Skelly:In a nutshell, it’s compute time.


Danny Ryan:Okay, compute time.


Pete Skelly:Really, it’s, you know, for someone doing development you could probably get away, you know, learning it, you could get away with pennies if you’ve got a developer subscription or an MSDN subscription to Azure. It’s not going to cost you really anything to get started. There’s in the millions of compute cycles for free basically with an Azure subscription. If you’re using, you know, if you’re processing BLOBs, so if somebody puts a file into a BLOB storage and you have a function that’s kicked off because of that event, obviously, you’re going to pay for some BLOB storage and some of the storage things, or depending on how much you’re using could get a little pricey. To get started and to start figuring it out, very cheap.


Danny Ryan:Nice.


Pete Skelly:Also, really looking at if you have things that are, that typically might be a workload where, you know, a couple of our customers currently have the classic scheduled task running on a server in their own environment and so they’re paying for the infrastructure for the server, they’re paying for server itself, they’re paying for somebody to monitor and manage it if they’ve got to patch it. All those things where literally they’ve got a PowerShell script that’s fifty lines of PowerShell that’s running, that’s doing the work. That can be moved up to Azure and actually function in Azure, they get rid of all that cost.


Danny Ryan:That’s awesome.


Pete Skelly:They get rid of all those kind of moving parts and it just becomes somebody uploads a file to BLOB storage, that function runs, they get charged for the time that that function is running, and they move on. Really, I think there’s a lot of potential there.


Danny Ryan:Yeah.


Pete Skelly:The IFTTT mentality, and Zapier mentality, and now Flow. That style of interaction with things and we, you know, wrote in a white paper many moons ago. The true power of kind of a network, if you will, are all the permutations that you have basically all of those Nodes, the more Nodes you have the more value you’re going to get. While Flow just kind of, between Flow and Functions and all these other things it’s now you’re getting to a power use that could make some really effective workflows if you will, and I’m doing air quotes on the podcast.


Danny Ryan:I see it, I see the air quotes.


Pete Skelly:You can create some really powerful, you know, sort of personal workflows using Flow and then if there are something that need to be more enterprise grade that’s where Logic Apps would come in.


Danny Ryan:Okay.


Pete Skelly:In effect Logic Apps would be the enterprise level of a personal flow, if you will, and Logic Apps basically know how to interpret, interact with, you have full access to Azure Functions so it’s the big boy movement to getting all of the business processes that will be event driven within kind of your Office 365 or Azure environment.


Danny Ryan:Someone getting started with this besides nailing the class names, what would you-


Pete Skelly:Figuring out the easy stuff?


Danny Ryan:Any tips that you have for them?


Pete Skelly:Start with


Danny Ryan:Okay.


Pete Skelly:You’ll need an Azure subscription but that’s the simplest way to get started. The Azure WebJobs STK in GitHub has a ton of samples. I believe it’s Chris Anderson did a Azure Functions demo, or seminar, or session at Ignite this year.


Danny Ryan:Ignite, okay.


Pete Skelly:That is probably one of the better things I’ve seen as far as explaining what they are, how to use them, you know, giving you some concrete examples, et cetera. Once you kind of go through UI to get your feet wet there is in, there’s several utilities depending on if you are inside of Visual Studio. Me personally, I’m on a Mac so I’m using some command line tools so the Azure Functions Command Line Interface actually lets you stub things out and get started kind of in a better application lifecycle management so you’re using source control, you’re pushing to a repository.


Danny Ryan:Nice.


Pete Skelly:That’s actually going to, once you push it, check your code into master. It actually deploys it out to the Functions so get a little bit more advanced once you get out of the UI in the web you’ve got to get some tooling. That’s not there yet, it’s kind of half-baked. There’s some really good stuff out there but it takes a little effort, that’s kind of the stage I’m in is getting the pieces and parts together so that you are actually doing the development quote properly versus trying to go through the UI and not being really scaled. Not that there’s not value in that but for our customers, you know, we’ve got to find a way to actually make that a little bit more ALM compliant.


Danny Ryan:When do you see first, and we’re getting here to the latter stages of this, need to let you get back to work here. When do you see first using Functions on projects, or Azure Functions on projects? You’re next project you are going to be on, or is it?


Pete Skelly:I think dependency right now is does the customer have Office 365 first off. If they have Office 365 I think that’s the easiest on-ramp because then it’s just, “Okay, you’ve got Office 365 let’s get a subscription in Azure.” Basically, wire up some processes so one of our customers that I’m working with currently that’s in their plan for 2017.


Danny Ryan:Great.


Pete Skelly:They’re really looking forward to kind of doing some of the things I described which are kind of timer job based on premises and moving those into the cloud and using just pure compute so they’re not, they don’t have servers on-prem anymore, those types of things.


Danny Ryan:Do you mind taking a couple of hours this weekend and getting the Jive migration tool over into this Functions?


Pete Skelly:Actually the reason I’m a deer in the headlights today is because I’m working on, I’ve been mired in Git for most of the day trying to set it up so that we can have multiple Git repositories that share all the same code so I’m working on your migrator tool right now as we speak.


Danny Ryan:Aww, thank you. Very nice. Very nice.


Pete Skelly:It’s coming, I don’t know if it will be in Azure Functions very soon but it, that’s kind of the long-term goal for some of this stuff.


Danny Ryan:You should focus in on getting the project done. I’m like the number 18th priority around here so I understand. I know where I fit into the world. With that, thank you so much for taking, I just wish that at the end of the day here, taking some time out just to share, you know, share internally at the earlier this week and then sharing with everybody here. If you are picking this up and have any questions, please leave a comment at the bottom of the page. Also, promote your website, Pete.


Pete the Azure Functions blog post that actually goes through what I had to do to get the PnP CmdLets working full code, those types of things are all out there.


Danny Ryan:Yeah, and I’ll put a link to that at the bottom of the page as well so jump off there as a good next step from here and thank you so much for listening and have a wonderful day. Take care. Bye bye.


read more
Pete SkellyGetting Started With Azure Functions

How to Assign ThreeWill as Your Azure Partner of Record

Danny serves as Vice President of Business Development at ThreeWill. His primary responsibilities are to make sure that we are building partnerships with the right clients and getting out the message about how we can help clients.

Over the last 15 years we’ve been fortunate to help hundreds of customers.  If you’re a customer of ThreeWill and you want to do us a huge favor, please assign us as your Partner of Record.  This enables us to keep our Gold Certification and to serve you better because we have more resources from Microsoft to help you on projects.

Step-by-Step Instructions to Add ThreeWill as Your Partner of Record

  1. Go to the Microsoft Azure Portal at
  2. Click on the My Account icon on the upper middle of the screen.
  3. Click on Usage and Billing.
  4. Log into your account using your user name and password.
  5. In the left navigation pane, select Subscriptions.
  6. On the Summary Subscription Page, click on Partner Information on the right navigation. This is where you will attach your Partner of Record.
  7. Enter 566560 for the Partner ID.
  8. Click Check ID to verify ThreeWill.
  9. Click Submit to complete assigning their Partner of Record.
  10. After you customer assign us as your Partner of Record, we will receive an email notification that lets us know that we have been assigned as the Partner of Record.

To Change or Remove Your Partner of Record

  1. Following the steps outlined above, log into the Microsoft Azure Portal.
  2. On the Summary Subscription Page, click on Partner Information on the right navigation.
  3. Highlight the Partner of Record field and delete the Partner of Record shown in that field.
  4. Click the check box. You have now removed the Partner of Record for this account and your subscription no longer has a Partner of Record.

Frequently Asked Questions

Who is a Partner of Record?

The Partner of Record for an Office 365, CRM Online, or Azure subscription is the partner who is helping the customer design, build, deploy or manage a solution that they’ve built on the service. It is not the partner who sold the subscription.

What are the benefits of specifying a Partner of Record?

Customers benefit because it provides the partner access to usage and consumption data, so they can provide better service and help customers optimize their usage for their desired business outcomes.

Who can attach a Digital Partner of Record?

The administrator role, also known as the owner, is the only role within the customer’s tenant or account that can attach a Partner of Record. Service admins, co-admins, and partners designated as delegated admins do not have the ability to change the Partner of Record.

When should a Partner of Record be added to a for Office 365, CRM Online, or Azure subscription?

Microsoft recommends a Partner of Record be assigned to subscriptions right away. Partners of Record can also be assigned for Office 365 subscriptions in the admin portal for that service.

read more
Danny RyanHow to Assign ThreeWill as Your Azure Partner of Record

Configure Scheduled Shutdown for Virtual Machines in Azure

Eric Bowden has over 19 years of software development experience around enterprise and departmental business productivity applications.

Scheduled Shutdown in Azure

I’ve been using Virtual Machines in Azure for a while, and it’s a great option. My favorite part about it is how easy it is to expose Virtual Machines in Azure to the public internet. This can be really useful for a number of testing scenarios; It’s been a great resource. I’m able to use it because as part of my MSDN subscription, I’m provided a monthly credit. The only issue is that the credit is approximately enough to cover Virtual Machines that are running for the full month (as long as those Virtual Machines are running only during working hours.)

Approximately eight hours a day, five days a week.  The credit has been enough to cover. That works out great as long as I remember to start my Virtual Machines in the morning and then stop my Virtual Machines at the end of the day. Unfortunately, I’d forgotten to stop my Virtual Machines a number of times, so they run overnight. One time, they ran over the weekend and that caused me to overrun the monthly credit that I have as part of my MSDN subscription.

A colleague of mine, Rob Horton caught wind of that recently and directed me to this blog post that I’m showing, which walks you through the process where you can configure what’s called a runbook in Azure that runbook will run a PowerShell script. It’s actually called a graphical PowerShell script. There are graphical PowerShell scripts and I guess non-graphical PowerShell scripts and this walks you through configuring a graphical PowerShell script as part of your runbook, which can be used to automatically stop your Virtual Machines.

Configured that a couple of weeks ago. It’s been awesome! I configured mine to run at 6:00 PM which is approximately the end of the day and then 2:00 AM which could be the end of my day if it’s a long day. If I’m working on something into the night, 2:00 AM will probably be the latest. I just want  make sure that I have it running twice just to make sure that it shuts down. I’ll walk us through the process of configuring my runbook.  I’m showing on the screen now the blog post that I followed but there was one key difference in how I configured mine versus how this blog post walks you through it. That is that, I used a user account to run my runbook versus this blog post walks you through using an application account.

I think there are probably good reasons for using an application account. I tried that. I couldn’t get it to work, but I got it to work with a user account. I could probably circle back through and get it to work with an application account, but just have it put that effort into it. I’m going to walk you through today how to configure it using a user account. Let me get started.

The first thing I’m going to do is create the account. To do that, I’m going to go into the portal and choose Active Directory to get into my Azure Active Directory accounts. I’m going to go to the users tab. I’m going to add a user. I’ll create a new user and I’m going to call this VM Maintenance. Choose next and we’ll just call it VM Maintenance. Just create it as a standard user and you will get a temporary password for that user. I’m going to copy that to my clipboard. We’ll go ahead and close that up.

I’m going to open up a new incognito window. Then I’m going to go ahead and log in as this new user. It’s going to prompt me to change my password, which I will do. Take my password and sign in. Now, my credential is set, and my next step is that I’ll need to configure this user as a co-administrator of my Azure subscription. To do that, I’m going to scroll down to Settings I’ll choose Administrators and then I’m going to choose Add. Add in my user. Select that user as a co-administrator.

You’ll see the user is actually listed in here twice at the moment because I did it earlier while preparing for this demo. Let’s just remove the first one. There we go. Hopefully, I removed the old one and not the new one, but we’ll see what happens there. My account is now a co-administrator of this account. I know that seems pretty heavy-handed. I have read a blog post, Stack Overflow post and so forth, about people who have stated that this is necessary, and I’ve actually tried to get this user a lower account privilege and it was not successful. This does seem to be necessary.

That should be what we need for our accounts. Next, I’m going to go back to the portal admin page. I’m going to go to automation accounts and I’m going to add a new automation account. Click Plus and we will call this, VM Automation. I’m going to add it to an existing resource group. I’m not really aware of why this matters, but I’ll choose the same resource group as I do for all my others. I’ll leave the other default settings and choose Create.

It may take just a few minutes for the automation account to finish the creation process. For me, I had to click refresh on this automation, the listing of automation accounts. Once that appears, once it’s all like that, once the new automation account is open, the first thing we’re going to want to do is go to Assets and then we’re going to click Variables and we’re going to add a new Variable called Azure Subscription ID. For that, we of course are going to need our Azure subscription ID. Just go back to the portal. Go to My Subscriptions grab the Azure subscription ID. Paste that in.

Next, we’re going to need to create a credential which will be used to run the runbook, so I’m going to click Credentials. I’m going to click Add Credential and we’re going to name this one Azure Credential. I’m going to add in the VM Maintenance account that I created earlier. Now we’ve created a variable and a subscription and our next step is going to be to create the runbook. I’m going to choose Add runbook. No, I’m not going to choose that runbook. I’m going to choose Browse Gallery and then choose Stop Azure Classic. Let’s just see what that gives us. Stop Azure Classic VM’s.

You’ll notice that there’s two of them. There’s one that says PowerShell Workflow Runbook and another one that says Graphical PowerShell Workflow. The Graphical Powershell Workflow is the one that we need. We’ll choose that and then select Import. I can accept the default name, choose okay. We’ll import it from the gallery. My next step is going to be to choose Edit. There is a button for task pane and it prompts me for parameters. The service name that’ll default, I guess, to the current service. The Azure Subscription ID and the name of the credential  both have defaults which we configured a moment ago. You can see Azure Subscription ID is the variable that we just configured and then Azure Credential is the credential that we just created.

When we’re ready, we can just click Start to begin the test. I’m going to go scroll over a little bit here to see the results or the output. It takes a while for these to run, so I’m just going to pause the video while it runs. When the workflow is completed, as you can see here, it gives you a little bit of output. In this case, it tells me that my Virtual Machines were already stopped, so there is no action taken. Had any of them been running, it would tell me that they would list out those that it had stopped.

Let me close out of the task pane, My next step is going to be to publish this runbook so it’ll prompt me, do I want to publish it? Of course I say Yes. Now we have a runbook created. Our accounts are created and we have tested our runbook, so our next step is going to be to schedule it. I’m going to choose, go back to the VM Automation and I will choose select the runbook. Once the runbook is open, I’ll click schedule from the top and it will say like to a schedule. Link a schedule to your runbook, so I’ll choose that. I’m going to create a new schedule.

In this case, I’m just going to choose a one-time schedule. Of course, you can choose once or recurring. I’ll just choose once, and I’ll have it fire off here in the next 20 minutes. We’ll choose 12:20 PM and I’m going to say Create. Now one trick is that, simply creating the schedule,  I thought I was finished, but you’re not. All you have done at this point is created a schedule, but you need to link this runbook to it.

Now I’ve selected it and choose OK. Now you should see the number of schedules increase here under the schedules listing. I’ll open it, and now I can see that my runbook is scheduled and we’ll wait a moment for that schedule to run to confirm that it’s working. To do that, I’m going to go back to my classic virtual machines. Let me just start up one of my smaller virtual machines here.

It’ll start up and hopefully it’ll be fully started by the time that schedule fires off. Let’s go back to Automation Accounts. Open that Automation Account. Open the runbook and open the schedule and we’ll see it’s set to fire off at 12:20, which is just about five minutes from now. They won’t allow you to fire off a schedule. Any sooner than five minutes, so we’ll wait just a moment for that to fire off.

Now I can see that my schedule is listed as expired which tells me that it’s already run. So if I go back to my classic virtual machines, I can confirm now that my virtual machine has been stopped. That’s the quick run-through for creating a runbook which you can use to stop your virtual machines.

read more
Eric BowdenConfigure Scheduled Shutdown for Virtual Machines in Azure

Azure WebJob Scheduling

Kirk Liemohn is a Principal Software Engineer at ThreeWill. He has over 20 years of software development experience with most of that time spent in software consulting.

Azure WebJobs are extremely easy to write. If you have Visual Studio 2015 (earlier versions may work as well), then you can write a Hello World app in one minute. In another two minutes, you can publish it and have it running in Azure.

However, if you want to run it on a schedule, this can be a little tricky. For my first scheduled WebJob this went off without a hitch, but for some reason, my second WebJob was more finicky. It took a bit of investigation to figure this out, so I thought I would share my findings here…

Let’s start from the beginning.

If you have Visual Studio 2015, the simplest way to create an Azure WebJob is to create a console app. You can do this without Visual Studio, but that’s beyond the scope of this post. For a hello world test, your console application can be this simple:

using System;

namespace SimpleWebJob


void Main(string[] args)

Console.WriteLine("Hello World");




Then simply right click on your project name and choose Publish as Azure WebJob:

The next screen you are presented with asks for a schedule. This may or may not work (more on that later).

Then you have a choice to make for selecting a publish target. The first option has given me more success when it comes to publishing the schedule chosen above. With that option, you’ll have to sign in to Azure.

If you choose the second option, you’ll need to first download a publish profile. On the old Azure portal (, this can be found in the quick glance portion of the dashboard for your web apps (you’ll have to create a web app if you don’t have one).

On the new Azure portal (, this can be found near the top of the page for your web app.

Once you download publish profile you can import it into the Visual Studio wizard.

Regardless of your publish target, you can then validate your connection and publish.

After publishing you want to see something like the following in your output window:

Creating the scheduler job

Job schedule created

Web App was published successfully http://<yoursite>

However, you may notice one of the following message in your output window instead:

Warning : WebJob schedule for SimpleWebJob will not be created. WebJob schedules can only be created when the publish destination is an Azure Website

Web App was published successfully http://<yoursite>

I tend to see this when importing a publish profile, but don’t have this problem when selecting “Microsoft Azure Web Apps” as the target. Others have had these issues as well as evidenced here and here. However, even with the “Microsoft Azure Web Apps” target I can get this error:

Error : An error occurred while creating the WebJob schedule: Conflict: The provided job has a recurrence that is outside of quota. The minimum recurrence for the job collection is ’01:00:00′.

I get this error if I choose a frequency more often than one hour even if I am on the free tier for my web app. Others have had these issues as well as evidenced here and here. There pricing for these plans is at You may find that upgrading your web app from the free plan helps, but keep reading as there is another option.

If you see either of the above messages, your WebJob was probably published, but the schedule was not. It will only run on demand. Another clue you are having this problem is if you look at the WebJob in Azure you will see that it has a type/schedule of “On Demand”. However, I have seen it say “On Demand” in the new Azure portal when it actually mentioned the right schedule in the old Azure portal. Very strange.

Based on what I have seen, Azure WebJobs are not really intended to own the schedule. The reasons I state this include:

  1. It seems you need Visual Studio to do this approach.
  2. When using a publishing profile, the schedule appears to be ignored.
  3. The new Azure portal does not show the schedule when the old one does. For the new Azure portal, the schedule actually shows in the Scheduler (more on that below).
  4. There is a nice UI for editing the schedule, but to update the schedule you must go into the webjob-publish-settings.json file.

My belief is that built-in scheduling for WebJobs should be done using the Azure Scheduler directly. If you choose the Azure Scheduler, you have 3 pricing options. The free one has some limitations. The most important limitation is not only the maximum execution frequency of 1 hour, but also that you cannot have authentication with each schedule. This is needed if you want to schedule a WebJob. Some have had success at providing credentials on the URL (see this post for more details), but it did not work for me.

My solution was to upgrade the scheduler to Standard ($13.99 per unit; you get 10 “job collections created per hour” for one unit) and create my own schedule by hand. Here’s how I did it from the new Azure portal:

  1. Create a Scheduler or find your existing Scheduler (use search within the portal to find it)
  2. Upgrade to Standard if you are using the Free tier
  3. Download your publish profile as discussed further above and open the file with a text editor.
  4. Add a new scheduler job
  5. Give it a name and set the following properties:
    1. Action: Https
    2. Method: Post
    3. Url: https://<yoursite><webjobname>/run
    4. Body: (leave blank)
    5. Authentication Type: Basic
    6. Authentication Username: get the “useName” value from your publish profile – it might start with “$”
    7. Authentication Password: get the “userPWD” value from your publish profile
  6. Set a schedule. I noticed that the UTC offset was ignored for me so I kept it at UTC and just adjusted the advanced schedule settings accordingly.
  7. Make sure you save everything

One issue I noticed with this approach is that my authentication settings would get cleared whenever I made updates to the schedule, so you may want to re-enter the authentication settings if you make any changes to the schedule.

After the schedule is set up, you can go back to your WebJob in the Azure portal and wait for it to run. You can run it manually as well. Once it does run, you should see your Console.WriteLine output. Here is a sample of what you see:

[02/23/2016 03:51:42 > 1cd571: SYS INFO] Status changed to Initializing

[02/23/2016 03:51:42 > 1cd571: SYS INFO] Run script ‘SimpleWebJob.exe’ with script host – ‘WindowsScriptHost’

[02/23/2016 03:51:42 > 1cd571: SYS INFO] Status changed to Running

[02/23/2016 03:51:42 > 1cd571: INFO] Hello World

[02/23/2016 03:51:42 > 1cd571: SYS INFO] Status changed to Success

I know this is a lot of detail, but I hope this post clears things up for some.

read more
Kirk LiemohnAzure WebJob Scheduling

Azure Web Jobs

Kirk Liemohn is a Principal Software Engineer at ThreeWill. He has over 20 years of software development experience with most of that time spent in software consulting.

Danny Ryan:                       Hi this is Danny Ryan and welcome to the ThreeWill Podcast. Today I’ve got Kirk Liemohn here with me. Kirk is a Principal Software Engineer here at ThreeWill. Thanks for joining me Kirk.

Kirk Liemohn:                    Hello. Good to be here.

Danny Ryan:                       Great. I’m pulling you in for … We’re going to hit a technical subject today and that subject is web jobs. These are Azure Web Jobs. Is that correct?

Kirk Liemohn:                    That is correct.

Danny Ryan:                       Awesome. I’d love to learn a little bit more about what this is. Maybe a little bit of history of how you solved this problem before in the past. Some of the benefits of the way you’re taking your approach today. Tell me a little, maybe just starting at a high level, what is an Azure Web Job?

Kirk Liemohn:                    It’s just a task that runs in the background on Azure, within an Azure app. You can use it just to run anything that might take a long amount of time maybe, or just something that you want to run on a periodic basis.

Danny Ryan:                       Great. What would … is there an equivalent type of thing on SharePoint as far as the way you would approach this?

Kirk Liemohn:                    Yeah. In the past, in SharePoint we would do a timer job, definitely. If we had access to do that, you can’t do that in SharePoint online. You could write your own timer job in SharePoint. You could do a scheduled task on a Windows server as well or even a client machine if you wanted to. Those are ways to in the past and I’ve got a Unix background from back in the ’90s and back then it would have been a ChronJob.

Danny Ryan:                       Excellent. Sort of the history of the way that you periodically ran a piece of software or …

Kirk Liemohn:                    Yep.

Danny Ryan:                       I know when we we’re talking about prepping for this you also mentioned that you could use this for long running processes as well?

Kirk Liemohn:                    Yeah. Say if you’ve got a website, especially if it’s running Azure and you’re using the PaaS solution for doing websites in Azure. It needs to do some processing and for whatever reason that processing takes more time than you want to make the user wait. Even if it’s just a few seconds, might be enough to say, “Hey, let’s do this asynchronously.” You could push it off to a web job. If it certainly might take minutes or hours you would want to consider a web job for background processing.

It may not be something you do on a scheduled basis. It may be initiated by a user. If you we’re to, on that website, try to spawn a thread so that the user didn’t have to wait that delay. IIS, if you’re using IIS, it’ll potentially kill your thread because it thinks, “Well I’ve already returned to the user and I don’t need to be having anything else going on. I see there’s something else going on but I don’t care. I’ve already returned control to the user.” It has no use for your thread you’ve created so it’s better to do it with something like a background process via web job.

Danny Ryan:                       Tell me a little bit about some of the things that you learned while doing this. Talk a little bit about how did you package this up and get it on Azure.

Kirk Liemohn:                    First off, this is my first time doing a web job. I’ve wanted to do them in the past. I’ve had some times I didn’t have to do them, but I kind of wanted to do them. This is a time I really needed to. I am doing a migration to Office 365 and we’re concerned about throttling. This is SharePoint, this is a SharePoint migration. We know that we might get throttled by SharePoint online. We want to know when we’re being throttled.

The best way we can do that is to look at the logs for the migration tool and read that information and find out if there’s an entry in the log that indicates that we’re being throttled. I’ve got several migrations running at once on several servers. I didn’t want them to each go through and look for this because it is a little costly. I have to do a full text index of the content and a query across that. I’ve decided that I want to run it on a periodic basis. A job that will do that check and then it will post something to a simpler database that says, “Yep, I’ve been throttled or not.”

It’ll be a lot easier to say, create a dashboard that shows what the throttle status is, if there’s been any throttles recently, as well as the migration … The servers that are doing migrations they can do that check and say, “Oh, if we’ve been throttled recently I don’t want to start another migration job. I’m going to hold off until I’ve got no throttles within the last hour or something.”

Danny Ryan:                       Awesome. How are you alerted about this? You’ve mentioned a dashboard. I guess you could do it through a dashboard, but how else?

Kirk Liemohn:                    I haven’t created the dashboard yet, I might do that. I use Visual Studio, is what put it into the Azure Web Job for me. I started out making it as a PowerShell script and my PowerShell script relied on a commandlet that for Sequel that I had to use a couple of, actually three, MSIs to get that working. I didn’t know how to install MSI’s in a web job. I don’t think you can. I know you can upload a bunch of files and it was simpler to me to just switch it over to a console app.

I switched to a console app; this console app was written in Visual Studio. Visual Studio was real easy to just right click on the project and say, “Publish this web job.” There’s a few settings you have there, how often you want it to run. Then from Azure, the Azure console, admin console, you can see the status like any output from your console as well … Your console app.

Danny Ryan:                       You mentioned that this runs every hour, but that’s primarily because of the type of Azure account that you have. Is that correct?

Kirk Liemohn:                    That’s right. This will run, I have it running once an hour. That’s the fastest I can have it run since I think we’re on the basic, I think it’s the basic plan for our Azure web app. If I we’re to change that to standard or premium I could do it more often. We may do it for that reason, we haven’t decided that we need to go to that level yet. That is correct. While it’s running it simply does the query, the full text query. It’s against SQL which is running on Azure machines so I have to have the access to that sequel server over the internet so I’ve got special accounts and ports for that. Then if it finds something it will log something to a different database table and it will also send an email. I use SendGrid for that. Which I’ve used SendGrid in the past with Azure so it was pretty easy to set that up.

Danny Ryan:                       It’s a pretty easy configuration for that?

Kirk Liemohn:                    Yeah, it probably took me 30 minutes and I hadn’t used SendGrid in several months, maybe a year. It wasn’t fresh in my memory but about 30 minutes to figure out how to send email, basically. Set it up and configure it and all that.

Danny Ryan:                       Its funny you mention this is the first one you’ve done. I think that’s the best time to give advice to other people who might be doing this for the first time. Any other pointers that you would give to people for writing Azure web apps, web jobs?

Kirk Liemohn:                    Yeah, web jobs. The first thing is try and understand what it relies on. If it’s relying on certain other pieces of software or something installed on your system then it may not be appropriate or you may need to find a way to get those up into Azure. If you need other files, in my case initially I needed some other commandlets for PowerShell and that wasn’t going to be the easiest path for me so just doing an all encompassing console app that knew how to talk to Send Grid and knew how to talk to databases was all I needed.

Danny Ryan:                       It’s funny it seems like console apps are coming back because I was talking to Chris yesterday and he wrote for the migration tool. The Jive to SharePoint migration tool. That’s as a console app as well. It seems like they’re coming back in fashion now.

Kirk Liemohn:                    Oh yeah. All of our stuff for migration is mostly PowerShell based and we run all that from consoles, so definitely. The gotcha is just try and understand what your program relies on. Try and understand what your needs are in terms of how often this thing needs to run. Those are kind of the two things that I ran up against.

Danny Ryan:                       That’s great. I appreciate you taking the time to do this. I know you’re busy on your project but hopefully this will help somebody who may be out there looking to write their first Azure Web Job. Thank you for doing this Kirk.

Kirk Liemohn:                    Sure. You’re welcome.

Danny Ryan:                       Thanks everybody for listening. Have a great day.

read more
Kirk LiemohnAzure Web Jobs

Starting and Stopping Multiple Azure VM’s from a Mac Terminal

Pete is a Director of Technology at ThreeWill. Pete’s primary role is driving the overall technology strategy and roadmap for ThreeWill. Pete also serves as ThreeWill’s Hiring Manager and is constantly looking for new talent to join the ThreeWill family.

We recently started a project in which ensuring we kept costs of Azure VM’s as low as possible was a real concern. Since I am a Mac user (stop snickering), the Azure CLI tools seemed like an obvious choice. I just wanted a simple way to check the current status of the virtual machines in a specific Azure Subscription, and then start to shutdown the virtual machines as needed. I agree with Scott Hansleman, and thought I would save myself and others some keystrokes. #sharingiscaring

Using the Azure CLI from a Mac Terminal

The first thing I found when searching was Using the Azure CLI for Mac. This was pretty helpful and gave me some basic commands, but didn’t give me the simple path of shutting down all the virtual machines that were running at once. This did give me the basic commands that let me check the status of the virtual machines, and stop and start the machines as well. By the way, this page has a bunch of info I plan on diving into at a later date.

To start, you’ll need to install the Azure CLI tools. This is a pretty straight forward install using npm.

npm install azure-cli -g

Once you have the tools, you’ll need to get your subscription settings. I have to run the following:

azure account download

This will open a browser and you’ll need to select the subscription for which you want the settings file. Once the subscription information was downloaded, running

azure account import [settings file]

will set the current account.

The Basic Commands

Once the Azure CLI tools are installed, the basic commands to list, stop and start the virtual machines are pretty simple

azure vm list

azure vm start vm-name

azure vm shutdown vm-name

Great. Almost there.

Painful Unix Memories

Unfortunately, even though I am now a Mac user, I still have to bang my head against the wall to remember Unix commands and pipe operations. Bad memories of awk and sed from many years ago. Towards the end of the install, the Azure CLI tools post, there was a section about “understanding results” which did something very similar to what I needed. This example was using xargs and something I had never heard of called jsawk. The xargs didn’t give me any flashbacks, so I was ok with that, but the jsawk made me twitch a little. Turns out, the curl installation of jsawk from the GitHub repo failed for me, but I was too close to stop!

A link in the jsawk readme mentioned using Homebrew, so the following installed jsawk and the dependencies I was missing. Back in business!

brew search jsawk
brew install jsawk

The Final Solution

Finally, all the pieces are in place. Now I can list the virtual machines, filter them, and more. This is a little deeper than I want to go here, but the jsawk documentation is really great. With the ability to filter and then use xargs to call another azure command, the final solution was just a simple change based on the example towards the end of the install the Azure CLI tools post. Here they are:

Shutdown some Virtual Machines

azure vm list --json | jsawk -n 'out(this.VMName)' | xargs -L 1 azure vm shutdown

Start all of the Virtual Machines

azure vm list --json | jsawk -n 'out(this.VMName)' | xargs -L 1 azure vm start

There are still some things I want to do, specifically create a shell script to enable some other options and arguments, but this might be a blog post for another day. Hopefully this will help someone else and save some keystrokes.

And BTW – I do realize this would be very easy in PowerShell. I’m just sort of new to the Mac. #justsaying

Get-AzureVM | Where-Object {$_.Status -like "Ready*"} | ForEach-Object { Stop-AzureVM -ServiceName $_.ServiceName -Name $_.Name -Force }

Let me know what you think or if this was helpful.


read more
Pete SkellyStarting and Stopping Multiple Azure VM’s from a Mac Terminal

How to Brand Your Office 365 Login Page

Pete is a Director of Technology at ThreeWill. Pete’s primary role is driving the overall technology strategy and roadmap for ThreeWill. Pete also serves as ThreeWill’s Hiring Manager and is constantly looking for new talent to join the ThreeWill family.

Now that Microsoft has made it easier to get to your Azure Active Directory instance which backs your Office 365 tenant, something else got a whole lot easier as well. Branding your Office 365 tenant login experience is now super simple. No more boring login screens that look like everyone elses – like this one!


Just a few clicks and we’ll have a nice, branded experience.  First, click the Apps Launcher in your tenant and click the Admin app tile.  You will need admin rights to accomplish this task, and you will need access to an Azure subscription.


Once you are in the adminstrative portal for your tenant, click on the Azure AD link. If you don’t have an Azure subscription, you will need to create one. In my case, I already had an account, and my tenant directory appeared with my other tenant directories. You should see your domain listed that is present with your O365 Tenant. Then click on your Domain.


Once you see the list of Active Directory domains, click your domain. Then click the Configure menu option if you don’t see the directory properties as below.


Finally, click the “Customize Branding” button and set the following five items:

  1. The banner logo – a banner logo displayed on the Sign In page in cloud applications that use this directory
  2. The tile logo – a 200×200 .jpg or .png file displayed on the sign in page
  3. Sign In Page Text – sign in page text, typically an “appropriate use” statement
  4. Sign In Page Illustration – large image which will appear to the left of the login form
  5. Sign In Page background – background color (in HEX) used in place of an image or behind a transparent image


Some of the items above have some size limitations, image size and file size so you may need to brush off some Paint.NET skills – have a somebody with real talent create your images like I did.  Save your changes and then try entering your user id in the login screen. It may take some time to see all of your changes, but as soon as you tab off of the user name field, you should see your images and text.


Once you have these settings in place, you can even return to the Configure page and add language specific settings. Just a little something simple to make the Office 365 experience part of your company brand.

Questions about how to brand your Office 365 login page?  Ask in the comments below…I hope this helps make Office 365 feel a little more like your own. Want even more customization? Look Here!

read more
Pete SkellyHow to Brand Your Office 365 Login Page

Ten Reasons Why Office 365 and Azure Made Me a Fan Boy

Bo is a Principal Consultant for ThreeWill. He has 18 years of full lifecycle software development experience.

I recently worked on an on-boarding solution for a customer that was built as a cloud based application using Microsoft Azure and Office 365.  It’s what Microsoft refers to as a provider-hosted app for SharePoint, and it is all the rage these days.  Being a Microsoft developer for about 15 years, I’ve learned to be a little cautious of the newest things being pushed out by the Microsoft marketing machine.  However, after working on very real world challenges using Office 365 and Azure, I think all hubbub is well deserved.

In the on-boarding application, our customer had some very typical requirements that most companies might have including:

  • Public facing internet site where anyone can register and apply for a job
  • Highly complex and very business specific onboarding process
  • Simple and familiar UI for managing applicants from hiring through termination
  • Routing to different queues based on an applicant’s current state
  • Need to support user based events and time based events
  • Need to integrate with external third-party systems during onboarding

In addition to the typical requirements, our customer had some more unique but not uncommon challenges to deal with:

  • Very small IT staff responsible for supporting many users and technologies
  • Seasonal hiring model with large peaks and then periods of low activity
  • Need to manage seasonal employees individually or in batches
  • Desire to very easily change the workflow process over time

Obviously there are some “off the shelf” products that can handle some of these requirements, but many times they are not as customizable as desired.  Other times they can cost more than a custom solution due to licensing costs over time.  Finally, they may not integrate out of the box with Office 365 where many employees are already doing work.

Since our customer was already using Office 365 for collaboration and Azure for other applications, a provider hosted app was a natural fit for them.  It allowed for the flexibility to satisfy all their requirements, provided the services to be an enabler for their IT Staff and the UI familiarity for their internal stuff.  In the white paper The New Business Operating System: Combining Office 365 and the Microsoft Cloud Ecosystem to Create Business Value I think Pete Skelly summarized it best in the sentence “The New Business Operating System (NBOS) enables businesses to leverage commodity based IT Services and enables customizations to enhance business value.”

That sentence really says so much; I thought I would break it down into some specific reasons and examples on how the New Business Operating System helped our customer with their solution and made me a fan.

Five Reasons Office 365 Added Value to Our Solution

  1. Do More with Less – With a small IT staff, using Office 365 is really a no brainer. The Software as a Service (SaaS) model means no infrastructure to manage and no software to install/maintain.
  2. Familiar User Interface – For internal users, Office 365 made sense as the UI since it is already familiar from other activities such as document management and collaboration.
  3. Ease of Management – Built-in support for things like user management, security, lists and views means the small IT staff can easily manage the user experience for internal users just like they do for other functionality in Office 365.
  4. Workflow Engine – With a workflow engine available in Office 365, time based events like status changes and notifications after a specific period of time is easy.
  5. Ease of Customization – UI elements such as pages and the ribbon in Office 365 can be easily customized to integrate custom functionality while still looking like it’s built into the platform.

Five Reasons Microsoft Azure Added Value to Our Solution

  1. Enterprise Level Management and Support – Again, with a small IT staff, using a Platform as a Service (PaaS) makes a lot of sense from a management standpoint. In Azure, backup and recovery, monitoring and scalability are easily managed without needing extra tools or software.
  2. Flexible Compute Power – The nature of a seasonal hiring solution is also a perfect fit for Azure because you can easily deal with “Predictable Bursting” (The New Business Operating System: Combining Office 365 and the Microsoft Cloud Ecosystem to Create Business Value) by increasing resources during peak periods and reducing during low periods. If you had to deal with this on your own infrastructure, it would likely mean lots of idle hardware much of the time.
  3. Technology Choice – Since Azure is a platform to host applications, it provides the flexibility to develop an application exactly as envisioned by a customer and in technologies, known or comfortable. In our solution, we stayed within the Microsoft stack including ASP.NET MVC, Entity Framework, and several JavaScript Frameworks, but you aren’t limited to these at all.
  4. Ease of Deployment – In Azure, it is also extremely easy to package and deploy the application. This means developers can spend their time on business functionality and creating business value. Our application could be deployed to an environment like QA and UAT in just a few clicks.
  5. Robust Platform – Azure isn’t just for web apps and databases either. It’s a much larger platform that supports things like a service bus where an application can use queues and messages to make the application more robust, scalable and responsive even during unexpected peaks.

As you can probably tell, I’ve become a fan of creating solutions using Microsoft Azure and Office 365 as the platform and services to build upon.  I look at how our customer’s resources are now enhancing the application and managing thousands of applicants with just 2 people working on the application only part of the time.  In a traditional on premise application, there is no way I can imagine it would be possible for just 2 people to support the same application and be able to continue to add business functionality.

SharePoint is a web application platform in the Microsoft Office server suite. Launched in 2001, SharePoint combines various functions which are traditionally separate applications: intranet, extranet, content management, document management, personal cloud, enterprise social networking, enterprise search, business intelligence, workflow management, web content management, and an enterprise application store. SharePoint servers have traditionally been deployed for internal use in mid-size businesses and large departments alongside Microsoft Exchange, Skype for Business, and Office Web Apps; but Microsoft’s ‘Office 365’ software as a service offering (which includes a version of SharePoint) has led to increased usage of SharePoint in smaller organizations.

While Office 365 provides SharePoint as a service, installing SharePoint on premises typically requires multiple virtual machines, at least two separate physical servers, and is a somewhat significant installation and configuration effort. The software is based on an n-tier service oriented architecture. Enterprise application software (for example, email servers, ERP, BI and CRM products) often either requires or integrates with elements of SharePoint. As an application platform, SharePoint provides central management, governance, and security controls. The SharePoint platform manages Internet Information Services (IIS) via form-based management tooling.

Since the release of SharePoint 2013, Microsoft’s primary channel for distribution of SharePoint has been Office 365, where the product is continuously being upgraded. New versions are released every few years, and represent a supported snapshot of the cloud software. Microsoft currently has three tiers of pricing for SharePoint 2013, including a free version (whose future is currently uncertain). SharePoint 2013 is also resold through a cloud model by many third-party vendors. The next on-premises release is SharePoint 2016, expected to have increased hybrid cloud integration.

Office 365 is the brand name used by Microsoft for a group of software plus services subscriptions that provides productivity software and related services to its subscribers. For consumers, the service allows the use of Microsoft Office apps on Windows and OS X, provides storage space on Microsoft’s cloud storage service OneDrive, and grants 60 Skype minutes per month. For business and enterprise users, Office 365 offers plans including e-mail and social networking services through hosted versions of Exchange Server, Skype for Business Server, SharePoint and Office Online, integration with Yammer, as well as access to the Office software.

After a beta test that began in October 2010, Office 365 was launched on June 28, 2011, as a successor to Microsoft Business Productivity Online Suite (MSBPOS), originally aimed at corporate users. With the release of Microsoft Office 2013, Office 365 was expanded to include new plans aimed at different types of businesses, along with new plans aimed at general consumers wanting to use the Office desktop software on a subscription basis—with an emphasis on the rolling release model.

read more
Bo GeorgeTen Reasons Why Office 365 and Azure Made Me a Fan Boy

Join ThreeWill at the Atlanta SharePoint User Group Meeting on March 16th

Danny serves as Vice President of Business Development at ThreeWill. His primary responsibilities are to make sure that we are building partnerships with the right clients and getting out the message about how we can help clients.

Session Information

Topic: Hybrid SharePoint Security – Providing Seamless Access Using Azure AD, ADFS and Active Directory

Today’s complex business environments present IT organizations with many challenges to provide internal users, external users and partners access to enterprise resources seamlessly and securely. Today’s enterprises want to take advantage of the cloud and all of the flexibility and benefits cloud computing has to offer. But how can you integrate on premises and cloud resources, providing all your users and partners seamless access across environments and devices while remaining secure? This presentation describes and demonstrates the integration of Azure AD, ADFS and an on premises Active Directory to enable internal users to maintain their typical authentication experience, allow seamless access to corporate resources for mobile users, and enable external partner access to resources.

Specially, we’ll discuss a real world Microsoft Azure based IaaS SharePoint 2013 implementation which enables secure internal and external access to SharePoint content, including video for iPhone and iPad, for all internal and external users. We’ll also cover a demo of a development environment which demonstrates the solution.

Speaker – Lane Goolsby

Lane is a Senior Software Engineer with ThreeWill. He is a strong technology expert with a focus on programming, network and hardware design, and requirements and capacity planning. He has an exceptional combination of technical and communication skills with a strong emphasis on troubleshooting, problem determination, customer relations, and identifying quick paths to resolutions. Lane is a strong mediator between programmers, end users, and clients. His experience in design, installation, implementation, maintenance, support, and client/end user training of various software and hardware packages is put to use in everything from large-scale enterprise applications to small deployment limited traffic web sites.

Register Now
read more
Danny RyanJoin ThreeWill at the Atlanta SharePoint User Group Meeting on March 16th

New Business Operating System Webinar

Pete is a Director of Technology at ThreeWill. Pete’s primary role is driving the overall technology strategy and roadmap for ThreeWill. Pete also serves as ThreeWill’s Hiring Manager and is constantly looking for new talent to join the ThreeWill family.

Hello everybody. My name’s Pete Skelly. I’m a principal consultant and Director of Technology here at ThreeWill and I’d like to thank y’all for joining me today. We’re going to do a quick presentation on the new business operating system what ThreeWill’s calling the new business operating system and discuss the combination of Office 365, Microsoft’s Azure and on-premises products and how they can create value for your business.

So, let’s dive right in. We’ve done this presentation a couple of times and typically I start off and explain Why Live Event. So, in October of 2014, we published a white paper that really described what we call the new business operating system and why we believe that the new business operating system really provides some compelling opportunities for enterprise collaboration, increases in productivity, and innovation.

Second, we wanted to share how we see clients using the Cloud today. For most of our clients, hybrid is their reality. So, really explaining that the hybrid environments in Cloud on your terms is really what the Enterprise Cloud is going to be like for quite a while.

Then finally to discuss some benefits and success stories that ThreeWill’s learned over the course of about the last two years moving clients to the Cloud, working on some proofs and concepts, and also find out how some folks are using the Cloud offerings today from Microsoft and from other vendors and how that hybrid story is playing out.

First, the Cloud means a lot of things to a lot of people. There’s a lot of … some information about the Cloud that, frankly, can be cloudy. Pun intended. So, let’s define some terms. To start off, Cloud computing can be a really scary topic to a lot of folks. A lot of clients are very concerned about moving to the Cloud. They often have concerns about compliance or how do I move an existing app? What does it mean for Office 365? What does it mean for my users? What does it mean for my investment and share points? So, let’s start with some terms and some models, some delivery models and then we’ll dive into a little bit about what the new business operating system actually is.

So, to start, the first delivery model is something that everyone’s familiar with. It’s the traditional IT Model where you own the entire stack of delivering and application. So, from the networking physical storage, the actual server hardware managing the OS all the way up through any data that you have to provide, disaster recovery solutions, the application itself and all the clients that would consume it. These can be on-premises, they can be private in a private Cloud like PC Mall or Rackspace, and they can also be in public Clouds like Microsoft Azure.

The second hosting model, or delivery model, is an infrastructure as a service and this is when you start to move to managing some of your infrastructure as a managed service. So you begin by virtualizing some hardware, some storage and a portion of the OS. These can be on-prem, these can be, again, in the Cloud with a private provider like PCM, Rackspace and also with Azure or another Cloud provider. The key here is moving up that stack so that you’re concerned with less of the physical hardware and less of the management of that infrastructure for servers and hardware even to the point of patching some of that OS. You may be responsible for some of it, but you’re moving up that stack.

The next delivery model is platform as a service and platform as a service, or PAS provides a solution as a service typically built on top of infrastructure as a service, or IAS. This can be provided, again, on-prem, private or public situations and here the focus pulls all the way up to the top of the stack to where you’re more concerned with your applications and your data. We’re less concerned with the OS, that’s even taken out of our hands, in most cases. We’re very concerned with how the data and the applications provide value to our business.

Then finally, there’s a business model which is software as a service and this is a full solution. So, SAS, or software as a service, is a business model where everything is delivered to you from networking storage, servers, data, the application itself. You’re consuming that service and you may have multiple clients, a desktop, a laptop, a phone, etc.. This is just one perspective. If we take a different look at how those delivery models are consumed, you begin to see a little bit of a different picture. So, in an IAS, or infrastructure as a service delivery model, we’re typically going to migrate to it. So the physical resources we may take those physical resources and actually package those physical resources up and put in a new, or an old Legacy system on a new virtual machine management system in the Cloud. So taking something like an old accounting application that has to run on Windows server 2003 or Windows XP and actually putting that up into the Cloud using infrastructure as a service, that’s one way of migrating to infrastructure as a service, or one of those delivery models.

The second, in a platform as a service, we’re typically going to build on it. So, in platform as a service, or PAS, we build the solution on top of the platform typically something like e-mail or storage. We’re going to interact with the compute cycles or we’re going to consume some of the data storage for image processing, for example. So, we typically build on top of an operating system. All the middleware may be provided to us. Some of the frameworks may be provided to us and this enables us to customize and build applications that are really providing our business value and not having to worry about some of those resources that are in the infrastructure data center physical resources.

Then finally, from a SAS, or a software as a service perspective, this is really the [consume 00:07:19] it model. So with a SAS solution, you’re consuming the entire app, UI, configuration, across multiple devices, etc. This is a different perspective and the reason I have the shading here will become a little more apparent in a different slide. So, remember how that shading appears.

There are also four compute patterns that are often associated with challenges that the Cloud delivery models address. The first computing pattern is On/Off. So this is typically used for development tasks, or prototyping, very intermittent compute needs. Think here of business intelligence processing, or nightly processing call center data for reports and this is all things that you would typically over provision hardware for. You’re just going to have something running for a short period of time and then it’s going to be off for quite a while and then run again in the future.

The second compute pattern is Growing Fast and this represents pattern of growth in which it’s unlikely that you can provision hardware fast enough to respond to increases in need for your application. Think of Facebook, or Twitter, probably Snapchat or any of the consumer apps that kind of take-off and grow extremely fast. They have zero deployment lifetime so they need hardware as quickly as possible and typically you can’t think of, “How am I going to deal with that?”

The third compute pattern is Unpredictable Bursting and this unexpected growth. So, my service or application may be going along perfectly and then I have this giant spike. Here you might think of something like when Ashton Kutcher tweets about he’s going to back your start-up and you get billions of hits to your website instantly. There’s no way you can predict that, but the Unpredictable Bursting model is something that you have to deal with.

The fourth compute model is predictable bursting. This is for things like seasonal, or predictable loads. Things that are cyclical, tax calculations, for example, seasonal staffing. If you’re a logistical company that needs to have an increase in staff or a retail industry that needs to increase staff and compute time for some of your retail operations. You know those things are coming so you can actually compute … you can deal with compute needs you’re going to have based on those cyclical needs.

We talked a little bit about Cloud delivery models and about the compute patterns that they can address. So, let’s dive a little bit deeper with what we mean by that operating systems analogy. First, the new business operating system really starts with that top layer, an application layer if you will. On a desktop operating system, we’re all familiar with using different applications: e-mail, word, word processor, any app that you can think of. With the new business operating system, this is now a transparent layer. You have to be able to consume applications from a browser from an android or IOS device, desktops, laptops. I can use all of them in this application layer, but I also have to have access to things like business apps, not just things like Office 365, or Yammer, or my typical office productivity apps.

The second layer of the new business operating system is a security layer. With a traditional desktop operating system, I typically want to know who the user was for audit reasons, perhaps I wanted to have group policies that provisioned applications, might want to know what they were accessing, might want to be able to say which locations on this they could save files, etc. With the new business operating system, I need to know that as well, but I need a mechanism that is going to work in that transparent mobile world. I need something that will enable my identity and my user’s identity to be location transparent and more and more [O-Off 00:11:53], specifically [O-Off2 00:11:55], is being that security mechanism that providers are using to enable that transparent, or location transparent identity and still provide things like audit access, group management, etc.

The third layer is a services layer and in the traditional desktop world when you use something like Microsoft Word, or PowerPoint, or Excel and you click Save, I doubt that anybody really cared about how that document got saved, but there were services that are taking care of this. In the new business operating system is no different. In the new business operating system we just have those services at a higher level, things like e-mail, task, calendar, search, workflow and many others, but I’m not limited to just consuming Office 365’s APIs. I can consume my custom business APIs or services and third party services. I can also consume other Cloud services from other providers as well.

Finally, the fourth layer of an operating system traditionally deals with things like storage, caching, memory management and a whole host of other things. For example, if I send you a five megabyte Word file and you click Save, you’re not thinking about thread management, or how that file’s going to get stored to disk, or any of that. In the new business operating system, we have that same series of needs for things like scheduling, caching, we have to have hardware that things physically run on, but in Azure … in the new business operating system, the combination of Azure, on-premises services and that operating system layer it manages memory, CPU usage, responds to application needs and it automatically can scale to meet those compute models we just talked about.

So, this is all great, but why is it important and why do we think this is going to help enterprises be more productive and potentially innovate? What problems does this kind of new business operating system really solve? We said that the Cloud, for the enterprise, is about hybrid. So, let’s kind of take a quick step back and look at what a view of your Cloud profile might look like when you’re dealing with moving to solving delivery model and compute pattern problems that we just described and how that new business operating system might help.

So, first there’s the on-premises world. This is that traditional IT, you own everything, on-premises situation. It’s familiar to everybody. Everything on-premises. We’re all happy of consuming maybe SharePoint data and Exchange, etc. and I’m in control of all of this. At some point, my CFO may come along and say, “I’ve heard about the cost savings from the Cloud and I want to start consuming that software as a service for things like commodity services like e-mail, calendaring and this is great. So now we’re starting to dip our toe into the world of the Cloud. I may see some opportunity to use infrastructure as a service for something like that accounting app that I mentioned, something that is a Legacy application that we could potentially package up and move to the Cloud as an infrastructure as a service capacity. This may or may not work perfectly. You may have some adjustments to make, but you can get that service off into the Cloud and maybe mitigate some risk of some old hardware that’s about to fail.

Then you might say, with new projects, we’re going to start to move to the Cloud by creating platform as a service applications, or PAS, solutions that you can develop using SQL storage, for example, or mobile notification features, anything that you can do to kind of pull those services up into that PAS environment and reduce your IT operations burden. At some point, you may have a sales person come to you and say, “You know this is great. I have marketing data that our sales people have put out in the public SharePoint environment and I have private information that is sales related, and sales figure related that I need to make sure that I’m pulling the data from both locations.” How can we deal with that? So, you’re probably going to end up with a hybrid Cloud search situation. So, in this case, you might want to search across SharePoint [and 00:16:52] the Cloud and SharePoint on-prem and actually have a unified search experience.

At some point, you may find another company needs to have services like a private Cloud. For example, I may need to host virtual machines and networkings, etc. outside of my own environment. So, I’m going to probably work with a private hosting company and try to get those IS Services, or infrastructure as a service, into a private Cloud. Maybe I have specific needs for more compliance, or I need more control, maybe we’re ready to go do the Cloud, but not totally ready to go to a public Cloud.

Once we have that private Cloud, we may find that we actually can manage this environment and we do have the need to actually get more benefit from those commodity type services. So we could, potentially, move to a dedicated [inaudible 00:17:49] Office 365 and actually have dedicated services or an isolated environment for those commodity services.

Then we may find that that same sales person, or sales team, that was working with external customers for marketing and sales data now they have to have some integration with additional SAS services, or software as a service, that they’re consuming and their clients are consuming from the public Cloud. So, at this point I may need to integrate with a CRM system, for example in my on-prem or my IS Solutions may also need to save documents back to another public SAS service.

Finally, I may have infrastructure as a service on-prem and using virtual hardware etc. that I need to connect up to the public Cloud IS services for things that might be big data analysis, or I have a factory floor that has IOT devices that I want to update from on-prem to the Cloud, or I want the Cloud to process the data, but I want the reports built internally. So, based on this kind of diagram, or picture of how the Cloud really will operate for most enterprises, the reality is hybrid Clouds are in your future and that new business operating system that we just described helps you interact with this type of Cloud, manage your business processes, and your infrastructure, securely, transparently, and in a really scalable way.

So we talked about what the Cloud is and what the new business operating system. Let’s talk about some of the benefits that we think the new business operating system can provide to your business. The first is the new business operating system really helps you move to the Cloud on your terms. This is the most critical benefit. Since most enterprises are likely to require some sort of hybrid environment, that new business operating system is delivery model independent. You can combine public and private Clouds as we said. You can use e-mail, or storage, any of those commodity type services. You can then use infrastructure as a service to host some CRM application that you have to have that’s Legacy and it’ll take you a while to get off of that. You can combine all of these things on-premises, private, public, and hybrid scenarios, things like search, reporting, big data analysis.

A second way that this enables on your terms architectures is incremental adoption. You can incrementally adopt the Cloud. It’s not an all or nothing situation. You can slowly adopt the Cloud by using on-premises, or a private IS solutions and migrate slowly to the Cloud when you have the opportunity. You can gradually decrease your IS footprint and increase your PAS footprint, or platform as a service, over time as those applications need to be replaced.

Finally, you can consume those SAS services when they make financial and business sense. If you can do it, great. If not, you still have those other options. In the new business operating system enables you to move from on-premises to private and public Clouds transparently. So, I can have new business apps using the new app models, of today, on-premises and this enables me to kind of future proof my applications as I want to move those into the private or public Cloud. This also lets me manage virtual machines across Cloud boundaries with an easily managed, simplified, single control surface for all of my hardware, all of my virtualization needs, and all of my application needs.

The second benefit is it reduces the time you spend on routine maintenance. So, using commodity services like e-mail, SharePoint, link and more decreases your IT operations management service area. The [second 00:22:21] thing is by looking to reduce your IS management and increase your PAS solutions over time, you’re going to spend less time in routine maintenance. As you increase those platform as a service, or PAS, solutions in your environment, you’re also increasing IT’s ability to participate in adding or [inaudible 00:22:45] to the business. You’ve got tons of IT folks that are really smart people that aren’t necessarily wasting their time but spending time on low value tasks when they could spend time really impacting your business because they know a lot about your business.

As just a point of reference, in late 2013 80% of IT budgets were still tied up in routine maintenance. Just think about if you could unlock some of that potential and turn your IT folks loose on helping your business.

So, the third benefit is the new business operating system enables innovation together with Microsoft moving to what’s called the continuous delivery model, this is a really critical change in the way that they do business. So no more three-year product life cycles, no more waiting for SharePoint 2007 to 2010, or 2013 and 2016. If you’re using Office 365 or Azure, futures are continuously and incrementally released monthly, weekly, even daily if needed if they have security fixes or critical bugs. This continuous delivery life cycle of products and services means that we have to change as well. So, how we, as consultants, or you, in your particular business, how you provide value has changed.

The second thing is the Cloud moves faster than Moore’s Law. Moore’s Law states that every 18 to 24 months processes will cost half as much to produce and be able to perform twice as many operations. I probably butchered that a little bit, but you get the gist. So, we use this plan to actually plan our business and IT strategies for years. So, according to your two to three year cycle, what did we use to do? License cost, we used to redo licensing agreements every two to three years. We would purchase hardware every two to three years. For servers, we might do a hardware refresh for phones or laptops for our end users, but we’re not constrained by that two to three year cycle anymore. The Cloud moves much faster than Moore’s Law.

Third, the new business operating system allows us to incrementally innovate. So, I can move more and more to those PAS solutions. Once I do that, I can find opportunity to innovate because now I’m not constrained to these two to three year cycles, I can innovate much more quickly and perhaps differentiate myself for my customer from my competitors. So all of these combine to make businesses more efficient, IT more productive and focused on ways to innovate.

The fourth benefit is that the new business operating system promotes process cohesion. If we look at a traditional full stack application, typically you’re required to create and deliver everything in that application. So, let’s take a look at kind of theoretical, hypothetical onboarding solution. So, if I were going to onboard a large amount of users I had to build the entire solutions provision hardware, patch the OS, I had to go through and make sure networking was set up correctly. As far as integration with other systems in that middle tier in that PAS layer that we would typically think of now, I had onboarding functions that were specific to my business value, security issues, but all the way up that stack, I was responsible for everything. This solution does not scale well. For those delivery models that we discussed earlier, those delivery models and compute patterns are limiting to me in this environment.

So, the new business operating system promotes a more cohesive process in application. If we were to build that onboarding process today, we might start with our business logic and really figure out how can I provide that as more of a service that can be consumed across multiple layers? Well, if I start with a service from my onboarding logic then I can start to consume other PAS services like storage. I can consume an e-mail service. I can start to integrate with other third party apps that they’re responsible for their own infrastructure as a service or PAS solutions.

Then we can start to think of [inaudible 00:27:28] as a consumer of our services versus something that we’re providing and tightly coupling ourselves to so we can provide SharePoint as an interface, Outlook, and even Word. So, if we’re building on the new application model from Microsoft, I actually have the ability to write even a single code base that can be consumed from those three environments.

This trend of cohesive business processes, things that are tightly defined and very small and composable compact services. It’s not new. It’s not kind of the nouveau thing, but it does hint at a larger trend. So, if everybody’s using the same SAS solutions, or consuming some of these PAS solutions how do you differentiate? Well, if you start to notice these little changes you can start to look at how are these things going to impact me, long term? What are the bigger changes that are coming? Increasing the number of PAS services that you have over time will increase your integration opportunities exponentially. So, this is really where innovation can start to occur.

The fifth benefit is this stack is technology agnostic. The new business operating system enables us to use the right tool for the job. So, if you are familiar with .Net, and ASP .Net, and C#, and SQL server, etc., then go ahead and use those tools in your tool bag. But, if you have a different tool bag and, you have a different skill set, or different business needs you might want to use the LAMP stack, you know using Linux and Apache and PHP may be what your developers internally know and you’re not constrained to that anymore. If these things make sense, use them. A frequently asked question is: why would I not use a relational database, or why wouldn’t I use a document store for this? Now, you can do those things when developing Office and SharePoint solutions.

As your business grows, you’re going to see new opportunities to innovate and those opportunities may require some new architectures or software designs. So this new business operating supports massive scale and all those compute models. There are a variety of services that you can consume from storage and mobile messaging and, all sorts of things. These are all possible so using high [through 00:30:12] put processing for IOT media services and more. Those things are not only going to be available to you, but you also have to think of, “If I take a dependency on another PAS service, how do I create a resilient design that can compensate for some of these failures that might occur?” Networks will go done. Services will have interruptions. There is no such thing as 100% uptime. So, you’re going to have to learn to look at some new opportunities for architecture and design.

Finally, the new business operating system gives you the freedom to choose the right technology based on your business scenario. So, if you look at SharePoint, Outlook, Custom Web, Web Axe, and Workflow for your departmental processing needs, you can also use the new business operating system for media integration, IOT, Power BI, and SharePoint, all those things. So, when you find something that is critical for your business to be successful, you can utilize the new business operating system to impact that change and you’re not driven by technology. You’re driven the solution … the problem you’re trying to solve, the right solution.

So, to recap the five benefits of the new business operating system really, if you look at the top benefit on your terms architectures, it’s really what this new business operating system is about and what ThreeWill’s really trying to kind of explain this to some of our customers and reduce some of this anxiety about moving to the Cloud. It’s not an all or nothing proposition. You can move to the Cloud on your terms and get all of these benefits out of moving to the Cloud.

So, some success stories. We created, about 18 months ago, a perfect concept application that was an Azure based sales enablement application called Popcorn. This was a contextual based search application that aggregated content across multiple SAS services. So, this really proves that kind of those cohesive services that once you start to combine them, you can get new powerful applications out of those different services. This not only has search but integration with other technologies like push technologies for mobile phones, dial technologies. So from a search result actually being able to dial my phone and start a conversation with someone.

We had a recent client that we completed a seasonal staffing application for so that onboarding application, that fictitious example isn’t so fictitious. It’s a real world example of an Office 365 intranet and an Azure based public facing web application that eliminates a paper based, Excel based hiring and termination process. The internal processes for approval, routing, some of the process automation, all the status management, and internal HR review process, that’s all handled within Office 365. The background checks and integration with termination process and hiring and payroll all those are done from the Azure side, but this is a great example of that predictable burst pattern. So consuming Office 365 commodity services, the intranet, and the HR staff go about their business on a daily basis and just perform their job like normal. As their season starts, that cyclical need for compute power it’s automatic.

The third example is a hybrid search environment that we created for a client that needed to have search from on-premises and an Office 365 environment actually deliver aggregated search results. They needed single sign on, they needed people that were external to an office be able to actually log-in, search o-premises data, and Office 365 data but get an aggregated search results.

The fourth example is an IOS application proof of concept that we wrote that uses Office 365 and Azure to access SharePoint documents and SharePoint document libraries using single sign-on from Office 365 provide native capabilities of the IOS device like search, offline storage, sharing, texting, etc. This was an application that really kind of suits the need of, if I need to have an application that really has native functionality and pushes into that familiar experience that users have as consumers, it’s more than possible. I can integrate with all of Office 365 and Azure from different devices today.

Finally, we’ve recently completed an infrastructure as a service based SharePoint 2013 portal for a law firm. This portal is an IS based for clients and matters which enable internal users to have that single sign-on experience, external users, their clients, to actually log-in and access resources all using infrastructure as a service to manage and house a complete SharePoint 2013 [form 00:36:08] because they needed some special features, they had some special code that needed to run and this enabled IT full control of that environment.

In summary, the new business operating system really lets you provide value to your business by consuming those Office 365 commodity services like e-mail, and SharePoint, and link. It can also help reduce long term costs gradually by letting you reduce your on-premises IAS needs and increase those PAS solutions over time.

As hopefully, we’ve made a case for hybrid Cloud is probably in your future. It is the enterprise Cloud. Using the new business operating system can help flatten that control plain that IT, operations and your enterprise has to deal with in order to manage that hybrid environment.

Finally, ThreeWill can help define your business application profile and see how that new business operating system can help start moving you to the Cloud today. In probably a final note, the Cloud isn’t just about compute power and, the cost of storage. Many people say it’s a race to the bottom as far as storage costs, etc. To us, the decision about using hybrid Cloud is about creating new opportunities, finding new ways to use the data and provide value to your business by moving to these different compute models and finding ways that the new business operating system can add value to your business today and in the future.

Hopefully, the Cloud is a little less scary at this point. Thanks for everybody’s attention. Here’s some of my information. Feel free to shoot me an e-mail, or ask any questions. There are my social media handles, so if you want to talk to me on Twitter @pskelly, or @ThreeWillLabs, and thanks for your time.

read more
Pete SkellyNew Business Operating System Webinar