microsoft-flow-icon.png

How to Convert Strings to DateTime Data Types in Microsoft Flow

Rob Horton is a Senior Consultant and the Sustainment Practice Lead at ThreeWill. His experience includes over 20 years leading software architecture, design and development focusing on support tools, automation and e-commerce for large corporations and his own small businesses.

Background

Being a consulting firm, you can imagine we do a lot of reporting. One of my responsibilities as the Solution Sustainment lead at ThreeWill is to provide monthly Sustainment usage reports to our clients. It typically involves massaging time tracking reports into more friendly formats. This process can be a bit time consuming, so I set out to automate the process and implement Microsoft Power BI reports.

I decided to use Microsoft Flow for the automation and was challenged to import the reporting data into Azure Table storage. Our reporting data does not contain unique fields that could easily be used as Row Keys and I wanted to make the process self-healing (avoid duplicate data while also being able to reprocess a report), so I needed to query the Table for existing entities by date. This required storing the date as a DateTime data type in the Table. If I didn’t need to query the Table by date, I could have simply added the date as a string and shaped it on the consumption side in Power BI.

The Problem

Flow does a lot of things well but doesn’t have any out-of-the-box functions for converting strings to DateTime data types. I needed to first format my “m/d/yyyy” string to a standard DateTime format.

The Solution

I created 3 Compose actions for each date as such:

To calculate the dateReportStartMonth, I used the following expression:

substring(concat('0',split(body('Parse_JSON')?['report']?['Report Start Date'],'/')[0]),sub(length(split(body('Parse_JSON')?['report']?['Report Start Date'],'/')[0]),1),2)

Where body(‘Parse_JSON’)?[‘report’]?[‘Report Start Date’] is the report’s start date. This expression adds a ‘0’ to the beginning of the month and takes the right two characters so that all months are depicted by two characters.

To calculate the dateReportStartDay, I used a similar expression:

substring(concat('0',split(body('Parse_JSON')?['report']?['Report Start Date'],'/')[1]),sub(length(split(body('Parse_JSON')?['report']?['Report Start Date'],'/')[1]),1),2)

Then I put the formatted DateTime string together in dateReportStart:

concat(split(body('Parse_JSON')?['report']?['Report Start Date'],'/')[2],'-',outputs('dateReportStartMonth'),'-',outputs('dateReportStartDay'),'T00:00:00Z')

I also created dateReportEnd using similar expressions to format the report’s end date. With these dates formatted, I was able to query the Table with the following Filter Query:

(the first Output is dateReportStart and the second is dateReportEnd)

The final step is to insert the formatted date into the Table as a DateTime typed value. This is done by adding the data type into the JSON payload just prior to the value as such:

{

“Company”: “<Company Output>”,

“Contract”: “<Contract Output>”,

“[email protected]”: “Edm.DateTime”,

“Date”: “<Formatted Date Output>”,

“Ticket”: “<Ticket Output>”,

“PartitionKey”: “<PartitionKey Output>”,

“RowKey”: “<guid() Output>”

}

(where Outputs are values from other Flow Actions)

Conclusion

I suspect Microsoft will continue to add expressions to Flow and include string to date conversion at some point, making the process above simpler. Still, it is still possible with the current tools to format strings as acceptable DateTime inputs and use them to query Azure Table storage as well as populate typed entities in a Table.

read more
Rob HortonHow to Convert Strings to DateTime Data Types in Microsoft Flow
security-concept.jpg

Securing Azure Functions with Azure Active Directory

Pete is a Director of Technology at ThreeWill. Pete’s primary role is driving the overall technology strategy and roadmap for ThreeWill. Pete also serves as ThreeWill’s Hiring Manager and is constantly looking for new talent to join the ThreeWill family.

Editor’s Note – Pete has been busy sharing knowledge on his personal blog.  I asked him if we could promote it on our corporate blog and he agreed.  Enjoy!

This is a multi-part post about consuming Azure Functions secured by Azure Active Directory. I have been using Office 365 applications with OAuth tokens for a while but wanted to dive a bit deeper and learn some of what is going on behind the scenes.

read more
Pete SkellySecuring Azure Functions with Azure Active Directory
apples-vs-oranges-1.jpg

Office 365 Groups vs Azure AD Security Groups

Caroline Sosebee is a Software Engineer at ThreeWill. She comes to us with 20+ years of software development experience and a broad scope of general IT support skills.

We recently had a client who was ready to streamline the security of their SharePoint Online site and change it from ‘Everyone’ access to groups of people with more specific access. Our recommendation to them was to use Azure AD groups so that the groups would be global and could be both centrally managed and used across site collections.

As we moved ahead with it, they had the groups added with the appropriate members. We then granted SharePoint permissions to the new AD groups by adding them into the appropriate SharePoint groups and removing the reference to ‘Everyone but external users’.

At first all seemed to work ok but as the week progressed, random problems started cropping up that we couldn’t explain, the biggest one revolving around search results. One of their users (and others, we later found out), who had full read access to the root site and all subsites, would only get back results from his OneDrive library and from the separate training documents site (which is open to Everyone). Yet he could easily navigate to and access all the document libraries in all the sites.

Thus began my long search on all sorts of things SharePoint search related, trying to figure out what was going on. For some reason, I finally decided to go look at the AD groups themselves with the thought that since roles are assigned to users, maybe the same thing might need to be done for groups. This was a bust of course, but being fairly new to administrating SharePoint Online, I was game for checking all sorts of things I didn’t know about.

Luckily this random check ultimately ended up pointing me to the real problem. It turns out these two new groups were setup as Office 365 Groups instead of security groups. At the time, I didn’t know anything about Office 365 Groups but didn’t really think this could be the problem. I decided to do a little research anyway into what that meant. One of the definitions I found was:

Office 365 Groups is a service that enables teams to come together and get work done by establishing a single team identity (managed in Azure Active Directory) and a single set of permissions across Office 365 apps including Outlook, SharePoint, OneNote, Skype for Business, Planner, Power BI, and Dynamics CRM.

So SharePoint was mentioned in that list of Office 365 apps, right? How could the group type be the problem then? We needed access to SharePoint and it says it does that. What it doesn’t tell you is that it’s mostly referring to access to the team site that is created, specifically for that group, when the group is first created. It does not mean that it will be very usable by other sites.

After more searching and finding very little, I decided it was time to do some of my own testing. First I had a test user added to the Office 365 Group currently in use. After giving the cloud some time to process this change, I signed in and ran a search or two. What I got back was very similar to the user mentioned above. I got little or no results back, even though I had access to everything in the site.

I then created a new Azure AD Security group, added the same test user to it and then granted it the same permissions in the SharePoint site as the Office 365 Group had. After waiting a decent time so I was sure the security change was processed by search, I signed back in and found that the behavior was now entirely different. With the test user as a member of my new security group, I got back tons of results, just as expected. To further verify, I then removed the user from my test security group, waited a bit, ran a search and found I was back to square one. This was pretty solid evidence that the Office 365 Group was the culprit.

Our end solution was to create new Azure AD Security groups, add the correct members, grant them the same access as had been granted to the Office 365 Groups and then remove the access for the Office 365 Groups. This seems to have corrected all the problems the users were experiencing.

I still don’t know a lot about Office 365 Groups and haven’t had time to research much further, but I do like the below snippet (found here) that very succinctly describes each group and what it does.

I’m sure that Office 365 Groups have a place in the Microsoft world, but it is definitely not as a replacement to AD security groups.

As a quick recap, here are the areas that were impacted (at least on this particular site) by using an Office 365 Group instead of a security group:

  • Search – would not return results from the site
  • Starting a workflow – if a user in an O365 group kicked off a workflow, the workflow got hung up with an ‘Access Denied’ error before it ever got far enough along to send out custom errors.
  • Site access – Various users had problems accessing the site, even though they were in the correct group. Was a very random thing as it worked for some and for others, it didn’t.

I hope this helps save someone some time in the future!

read more
Caroline SosebeeOffice 365 Groups vs Azure AD Security Groups
deploy.jpg

Deploy an Azure Function App using Azure ARM Templates

Pete is a Director of Technology at ThreeWill. Pete’s primary role is driving the overall technology strategy and roadmap for ThreeWill. Pete also serves as ThreeWill’s Hiring Manager and is constantly looking for new talent to join the ThreeWill family.

“About a month ago, the Azure Functions for SharePoint (AFSP) open source project was introduced. The project’s goal is to provide a set of common functions for scenarios shared by most SharePoint provider hosted add-ins implemented as an Azure Function. Azure Functions is a compelling service from Azure and the Azure Functions for SharePoint pattern is promoting an equally compelling use of Azure Functions in the Office and SharePoint space and I wanted to get involved with the project.  One of the goals of the project is to enable people to make the use of the pattern and Azure Functions quickly and easily. While there is a Client Configuration Guide to manually configure the Azure resources required to start using the Azure Functions for SharePoint, we thought minimizing the friction of getting started by providing a scripted configuration of the Azure Resources required would be very helpful.

Read about how to use an ARM Template to deploy all the resources to get started with the Azure Functions for SharePoint.“

read more
Pete SkellyDeploy an Azure Function App using Azure ARM Templates
long-running.jpg

Long Running Operations in Azure

Lane is a Senior Software Engineer for ThreeWill. He is a strong technology expert with a focus on programming, network and hardware design, and requirements and capacity planning. He has an exceptional combination of technical and communication skills.

For a project, we recently had to implement a mechanism for performing bulk data operations. Essentially, the customer wanted for an end user to upload an Excel file with a bunch of tabular data in it that we would then process and insert into SQL Server. We knew the act of processing and inserting could be very time consuming (we estimated it would be ~30 seconds/row x 10,000 max rows = ~1.2 days) so we wanted something very robust. Simply working in the application pool wasn’t an option, IIS would time us out. When coding the on-premise implementation, we just used a Windows service to watch for a file event in a folder that would process the file uploaded by the user. Easy peasy lemon squeezy.

However, Azure works differently. To have the equivalent of a Windows service in Azure we either needed a Worker Role or an Azure Fabric Stateless Worker. Both of these approaches were undesirable since worker roles always consume resources, even when they are idle. This bulk import process would be used once, maybe twice, in the initial phases of an engagement and then never again over the course of months (or maybe even years). That means that money is being spent even when nothing is happening.

Our first idea for solving this was to use an Azure Function App. A Function allowed us to listen to an Azure BLOB storage account for when a file was written and then react accordingly. However, we quickly realized that Function Apps have a 5-minute timeout (at least the Consumption plan ones do, I think the ‘always on’ Functions do not have this limitation) and our job had the likelihood of running for days.

The next couple ideas we tried out were very complicated, and in the end, none of them worked. That’s when we stumbled across Azure Batch Services. Batch Services are a way to run very long running operations much like the worker roles. However, unlike Worker Roles, you only pay for Batch Services when a batch job is running. The Azure Team were even nice enough to provide some example projects that showed how to wire up a job, start it, monitor it, and eventually kill it when its done (and all kinds of other very helpful things). Even better, we learned we could set up a Function App to trigger off a file being written to BLOB storage that would start the batch job, but the Function App didn’t have to wait until the job was completed, so we weren’t limited by the 5-minute timeout any longer. Even better, when setting up a job, you can specify how many cores and nodes in the batch cluster you want. That allows us to “preflight” the job by looking at how many records we will be importing and scale our job accordingly. Very slick!

It feels weird writing a blog post without code so here are some steps that should get you pointed in the right direction:

  1. Create a Batch Service account from the Azure Portal.
  2. Grab one of the sample projects from the Github project I linked above. I used HelloWorld as a starting place.
  3. Open Main.cs and comment out the lines for Console.ReadLine at the end of Main() (~line 38), and the call to WaitForJobAndPrintOutputAsync (~line 71) and all the code in the Finally block (~line 75-80) in HelloWorldAsync() (mainly we just want HelloWorld to tee up the job and not wait for it to finish). Save and compile everything.
  4. Make sure your Azure specific settings (keys, URL, etc.) are saved in Common’s AccountSettings.settings.
  5. Create a new Function App. Add a BlobTrigger-CSharp Function and add the following code into Run (you will also need to add ‘using System.Diagnostics’ at the top):
    Process process = new Process();
    process.StartInfo.FileName = @"D:\home\site\wwwroot\BlobTriggerCSharp1\HelloWorld\HelloWorld.exe";
    process.StartInfo.Arguments = "";
    process.StartInfo.UseShellExecute = false;
    process.StartInfo.RedirectStandardOutput = true;
    process.StartInfo.RedirectStandardError = true;
    process.Start();
    string output = process.StandardOutput.ReadToEnd();
    string err = process.StandardError.ReadToEnd();
    log.Info(output);
    process.WaitForExit();
    
  6. Open the Function App settings and open Kudu. Use Kudu to upload the contents of the HelloWorld Debug folder into the Function App.
  7. Configure the BLOB triggers for the Function App using any path and storage account you want to.
  8. If needed, correct the path to the EXE above.
  9. Open Azure Storage Explorer and upload a file to the storage account you configured in #7.

You should now see your HelloWorld job if you navigate to the Batch Service’s Jobs in the Azure Portal. Note that because we took out the code to delete the job once its done from HelloWorld, you will need to add that logic somewhere else in your workflow (maybe another BLOB trigger that kicks off another Azure Function App).

And that’s it. An on-demand, scalable batch processing mechanism that works (and costs us money) only when we want it to.

read more
Lane GoolsbyLong Running Operations in Azure
threewill-webinars-2017.jpg

Free ThreeWill Webinars for 2017

Danny serves as Vice President of Marketing at ThreeWill. His primary responsibilities are to make sure that we are building partnerships with the right clients and getting out the message about how we can help clients.

We’re excited to announce our Webinar Schedule for 2017 (all times in EST)…

  1. Moving from SharePoint Online Dedicated to Multi-Tenant – 1/26/17 @ 1:00pm – Listen Now
  2. Migrating from Jive to Office 365 – 2/23/17 @ 1:00pm – Listen Now
  3. Complex SharePoint Online/2016 Migrations – 3/30/17 @ 1:00pm – Listen Now
  4. Creating Award-Winning SharePoint Intranets – 4/27/17 @ 1:00pm – Watch Now
  5. Find Anything in SharePoint with Amazon-Like Faceted Search – 6/29/17 @ 1:00pm – Watch Now
  6. Budgeting for 2018 SharePoint Initiatives – 10/26/17 @ 1:00pm – Register
  7. Successful SharePoint Farm Assessments – 11/30/17 @ 1:00pm – Register

The schedule is subject to change (especially if presenters get overloaded on projects). Let us know in the comments if you have other topics that you would like us to cover.

Sign up below to get notified about upcoming events or follow us on twitter.


SharePoint is a web application platform in the Microsoft Office server suite. Launched in 2001, SharePoint combines various functions which are traditionally separate applications: intranet, extranet, content management, document management, personal cloud, enterprise social networking, enterprise search, business intelligence, workflow management, web content management, and an enterprise application store. SharePoint servers have traditionally been deployed for internal use in mid-size businesses and large departments alongside Microsoft Exchange, Skype for Business, and Office Web Apps; but Microsoft’s ‘Office 365’ software as a service offering (which includes a version of SharePoint) has led to increased usage of SharePoint in smaller organizations.

While Office 365 provides SharePoint as a service, installing SharePoint on premises typically requires multiple virtual machines, at least two separate physical servers, and is a somewhat significant installation and configuration effort. The software is based on an n-tier service oriented architecture. Enterprise application software (for example, email servers, ERP, BI and CRM products) often either requires or integrates with elements of SharePoint. As an application platform, SharePoint provides central management, governance, and security controls. The SharePoint platform manages Internet Information Services (IIS) via form-based management tooling.

Since the release of SharePoint 2013, Microsoft’s primary channel for distribution of SharePoint has been Office 365, where the product is continuously being upgraded. New versions are released every few years, and represent a supported snapshot of the cloud software. Microsoft currently has three tiers of pricing for SharePoint 2013, including a free version (whose future is currently uncertain). SharePoint 2013 is also resold through a cloud model by many third-party vendors. The next on-premises release is SharePoint 2016, expected to have increased hybrid cloud integration.

Office 365 is the brand name used by Microsoft for a group of software plus services subscriptions that provides productivity software and related services to its subscribers. For consumers, the service allows the use of Microsoft Office apps on Windows and OS X, provides storage space on Microsoft’s cloud storage service OneDrive, and grants 60 Skype minutes per month. For business and enterprise users, Office 365 offers plans including e-mail and social networking services through hosted versions of Exchange Server, Skype for Business Server, SharePoint and Office Online, integration with Yammer, as well as access to the Office software.

After a beta test that began in October 2010, Office 365 was launched on June 28, 2011, as a successor to Microsoft Business Productivity Online Suite (MSBPOS), originally aimed at corporate users. With the release of Microsoft Office 2013, Office 365 was expanded to include new plans aimed at different types of businesses, along with new plans aimed at general consumers wanting to use the Office desktop software on a subscription basis—with an emphasis on the rolling release model.

read more
Danny RyanFree ThreeWill Webinars for 2017
azure-functions.jpg

Getting Started With Azure Functions

Pete is a Director of Technology at ThreeWill. Pete’s primary role is driving the overall technology strategy and roadmap for ThreeWill. Pete also serves as ThreeWill’s Hiring Manager and is constantly looking for new talent to join the ThreeWill family.

Danny Ryan:Hello and welcome to the ThreeWill podcast. This is your host Danny Ryan and I have Pete Skelly here with me. He is our director of technology, yes?

 

Pete Skelly:Ah, that’s correct.

 

Danny Ryan:Director of technology.

 

Pete Skelly:That’s what they call me today.

 

Danny Ryan:You take the technology and you direct it in certain ways, is that what you do?

 

Pete Skelly:Some would say that.

 

Danny Ryan:Some would say that and so this is going to be a really fun podcast because Pete is in the delirium stage at this point right now so what better than to sit down, record it for posterity and so when your grandkids are listening to this they’re like, “Papa, what are you talking about? I don’t understand what you’re talking about. You sound silly.” What, seriously today we wanted to talk about a subject that we covered in the morning brew, a morning brew is something we do weekly where we sort of do knowledge sharing within ThreeWill. This week you covered Azure Functions, correct?

 

Pete Skelly:Yes, that’s correct.

 

Danny Ryan:A sec, hit that and tell me at a high level, assume I was there and listening or maybe I wasn’t listening and you were going to describe this to me at a high level?

 

Pete Skelly:Azure Functions, I don’t even know what the Microsoft marketing terminology and speak is at this point but Azure Functions to me really are the, a pure function if you will, and not in the proper use of pure function, don’t send me hate mail. It’s just the functionality that you want to run.

 

Danny Ryan:Okay.

 

Pete Skelly:There’s no additional need to have a full blown visual studio project or, you know, all this extra cruft to accomplish a typically small task. There’s a whole lot of caveats around that but that kind of boils down. You’ve got something you want to accomplish, it can be chunked very small, and it’s kind of an in and out function that achieves some, you know, valuable process.

 

Danny Ryan:This is, previously we’d take these and you know you’d build out an app and this app would have a bunch of functions inside of it and-

 

Pete Skelly:Yep.

 

Danny Ryan:Here we’re taking something that’s going to, I mean we’re just putting it out there in the cloud and something that we can interact with and do what we need to do in little bite sized chunks.

 

Pete Skelly:Yep.

 

Danny Ryan:They go and they do their thing and they probably I guess just very specific inputs and outputs for this little guy that does this thing?

 

Pete Skelly:Absolutely, so Azure Functions are based on the Azure WebJobs SDK.

 

Danny Ryan:Okay.

 

Pete Skelly:You know, and please if somebody if I misstate this provide some comments or feedback in the

 

Danny Ryan:There will be a comments section.

 

Pete Skelly:Website that Danny has so, you know, full transparency Azure Functions were recently released to GA.

 

Danny Ryan:Okay.

 

Pete Skelly:Myself and someone else in our free time are spending some time learning some Azure Functions and one of the things that we started to look at was what do you really do with them? The classic example is a timer job.

 

Danny Ryan:Okay.

 

Pete Skelly:Probably the biggest use of WebJobs, at least from ThreeWill’s perspective for customers or even internal, we used WebJobs for timer jobs so we had some-

 

Danny Ryan:I love a good WebJob, I’ll tell you that much.

 

Pete Skelly:The timer job really was, I just, I’m not even going there. The timer job really is the classic example, you know, I have to clean out some table in a database, or I have a files in some location that I want to clean out, or it’s a very simple, everybody gets it, you know, classic IT Admin schedule a job to actually import a file or something.

 

Danny Ryan:Nice.

 

Pete Skelly:In Azure Functions it’s a very simple mechanism of saying I have this little bit of code that might have to, you know, add a file or parse a file and update a database, or parse many files and update a database and so you can create these functions that are based on HTTP triggers. Basically, kind of a fire and forget mechanism or even npoints that might serve a mobile app, or a custom app, or even a SharePoint app, for example. They could be timers, they could be BLOB or Cube-based and all of these things are just events. An event kicks off that code and that Azure Functions just essentially going to run and perform some action.

 

Danny Ryan:This tickles you inside because one of the things that you can do is PowerShow? What’s with the deal there.

 

Pete Skelly:One of the things, yeah, one of the things that part of what I kind of dove into in the morning brew the other day was to try to get my feet wet with Functions you can do. You can write the Azure Functions in multiple languages so they have support for Node, and C#, and Bash, and PowerShow, et cetera and I figured, “Hey, PowerShow seems pretty good.” I had just read that everything went GA for Azure Functions and I also saw an article or blog post by John Woo about using the PnP PowerShell CmdLets in an Azure Function so I said, “Why not give it a shot?”

 

Danny Ryan:Sure.

 

Pete Skelly:Using the PowerShell, using a PowerShell function and based on an HTTP trigger I was trying to set up a Microsoft Flow from a SharePoint list to kick off an Azure Function that would provision a site collection in Office 365.

 

Danny Ryan:Boom. Boom. It was easy, piece of cake?

 

Pete Skelly:Setting everything up was fairly easy except for getting the PowerShell PnP CmdLets to work. Little did I know, you know spoiler alert, the CmdLets had been changed probably two weeks prior and so everything that, you know, John wrote about-

 

Danny Ryan:Was it a big change, or?

 

Pete Skelly:The names, the CmdLet names were changed so they had used, you know, Startup PnP or Startup SPO for SharePoint Online for their CmdLet names and that caused a couple of problems and it was something you could work around but they recently changed it to Start PnP Star. For example, get-pnpsite would get you to site collection.

 

Danny Ryan:Okay.

 

Pete Skelly:What I did not know is that that had occurred and that I had grabbed the latest code for the PowerShow CmdLets and was trying to, thinking that it was an Azure Function issue because I’ve used the PowerShow CmdLets enough in the past to know exactly what they were. It’s always something simple and I didn’t know what I didn’t know and they were renamed. In the end it was real simple to set up, very simple issue-

 

Danny Ryan:Once you figured it out.

 

Pete Skelly:Just took a little while to figure it out, you know, so, in the end, it was a very simple, creative SharePoint list, creative Flow.

 

Danny Ryan:Yeah.

 

Pete Skelly:Then Flow has the ability to basically call out to an HTTP npoint and that pushed the data over to the actual Azure Function, PowerShell took that data and basically provisioned the site collection and applied the site template from the PnP CmdLets. Really simple exercise that shows some of the power of Azure Functions and right now, kind of, diving into some other things. There are some edges with it at this point so learning some of the development style, some of the SDLC or ALM depending on who you are. Some of the lifecycle management ways of developing functions, deploying functions, managing them, et cetera.

 

Danny Ryan:Thank you for defining the acronyms for me, that’s so sweet.

 

Pete Skelly:Yeah, so software development lifecycle and application lifecycle management. You know, their functions are pretty, they’re extremely powerful and they’re super simple to get started with so you can literally go to the Azure Functions at azure.com and in the course of probably 20 clicks have some pretty powerful stuff wired up.

 

Danny Ryan:Did you look at all, how are they pricing this? I mean, how do you license something like this?

 

Pete Skelly:In a nutshell, it’s compute time.

 

Danny Ryan:Okay, compute time.

 

Pete Skelly:Really, it’s, you know, for someone doing development you could probably get away, you know, learning it, you could get away with pennies if you’ve got a developer subscription or an MSDN subscription to Azure. It’s not going to cost you really anything to get started. There’s in the millions of compute cycles for free basically with an Azure subscription. If you’re using, you know, if you’re processing BLOBs, so if somebody puts a file into a BLOB storage and you have a function that’s kicked off because of that event, obviously, you’re going to pay for some BLOB storage and some of the storage things, or depending on how much you’re using could get a little pricey. To get started and to start figuring it out, very cheap.

 

Danny Ryan:Nice.

 

Pete Skelly:Also, really looking at if you have things that are, that typically might be a workload where, you know, a couple of our customers currently have the classic scheduled task running on a server in their own environment and so they’re paying for the infrastructure for the server, they’re paying for server itself, they’re paying for somebody to monitor and manage it if they’ve got to patch it. All those things where literally they’ve got a PowerShell script that’s fifty lines of PowerShell that’s running, that’s doing the work. That can be moved up to Azure and actually function in Azure, they get rid of all that cost.

 

Danny Ryan:That’s awesome.

 

Pete Skelly:They get rid of all those kind of moving parts and it just becomes somebody uploads a file to BLOB storage, that function runs, they get charged for the time that that function is running, and they move on. Really, I think there’s a lot of potential there.

 

Danny Ryan:Yeah.

 

Pete Skelly:The IFTTT mentality, and Zapier mentality, and now Flow. That style of interaction with things and we, you know, wrote in a white paper many moons ago. The true power of kind of a network, if you will, are all the permutations that you have basically all of those Nodes, the more Nodes you have the more value you’re going to get. While Flow just kind of, between Flow and Functions and all these other things it’s now you’re getting to a power use that could make some really effective workflows if you will, and I’m doing air quotes on the podcast.

 

Danny Ryan:I see it, I see the air quotes.

 

Pete Skelly:You can create some really powerful, you know, sort of personal workflows using Flow and then if there are something that need to be more enterprise grade that’s where Logic Apps would come in.

 

Danny Ryan:Okay.

 

Pete Skelly:In effect Logic Apps would be the enterprise level of a personal flow, if you will, and Logic Apps basically know how to interpret, interact with, you have full access to Azure Functions so it’s the big boy movement to getting all of the business processes that will be event driven within kind of your Office 365 or Azure environment.

 

Danny Ryan:Someone getting started with this besides nailing the class names, what would you-

 

Pete Skelly:Figuring out the easy stuff?

 

Danny Ryan:Any tips that you have for them?

 

Pete Skelly:Start with functions.azure.com

 

Danny Ryan:Okay.

 

Pete Skelly:You’ll need an Azure subscription but that’s the simplest way to get started. The Azure WebJobs STK in GitHub has a ton of samples. I believe it’s Chris Anderson did a Azure Functions demo, or seminar, or session at Ignite this year.

 

Danny Ryan:Ignite, okay.

 

Pete Skelly:That is probably one of the better things I’ve seen as far as explaining what they are, how to use them, you know, giving you some concrete examples, et cetera. Once you kind of go through UI to get your feet wet there is in, there’s several utilities depending on if you are inside of Visual Studio. Me personally, I’m on a Mac so I’m using some command line tools so the Azure Functions Command Line Interface actually lets you stub things out and get started kind of in a better application lifecycle management so you’re using source control, you’re pushing to a repository.

 

Danny Ryan:Nice.

 

Pete Skelly:That’s actually going to, once you push it, check your code into master. It actually deploys it out to the Functions so get a little bit more advanced once you get out of the UI in the web you’ve got to get some tooling. That’s not there yet, it’s kind of half-baked. There’s some really good stuff out there but it takes a little effort, that’s kind of the stage I’m in is getting the pieces and parts together so that you are actually doing the development quote properly versus trying to go through the UI and not being really scaled. Not that there’s not value in that but for our customers, you know, we’ve got to find a way to actually make that a little bit more ALM compliant.

 

Danny Ryan:When do you see first, and we’re getting here to the latter stages of this, need to let you get back to work here. When do you see first using Functions on projects, or Azure Functions on projects? You’re next project you are going to be on, or is it?

 

Pete Skelly:I think dependency right now is does the customer have Office 365 first off. If they have Office 365 I think that’s the easiest on-ramp because then it’s just, “Okay, you’ve got Office 365 let’s get a subscription in Azure.” Basically, wire up some processes so one of our customers that I’m working with currently that’s in their plan for 2017.

 

Danny Ryan:Great.

 

Pete Skelly:They’re really looking forward to kind of doing some of the things I described which are kind of timer job based on premises and moving those into the cloud and using just pure compute so they’re not, they don’t have servers on-prem anymore, those types of things.

 

Danny Ryan:Do you mind taking a couple of hours this weekend and getting the Jive migration tool over into this Functions?

 

Pete Skelly:Actually the reason I’m a deer in the headlights today is because I’m working on, I’ve been mired in Git for most of the day trying to set it up so that we can have multiple Git repositories that share all the same code so I’m working on your migrator tool right now as we speak.

 

Danny Ryan:Aww, thank you. Very nice. Very nice.

 

Pete Skelly:It’s coming, I don’t know if it will be in Azure Functions very soon but it, that’s kind of the long-term goal for some of this stuff.

 

Danny Ryan:You should focus in on getting the project done. I’m like the number 18th priority around here so I understand. I know where I fit into the world. With that, thank you so much for taking, I just wish that at the end of the day here, taking some time out just to share, you know, share internally at the earlier this week and then sharing with everybody here. If you are picking this up and have any questions, please leave a comment at the bottom of the page. Also, promote your website, Pete.

 

Pete Skelly:Peteskelly.com the Azure Functions blog post that actually goes through what I had to do to get the PnP CmdLets working full code, those types of things are all out there.

 

Danny Ryan:Yeah, and I’ll put a link to that at the bottom of the page as well so jump off there as a good next step from here and thank you so much for listening and have a wonderful day. Take care. Bye bye.

 

read more
Pete SkellyGetting Started With Azure Functions
azure-partner.jpg

How to Assign ThreeWill as Your Azure Partner of Record

Danny serves as Vice President of Marketing at ThreeWill. His primary responsibilities are to make sure that we are building partnerships with the right clients and getting out the message about how we can help clients.

Over the last 15 years we’ve been fortunate to help hundreds of customers.  If you’re a customer of ThreeWill and you want to do us a huge favor, please assign us as your Partner of Record.  This enables us to keep our Gold Certification and to serve you better because we have more resources from Microsoft to help you on projects.

Step-by-Step Instructions to Add ThreeWill as Your Partner of Record

  1. Go to the Microsoft Azure Portal at http://azure.microsoft.com/.
  2. Click on the My Account icon on the upper middle of the screen.
  3. Click on Usage and Billing.
  4. Log into your account using your user name and password.
  5. In the left navigation pane, select Subscriptions.
  6. On the Summary Subscription Page, click on Partner Information on the right navigation. This is where you will attach your Partner of Record.
  7. Enter 566560 for the Partner ID.
  8. Click Check ID to verify ThreeWill.
  9. Click Submit to complete assigning their Partner of Record.
  10. After you customer assign us as your Partner of Record, we will receive an email notification that lets us know that we have been assigned as the Partner of Record.

To Change or Remove Your Partner of Record

  1. Following the steps outlined above, log into the Microsoft Azure Portal.
  2. On the Summary Subscription Page, click on Partner Information on the right navigation.
  3. Highlight the Partner of Record field and delete the Partner of Record shown in that field.
  4. Click the check box. You have now removed the Partner of Record for this account and your subscription no longer has a Partner of Record.

Frequently Asked Questions

Who is a Partner of Record?

The Partner of Record for an Office 365, CRM Online, or Azure subscription is the partner who is helping the customer design, build, deploy or manage a solution that they’ve built on the service. It is not the partner who sold the subscription.

What are the benefits of specifying a Partner of Record?

Customers benefit because it provides the partner access to usage and consumption data, so they can provide better service and help customers optimize their usage for their desired business outcomes.

Who can attach a Digital Partner of Record?

The administrator role, also known as the owner, is the only role within the customer’s tenant or account that can attach a Partner of Record. Service admins, co-admins, and partners designated as delegated admins do not have the ability to change the Partner of Record.

When should a Partner of Record be added to a for Office 365, CRM Online, or Azure subscription?

Microsoft recommends a Partner of Record be assigned to subscriptions right away. Partners of Record can also be assigned for Office 365 subscriptions in the admin portal for that service.

read more
Danny RyanHow to Assign ThreeWill as Your Azure Partner of Record
shutdown.jpg

Configure Scheduled Shutdown for Virtual Machines in Azure

Eric Bowden has over 19 years of software development experience around enterprise and departmental business productivity applications.

Scheduled Shutdown in Azure

I’ve been using Virtual Machines in Azure for a while, and it’s a great option. My favorite part about it is how easy it is to expose Virtual Machines in Azure to the public internet. This can be really useful for a number of testing scenarios; It’s been a great resource. I’m able to use it because as part of my MSDN subscription, I’m provided a monthly credit. The only issue is that the credit is approximately enough to cover Virtual Machines that are running for the full month (as long as those Virtual Machines are running only during working hours.)

Approximately eight hours a day, five days a week.  The credit has been enough to cover. That works out great as long as I remember to start my Virtual Machines in the morning and then stop my Virtual Machines at the end of the day. Unfortunately, I’d forgotten to stop my Virtual Machines a number of times, so they run overnight. One time, they ran over the weekend and that caused me to overrun the monthly credit that I have as part of my MSDN subscription.

A colleague of mine, Rob Horton caught wind of that recently and directed me to this blog post that I’m showing, which walks you through the process where you can configure what’s called a runbook in Azure that runbook will run a PowerShell script. It’s actually called a graphical PowerShell script. There are graphical PowerShell scripts and I guess non-graphical PowerShell scripts and this walks you through configuring a graphical PowerShell script as part of your runbook, which can be used to automatically stop your Virtual Machines.

Configured that a couple of weeks ago. It’s been awesome! I configured mine to run at 6:00 PM which is approximately the end of the day and then 2:00 AM which could be the end of my day if it’s a long day. If I’m working on something into the night, 2:00 AM will probably be the latest. I just want  make sure that I have it running twice just to make sure that it shuts down. I’ll walk us through the process of configuring my runbook.  I’m showing on the screen now the blog post that I followed but there was one key difference in how I configured mine versus how this blog post walks you through it. That is that, I used a user account to run my runbook versus this blog post walks you through using an application account.

I think there are probably good reasons for using an application account. I tried that. I couldn’t get it to work, but I got it to work with a user account. I could probably circle back through and get it to work with an application account, but just have it put that effort into it. I’m going to walk you through today how to configure it using a user account. Let me get started.

The first thing I’m going to do is create the account. To do that, I’m going to go into the portal and choose Active Directory to get into my Azure Active Directory accounts. I’m going to go to the users tab. I’m going to add a user. I’ll create a new user and I’m going to call this VM Maintenance. Choose next and we’ll just call it VM Maintenance. Just create it as a standard user and you will get a temporary password for that user. I’m going to copy that to my clipboard. We’ll go ahead and close that up.

I’m going to open up a new incognito window. Then I’m going to go ahead and log in as this new user. It’s going to prompt me to change my password, which I will do. Take my password and sign in. Now, my credential is set, and my next step is that I’ll need to configure this user as a co-administrator of my Azure subscription. To do that, I’m going to scroll down to Settings I’ll choose Administrators and then I’m going to choose Add. Add in my user. Select that user as a co-administrator.

You’ll see the user is actually listed in here twice at the moment because I did it earlier while preparing for this demo. Let’s just remove the first one. There we go. Hopefully, I removed the old one and not the new one, but we’ll see what happens there. My account is now a co-administrator of this account. I know that seems pretty heavy-handed. I have read a blog post, Stack Overflow post and so forth, about people who have stated that this is necessary, and I’ve actually tried to get this user a lower account privilege and it was not successful. This does seem to be necessary.

That should be what we need for our accounts. Next, I’m going to go back to the portal admin page. I’m going to go to automation accounts and I’m going to add a new automation account. Click Plus and we will call this, VM Automation. I’m going to add it to an existing resource group. I’m not really aware of why this matters, but I’ll choose the same resource group as I do for all my others. I’ll leave the other default settings and choose Create.

It may take just a few minutes for the automation account to finish the creation process. For me, I had to click refresh on this automation, the listing of automation accounts. Once that appears, once it’s all like that, once the new automation account is open, the first thing we’re going to want to do is go to Assets and then we’re going to click Variables and we’re going to add a new Variable called Azure Subscription ID. For that, we of course are going to need our Azure subscription ID. Just go back to the portal. Go to My Subscriptions grab the Azure subscription ID. Paste that in.

Next, we’re going to need to create a credential which will be used to run the runbook, so I’m going to click Credentials. I’m going to click Add Credential and we’re going to name this one Azure Credential. I’m going to add in the VM Maintenance account that I created earlier. Now we’ve created a variable and a subscription and our next step is going to be to create the runbook. I’m going to choose Add runbook. No, I’m not going to choose that runbook. I’m going to choose Browse Gallery and then choose Stop Azure Classic. Let’s just see what that gives us. Stop Azure Classic VM’s.

You’ll notice that there’s two of them. There’s one that says PowerShell Workflow Runbook and another one that says Graphical PowerShell Workflow. The Graphical Powershell Workflow is the one that we need. We’ll choose that and then select Import. I can accept the default name, choose okay. We’ll import it from the gallery. My next step is going to be to choose Edit. There is a button for task pane and it prompts me for parameters. The service name that’ll default, I guess, to the current service. The Azure Subscription ID and the name of the credential  both have defaults which we configured a moment ago. You can see Azure Subscription ID is the variable that we just configured and then Azure Credential is the credential that we just created.

When we’re ready, we can just click Start to begin the test. I’m going to go scroll over a little bit here to see the results or the output. It takes a while for these to run, so I’m just going to pause the video while it runs. When the workflow is completed, as you can see here, it gives you a little bit of output. In this case, it tells me that my Virtual Machines were already stopped, so there is no action taken. Had any of them been running, it would tell me that they would list out those that it had stopped.

Let me close out of the task pane, My next step is going to be to publish this runbook so it’ll prompt me, do I want to publish it? Of course I say Yes. Now we have a runbook created. Our accounts are created and we have tested our runbook, so our next step is going to be to schedule it. I’m going to choose, go back to the VM Automation and I will choose select the runbook. Once the runbook is open, I’ll click schedule from the top and it will say like to a schedule. Link a schedule to your runbook, so I’ll choose that. I’m going to create a new schedule.

In this case, I’m just going to choose a one-time schedule. Of course, you can choose once or recurring. I’ll just choose once, and I’ll have it fire off here in the next 20 minutes. We’ll choose 12:20 PM and I’m going to say Create. Now one trick is that, simply creating the schedule,  I thought I was finished, but you’re not. All you have done at this point is created a schedule, but you need to link this runbook to it.

Now I’ve selected it and choose OK. Now you should see the number of schedules increase here under the schedules listing. I’ll open it, and now I can see that my runbook is scheduled and we’ll wait a moment for that schedule to run to confirm that it’s working. To do that, I’m going to go back to my classic virtual machines. Let me just start up one of my smaller virtual machines here.

It’ll start up and hopefully it’ll be fully started by the time that schedule fires off. Let’s go back to Automation Accounts. Open that Automation Account. Open the runbook and open the schedule and we’ll see it’s set to fire off at 12:20, which is just about five minutes from now. They won’t allow you to fire off a schedule. Any sooner than five minutes, so we’ll wait just a moment for that to fire off.

Now I can see that my schedule is listed as expired which tells me that it’s already run. So if I go back to my classic virtual machines, I can confirm now that my virtual machine has been stopped. That’s the quick run-through for creating a runbook which you can use to stop your virtual machines.

read more
Eric BowdenConfigure Scheduled Shutdown for Virtual Machines in Azure
azure-webjob-scheduling.jpg

Azure WebJob Scheduling

Kirk Liemohn is a Principal Software Engineer at ThreeWill. He has over 20 years of software development experience with most of that time spent in software consulting.

Azure WebJobs are extremely easy to write. If you have Visual Studio 2015 (earlier versions may work as well), then you can write a Hello World app in one minute. In another two minutes, you can publish it and have it running in Azure.

However, if you want to run it on a schedule, this can be a little tricky. For my first scheduled WebJob this went off without a hitch, but for some reason, my second WebJob was more finicky. It took a bit of investigation to figure this out, so I thought I would share my findings here…

Let’s start from the beginning.

If you have Visual Studio 2015, the simplest way to create an Azure WebJob is to create a console app. You can do this without Visual Studio, but that’s beyond the scope of this post. For a hello world test, your console application can be this simple:



using System;



namespace SimpleWebJob



{
class
Program



{
static
void Main(string[] args)



{
Console.WriteLine("Hello World");



}



}



}



Then simply right click on your project name and choose Publish as Azure WebJob:

The next screen you are presented with asks for a schedule. This may or may not work (more on that later).

Then you have a choice to make for selecting a publish target. The first option has given me more success when it comes to publishing the schedule chosen above. With that option, you’ll have to sign in to Azure.

If you choose the second option, you’ll need to first download a publish profile. On the old Azure portal (manage.windowsazure.com), this can be found in the quick glance portion of the dashboard for your web apps (you’ll have to create a web app if you don’t have one).

On the new Azure portal (portal.azure.com), this can be found near the top of the page for your web app.

Once you download publish profile you can import it into the Visual Studio wizard.

Regardless of your publish target, you can then validate your connection and publish.

After publishing you want to see something like the following in your output window:

Creating the scheduler job

Job schedule created

Web App was published successfully http://<yoursite>.azurewebsites.net/

However, you may notice one of the following message in your output window instead:

Warning : WebJob schedule for SimpleWebJob will not be created. WebJob schedules can only be created when the publish destination is an Azure Website

Web App was published successfully http://<yoursite>.azurewebsites.net/

I tend to see this when importing a publish profile, but don’t have this problem when selecting “Microsoft Azure Web Apps” as the target. Others have had these issues as well as evidenced here and here. However, even with the “Microsoft Azure Web Apps” target I can get this error:

Error : An error occurred while creating the WebJob schedule: Conflict: The provided job has a recurrence that is outside of quota. The minimum recurrence for the job collection is ’01:00:00′.

I get this error if I choose a frequency more often than one hour even if I am on the free tier for my web app. Others have had these issues as well as evidenced here and here. There pricing for these plans is at https://azure.microsoft.com/en-us/pricing/details/app-service/plans/. You may find that upgrading your web app from the free plan helps, but keep reading as there is another option.

If you see either of the above messages, your WebJob was probably published, but the schedule was not. It will only run on demand. Another clue you are having this problem is if you look at the WebJob in Azure you will see that it has a type/schedule of “On Demand”. However, I have seen it say “On Demand” in the new Azure portal when it actually mentioned the right schedule in the old Azure portal. Very strange.

Based on what I have seen, Azure WebJobs are not really intended to own the schedule. The reasons I state this include:

  1. It seems you need Visual Studio to do this approach.
  2. When using a publishing profile, the schedule appears to be ignored.
  3. The new Azure portal does not show the schedule when the old one does. For the new Azure portal, the schedule actually shows in the Scheduler (more on that below).
  4. There is a nice UI for editing the schedule, but to update the schedule you must go into the webjob-publish-settings.json file.

My belief is that built-in scheduling for WebJobs should be done using the Azure Scheduler directly. If you choose the Azure Scheduler, you have 3 pricing options. The free one has some limitations. The most important limitation is not only the maximum execution frequency of 1 hour, but also that you cannot have authentication with each schedule. This is needed if you want to schedule a WebJob. Some have had success at providing credentials on the URL (see this post for more details), but it did not work for me.

My solution was to upgrade the scheduler to Standard ($13.99 per unit; you get 10 “job collections created per hour” for one unit) and create my own schedule by hand. Here’s how I did it from the new Azure portal:

  1. Create a Scheduler or find your existing Scheduler (use search within the portal to find it)
  2. Upgrade to Standard if you are using the Free tier
  3. Download your publish profile as discussed further above and open the file with a text editor.
  4. Add a new scheduler job
  5. Give it a name and set the following properties:
    1. Action: Https
    2. Method: Post
    3. Url: https://<yoursite>.scm.azurewebsites.net/api/triggeredwebjobs/<webjobname>/run
    4. Body: (leave blank)
    5. Authentication Type: Basic
    6. Authentication Username: get the “useName” value from your publish profile – it might start with “$”
    7. Authentication Password: get the “userPWD” value from your publish profile
  6. Set a schedule. I noticed that the UTC offset was ignored for me so I kept it at UTC and just adjusted the advanced schedule settings accordingly.
  7. Make sure you save everything

One issue I noticed with this approach is that my authentication settings would get cleared whenever I made updates to the schedule, so you may want to re-enter the authentication settings if you make any changes to the schedule.

After the schedule is set up, you can go back to your WebJob in the Azure portal and wait for it to run. You can run it manually as well. Once it does run, you should see your Console.WriteLine output. Here is a sample of what you see:

[02/23/2016 03:51:42 > 1cd571: SYS INFO] Status changed to Initializing

[02/23/2016 03:51:42 > 1cd571: SYS INFO] Run script ‘SimpleWebJob.exe’ with script host – ‘WindowsScriptHost’

[02/23/2016 03:51:42 > 1cd571: SYS INFO] Status changed to Running

[02/23/2016 03:51:42 > 1cd571: INFO] Hello World

[02/23/2016 03:51:42 > 1cd571: SYS INFO] Status changed to Success

I know this is a lot of detail, but I hope this post clears things up for some.

read more
Kirk LiemohnAzure WebJob Scheduling
timer.jpg

Azure Web Jobs

Kirk Liemohn is a Principal Software Engineer at ThreeWill. He has over 20 years of software development experience with most of that time spent in software consulting.

Danny Ryan:                       Hi this is Danny Ryan and welcome to the ThreeWill Podcast. Today I’ve got Kirk Liemohn here with me. Kirk is a Principal Software Engineer here at ThreeWill. Thanks for joining me Kirk.

Kirk Liemohn:                    Hello. Good to be here.

Danny Ryan:                       Great. I’m pulling you in for … We’re going to hit a technical subject today and that subject is web jobs. These are Azure Web Jobs. Is that correct?

Kirk Liemohn:                    That is correct.

Danny Ryan:                       Awesome. I’d love to learn a little bit more about what this is. Maybe a little bit of history of how you solved this problem before in the past. Some of the benefits of the way you’re taking your approach today. Tell me a little, maybe just starting at a high level, what is an Azure Web Job?

Kirk Liemohn:                    It’s just a task that runs in the background on Azure, within an Azure app. You can use it just to run anything that might take a long amount of time maybe, or just something that you want to run on a periodic basis.

Danny Ryan:                       Great. What would … is there an equivalent type of thing on SharePoint as far as the way you would approach this?

Kirk Liemohn:                    Yeah. In the past, in SharePoint we would do a timer job, definitely. If we had access to do that, you can’t do that in SharePoint online. You could write your own timer job in SharePoint. You could do a scheduled task on a Windows server as well or even a client machine if you wanted to. Those are ways to in the past and I’ve got a Unix background from back in the ’90s and back then it would have been a ChronJob.

Danny Ryan:                       Excellent. Sort of the history of the way that you periodically ran a piece of software or …

Kirk Liemohn:                    Yep.

Danny Ryan:                       I know when we we’re talking about prepping for this you also mentioned that you could use this for long running processes as well?

Kirk Liemohn:                    Yeah. Say if you’ve got a website, especially if it’s running Azure and you’re using the PaaS solution for doing websites in Azure. It needs to do some processing and for whatever reason that processing takes more time than you want to make the user wait. Even if it’s just a few seconds, might be enough to say, “Hey, let’s do this asynchronously.” You could push it off to a web job. If it certainly might take minutes or hours you would want to consider a web job for background processing.

It may not be something you do on a scheduled basis. It may be initiated by a user. If you we’re to, on that website, try to spawn a thread so that the user didn’t have to wait that delay. IIS, if you’re using IIS, it’ll potentially kill your thread because it thinks, “Well I’ve already returned to the user and I don’t need to be having anything else going on. I see there’s something else going on but I don’t care. I’ve already returned control to the user.” It has no use for your thread you’ve created so it’s better to do it with something like a background process via web job.

Danny Ryan:                       Tell me a little bit about some of the things that you learned while doing this. Talk a little bit about how did you package this up and get it on Azure.

Kirk Liemohn:                    First off, this is my first time doing a web job. I’ve wanted to do them in the past. I’ve had some times I didn’t have to do them, but I kind of wanted to do them. This is a time I really needed to. I am doing a migration to Office 365 and we’re concerned about throttling. This is SharePoint, this is a SharePoint migration. We know that we might get throttled by SharePoint online. We want to know when we’re being throttled.

The best way we can do that is to look at the logs for the migration tool and read that information and find out if there’s an entry in the log that indicates that we’re being throttled. I’ve got several migrations running at once on several servers. I didn’t want them to each go through and look for this because it is a little costly. I have to do a full text index of the content and a query across that. I’ve decided that I want to run it on a periodic basis. A job that will do that check and then it will post something to a simpler database that says, “Yep, I’ve been throttled or not.”

It’ll be a lot easier to say, create a dashboard that shows what the throttle status is, if there’s been any throttles recently, as well as the migration … The servers that are doing migrations they can do that check and say, “Oh, if we’ve been throttled recently I don’t want to start another migration job. I’m going to hold off until I’ve got no throttles within the last hour or something.”

Danny Ryan:                       Awesome. How are you alerted about this? You’ve mentioned a dashboard. I guess you could do it through a dashboard, but how else?

Kirk Liemohn:                    I haven’t created the dashboard yet, I might do that. I use Visual Studio, is what put it into the Azure Web Job for me. I started out making it as a PowerShell script and my PowerShell script relied on a commandlet that for Sequel that I had to use a couple of, actually three, MSIs to get that working. I didn’t know how to install MSI’s in a web job. I don’t think you can. I know you can upload a bunch of files and it was simpler to me to just switch it over to a console app.

I switched to a console app; this console app was written in Visual Studio. Visual Studio was real easy to just right click on the project and say, “Publish this web job.” There’s a few settings you have there, how often you want it to run. Then from Azure, the Azure console, admin console, you can see the status like any output from your console as well … Your console app.

Danny Ryan:                       You mentioned that this runs every hour, but that’s primarily because of the type of Azure account that you have. Is that correct?

Kirk Liemohn:                    That’s right. This will run, I have it running once an hour. That’s the fastest I can have it run since I think we’re on the basic, I think it’s the basic plan for our Azure web app. If I we’re to change that to standard or premium I could do it more often. We may do it for that reason, we haven’t decided that we need to go to that level yet. That is correct. While it’s running it simply does the query, the full text query. It’s against SQL which is running on Azure machines so I have to have the access to that sequel server over the internet so I’ve got special accounts and ports for that. Then if it finds something it will log something to a different database table and it will also send an email. I use SendGrid for that. Which I’ve used SendGrid in the past with Azure so it was pretty easy to set that up.

Danny Ryan:                       It’s a pretty easy configuration for that?

Kirk Liemohn:                    Yeah, it probably took me 30 minutes and I hadn’t used SendGrid in several months, maybe a year. It wasn’t fresh in my memory but about 30 minutes to figure out how to send email, basically. Set it up and configure it and all that.

Danny Ryan:                       Its funny you mention this is the first one you’ve done. I think that’s the best time to give advice to other people who might be doing this for the first time. Any other pointers that you would give to people for writing Azure web apps, web jobs?

Kirk Liemohn:                    Yeah, web jobs. The first thing is try and understand what it relies on. If it’s relying on certain other pieces of software or something installed on your system then it may not be appropriate or you may need to find a way to get those up into Azure. If you need other files, in my case initially I needed some other commandlets for PowerShell and that wasn’t going to be the easiest path for me so just doing an all encompassing console app that knew how to talk to Send Grid and knew how to talk to databases was all I needed.

Danny Ryan:                       It’s funny it seems like console apps are coming back because I was talking to Chris yesterday and he wrote for the migration tool. The Jive to SharePoint migration tool. That’s as a console app as well. It seems like they’re coming back in fashion now.

Kirk Liemohn:                    Oh yeah. All of our stuff for migration is mostly PowerShell based and we run all that from consoles, so definitely. The gotcha is just try and understand what your program relies on. Try and understand what your needs are in terms of how often this thing needs to run. Those are kind of the two things that I ran up against.

Danny Ryan:                       That’s great. I appreciate you taking the time to do this. I know you’re busy on your project but hopefully this will help somebody who may be out there looking to write their first Azure Web Job. Thank you for doing this Kirk.

Kirk Liemohn:                    Sure. You’re welcome.

Danny Ryan:                       Thanks everybody for listening. Have a great day.

read more
Kirk LiemohnAzure Web Jobs
mac.jpg

Starting and Stopping Multiple Azure VM’s from a Mac Terminal

Pete is a Director of Technology at ThreeWill. Pete’s primary role is driving the overall technology strategy and roadmap for ThreeWill. Pete also serves as ThreeWill’s Hiring Manager and is constantly looking for new talent to join the ThreeWill family.

We recently started a project in which ensuring we kept costs of Azure VM’s as low as possible was a real concern. Since I am a Mac user (stop snickering), the Azure CLI tools seemed like an obvious choice. I just wanted a simple way to check the current status of the virtual machines in a specific Azure Subscription, and then start to shutdown the virtual machines as needed. I agree with Scott Hansleman, and thought I would save myself and others some keystrokes. #sharingiscaring

Using the Azure CLI from a Mac Terminal

The first thing I found when searching was Using the Azure CLI for Mac. This was pretty helpful and gave me some basic commands, but didn’t give me the simple path of shutting down all the virtual machines that were running at once. This did give me the basic commands that let me check the status of the virtual machines, and stop and start the machines as well. By the way, this page has a bunch of info I plan on diving into at a later date.

To start, you’ll need to install the Azure CLI tools. This is a pretty straight forward install using npm.

npm install azure-cli -g

Once you have the tools, you’ll need to get your subscription settings. I have to run the following:

azure account download

This will open a browser and you’ll need to select the subscription for which you want the settings file. Once the subscription information was downloaded, running

azure account import [settings file]

will set the current account.

The Basic Commands

Once the Azure CLI tools are installed, the basic commands to list, stop and start the virtual machines are pretty simple

azure vm list

azure vm start vm-name

azure vm shutdown vm-name

Great. Almost there.

Painful Unix Memories

Unfortunately, even though I am now a Mac user, I still have to bang my head against the wall to remember Unix commands and pipe operations. Bad memories of awk and sed from many years ago. Towards the end of the install, the Azure CLI tools post, there was a section about “understanding results” which did something very similar to what I needed. This example was using xargs and something I had never heard of called jsawk. The xargs didn’t give me any flashbacks, so I was ok with that, but the jsawk made me twitch a little. Turns out, the curl installation of jsawk from the GitHub repo failed for me, but I was too close to stop!

A link in the jsawk readme mentioned using Homebrew, so the following installed jsawk and the dependencies I was missing. Back in business!

brew search jsawk
brew install jsawk

The Final Solution

Finally, all the pieces are in place. Now I can list the virtual machines, filter them, and more. This is a little deeper than I want to go here, but the jsawk documentation is really great. With the ability to filter and then use xargs to call another azure command, the final solution was just a simple change based on the example towards the end of the install the Azure CLI tools post. Here they are:

Shutdown some Virtual Machines

azure vm list --json | jsawk -n 'out(this.VMName)' | xargs -L 1 azure vm shutdown

Start all of the Virtual Machines

azure vm list --json | jsawk -n 'out(this.VMName)' | xargs -L 1 azure vm start

There are still some things I want to do, specifically create a shell script to enable some other options and arguments, but this might be a blog post for another day. Hopefully this will help someone else and save some keystrokes.

And BTW – I do realize this would be very easy in PowerShell. I’m just sort of new to the Mac. #justsaying

Get-AzureVM | Where-Object {$_.Status -like "Ready*"} | ForEach-Object { Stop-AzureVM -ServiceName $_.ServiceName -Name $_.Name -Force }

Let me know what you think or if this was helpful.

 

read more
Pete SkellyStarting and Stopping Multiple Azure VM’s from a Mac Terminal
brand.jpg

How to Brand Your Office 365 Login Page

Pete is a Director of Technology at ThreeWill. Pete’s primary role is driving the overall technology strategy and roadmap for ThreeWill. Pete also serves as ThreeWill’s Hiring Manager and is constantly looking for new talent to join the ThreeWill family.

Now that Microsoft has made it easier to get to your Azure Active Directory instance which backs your Office 365 tenant, something else got a whole lot easier as well. Branding your Office 365 tenant login experience is now super simple. No more boring login screens that look like everyone elses – like this one!

o365login-1

Just a few clicks and we’ll have a nice, branded experience.  First, click the Apps Launcher in your tenant and click the Admin app tile.  You will need admin rights to accomplish this task, and you will need access to an Azure subscription.

o365login-2

Once you are in the adminstrative portal for your tenant, click on the Azure AD link. If you don’t have an Azure subscription, you will need to create one. In my case, I already had an account, and my tenant directory appeared with my other tenant directories. You should see your domain listed that is present with your O365 Tenant. Then click on your Domain.

o365login-3

Once you see the list of Active Directory domains, click your domain. Then click the Configure menu option if you don’t see the directory properties as below.

o365login-4

Finally, click the “Customize Branding” button and set the following five items:

  1. The banner logo – a banner logo displayed on the Sign In page in cloud applications that use this directory
  2. The tile logo – a 200×200 .jpg or .png file displayed on the sign in page
  3. Sign In Page Text – sign in page text, typically an “appropriate use” statement
  4. Sign In Page Illustration – large image which will appear to the left of the login form
  5. Sign In Page background – background color (in HEX) used in place of an image or behind a transparent image

o365login-5

Some of the items above have some size limitations, image size and file size so you may need to brush off some Paint.NET skills – have a somebody with real talent create your images like I did.  Save your changes and then try entering your user id in the login screen. It may take some time to see all of your changes, but as soon as you tab off of the user name field, you should see your images and text.

o365login-6

Once you have these settings in place, you can even return to the Configure page and add language specific settings. Just a little something simple to make the Office 365 experience part of your company brand.

Questions about how to brand your Office 365 login page?  Ask in the comments below…I hope this helps make Office 365 feel a little more like your own. Want even more customization? Look Here!

read more
Pete SkellyHow to Brand Your Office 365 Login Page
top-ten-blue.jpg

Ten Reasons Why Office 365 and Azure Made Me a Fan Boy

Bo is a Principal Consultant for ThreeWill. He has 18 years of full lifecycle software development experience.

I recently worked on an on-boarding solution for a customer that was built as a cloud based application using Microsoft Azure and Office 365.  It’s what Microsoft refers to as a provider-hosted app for SharePoint, and it is all the rage these days.  Being a Microsoft developer for about 15 years, I’ve learned to be a little cautious of the newest things being pushed out by the Microsoft marketing machine.  However, after working on very real world challenges using Office 365 and Azure, I think all hubbub is well deserved.

In the on-boarding application, our customer had some very typical requirements that most companies might have including:

  • Public facing internet site where anyone can register and apply for a job
  • Highly complex and very business specific onboarding process
  • Simple and familiar UI for managing applicants from hiring through termination
  • Routing to different queues based on an applicant’s current state
  • Need to support user based events and time based events
  • Need to integrate with external third-party systems during onboarding

In addition to the typical requirements, our customer had some more unique but not uncommon challenges to deal with:

  • Very small IT staff responsible for supporting many users and technologies
  • Seasonal hiring model with large peaks and then periods of low activity
  • Need to manage seasonal employees individually or in batches
  • Desire to very easily change the workflow process over time

Obviously there are some “off the shelf” products that can handle some of these requirements, but many times they are not as customizable as desired.  Other times they can cost more than a custom solution due to licensing costs over time.  Finally, they may not integrate out of the box with Office 365 where many employees are already doing work.

Since our customer was already using Office 365 for collaboration and Azure for other applications, a provider hosted app was a natural fit for them.  It allowed for the flexibility to satisfy all their requirements, provided the services to be an enabler for their IT Staff and the UI familiarity for their internal stuff.  In the white paper The New Business Operating System: Combining Office 365 and the Microsoft Cloud Ecosystem to Create Business Value I think Pete Skelly summarized it best in the sentence “The New Business Operating System (NBOS) enables businesses to leverage commodity based IT Services and enables customizations to enhance business value.”

That sentence really says so much; I thought I would break it down into some specific reasons and examples on how the New Business Operating System helped our customer with their solution and made me a fan.

Five Reasons Office 365 Added Value to Our Solution

  1. Do More with Less – With a small IT staff, using Office 365 is really a no brainer. The Software as a Service (SaaS) model means no infrastructure to manage and no software to install/maintain.
  2. Familiar User Interface – For internal users, Office 365 made sense as the UI since it is already familiar from other activities such as document management and collaboration.
  3. Ease of Management – Built-in support for things like user management, security, lists and views means the small IT staff can easily manage the user experience for internal users just like they do for other functionality in Office 365.
  4. Workflow Engine – With a workflow engine available in Office 365, time based events like status changes and notifications after a specific period of time is easy.
  5. Ease of Customization – UI elements such as pages and the ribbon in Office 365 can be easily customized to integrate custom functionality while still looking like it’s built into the platform.

Five Reasons Microsoft Azure Added Value to Our Solution

  1. Enterprise Level Management and Support – Again, with a small IT staff, using a Platform as a Service (PaaS) makes a lot of sense from a management standpoint. In Azure, backup and recovery, monitoring and scalability are easily managed without needing extra tools or software.
  2. Flexible Compute Power – The nature of a seasonal hiring solution is also a perfect fit for Azure because you can easily deal with “Predictable Bursting” (The New Business Operating System: Combining Office 365 and the Microsoft Cloud Ecosystem to Create Business Value) by increasing resources during peak periods and reducing during low periods. If you had to deal with this on your own infrastructure, it would likely mean lots of idle hardware much of the time.
  3. Technology Choice – Since Azure is a platform to host applications, it provides the flexibility to develop an application exactly as envisioned by a customer and in technologies, known or comfortable. In our solution, we stayed within the Microsoft stack including ASP.NET MVC, Entity Framework, and several JavaScript Frameworks, but you aren’t limited to these at all.
  4. Ease of Deployment – In Azure, it is also extremely easy to package and deploy the application. This means developers can spend their time on business functionality and creating business value. Our application could be deployed to an environment like QA and UAT in just a few clicks.
  5. Robust Platform – Azure isn’t just for web apps and databases either. It’s a much larger platform that supports things like a service bus where an application can use queues and messages to make the application more robust, scalable and responsive even during unexpected peaks.

As you can probably tell, I’ve become a fan of creating solutions using Microsoft Azure and Office 365 as the platform and services to build upon.  I look at how our customer’s resources are now enhancing the application and managing thousands of applicants with just 2 people working on the application only part of the time.  In a traditional on premise application, there is no way I can imagine it would be possible for just 2 people to support the same application and be able to continue to add business functionality.


SharePoint is a web application platform in the Microsoft Office server suite. Launched in 2001, SharePoint combines various functions which are traditionally separate applications: intranet, extranet, content management, document management, personal cloud, enterprise social networking, enterprise search, business intelligence, workflow management, web content management, and an enterprise application store. SharePoint servers have traditionally been deployed for internal use in mid-size businesses and large departments alongside Microsoft Exchange, Skype for Business, and Office Web Apps; but Microsoft’s ‘Office 365’ software as a service offering (which includes a version of SharePoint) has led to increased usage of SharePoint in smaller organizations.

While Office 365 provides SharePoint as a service, installing SharePoint on premises typically requires multiple virtual machines, at least two separate physical servers, and is a somewhat significant installation and configuration effort. The software is based on an n-tier service oriented architecture. Enterprise application software (for example, email servers, ERP, BI and CRM products) often either requires or integrates with elements of SharePoint. As an application platform, SharePoint provides central management, governance, and security controls. The SharePoint platform manages Internet Information Services (IIS) via form-based management tooling.

Since the release of SharePoint 2013, Microsoft’s primary channel for distribution of SharePoint has been Office 365, where the product is continuously being upgraded. New versions are released every few years, and represent a supported snapshot of the cloud software. Microsoft currently has three tiers of pricing for SharePoint 2013, including a free version (whose future is currently uncertain). SharePoint 2013 is also resold through a cloud model by many third-party vendors. The next on-premises release is SharePoint 2016, expected to have increased hybrid cloud integration.

Office 365 is the brand name used by Microsoft for a group of software plus services subscriptions that provides productivity software and related services to its subscribers. For consumers, the service allows the use of Microsoft Office apps on Windows and OS X, provides storage space on Microsoft’s cloud storage service OneDrive, and grants 60 Skype minutes per month. For business and enterprise users, Office 365 offers plans including e-mail and social networking services through hosted versions of Exchange Server, Skype for Business Server, SharePoint and Office Online, integration with Yammer, as well as access to the Office software.

After a beta test that began in October 2010, Office 365 was launched on June 28, 2011, as a successor to Microsoft Business Productivity Online Suite (MSBPOS), originally aimed at corporate users. With the release of Microsoft Office 2013, Office 365 was expanded to include new plans aimed at different types of businesses, along with new plans aimed at general consumers wanting to use the Office desktop software on a subscription basis—with an emphasis on the rolling release model.

read more
Bo GeorgeTen Reasons Why Office 365 and Azure Made Me a Fan Boy
atlanta-bright-night.jpg

Join ThreeWill at the Atlanta SharePoint User Group Meeting on March 16th

Danny serves as Vice President of Marketing at ThreeWill. His primary responsibilities are to make sure that we are building partnerships with the right clients and getting out the message about how we can help clients.

Session Information

Topic: Hybrid SharePoint Security – Providing Seamless Access Using Azure AD, ADFS and Active Directory

Today’s complex business environments present IT organizations with many challenges to provide internal users, external users and partners access to enterprise resources seamlessly and securely. Today’s enterprises want to take advantage of the cloud and all of the flexibility and benefits cloud computing has to offer. But how can you integrate on premises and cloud resources, providing all your users and partners seamless access across environments and devices while remaining secure? This presentation describes and demonstrates the integration of Azure AD, ADFS and an on premises Active Directory to enable internal users to maintain their typical authentication experience, allow seamless access to corporate resources for mobile users, and enable external partner access to resources.

Specially, we’ll discuss a real world Microsoft Azure based IaaS SharePoint 2013 implementation which enables secure internal and external access to SharePoint content, including video for iPhone and iPad, for all internal and external users. We’ll also cover a demo of a development environment which demonstrates the solution.

Speaker – Lane Goolsby

Lane is a Senior Software Engineer with ThreeWill. He is a strong technology expert with a focus on programming, network and hardware design, and requirements and capacity planning. He has an exceptional combination of technical and communication skills with a strong emphasis on troubleshooting, problem determination, customer relations, and identifying quick paths to resolutions. Lane is a strong mediator between programmers, end users, and clients. His experience in design, installation, implementation, maintenance, support, and client/end user training of various software and hardware packages is put to use in everything from large-scale enterprise applications to small deployment limited traffic web sites.

Register Now
read more
Danny RyanJoin ThreeWill at the Atlanta SharePoint User Group Meeting on March 16th