Creating an Alexa Skill in C# – Step 3

Now that you’ve defined your Alexa Skill in Step 1, and you’ve configured the security and downloaded the sample in Step 2, we’re ready to take a look at the sample and see how to wire it up to the skill definition.

Open your copy of the sample solution in Visual Studio 2017.  I left my copy in this folder:


At this point, I had a decision to make:  Do I start a new project and copy the stuff from the sample project, or do I just mess with the sample project?  I think I’ll actually create a new project, and leave the sample in pristine condition for comparison.

Creating the Alexa Skill Project from the Sample

First, create a new project in the solution, choosing AWS Lambda Project with Tests as the template:


We want an Empty Function:


Now that we have our own project, let’s check out the dependencies:


As of this writing, the NuGet package for Amazon.Lambda.Serialization.Json needs updating.  So I did that.  Also, the sample is for .NetCore 1.0.  My project has defaulted to using .NetCore 2.0.  I don’t expect this to be a problem.

Next, I might as well copy the code (both classes) and the AlexaAPI folder from the sample to my project.  This will at least give me a starting point.

Coding for the Alexa Intents

Now that the project is set up, we can start coding for the Intents in the Alexa Skill we created in Step 1.   To start off, we will crack open the Function class in our new project and locate the ProcessIntentRequest() function.  There’s a switch statement there that will handle our simplest Intent Request, which was GetTodaysDateIntent, like so:

switch (intentRequest.Intent.Name)
    case "GetTodaysDateIntent":
        innerResponse = new SsmlOutputSpeech();
        (innerResponse as SsmlOutputSpeech).Ssml = $"Today's date is {DateTime.Today.ToString()}";
    case "GetNewFactIntent":
        innerResponse = new SsmlOutputSpeech();
        (innerResponse as SsmlOutputSpeech).Ssml = GetNewFact(factdata, true);

At this point, you’ll probably want to do a build and deployment to AWS.  First, make sure your project builds.

Creating an AWS Profile

Next, you need to set up a Publish Profile in AWS.  This is done by viewing the AWS Explorer on the left side of your VS screen and clicking the New Account Profile icon.


That will bring up the New Account Profile dialog.


Click the link near the bottom of the dialog to go to the documentation on how to add a user.  Then click on one of the links that open the Console to IAM.  Or just click on the link here.

Next, select the Users category on the left.  Then create a Deployment User that has programmatic access and is in the Developers group we created in Step 2.


When you click Create User, you will see the Success screen.  Be sure to click the Download .csv button (middle-left) to get the credentials file.  You can never get this file again, you will have to create a new Access Key and get a new credentials file.


Now, back in Visual Studio, click the Import from csv file button:


Give your profile a name, I called mine Deployment.  Account Number is optional and can be left blank.  Then click OK.  You will see the AWS Explorer switch to the Deployment profile.  You should then pick the default profile and delete it.


Publishing Your AWS Lambda Project

You are now ready to publish your AWS Lambda project.  Right-click on the project and select Publish to AWS Lambda….


If you don’t see this menu option, make sure you’re in the Lambda project, not the Test project.  Then, make sure you have installed the AWS Toolkit as described in Step 2.

When you see the publish dialog, give your function a name:


Then click next and pick the Role we defined in Step 2.


Then click Upload.  After a few moments, your Lambda function will be uploaded.

Uh-Oh.  Duplicate Compile Items

If you get the Duplicate Compile items error:

Duplicate Compile items were included. The .NET SDK includes Compile items from your project directory by default. You can either remove these items from your project file, or set the ‘EnableDefaultCompileItems’ property to ‘false’ if you want to explicitly include them in your project file.

The recommendation is to remove the Compile items from your project (*.csproj) file.  Right-click the project and choose to Edit your .csproj file:


Then delete all the *.cs files from the Compile section:

I also deleted the ItemGroups above that because they had Compile items in them too.  Now my project builds and deploys properly, and my .csproj file is much cleaner:


Now publish the fixed project to AWS.

Back on Track:  Wiring up the Skill and the Lambda function.

Alright, we have a Skill defined.  We have a Lambda function uploaded.  Now we need to tell them each about the other.  First, we’ll tell the Skill about the Lambda function.

Go to the Lambda functions list in the Lambda Management Console.  Click on your new Lambda function, which I named MyAlexaSkillsLambdaFunction.  In the top right corner of the screen you will see the ARN:


Copy the ARN to the clipboard and paste it into Notepad.  Now, switch over to the Alexa Skill Console.  Select your skill.  Click on the Build tab.  Click on the big “4. Endpoint >” button in the checklist on the right.


Paste the ARN from the Lambda Function into the Default Region box on the right.


Then, copy the Skill ID above it to the clipboard and paste that into Notepad too.

Switch back to the Lambda Management Console.  Under the Add Triggers list on the left, find the Alexa Skills Kit and drag it over to the Triggers area of your Lambda function:


Then scroll down to configure the trigger and paste in the Skill ID you copied from the Skill and click the Add button:


Lastly, in the top right corner of the Lambda screen, click the Save button:


And that’s that.  The Alexa Skill is defined, the Lambda function is defined.  The two know about each other.  You are ready to test your skill.

Testing the Alexa Skill

To test your Alexa Skill and make sure everything is wired up properly, click on the Test tab in the Alexa Console.   Then enable testing for this skill.


In the Alexa Simulator panel, in the text entry box, run your Skill:


Alexa should respond with the Launch Message from your skill (which we will modify in a later Step in this series):


Now, test your Skill’s Intent:


Alexa should tell you the date:


Tada!  That’s it for Step 3.

Check out Step 3a to see how to Unit Test your Alexa Skill Intent without publishing anything to AWS.

In Step 4, we will look at setting up a Web Service project to feed data from an Azure SQL database to our Alexa Skill.


Creating an Alexa Skill in C# – Step 2

If you haven’t done so already, check out Step 1 to define your Alexa Skill.

After defining the Alexa Skill in Step 1, you are ready to set up Visual Studio in Step 2.

AWS Account and AWS Toolkit Extension

Before you can set up an Alexa project, you need to create an AWS Lambda project.  Lambda functions are just class libraries that are hosted in the AWS Lambda cloud service.   To create one, you need two things:  the Amazon AWS Toolkit Extension and an AWS Developer Account.  You can install the Extension in VS 2017.


You can create the AWS Developer Account at

With the AWS NuGet package installed and the Developer Account set up, you are ready to set up the AWS Security.  You do that here:

Securing Your AWS Account and Lambda Function

In IAM Management Console, click on Groups on the left.  Create a group for your Developers that has Admin Access.  Create a group for your Apps that has AWSLambdaFullAccess

Then, click on Users on the left and create two accounts.  One for yourself, of type AWS Management Console access, assigned to the Developers group.  And, one for your app, of type Programmatic access, assigned to the Apps group:



Next, create an AWS Lambda Role by selecting Roles on the left side.  Then make the following selections:




With the security Groups, Users and Role configured, you are ready to create the Solution for your Alexa Skill.  The easiest way to do this is to use one of the sample projects in the Alexa Git repository.

Getting the Alexa Skill Sample Solution

Navigate to the Alexa GitHub repository in your browser to see what’s available.

I used the alexa/skill-sample-csharp-fact sample as my starting point.  It has a ton of code in it that is ready to go, I just had to add something specific to my skill and I was off to the races.  You can get the sample from the command line (Start | Run | Cmd).  Make the directory/folder you want to host the project in with the md command:

C:>  md \_GitHub
C:>  md \_GitHub\Alexa
C:>  cd \_GitHub\Alexa

And then type:

git clone

You will find the C# Solution file here:


Copy the sample solution to your own folder:

C:\_GitHub\Alexa>  md MyAlexaSkill
C:\_GitHub\Alexa>  cd skill-sample-csharp-fact\lambda\custom
    xcopy *.* \_GitHub\Alexa\MyAlexaSkill\*.* /S

Open the solution in VS 2017.  (Be sure to update to the latest version of VS — 15.7 as of this writing — as it has some cool new features!)

In Step 3, we will look at the sample solution and start to modify it to work with the skill we defined in Step 1.


Creating an Alexa Skill in C# – Step 1

Recently, I decided to make an Alexa skill that I could play on my boss’s Alexa.  At first, I was doing it as a gag, but I figured that wouldn’t work as he has to actively install my skill onto his Alexa.  Now it reads some stats from one of our Azure databases and publishes those in a conversation.  Here’s how I built it.

Step 1:  Creating the Alexa Skill Model

I don’t know any of the other languages that you can write a skill in, so I chose to write it in C#.  That means getting a bunch of bits and loading them into Visual Studio.  But first, you’ll need to start the process of creating a skill.  At the time of this writing, these are the steps I took.

  1. Start Here.
  2. Click the Start a Skill button.capture20180507114817423
  3. If you don’t have an Amazon Developer account, create one.
  4. Eventually, you’ll get to the Alexa Skills Developers Console where you can click the Create Skill button.  capture20180507115228773
  5. Give your skill a name.  I’m calling mine:  Simon’s Example Skill.
  6. On the Choose a Model screen, select Custom. capture20180507115624173
  7. Then click Create Skill.
  8. You should now be on the Build tab for your Skill.  Notice the tree view on the left and the checklist on the right.capture20180507115839651

Invocation Name

The Invocation Name is the phrase that an Alexa user will use to launch/run/start your Alexa Skill.  It could be “joe’s hot dog recipes” or some such.  It needs to be lower case, there are some restrictions on the characters you can use, and any abbreviations you use need periods to separate the letters so Alexa knows to read the letters and not pronounce the word.  Read the Invocation Name Requirements for more details.

Click the Invocation Name link in the tree view on the left or the first option in the checklist on the right.  Then give your skill an Invocation Name.  I named mine:  “simon’s example”.


The Intents are the various functions your skill can perform.  For example, stating today’s date, looking up some data, figuring something out, etc.  Let’s do all three.

First, my skill is going to provide today’s date, so I’m going to name my first Intent, “GetTodaysDateIntent”.  My skill is also going to look up some data in an Azure SQL Database, so I’m going to name my second Intent, “DataLookupIntent”.  Lastly, I want to figure something out, like the average temperature in a major US city.


The Utterances are the phrases an Alexa user might say to trigger your Intent (function).  You should put in several Utterances using synonyms and different phrasing so Alexa has a better chance of triggering your Intent instead of responding with “I don’t know how to do that.”

For the GetTodaysDateIntent, I added the following utterances:



Within an Utterance, you can have Slots (or placeholders), that represent the multiple options for that slot.  For example, if my table has the count of animals per household per neighborhood per county in it, I might want to create slots for Animal Type, Household Size, Neighborhood Name, and County Name.  You can do this by typing a left brace { in the Utterance box.


Here are three of my sample utterances for the DataLookupIntent:


Once you have created a slot, you need to populate it with options.  You do this in the bottom half of the Utterance screen.


You can easily select one of the pre-defined Slot Types in the drop-down.  In my case, Amazon has a list of Animals, so I’ll pick AMAZON.Animal in the first slot.

I need to manually add a few Counties for the second slot though.  And at this time, you don’t want to click Edit Dialog (though it’s tempting).  Instead, you want to define your own Slot Type by clicking the Add (Plus) button next to Slot Type in the tree view on the left:


For example, here is the custom Slot Type for County Name:


Notice the synonyms column.  This is important if there are synonyms, for example, a 1 person household and a single person household are synonymous.  So be certain to add any synonyms you can think of.  Here is my custom Slot Type for Household Size, notice the synonyms off to the right:


Now that you’ve defined some custom Slot Types, you can click on the Slot names under the Intent in the tree view on the left and select the newly created Slot Type for each Slot.


For the GetAverageTemperatureIntent, I added one Utterance:


And configured the {City} slot as follows:


Finally, you can Save your Model and Build it by clicking the buttons at the top of the screen:


Hopefully, the model will build and we are ready to move on to Step 2.  If the model doesn’t build, check the bottom right of the screen for a list of the errors:


Fix the errors until the model builds:


Then go to Step 2.





Connecting to IBM DB2 zOS from Azure Data Factory v2

Connecting to IBM DB2 zOS from Azure Data Factory v1 was a matter of setting up the Azure Data Gateway on an on-prem server that had the IBM DB2 Client installed; creating an ODBC connection to DB2 (I called it DB2Test).  Then, in the Data Factory v1 Copy Wizard, Select the ODBC source, pick the Gateway, and enter the phrase:  DSN=DB2Test into the Connection String.  This worked for us.

Azure Data Factory v2

First, the Azure Data Gateway is now called “Hosted Integration Runtime”.  So download and install the IR client on your on-prem gateway machine.  On my machine, it auto-configured to use the existing Data Factory Gateway configuration, which is NOT what I wanted.  After uninstalling and reinstalling the IR client a couple of times, it stopped auto-configuring and asked me for a key.  To get the key, I had our Azure Dev configuration guy run the following PowerShell:

Import-Module AzureRM
$dataFactoryName = "myDataFactoryv2NoSSIS"
$resourceGroupName = "myResourceGroup"
$selfHostedIntegrationRuntimeName = "mySelfHostedIntegrationRuntime"
Set-AzureRmDataFactoryV2IntegrationRuntime -ResourceGroupName $resouceGroupName -DataFactoryName $dataFactoryName -Name $selfHostedIntegrationRuntimeName -Type SelfHosted -Description "selfhosted IR description"
Get-AzureRmDataFactoryV2IntegrationRuntimeKey -ResourceGroupName $resourceGroupName -DataFactoryName $dataFactoryName -Name $selfHostedIntegrationRuntime

I then pasted the Key into the Integration Runtime Configuration screen, and it connected properly to myDataFactoryv2NoSSIS.  Tada:


Next, is to test the connection to DB2.  I went to the Diagnostics tab, entered the DSN and credentials, just like I did for Data Factory V1:

Failed to connect to the database. Error message: ERROR [HY010] [IBM][CLI Driver] CLI0125E Function sequence error. SQLSTATE=HY010

Dang! Much googling later, I found this obscure note.

I added the phrase “Autocommit=Off” to the DSN in the connection string, and voila, the connection worked.  So my final diagnostic looked like this:




SQL Saturday Charlotte 2017

I am excited to announce I am finally ready to present my advanced ETL Framework with Biml seminar at SQL Saturday Charlotte on Oct 14.  And I have a 10AM slot!!!   Wooo!


Implementing a SSIS Framework and enforcing SSIS Patterns (with Biml).

(Using Biml to Automate the Implementation of a SSIS Framework)

Let’s use Biml to automate the implementation of a standard SSIS Framework. I.e. Logging, error handling, etc. Business Intelligence Markup Language (Biml) is great at automating the creation of SSIS packages. With Biml we can generate a template package that implements a standard SSIS framework. In this fast-paced session, we will create the tables and load some metadata, write the Biml to implement logging and error handling, and generate some packages that implement our standard framework. In this session, you don’t need to know Biml, but some familiarity with XML, TSQL or C# will help. By the end of this session, you will know how to use Biml to automatically generate packages that implement a simple SSIS Framework with Audit Logging and Error Handling.

BMI KeyNotes – Toastmasters Speech on Biml

As practice for delivering a 50-minute seminar to my fellow ETL Developers at BMI, I presented Speech 5 from the Toastmasters Technical Presentations manual.  It was a whopping 12-15 minute speech and I didn’t go over time!!!   Yay!!!  (I normally blow through the red card and end up a minute or two over my time before the Toastmaster yanks off the podium.)

I think the speech went well.  I only got off script a couple of times.  (But at least I didn’t read it verbatim, which I too often do.)  I lost some people during the demonstration portion of the speech, but that is where I cut a 40-minute demonstration down to 4 minutes, so I knew it was going to be too fast and too technical.  I do hope that most of my non-technical audience at least got a glimpse into the world of an ETL developer, even if they didn’t understand it fully.

Please use the comments section below to ask questions and provide feedback.  I welcome comments, questions, and constructive criticism about the speech and the content.  Thank you.

Below are links to the PowerPoint slide decks I used, the speech I delivered and the support files for the demonstration I gave.

How to move a ton of data from the Mainframe to the Cloud – PowerPoint

How to move a ton of data from the Mainframe to the Cloud – Speech

Using Biml to Automate the Generation of SSIS Packages – PowerPoint

Intro to Biml Speech – Support Files

SQL Saturday Atlanta!

I am excited to announce that I will be giving my BIML seminar in Atlanta in July!!! Here’s the abstract for my session:

SQL Saturday 652

How to move a ton of data from the Mainframe to the Cloud (with Biml).

So, you need to move data from 75 tables on the mainframe to new tables in SQL Azure. Do you: a) hand code one package to load all 75 tables, b) hand code 75 packages that move a table each, or c) wish there was a better way?
There is! Business Intelligence Markup Language (Biml) can automate the creation of packages so that they all follow the same script. In this session, we will create some simple metadata to be able to generate multiple packages and their associated connection managers. You will see Biml in action. You will see the XML that Biml uses to define packages and connections. You will see the C# code that Biml uses to fetch metadata and dynamically generate packages in SSIS. And you will see packages and connection managers generated from Biml before your very eyes.