Building Connected Things with Node.js, Johnny-Five, and Microsoft Azure

Storing and Displaying Data

In this lab you will build a data pipeline that captures the data coming into IoT Hub, processes it with Azure Stream Analytics, then routes it to downstream services. The first service you'll build will consume the data and present it through an Azure Web App where it is rendered as a real-time graph.

Table of Contents

In this lab you will build a data pipeline that captures the data coming into IoT Hub, processes it with Azure Stream Analytics, then routes it to downstream services. The first service you’ll build will consume the data and present it through an Azure Web App where it is rendered as a real-time graph.

Bill of Materials

For this lab you will need:

  1. An Azure IoT Hub (created in the preceeding lab).
  2. The ThingLabs Thingy™ (created in an earlier lab).

We will use the light data you are sending from your Thingy and deploy a web application to azure that will render the data so you can visualize it even if you don’t have access to PowerBI.

In an earlier lab you provisioned an Azure IoT Hub and in the previous lab you built a physical device…the ThingLabs Thingy™. You coded an application to collect data from the device and send it to your IoT Hub. At the end of the previous lab you had data going into your IoT Hub but you weren’t yet doing anything with it. Let’s change that.

Using Stream Analytics to Process and Route IoT Data

Azure Stream Analytics is a service that does real-time data processing in the cloud. You will create a new Stream Analytics job and define the input data stream as the data coming from your IoT Hub. Next you will define an output data stream that sends data to an Event Hub (and optionally to PowerBI). Finally, you will write a SQL-like query that collects data coming in on the input stream and routes it to the output stream(s).

Create an Event Hub

First, you need to create an Event Hub to queue the data coming out of the Azure Stream Analytics Job. Open a new browser tab and navigate to the https://manage.windowsazure.com. Click on the NEW icon in the lower-left corner.

Windows Azure Portal v1

  1. Select + > APP SERVICES > SERVICE BUS > EVENT HUB > QUICK CREATE and enter the following:
    • EVENT HUB NAME: You can use anything that is a valid name here, such as thinglabs-eventhub-[yourname, initials, etc]
    • REGION: If you created your IoT Hub in East US, select East US 2. Select the same region you created your IoT Hub in.
    • SUBSCRIPTION: Select the subscription you’ve created your resources in.
    • NAMESPACE: You can use anything that is a valid name here, such as thinglabs-eh-[yourname, initials, etc]

Configure Event Hub

Create the Stream Analytics Job

Next, you can create the Stream Analytics Job by opening a new browser tab and navigate to https://manage.windowsazure.com. Login if necessary. Click on the NEW icon in the lower-left corner.

  1. Select DATA SERVICES > STREAM ANALYTICS > QUICK CREATE and enter the following:
    • JOB NAME: You can use anything you’d like here…like iotlab or something similar so you can identify it easily later.
    • REGION: If you created your IoT Hub in East US, select East US 2. Select the same region you created your IoT Hub in.
    • REGIONAL MONITORING STORAGE ACCOUNT: Select or create a storage account.

    Defining a new Stream Analytics job

  2. Click CREATE STREAM ANALYTICS JOB. It will take a few minutes for the Steam Analytics job to get created and become available.

Creating a new Stream Analytics job

When the job indicates that it is created, click into it to create the data streams and query. Once you are in the Stream Analytics job you will need to define the job input, query, and output.

Define the Input Data Stream

The data will come in as a data stream from the Event Hub that was automatically created when you created the Azure IoT Hub.

  1. Click on the INPUTS header.

    Create the input

  2. Click on ADD AN INPUT.
  3. Select Data stream and click on the forward arrow in the lower-right.
  4. Select IoT Hub and click on the forward arrow in the lower-right.
  5. Complete the form as follows:
    • INPUT ALIAS - DeviceInputStream
    • SUBSCRIPTION - choose your subscription
    • CHOOSE AN IOT HUB - choose the IoT Hub you created earlier
    • IOT HUB SHARED ACCESS POLICY NAME - leave this as the default, which should be iothubowner
    • IOT HUB CONSUMER GROUP - enter the default consumer group name: $Default
  6. Click on the forward arrow in the lower-right.
  7. On the Serialization settings form, leave the defaults (Event Serialization Format:JSON and Encoding:UTF8) click on the checkmark in the lower-right.

Stream Analytics input definition

After a few seconds, a new input will be listed.

Define Output Data Streams

Before defining the query that will select data from the input and send it to the outputs you need to define the outputs. For this lab you will output the results of the query to an Azure Web Dashboard you’ll create later and/or Power BI dataset for reporting.

You will create two outputs, one for data to flow to EventHub and a second for data to flow to PowerBI, so for EventHub:

  1. Click on the OUTPUTS header.

    Create the output

  2. Click on ADD AN OUTPUT.
  3. Select Event Hub and click on the forward arrow in the lower-right.

    Event Hub Output

  4. Configure the EventHub

    Event Hub Configuration

    1. Give it a unique name
    2. Select “Use Event Hub from Current Subscription”
    3. From the drop-down, select your previously created Event Hub
    4. Click the right arrow to go to the “Serialization Settings”
    5. Accept the defaults (JSON, UTF8, Line Separated)
  5. Click checkmark on the lower-right to create the Event Hub
  6. Start the Event Hub by clicking the Start triangle at the bottom of the window

Write the Query

In the query, you want to select data from the input stream and put it into the output stream. With data like darkness, you can do interesting things like applying operations on the data as you query it. For this example, you will write a query that selects from the input stream and sends the output stream the minimum, maximum and average darkness values across all devices, and enables you to group the data by either location or device ID. Using a TumblingWindow you will send data to the output stream in rolling increments of 5-seconds.

  1. Click on the QUERY header.

    Create the query

  2. Write the following query:

WITH ProcessedData as (
    SELECT
        MAX(sensorValue) MaxTemperature,
        MIN(sensorValue) MinTemperature,
        AVG(sensorValue) AvgTemperature,
        location,
        deviceId,
        System.Timestamp AS Timestamp
    FROM
        [DeviceInputStream]
    WHERE
        sensorType = 'temperature'
    GROUP BY
        TumblingWindow (second, 5), deviceId, location
)

-- Make sure this matches your Event Hub output name from above,
-- If you've forgotten it you can go back and get it in another browser tab
SELECT * INTO [ThingLabsEHOutput] FROM ProcessedData
  1. Click SAVE in the lower middle of the screen.
  2. Once the query is saved, click START to start the Stream Analytics job.

If your app from the previous lab isn’t still running, go ahead and start it up. It will take a few minutes for the Stream Analytics job to get started and to start sending data to Power BI, but you should see MyIoTDataSet show up in Power BI within a few minutes. Remember, the TumblingWindow is set to 5-seconds, so PowerBI will only update every 5-seconds.

Create Azure Website that Shows EventHub Data

Content Goes here

You need to create a Web App to visualize the data coming out of the Event Hub. Click on the NEW icon in the lower-left corner.

Windows Azure Portal v1

  1. Select + > COMPUTE > WEB APP > QUICK CREATE
    • URL: You can use anything that is a valid name here, such as thinglabs-eventhub-[yourname, initials, etc]
    • APP SERVICE PLAN: Select “Create a new App Service Plan”.
    • REGION: If you created your IoT Hub in East US, select East US 2. Select the same region you created your IoT Hub in.

    Configure Event Hub

  2. Setup deployment from github:
    • On the web app configuration dashboard, click “Set up deployment from source control.” (on the lower righthand side of the page)

    Configure Event Hub

    • Select “External repositiory” from the dialog

    (Note: You’d think this should be “Github repository”, but we’re avoiding having you fork and maintain your own version fo the code for today. If you want to modify it later, you can fork the repository and modify it. Then you would have to update your deployment source to “Github Repository” which finds your repositories.)

    External Repository

    Git repo url

    • Paste it into the dialog on the configuration (the External Repository Page)

    Git repo url

    • Click checkmark on the lower-right to setup your deployment from the ThingLabs Github Repository
    • Click on “Configure” to customize settings for your the Web App.
      • Turn Web Sockets on

      Web App Enable Web Sockets

      • Add an App Setting (Key: THINGLABS_EVENTHUB_CONNSTRING Value: Connection String from your EventHub)

      Web App App Settings

      • Use the connection string from your Event Hub

      Event Hub Connection String

      • Save the changes
    • Restart the Web Application
  3. Browse to your site
    • It will take a few minutes for data to flow, but you should start to see a graph render as data is recieved.

    Thing Labs Web Dashboard - Web App

Conclusion

In this lab you learned how to create an Azure Stream Analytics job to query data coming in to Azure IoT Hub, process it and send it to Event Hub and Power BI.

Congratulations! In this hands-on workshop you experienced an IoT solution end-to-end. You built a Thing that both sent output (blinked an LED) and collected input (ambient light) and uased it to send data to Azure IoT Hubs.

In the [next lab][nextlab] you will modify the web application to include a capability to send a Cloud-to-Device (C2D) message.

Go to ‘Sending Cloud to Device (C2D) Messages’ ›