1. Service Description
Consider a hypothetical
company called SixFrogs Incorporated that offers a cloud-based
demand-response service (Dem-Res) directly to utility companies and
indirectly to consumers through utility companies.
A demand-response (Dem-Res) system
is a popular pattern used by utility companies to curtail the
electricity load during peak usage when the pricing for usage is very
high. The cost savings are then passed on to consumers. The curtailment
is determined in real time based on peak usage, pricing, and several
other factors. In the interest of keeping the example conceptual,
assume that the load reduction depends on peak usage and pricing.
2. Processes for Curtailment
The process flow for the Dem-Res system between SixFrogs, utility companies, and consumers is as follows:
Multiple utility companies subscribe to SixFrog's Dem-Res cloud system.
Consumers subscribe to the Dem-Res system through their utility company in return for a discount on their monthly electric bill.
Utility
companies install their hardware (gateway) in consumers' houses and/or
buildings and point those gateways to the Dem-Res system in the cloud.
Utility companies configure load curtailment for consumers for specific load devices (this example considers only HVAC).
Utility
companies receive electric load-pricing information for a particular
period of time. This pricing information is in dollars per kWh for a
particular period of time. For example, a particular pricing entry may
be represented as $16/kWh between 1:00pm and 3:00pm on Monday August 24th 2009.
Utility companies passes this pricing information to the Dem-Res system.
The
Dem-Res system reads the pricing information and sends commands to the
applicable gateways in buildings and houses to automatically curtail
the load.
Periodically,
the gateways communicate the energy usage to the Dem-Res system. The
Dem-Res system checks the database for peak load and pricing
information and sends curtailment commands to the gateways if required.
Figure 1 illustrates the high-level process for the Dem-Res system.
3. Technical Architecture
This section discusses the
design of the Dem-Res system and its communication endpoints. The goal
is to map the process architecture to the system architecture in the
cloud representing Windows Azure components. From the earlier section
and your knowledge of Windows Azure so far, it should be clear that you
can build a complete Dem-Res system in Windows Azure. Figure 2 illustrates the technical architecture of the Dem-Res system.
As shown in Figure 9-30, the Dem-Res system consists of three core components:
The flow of information within the components is as follows:
The utility company periodically sends gateway and pricing information to the pricing database in the Dem-Res system.
Periodically, the gateway sends an energy usage value to the gateway listener via the AppFabric Service Bus.
The gateway listener worker role queries the pricing database to check if the energy value is more than the peak load value.
If
the gateway energy value is more than the peak load value, the gateway
listener sends a curtail command to the gateway application. The
gateway application in turn sends the control command to the
appropriate device (such as HVAC in this example).
3.1. Pricing and Gateway Database Design
The Pricing and Gateway
database is hosted in SQL Azure and is geo-located in the same region
as the gateway listener worker role to keep the communications within
the same data center. As the name suggests, the database consists of
pricing and gateway information.
I recommend that y to design your
SQL Azure database in SQL Server on-premises and then migrate the
database to SQL Azure. When SQL Azure fully supports SQL Server
Management Studio, you can work directly with the SQL Azure database.
|
|
The Pricing and Gateway database consists of four tables, as shown in Figure 3.
The Gateways table maintains
the list of gateways in the field along with their location, which is
referenced by LocationId from the PricingLocations table. The
PricingLocations table consists of address locations that are mapped as
pricing zones. Each zone has a price per kWh stored in the
PricingCalendar_kWh table. The PricingCalendar_kWh table is updated
periodically by the utility companies with the latest pricing. The
EnergyMeterValues are the kWh values that gateways send periodically to
the Dem-Res system.
The steps to create and testing the Dem-Res database system are as follows:
Create a database named proazuredemres.
Create table-creation scripts.
Create stored-procedure scripts.
Upload sample data into the tables.
Create data synchronization for the PricingCalendar_kWh table.
3.1.1. Creating the proazuredemres Database
To create this database, follow these steps:
Open SQL Server Management Studio.
Connect to the SQL Azure master database as an administrator, as shown in earlier sections.
Execute the following query in the New Query window to create the proazuredemres database:
CREATE DATABASE proazuredemres;
Create
a new login named demresadmin in the db_owner role for the
proazuredemres database.
Log
in to the proazuredemres database as the demresadmin user, and execute
the following query to test if the login was created successfully:
select * from sys.databases;
3.1.2. Creating Database Tables
To create database tables, I
recommend that you first create the tables and other database objects
in your local SQL Server Express and then generate scripts to upload to
the SQL Azure database. In this section, you directly create SQL Server
objects in SQL Azure to keep the content relevant to SQL Azure only.
Listing 1 shows the script and schema to create the Dem-Res database tables.
Example 1. Dem-Res Table-Creation Script
CREATE TABLE [dbo].[PricingLocations]( [LocationId] [varchar](50) NOT NULL PRIMARY KEY CLUSTERED, [Description] [varchar](100) NOT NULL); GO CREATE TABLE [dbo].[PricingCalendar_kWh]( [PricingId] [int] IDENTITY(1,1) NOT NULL PRIMARY KEY CLUSTERED, [PricingStartDate] [datetime] NOT NULL, [PricingEndDate] [datetime] NOT NULL, [PricePerkWh] [float] NOT NULL, [LocationId] [varchar](50) NOT NULL); GO CREATE TABLE [dbo].[Gateways]( [GatewayNumber] [int] IDENTITY(1,1) NOT NULL PRIMARY KEY CLUSTERED , [GatewayId] [varchar](50) NOT NULL, [LastCommunication] [datetime] NULL,
[LocationId] [varchar](50) NOT NULL, [WebAddress] [varchar](100) NOT NULL); GO CREATE TABLE [dbo].[EnergyMeterValues]( [RecordId] [int] IDENTITY(1,1) NOT NULL PRIMARY KEY CLUSTERED, [GatewayNumber] [int] NOT NULL, [GatewayId] [varchar](50) NOT NULL, [kWhValue] [float] NOT NULL, [kWhFieldRecordedTime] [datetime] NOT NULL, [kWhServerTime] [datetime] NOT NULL, [Cost] [money] NOT NULL); GO ALTER TABLE [dbo].[EnergyMeterValues] WITH CHECK ADD CONSTRAINT [FK_EnergyMeterValues_Gateways] FOREIGN KEY([GatewayNumber]) REFERENCES [dbo].[Gateways] ([GatewayNumber]) GO ALTER TABLE [dbo].[EnergyMeterValues] CHECK CONSTRAINT [FK_EnergyMeterValues_Gateways] GO ALTER TABLE [dbo].[Gateways] WITH CHECK ADD CONSTRAINT [FK_Gateways_PricingLocations] FOREIGN KEY([LocationId]) REFERENCES [dbo].[PricingLocations] ([LocationId]) GO ALTER TABLE [dbo].[Gateways] CHECK CONSTRAINT [FK_Gateways_PricingLocations] GO ALTER TABLE [dbo].[PricingCalendar_kWh] WITH CHECK ADD CONSTRAINT [FK_PricingCalendar_kWh_PricingLocations] FOREIGN KEY([LocationId]) REFERENCES [dbo].[PricingLocations] ([LocationId]) GO ALTER TABLE [dbo].[PricingCalendar_kWh] CHECK CONSTRAINT [FK_PricingCalendar_kWh_PricingLocations] GO
|
The first part of Listing 9-4 defines the tables, and then second part defines foreign key relationships between the tables.
To create tables in SQL Azure, follow these steps:
Connect to the proazuredemres database using SQL Server Management Studio.
Log in as the demresadmin user.
Open the createtables_proazuredemresdb.sql script file .
Click the Execute button to create the tables in the proazuredemres database.
Execute the following query to check if the tables and constraints were successfully created:
select * from sys.objects
3.1.3. Creating Stored Procedures
One of the database design best
practices is to locate data-processing logic closer to the database as
much as possible. This is the reason stored procedures are recommended
for data-processing logic rather than inline code. Stored procedures
are also easier to modify and maintain than code because each stored
procedure is an atomic unit containing data-processing logic that can
be easily modified without having to recompile the code. For the
Dem-Res system, I identified the stored procedures described in the
following sections.
InsertPricingLocations
The
InsertPricingLocations stored procedure inserts a new record in the
PricingLocations table. This stored procedure is called by utility
companies to add locations that are used to set energy prices. Listing 2 shows the create script for the InsertPricingLocations stored procedure.
Example 2. InsertPricingLocations
CREATE PROCEDURE [dbo].[InsertPricingLocations] @locationId varchar(50), @description varchar(100) AS BEGIN -- SET NOCOUNT ON added to prevent extra result sets from -- interfering with SELECT statements. SET NOCOUNT ON;
INSERT INTO PricingLocations(LocationId, [Description]) VALUES (@locationId, @description); END
|
The stored procedure consists of
a simple insert statement. If you want to modify the stored procedure
after it's installed in the database, replace CREATE PROCEDURE with
ALTER PROCEDURE in the stored procedure body.
InsertPricingCalendar_kWh
The InsertPricingCalendar_kWh
stored procedure inserts a new record in the PricingCalendar_kWh table.
The stored procedure is called by the utility companies to update the
kWh pricing for a particular period of time. Listing 3 shows the create script for the InsertPricingCalendar_kWh stored procedure.
Example 3. InsertPricingCalendar_kWh
CREATE PROCEDURE [dbo].[InsertPricingCalendar_kWh] @pricingStartDate datetime, @pricingEndDate datetime, @pricePerkWh float, @locationId int AS BEGIN SET NOCOUNT ON;
INSERT INTO PricingCalendar_kWh (PricingStartDate, PricingEndDate, PricePerkWh, LocationId) VALUES (@pricingStartDate, @pricingEndDate, @pricePerkWh, @locationId); END
|
InsertGateway
The InsertGateway stored
procedure inserts a new record in the Gateways table. This procedure is
called when a new gateway is added to the Dem-Res database by the
utility company or when the gateway communicates with the Dem-Res
system for the first time. Listing 4 shows the create script for the InsertGateway stored procedure.
Example 4. InsertGateway
CREATE PROCEDURE [dbo].[InsertGateway] @gatewayId varchar(50), @locationId int, @webAddress varchar(100) AS BEGIN SET NOCOUNT ON;
INSERT INTO Gateways(GatewayId, LocationId, WebAddress, LastCommunication) VALUES (@gatewayId, @locationId, @webAddress, getdate()); END
|
InsertEnergyMeterValues
The
InsertEnergyMeterValues stored procedure inserts a new record in the
EnergyMeterValues table. This stored procedure is called when the
gateway sends the energy meter value to the Dem-Res server. Listing 5 shows the create script for the InsertEnergyMeterValues stored procedure.
Example 5. InsertEnergyMeterValues
CREATE PROCEDURE [dbo].[InsertEnergyMeterValues] @gatewayId varchar(50), @kWhValue float, @kWhFieldRecoredTime datetime, @kWhServerTime datetime AS BEGIN SET NOCOUNT ON; DECLARE @gatewayNumber int DECLARE @cost float DECLARE @locationId int SELECT @gatewayNumber = GatewayNumber, @locationId=LocationId FROM Gateways WHERE GatewayId = @gatewayId; SELECT @cost=PricePerkWh FROM PricingCalendar_kWh WHERE LocationId = @locationId; SET @cost = @cost * @kWhValue;
INSERT INTO EnergyMeterValues(GatewayNumber, GatewayId, kWhValue, kWhFieldRecordedTime, kWhServerTime, Cost) VALUES (@gatewayNumber, @gatewayId, @kWhValue, @kWhFieldRecoredTime, @kWhServerTime, @cost); END
|
In Listing 5,
PricePerkWh is retrieved from the PricingCalendar_kWh table to
calculate the cost of energy for the kWh value at the location where
the gateway is located. The cost is calculated by multiplying the kWh
value by the price per kWh sent by the gateway. The cost value is then
inserted into the record along with all the other fields of the table.
UpdateGatewayLastCommunication
The
UpdateGatewayLastCommunication stored procedure updates the last
communication time field in the Gateways table. This stored procedure
is called when the gateway sends the energy meter value to the Dem-Res
server. Listing 6 shows the create script for the UpdateGatewayLastCommunication stored procedure.
Example 6. UpdateGatewayLastCommunication
CREATE PROCEDURE [dbo].[UpdateGatewayLastCommunication] @gatewayId varchar(50), @locationId int, @webAddress varchar(100) AS BEGIN SET NOCOUNT ON; UPDATE Gateways SET LastCommunication = getdate() WHERE GatewayId = @gatewayId END
|
To install the
stored procedures, open SQL Server Management Studio, connect to the
proazuredemres database, and execute the CREATE PROCEDURE scripts as
shown in Figure 4.
3.1.4. Uploading Sample Data
In any database system
design, you need sample data to test different scenarios and conditions
that will affect the system in general. You also need sample data to
test the business logic in application and stored procedures. In
Ch8Solution in this chapter's source code directory, there is a Windows
Forms project called ProAzureDemResDbApp that uploads sample data to
PricingLocations, PricingCalendar_kWh, Gateways, and EnergyMeterValues
in the SQL Azure Dem-Res database. The application calls the stored
procedures discussed in the previous section to insert the data. The
data is randomly generated based on some hard-coded parameters in the
code. For example, the pricing locations are between the ZIP codes
95147 and 94583. Similarly, the gateway numbers are between 1 and 300.
These values are also used to generate a web URL for the gateway, which
is of the format sb://proazure.servicebus.windows.net/gateways/{location_id}/{gateway_id}.
Figure 5 illustrates the user interface of the ProAzureDemResDbApp application.
To upload sample data, follow these steps:
Run the ProAzureDemResDb application.
Enter your server name, username, password, and database name.
Click the Create Sample Data button.
NOTE
It may take some time to run the query, depending on your network connection.
To
delete all the data in the tables, click the Drop Data button. Dropping
data is useful if you want to re-create the sample data from scratch.
Listing 7 shows the code to insert data in the PricingCalendar_kWh table.
Example 7. Insert Pricing Calendar Data
using (SqlConnection conn = new SqlConnection(GetUserDbString())) { conn.Open(); for (int j = START_LOCATIONID; j < END_LOCATIONID; j++) { using (SqlCommand command = conn.CreateCommand()) { command.CommandText = "InsertPricingCalendar_kWh"; command.CommandType = CommandType.StoredProcedure; string lid = j.ToString(); Random r = new Random(); double price = r.NextDouble(); SqlParameter pricingStartDate = command.CreateParameter(); pricingStartDate.ParameterName = "@pricingStartDate"; pricingStartDate.Value = PRICINGCALENDAR_STARTDATE; command.Parameters.Add(pricingStartDate); SqlParameter pricingEndDate = command.CreateParameter(); pricingEndDate.ParameterName = "@pricingEndDate"; pricingEndDate.Value = PRICINGCALENDAR_ENDDATE; command.Parameters.Add(pricingEndDate); SqlParameter pricePerkWh = command.CreateParameter(); pricePerkWh.ParameterName = "@pricePerkWh"; pricePerkWh.Value = price; command.Parameters.Add(pricePerkWh); SqlParameter locationId = command.CreateParameter(); locationId.ParameterName = "@locationId"; locationId.Value = lid; command.Parameters.Add(locationId); command.ExecuteNonQuery(); }//using
}//for }//using
|
Listing 7
demonstrates calling the InsertPricingCalendar_kWh stored procedure
with parameterized values. In database programming, parameterized
values are recommended over plain-text query strings, because there is
a SQL injection risk when you use plain text queries. Parameterized
queries reduce this risk because they don't append the value of the
parameter to the SQL query, which gives a clear separation between the
SQL query and its parameters. In Listing 9-10,
note that the price per kWh is generated randomly. In the real world,
the utility company provides the Dem-Res application with the price per
kWh.
To test the creation of the
data, you can login to proazuredemres database in your account using
SQL Server Management Studio and execute a select * query on all the
database tables.
3.1.5. Optimizing SELECT Queries
In the Dem-Res system,
the most commonly used SELECT query selects PricePerkWh by LocationId
from the PricingCalendar_kWh table, because for every message that
comes in from the gateway, you have to calculate the cost. So, it's
important that the performance of the SELECT query is optimized by
appropriate indexes on the table. Depending on the other queries in the
system, you may choose to create a clustered index on the LocationId
field. But you can have only one clustered index on a table, which in
this case is PricingId. In this example, you create a simple index on
the LocationId field by executing this query:
CREATE INDEX INDEX_PricingCalendar_kWh_LocationId
ON PricingCalendar_kWh(LocationId);
To test the index scan, you
need sufficient data in the PricingCalendar_kWh table; otherwise, the
SQL Server optimizer scans only the clustered index because the SQL
Server optimizer may choose a different execution plan that yields
better results. You can generate more test data by executing the stored
procedure shown in Listing 8.
Example 8. AddSampleData Stored Procedure
ALTER PROCEDURE AddSampleData @NumRows int AS DECLARE @counter int DECLARE @locationId int DECLARE @locationIdStr varchar(50) DECLARE @desc varchar(50) DECLARE @pricingStartDate datetime DECLARE @pricingEndDate datetime DECLARE @pricekWh float DECLARE @gatewayUrl varchar(100) DECLARE @gatewayId varchar(50) DECLARE @kWhValue float DECLARE @now datetime
SELECT @counter = 1 WHILE (@counter < @NumRows) BEGIN
SET @locationId = 10000 + @counter;
SET @locationIdStr = CAST(@locationId as varchar); SET @desc = @locationIdStr + '-' + CAST(@counter as nvarchar)+'-description'; SET @pricingStartDate = DATEADD(m, 2, getdate()); SET @pricingEndDate = DATEADD(m, 3, getdate()); SET @pricekWh = CAST(@counter as float) * 0.00052; SET @gatewayId = 'MyGateway' + @locationIdStr; SET @gatewayUrl = 'sb://proazure.servicebus.windows.net/gateways/' + @locationIdStr + '/' + @gatewayId; SET @kWhValue = @pricekWh * 5.2; SET @now = getdate();
EXEC InsertPricingLocations @locationId, @desc; EXEC InsertPricingCalendar_kWh @pricingStartDate, @pricingEndDate, @pricekWh, @locationId; EXEC InsertGateway @gatewayId, @locationId, @gatewayUrl; EXEC InsertEnergyMeterValues @gatewayId, @kWhValue, @now, @now;
SELECT @counter = @counter + 1;
END
|
The AddSampleData
stored procedure creates sample data in all the database tables similar
to the ProAzureDemResDbApp Windows application you saw earlier. Execute
the stored procedure with the following query to enter 10,000 entries
in the database tables:
EXEC AddSampleData 10001;
NOTE
I demonstrate two
different ways of creating sample data so you can choose the approach
you feel comfortable with and understand the advantage of having the
data-processing logic closer to the data. You can easily modify the
AddSampleData stored procedure without recompiling any code as you
would have to do with the Windows application shown earlier.
Next, to view the query execution plan, execute the query shown in Listing 9.
Example 9. Show Query Plan
SET SHOWPLAN_ALL ON GO SELECT PricePerkWh FROM PricingCalendar_kWh WHERE LocationId = 95148; GO SET SHOWPLAN_ALL OFF
|
SET SHOWPLAN_ALL ON enables you to see the output of the query execution plan, as shown in Figure 6.
The query plan shows the
steps followed by the SQL Server query optimizer to execute your query.
The information shown in the plan is valuable for optimizing queries or
debugging slow-running queries. Figure 7
shows the query optimizer using the index scan of the
PricingCalendar_kWh_LocationId index you created earlier. To see the
execution of the plan graphically, go to Query => Display Estimated Execution Plan in SQL Server Management Studio.
3.1.6. Pricing Table Synchronization
The values in the
PricingCalendar_kWh table are provided by the utility companies. There
are several ways to synchronize data in the cloud PricingCalendar_kWh
table with an on-premises database table, such as creating a custom web
service that can be called by utility companies, having an FTP server
for data transfer between utility companies and the Dem-Res system, SQL
Server Integration Services (SSIS), and so on. This section shows you
how to use SSIS to synchronize data between an on-premises database and
the SQL Azure cloud database. SSIS is an Extract-Transform-Load (ETL)
tool that comes with higher SQL Server editions. You can use SSIS to
Extract data from a structured or unstructured data source
Clean up the data or apply business rules to the data
Upload the clean data to the destination database tables
SSIS is popular in
business intelligence (BI) applications for extracting data from
different kinds of sources and uploading the aggregated and clean data
to a data warehouse for analysis and reporting. But the application of
SSIS isn't limited to BI applications: many organizations use SSIS for
simple cross-database data transfer. Figure 8 illustrates the use of SSIS in different kinds of applications.
As shown in Figure 9-36,
SSIS extracts data from any kind of compatible source, transforms the
data into an understandable format, and then loads the transformed data
into the destination database. SSIS can be used in any kind of
data-transfer application depending on the business needs. SSIS also
offers the flexibility of developing custom components if the
out-of-the-box features don't fulfill your requirements. SSIS is
composed of packages,
which are logical steps you design to execute an ETL operation. SSIS
packages can be designed in Visual Studio; the programming environment
is similar to a workflow environment.
In this section, you build
an SSIS package to extract data from a pricing table in an on-premises
SQL Server database and load it into the PricingCalendar_kWh table in
your SQL Azure proazuredemres database. Figure 9 illustrates the simple architecture of the data transfer from on-premises SQL Server to SQL Azure.
Before creating an SSIS
package, create a SQL Server database named utility_pricing on your
local SQL Server instance; it represents a utility company's pricing
database. Listing 10
shows the DDL script to create the database and all its associated
objects, and the stored procedures AddSampleData and DropSampleData
create and drop sample data, respectively.
Example 10. On-Premises DDL Database Script
use [master] CREATE DATABASE utility_pricing; use [utility_pricing] CREATE TABLE [dbo].[PricingLocations]( [LocationId] [int] NOT NULL PRIMARY KEY CLUSTERED , [Description] [varchar](100) NOT NULL) GO CREATE TABLE [dbo].[PricingCalendar_kWh]( [PricingId] [int] IDENTITY(1,1) NOT NULL PRIMARY KEY CLUSTERED , [PricingStartDate] [datetime] NOT NULL, [PricingEndDate] [datetime] NOT NULL, [PricePerkWh] [float] NOT NULL, [LocationId] [int] NOT NULL) GO
CREATE PROCEDURE [dbo].[InsertPricingLocations] @locationId int, @description varchar(100) AS BEGIN -- SET NOCOUNT ON added to prevent extra result sets from -- interfering with SELECT statements. SET NOCOUNT ON;
INSERT INTO PricingLocations(LocationId, [Description]) VALUES (@locationId, @description);
END GO CREATE PROCEDURE [dbo].[InsertPricingCalendar_kWh] @pricingStartDate datetime, @pricingEndDate datetime, @pricePerkWh float, @locationId int AS BEGIN SET NOCOUNT ON;
INSERT INTO PricingCalendar_kWh(PricingStartDate, PricingEndDate, PricePerkWh, LocationId) VALUES (@pricingStartDate, @pricingEndDate, @pricePerkWh, @locationId); END GO
ALTER TABLE [dbo].[PricingCalendar_kWh] WITH CHECK ADD CONSTRAINT [FK_PricingCalendar_kWh_PricingLocations] FOREIGN KEY([LocationId]) REFERENCES [dbo].[PricingLocations] ([LocationId]) GO CREATE PROCEDURE AddSampleData @NumRows int AS DECLARE @counter int DECLARE @locationId int DECLARE @locationIdStr varchar(50) DECLARE @desc varchar(50) DECLARE @pricingStartDate datetime DECLARE @pricingEndDate datetime DECLARE @pricekWh float
SELECT @counter = 1 WHILE (@counter < @NumRows) BEGIN
SET @locationId = 10000 + @counter; SET @locationIdStr = CAST(@locationId as varchar); SET @desc = @locationIdStr + '-' + CAST(@counter as nvarchar)+'-description'; SET @pricingStartDate = DATEADD(m, 2, getdate()); SET @pricingEndDate = DATEADD(m, 3, getdate()); SET @pricekWh = CAST(@counter as float)* 0.00063;
EXEC InsertPricingLocations @locationId, @desc; EXEC InsertPricingCalendar_kWh @pricingStartDate, @pricingEndDate, @pricekWh, @locationId;
SELECT @counter = @counter + 1;
END
GO CREATE PROCEDURE DROPSAMPLEDATA AS BEGIN
DELETE FROM PricingCalendar_kWh; DELETE FROM PricingLocations;
END
|
For simplicity's sake, I've
kept the table schema for the pricing table the same for the
proazuredemres and utility_pricing databases. To create sample data,
execute the following stored statement:
EXEC AddSampleData 10001;
Designing an SSIS Package for Pricing Table Synchronization
The steps to create
an SSIS package for synchronizing the on-premises pricing tables to the
SQL Azure pricing tables are as follows:
Open SQL Server Business Intelligence Development Studio as an administrator.
Choose Start => All Programs => SQL Server Business Intelligence Development Studio.
Choose File => New => Project.
Select Business Intelligence Projects in the Project Types section to get a list of all the Business Intelligence Project types.
Select Integration Services Project in the Templates section.
Name the project "PricingUploadSSIS", as shown in Figure 10.
Click OK to create the project.
On the Package.dtsx design page, drag and drop Execute SQL Task as shown in Figure11.
Before
you upload the data from the on-premises pricing tables to the SQL
Azure pricing table, delete the existing data from the SQL Azure
pricing table.
NOTE
The pricing
locations remain constant unless a new pricing location is added to the
on-premises table. The only variable is the price per kWh from the
PricingCalendar_kWh table.
Right-click the task on the designer surface, and click Rename to rename the task "Drop SQL Azure Pricing Data" (see Figure 12).
Right-click the task, and click Edit.
In the SQL Statement section, type the following query to delete all the records from the PricingCalendar_kWh table (see Figure 13):
DELETE FROM PricingCalendar_kWh
In the Connection Type section, select ADO.NET from the drop-down list.
In the Connection section, select New Connection to open the Configure ADO.NET Connection Manager dialog box (see Figure 14).
Click the New button to open the Connection Manager dialog box.
Enter the database server name.
Select the SQL Server Authentication type.
Enter the username and password of the login you've created to connect to the database.
Enter the name of the database: proazuredemres.
Click the Test Connection button to test your connection. A success connection dialog box opens on successful connection.
Click OK to close the Test Connection dialog box.
Click OK to save the connection.
Drag and drop Data Flow Task from the Toolbox on the Package.dtsx design surface.
Rename
the task to "Upload Pricing Data". Drag the end of the green arrow from
the Drop SQL Azure Pricing Data task to the Upload Pricing Data task to
let SSIS know the next logical step of execution. In this task. you
upload the data from the on-premises PricingCalendar_kWh table to SQL
Azure PricingCalendar_kWh table.
Right-click the Upload Pricing Data task, and click Edit to navigate to the Data Flow design surface (see Figure 18).
Notice that the controls in the Toolbox change when you navigate to the
Data Flow tab. The Toolbox only displays controls relevant to Data Flow
tasks.
From the Data Flow Sources section on the Toolbox, drag and drop an ADO.NET Source onto the Data Flow design surface (see Figure 19.
Rename the source "On-premises PricingCalendar_kWh". The ADO.NET Source
represents the on-premises PricingCalendar_kWh data base table.
Right-click the On-premises PricingCalendar_kWh task, and click Edit to open an ADO.NET Source Editor.
In
the ADO.NET Source Editor dialog box, click the New button. By doing
so, you create a new connection to the on-premises PricingCalendar_kWh
database table.
In the Configure ADO.NET Connection Manager, click the New button to open the Connection Manager dialog box.
Fill in the on-premises database connection information.
Select the utility_pricing database from the database drop-down menu. and click OK.
Click OK to go back to the ADO.NET Source Editor.
Select PricingCalendar_kWh from the Tables drop-down menu, and click OK (see Figure 22).
Drag and Drop ADO NET Destination from the Data Flow Destinations section of the Toolbox.
Rename
the ADO NET Destination task to "SQL Azure PricingCalendar_kWh". The
destination represents the PricingCalendar_kWh table in the SQL Azure
database.
Connect
the source and destination by connecting the green down arrow from the
On-premises PricingCalendar_kWh task to SQL Azure PricingCalendar_kWh.
Right-click the SQL Azure PricingCalendar_kWh task, and click Edit to open the ADO.NET Destination Editor.
In
the Connection Manager drop-down list, select the connection to the SQL
Azure database. Select the PricingCalendar_kWh table from the Tables
drop-down list, as shown in Figure 23.
Click the Mappings section to show the default mapping between the source and destination tables (see Figure 24). If the tables are similar as in this example, SQL Server does the default mapping for you.
Select PricingId in the Input Column, and select Ignore to remove the mapping of the PricingId field (see Figure 25).
The PricingCalendar_kWh table contains an identity field named
PricingId that is auto-generated whenever a new item is inserted in the
table. When a new record is inserted in the SQL Azure
PricingCalendar_kWh table, a new PricingId is automatically generated.
So, you don't have to synchronize the PricingId field to the SQL Azure
database.
Click OK to close the ADO NET Destination Editor.
The Data Flow tasks are illustrated in Figure 26.
The steps to test
the PricingCalendar_kWh table synchronization between the on-premises
utility_pricing database and the SQL Azure proazuredemres database are
as follows:
Open SQL Server Management Studio.
Connect to the on-premises database utility_pricing.
Execute the following T-SQL statements in a New Query window to erase old pricing data and create new pricing sample data:
EXEC DROPSAMPLEDATA
EXEC AddSampleData 20;
select * from PricingCalendar_kWh;
The output from these statements is shown in Figure 27.
You can keep existing data in the SQL Azure PricingCalendar_kWh table
because the SSIS package deletes all the data from this table before
uploading the on-premises data.
To run the SSIS package in Debug mode, click Debug
Start Debugging. In Debug mode, Visual Studio steps through each task
and color-codes it based on its execution status. First the task turns
yellow, which means the operation is in progress. Then it turns either
red or blue, depending on whether the task failed or succeeded,
respectively. If both the tasks succeed, they turn green, as shown in Figure 28. If any tasks turn red, you can view the output window (View => Output) to see output messages. You can also add breakpoints to the tasks to do detailed debugging of the package execution.
Connect
to SQL Azure database proazuredemres using SQL Management Studio, and
execute the following command in a New Query window to view the new
records uploaded from the on-premises database table:
select * from PricingCalendar_kWh;
Note that other
than PricingId, all the field values are exactly the same as the
on-premises database. You specifically configured PricingId to be
ignored while uploading the new records because it's an identity field
that increments automatically on new data inserts.
As you learned in this
exercise, SSIS is a very flexible tool for ETL operations, and it adds
significant value when you're working with cloud databases. The
out-of-the-box tasks in the SSIS Toolbox are rich in features that can
help you synchronize, maintain, administer, and analyze data in SQL
Azure from your on-premises infrastructure.
3.2. Gateway Listener Design
The Gateway Listener is a server
that listens to the messages sent by gateways and updates the database
with these messages. If required, it sends a curtail command to the
gateway to curtail the load. In the Ch8Solution, the DemResWorkerRole
project implements the Gateway Listener design. The DemResWorkerRole is
a Windows Azure worker role project that registers itself as an
AppFabric Service Bus endpoint and listens to messages using
netTcpRelayBinding. The gateways send messages to the AppFabric Service
Bus endpoint. Figure 30 illustrates the DemResWorkerRole design.
The DemResWorkerRole is a
typical AppFabric Service Bus service that uses a username and password
for AppFabric Service Bus authentication. The Logging service runs
on-premises as a separate AppFabric Service Bus service and receives
log messages from the DemResWorkerRole.
This example uses
netTcpRelayBinding as the AppFabric Service Bus binding between the
gateway and the DemResWorkerRole. The drawback with netTcpRelayBinding
is that it doesn't have built-in load-balancing, and each endpoint can
support only one service instance. In the real world, you should use
either an AppFabric Service Bus Router or a WCF web service hosted in a
web role. The purpose of this example is to demonstrate the integration
between a Windows Azure worker role, the AppFabric Service Bus, and SQL
Azure, so it uses simple bindings and configurations.
|
|
The logical steps of execution in the DemResWorkerRole are as follows:
Periodically, gateways call the SendValue() method on the DemResWorkerRole.
The DemResWorkerRole inserts a new record in the EnergyMeterValues database table in SQL Azure.
It
then retrieves the kWh price for the location where the gateway is
located and calculates the total unit cost of energy by multiplying the
price per kWh by the kWh value.
If
the total value exceeds one dollar, the DemResWorkerRole sends a
curtail command to the gateway on a callback connection to the gateway.
Listing 11 shows the code for the contract and server implementation of the DemResWorkerRole.
Example 11. Contract and Implementation of the DemResWorkerRole
[ServiceContract(CallbackContract = typeof(IDemResCallback))] public interface IDemResOperations { [OperationContract] void SendValue(string gatewayId, double value, DateTime gatewayTime); } public interface IDemResCallback { [OperationContract] void Curtail(int setPointValue); } [ServiceBehavior(ConcurrencyMode = ConcurrencyMode.Reentrant)] public class DemResService : IDemResOperations {
public void SendValue(string gatewayId, double kWhValue, DateTime gatewayTime) { //Update the database table with the new value InsertEnergyMeterValues(gatewayId, kWhValue, gatewayTime); //Get the value from the database and curtail if total price > $1.0 double cost = GetCostByGateway(gatewayId, kWhValue);
if (cost > 1.0) { IDemResCallback callback = OperationContext.Current. GetCallbackChannel<IDemResCallback>(); callback.Curtail(70); }
}
private void InsertEnergyMeterValues (string gid, double kWh, DateTime gatewayTime) {
try { // Connect to the sample database and perform various operations using (SqlConnection conn = new SqlConnection(GetUserDbString())) { conn.Open(); using (SqlCommand command = conn.CreateCommand()) { // Insert records command.CommandText = "InsertEnergyMeterValues"; command.CommandType = CommandType.StoredProcedure; SqlParameter gatewayId = command.CreateParameter(); gatewayId.ParameterName = "@gatewayId"; gatewayId.Value = gid; command.Parameters.Add(gatewayId);
SqlParameter kWhValue = command.CreateParameter(); kWhValue.ParameterName = "@kWhValue"; kWhValue.Value = kWh; command.Parameters.Add(kWhValue);
SqlParameter kWhFieldRecoredTime = command.CreateParameter(); kWhFieldRecoredTime.ParameterName = "@kWhFieldRecoredTime"; kWhFieldRecoredTime.Value = gatewayTime; command.Parameters.Add(kWhFieldRecoredTime);
SqlParameter kWhServerTime = command.CreateParameter(); kWhServerTime.ParameterName = "@kWhServerTime"; kWhServerTime.Value = DateTime.Now; command.Parameters.Add(kWhServerTime); int rowsAdded = command.ExecuteNonQuery(); }//using } } catch (Exception ex) { throw ex; } }
|
Listing 12
shows the code for the GetEnergyCostByGatewayId stored procedure that
retrieves the energy cost for the unit of energy sent by the gateway.
Example 12. GetEnergyCostByGatewayId Stored Procedure
CREATE PROCEDURE [dbo].[GetEnergyCostByGatewayId] @gatewayId varchar(50) AS BEGIN SET NOCOUNT ON;
SELECT PricingCalendar_kWh.PricePerkWh FROM Gateways INNER JOIN PricingLocations ON Gateways.LocationId = PricingLocations.LocationId INNER JOIN PricingCalendar_kWh ON PricingLocations.LocationId = PricingCalendar_kWh.LocationId WHERE (Gateways.GatewayId = @gatewayId); END
|
The
GetEnergyCostByGatewayId accepts gatewayId as the parameter and
retrieves the price per kWh for the location of the gateway from the
database. The DemResWorkerRole calculates the total cost and decides
whether to curtail the load.
3.3. Gateway Application Design
A gateway application
runs on gateways. This example builds it as a simple Windows
application, but in the real world such applications are background
processes that run on embedded operating systems like Windows CE. The
gateway application calls the DemRes service in the cloud via the
AppFabric Service Bus using the netTcpRelayBinding. In Ch8Solution, the
DemResGateway project represents the gateway application. DemResGateway
is a Windows application that implements the callback interface that
the DemRes service can call to curtail the load. Figure 31 illustrates the DemResGateway architecture.
3.4. Running the ProAzure Demand-Response Service
The steps required to run the Dem-Res system are outlined next. Begin by performing the following prerequisites:
Register for a Windows Azure account.
Create a project in the AppFabric portal.
Create a username and password in the AppFabric portal.
Create an InfoCard using Windows CardSpace on your local machine.
Upload
the card as one of the AppFabric solution credentials. You need this
card to call methods on the server from the gateway client.
Create a new SQL Azure database called proazuredemres.
Log in to proazuredemres using SQL Server Management Studio.
Open
the proazuredemres_allobjects.sql file in a New Query window, and
execute the script to create objects in the proazuredemres database.
NOTE
Make sure you've chosen
the correct database before executing the script. You may also use
SQLCMD to execute the script from the command line.
Create sample data by executing the following stored procedure:
EXEC AddSampleData 10001;
Create a new on-premises database called utility_pricing.
Open
the utility_pricing_allobjects.sql file in a New Query window, and
execute the script to create objects in the utility_pricing database.
Create sample data by executing the following stored procedure:
EXEC AddSampleData 10001;
Now, follow these steps to run the application:
Open Ch8Solution in Visual Studio.
Add
your own usernames, passwords, endpoints, SQL Azure connection strings,
and other configuration parameters in the configuration files of the
DemResWorker, DemResGateway, and LogReceiverConsole projects.
Build and deploy the DemResWorker cloud service project to Windows Azure.
Run the complete end-to-end application locally in the development fabric before deploying it to Windows Azure.
|
|
Start the LogReceiverConsole application. This application receives log messages from the DemResWorkerRole.
Start the DemResWorker service in Windows Azure or in the development fabric.
Start the DemResGateway application on your local machine.
Click the Send kWh Value button to send a single kWh value to the DemResWorker service.
Click
the button several times to get a value that results in a price greater
than one dollar. If the cost of energy per unit is more than one
dollar, the DemResWorker service sends a curtail command to the gateway
using a callback interface. When this happens, you see a message box
with the curtail value, as shown in Figure 32.