The
learning curve is considerably steeper than for creating packages using
the Import And Export Wizard and the designer. We advise you to become
familiar with using the wizard and the designer to create packages
before you start working with package creation and extension
programmatically.
Programming in Visual Studio
Programming Integration
Services in the Visual Studio environment is easy to set up. Depending
on whether you’re building an extensibility object or a custom
application, you can create either a Class Library project or an
application project. Then you add references to the appropriate assembly
and begin implementing your classes.
If you are
building control flow extensibility object or a custom application that
controls the Integration Service package, you must reference Microsoft.SqlServer.ManagedDTS. If you are building a data flow component, you must reference Microsoft.SqlServer.PipelineHost, Microsoft.SqlServer.DTSRuntimeWrap, and Microsoft.SqlServer.DTSPipelineWrap.
Loading and Executing Packages in Applications
You can load packages in applications by using the Integration Services Application
object. This object returns information about the state of Integration
Services (such as the data flow components that are installed) and
performs other Integration Services–related tasks (such as determining
which packages are running). One of the tasks supported by the object is
the loading of packages. More than one method is available to do
this—including LoadPackage and LoadFromSqlServer. The method you should use depends on how the package is stored. To create the Application object in managed code, you need to add a reference to Microsoft.SQLServer.ManagedDTS and use the namespace Microsoft.SqlServer.Dts.Runtime.
Listing 1 shows a managed code example that demonstrates package loading and execution.
Listing 1. Loading and executing a package
using Microsoft.SqlServer.Dts.Runtime;
class ApplicationDemo
{
static void Main(string[] args)
{
Application application = new Application();
Package package = application.LoadPackage("package.dtsx", null);
package.Execute();
}
}
|
Creating Packages Programmatically
In addition to loading and
executing packages programmatically, you can also use the Integration
Services object model to create packages programmatically. The package’s
control flow and data flow are conceptually different and therefore
have different methods for their construction.
Control Flow
The programming model for control flow is rooted in the Package object. You create this object first and then create additional objects with the Package object’s Executables
collection. If the object you create is a task, you get a task and you
can set properties on it. If the object you create is a container, you
get a container and you can set properties on the container and create
child objects in the container’s Executables collection.
Once you have the executable objects in place, you can create precedence constraints by adding items to the PrecedenceConstraints collection.
Likewise, to create connections and variables, you add items to the Connections collection on the Package object and to the Variables collection on container objects.
Although
you can create the entire package from scratch programmatically, one
approach is to templatize the process as much as possible. Build most of
the package interactively in SQL Server Business Intelligence
Development Studio ahead of time, and then load the template package
programmatically and modify it as needed until it has the correct
composition. In this way, you minimize the amount of code you need to
write and maximize ease of maintenance of your application because you
can handle many types of changes simply by changing the template
package; you don’t have to change a single line of code.
Data Flow
The data flow has
its own object model for programmatically setting up the layout that
describes a flow. For our example, we programmatically construct a
package in C# containing a data flow that uses an OLE DB source adapter
to read in data from a SQL Server 2005 table and then inserts the data
into a destination table using the OLE DB destination adapter. The flow
of the program is similar to how the Import And Export Wizard constructs
its packages. Along the way, we review and introduce some concepts that
programmers working with the data flow must know. To run the sample,
you must have the AdventureWorks database available and the destination
table available in tempdb (as shown in Listing 2).
Listing 2. SQL statement for the destination table used in the data flow programming sample
use tempdb
create table Employee
(
EmployeeID int,
Title nvarchar(50)
)
|
Listing 3 shows a complete data flow programming sample. When you build the sample, you must add references to:
Microsoft.SqlServer.ManagedDTS, Microsoft.SqlServer.DTSPipelineWrap, and Microsoft.SqlServer.DTSRuntimeWrap
Listing 3. Complete data flow programming sample
using Microsoft.SqlServer.Dts.Runtime;
using Microsoft.SqlServer.Dts.Pipeline.Wrapper;
namespace Sample
{
class DataFlowConstruction
{
static void Main(string[] args)
{
// Create a new package
Package package = new Package();
package.Name = "OLE DB Transfer";
// Add a Data Flow task
TaskHost taskHost = package.Executables.Add("DTS.Pipeline") as TaskHost;
taskHost.Name = "Transfer Table";
IDTSPipeline90 pipeline = taskHost.InnerObject as MainPipe;
// Get the pipeline's component metadata collection
IDTSComponentMetaDataCollection90 componentMetadataCollection = pipeline.Com
ponentMetaDataCollection;
// Add a new component metadata object to the data flow
IDTSComponentMetaData90 oledbSourceMetadata =
componentMetadataCollection.New();
// Associate the component metadata object with the OLE DB Source Adapter
oledbSourceMetadata.ComponentClassID = "DTSAdapter.OLEDBSource";
// Instantiate the OLE DB Source adapter
IDTSDesigntimeComponent90 oledbSourceComponent =
oledbSourceMetadata.Instantiate();
// Ask the component to set up its component metadata object
oledbSourceComponent.ProvideComponentProperties();
// Add an OLE DB connection manager
ConnectionManager connectionManagerSource = package.Connections.Add("OLEDB");
connectionManagerSource.Name = "OLEDBSource";
// Set the connection string
connectionManagerSource.ConnectionString =
"provider=sqlncli;server=.;integrated security=sspi;database=AdventureWorks";
// Set the connection manager as the OLE DB Source adapter's
runtime connection
IDTSRuntimeConnection90 runtimeConnectionSource =
oledbSourceMetadata.RuntimeConnectionCollection["OleDbConnection"];
runtimeConnectionSource.ConnectionManagerID = connectionManagerSource.ID;
// Tell the OLE DB Source adapter to use the SQL Command access mode.
oledbSourceComponent.SetComponentProperty("AccessMode", 2);
// Set up the SQL command
oledbSourceComponent.SetComponentProperty("SqlCommand", "select EmployeeID,
Title from HumanResources.Employee");
// Set up the connection manager object
runtimeConnectionSource.ConnectionManager = DtsConvert.ToConnectionManager90
(connectionManagerSource);
// Establish the database connection
oledbSourceComponent.AcquireConnections(null);
// Set up the column metadata
oledbSourceComponent.ReinitializeMetaData();
// Release the database connection
oledbSourceComponent.ReleaseConnections();
// Release the connection manager
runtimeConnectionSource.ReleaseConnectionManager();
// Add a new component metadata object to the data flow
IDTSComponentMetaData90 oledbDestinationMetadata =
componentMetadataCollection.New();
//
Associate the component metadata object with the OLE DB Destination Adapter
oledbDestinationMetadata.ComponentClassID = "DTSAdapter.OLEDBDestination";
// Instantiate the OLE DB Destination adapter
IDTSDesigntimeComponent90 oledbDestinationComponent =
oledbDestinationMetadata.Instantiate();
// Ask the component to set up its component metadata object
oledbDestinationComponent.ProvideComponentProperties();
// Add an OLE DB connection manager
ConnectionManager connectionManagerDestination =
package.Connections.Add("OLEDB");
connectionManagerDestination.Name = "OLEDBDestination";
// Set the connection string
connectionManagerDestination.ConnectionString =
"provider=sqlncli;server=.;integrated security=sspi;database=tempdb";
// Set the connection manager as the OLE DB
Destination adapter's runtime connection
IDTSRuntimeConnection90 runtimeConnectionDestination =
oledbDestinationMetadata.RuntimeConnectionCollection["OleDbConnection"];
runtimeConnectionDestination.ConnectionManagerID =
connectionManagerDestination.ID;
// Tell the OLE DB Destination adapter to use the SQL Command access mode.
oledbDestinationComponent.SetComponentProperty("AccessMode", 2);
// Set up the SQL command
oledbDestinationComponent.SetComponentProperty("SqlCommand",
"select EmployeeID, Title from Employee");
// Set up the connection manager object
runtimeConnectionDestination.ConnectionManager =
DtsConvert.ToConnectionManager90(connectionManagerDestination);
// Establish the database connection
oledbDestinationComponent.AcquireConnections(null);
// Set up the external metadata column
oledbDestinationComponent.ReinitializeMetaData();
// Release the database connection
oledbDestinationComponent.ReleaseConnections();
// Release the connection manager
runtimeConnectionDestination.ReleaseConnectionManager();
// Get the standard output of the OLE DB Source adapter
IDTSOutput90 oledbSourceOutput =
oledbSourceMetadata.OutputCollection["OLE DB Source Output"];
// Get the input of the OLE DB Destination adapter
IDTSInput90 oledbDestinationInput = oledbDestinationMetadata.InputCollection
["OLE DB Destination Input"];
// Create a new path object
IDTSPath90 path = pipeline.PathCollection.New();
// Connect the source and destination adapters
path.AttachPathAndPropagateNotifications(oledbSourceOutput, oledbDestination
Input);
// Get the output column collection for the OLE DB Source adapter
IDTSOutputColumnCollection90 oledbSourceOutputColumns =
oledbSourceOutput.OutputColumnCollection;
//
Get the external metadata column collection for the OLE DB Destination adapter
IDTSExternalMetadataColumnCollection90 externalMetadataColumns =
oledbDestinationInput.ExternalMetadataColumnCollection;
// Get the virtual input for the OLE DB Destination adapter.
IDTSVirtualInput90 virtualInput = oledbDestinationInput.GetVirtualInput();
// Loop through our output columns
foreach (IDTSOutputColumn90 outputColumn in oledbSourceOutputColumns)
{
// Add a new input column
IDTSInputColumn90 inputColumn =
oledbDestinationComponent.SetUsageType(oledbDestinationInput.ID,
virtualInput, outputColumn.LineageID, DTSUsageType.UT_READONLY);
// Get the external metadata column from the OLE DB Destination
// using the output column's name
IDTSExternalMetadataColumn90 externalMetadataColumn =
externalMetadataColumns[output
Column.Name];
// Map the new input column to its corresponding
external metadata column.
oledbDestinationComponent.MapInputColumn(oledbDestinationInput.ID,
inputColumn.ID, externalMetadataColumn.ID);
}
// Save the package
Application application = new Application();
application.SaveToXml(@"c:\OLEDBTransfer.dtsx", package, null);
}
}
}
|