Showing posts with label flat. Show all posts
Showing posts with label flat. Show all posts

Monday, March 12, 2012

Programming in ssis

Hi..

I am working on sql server integration services..New to this topic...I want to transfer data from flat file to sql database by doing it programatically.Creating source,destination and transform adapters

my basic question is should the source,transform and destination adapter be bulid for each file i am trying to transfer.

What is the advantage of this doing it this way.

Would anyone let me know about this..

Thanks

The answer to your question will depend on your application's requirements, but in principle, if you are using the same metadata throughout the data flow, you can use a for-each loop and dynamically update the connection string of the flat file connection manager, using the property expression feature.

Programmatically Set Flat File Destination Name (relating to timestamp)

One simple question:
I need dumping data from table to a file daily with file name relating to time stamp. The flat file connection manager editor only allow you to set the STATIC file name. Suppose I programmtically passing the time now to a User::TimeNowVariable, how do I set the file name programmatically relating to this variable?

This is a very classic issue associated to Graphical Programming, it seems to be very difficult to understand how to do a very simple programmatic property setting like this, uncless you KNEW it.

I appreciate any helpful info provided.

JordanUse expressions on the Flat File Connection Manager to set the Connection String using the variable.

Programmatically set a Flat File Connection Manager

Hello,

I need to know how I can programmatically set a Flat File Connection Manager's Column Delimiter value.

The Data Warehouse project I am working on, receives daily information feeds that could contain one of two delimiters. Which is just dumb...anyways, as it is now we have two seperate Data Flow Tasks which handle these two delimiters. Currently we have a script taks that "sneak previews" each incoming flat file to determine which delimiter it has, and direct our flow to the correct Data Flow Task to handle it.

I do not want to have to maintain 2 DFTs. How can I get around this problem?

Even if there is a way to do this by passing variables/setting expressions in the Flat File Connection manager, I would do that. Does not necessarily HAVE to be a pure programmatic approach.

ANY help would be greatly appreciated!

Feel free to email me at ccorbin@.topcoder.com with any questions, or leave me some good news here :)

THANKS!

Chris Corbin

Software Developer

TopCoder Inc.

I thought this was a simple question. Apparently, you can use an expression to set every property of the File Connection Manager except the column delimiter. Maybe that will change in the near future. There must be a reason.|||ColumnDelimiter is a property of the FlatFileColumn, and not the connection manager. Due to which currently it is not expressionable. Feel free to raise a DCR(Design Change Request) for this issue.|||

How would I go about raising a DCR?

Thanks

|||

To file a bug/DCR, start here:

http://connect.microsoft.com/SQLServer

Friday, March 9, 2012

Programmatically Data Conversion component creation

Hi there,

I have created a package which simply imports data from a flat file to a SQL Server table. But I need to incorporate a data conversion component by which I may change the source-destination column mapping programmatically. So what I thought that I need to add a data conversion component into the dataflow task. After adding this component (I found its component id as {C3BF62C8-7C5C-4F85-83C3-E0B6F6BE267C}) I have created a path which establishes the mapping between output columns of source component and the input columns of data conversion component. Now I am not sure how to establish the mapping between the data conversion component’s input column collection and output column collection.

I am giving my code snippet here,

IDTSComponentMetaData90 sourceDataFlowComponent = dataFlowTask.ComponentMetaDataCollection.New();

sourceDataFlowComponent.ComponentClassID = "{90C7770B-DE7C-435E-880E-E718C92C0573}";

… … …. // Code for configuring the source data flow component

IDTSComponentMetaData90 conversionDataFlowComponent = dataFlowTask.ComponentMetaDataCollection.New();// creating data conversion

conversionDataFlowComponent.ComponentClassID = "{C3BF62C8-7C5C-4F85-83C3-E0B6F6BE267C}";// This is the GUID for data conversion component

CManagedComponentWrapper conversionInstance = conversionDataFlowComponent.Instantiate();//Instantiate

conversionInstance.ProvideComponentProperties();

// Now creating a path to connet the source and conversion

IDTSPath90 fPath = dataFlowTask.PathCollection.New();fPath.AttachPathAndPropagateNotifications(

sourceDataFlowComponent.OutputCollection[0],

conversionDataFlowComponent.InputCollection[0]);

// Sould I need to accuire connect for data conversion? Im not sure

conversionInstance.AcquireConnections(null);

conversionInstance.ReinitializeMetaData();

// Get the input collection

IDTSInput90 input = conversionDataFlowComponent.InputCollection[0];

IDTSVirtualInput90 vInput = input.GetVirtualInput();

foreach (IDTSVirtualInputColumn90 vColumn in vInput.VirtualInputColumnCollection){

conversionInstance.SetUsageType(

input.ID, vInput, vColumn.LineageID, DTSUsageType.UT_READONLY);

}

.. . // Well here I am stucked. What I need to do here to establish a map

// between conversionDataFlowComponent.InputCollection[0] and

// conversionDataFlowComponent.OutputCollection[0]?

As you can see I am just away from creating the mapping between input and output collection. Can anybody give me an idea how can I achieve this?

I will appreciate all kind of suggestions and comments.

Regards

Moim

Have a look at this code snippet where I use this transform, selecting all columns, and converting types where required based on the result of a helper function-

#region Select Conversion Columns

IDTSOutput90 output = dataConversion.OutputCollection[0];

IDTSVirtualInput90 virtualInput = dataConversion.InputCollection[0].GetVirtualInput();

int inputId = dataConversion.InputCollection[0].ID;

foreach (IDTSVirtualInputColumn90 column in virtualInput.VirtualInputColumnCollection)

{

int length = column.Length;

int precision = column.Precision;

int scale = column.Scale;

DataType dataType = ComponentHelper.ConvertBufferDataTypeToFitExcel(column.DataType, ref length, ref precision, ref scale);

if (dataType != column.DataType)

{

IDTSInputColumn90 inputColumn = dataConversionInstance.SetUsageType(inputId, virtualInput, column.LineageID, DTSUsageType.UT_READONLY);

IDTSOutputColumn90 outputColumn = dataConversionInstance.InsertOutputColumnAt(output.ID, output.OutputColumnCollection.Count, column.Name, "Excel compatible conversion column");

dataConversionInstance.SetOutputColumnDataTypeProperties(output.ID, outputColumn.ID, dataType, length, precision, scale, 0);

outputColumn.CustomPropertyCollection[0].Value = inputColumn.LineageID;

inputColumn.Name = DataConversionInputRenamePrefix + inputColumn.Name;

}

}

#endregion

Btw, how did you get your code to paste so nicely into this editor Window?

|||

DarrenSQLIS wrote:

Btw, how did you get your code to paste so nicely into this editor Window?

Try this:

http://www.codeproject.com/jscript/CopyasHTML.asp

Or just go here: http://www.google.co.uk/search?hl=en&q=copyashtml&meta=

I have add-in that enables me to do it but I can't for the life of me find it again.

-Jamie

|||

Jamie Thomson wrote:

DarrenSQLIS wrote:

Btw, how did you get your code to paste so nicely into this editor Window?

I have add-in that enables me to do it but I can't for the life of me find it again.

-Jamie

Here it is: http://www.jtleigh.com/people/colin/software/CopySourceAsHtml/|||Nice, thanks Gents.|||

Hello Darran,

Thanks a lot. After implementing your suggestion I believe I am very near to make it work. I think I am missing/doing some silly things/mistakes so that the package is not working as expected. I observed the log traces and found that it is failing to validate the destination data flow component. And debugging it I found that after creating a path between the data conversion component and the destination component, when I populate the input columns for the destination component it is actually getting the source components output columns lineageID instead of data conversion component’s output columns lineageID. Can you help me to find out the problem please?

Anyway, thank you very much for your previous help.

Here is the code snippet:

private void ConfigureSourceComponent()

{

sourceDataFlowComponent.Name = "Source Data from Flat file";

// Here is the component class id for flat file source data

sourceDataFlowComponent.ComponentClassID = "{90C7770B-DE7C-435E-880E-E718C92C0573}";

CManagedComponentWrapper managedFlatFileInstance =

sourceDataFlowComponent.Instantiate();

managedFlatFileInstance.ProvideComponentProperties();sourceDataFlowComponent.RuntimeConnectionCollection[0].

ConnectionManagerID =flatfileConnectionManager.ID;sourceDataFlowComponent.RuntimeConnectionCollection[0].

ConnectionManager =DtsConvert.ToConnectionManager90(flatfileConnectionManager);

managedFlatFileInstance.AcquireConnections(null);

managedFlatFileInstance.ReinitializeMetaData();

IDTSExternalMetadataColumn90 exOutColumn;

foreach (IDTSOutputColumn90 outColumn in sourceDataFlowComponent.OutputCollection[0].OutputColumnCollection)

{

exOutColumn = sourceDataFlowComponent.OutputCollection[0].

ExternalMetadataColumnCollection[outColumn.Name];managedFlatFileInstance.MapOutputColumn(

sourceDataFlowComponent.OutputCollection[0].ID, outColumn.ID, exOutColumn.ID, true);

}

managedFlatFileInstance.ReleaseConnections()

}

private void ConfigureConversionComponent()

{

conversionDataFlowComponent.Name = "Conversion compoenent";

// component class id for Data conversion component

conversionDataFlowComponent.ComponentClassID = "{C3BF62C8-7C5C-4F85-83C3-E0B6F6BE267C}";

CManagedComponentWrapper conversionInstance =

conversionDataFlowComponent.Instantiate();

conversionInstance.ProvideComponentProperties();

// Create the path.

IDTSPath90 fPath = dataFlowTask.PathCollection.New();fPath.AttachPathAndPropagateNotifications(

sourceDataFlowComponent.OutputCollection[0],

conversionDataFlowComponent.InputCollection[0]);

conversionInstance.AcquireConnections(null);

conversionInstance.ReinitializeMetaData();

// Get the output collection

IDTSOutput90 output = conversionDataFlowComponent.OutputCollection[0];

// Get the destination's default input and virtual input.

input = conversionDataFlowComponent.InputCollection[0];

vInput = input.GetVirtualInput();

// Iterate through the virtual input column collection.

foreach (IDTSVirtualInputColumn90 vColumn in vInput.VirtualInputColumnCollection)

{

IDTSInputColumn90 inputColumn =

conversionInstance.SetUsageType(

input.ID, vInput, vColumn.LineageID,

DTSUsageType.UT_READONLY);

IDTSOutputColumn90 outputColumn =

conversionInstance.InsertOutputColumnAt

(output.ID, output.OutputColumnCollection.Count,

vColumn.Name, "Sql server compatible conversion column");

conversionInstance.SetOutputColumnDataTypeProperties(

output.ID, outputColumn.ID, vColumn.DataType,

vColumn.Length, vColumn.Precision, vColumn.Scale, 0);

outputColumn.CustomPropertyCollection[0].Value =

inputColumn.LineageID;

}

conversionInstance.ReleaseConnections();

}

private void ConfigureDestinationComponent()

{

destinationDataFlowComponent.Name = "Destination Oledb compoenent";

// component class id for Oledb

destinationDataFlowComponent.ComponentClassID = "{E2568105-9550-4F71-A638-B7FE42E66922}";

CManagedComponentWrapper managedOleInstance =

destinationDataFlowComponent.Instantiate();

managedOleInstance.ProvideComponentProperties();

destinationDataFlowComponent.RuntimeConnectionCollection[0].

ConnectionManagerID = oledbConnectionManager.ID;

destinationDataFlowComponent.RuntimeConnectionCollection[0].

ConnectionManager =DtsConvert.ToConnectionManager90(oledbConnectionManager);

managedOleInstance.SetComponentProperty("AccessMode", 0);

// .. .. .. setting other component properties here..

// Create the path.

IDTSPath90 path = dataFlowTask.PathCollection.New();

path.AttachPathAndPropagateNotifications(

conversionDataFlowComponent.OutputCollection[0],

destinationDataFlowComponent.InputCollection[0]

);

managedOleInstance.AcquireConnections(null);

managedOleInstance.ReinitializeMetaData();

// Get the destination's default input and virtual input.

IDTSInput90 input;

IDTSVirtualInput90 vInput;

// Get the destination's default input and virtual input.

input = destinationDataFlowComponent.InputCollection[0];

vInput = input.GetVirtualInput();

// Iterate through the virtual input column collection.

foreach (IDTSVirtualInputColumn90 vColumn in

vInput.VirtualInputColumnCollection)

{

managedOleInstance.SetUsageType(

input.ID, vInput, vColumn.LineageID,

DTSUsageType.UT_READONLY);

// culprit Line. I can found vColumn.LineageID = source lineage ids

// while debugging.. can you please help me out here?

}

IDTSExternalMetadataColumn90 exColumn;

foreach (IDTSInputColumn90 inColumn in

destinationDataFlowComponent.InputCollection[0].InputColumnCollection)

{

exColumn = destinationDataFlowComponent.InputCollection[0].

ExternalMetadataColumnCollection[inColumn.Name];managedOleInstance.MapInputColumn(destinationDataFlowComponent.

InputCollection[0].ID, inColumn.ID, exColumn.ID);

}

managedOleInstance.ReleaseConnections();

}

Here you can see the red marked lines. Where I can found that the destination component is watching the source output columns lineage althoug its mapped with data conversion component.

Regards

Moim

|||

The input buffer of the destination will include all upstream columns, so that means both the ones that came from the source and those from the Data Conversion Tx. You will need to be selective in choosing the input columns.

Think of the buffer as it passes down the pipeline, or rather what columns are available -

Source Output - SourceCol1

Data Conversion Output - SourceCol1, SourceCol1(converted)

You don't seem to be distiguishing between the input columns and output columns in the data conversion setup, that seems strange. You would only want to select the converted columns I assume. You could use a name, description or the UpstreamComponentName property of the column perhaps.

|||

Hi,

I have similar requirement.I need to import data from csv files to sql server tables using ssis programatically.iam not able to create a data conversion task programatically and map the columns from the dataconversion task to the oledb destination.tried my level best and got struck.Is there a code demo availble?appreciating your help

|||

adarsha wrote:

Hi,

I have similar requirement.I need to import data from csv files to sql server tables using ssis programatically.iam not able to create a data conversion task programatically and map the columns from the dataconversion task to the oledb destination.tried my level best and got struck.Is there a code demo availble?appreciating your help

It would really help if you explain what you are getting stuck on and how far you have come so far. Posting your code would be even better.

-Jamie

|||

Hi,

//In the below example i have pumped data from sql table to csvs which works fine for me .But now i need to do the exact opposite task.(Pump data from csv files to sql table)

Problems i have :

1:

After adding a flatfile source iam not very sure how to exactly add input columns in FlatFile source as they dont get populated by themselves.I tried using datatable reading the columns from the file and add it and somewhat succeeded.But need a clear implementation help.

2:

To some extent when i succeeded in adding input columns to flat file source, the package failed due to data conversion problems(unicode to non unicode conversion error).I tried programatically building dataconversion task and had no clue how exactly to do it correctly and make it to work.

So i need help in adding input columns to flat file source and adding data conversion task and selecting only the converted columns to the oledb destination.

using System;

using System.Collections.Generic;

using System.Collections.Specialized;

using System.Reflection;

using System.Threading;

using System.Text;

using System.IO;

using Microsoft.SqlServer.Dts.Runtime;

using Microsoft.SqlServer.Dts.Pipeline.Wrapper;

using wrap = Microsoft.SqlServer.Dts.Runtime.Wrapper;

namespace CCE_ETL_Engine_Flatfiles

{

class Flatfile_Pump

{

Boolean mblnDebug = true;

string mstrAppLogFile;

string mstrDTSXFile;

string DataFileName;

string DestinationDataDirectory;

#region intialize

public void InitializeVariables(string strDTSPackageName)

{

DestinationDataDirectory = System.Configuration.ConfigurationManager.AppSettings.Get("DataFilePath");

DataFileName = System.Configuration.ConfigurationManager.AppSettings.Get("DataFileName");

Object objLock = new Object();

Monitor.Enter(objLock);

string strTimeStamp = DateTime.Now.Hour.ToString().PadLeft(2, '0') + DateTime.Now.Minute.ToString().PadLeft(2, '0') + DateTime.Now.Second.ToString().PadLeft(2, '0') + DateTime.Now.Millisecond.ToString();

Monitor.Exit(objLock);

mstrAppLogFile = System.Configuration.ConfigurationManager.AppSettings.Get("ApplicationLogFilePath") + @."\" + strDTSPackageName + "_" + strTimeStamp + ".log";

mstrDTSXFile = System.Configuration.ConfigurationManager.AppSettings.Get("DTSXFilePath") + @."\" + strDTSPackageName + "_" + strTimeStamp + ".dtsx";

if (System.Configuration.ConfigurationManager.AppSettings.Get("Debug").ToUpper() == "FALSE")

mblnDebug = false;

else

mblnDebug = true;

}

#endregion

#region constructor

public Flatfile_Pump(string strDTSPackageName)

{

InitializeVariables(strDTSPackageName);

}

#endregion

#region CreatePackage

public Package CreatePackage(string Name, string Description)

{

Package DataPump = new Package();

DataPump.PackageType = DTSPackageType.DTSDesigner90;

DataPump.Name = Name;

DataPump.Description = Description;

DataPump.CreatorComputerName = System.Environment.MachineName;

DataPump.CreatorName = System.Environment.UserName;

return DataPump;

}

#endregion

#region AddConnectionManagers

/// <summary>

/// Adds the OLEDB and FlatFile connection managers to the package.

/// </summary>

public void AddConnectionManagers(Package DataPump, string SqlConnection) Here iam adding 2 connection managers 1 for flatfile and 1 for OLEDB

{

// Add the OLEDB connection manager.

ConnectionManager ConnectionName;

ConnectionManager cmflatFile;

ConnectionName = DataPump.Connections.Add("OLEDB");

// Set stock properties.

ConnectionName.Name = "Remote Source";

ConnectionName.ConnectionString = SqlConnection;

//@."Provider=SQLNCLI;Integrated Security=SSPI;Persist Security Info=False;Initial Catalog=cce_control;Data Source=(wxp-9plyf1s);Auto Translate=False;";

// Add the Destination connection manager.

cmflatFile = DataPump.Connections.Add("FLATFILE");

// Set the stock properties.

cmflatFile.Properties["ConnectionString"].SetValue(cmflatFile, DestinationDataDirectory + DataFileName);

cmflatFile.Properties["Format"].SetValue(cmflatFile, "Delimited");

cmflatFile.Properties["DataRowsToSkip"].SetValue(cmflatFile, 0);

cmflatFile.Properties["ColumnNamesInFirstDataRow"].SetValue(cmflatFile, true);

cmflatFile.Properties["Name"].SetValue(cmflatFile, "FlatFileConnection");

cmflatFile.Properties["RowDelimiter"].SetValue(cmflatFile, "\r\n");

cmflatFile.Properties["TextQualifier"].SetValue(cmflatFile, "\"");

}

#endregion

#region AddDataFlowTask

/// <summary>

/// Adds a DataFlow task to the Executables collection of the package.

/// Retrieves the MainPipe object from the TaskHost and stores it in

/// the dataFlow member variable

/// </summary>

public MainPipe AddDataFlowTask(Package DataPump)

{

TaskHost th = DataPump.Executables.Add("DTS.Pipeline") as TaskHost;

th.Name = "DataFlow";

th.Description = "The DataFlow task in the package.";

MainPipe dataFlow = th.InnerObject as MainPipe;

ComponentEvents componentEvents = new ComponentEvents();

dataFlow.Events = componentEvents as wrap.IDTSComponentEvents90;

return dataFlow;

}

#endregion

#region AddOLEDBSource

public IDTSComponentMetaData90 AddOLEDBSource(Microsoft.SqlServer.Dts.Pipeline.Wrapper.MainPipe dataFlow, Package DataPump, string strSourceQuery)

{

// Declare and Add the component to the dataFlow metadata collection

IDTSComponentMetaData90 OLEDBsource;

OLEDBsource = dataFlow.ComponentMetaDataCollection.New();

// Set the common properties

OLEDBsource.ComponentClassID = "DTSAdapter.OLEDBSource";

OLEDBsource.Name = "cce_control";

OLEDBsource.Description = "Remote Source Server";

// Create an instance of the component

CManagedComponentWrapper instance = OLEDBsource.Instantiate();

instance.ProvideComponentProperties();

// Associate the runtime ConnectionManager with the component

OLEDBsource.RuntimeConnectionCollection[0].ConnectionManagerID = DataPump.Connections["Remote Source"].ID;

OLEDBsource.RuntimeConnectionCollection[0].ConnectionManager = DtsConvert.ToConnectionManager90(DataPump.Connections["Remote Source"]);

instance.SetComponentProperty("SqlCommand", strSourceQuery);

instance.SetComponentProperty("AccessMode", 2);

instance.AcquireConnections(null);

instance.ReinitializeMetaData();

instance.ReleaseConnections();

return OLEDBsource;

}

#endregion

#region AddFlatFileDestination

public IDTSComponentMetaData90 AddFlatFileDestination(Microsoft.SqlServer.Dts.Pipeline.Wrapper.MainPipe dataFlow, Package DataPump, IDTSComponentMetaData90 OLEDBsource)

{

// Declare and Add the component to the dataFlow metadata collection

IDTSComponentMetaData90 flatfileDestination;

flatfileDestination = dataFlow.ComponentMetaDataCollection.New();

// Set the common properties

flatfileDestination.ComponentClassID = "DTSAdapter.FlatFileDestination";

flatfileDestination.Name = "FlatFileDestination";

flatfileDestination.Description = "Flat file destination";

// Create an instance of the component

CManagedComponentWrapper instance = flatfileDestination.Instantiate();

instance.ProvideComponentProperties();

// Associate the runtime ConnectionManager with the component

flatfileDestination.RuntimeConnectionCollection[0].ConnectionManagerID

= DataPump.Connections["FlatFileConnection"].ID;

flatfileDestination.RuntimeConnectionCollection[0].ConnectionManager

= DtsConvert.ToConnectionManager90(DataPump.Connections["FlatFileConnection"]);

// Map a path between the Sort transformation component to the FlatFileDestination

dataFlow.PathCollection.New().AttachPathAndPropagateNotifications(OLEDBsource.OutputCollection[0], flatfileDestination.InputCollection[0]);

// Add columns to the FlatFileConnectionManager

AddColumnsToFlatFileConnection(DataPump, flatfileDestination);

// Acquire the connection, reinitialize the metadata,

// map the columns, then release the connection.

instance.AcquireConnections(null);

instance.ReinitializeMetaData();

MapFlatFileDestinationColumns(flatfileDestination);

instance.ReleaseConnections();

return flatfileDestination;

}

#endregion

#region AddColumnsToFlatFileConnection

public void AddColumnsToFlatFileConnection(Package DataPump, IDTSComponentMetaData90 flatfileDestination)

{

wrap.IDTSConnectionManagerFlatFile90 ff = null;

foreach (ConnectionManager cm in DataPump.Connections)

{

if (cm.Name == "FlatFileConnection")

{

ff = cm.InnerObject as wrap.IDTSConnectionManagerFlatFile90;

DtsConvert.ToConnectionManager90(cm);

}

}

int count = ff.Columns.Count;

// if the connection manager is null here, then we have a problem

if (ff != null)

{

// Get the upstream columns

IDTSVirtualInputColumnCollection90 vColumns = flatfileDestination.InputCollection[0].GetVirtualInput().VirtualInputColumnCollection;

for (int cols = 0; cols < vColumns.Count; cols++)

{

wrap.IDTSConnectionManagerFlatFileColumn90 col = ff.Columns.Add();

// If this is the last column, set the delimiter to CRLF.

// Otherwise keep the delimiter as ",".

if (cols == vColumns.Count - 1)

{

col.ColumnDelimiter = "\r\n";

}

else

{

col.ColumnDelimiter = @.",";

}

col.ColumnType = "Delimited";

col.DataType = vColumns[cols].DataType;

col.DataPrecision = vColumns[cols].Precision;

col.DataScale = vColumns[cols].Scale;

wrap.IDTSName90 name = col as wrap.IDTSName90;

name.Name = vColumns[cols].Name;

}

}

}

#endregion

#region MapFlatFileDestination Columns

public void MapFlatFileDestinationColumns(IDTSComponentMetaData90 flatfileDestination)

{

CManagedComponentWrapper wrp = flatfileDestination.Instantiate();

IDTSVirtualInput90 vInput = flatfileDestination.InputCollection[0].GetVirtualInput();

foreach (IDTSVirtualInputColumn90 vColumn in vInput.VirtualInputColumnCollection)

{

wrp.SetUsageType(flatfileDestination.InputCollection[0].ID, vInput, vColumn.LineageID, DTSUsageType.UT_READONLY);

}

// For each column in the input collection

// find the corresponding external metadata column.

foreach (IDTSInputColumn90 col in flatfileDestination.InputCollection[0].InputColumnCollection)

{

IDTSExternalMetadataColumn90 exCol = flatfileDestination.InputCollection[0].ExternalMetadataColumnCollection[col.Name];

wrp.MapInputColumn(flatfileDestination.InputCollection[0].ID, col.ID, exCol.ID);

}

}

#endregion

#region Exec_DataPump--This is just the execution where i get values from like server name etc and create the package and execute it.

public Boolean DataPump(string SqlConnection, string strSourceQuery)

{

Package DataPump;

PackageEvents packageEvents;

MainPipe dataFlow;

IDTSComponentMetaData90 OLEDBsource, flatfileDestination;

packageEvents = new PackageEvents();

DataPump = CreatePackage("SSISPackage", "The package for the Flatfile pump");

AddConnectionManagers(DataPump, SqlConnection);

dataFlow = AddDataFlowTask(DataPump);

OLEDBsource = AddOLEDBSource(dataFlow, DataPump, strSourceQuery);

flatfileDestination = AddFlatFileDestination(dataFlow, DataPump, OLEDBsource);

DTSExecResult status = DataPump.Validate(null, null, packageEvents, null);

if (mblnDebug)

{

Application dts = new Application();

dts.SaveToXml(mstrDTSXFile, DataPump, packageEvents);

}

foreach (Variable v in DataPump.Variables)

{

if (v.Name.ToUpper() == "EXECUTIONINSTANCEGUID")

{

break;

}

}

if (status == DTSExecResult.Success)

{

DTSExecResult result = DataPump.Execute(null, null, packageEvents, null, null);

foreach (Variable v in DataPump.Variables)

{

Console.WriteLine(v.Name.ToUpper());

}

}

return true;

}

#endregion

}

}

Programmatically Data Conversion component creation

Hi there,

I have created a package which simply imports data from a flat file to a SQL Server table. But I need to incorporate a data conversion component by which I may change the source-destination column mapping programmatically. So what I thought that I need to add a data conversion component into the dataflow task. After adding this component (I found its component id as {C3BF62C8-7C5C-4F85-83C3-E0B6F6BE267C}) I have created a path which establishes the mapping between output columns of source component and the input columns of data conversion component. Now I am not sure how to establish the mapping between the data conversion component’s input column collection and output column collection.

I am giving my code snippet here,

IDTSComponentMetaData90 sourceDataFlowComponent = dataFlowTask.ComponentMetaDataCollection.New();

sourceDataFlowComponent.ComponentClassID = "{90C7770B-DE7C-435E-880E-E718C92C0573}";

… … …. // Code for configuring the source data flow component

IDTSComponentMetaData90 conversionDataFlowComponent = dataFlowTask.ComponentMetaDataCollection.New();// creating data conversion

conversionDataFlowComponent.ComponentClassID = "{C3BF62C8-7C5C-4F85-83C3-E0B6F6BE267C}";// This is the GUID for data conversion component

CManagedComponentWrapper conversionInstance = conversionDataFlowComponent.Instantiate();//Instantiate

conversionInstance.ProvideComponentProperties();

// Now creating a path to connet the source and conversion

IDTSPath90 fPath = dataFlowTask.PathCollection.New();fPath.AttachPathAndPropagateNotifications(

sourceDataFlowComponent.OutputCollection[0],

conversionDataFlowComponent.InputCollection[0]);

// Sould I need to accuire connect for data conversion? Im not sure

conversionInstance.AcquireConnections(null);

conversionInstance.ReinitializeMetaData();

// Get the input collection

IDTSInput90 input = conversionDataFlowComponent.InputCollection[0];

IDTSVirtualInput90 vInput = input.GetVirtualInput();

foreach (IDTSVirtualInputColumn90 vColumn in vInput.VirtualInputColumnCollection){

conversionInstance.SetUsageType(

input.ID, vInput, vColumn.LineageID, DTSUsageType.UT_READONLY);

}

.. . // Well here I am stucked. What I need to do here to establish a map

// between conversionDataFlowComponent.InputCollection[0] and

// conversionDataFlowComponent.OutputCollection[0]?

As you can see I am just away from creating the mapping between input and output collection. Can anybody give me an idea how can I achieve this?

I will appreciate all kind of suggestions and comments.

Regards

Moim

Have a look at this code snippet where I use this transform, selecting all columns, and converting types where required based on the result of a helper function-

#region Select Conversion Columns

IDTSOutput90 output = dataConversion.OutputCollection[0];

IDTSVirtualInput90 virtualInput = dataConversion.InputCollection[0].GetVirtualInput();

int inputId = dataConversion.InputCollection[0].ID;

foreach (IDTSVirtualInputColumn90 column in virtualInput.VirtualInputColumnCollection)

{

int length = column.Length;

int precision = column.Precision;

int scale = column.Scale;

DataType dataType = ComponentHelper.ConvertBufferDataTypeToFitExcel(column.DataType, ref length, ref precision, ref scale);

if (dataType != column.DataType)

{

IDTSInputColumn90 inputColumn = dataConversionInstance.SetUsageType(inputId, virtualInput, column.LineageID, DTSUsageType.UT_READONLY);

IDTSOutputColumn90 outputColumn = dataConversionInstance.InsertOutputColumnAt(output.ID, output.OutputColumnCollection.Count, column.Name, "Excel compatible conversion column");

dataConversionInstance.SetOutputColumnDataTypeProperties(output.ID, outputColumn.ID, dataType, length, precision, scale, 0);

outputColumn.CustomPropertyCollection[0].Value = inputColumn.LineageID;

inputColumn.Name = DataConversionInputRenamePrefix + inputColumn.Name;

}

}

#endregion

Btw, how did you get your code to paste so nicely into this editor Window?

|||

DarrenSQLIS wrote:

Btw, how did you get your code to paste so nicely into this editor Window?

Try this:

http://www.codeproject.com/jscript/CopyasHTML.asp

Or just go here: http://www.google.co.uk/search?hl=en&q=copyashtml&meta=

I have add-in that enables me to do it but I can't for the life of me find it again.

-Jamie

|||

Jamie Thomson wrote:

DarrenSQLIS wrote:

Btw, how did you get your code to paste so nicely into this editor Window?

I have add-in that enables me to do it but I can't for the life of me find it again.

-Jamie

Here it is: http://www.jtleigh.com/people/colin/software/CopySourceAsHtml/|||Nice, thanks Gents.|||

Hello Darran,

Thanks a lot. After implementing your suggestion I believe I am very near to make it work. I think I am missing/doing some silly things/mistakes so that the package is not working as expected. I observed the log traces and found that it is failing to validate the destination data flow component. And debugging it I found that after creating a path between the data conversion component and the destination component, when I populate the input columns for the destination component it is actually getting the source components output columns lineageID instead of data conversion component’s output columns lineageID. Can you help me to find out the problem please?

Anyway, thank you very much for your previous help.

Here is the code snippet:

private void ConfigureSourceComponent()

{

sourceDataFlowComponent.Name = "Source Data from Flat file";

// Here is the component class id for flat file source data

sourceDataFlowComponent.ComponentClassID = "{90C7770B-DE7C-435E-880E-E718C92C0573}";

CManagedComponentWrapper managedFlatFileInstance =

sourceDataFlowComponent.Instantiate();

managedFlatFileInstance.ProvideComponentProperties();sourceDataFlowComponent.RuntimeConnectionCollection[0].

ConnectionManagerID =flatfileConnectionManager.ID;sourceDataFlowComponent.RuntimeConnectionCollection[0].

ConnectionManager =DtsConvert.ToConnectionManager90(flatfileConnectionManager);

managedFlatFileInstance.AcquireConnections(null);

managedFlatFileInstance.ReinitializeMetaData();

IDTSExternalMetadataColumn90 exOutColumn;

foreach (IDTSOutputColumn90 outColumn in sourceDataFlowComponent.OutputCollection[0].OutputColumnCollection)

{

exOutColumn = sourceDataFlowComponent.OutputCollection[0].

ExternalMetadataColumnCollection[outColumn.Name];managedFlatFileInstance.MapOutputColumn(

sourceDataFlowComponent.OutputCollection[0].ID, outColumn.ID, exOutColumn.ID, true);

}

managedFlatFileInstance.ReleaseConnections()

}

private void ConfigureConversionComponent()

{

conversionDataFlowComponent.Name = "Conversion compoenent";

// component class id for Data conversion component

conversionDataFlowComponent.ComponentClassID = "{C3BF62C8-7C5C-4F85-83C3-E0B6F6BE267C}";

CManagedComponentWrapper conversionInstance =

conversionDataFlowComponent.Instantiate();

conversionInstance.ProvideComponentProperties();

// Create the path.

IDTSPath90 fPath = dataFlowTask.PathCollection.New();fPath.AttachPathAndPropagateNotifications(

sourceDataFlowComponent.OutputCollection[0],

conversionDataFlowComponent.InputCollection[0]);

conversionInstance.AcquireConnections(null);

conversionInstance.ReinitializeMetaData();

// Get the output collection

IDTSOutput90 output = conversionDataFlowComponent.OutputCollection[0];

// Get the destination's default input and virtual input.

input = conversionDataFlowComponent.InputCollection[0];

vInput = input.GetVirtualInput();

// Iterate through the virtual input column collection.

foreach (IDTSVirtualInputColumn90 vColumn in vInput.VirtualInputColumnCollection)

{

IDTSInputColumn90 inputColumn =

conversionInstance.SetUsageType(

input.ID, vInput, vColumn.LineageID,

DTSUsageType.UT_READONLY);

IDTSOutputColumn90 outputColumn =

conversionInstance.InsertOutputColumnAt

(output.ID, output.OutputColumnCollection.Count,

vColumn.Name, "Sql server compatible conversion column");

conversionInstance.SetOutputColumnDataTypeProperties(

output.ID, outputColumn.ID, vColumn.DataType,

vColumn.Length, vColumn.Precision, vColumn.Scale, 0);

outputColumn.CustomPropertyCollection[0].Value =

inputColumn.LineageID;

}

conversionInstance.ReleaseConnections();

}

private void ConfigureDestinationComponent()

{

destinationDataFlowComponent.Name = "Destination Oledb compoenent";

// component class id for Oledb

destinationDataFlowComponent.ComponentClassID = "{E2568105-9550-4F71-A638-B7FE42E66922}";

CManagedComponentWrapper managedOleInstance =

destinationDataFlowComponent.Instantiate();

managedOleInstance.ProvideComponentProperties();

destinationDataFlowComponent.RuntimeConnectionCollection[0].

ConnectionManagerID = oledbConnectionManager.ID;

destinationDataFlowComponent.RuntimeConnectionCollection[0].

ConnectionManager =DtsConvert.ToConnectionManager90(oledbConnectionManager);

managedOleInstance.SetComponentProperty("AccessMode", 0);

// .. .. .. setting other component properties here..

// Create the path.

IDTSPath90 path = dataFlowTask.PathCollection.New();

path.AttachPathAndPropagateNotifications(

conversionDataFlowComponent.OutputCollection[0],

destinationDataFlowComponent.InputCollection[0]

);

managedOleInstance.AcquireConnections(null);

managedOleInstance.ReinitializeMetaData();

// Get the destination's default input and virtual input.

IDTSInput90 input;

IDTSVirtualInput90 vInput;

// Get the destination's default input and virtual input.

input = destinationDataFlowComponent.InputCollection[0];

vInput = input.GetVirtualInput();

// Iterate through the virtual input column collection.

foreach (IDTSVirtualInputColumn90 vColumn in

vInput.VirtualInputColumnCollection)

{

managedOleInstance.SetUsageType(

input.ID, vInput, vColumn.LineageID,

DTSUsageType.UT_READONLY);

// culprit Line. I can found vColumn.LineageID = source lineage ids

// while debugging.. can you please help me out here?

}

IDTSExternalMetadataColumn90 exColumn;

foreach (IDTSInputColumn90 inColumn in

destinationDataFlowComponent.InputCollection[0].InputColumnCollection)

{

exColumn = destinationDataFlowComponent.InputCollection[0].

ExternalMetadataColumnCollection[inColumn.Name];managedOleInstance.MapInputColumn(destinationDataFlowComponent.

InputCollection[0].ID, inColumn.ID, exColumn.ID);

}

managedOleInstance.ReleaseConnections();

}

Here you can see the red marked lines. Where I can found that the destination component is watching the source output columns lineage althoug its mapped with data conversion component.

Regards

Moim

|||

The input buffer of the destination will include all upstream columns, so that means both the ones that came from the source and those from the Data Conversion Tx. You will need to be selective in choosing the input columns.

Think of the buffer as it passes down the pipeline, or rather what columns are available -

Source Output - SourceCol1

Data Conversion Output - SourceCol1, SourceCol1(converted)

You don't seem to be distiguishing between the input columns and output columns in the data conversion setup, that seems strange. You would only want to select the converted columns I assume. You could use a name, description or the UpstreamComponentName property of the column perhaps.

|||

Hi,

I have similar requirement.I need to import data from csv files to sql server tables using ssis programatically.iam not able to create a data conversion task programatically and map the columns from the dataconversion task to the oledb destination.tried my level best and got struck.Is there a code demo availble?appreciating your help

|||

adarsha wrote:

Hi,

I have similar requirement.I need to import data from csv files to sql server tables using ssis programatically.iam not able to create a data conversion task programatically and map the columns from the dataconversion task to the oledb destination.tried my level best and got struck.Is there a code demo availble?appreciating your help

It would really help if you explain what you are getting stuck on and how far you have come so far. Posting your code would be even better.

-Jamie

|||

Hi,

//In the below example i have pumped data from sql table to csvs which works fine for me .But now i need to do the exact opposite task.(Pump data from csv files to sql table)

Problems i have :

1:

After adding a flatfile source iam not very sure how to exactly add input columns in FlatFile source as they dont get populated by themselves.I tried using datatable reading the columns from the file and add it and somewhat succeeded.But need a clear implementation help.

2:

To some extent when i succeeded in adding input columns to flat file source, the package failed due to data conversion problems(unicode to non unicode conversion error).I tried programatically building dataconversion task and had no clue how exactly to do it correctly and make it to work.

So i need help in adding input columns to flat file source and adding data conversion task and selecting only the converted columns to the oledb destination.

using System;

using System.Collections.Generic;

using System.Collections.Specialized;

using System.Reflection;

using System.Threading;

using System.Text;

using System.IO;

using Microsoft.SqlServer.Dts.Runtime;

using Microsoft.SqlServer.Dts.Pipeline.Wrapper;

using wrap = Microsoft.SqlServer.Dts.Runtime.Wrapper;

namespace CCE_ETL_Engine_Flatfiles

{

class Flatfile_Pump

{

Boolean mblnDebug = true;

string mstrAppLogFile;

string mstrDTSXFile;

string DataFileName;

string DestinationDataDirectory;

#region intialize

public void InitializeVariables(string strDTSPackageName)

{

DestinationDataDirectory = System.Configuration.ConfigurationManager.AppSettings.Get("DataFilePath");

DataFileName = System.Configuration.ConfigurationManager.AppSettings.Get("DataFileName");

Object objLock = new Object();

Monitor.Enter(objLock);

string strTimeStamp = DateTime.Now.Hour.ToString().PadLeft(2, '0') + DateTime.Now.Minute.ToString().PadLeft(2, '0') + DateTime.Now.Second.ToString().PadLeft(2, '0') + DateTime.Now.Millisecond.ToString();

Monitor.Exit(objLock);

mstrAppLogFile = System.Configuration.ConfigurationManager.AppSettings.Get("ApplicationLogFilePath") + @."\" + strDTSPackageName + "_" + strTimeStamp + ".log";

mstrDTSXFile = System.Configuration.ConfigurationManager.AppSettings.Get("DTSXFilePath") + @."\" + strDTSPackageName + "_" + strTimeStamp + ".dtsx";

if (System.Configuration.ConfigurationManager.AppSettings.Get("Debug").ToUpper() == "FALSE")

mblnDebug = false;

else

mblnDebug = true;

}

#endregion

#region constructor

public Flatfile_Pump(string strDTSPackageName)

{

InitializeVariables(strDTSPackageName);

}

#endregion

#region CreatePackage

public Package CreatePackage(string Name, string Description)

{

Package DataPump = new Package();

DataPump.PackageType = DTSPackageType.DTSDesigner90;

DataPump.Name = Name;

DataPump.Description = Description;

DataPump.CreatorComputerName = System.Environment.MachineName;

DataPump.CreatorName = System.Environment.UserName;

return DataPump;

}

#endregion

#region AddConnectionManagers

/// <summary>

/// Adds the OLEDB and FlatFile connection managers to the package.

/// </summary>

public void AddConnectionManagers(Package DataPump, string SqlConnection) Here iam adding 2 connection managers 1 for flatfile and 1 for OLEDB

{

// Add the OLEDB connection manager.

ConnectionManager ConnectionName;

ConnectionManager cmflatFile;

ConnectionName = DataPump.Connections.Add("OLEDB");

// Set stock properties.

ConnectionName.Name = "Remote Source";

ConnectionName.ConnectionString = SqlConnection;

//@."Provider=SQLNCLI;Integrated Security=SSPI;Persist Security Info=False;Initial Catalog=cce_control;Data Source=(wxp-9plyf1s);Auto Translate=False;";

// Add the Destination connection manager.

cmflatFile = DataPump.Connections.Add("FLATFILE");

// Set the stock properties.

cmflatFile.Properties["ConnectionString"].SetValue(cmflatFile, DestinationDataDirectory + DataFileName);

cmflatFile.Properties["Format"].SetValue(cmflatFile, "Delimited");

cmflatFile.Properties["DataRowsToSkip"].SetValue(cmflatFile, 0);

cmflatFile.Properties["ColumnNamesInFirstDataRow"].SetValue(cmflatFile, true);

cmflatFile.Properties["Name"].SetValue(cmflatFile, "FlatFileConnection");

cmflatFile.Properties["RowDelimiter"].SetValue(cmflatFile, "\r\n");

cmflatFile.Properties["TextQualifier"].SetValue(cmflatFile, "\"");

}

#endregion

#region AddDataFlowTask

/// <summary>

/// Adds a DataFlow task to the Executables collection of the package.

/// Retrieves the MainPipe object from the TaskHost and stores it in

/// the dataFlow member variable

/// </summary>

public MainPipe AddDataFlowTask(Package DataPump)

{

TaskHost th = DataPump.Executables.Add("DTS.Pipeline") as TaskHost;

th.Name = "DataFlow";

th.Description = "The DataFlow task in the package.";

MainPipe dataFlow = th.InnerObject as MainPipe;

ComponentEvents componentEvents = new ComponentEvents();

dataFlow.Events = componentEvents as wrap.IDTSComponentEvents90;

return dataFlow;

}

#endregion

#region AddOLEDBSource

public IDTSComponentMetaData90 AddOLEDBSource(Microsoft.SqlServer.Dts.Pipeline.Wrapper.MainPipe dataFlow, Package DataPump, string strSourceQuery)

{

// Declare and Add the component to the dataFlow metadata collection

IDTSComponentMetaData90 OLEDBsource;

OLEDBsource = dataFlow.ComponentMetaDataCollection.New();

// Set the common properties

OLEDBsource.ComponentClassID = "DTSAdapter.OLEDBSource";

OLEDBsource.Name = "cce_control";

OLEDBsource.Description = "Remote Source Server";

// Create an instance of the component

CManagedComponentWrapper instance = OLEDBsource.Instantiate();

instance.ProvideComponentProperties();

// Associate the runtime ConnectionManager with the component

OLEDBsource.RuntimeConnectionCollection[0].ConnectionManagerID = DataPump.Connections["Remote Source"].ID;

OLEDBsource.RuntimeConnectionCollection[0].ConnectionManager = DtsConvert.ToConnectionManager90(DataPump.Connections["Remote Source"]);

instance.SetComponentProperty("SqlCommand", strSourceQuery);

instance.SetComponentProperty("AccessMode", 2);

instance.AcquireConnections(null);

instance.ReinitializeMetaData();

instance.ReleaseConnections();

return OLEDBsource;

}

#endregion

#region AddFlatFileDestination

public IDTSComponentMetaData90 AddFlatFileDestination(Microsoft.SqlServer.Dts.Pipeline.Wrapper.MainPipe dataFlow, Package DataPump, IDTSComponentMetaData90 OLEDBsource)

{

// Declare and Add the component to the dataFlow metadata collection

IDTSComponentMetaData90 flatfileDestination;

flatfileDestination = dataFlow.ComponentMetaDataCollection.New();

// Set the common properties

flatfileDestination.ComponentClassID = "DTSAdapter.FlatFileDestination";

flatfileDestination.Name = "FlatFileDestination";

flatfileDestination.Description = "Flat file destination";

// Create an instance of the component

CManagedComponentWrapper instance = flatfileDestination.Instantiate();

instance.ProvideComponentProperties();

// Associate the runtime ConnectionManager with the component

flatfileDestination.RuntimeConnectionCollection[0].ConnectionManagerID

= DataPump.Connections["FlatFileConnection"].ID;

flatfileDestination.RuntimeConnectionCollection[0].ConnectionManager

= DtsConvert.ToConnectionManager90(DataPump.Connections["FlatFileConnection"]);

// Map a path between the Sort transformation component to the FlatFileDestination

dataFlow.PathCollection.New().AttachPathAndPropagateNotifications(OLEDBsource.OutputCollection[0], flatfileDestination.InputCollection[0]);

// Add columns to the FlatFileConnectionManager

AddColumnsToFlatFileConnection(DataPump, flatfileDestination);

// Acquire the connection, reinitialize the metadata,

// map the columns, then release the connection.

instance.AcquireConnections(null);

instance.ReinitializeMetaData();

MapFlatFileDestinationColumns(flatfileDestination);

instance.ReleaseConnections();

return flatfileDestination;

}

#endregion

#region AddColumnsToFlatFileConnection

public void AddColumnsToFlatFileConnection(Package DataPump, IDTSComponentMetaData90 flatfileDestination)

{

wrap.IDTSConnectionManagerFlatFile90 ff = null;

foreach (ConnectionManager cm in DataPump.Connections)

{

if (cm.Name == "FlatFileConnection")

{

ff = cm.InnerObject as wrap.IDTSConnectionManagerFlatFile90;

DtsConvert.ToConnectionManager90(cm);

}

}

int count = ff.Columns.Count;

// if the connection manager is null here, then we have a problem

if (ff != null)

{

// Get the upstream columns

IDTSVirtualInputColumnCollection90 vColumns = flatfileDestination.InputCollection[0].GetVirtualInput().VirtualInputColumnCollection;

for (int cols = 0; cols < vColumns.Count; cols++)

{

wrap.IDTSConnectionManagerFlatFileColumn90 col = ff.Columns.Add();

// If this is the last column, set the delimiter to CRLF.

// Otherwise keep the delimiter as ",".

if (cols == vColumns.Count - 1)

{

col.ColumnDelimiter = "\r\n";

}

else

{

col.ColumnDelimiter = @.",";

}

col.ColumnType = "Delimited";

col.DataType = vColumns[cols].DataType;

col.DataPrecision = vColumns[cols].Precision;

col.DataScale = vColumns[cols].Scale;

wrap.IDTSName90 name = col as wrap.IDTSName90;

name.Name = vColumns[cols].Name;

}

}

}

#endregion

#region MapFlatFileDestination Columns

public void MapFlatFileDestinationColumns(IDTSComponentMetaData90 flatfileDestination)

{

CManagedComponentWrapper wrp = flatfileDestination.Instantiate();

IDTSVirtualInput90 vInput = flatfileDestination.InputCollection[0].GetVirtualInput();

foreach (IDTSVirtualInputColumn90 vColumn in vInput.VirtualInputColumnCollection)

{

wrp.SetUsageType(flatfileDestination.InputCollection[0].ID, vInput, vColumn.LineageID, DTSUsageType.UT_READONLY);

}

// For each column in the input collection

// find the corresponding external metadata column.

foreach (IDTSInputColumn90 col in flatfileDestination.InputCollection[0].InputColumnCollection)

{

IDTSExternalMetadataColumn90 exCol = flatfileDestination.InputCollection[0].ExternalMetadataColumnCollection[col.Name];

wrp.MapInputColumn(flatfileDestination.InputCollection[0].ID, col.ID, exCol.ID);

}

}

#endregion

#region Exec_DataPump--This is just the execution where i get values from like server name etc and create the package and execute it.

public Boolean DataPump(string SqlConnection, string strSourceQuery)

{

Package DataPump;

PackageEvents packageEvents;

MainPipe dataFlow;

IDTSComponentMetaData90 OLEDBsource, flatfileDestination;

packageEvents = new PackageEvents();

DataPump = CreatePackage("SSISPackage", "The package for the Flatfile pump");

AddConnectionManagers(DataPump, SqlConnection);

dataFlow = AddDataFlowTask(DataPump);

OLEDBsource = AddOLEDBSource(dataFlow, DataPump, strSourceQuery);

flatfileDestination = AddFlatFileDestination(dataFlow, DataPump, OLEDBsource);

DTSExecResult status = DataPump.Validate(null, null, packageEvents, null);

if (mblnDebug)

{

Application dts = new Application();

dts.SaveToXml(mstrDTSXFile, DataPump, packageEvents);

}

foreach (Variable v in DataPump.Variables)

{

if (v.Name.ToUpper() == "EXECUTIONINSTANCEGUID")

{

break;

}

}

if (status == DTSExecResult.Success)

{

DTSExecResult result = DataPump.Execute(null, null, packageEvents, null, null);

foreach (Variable v in DataPump.Variables)

{

Console.WriteLine(v.Name.ToUpper());

}

}

return true;

}

#endregion

}

}

Monday, February 20, 2012

Programatically create and configure a FlatFileConnectionManager

I am writing some C# code that builds a package programatically. the package contains a flat file connection manager. I have referenced Microsoft.SqlServer.Dts.Runtime.Wrapper in order that I can use the ConnectionManagerFlatFileClass class.

Here is some of my code:

Package p = New package();

ConnectionManager sourceCM = p.Connections.Add("FLATFILE");

sourceCM.Name = "some-name";

ConnectionManagerFlatFileClass cmffc = sourceCM.InnerObject()

I get a compilation error on the last line saying I cannot implicitly cast as ConnectionManagerFlatFileClass. Fair enough, so I change the last line of code to:

ConnectionManagerFlatFileClass cmffc = (ConnectionManagerFlatFileClass)sourceCM.InnerObject()

Now it will compile. But when I run it I get the error:

"Unable to cast COM object of type 'System.__ComObject' to class type "Microsoft.SqlServer.Dts.Runtime.Wrapper.ConnectionManagerFlatFileClass'. COM components that enter the CLR and do not support IProvideClassInfo or that do not have interop assembly referenced will be wrapped in the __ComObject type. instances of this type cannot be cast to any other class; however they can be cast to interfaces as long as the underlying COM component supports QueryInterface calls for the IID of the interface"

Can anyone tell me how I can cast the ConnectionManager.InnerObject as a ConnectionManagerFlatFileClass please?

-Jamie

P.S. My ultimate aim here is to set the metdata of each of the columns defined by the flat file connection manager. I can't see how I can do that without referring to the connection manager as a ConnectionManagerFlatFileClass!

OK, it looks like I can reference the Columns collection of my flat file connection manager using

sourceCM.Properties["Columns"]

but that returns me an object of type DTSProperty which I've tried casting to IDTSOutputColumnCollection90 but it says it cannot convert a Microsoft.SqlServer.Dts.Runtime.DtsProperty to Microsoft.SqlServer.Dts.Pipeline.Wrapper.IDTSOutputColumnCollection90 which, again, makes complete sense!

So, if anyone can tell me how I can programatically access the columns in my flat file connection manager then I'd be very grateful.

-Jamie

|||

DtsProperty.GetValue Method ?

DtsProperty is the definition of the property, and you want the actual value, which is where the GetValue method comes in.

I think the whole idea of this is to abstract you from the real class, such that they can be changed and updated at will without breaking existing code. Not too sure on the deal on binary compatability as we used to think of it, it seems to be less of an issue, but there is probably some good reason!

|||

Thanks mate, but that won't do it. However, I've managed to find the answer elsewhere (and for once in my life I didn't rely on Darren to give me the answer ). Namely, in this sample: http://msdn2.microsoft.com/en-us/library/ms161541.aspx

Here's the code that does it, turns out I DO need the Microsoft.SqlServer.Dts.Runtime.Wrapper namespace:

using wrap = Microsoft.SqlServer.Dts.Runtime.Wrapper;

...

private wrap.IDTSConnectionManagerFlatFile90 ffc = (wrap.IDTSConnectionManagerFlatFile90)sourceCM.InnerObject;

wrap.IDTSConnectionManagerFlatFileColumn90 col = ffc.Columns.Add();

OK, cool. I've got it working. It kinda begs the question tho, why is there no class in the Microsoft.SqlServer.Dts.Runtime namespace that enables us to do it? Its not like configuring a FlatFile connection manager isn't core functionality. I've asked the same question elsewhere (sort of). Hopefully it'll get answered.

-Jamie