Showing posts with label package. Show all posts
Showing posts with label package. Show all posts

Friday, March 30, 2012

Pros and Cons of saving to serverstorage versus file system

What are the pros and cons of saving the SSIS package using serverstorage versus file system? It appears to me that the file system is much flexible and can be promoted anywhere without going through the hassle of exporting off from msdb etc.

Thanks,
Lito

Lito wrote:

What are the pros and cons of saving the SSIS package using serverstorage versus file system? It appears to me that the file system is much flexible and can be promoted anywhere without going through the hassle of exporting off from msdb etc.

Thanks,
Lito

I agree. The single biggest issue I have with server deployment is that you bring a whole new layer of management into play if you're using the Execute Package task (which likely many people will be).
i.e. At design time you use a file connection manager for calling other packages...at runtime you use OLE DB connection manager. So not only do you have to tell the package which environment its running in so that it knows which connection manager to use...promoting from dev-->test-->live becomes a real headache because you don't have uniformity across environments.

Just my 2 penneth worth!

-Jamie|||Kirk has blogged about some of the Pros and Cons of saving to SQL Server Vs File System. See

http://sqljunkies.com/WebLog/knight_reign/archive/2005/05/05/13523.aspx

- Ranjeeta

Pros and Cons of saving to serverstorage versus file system

What are the pros and cons of saving the SSIS package using serverstorage versus file system? It appears to me that the file system is much flexible and can be promoted anywhere without going through the hassle of exporting off from msdb etc.

Thanks,
Lito

Lito wrote:

What are the pros and cons of saving the SSIS package using serverstorage versus file system? It appears to me that the file system is much flexible and can be promoted anywhere without going through the hassle of exporting off from msdb etc.

Thanks,
Lito

I agree. The single biggest issue I have with server deployment is that you bring a whole new layer of management into play if you're using the Execute Package task (which likely many people will be).
i.e. At design time you use a file connection manager for calling other packages...at runtime you use OLE DB connection manager. So not only do you have to tell the package which environment its running in so that it knows which connection manager to use...promoting from dev-->test-->live becomes a real headache because you don't have uniformity across environments.

Just my 2 penneth worth!

-Jamie|||Kirk has blogged about some of the Pros and Cons of saving to SQL Server Vs File System. See

http://sqljunkies.com/WebLog/knight_reign/archive/2005/05/05/13523.aspx

- Ranjeeta

Wednesday, March 28, 2012

Propery expressions: Values not updated

I see some strange behavior when running a package using the SQL Server Agent. The package I run calls a number of child packages, in which I use property expressions to set - among other things - the path to text files I read into a database.

The property expressions are in the form: @.[User::ThePath] + "\\TheFile.txt". The variable ThePath is read from an XML configuration file at runtime from the "master" package and passed on to the child packages using a parent package variable configuration.

My problem is this: When the package is executed by the SQL Server Agent, the properties for the path of the text files are not updated, and the package fails, as it cannot locate the files. When using DTExecUI to execute the same package, everything runs fine.

Could it be a permissions issue? I don't think so, as the SQL Server Agent runs under the same account as I am logged in when executing with DTExecUI.

Anyone?

It does sound a bit like permissions.
So are you referring to the Account the Agent service runs under OR the account the specific Agent Job Step is executing AS, which are 2 different things.

>>>Could it be a permissions issue? I don't think so, as the SQL Server Agent runs under the same account as I am logged in when executing with DTExecUI.

http://support.microsoft.com/?kbid=918760

|||The SQL Agent is running under a local administrator account, which is the same I use when I log in and run the package using DTExecUI. In the scheduled job, the package is "Run as" the "SQL Agent Service Account", which I think should mean that it is executing in the same security context as when I run it manually?!|||

This one has been bugging me big time!

Now, it seems that I might have found a workaround. If I store the main/parent package (the one executed in the Agent job) in MSDB and let the job execute it from here, everything seems to work fine and the property expressions are properly updated at runtime - even though those property expressions are used in the child packages. Previously I had all the packages stored in SSIS Package Store (File System), and that is also where I still keep the child packages.

Can anyone explain this behavior?

|||Can anyone give any explanation for this behavior?|||Now, it gets even more strange. When the job runs on a schedule, it fails, but when I start it manually by clicking "Start job at step..." it runs fine.

Friday, March 23, 2012

Proper format of BufferTempStoragePath

Hello, I have one package that seems to have continuous problems with memory. As of right now it loads a little over 1 million records. I tried leveraging the property, BufferTempStoragePath, but I don't seem to have the right path name. What sort of path do you put there? File? Folder? If it is a file, what sort of file should it be... text, dat, xml? If someone could point me in the right direction it would be greatly appreciated. Thanks.

PS: Below are the error messages I am getting:
[DTS.Pipeline] Information: The buffer manager detected that the system was low on virtual memory, but was unable to swap out any buffers. 4 buffers were considered and 4 were locked. Either not enough memory is available to the pipeline because not enough is installed, other processes are using it, or too many buffers are locked.

[DTS.Pipeline] Error: The buffer manager cannot create a temporary storage file on any path in the BufferTempStoragePath property. There is an incorrect file name or no permission.

It should be a folder.

-Jamie

|||Thank you. Unfortunately I set the property to be a folder but it doesn't seem to be using the swap location. I still receive the following error messages and a lot of them...

[DTS.Pipeline] Information: The buffer manager detected that the system was low on virtual memory, but was unable to swap out any buffers. 4 buffers were considered and 4 were locked. Either not enough memory is available to the pipeline because not enough is installed, other processes are using it, or too many buffers are locked.

I am running the package by itself with nothing other than Sql Server running. Any ideas what I might be missing here? I don't believe it is a security problem as I am setup as a local admin on this development box. Any further assistance would be greatly appreciated.

Monday, March 12, 2012

programmatically setting package variables in job step command line

We are trying to start a server job running an SSIS package and supply some parameters to the package when we start the job using SMO.

What we have now is this:

string cmdLine = job.JobSteps[0].Command;

cmdLine += @." /SET \Package\GetGroupRatingYear_Id.Variables[User::RatingId].Value;1";

cmdLine += @." /SET \Package\GetGroupRatingYear_Id.Variables[User::GroupId].Value;1";

cmdLine += " /SET \\Package.Variables[User::period].Value;\"" + periodEndDate + "\"";

job.JobSteps[0].Command = cmdLine;

job.Start();

It appears that when the job is run, the modified command line is not used.

What is needed to supply runtime parameters to a job step when starting the job via SMO?

Thanks,

So managing a job in this way seems a bit of a pain. Why not let the package go and get the values from an external location when it is required.

One example would be to use a package configuration, perhaps using a SQL Server configuration. You could update the table values, and then just design the package to use that configuration value, assigning the values to the variables as required. Read up on package configurations if you are not familiar with them.

A variation on the theme it to do the work yourself. You could use any table, not just a configuration format table. Use an Execuite SQL Task to query for the values and using the result set option, you can return values and on the results page of the task, set the output to variable values.

|||

Yes it's been a learning curve in how to do what we're trying to do. The application is driven by a web page where the user says 'run this job, use these parameters'. However, you can't have one predefined job with multiple instances on the server, each with their own set of parameters - which needs to be possible because of the application requirements.

What we are doing now that seems to work is creating a new job, setting the type to ssis, setting the command line to specify the package and parameters, and then starting the job. It is also set to auto delete upon success.

The other way we thought of but decided against was to have the job pick up its runtime parameters from a queue - but then we'd have to create and manage the queue.

The 'create new job' approach lets us run now or set a schedule to run later, all the instances are visible as jobs on the server (based on category to filter out for the UI), and they clean up themselves if they run successfully.

NB: if anyone is curious, changing the command line of an existing job requires the Alter() method to persist the change back to the server, otherwise it just runs with the original command. like this:

string cmdLine = job.JobSteps[0].Command;
cmdLine += @." /SET \Package\GetGroupRatingYear_Id.Variables[User::RatingId].Value;1";
job.JobSteps[0].Command = cmdLine;
job.JobSteps[0].Alter();
job.Start();

However, this permanently changes the command line in the job of course and you have to deal with that.

The code that that we're using to dynamically create the job and supply the parameters is pretty close to this:

string jobName = "the name to give to the new job";
string cmdLine = "the command line to run the package and set parameters";
ServerConnection svrConnection = new ServerConnection(sqlConnection);
Server svr = new Server(svrConnection);
JobServer agent = svr.JobServer;
if (agent.Jobs.Contains(jobName))
{
agent.Jobs[jobName].Drop();
}

job = new Job(agent, jobName);
job.DeleteLevel = CompletionAction.OnSuccess;
job.Category = "Calculate";
JobStep js = new JobStep(job, "Step 1");
js.SubSystem = AgentSubSystem.Ssis;
js.Command = cmdLine;
job.Create();
js.Create();
job.ApplyToTargetServer("(local)");
job.Alter();
job.Start();

programmatically schedule the package for later execution in store procedure -

Hello,
Is it possible programmatically schedule the DTS package for later execution in store procedure or maybe in trigger?When you say later execution, please be more specific and describe the scenario you are trying to apply this to.|||Originally posted by rnealejr
When you say later execution, please be more specific and describe the scenario you are trying to apply this to.

For example, user sets flag thru user interface. This action updates some field in table. I would like to have an "update" trigger for this table that will schedule the DTS package to be executed at night. There is some option to do it in VB but it will be cool to perform this thru store procedure.|||You can use the sp_add_schedule or sp_update_schedule - however, a job must already exist for the DTS package.|||Thank you. It's msdb..sp_update_jobschedule.

Thanks again!|||Happy to help.

Friday, March 9, 2012

Programmatically iterating tasks/components in the data flow portion of a package.

HI All,

In several threads there has been discussion regarding adding connection managers to a package's data flow, etc. My challenge is that I have a large solution that contains many packages, and I need to change the connection manager linked to the data flow in all of the packages. When the solution was initially designed, data sources were used, and it has become a tedious maintenance issue to keep those in sync. We want to use a standard OLEDB connection manager, but adding a connection manager to each package and editing the corresponding data flow tasks in each package to use that new connection manager is a daunting task. I've coded a .Net module to access the packages, remove the old connection manager (data source) and add the new OLEDB data source. However, as I traverse the objects in the package hierarchy, when I come to the data flow object, the innerobject is not a dts object, but rather a _com object.. I can't seem to find any documentation/examples as to how to iterate the tasks within a data flow and change the connection manager. If you have any information, that would be quite helpful. If you reply with a code sample, if you would be so kind as to relate it to one of the sample packages provided with SSIS so I can run it, that would be great.

Thank you.

Steve.

Hiya Steve,

I know we discussed this offline but just for anyone else's benefit.... the following may help:

Iterate over a package programatically
http://blogs.conchango.com/jamiethomson/archive/2007/03/06/SSIS_3A00_-Iterate-over-a-package-programatically.aspx

-Jamie

Programmatically get ScriptTask source code

Hi am trying without luck to load a package which contains a ScriptTask and read the source code of that task.

I can load the package and get the ScriptTask no problem.
However i am not sure how to get the source code.
I know i have to use the ScriptTaskCodeProvider and i assume the GetSourceCode() method.

This is what i have so far

ScriptTask scriptTask = taskHost.InnerObject as ScriptTask;
ScriptTaskCodeProvider codeProvider = new ScriptTaskCodeProvider();
codeProvider.LoadFromTask(scriptTask);
string sourceCode = codeProvider.GetSourceCode(scriptTask.VsaProjectName);

Any assistance greatly appreciated.

Cheers
Richard.

Application application = new Application();

Package package = application.LoadPackage(@."C:\Yukon\ISPlaying\ISPlaying\Package14.dtsx", null);

TaskHost taskHost = package.Executables[0] as TaskHost;

ScriptTask scriptTask = taskHost.InnerObject as ScriptTask;

ScriptTaskCodeProvider codeProvider = new ScriptTaskCodeProvider();

codeProvider.LoadFromTask(scriptTask);

string script = codeProvider.GetSourceCode(String.Format("dts://Scripts/{0}/ScriptMain.vsaitem", scriptTask.VsaProjectName));

Programmatically developing an entire package..

Hi all,

I am trying to write a program that creates packages on the fly depending on the variables you pass. for eg. It should create connection managers on the fly specific to a certain file in the variable (eg. sample.csv). the package has a dataflow task and it has flat file source and oledb destination.The problem I am facing is the flat file source when assigned to a flat file connection manager(dynamically), it is not giving any source output columns. i.e, the value for DFSource.OutputCollection(0).OutputColumnCollection.Count is Zero. But when I use the same code and reverse the source and destination(oledb as source and flatfile as destination), it is working fine. I searched everywhere for resources on how to develop packages programmatically, but could not find any except one example on msdn. If anyone knows about this prob or any useful resources on this subject, it would be really helpful.

Thanks,

Prithvi.

This is the code for creating connections and adding dataflow task. I need to add a script component transformation, but initially I wanted to see if a one-to-one mapping works fine.

Public Sub CreateConnections()

'Add the OLE DB and Flat File Connection Managers

Console.WriteLine("Creating the MyOLEDBConnection")

Dim cnOLEDB As ConnectionManager = myPackage.Connections.Add("OLEDB")

cnOLEDB.Name = "MyOLEDBConnection"

cnOLEDB.ConnectionString = "<connection string>"

Console.WriteLine("Creating the MyFlatFileConnection")

Dim cnFile As ConnectionManager = myPackage.Connections.Add("FLATFILE")

cnFile.Name = "MyFlatFileConnection"

cnFile.Properties("ConnectionString").SetValue(cnFile, "C:\sample.csv")

cnFile.Properties("Format").SetValue(cnFile, "Delimited")

cnFile.Properties("ColumnNamesInFirstDataRow").SetValue(cnFile, False)

cnFile.Properties("DataRowsToSkip").SetValue(cnFile, 0)

cnFile.Properties("RowDelimiter").SetValue(cnFile, vbCrLf)

cnFile.Properties("TextQualifier").SetValue(cnFile, """")

End Sub

Public Sub AddDataFlowTask()

'Add a Data Flow Task

Console.WriteLine("Adding a Data Flow Task")

Dim e As Executable = myPackage.Executables.Add("DTS.Pipeline")

Dim taskDF As TaskHost = CType(e, TaskHost)

taskDF.Name = "DataFlow"

Dim DTP As MainPipe

DTP = CType(taskDF.InnerObject, MainPipe)

' Add the FLAT FILE Source

Console.WriteLine("Adding the File Source")

Dim DFSource As IDTSComponentMetaData90

DFSource = DTP.ComponentMetaDataCollection.New()

DFSource.ComponentClassID = "DTSAdapter.FlatFileSource.1"

DFSource.Name = "FlatFileSource"

' Connect, populate the Input collections and disconnect

Dim SourceInst As CManagedComponentWrapper = DFSource.Instantiate()

SourceInst.ProvideComponentProperties()

DFSource.RuntimeConnectionCollection(0).ConnectionManagerID = myPackage.Connections("MyFlatFileConnection").ID

DFSource.RuntimeConnectionCollection(0).ConnectionManager = DtsConvert.ToConnectionManager90(myPackage.Connections("MyFlatFileConnection"))

SourceInst.AcquireConnections(vbNull)

SourceInst.ReinitializeMetaData()

SourceInst.ReleaseConnections()

'checking If the external metadata columns are available as source output columns and the output is 0 which means no columns are being passed as output from source

Dim column As IDTSOutputColumn90

Try

Console.WriteLine("connection name: " & DFSource.RuntimeConnectionCollection(0).Name.ToString)

Console.WriteLine("output collection name: " & DFSource.OutputCollection(0).Name)

Console.WriteLine("output collection description :" & DFSource.OutputCollection(0).Description)

Console.WriteLine("source output columns count :" & DFSource.OutputCollection(0).OutputColumnCollection.Count.ToString

Catch ex As Exception

Console.WriteLine(ex.InnerException.Message.ToString)

End Try

'tried to print col names but it does not print any

For Each column In DFSource.OutputCollection(0).OutputColumnCollection

Console.WriteLine(column.Name.ToString)

Console.WriteLine(column.DataType.ToString)

Next

' Add the OLEDB Destination

Console.WriteLine("Adding OLEDB Destination")

Dim DFDestination As IDTSComponentMetaData90

DFDestination = DTP.ComponentMetaDataCollection.New()

DFDestination.ComponentClassID = "DTSAdapter.OLEDBDestination"

DFDestination.Name = "OLEDBDestination"

' Create an instance of the component

Dim DestInst As CManagedComponentWrapper = DFDestination.Instantiate()

DestInst.ProvideComponentProperties()

If DFDestination.RuntimeConnectionCollection.Count > 0 Then

DFDestination.RuntimeConnectionCollection(0).ConnectionManagerID = myPackage.Connections("MyOLEDBConnection").ID

DFDestination.RuntimeConnectionCollection(0).ConnectionManager = DtsConvert.ToConnectionManager90(myPackage.Connections("MyOLEDBConnection"))

End If

DestInst.SetComponentProperty("AccessMode", 0)

DestInst.SetComponentProperty("OpenRowset", "tempSSIS")

DestInst.SetComponentProperty("FastLoadKeepNulls", True)

' Map a connection between the source and destination

Dim path As IDTSPath90 = DTP.PathCollection.New()

path.AttachPathAndPropagateNotifications(DFSource.OutputCollection(0), DFDestination.InputCollection(0))

Dim InColumns As IDTSVirtualInputColumnCollection90 = DFDestination.InputCollection(0).GetVirtualInput().VirtualInputColumnCollection()

' the number of input columns to destination is zero

Console.WriteLine("input columns : " & InColumns.Count.ToString)

Try

DestInst.AcquireConnections(vbNull)

Catch ex As Exception

Console.WriteLine(ex.InnerException.Message)

End Try

DestInst.ReinitializeMetaData()

'Console.WriteLine("input columns : " & DFDestination.InputCollection(0).InputColumnCollection.Count.ToString)

For Each input As IDTSInput90 In DFDestination.InputCollection

' Get the virtual input column collection for the input.

Dim vInput As IDTSVirtualInput90 = input.GetVirtualInput()

' Iterate through the virtual column collection.

For Each vColumn As IDTSVirtualInputColumn90 In vInput.VirtualInputColumnCollection

' Call the SetUsageType method of the design time instance of the component.

Console.WriteLine(vColumn.Name.ToString)

Console.WriteLine(vColumn.DataType)

DestInst.SetUsageType(input.ID, vInput, vColumn.LineageID, DTSUsageType.UT_READONLY)

Next

Dim exCol As IDTSExternalMetadataColumn90

For Each col As IDTSInputColumn90 In DFDestination.InputCollection(0).InputColumnCollection

exCol = DFDestination.InputCollection(0).ExternalMetadataColumnCollection(col.Name)

DestInst.MapInputColumn(DFDestination.InputCollection(0).ID, col.ID, exCol.ID)

Next

DestInst.ReleaseConnections()

End Sub

Please see if i did wrong any where(which I always happen to do). But based on the msdn material, the above code should work, i guess.

Programmatically Data Conversion component creation

Hi there,

I have created a package which simply imports data from a flat file to a SQL Server table. But I need to incorporate a data conversion component by which I may change the source-destination column mapping programmatically. So what I thought that I need to add a data conversion component into the dataflow task. After adding this component (I found its component id as {C3BF62C8-7C5C-4F85-83C3-E0B6F6BE267C}) I have created a path which establishes the mapping between output columns of source component and the input columns of data conversion component. Now I am not sure how to establish the mapping between the data conversion component’s input column collection and output column collection.

I am giving my code snippet here,

IDTSComponentMetaData90 sourceDataFlowComponent = dataFlowTask.ComponentMetaDataCollection.New();

sourceDataFlowComponent.ComponentClassID = "{90C7770B-DE7C-435E-880E-E718C92C0573}";

… … …. // Code for configuring the source data flow component

IDTSComponentMetaData90 conversionDataFlowComponent = dataFlowTask.ComponentMetaDataCollection.New();// creating data conversion

conversionDataFlowComponent.ComponentClassID = "{C3BF62C8-7C5C-4F85-83C3-E0B6F6BE267C}";// This is the GUID for data conversion component

CManagedComponentWrapper conversionInstance = conversionDataFlowComponent.Instantiate();//Instantiate

conversionInstance.ProvideComponentProperties();

// Now creating a path to connet the source and conversion

IDTSPath90 fPath = dataFlowTask.PathCollection.New();fPath.AttachPathAndPropagateNotifications(

sourceDataFlowComponent.OutputCollection[0],

conversionDataFlowComponent.InputCollection[0]);

// Sould I need to accuire connect for data conversion? Im not sure

conversionInstance.AcquireConnections(null);

conversionInstance.ReinitializeMetaData();

// Get the input collection

IDTSInput90 input = conversionDataFlowComponent.InputCollection[0];

IDTSVirtualInput90 vInput = input.GetVirtualInput();

foreach (IDTSVirtualInputColumn90 vColumn in vInput.VirtualInputColumnCollection){

conversionInstance.SetUsageType(

input.ID, vInput, vColumn.LineageID, DTSUsageType.UT_READONLY);

}

.. . // Well here I am stucked. What I need to do here to establish a map

// between conversionDataFlowComponent.InputCollection[0] and

// conversionDataFlowComponent.OutputCollection[0]?

As you can see I am just away from creating the mapping between input and output collection. Can anybody give me an idea how can I achieve this?

I will appreciate all kind of suggestions and comments.

Regards

Moim

Have a look at this code snippet where I use this transform, selecting all columns, and converting types where required based on the result of a helper function-

#region Select Conversion Columns

IDTSOutput90 output = dataConversion.OutputCollection[0];

IDTSVirtualInput90 virtualInput = dataConversion.InputCollection[0].GetVirtualInput();

int inputId = dataConversion.InputCollection[0].ID;

foreach (IDTSVirtualInputColumn90 column in virtualInput.VirtualInputColumnCollection)

{

int length = column.Length;

int precision = column.Precision;

int scale = column.Scale;

DataType dataType = ComponentHelper.ConvertBufferDataTypeToFitExcel(column.DataType, ref length, ref precision, ref scale);

if (dataType != column.DataType)

{

IDTSInputColumn90 inputColumn = dataConversionInstance.SetUsageType(inputId, virtualInput, column.LineageID, DTSUsageType.UT_READONLY);

IDTSOutputColumn90 outputColumn = dataConversionInstance.InsertOutputColumnAt(output.ID, output.OutputColumnCollection.Count, column.Name, "Excel compatible conversion column");

dataConversionInstance.SetOutputColumnDataTypeProperties(output.ID, outputColumn.ID, dataType, length, precision, scale, 0);

outputColumn.CustomPropertyCollection[0].Value = inputColumn.LineageID;

inputColumn.Name = DataConversionInputRenamePrefix + inputColumn.Name;

}

}

#endregion

Btw, how did you get your code to paste so nicely into this editor Window?

|||

DarrenSQLIS wrote:

Btw, how did you get your code to paste so nicely into this editor Window?

Try this:

http://www.codeproject.com/jscript/CopyasHTML.asp

Or just go here: http://www.google.co.uk/search?hl=en&q=copyashtml&meta=

I have add-in that enables me to do it but I can't for the life of me find it again.

-Jamie

|||

Jamie Thomson wrote:

DarrenSQLIS wrote:

Btw, how did you get your code to paste so nicely into this editor Window?

I have add-in that enables me to do it but I can't for the life of me find it again.

-Jamie

Here it is: http://www.jtleigh.com/people/colin/software/CopySourceAsHtml/|||Nice, thanks Gents.|||

Hello Darran,

Thanks a lot. After implementing your suggestion I believe I am very near to make it work. I think I am missing/doing some silly things/mistakes so that the package is not working as expected. I observed the log traces and found that it is failing to validate the destination data flow component. And debugging it I found that after creating a path between the data conversion component and the destination component, when I populate the input columns for the destination component it is actually getting the source components output columns lineageID instead of data conversion component’s output columns lineageID. Can you help me to find out the problem please?

Anyway, thank you very much for your previous help.

Here is the code snippet:

private void ConfigureSourceComponent()

{

sourceDataFlowComponent.Name = "Source Data from Flat file";

// Here is the component class id for flat file source data

sourceDataFlowComponent.ComponentClassID = "{90C7770B-DE7C-435E-880E-E718C92C0573}";

CManagedComponentWrapper managedFlatFileInstance =

sourceDataFlowComponent.Instantiate();

managedFlatFileInstance.ProvideComponentProperties();sourceDataFlowComponent.RuntimeConnectionCollection[0].

ConnectionManagerID =flatfileConnectionManager.ID;sourceDataFlowComponent.RuntimeConnectionCollection[0].

ConnectionManager =DtsConvert.ToConnectionManager90(flatfileConnectionManager);

managedFlatFileInstance.AcquireConnections(null);

managedFlatFileInstance.ReinitializeMetaData();

IDTSExternalMetadataColumn90 exOutColumn;

foreach (IDTSOutputColumn90 outColumn in sourceDataFlowComponent.OutputCollection[0].OutputColumnCollection)

{

exOutColumn = sourceDataFlowComponent.OutputCollection[0].

ExternalMetadataColumnCollection[outColumn.Name];managedFlatFileInstance.MapOutputColumn(

sourceDataFlowComponent.OutputCollection[0].ID, outColumn.ID, exOutColumn.ID, true);

}

managedFlatFileInstance.ReleaseConnections()

}

private void ConfigureConversionComponent()

{

conversionDataFlowComponent.Name = "Conversion compoenent";

// component class id for Data conversion component

conversionDataFlowComponent.ComponentClassID = "{C3BF62C8-7C5C-4F85-83C3-E0B6F6BE267C}";

CManagedComponentWrapper conversionInstance =

conversionDataFlowComponent.Instantiate();

conversionInstance.ProvideComponentProperties();

// Create the path.

IDTSPath90 fPath = dataFlowTask.PathCollection.New();fPath.AttachPathAndPropagateNotifications(

sourceDataFlowComponent.OutputCollection[0],

conversionDataFlowComponent.InputCollection[0]);

conversionInstance.AcquireConnections(null);

conversionInstance.ReinitializeMetaData();

// Get the output collection

IDTSOutput90 output = conversionDataFlowComponent.OutputCollection[0];

// Get the destination's default input and virtual input.

input = conversionDataFlowComponent.InputCollection[0];

vInput = input.GetVirtualInput();

// Iterate through the virtual input column collection.

foreach (IDTSVirtualInputColumn90 vColumn in vInput.VirtualInputColumnCollection)

{

IDTSInputColumn90 inputColumn =

conversionInstance.SetUsageType(

input.ID, vInput, vColumn.LineageID,

DTSUsageType.UT_READONLY);

IDTSOutputColumn90 outputColumn =

conversionInstance.InsertOutputColumnAt

(output.ID, output.OutputColumnCollection.Count,

vColumn.Name, "Sql server compatible conversion column");

conversionInstance.SetOutputColumnDataTypeProperties(

output.ID, outputColumn.ID, vColumn.DataType,

vColumn.Length, vColumn.Precision, vColumn.Scale, 0);

outputColumn.CustomPropertyCollection[0].Value =

inputColumn.LineageID;

}

conversionInstance.ReleaseConnections();

}

private void ConfigureDestinationComponent()

{

destinationDataFlowComponent.Name = "Destination Oledb compoenent";

// component class id for Oledb

destinationDataFlowComponent.ComponentClassID = "{E2568105-9550-4F71-A638-B7FE42E66922}";

CManagedComponentWrapper managedOleInstance =

destinationDataFlowComponent.Instantiate();

managedOleInstance.ProvideComponentProperties();

destinationDataFlowComponent.RuntimeConnectionCollection[0].

ConnectionManagerID = oledbConnectionManager.ID;

destinationDataFlowComponent.RuntimeConnectionCollection[0].

ConnectionManager =DtsConvert.ToConnectionManager90(oledbConnectionManager);

managedOleInstance.SetComponentProperty("AccessMode", 0);

// .. .. .. setting other component properties here..

// Create the path.

IDTSPath90 path = dataFlowTask.PathCollection.New();

path.AttachPathAndPropagateNotifications(

conversionDataFlowComponent.OutputCollection[0],

destinationDataFlowComponent.InputCollection[0]

);

managedOleInstance.AcquireConnections(null);

managedOleInstance.ReinitializeMetaData();

// Get the destination's default input and virtual input.

IDTSInput90 input;

IDTSVirtualInput90 vInput;

// Get the destination's default input and virtual input.

input = destinationDataFlowComponent.InputCollection[0];

vInput = input.GetVirtualInput();

// Iterate through the virtual input column collection.

foreach (IDTSVirtualInputColumn90 vColumn in

vInput.VirtualInputColumnCollection)

{

managedOleInstance.SetUsageType(

input.ID, vInput, vColumn.LineageID,

DTSUsageType.UT_READONLY);

// culprit Line. I can found vColumn.LineageID = source lineage ids

// while debugging.. can you please help me out here?

}

IDTSExternalMetadataColumn90 exColumn;

foreach (IDTSInputColumn90 inColumn in

destinationDataFlowComponent.InputCollection[0].InputColumnCollection)

{

exColumn = destinationDataFlowComponent.InputCollection[0].

ExternalMetadataColumnCollection[inColumn.Name];managedOleInstance.MapInputColumn(destinationDataFlowComponent.

InputCollection[0].ID, inColumn.ID, exColumn.ID);

}

managedOleInstance.ReleaseConnections();

}

Here you can see the red marked lines. Where I can found that the destination component is watching the source output columns lineage althoug its mapped with data conversion component.

Regards

Moim

|||

The input buffer of the destination will include all upstream columns, so that means both the ones that came from the source and those from the Data Conversion Tx. You will need to be selective in choosing the input columns.

Think of the buffer as it passes down the pipeline, or rather what columns are available -

Source Output - SourceCol1

Data Conversion Output - SourceCol1, SourceCol1(converted)

You don't seem to be distiguishing between the input columns and output columns in the data conversion setup, that seems strange. You would only want to select the converted columns I assume. You could use a name, description or the UpstreamComponentName property of the column perhaps.

|||

Hi,

I have similar requirement.I need to import data from csv files to sql server tables using ssis programatically.iam not able to create a data conversion task programatically and map the columns from the dataconversion task to the oledb destination.tried my level best and got struck.Is there a code demo availble?appreciating your help

|||

adarsha wrote:

Hi,

I have similar requirement.I need to import data from csv files to sql server tables using ssis programatically.iam not able to create a data conversion task programatically and map the columns from the dataconversion task to the oledb destination.tried my level best and got struck.Is there a code demo availble?appreciating your help

It would really help if you explain what you are getting stuck on and how far you have come so far. Posting your code would be even better.

-Jamie

|||

Hi,

//In the below example i have pumped data from sql table to csvs which works fine for me .But now i need to do the exact opposite task.(Pump data from csv files to sql table)

Problems i have :

1:

After adding a flatfile source iam not very sure how to exactly add input columns in FlatFile source as they dont get populated by themselves.I tried using datatable reading the columns from the file and add it and somewhat succeeded.But need a clear implementation help.

2:

To some extent when i succeeded in adding input columns to flat file source, the package failed due to data conversion problems(unicode to non unicode conversion error).I tried programatically building dataconversion task and had no clue how exactly to do it correctly and make it to work.

So i need help in adding input columns to flat file source and adding data conversion task and selecting only the converted columns to the oledb destination.

using System;

using System.Collections.Generic;

using System.Collections.Specialized;

using System.Reflection;

using System.Threading;

using System.Text;

using System.IO;

using Microsoft.SqlServer.Dts.Runtime;

using Microsoft.SqlServer.Dts.Pipeline.Wrapper;

using wrap = Microsoft.SqlServer.Dts.Runtime.Wrapper;

namespace CCE_ETL_Engine_Flatfiles

{

class Flatfile_Pump

{

Boolean mblnDebug = true;

string mstrAppLogFile;

string mstrDTSXFile;

string DataFileName;

string DestinationDataDirectory;

#region intialize

public void InitializeVariables(string strDTSPackageName)

{

DestinationDataDirectory = System.Configuration.ConfigurationManager.AppSettings.Get("DataFilePath");

DataFileName = System.Configuration.ConfigurationManager.AppSettings.Get("DataFileName");

Object objLock = new Object();

Monitor.Enter(objLock);

string strTimeStamp = DateTime.Now.Hour.ToString().PadLeft(2, '0') + DateTime.Now.Minute.ToString().PadLeft(2, '0') + DateTime.Now.Second.ToString().PadLeft(2, '0') + DateTime.Now.Millisecond.ToString();

Monitor.Exit(objLock);

mstrAppLogFile = System.Configuration.ConfigurationManager.AppSettings.Get("ApplicationLogFilePath") + @."\" + strDTSPackageName + "_" + strTimeStamp + ".log";

mstrDTSXFile = System.Configuration.ConfigurationManager.AppSettings.Get("DTSXFilePath") + @."\" + strDTSPackageName + "_" + strTimeStamp + ".dtsx";

if (System.Configuration.ConfigurationManager.AppSettings.Get("Debug").ToUpper() == "FALSE")

mblnDebug = false;

else

mblnDebug = true;

}

#endregion

#region constructor

public Flatfile_Pump(string strDTSPackageName)

{

InitializeVariables(strDTSPackageName);

}

#endregion

#region CreatePackage

public Package CreatePackage(string Name, string Description)

{

Package DataPump = new Package();

DataPump.PackageType = DTSPackageType.DTSDesigner90;

DataPump.Name = Name;

DataPump.Description = Description;

DataPump.CreatorComputerName = System.Environment.MachineName;

DataPump.CreatorName = System.Environment.UserName;

return DataPump;

}

#endregion

#region AddConnectionManagers

/// <summary>

/// Adds the OLEDB and FlatFile connection managers to the package.

/// </summary>

public void AddConnectionManagers(Package DataPump, string SqlConnection) Here iam adding 2 connection managers 1 for flatfile and 1 for OLEDB

{

// Add the OLEDB connection manager.

ConnectionManager ConnectionName;

ConnectionManager cmflatFile;

ConnectionName = DataPump.Connections.Add("OLEDB");

// Set stock properties.

ConnectionName.Name = "Remote Source";

ConnectionName.ConnectionString = SqlConnection;

//@."Provider=SQLNCLI;Integrated Security=SSPI;Persist Security Info=False;Initial Catalog=cce_control;Data Source=(wxp-9plyf1s);Auto Translate=False;";

// Add the Destination connection manager.

cmflatFile = DataPump.Connections.Add("FLATFILE");

// Set the stock properties.

cmflatFile.Properties["ConnectionString"].SetValue(cmflatFile, DestinationDataDirectory + DataFileName);

cmflatFile.Properties["Format"].SetValue(cmflatFile, "Delimited");

cmflatFile.Properties["DataRowsToSkip"].SetValue(cmflatFile, 0);

cmflatFile.Properties["ColumnNamesInFirstDataRow"].SetValue(cmflatFile, true);

cmflatFile.Properties["Name"].SetValue(cmflatFile, "FlatFileConnection");

cmflatFile.Properties["RowDelimiter"].SetValue(cmflatFile, "\r\n");

cmflatFile.Properties["TextQualifier"].SetValue(cmflatFile, "\"");

}

#endregion

#region AddDataFlowTask

/// <summary>

/// Adds a DataFlow task to the Executables collection of the package.

/// Retrieves the MainPipe object from the TaskHost and stores it in

/// the dataFlow member variable

/// </summary>

public MainPipe AddDataFlowTask(Package DataPump)

{

TaskHost th = DataPump.Executables.Add("DTS.Pipeline") as TaskHost;

th.Name = "DataFlow";

th.Description = "The DataFlow task in the package.";

MainPipe dataFlow = th.InnerObject as MainPipe;

ComponentEvents componentEvents = new ComponentEvents();

dataFlow.Events = componentEvents as wrap.IDTSComponentEvents90;

return dataFlow;

}

#endregion

#region AddOLEDBSource

public IDTSComponentMetaData90 AddOLEDBSource(Microsoft.SqlServer.Dts.Pipeline.Wrapper.MainPipe dataFlow, Package DataPump, string strSourceQuery)

{

// Declare and Add the component to the dataFlow metadata collection

IDTSComponentMetaData90 OLEDBsource;

OLEDBsource = dataFlow.ComponentMetaDataCollection.New();

// Set the common properties

OLEDBsource.ComponentClassID = "DTSAdapter.OLEDBSource";

OLEDBsource.Name = "cce_control";

OLEDBsource.Description = "Remote Source Server";

// Create an instance of the component

CManagedComponentWrapper instance = OLEDBsource.Instantiate();

instance.ProvideComponentProperties();

// Associate the runtime ConnectionManager with the component

OLEDBsource.RuntimeConnectionCollection[0].ConnectionManagerID = DataPump.Connections["Remote Source"].ID;

OLEDBsource.RuntimeConnectionCollection[0].ConnectionManager = DtsConvert.ToConnectionManager90(DataPump.Connections["Remote Source"]);

instance.SetComponentProperty("SqlCommand", strSourceQuery);

instance.SetComponentProperty("AccessMode", 2);

instance.AcquireConnections(null);

instance.ReinitializeMetaData();

instance.ReleaseConnections();

return OLEDBsource;

}

#endregion

#region AddFlatFileDestination

public IDTSComponentMetaData90 AddFlatFileDestination(Microsoft.SqlServer.Dts.Pipeline.Wrapper.MainPipe dataFlow, Package DataPump, IDTSComponentMetaData90 OLEDBsource)

{

// Declare and Add the component to the dataFlow metadata collection

IDTSComponentMetaData90 flatfileDestination;

flatfileDestination = dataFlow.ComponentMetaDataCollection.New();

// Set the common properties

flatfileDestination.ComponentClassID = "DTSAdapter.FlatFileDestination";

flatfileDestination.Name = "FlatFileDestination";

flatfileDestination.Description = "Flat file destination";

// Create an instance of the component

CManagedComponentWrapper instance = flatfileDestination.Instantiate();

instance.ProvideComponentProperties();

// Associate the runtime ConnectionManager with the component

flatfileDestination.RuntimeConnectionCollection[0].ConnectionManagerID

= DataPump.Connections["FlatFileConnection"].ID;

flatfileDestination.RuntimeConnectionCollection[0].ConnectionManager

= DtsConvert.ToConnectionManager90(DataPump.Connections["FlatFileConnection"]);

// Map a path between the Sort transformation component to the FlatFileDestination

dataFlow.PathCollection.New().AttachPathAndPropagateNotifications(OLEDBsource.OutputCollection[0], flatfileDestination.InputCollection[0]);

// Add columns to the FlatFileConnectionManager

AddColumnsToFlatFileConnection(DataPump, flatfileDestination);

// Acquire the connection, reinitialize the metadata,

// map the columns, then release the connection.

instance.AcquireConnections(null);

instance.ReinitializeMetaData();

MapFlatFileDestinationColumns(flatfileDestination);

instance.ReleaseConnections();

return flatfileDestination;

}

#endregion

#region AddColumnsToFlatFileConnection

public void AddColumnsToFlatFileConnection(Package DataPump, IDTSComponentMetaData90 flatfileDestination)

{

wrap.IDTSConnectionManagerFlatFile90 ff = null;

foreach (ConnectionManager cm in DataPump.Connections)

{

if (cm.Name == "FlatFileConnection")

{

ff = cm.InnerObject as wrap.IDTSConnectionManagerFlatFile90;

DtsConvert.ToConnectionManager90(cm);

}

}

int count = ff.Columns.Count;

// if the connection manager is null here, then we have a problem

if (ff != null)

{

// Get the upstream columns

IDTSVirtualInputColumnCollection90 vColumns = flatfileDestination.InputCollection[0].GetVirtualInput().VirtualInputColumnCollection;

for (int cols = 0; cols < vColumns.Count; cols++)

{

wrap.IDTSConnectionManagerFlatFileColumn90 col = ff.Columns.Add();

// If this is the last column, set the delimiter to CRLF.

// Otherwise keep the delimiter as ",".

if (cols == vColumns.Count - 1)

{

col.ColumnDelimiter = "\r\n";

}

else

{

col.ColumnDelimiter = @.",";

}

col.ColumnType = "Delimited";

col.DataType = vColumns[cols].DataType;

col.DataPrecision = vColumns[cols].Precision;

col.DataScale = vColumns[cols].Scale;

wrap.IDTSName90 name = col as wrap.IDTSName90;

name.Name = vColumns[cols].Name;

}

}

}

#endregion

#region MapFlatFileDestination Columns

public void MapFlatFileDestinationColumns(IDTSComponentMetaData90 flatfileDestination)

{

CManagedComponentWrapper wrp = flatfileDestination.Instantiate();

IDTSVirtualInput90 vInput = flatfileDestination.InputCollection[0].GetVirtualInput();

foreach (IDTSVirtualInputColumn90 vColumn in vInput.VirtualInputColumnCollection)

{

wrp.SetUsageType(flatfileDestination.InputCollection[0].ID, vInput, vColumn.LineageID, DTSUsageType.UT_READONLY);

}

// For each column in the input collection

// find the corresponding external metadata column.

foreach (IDTSInputColumn90 col in flatfileDestination.InputCollection[0].InputColumnCollection)

{

IDTSExternalMetadataColumn90 exCol = flatfileDestination.InputCollection[0].ExternalMetadataColumnCollection[col.Name];

wrp.MapInputColumn(flatfileDestination.InputCollection[0].ID, col.ID, exCol.ID);

}

}

#endregion

#region Exec_DataPump--This is just the execution where i get values from like server name etc and create the package and execute it.

public Boolean DataPump(string SqlConnection, string strSourceQuery)

{

Package DataPump;

PackageEvents packageEvents;

MainPipe dataFlow;

IDTSComponentMetaData90 OLEDBsource, flatfileDestination;

packageEvents = new PackageEvents();

DataPump = CreatePackage("SSISPackage", "The package for the Flatfile pump");

AddConnectionManagers(DataPump, SqlConnection);

dataFlow = AddDataFlowTask(DataPump);

OLEDBsource = AddOLEDBSource(dataFlow, DataPump, strSourceQuery);

flatfileDestination = AddFlatFileDestination(dataFlow, DataPump, OLEDBsource);

DTSExecResult status = DataPump.Validate(null, null, packageEvents, null);

if (mblnDebug)

{

Application dts = new Application();

dts.SaveToXml(mstrDTSXFile, DataPump, packageEvents);

}

foreach (Variable v in DataPump.Variables)

{

if (v.Name.ToUpper() == "EXECUTIONINSTANCEGUID")

{

break;

}

}

if (status == DTSExecResult.Success)

{

DTSExecResult result = DataPump.Execute(null, null, packageEvents, null, null);

foreach (Variable v in DataPump.Variables)

{

Console.WriteLine(v.Name.ToUpper());

}

}

return true;

}

#endregion

}

}

Programmatically Data Conversion component creation

Hi there,

I have created a package which simply imports data from a flat file to a SQL Server table. But I need to incorporate a data conversion component by which I may change the source-destination column mapping programmatically. So what I thought that I need to add a data conversion component into the dataflow task. After adding this component (I found its component id as {C3BF62C8-7C5C-4F85-83C3-E0B6F6BE267C}) I have created a path which establishes the mapping between output columns of source component and the input columns of data conversion component. Now I am not sure how to establish the mapping between the data conversion component’s input column collection and output column collection.

I am giving my code snippet here,

IDTSComponentMetaData90 sourceDataFlowComponent = dataFlowTask.ComponentMetaDataCollection.New();

sourceDataFlowComponent.ComponentClassID = "{90C7770B-DE7C-435E-880E-E718C92C0573}";

… … …. // Code for configuring the source data flow component

IDTSComponentMetaData90 conversionDataFlowComponent = dataFlowTask.ComponentMetaDataCollection.New();// creating data conversion

conversionDataFlowComponent.ComponentClassID = "{C3BF62C8-7C5C-4F85-83C3-E0B6F6BE267C}";// This is the GUID for data conversion component

CManagedComponentWrapper conversionInstance = conversionDataFlowComponent.Instantiate();//Instantiate

conversionInstance.ProvideComponentProperties();

// Now creating a path to connet the source and conversion

IDTSPath90 fPath = dataFlowTask.PathCollection.New();fPath.AttachPathAndPropagateNotifications(

sourceDataFlowComponent.OutputCollection[0],

conversionDataFlowComponent.InputCollection[0]);

// Sould I need to accuire connect for data conversion? Im not sure

conversionInstance.AcquireConnections(null);

conversionInstance.ReinitializeMetaData();

// Get the input collection

IDTSInput90 input = conversionDataFlowComponent.InputCollection[0];

IDTSVirtualInput90 vInput = input.GetVirtualInput();

foreach (IDTSVirtualInputColumn90 vColumn in vInput.VirtualInputColumnCollection){

conversionInstance.SetUsageType(

input.ID, vInput, vColumn.LineageID, DTSUsageType.UT_READONLY);

}

.. . // Well here I am stucked. What I need to do here to establish a map

// between conversionDataFlowComponent.InputCollection[0] and

// conversionDataFlowComponent.OutputCollection[0]?

As you can see I am just away from creating the mapping between input and output collection. Can anybody give me an idea how can I achieve this?

I will appreciate all kind of suggestions and comments.

Regards

Moim

Have a look at this code snippet where I use this transform, selecting all columns, and converting types where required based on the result of a helper function-

#region Select Conversion Columns

IDTSOutput90 output = dataConversion.OutputCollection[0];

IDTSVirtualInput90 virtualInput = dataConversion.InputCollection[0].GetVirtualInput();

int inputId = dataConversion.InputCollection[0].ID;

foreach (IDTSVirtualInputColumn90 column in virtualInput.VirtualInputColumnCollection)

{

int length = column.Length;

int precision = column.Precision;

int scale = column.Scale;

DataType dataType = ComponentHelper.ConvertBufferDataTypeToFitExcel(column.DataType, ref length, ref precision, ref scale);

if (dataType != column.DataType)

{

IDTSInputColumn90 inputColumn = dataConversionInstance.SetUsageType(inputId, virtualInput, column.LineageID, DTSUsageType.UT_READONLY);

IDTSOutputColumn90 outputColumn = dataConversionInstance.InsertOutputColumnAt(output.ID, output.OutputColumnCollection.Count, column.Name, "Excel compatible conversion column");

dataConversionInstance.SetOutputColumnDataTypeProperties(output.ID, outputColumn.ID, dataType, length, precision, scale, 0);

outputColumn.CustomPropertyCollection[0].Value = inputColumn.LineageID;

inputColumn.Name = DataConversionInputRenamePrefix + inputColumn.Name;

}

}

#endregion

Btw, how did you get your code to paste so nicely into this editor Window?

|||

DarrenSQLIS wrote:

Btw, how did you get your code to paste so nicely into this editor Window?

Try this:

http://www.codeproject.com/jscript/CopyasHTML.asp

Or just go here: http://www.google.co.uk/search?hl=en&q=copyashtml&meta=

I have add-in that enables me to do it but I can't for the life of me find it again.

-Jamie

|||

Jamie Thomson wrote:

DarrenSQLIS wrote:

Btw, how did you get your code to paste so nicely into this editor Window?

I have add-in that enables me to do it but I can't for the life of me find it again.

-Jamie

Here it is: http://www.jtleigh.com/people/colin/software/CopySourceAsHtml/|||Nice, thanks Gents.|||

Hello Darran,

Thanks a lot. After implementing your suggestion I believe I am very near to make it work. I think I am missing/doing some silly things/mistakes so that the package is not working as expected. I observed the log traces and found that it is failing to validate the destination data flow component. And debugging it I found that after creating a path between the data conversion component and the destination component, when I populate the input columns for the destination component it is actually getting the source components output columns lineageID instead of data conversion component’s output columns lineageID. Can you help me to find out the problem please?

Anyway, thank you very much for your previous help.

Here is the code snippet:

private void ConfigureSourceComponent()

{

sourceDataFlowComponent.Name = "Source Data from Flat file";

// Here is the component class id for flat file source data

sourceDataFlowComponent.ComponentClassID = "{90C7770B-DE7C-435E-880E-E718C92C0573}";

CManagedComponentWrapper managedFlatFileInstance =

sourceDataFlowComponent.Instantiate();

managedFlatFileInstance.ProvideComponentProperties();sourceDataFlowComponent.RuntimeConnectionCollection[0].

ConnectionManagerID =flatfileConnectionManager.ID;sourceDataFlowComponent.RuntimeConnectionCollection[0].

ConnectionManager =DtsConvert.ToConnectionManager90(flatfileConnectionManager);

managedFlatFileInstance.AcquireConnections(null);

managedFlatFileInstance.ReinitializeMetaData();

IDTSExternalMetadataColumn90 exOutColumn;

foreach (IDTSOutputColumn90 outColumn in sourceDataFlowComponent.OutputCollection[0].OutputColumnCollection)

{

exOutColumn = sourceDataFlowComponent.OutputCollection[0].

ExternalMetadataColumnCollection[outColumn.Name];managedFlatFileInstance.MapOutputColumn(

sourceDataFlowComponent.OutputCollection[0].ID, outColumn.ID, exOutColumn.ID, true);

}

managedFlatFileInstance.ReleaseConnections()

}

private void ConfigureConversionComponent()

{

conversionDataFlowComponent.Name = "Conversion compoenent";

// component class id for Data conversion component

conversionDataFlowComponent.ComponentClassID = "{C3BF62C8-7C5C-4F85-83C3-E0B6F6BE267C}";

CManagedComponentWrapper conversionInstance =

conversionDataFlowComponent.Instantiate();

conversionInstance.ProvideComponentProperties();

// Create the path.

IDTSPath90 fPath = dataFlowTask.PathCollection.New();fPath.AttachPathAndPropagateNotifications(

sourceDataFlowComponent.OutputCollection[0],

conversionDataFlowComponent.InputCollection[0]);

conversionInstance.AcquireConnections(null);

conversionInstance.ReinitializeMetaData();

// Get the output collection

IDTSOutput90 output = conversionDataFlowComponent.OutputCollection[0];

// Get the destination's default input and virtual input.

input = conversionDataFlowComponent.InputCollection[0];

vInput = input.GetVirtualInput();

// Iterate through the virtual input column collection.

foreach (IDTSVirtualInputColumn90 vColumn in vInput.VirtualInputColumnCollection)

{

IDTSInputColumn90 inputColumn =

conversionInstance.SetUsageType(

input.ID, vInput, vColumn.LineageID,

DTSUsageType.UT_READONLY);

IDTSOutputColumn90 outputColumn =

conversionInstance.InsertOutputColumnAt

(output.ID, output.OutputColumnCollection.Count,

vColumn.Name, "Sql server compatible conversion column");

conversionInstance.SetOutputColumnDataTypeProperties(

output.ID, outputColumn.ID, vColumn.DataType,

vColumn.Length, vColumn.Precision, vColumn.Scale, 0);

outputColumn.CustomPropertyCollection[0].Value =

inputColumn.LineageID;

}

conversionInstance.ReleaseConnections();

}

private void ConfigureDestinationComponent()

{

destinationDataFlowComponent.Name = "Destination Oledb compoenent";

// component class id for Oledb

destinationDataFlowComponent.ComponentClassID = "{E2568105-9550-4F71-A638-B7FE42E66922}";

CManagedComponentWrapper managedOleInstance =

destinationDataFlowComponent.Instantiate();

managedOleInstance.ProvideComponentProperties();

destinationDataFlowComponent.RuntimeConnectionCollection[0].

ConnectionManagerID = oledbConnectionManager.ID;

destinationDataFlowComponent.RuntimeConnectionCollection[0].

ConnectionManager =DtsConvert.ToConnectionManager90(oledbConnectionManager);

managedOleInstance.SetComponentProperty("AccessMode", 0);

// .. .. .. setting other component properties here..

// Create the path.

IDTSPath90 path = dataFlowTask.PathCollection.New();

path.AttachPathAndPropagateNotifications(

conversionDataFlowComponent.OutputCollection[0],

destinationDataFlowComponent.InputCollection[0]

);

managedOleInstance.AcquireConnections(null);

managedOleInstance.ReinitializeMetaData();

// Get the destination's default input and virtual input.

IDTSInput90 input;

IDTSVirtualInput90 vInput;

// Get the destination's default input and virtual input.

input = destinationDataFlowComponent.InputCollection[0];

vInput = input.GetVirtualInput();

// Iterate through the virtual input column collection.

foreach (IDTSVirtualInputColumn90 vColumn in

vInput.VirtualInputColumnCollection)

{

managedOleInstance.SetUsageType(

input.ID, vInput, vColumn.LineageID,

DTSUsageType.UT_READONLY);

// culprit Line. I can found vColumn.LineageID = source lineage ids

// while debugging.. can you please help me out here?

}

IDTSExternalMetadataColumn90 exColumn;

foreach (IDTSInputColumn90 inColumn in

destinationDataFlowComponent.InputCollection[0].InputColumnCollection)

{

exColumn = destinationDataFlowComponent.InputCollection[0].

ExternalMetadataColumnCollection[inColumn.Name];managedOleInstance.MapInputColumn(destinationDataFlowComponent.

InputCollection[0].ID, inColumn.ID, exColumn.ID);

}

managedOleInstance.ReleaseConnections();

}

Here you can see the red marked lines. Where I can found that the destination component is watching the source output columns lineage althoug its mapped with data conversion component.

Regards

Moim

|||

The input buffer of the destination will include all upstream columns, so that means both the ones that came from the source and those from the Data Conversion Tx. You will need to be selective in choosing the input columns.

Think of the buffer as it passes down the pipeline, or rather what columns are available -

Source Output - SourceCol1

Data Conversion Output - SourceCol1, SourceCol1(converted)

You don't seem to be distiguishing between the input columns and output columns in the data conversion setup, that seems strange. You would only want to select the converted columns I assume. You could use a name, description or the UpstreamComponentName property of the column perhaps.

|||

Hi,

I have similar requirement.I need to import data from csv files to sql server tables using ssis programatically.iam not able to create a data conversion task programatically and map the columns from the dataconversion task to the oledb destination.tried my level best and got struck.Is there a code demo availble?appreciating your help

|||

adarsha wrote:

Hi,

I have similar requirement.I need to import data from csv files to sql server tables using ssis programatically.iam not able to create a data conversion task programatically and map the columns from the dataconversion task to the oledb destination.tried my level best and got struck.Is there a code demo availble?appreciating your help

It would really help if you explain what you are getting stuck on and how far you have come so far. Posting your code would be even better.

-Jamie

|||

Hi,

//In the below example i have pumped data from sql table to csvs which works fine for me .But now i need to do the exact opposite task.(Pump data from csv files to sql table)

Problems i have :

1:

After adding a flatfile source iam not very sure how to exactly add input columns in FlatFile source as they dont get populated by themselves.I tried using datatable reading the columns from the file and add it and somewhat succeeded.But need a clear implementation help.

2:

To some extent when i succeeded in adding input columns to flat file source, the package failed due to data conversion problems(unicode to non unicode conversion error).I tried programatically building dataconversion task and had no clue how exactly to do it correctly and make it to work.

So i need help in adding input columns to flat file source and adding data conversion task and selecting only the converted columns to the oledb destination.

using System;

using System.Collections.Generic;

using System.Collections.Specialized;

using System.Reflection;

using System.Threading;

using System.Text;

using System.IO;

using Microsoft.SqlServer.Dts.Runtime;

using Microsoft.SqlServer.Dts.Pipeline.Wrapper;

using wrap = Microsoft.SqlServer.Dts.Runtime.Wrapper;

namespace CCE_ETL_Engine_Flatfiles

{

class Flatfile_Pump

{

Boolean mblnDebug = true;

string mstrAppLogFile;

string mstrDTSXFile;

string DataFileName;

string DestinationDataDirectory;

#region intialize

public void InitializeVariables(string strDTSPackageName)

{

DestinationDataDirectory = System.Configuration.ConfigurationManager.AppSettings.Get("DataFilePath");

DataFileName = System.Configuration.ConfigurationManager.AppSettings.Get("DataFileName");

Object objLock = new Object();

Monitor.Enter(objLock);

string strTimeStamp = DateTime.Now.Hour.ToString().PadLeft(2, '0') + DateTime.Now.Minute.ToString().PadLeft(2, '0') + DateTime.Now.Second.ToString().PadLeft(2, '0') + DateTime.Now.Millisecond.ToString();

Monitor.Exit(objLock);

mstrAppLogFile = System.Configuration.ConfigurationManager.AppSettings.Get("ApplicationLogFilePath") + @."\" + strDTSPackageName + "_" + strTimeStamp + ".log";

mstrDTSXFile = System.Configuration.ConfigurationManager.AppSettings.Get("DTSXFilePath") + @."\" + strDTSPackageName + "_" + strTimeStamp + ".dtsx";

if (System.Configuration.ConfigurationManager.AppSettings.Get("Debug").ToUpper() == "FALSE")

mblnDebug = false;

else

mblnDebug = true;

}

#endregion

#region constructor

public Flatfile_Pump(string strDTSPackageName)

{

InitializeVariables(strDTSPackageName);

}

#endregion

#region CreatePackage

public Package CreatePackage(string Name, string Description)

{

Package DataPump = new Package();

DataPump.PackageType = DTSPackageType.DTSDesigner90;

DataPump.Name = Name;

DataPump.Description = Description;

DataPump.CreatorComputerName = System.Environment.MachineName;

DataPump.CreatorName = System.Environment.UserName;

return DataPump;

}

#endregion

#region AddConnectionManagers

/// <summary>

/// Adds the OLEDB and FlatFile connection managers to the package.

/// </summary>

public void AddConnectionManagers(Package DataPump, string SqlConnection) Here iam adding 2 connection managers 1 for flatfile and 1 for OLEDB

{

// Add the OLEDB connection manager.

ConnectionManager ConnectionName;

ConnectionManager cmflatFile;

ConnectionName = DataPump.Connections.Add("OLEDB");

// Set stock properties.

ConnectionName.Name = "Remote Source";

ConnectionName.ConnectionString = SqlConnection;

//@."Provider=SQLNCLI;Integrated Security=SSPI;Persist Security Info=False;Initial Catalog=cce_control;Data Source=(wxp-9plyf1s);Auto Translate=False;";

// Add the Destination connection manager.

cmflatFile = DataPump.Connections.Add("FLATFILE");

// Set the stock properties.

cmflatFile.Properties["ConnectionString"].SetValue(cmflatFile, DestinationDataDirectory + DataFileName);

cmflatFile.Properties["Format"].SetValue(cmflatFile, "Delimited");

cmflatFile.Properties["DataRowsToSkip"].SetValue(cmflatFile, 0);

cmflatFile.Properties["ColumnNamesInFirstDataRow"].SetValue(cmflatFile, true);

cmflatFile.Properties["Name"].SetValue(cmflatFile, "FlatFileConnection");

cmflatFile.Properties["RowDelimiter"].SetValue(cmflatFile, "\r\n");

cmflatFile.Properties["TextQualifier"].SetValue(cmflatFile, "\"");

}

#endregion

#region AddDataFlowTask

/// <summary>

/// Adds a DataFlow task to the Executables collection of the package.

/// Retrieves the MainPipe object from the TaskHost and stores it in

/// the dataFlow member variable

/// </summary>

public MainPipe AddDataFlowTask(Package DataPump)

{

TaskHost th = DataPump.Executables.Add("DTS.Pipeline") as TaskHost;

th.Name = "DataFlow";

th.Description = "The DataFlow task in the package.";

MainPipe dataFlow = th.InnerObject as MainPipe;

ComponentEvents componentEvents = new ComponentEvents();

dataFlow.Events = componentEvents as wrap.IDTSComponentEvents90;

return dataFlow;

}

#endregion

#region AddOLEDBSource

public IDTSComponentMetaData90 AddOLEDBSource(Microsoft.SqlServer.Dts.Pipeline.Wrapper.MainPipe dataFlow, Package DataPump, string strSourceQuery)

{

// Declare and Add the component to the dataFlow metadata collection

IDTSComponentMetaData90 OLEDBsource;

OLEDBsource = dataFlow.ComponentMetaDataCollection.New();

// Set the common properties

OLEDBsource.ComponentClassID = "DTSAdapter.OLEDBSource";

OLEDBsource.Name = "cce_control";

OLEDBsource.Description = "Remote Source Server";

// Create an instance of the component

CManagedComponentWrapper instance = OLEDBsource.Instantiate();

instance.ProvideComponentProperties();

// Associate the runtime ConnectionManager with the component

OLEDBsource.RuntimeConnectionCollection[0].ConnectionManagerID = DataPump.Connections["Remote Source"].ID;

OLEDBsource.RuntimeConnectionCollection[0].ConnectionManager = DtsConvert.ToConnectionManager90(DataPump.Connections["Remote Source"]);

instance.SetComponentProperty("SqlCommand", strSourceQuery);

instance.SetComponentProperty("AccessMode", 2);

instance.AcquireConnections(null);

instance.ReinitializeMetaData();

instance.ReleaseConnections();

return OLEDBsource;

}

#endregion

#region AddFlatFileDestination

public IDTSComponentMetaData90 AddFlatFileDestination(Microsoft.SqlServer.Dts.Pipeline.Wrapper.MainPipe dataFlow, Package DataPump, IDTSComponentMetaData90 OLEDBsource)

{

// Declare and Add the component to the dataFlow metadata collection

IDTSComponentMetaData90 flatfileDestination;

flatfileDestination = dataFlow.ComponentMetaDataCollection.New();

// Set the common properties

flatfileDestination.ComponentClassID = "DTSAdapter.FlatFileDestination";

flatfileDestination.Name = "FlatFileDestination";

flatfileDestination.Description = "Flat file destination";

// Create an instance of the component

CManagedComponentWrapper instance = flatfileDestination.Instantiate();

instance.ProvideComponentProperties();

// Associate the runtime ConnectionManager with the component

flatfileDestination.RuntimeConnectionCollection[0].ConnectionManagerID

= DataPump.Connections["FlatFileConnection"].ID;

flatfileDestination.RuntimeConnectionCollection[0].ConnectionManager

= DtsConvert.ToConnectionManager90(DataPump.Connections["FlatFileConnection"]);

// Map a path between the Sort transformation component to the FlatFileDestination

dataFlow.PathCollection.New().AttachPathAndPropagateNotifications(OLEDBsource.OutputCollection[0], flatfileDestination.InputCollection[0]);

// Add columns to the FlatFileConnectionManager

AddColumnsToFlatFileConnection(DataPump, flatfileDestination);

// Acquire the connection, reinitialize the metadata,

// map the columns, then release the connection.

instance.AcquireConnections(null);

instance.ReinitializeMetaData();

MapFlatFileDestinationColumns(flatfileDestination);

instance.ReleaseConnections();

return flatfileDestination;

}

#endregion

#region AddColumnsToFlatFileConnection

public void AddColumnsToFlatFileConnection(Package DataPump, IDTSComponentMetaData90 flatfileDestination)

{

wrap.IDTSConnectionManagerFlatFile90 ff = null;

foreach (ConnectionManager cm in DataPump.Connections)

{

if (cm.Name == "FlatFileConnection")

{

ff = cm.InnerObject as wrap.IDTSConnectionManagerFlatFile90;

DtsConvert.ToConnectionManager90(cm);

}

}

int count = ff.Columns.Count;

// if the connection manager is null here, then we have a problem

if (ff != null)

{

// Get the upstream columns

IDTSVirtualInputColumnCollection90 vColumns = flatfileDestination.InputCollection[0].GetVirtualInput().VirtualInputColumnCollection;

for (int cols = 0; cols < vColumns.Count; cols++)

{

wrap.IDTSConnectionManagerFlatFileColumn90 col = ff.Columns.Add();

// If this is the last column, set the delimiter to CRLF.

// Otherwise keep the delimiter as ",".

if (cols == vColumns.Count - 1)

{

col.ColumnDelimiter = "\r\n";

}

else

{

col.ColumnDelimiter = @.",";

}

col.ColumnType = "Delimited";

col.DataType = vColumns[cols].DataType;

col.DataPrecision = vColumns[cols].Precision;

col.DataScale = vColumns[cols].Scale;

wrap.IDTSName90 name = col as wrap.IDTSName90;

name.Name = vColumns[cols].Name;

}

}

}

#endregion

#region MapFlatFileDestination Columns

public void MapFlatFileDestinationColumns(IDTSComponentMetaData90 flatfileDestination)

{

CManagedComponentWrapper wrp = flatfileDestination.Instantiate();

IDTSVirtualInput90 vInput = flatfileDestination.InputCollection[0].GetVirtualInput();

foreach (IDTSVirtualInputColumn90 vColumn in vInput.VirtualInputColumnCollection)

{

wrp.SetUsageType(flatfileDestination.InputCollection[0].ID, vInput, vColumn.LineageID, DTSUsageType.UT_READONLY);

}

// For each column in the input collection

// find the corresponding external metadata column.

foreach (IDTSInputColumn90 col in flatfileDestination.InputCollection[0].InputColumnCollection)

{

IDTSExternalMetadataColumn90 exCol = flatfileDestination.InputCollection[0].ExternalMetadataColumnCollection[col.Name];

wrp.MapInputColumn(flatfileDestination.InputCollection[0].ID, col.ID, exCol.ID);

}

}

#endregion

#region Exec_DataPump--This is just the execution where i get values from like server name etc and create the package and execute it.

public Boolean DataPump(string SqlConnection, string strSourceQuery)

{

Package DataPump;

PackageEvents packageEvents;

MainPipe dataFlow;

IDTSComponentMetaData90 OLEDBsource, flatfileDestination;

packageEvents = new PackageEvents();

DataPump = CreatePackage("SSISPackage", "The package for the Flatfile pump");

AddConnectionManagers(DataPump, SqlConnection);

dataFlow = AddDataFlowTask(DataPump);

OLEDBsource = AddOLEDBSource(dataFlow, DataPump, strSourceQuery);

flatfileDestination = AddFlatFileDestination(dataFlow, DataPump, OLEDBsource);

DTSExecResult status = DataPump.Validate(null, null, packageEvents, null);

if (mblnDebug)

{

Application dts = new Application();

dts.SaveToXml(mstrDTSXFile, DataPump, packageEvents);

}

foreach (Variable v in DataPump.Variables)

{

if (v.Name.ToUpper() == "EXECUTIONINSTANCEGUID")

{

break;

}

}

if (status == DTSExecResult.Success)

{

DTSExecResult result = DataPump.Execute(null, null, packageEvents, null, null);

foreach (Variable v in DataPump.Variables)

{

Console.WriteLine(v.Name.ToUpper());

}

}

return true;

}

#endregion

}

}

Programmatically creating SSIS package

Hi guys,

I was intended to write a program that will create a SSIS package which will import data from a CSV file to the SQL server 2005. But I did not find any good example for this into the internet. I found some example which exports data from SQL server 2005 to CSV files. And following those examples I have tried to write my own. But I am facing some problem with that. What I am doing here is creating two connection manager objects, one for Flat file and another for OLEDB. And create a data flow task that has two data flow component, one for reading source and another for writing to destination. While debugging I can see that after invoking the ReinitializedMetaData() for the flat file source data flow component, there is not output column found. Why it is not fetching the output columns from the CSV file? And after that when it invokes the ReinitializedMetaData() for the destination data flow component it simply throws exception.

Can any body help me to get around this problem? Even can anyone give me any link where I can find some useful article to accomplish this goal?

I am giving my code here too.

I will appreciate any kind of suggestion on this.

Code snippet:

public void CreatePackage()

{

string executeSqlTask = typeof(ExecuteSQLTask).AssemblyQualifiedName;

Package pkg = new Package();

pkg.PackageType = DTSPackageType.DTSDesigner90;

ConnectionManager oledbConnectionManager = CreateOLEDBConnection(pkg);

ConnectionManager flatfileConnectionManager =

CreateFileConnection(pkg);

// creating the SQL Task for table creation

Executable sqlTaskExecutable = pkg.Executables.Add(executeSqlTask);

ExecuteSQLTask execSqlTask = (sqlTaskExecutable as Microsoft.SqlServer.Dts.Runtime.TaskHost).InnerObject as ExecuteSQLTask;

execSqlTask.Connection = oledbConnectionManager.Name;

execSqlTask.SqlStatementSource =

"CREATE TABLE [MYDATABASE].[dbo].[MYTABLE] \n ([NAME] NVARCHAR(50),[AGE] NVARCHAR(50),[GENDER] NVARCHAR(50)) \nGO";

// creating the Data flow task

Executable dataFlowExecutable = pkg.Executables.Add("DTS.Pipeline.1");

TaskHost pipeLineTaskHost = (TaskHost)dataFlowExecutable;

MainPipe dataFlowTask = (MainPipe)pipeLineTaskHost.InnerObject;

// Put a precedence constraint between the tasks.

PrecedenceConstraint pcTasks = pkg.PrecedenceConstraints.Add(sqlTaskExecutable, dataFlowExecutable);

pcTasks.Value = DTSExecResult.Success;

pcTasks.EvalOp = DTSPrecedenceEvalOp.Constraint;

// Now adding the data flow components

IDTSComponentMetaData90 sourceDataFlowComponent = dataFlowTask.ComponentMetaDataCollection.New();

sourceDataFlowComponent.Name = "Source Data from Flat file";

// Here is the component class id for flat file source data

sourceDataFlowComponent.ComponentClassID = "{90C7770B-DE7C-435E-880E-E718C92C0573}";

CManagedComponentWrapper managedInstance = sourceDataFlowComponent.Instantiate();

managedInstance.ProvideComponentProperties();

sourceDataFlowComponent.

RuntimeConnectionCollection[0].ConnectionManagerID = flatfileConnectionManager.ID;

sourceDataFlowComponent.

RuntimeConnectionCollection[0].ConnectionManager = DtsConvert.ToConnectionManager90(flatfileConnectionManager);

managedInstance.AcquireConnections(null);

managedInstance.ReinitializeMetaData();

managedInstance.ReleaseConnections();

// Get the destination's default input and virtual input.

IDTSOutput90 output = sourceDataFlowComponent.OutputCollection[0];

// Here I dont find any columns at all..why?

// Now adding the data flow components

IDTSComponentMetaData90 destinationDataFlowComponent = dataFlowTask.ComponentMetaDataCollection.New();

destinationDataFlowComponent.Name =

"Destination Oledb compoenent";

// Here is the component class id for Oledvb data

destinationDataFlowComponent.ComponentClassID = "{E2568105-9550-4F71-A638-B7FE42E66922}";

CManagedComponentWrapper managedOleInstance = destinationDataFlowComponent.Instantiate();

managedOleInstance.ProvideComponentProperties();

destinationDataFlowComponent.

RuntimeConnectionCollection[0].ConnectionManagerID = oledbConnectionManager.ID;

destinationDataFlowComponent.

RuntimeConnectionCollection[0].ConnectionManager = DtsConvert.ToConnectionManager90(oledbConnectionManager);

// Set the custom properties.

managedOleInstance.SetComponentProperty("AccessMode", 2);

managedOleInstance.SetComponentProperty("OpenRowset", "[MYDATABASE].[dbo].[MYTABLE]");

managedOleInstance.AcquireConnections(null);

managedOleInstance.ReinitializeMetaData(); // Throws exception

managedOleInstance.ReleaseConnections();

// Create the path.

IDTSPath90 path = dataFlowTask.PathCollection.New(); path.AttachPathAndPropagateNotifications(sourceDataFlowComponent.OutputCollection[0],

destinationDataFlowComponent.InputCollection[0]);

// Get the destination's default input and virtual input.

IDTSInput90 input = destinationDataFlowComponent.InputCollection[0];

IDTSVirtualInput90 vInput = input.GetVirtualInput();

// Iterate through the virtual input column collection.

foreach (IDTSVirtualInputColumn90 vColumn in vInput.VirtualInputColumnCollection)

{

managedOleInstance.SetUsageType(

input.ID, vInput, vColumn.LineageID, DTSUsageType.UT_READONLY);

}

DTSExecResult res = pkg.Execute();

}

public ConnectionManager CreateOLEDBConnection(Package p)

{

ConnectionManager ConMgr;

ConMgr = p.Connections.Add("OLEDB");

ConMgr.ConnectionString =

"Data Source=VSTS;Initial Catalog=MYDATABASE;Provider=SQLNCLI;Integrated Security=SSPI;Auto Translate=false;";

ConMgr.Name = "SSIS Connection Manager for Oledb";

ConMgr.Description = "OLE DB connection to the Test database.";

return ConMgr;

}

public ConnectionManager CreateFileConnection(Package p)

{

ConnectionManager connMgr;

connMgr = p.Connections.Add("FLATFILE");

connMgr.ConnectionString = @."D:\MyCSVFile.csv";

connMgr.Name = "SSIS Connection Manager for Files";

connMgr.Description = "Flat File connection";

connMgr.Properties["Format"].SetValue(connMgr, "Delimited");

connMgr.Properties["HeaderRowDelimiter"].SetValue(connMgr, Environment.NewLine);

return connMgr;

}

And my CSV files is as follows

NAME, AGE, GENDER

Jon,52,MALE

Linda, 26, FEMALE

Thats all. Thanks.

As mentioned earlier, this CSV to OLEDB package generator is not far from working.

First, the AccessMode on the OLEDB destination is set to the constant 2, which the "Sql Command" constant. Change that back to 0 ("Table or View").

FYI, the OLEDB destination's Access Mode enumeration is undocumented, but nevertheless can be found in the following dll: Microsoft SQL Server\DTS\PipelineComponents\OleDbDest.dll using a program like "PE Explorer".

Enum AccessMode;

AM_OPENROWSET = 0
AM_OPENROWSET_VARIABLE = 1
AM_SQLCOMMAND = 2
AM_OPENROWSET_FASTLOAD = 3
AM_OPENROWSET_FASTLOAD_VARIABLE = 4

Next, the target table doesn't exist , yet ReinitializeMetadata() for the OLEDB destination component will by default attempt to retrieve the target table's meta data. The table isn't there, so an exception is thrown. Therefore, as part of the driving program, you may wish to create the table temporarily (this table create is separate from the create in the Execute SQL Task), so that there is meta-data to reinitialize.

That should eliminate exceptions, but it doesn't mean the package will validate or execute successfully, just that you could save it to xml.

Then, add source columns for your flat file. SSIS does not materialize them for you (at least as far as I know), so the user needs to do perform that programmatically.|||

Hello jaegd,

At last I made it work today . Your tips tremendously helped me. After implementing your suggestions I had to do very little coding to make it working. Thanks a lot .

Anyway, as I am creating the source columns programmatically for the Flat file connection, I had to read the columns from the CSV file using .net IO functionality. Is there any way to accomplish this through SSIS APIs?

Thank you very much again.

Regards

Moim

|||

Hello,

How did you create the metadata for the destination?

Thanks in advance.

Cheers,

kix