Monday, March 12, 2012
Programming
I'm trying to write a short sql stmts to attached to a folder and query that
folder for *.bak and if the*.bak is older than a certain amount of days, then
I want to delete them.
Is this possible using tsql?
"Andrew J. Kelly" wrote:
> Don't use EM, use a script instead. EM has some limitations and things like
> this can be much more controlled through scripts. The restore syntax is
> pretty simple and there are examples in BOL. If you have troubles with it
> post your script and we can help.
> --
> Andrew J. Kelly SQL MVP
>
> "Sonya" <Sonya@.discussions.microsoft.com> wrote in message
> news:B202FDF2-5301-4E8E-9558-F2E0505F563C@.microsoft.com...
>
>
You can't do this strictly with TSQL but this is as close as you will get.
There is a sample sp that will delete old log file backups based on the
timestamp in the name of the files.
-- Removing Older Backup Files --
-- Remove any log files older than 7 days
DECLARE @.Error INT, @.D DATETIME
SET @.D = DATEADD(dd,-7,GETDATE())
EXEC @.Error = remove_old_log_files @.D
SELECT @.Error
CREATE PROCEDURE remove_old_log_files
@.DelDate DATETIME
AS
SET NOCOUNT ON
DECLARE @.SQL VARCHAR(500), @.FName VARCHAR(40), @.Error INT
DECLARE @.Delete VARCHAR(300), @.Msg VARCHAR(100), @.Return INT
-- Used later when we cast strings to Datetimes
SET DATEFORMAT MDY
-- Create a table to hold the results of the DIR command
IF OBJECT_ID('tempdb..#dirlist') IS NOT NULL
DROP TABLE #DirList
CREATE TABLE #dirlist (FName VARCHAR(1000))
-- Create a table to hold any errors
IF OBJECT_ID('tempdb..#Errors') IS NOT NULL
DROP TABLE #Errors
CREATE TABLE #Errors (Results VARCHAR(1000))
-- Insert the results of the dir cmd into a table so we can scan it
INSERT INTO #dirlist (FName)
exec @.Return = master..xp_cmdshell 'dir /OD C:\Data\Backups\*.Bak'
SET @.Error = @.@.ERROR
IF @.Error <> 0 OR @.Return <> 0
BEGIN
IF @.Return = 1
SET @.Error = -1
SET @.Msg = 'Error while getting the filenames with DIR '
GOTO On_Error
END
-- Remove the garbage
DELETE #dirlist WHERE
SUBSTRING(FName,1,2) < '00' OR
SUBSTRING(FName,1,2) > '99' OR
FName IS NULL OR
FName LIKE '%<DIR>%'
-- Create a cursor and for each file name do the processing.
-- The files will be processed in date order.
DECLARE curDir CURSOR READ_ONLY LOCAL
FOR
SELECT SUBSTRING(FName,40,40) AS FName
FROM #dirlist
WHERE CAST(SUBSTRING(FName,1,20) AS DATETIME) < @.DelDate
AND SUBSTRING(FName,40,40) LIKE '%.BAK'
ORDER BY CAST(SUBSTRING(FName,1,20) AS DATETIME)
OPEN curDir
FETCH NEXT FROM curDir INTO @.Fname
WHILE (@.@.fetch_status = 0)
BEGIN
-- Delete the old backup files
SET @.Delete = 'DEL "C:\Data\Backups\' + @.FName + '"'
INSERT INTO #Errors (Results)
exec @.Return = master..xp_cmdshell @.Delete
IF @.@.RowCount > 1 OR @.Return = 1
BEGIN
SET @.Error = -1
SET @.Msg = 'Error while Deleting file ' + @.FName
GOTO On_Error
END
PRINT 'Deleted ' + @.FName + ' at ' +
CONVERT(VARCHAR(28),GETDATE(),113)
FETCH NEXT FROM curDir INTO @.Fname
END
CLOSE curDir
DEALLOCATE curDir
DROP TABLE #DirList
DROP TABLE #Errors
RETURN @.Error
On_Error:
BEGIN
IF @.Error <> 0
BEGIN
SELECT @.Msg + '. Error # ' + CAST(@.Error AS VARCHAR(10))
RAISERROR(@.Msg,12,1)
RETURN @.Error
END
END
GO
Andrew J. Kelly SQL MVP
"Sonya" <Sonya@.discussions.microsoft.com> wrote in message
news:041D4B77-15AB-42F8-80B6-AB740408D913@.microsoft.com...[vbcol=seagreen]
> Andrew or anyone,
> I'm trying to write a short sql stmts to attached to a folder and query
> that
> folder for *.bak and if the*.bak is older than a certain amount of days,
> then
> I want to delete them.
> Is this possible using tsql?
> "Andrew J. Kelly" wrote:
|||Thanks.. I'm going through the code and I'm running into an error when you
use the directory list command. I'm not able to view the temporary table to
see if data is being stored.
"Andrew J. Kelly" wrote:
> You can't do this strictly with TSQL but this is as close as you will get.
> There is a sample sp that will delete old log file backups based on the
> timestamp in the name of the files.
> -- Removing Older Backup Files --
> -- Remove any log files older than 7 days
> DECLARE @.Error INT, @.D DATETIME
> SET @.D = DATEADD(dd,-7,GETDATE())
> EXEC @.Error = remove_old_log_files @.D
>
> SELECT @.Error
> --
> CREATE PROCEDURE remove_old_log_files
> @.DelDate DATETIME
> AS
> SET NOCOUNT ON
> DECLARE @.SQL VARCHAR(500), @.FName VARCHAR(40), @.Error INT
> DECLARE @.Delete VARCHAR(300), @.Msg VARCHAR(100), @.Return INT
> -- Used later when we cast strings to Datetimes
> SET DATEFORMAT MDY
> -- Create a table to hold the results of the DIR command
> IF OBJECT_ID('tempdb..#dirlist') IS NOT NULL
> DROP TABLE #DirList
> CREATE TABLE #dirlist (FName VARCHAR(1000))
> -- Create a table to hold any errors
> IF OBJECT_ID('tempdb..#Errors') IS NOT NULL
> DROP TABLE #Errors
> CREATE TABLE #Errors (Results VARCHAR(1000))
> -- Insert the results of the dir cmd into a table so we can scan it
> INSERT INTO #dirlist (FName)
> exec @.Return = master..xp_cmdshell 'dir /OD C:\Data\Backups\*.Bak'
> SET @.Error = @.@.ERROR
> IF @.Error <> 0 OR @.Return <> 0
> BEGIN
> IF @.Return = 1
> SET @.Error = -1
> SET @.Msg = 'Error while getting the filenames with DIR '
> GOTO On_Error
> END
> -- Remove the garbage
> DELETE #dirlist WHERE
> SUBSTRING(FName,1,2) < '00' OR
> SUBSTRING(FName,1,2) > '99' OR
> FName IS NULL OR
> FName LIKE '%<DIR>%'
>
> -- Create a cursor and for each file name do the processing.
> -- The files will be processed in date order.
> DECLARE curDir CURSOR READ_ONLY LOCAL
> FOR
> SELECT SUBSTRING(FName,40,40) AS FName
> FROM #dirlist
> WHERE CAST(SUBSTRING(FName,1,20) AS DATETIME) < @.DelDate
> AND SUBSTRING(FName,40,40) LIKE '%.BAK'
> ORDER BY CAST(SUBSTRING(FName,1,20) AS DATETIME)
> OPEN curDir
> FETCH NEXT FROM curDir INTO @.Fname
> WHILE (@.@.fetch_status = 0)
> BEGIN
>
> -- Delete the old backup files
> SET @.Delete = 'DEL "C:\Data\Backups\' + @.FName + '"'
> INSERT INTO #Errors (Results)
> exec @.Return = master..xp_cmdshell @.Delete
>
> IF @.@.RowCount > 1 OR @.Return = 1
> BEGIN
> SET @.Error = -1
> SET @.Msg = 'Error while Deleting file ' + @.FName
> GOTO On_Error
> END
> PRINT 'Deleted ' + @.FName + ' at ' +
> CONVERT(VARCHAR(28),GETDATE(),113)
> FETCH NEXT FROM curDir INTO @.Fname
> END
> CLOSE curDir
> DEALLOCATE curDir
> DROP TABLE #DirList
> DROP TABLE #Errors
> RETURN @.Error
> On_Error:
> BEGIN
> IF @.Error <> 0
> BEGIN
> SELECT @.Msg + '. Error # ' + CAST(@.Error AS VARCHAR(10))
> RAISERROR(@.Msg,12,1)
> RETURN @.Error
> END
> END
> GO
>
> --
> Andrew J. Kelly SQL MVP
>
> "Sonya" <Sonya@.discussions.microsoft.com> wrote in message
> news:041D4B77-15AB-42F8-80B6-AB740408D913@.microsoft.com...
>
>
|||Do you mean this line: INSERT INTO #dirlist (FName)
exec @.Return = master..xp_cmdshell 'dir /OD C:\Data\Backups\*.Bak'
Can you be so kind as to list exactly what the error is that you are
getting? I don't understand why you can't see this table? Are you running
the select in the same connection as the one that created the table? If not
you won't be able to see it since it is a local temporary table. You really
need to be a little more specific on what you are doing and what the real
errors are that you are getting otherwise I am simply guessing.
Andrew J. Kelly SQL MVP
"Sonya" <Sonya@.discussions.microsoft.com> wrote in message
news:61C4CAE3-8398-4905-AE4C-411D6A55DA12@.microsoft.com...[vbcol=seagreen]
> Thanks.. I'm going through the code and I'm running into an error when you
> use the directory list command. I'm not able to view the temporary table
> to
> see if data is being stored.
> "Andrew J. Kelly" wrote:
|||When I try to query the temp table within the same connection, it says the
object is invalid. So I'm not sure the script is working because I can't view
the table.
I don't have any more specific errors. Sorry if I was vague in my previous
messages.
"Andrew J. Kelly" wrote:
> Do you mean this line: INSERT INTO #dirlist (FName)
> exec @.Return = master..xp_cmdshell 'dir /OD C:\Data\Backups\*.Bak'
> Can you be so kind as to list exactly what the error is that you are
> getting? I don't understand why you can't see this table? Are you running
> the select in the same connection as the one that created the table? If not
> you won't be able to see it since it is a local temporary table. You really
> need to be a little more specific on what you are doing and what the real
> errors are that you are getting otherwise I am simply guessing.
> --
> Andrew J. Kelly SQL MVP
>
> "Sonya" <Sonya@.discussions.microsoft.com> wrote in message
> news:61C4CAE3-8398-4905-AE4C-411D6A55DA12@.microsoft.com...
>
>
|||Sonya,
The temp table is created inside of a stored procedure. If you run the sp
and it creates a temp table it only lives for the duration of the sp
execution. So by the time you query for it the table has been destroyed.
This is how local temp tables work and is the whole purpose of them. If you
wan to play around with it you need to remove the code from the sp and run
it as a standard batch in query analyzer.
Andrew J. Kelly SQL MVP
"Sonya" <Sonya@.discussions.microsoft.com> wrote in message
news:D7770A92-6EEA-4B7D-8C9C-6A9DFBC92046@.microsoft.com...[vbcol=seagreen]
> When I try to query the temp table within the same connection, it says the
> object is invalid. So I'm not sure the script is working because I can't
> view
> the table.
> I don't have any more specific errors. Sorry if I was vague in my previous
> messages.
> "Andrew J. Kelly" wrote:
|||Ok, I was doing that as well. I'll have to remove or reference the return
function differently. Thank you for all of your help. The message that I
received is the following:
Server: Msg 178, Level 15, State 1, Line 92
A RETURN statement with a return value cannot be used in this context.
Server: Msg 178, Level 15, State 1, Line 99
A RETURN statement with a return value cannot be used in this context.
"Andrew J. Kelly" wrote:
> Sonya,
> The temp table is created inside of a stored procedure. If you run the sp
> and it creates a temp table it only lives for the duration of the sp
> execution. So by the time you query for it the table has been destroyed.
> This is how local temp tables work and is the whole purpose of them. If you
> wan to play around with it you need to remove the code from the sp and run
> it as a standard batch in query analyzer.
> --
> Andrew J. Kelly SQL MVP
>
> "Sonya" <Sonya@.discussions.microsoft.com> wrote in message
> news:D7770A92-6EEA-4B7D-8C9C-6A9DFBC92046@.microsoft.com...
>
>
|||You can't use RETURN if you are not in a stored procedure. Just comment
that line out for your testing.
Andrew J. Kelly SQL MVP
"Sonya" <Sonya@.discussions.microsoft.com> wrote in message
news:D62D330E-685E-4048-8635-FC6943AC9F50@.microsoft.com...[vbcol=seagreen]
> Ok, I was doing that as well. I'll have to remove or reference the return
> function differently. Thank you for all of your help. The message that I
> received is the following:
> Server: Msg 178, Level 15, State 1, Line 92
> A RETURN statement with a return value cannot be used in this context.
> Server: Msg 178, Level 15, State 1, Line 99
> A RETURN statement with a return value cannot be used in this context.
>
> "Andrew J. Kelly" wrote:
Programmaticly check DB
I want to write a job that will check for DB errors (as in dbcc checkdb) and
will operate if errors exists.
Is there a way to extract errors returned by dbcc checkdb?Have you consider setting up a DB Maintenance plan..? If you use the
Database Maintenance Plan Wizard you get the option to fix any errors the
checking finds. This can be scheduled to run as a job
HTH
Ryan Waight, MCDBA, MCSE
"Hagay Lupesko" <hagayl@.nice.com> wrote in message
news:ukCzmFZkDHA.688@.TK2MSFTNGP10.phx.gbl...
> Hi,
> I want to write a job that will check for DB errors (as in dbcc checkdb)
and
> will operate if errors exists.
> Is there a way to extract errors returned by dbcc checkdb?
>|||That maintenance plan option does require the database to be in single user
mode though, even when it doesn't have to repair anything. That can cause
quite some problems. I prefer to have dbcc checkdb run as a job and check if
the job runs as expected and act when something goes wrong. At least then I
know if something goes wrong, which might indicate an underlying problem,
rather than have problems repaired automatically.
--
Jacco Schalkwijk
SQL Server MVP
"Ryan Waight" <Ryan_Waight@.nospam.hotmail.com> wrote in message
news:uZqZKYZkDHA.2232@.TK2MSFTNGP09.phx.gbl...
> Have you consider setting up a DB Maintenance plan..? If you use the
> Database Maintenance Plan Wizard you get the option to fix any errors the
> checking finds. This can be scheduled to run as a job
>
> --
> HTH
> Ryan Waight, MCDBA, MCSE
> "Hagay Lupesko" <hagayl@.nice.com> wrote in message
> news:ukCzmFZkDHA.688@.TK2MSFTNGP10.phx.gbl...
> > Hi,
> > I want to write a job that will check for DB errors (as in dbcc checkdb)
> and
> > will operate if errors exists.
> >
> > Is there a way to extract errors returned by dbcc checkdb?
> >
> >
>
Friday, March 9, 2012
Programmatically extending the project tree
Hi,
I need to write an add-in, that when installed it will add a virtual folder to the project tree (in which I can later add other stuff).
The problem is even more complicated since the project I want to extend is an SSIS project (SQL Server Integration Services), and it doesn't let you customize the project structure even from the project explorer itself (can't add folders).
I appreciate any help on this.
Thanks.
You can't add new folders to SSIS project, neither from UI nor programmatically. We automatically assign file to one of the predefined folders (based on file extention - you may notice file may move when you rename it).If you describe what are your goals, we will consider this for future version.|||
Thank you.
I am writing an add-in to automate certain activities in SSIS. each such activity has its own definition, and you can have multiple activities in a certain SSIS project.
I thought the best way to do it would be to add my folder to the project tree, and list there the already defined activities, as well as allowing the creation of new activities.
each activity creates multiple SSIS packages, and also does some work outside the scope of SSIS.
I would love to hear if you have any other ideas solving this issue.
Thanks.
|||I would create a new tool window, which presents a different view of SSIS project - with activities and other custom stuff, and appropriate operations. This would be similar to e.g. Class View vs. Project View in C# or VB.NET projects.|||Thanks.
Can you send me any reference or technical guide on how to do this?
|||Michael,
I am working with Noa on the same project.
I am trying to do this differently than what you suggested. I added a new entry to the Project context menu (right click on the project) and I am trying to activate my addin from there. the problem is that this entry appears now for all project types and not only for SSIS!! Is there a better way to do this? maybe add the entry to another context menu? Or is there any indication that can tell me programmatically that I am dealing with a SSIS project, so I can disable it for other projects?
Thanks,
Sami.
Programmatically developing an entire package..
Hi all,
I am trying to write a program that creates packages on the fly depending on the variables you pass. for eg. It should create connection managers on the fly specific to a certain file in the variable (eg. sample.csv). the package has a dataflow task and it has flat file source and oledb destination.The problem I am facing is the flat file source when assigned to a flat file connection manager(dynamically), it is not giving any source output columns. i.e, the value for DFSource.OutputCollection(0).OutputColumnCollection.Count is Zero. But when I use the same code and reverse the source and destination(oledb as source and flatfile as destination), it is working fine. I searched everywhere for resources on how to develop packages programmatically, but could not find any except one example on msdn. If anyone knows about this prob or any useful resources on this subject, it would be really helpful.
Thanks,
Prithvi.
This is the code for creating connections and adding dataflow task. I need to add a script component transformation, but initially I wanted to see if a one-to-one mapping works fine.
Public Sub CreateConnections()
'Add the OLE DB and Flat File Connection Managers
Console.WriteLine("Creating the MyOLEDBConnection")
Dim cnOLEDB As ConnectionManager = myPackage.Connections.Add("OLEDB")
cnOLEDB.Name = "MyOLEDBConnection"
cnOLEDB.ConnectionString = "<connection string>"
Console.WriteLine("Creating the MyFlatFileConnection")
Dim cnFile As ConnectionManager = myPackage.Connections.Add("FLATFILE")
cnFile.Name = "MyFlatFileConnection"
cnFile.Properties("ConnectionString").SetValue(cnFile, "C:\sample.csv")
cnFile.Properties("Format").SetValue(cnFile, "Delimited")
cnFile.Properties("ColumnNamesInFirstDataRow").SetValue(cnFile, False)
cnFile.Properties("DataRowsToSkip").SetValue(cnFile, 0)
cnFile.Properties("RowDelimiter").SetValue(cnFile, vbCrLf)
cnFile.Properties("TextQualifier").SetValue(cnFile, """")
End Sub
Public Sub AddDataFlowTask()
'Add a Data Flow Task
Console.WriteLine("Adding a Data Flow Task")
Dim e As Executable = myPackage.Executables.Add("DTS.Pipeline")
Dim taskDF As TaskHost = CType(e, TaskHost)
taskDF.Name = "DataFlow"
Dim DTP As MainPipe
DTP = CType(taskDF.InnerObject, MainPipe)
' Add the FLAT FILE Source
Console.WriteLine("Adding the File Source")
Dim DFSource As IDTSComponentMetaData90
DFSource = DTP.ComponentMetaDataCollection.New()
DFSource.ComponentClassID = "DTSAdapter.FlatFileSource.1"
DFSource.Name = "FlatFileSource"
' Connect, populate the Input collections and disconnect
Dim SourceInst As CManagedComponentWrapper = DFSource.Instantiate()
SourceInst.ProvideComponentProperties()
DFSource.RuntimeConnectionCollection(0).ConnectionManagerID = myPackage.Connections("MyFlatFileConnection").ID
DFSource.RuntimeConnectionCollection(0).ConnectionManager = DtsConvert.ToConnectionManager90(myPackage.Connections("MyFlatFileConnection"))
SourceInst.AcquireConnections(vbNull)
SourceInst.ReinitializeMetaData()
SourceInst.ReleaseConnections()
'checking If the external metadata columns are available as source output columns and the output is 0 which means no columns are being passed as output from source
Dim column As IDTSOutputColumn90
Try
Console.WriteLine("connection name: " & DFSource.RuntimeConnectionCollection(0).Name.ToString)
Console.WriteLine("output collection name: " & DFSource.OutputCollection(0).Name)
Console.WriteLine("output collection description :" & DFSource.OutputCollection(0).Description)
Console.WriteLine("source output columns count :" & DFSource.OutputCollection(0).OutputColumnCollection.Count.ToString
Catch ex As Exception
Console.WriteLine(ex.InnerException.Message.ToString)
End Try
'tried to print col names but it does not print any
For Each column In DFSource.OutputCollection(0).OutputColumnCollection
Console.WriteLine(column.Name.ToString)
Console.WriteLine(column.DataType.ToString)
Next
' Add the OLEDB Destination
Console.WriteLine("Adding OLEDB Destination")
Dim DFDestination As IDTSComponentMetaData90
DFDestination = DTP.ComponentMetaDataCollection.New()
DFDestination.ComponentClassID = "DTSAdapter.OLEDBDestination"
DFDestination.Name = "OLEDBDestination"
' Create an instance of the component
Dim DestInst As CManagedComponentWrapper = DFDestination.Instantiate()
DestInst.ProvideComponentProperties()
If DFDestination.RuntimeConnectionCollection.Count > 0 Then
DFDestination.RuntimeConnectionCollection(0).ConnectionManagerID = myPackage.Connections("MyOLEDBConnection").ID
DFDestination.RuntimeConnectionCollection(0).ConnectionManager = DtsConvert.ToConnectionManager90(myPackage.Connections("MyOLEDBConnection"))
End If
DestInst.SetComponentProperty("AccessMode", 0)
DestInst.SetComponentProperty("OpenRowset", "tempSSIS")
DestInst.SetComponentProperty("FastLoadKeepNulls", True)
' Map a connection between the source and destination
Dim path As IDTSPath90 = DTP.PathCollection.New()
path.AttachPathAndPropagateNotifications(DFSource.OutputCollection(0), DFDestination.InputCollection(0))
Dim InColumns As IDTSVirtualInputColumnCollection90 = DFDestination.InputCollection(0).GetVirtualInput().VirtualInputColumnCollection()
' the number of input columns to destination is zero
Console.WriteLine("input columns : " & InColumns.Count.ToString)
Try
DestInst.AcquireConnections(vbNull)
Catch ex As Exception
Console.WriteLine(ex.InnerException.Message)
End Try
DestInst.ReinitializeMetaData()
'Console.WriteLine("input columns : " & DFDestination.InputCollection(0).InputColumnCollection.Count.ToString)
For Each input As IDTSInput90 In DFDestination.InputCollection
' Get the virtual input column collection for the input.
Dim vInput As IDTSVirtualInput90 = input.GetVirtualInput()
' Iterate through the virtual column collection.
For Each vColumn As IDTSVirtualInputColumn90 In vInput.VirtualInputColumnCollection
' Call the SetUsageType method of the design time instance of the component.
Console.WriteLine(vColumn.Name.ToString)
Console.WriteLine(vColumn.DataType)
DestInst.SetUsageType(input.ID, vInput, vColumn.LineageID, DTSUsageType.UT_READONLY)
Next
Dim exCol As IDTSExternalMetadataColumn90
For Each col As IDTSInputColumn90 In DFDestination.InputCollection(0).InputColumnCollection
exCol = DFDestination.InputCollection(0).ExternalMetadataColumnCollection(col.Name)
DestInst.MapInputColumn(DFDestination.InputCollection(0).ID, col.ID, exCol.ID)
Next
DestInst.ReleaseConnections()
End Sub
Please see if i did wrong any where(which I always happen to do). But based on the msdn material, the above code should work, i guess.
Programmatically creating SSIS package
Hi guys,
I was intended to write a program that will create a SSIS package which will import data from a CSV file to the SQL server 2005. But I did not find any good example for this into the internet. I found some example which exports data from SQL server 2005 to CSV files. And following those examples I have tried to write my own. But I am facing some problem with that. What I am doing here is creating two connection manager objects, one for Flat file and another for OLEDB. And create a data flow task that has two data flow component, one for reading source and another for writing to destination. While debugging I can see that after invoking the ReinitializedMetaData() for the flat file source data flow component, there is not output column found. Why it is not fetching the output columns from the CSV file? And after that when it invokes the ReinitializedMetaData() for the destination data flow component it simply throws exception.
Can any body help me to get around this problem? Even can anyone give me any link where I can find some useful article to accomplish this goal?
I am giving my code here too.
I will appreciate any kind of suggestion on this.
Code snippet:
public void CreatePackage()
{
string executeSqlTask = typeof(ExecuteSQLTask).AssemblyQualifiedName;
Package pkg = new Package();
pkg.PackageType = DTSPackageType.DTSDesigner90;
ConnectionManager oledbConnectionManager = CreateOLEDBConnection(pkg);
ConnectionManager flatfileConnectionManager =
CreateFileConnection(pkg);
// creating the SQL Task for table creation
Executable sqlTaskExecutable = pkg.Executables.Add(executeSqlTask);
ExecuteSQLTask execSqlTask = (sqlTaskExecutable as Microsoft.SqlServer.Dts.Runtime.TaskHost).InnerObject as ExecuteSQLTask;
execSqlTask.Connection = oledbConnectionManager.Name;
execSqlTask.SqlStatementSource =
"CREATE TABLE [MYDATABASE].[dbo].[MYTABLE] \n ([NAME] NVARCHAR(50),[AGE] NVARCHAR(50),[GENDER] NVARCHAR(50)) \nGO";
// creating the Data flow task
Executable dataFlowExecutable = pkg.Executables.Add("DTS.Pipeline.1");
TaskHost pipeLineTaskHost = (TaskHost)dataFlowExecutable;
MainPipe dataFlowTask = (MainPipe)pipeLineTaskHost.InnerObject;
// Put a precedence constraint between the tasks.
PrecedenceConstraint pcTasks = pkg.PrecedenceConstraints.Add(sqlTaskExecutable, dataFlowExecutable);
pcTasks.Value = DTSExecResult.Success;
pcTasks.EvalOp = DTSPrecedenceEvalOp.Constraint;
// Now adding the data flow components
IDTSComponentMetaData90 sourceDataFlowComponent = dataFlowTask.ComponentMetaDataCollection.New();
sourceDataFlowComponent.Name = "Source Data from Flat file";
// Here is the component class id for flat file source data
sourceDataFlowComponent.ComponentClassID = "{90C7770B-DE7C-435E-880E-E718C92C0573}";
CManagedComponentWrapper managedInstance = sourceDataFlowComponent.Instantiate();
managedInstance.ProvideComponentProperties();
sourceDataFlowComponent.
RuntimeConnectionCollection[0].ConnectionManagerID = flatfileConnectionManager.ID;
sourceDataFlowComponent.
RuntimeConnectionCollection[0].ConnectionManager = DtsConvert.ToConnectionManager90(flatfileConnectionManager);
managedInstance.AcquireConnections(null);
managedInstance.ReinitializeMetaData();
managedInstance.ReleaseConnections();
// Get the destination's default input and virtual input.
IDTSOutput90 output = sourceDataFlowComponent.OutputCollection[0];
// Here I dont find any columns at all..why?
// Now adding the data flow components
IDTSComponentMetaData90 destinationDataFlowComponent = dataFlowTask.ComponentMetaDataCollection.New();
destinationDataFlowComponent.Name =
"Destination Oledb compoenent";
// Here is the component class id for Oledvb data
destinationDataFlowComponent.ComponentClassID = "{E2568105-9550-4F71-A638-B7FE42E66922}";
CManagedComponentWrapper managedOleInstance = destinationDataFlowComponent.Instantiate();
managedOleInstance.ProvideComponentProperties();
destinationDataFlowComponent.
RuntimeConnectionCollection[0].ConnectionManagerID = oledbConnectionManager.ID;
destinationDataFlowComponent.
RuntimeConnectionCollection[0].ConnectionManager = DtsConvert.ToConnectionManager90(oledbConnectionManager);
// Set the custom properties.
managedOleInstance.SetComponentProperty("AccessMode", 2);
managedOleInstance.SetComponentProperty("OpenRowset", "[MYDATABASE].[dbo].[MYTABLE]");
managedOleInstance.AcquireConnections(null);
managedOleInstance.ReinitializeMetaData(); // Throws exception
managedOleInstance.ReleaseConnections();
// Create the path.
IDTSPath90 path = dataFlowTask.PathCollection.New(); path.AttachPathAndPropagateNotifications(sourceDataFlowComponent.OutputCollection[0],
destinationDataFlowComponent.InputCollection[0]);
// Get the destination's default input and virtual input.
IDTSInput90 input = destinationDataFlowComponent.InputCollection[0];
IDTSVirtualInput90 vInput = input.GetVirtualInput();
// Iterate through the virtual input column collection.
foreach (IDTSVirtualInputColumn90 vColumn in vInput.VirtualInputColumnCollection)
{
managedOleInstance.SetUsageType(
input.ID, vInput, vColumn.LineageID, DTSUsageType.UT_READONLY);
}
DTSExecResult res = pkg.Execute();
}
public ConnectionManager CreateOLEDBConnection(Package p)
{
ConnectionManager ConMgr;
ConMgr = p.Connections.Add("OLEDB");
ConMgr.ConnectionString =
"Data Source=VSTS;Initial Catalog=MYDATABASE;Provider=SQLNCLI;Integrated Security=SSPI;Auto Translate=false;";
ConMgr.Name = "SSIS Connection Manager for Oledb";
ConMgr.Description = "OLE DB connection to the Test database.";
return ConMgr;
}
public ConnectionManager CreateFileConnection(Package p)
{
ConnectionManager connMgr;
connMgr = p.Connections.Add("FLATFILE");
connMgr.ConnectionString = @."D:\MyCSVFile.csv";
connMgr.Name = "SSIS Connection Manager for Files";
connMgr.Description = "Flat File connection";
connMgr.Properties["Format"].SetValue(connMgr, "Delimited");
connMgr.Properties["HeaderRowDelimiter"].SetValue(connMgr, Environment.NewLine);
return connMgr;
}
And my CSV files is as follows
NAME, AGE, GENDER
Jon,52,MALE
Linda, 26, FEMALE
Thats all. Thanks.
As mentioned earlier, this CSV to OLEDB package generator is not far from working.First, the AccessMode on the OLEDB destination is set to the constant 2, which the "Sql Command" constant. Change that back to 0 ("Table or View").
FYI, the OLEDB destination's Access Mode enumeration is undocumented, but nevertheless can be found in the following dll: Microsoft SQL Server\DTS\PipelineComponents\OleDbDest.dll using a program like "PE Explorer".
Enum AccessMode;
AM_OPENROWSET = 0
AM_OPENROWSET_VARIABLE = 1
AM_SQLCOMMAND = 2
AM_OPENROWSET_FASTLOAD = 3
AM_OPENROWSET_FASTLOAD_VARIABLE = 4
Next, the target table doesn't exist , yet ReinitializeMetadata() for the OLEDB destination component will by default attempt to retrieve the target table's meta data. The table isn't there, so an exception is thrown. Therefore, as part of the driving program, you may wish to create the table temporarily (this table create is separate from the create in the Execute SQL Task), so that there is meta-data to reinitialize.
That should eliminate exceptions, but it doesn't mean the package will validate or execute successfully, just that you could save it to xml.
Then, add source columns for your flat file. SSIS does not materialize them for you (at least as far as I know), so the user needs to do perform that programmatically.|||
Hello jaegd,
At last I made it work today . Your tips tremendously helped me. After implementing your suggestions I had to do very little coding to make it working. Thanks a lot
.
Anyway, as I am creating the source columns programmatically for the Flat file connection, I had to read the columns from the CSV file using .net IO functionality. Is there any way to accomplish this through SSIS APIs?
Thank you very much again.
Regards
Moim
|||
Hello,
How did you create the metadata for the destination?
Thanks in advance.
Cheers,
kix
Programmatically creating SSIS package
Hi guys,
I was intended to write a program that will create a SSIS package which will import data from a CSV file to the SQL server 2005. But I did not find any good example for this into the internet. I found some example which exports data from SQL server 2005 to CSV files. And following those examples I have tried to write my own. But I am facing some problem with that. What I am doing here is creating two connection manager objects, one for Flat file and another for OLEDB. And create a data flow task that has two data flow component, one for reading source and another for writing to destination. While debugging I can see that after invoking the ReinitializedMetaData() for the flat file source data flow component, there is not output column found. Why it is not fetching the output columns from the CSV file? And after that when it invokes the ReinitializedMetaData() for the destination data flow component it simply throws exception.
Can any body help me to get around this problem? Even can anyone give me any link where I can find some useful article to accomplish this goal?
I am giving my code here too.
I will appreciate any kind of suggestion on this.
Code snippet:
public void CreatePackage()
{
string executeSqlTask = typeof(ExecuteSQLTask).AssemblyQualifiedName;
Package pkg = new Package();
pkg.PackageType = DTSPackageType.DTSDesigner90;
ConnectionManager oledbConnectionManager = CreateOLEDBConnection(pkg);
ConnectionManager flatfileConnectionManager =
CreateFileConnection(pkg);
// creating the SQL Task for table creation
Executable sqlTaskExecutable = pkg.Executables.Add(executeSqlTask);
ExecuteSQLTask execSqlTask = (sqlTaskExecutable as Microsoft.SqlServer.Dts.Runtime.TaskHost).InnerObject as ExecuteSQLTask;
execSqlTask.Connection = oledbConnectionManager.Name;
execSqlTask.SqlStatementSource =
"CREATE TABLE [MYDATABASE].[dbo].[MYTABLE] \n ([NAME] NVARCHAR(50),[AGE] NVARCHAR(50),[GENDER] NVARCHAR(50)) \nGO";
// creating the Data flow task
Executable dataFlowExecutable = pkg.Executables.Add("DTS.Pipeline.1");
TaskHost pipeLineTaskHost = (TaskHost)dataFlowExecutable;
MainPipe dataFlowTask = (MainPipe)pipeLineTaskHost.InnerObject;
// Put a precedence constraint between the tasks.
PrecedenceConstraint pcTasks = pkg.PrecedenceConstraints.Add(sqlTaskExecutable, dataFlowExecutable);
pcTasks.Value = DTSExecResult.Success;
pcTasks.EvalOp = DTSPrecedenceEvalOp.Constraint;
// Now adding the data flow components
IDTSComponentMetaData90 sourceDataFlowComponent = dataFlowTask.ComponentMetaDataCollection.New();
sourceDataFlowComponent.Name = "Source Data from Flat file";
// Here is the component class id for flat file source data
sourceDataFlowComponent.ComponentClassID = "{90C7770B-DE7C-435E-880E-E718C92C0573}";
CManagedComponentWrapper managedInstance = sourceDataFlowComponent.Instantiate();
managedInstance.ProvideComponentProperties();
sourceDataFlowComponent.
RuntimeConnectionCollection[0].ConnectionManagerID = flatfileConnectionManager.ID;
sourceDataFlowComponent.
RuntimeConnectionCollection[0].ConnectionManager = DtsConvert.ToConnectionManager90(flatfileConnectionManager);
managedInstance.AcquireConnections(null);
managedInstance.ReinitializeMetaData();
managedInstance.ReleaseConnections();
// Get the destination's default input and virtual input.
IDTSOutput90 output = sourceDataFlowComponent.OutputCollection[0];
// Here I dont find any columns at all..why?
// Now adding the data flow components
IDTSComponentMetaData90 destinationDataFlowComponent = dataFlowTask.ComponentMetaDataCollection.New();
destinationDataFlowComponent.Name =
"Destination Oledb compoenent";
// Here is the component class id for Oledvb data
destinationDataFlowComponent.ComponentClassID = "{E2568105-9550-4F71-A638-B7FE42E66922}";
CManagedComponentWrapper managedOleInstance = destinationDataFlowComponent.Instantiate();
managedOleInstance.ProvideComponentProperties();
destinationDataFlowComponent.
RuntimeConnectionCollection[0].ConnectionManagerID = oledbConnectionManager.ID;
destinationDataFlowComponent.
RuntimeConnectionCollection[0].ConnectionManager = DtsConvert.ToConnectionManager90(oledbConnectionManager);
// Set the custom properties.
managedOleInstance.SetComponentProperty("AccessMode", 2);
managedOleInstance.SetComponentProperty("OpenRowset", "[MYDATABASE].[dbo].[MYTABLE]");
managedOleInstance.AcquireConnections(null);
managedOleInstance.ReinitializeMetaData(); // Throws exception
managedOleInstance.ReleaseConnections();
// Create the path.
IDTSPath90 path = dataFlowTask.PathCollection.New(); path.AttachPathAndPropagateNotifications(sourceDataFlowComponent.OutputCollection[0],
destinationDataFlowComponent.InputCollection[0]);
// Get the destination's default input and virtual input.
IDTSInput90 input = destinationDataFlowComponent.InputCollection[0];
IDTSVirtualInput90 vInput = input.GetVirtualInput();
// Iterate through the virtual input column collection.
foreach (IDTSVirtualInputColumn90 vColumn in vInput.VirtualInputColumnCollection)
{
managedOleInstance.SetUsageType(
input.ID, vInput, vColumn.LineageID, DTSUsageType.UT_READONLY);
}
DTSExecResult res = pkg.Execute();
}
public ConnectionManager CreateOLEDBConnection(Package p)
{
ConnectionManager ConMgr;
ConMgr = p.Connections.Add("OLEDB");
ConMgr.ConnectionString =
"Data Source=VSTS;Initial Catalog=MYDATABASE;Provider=SQLNCLI;Integrated Security=SSPI;Auto Translate=false;";
ConMgr.Name = "SSIS Connection Manager for Oledb";
ConMgr.Description = "OLE DB connection to the Test database.";
return ConMgr;
}
public ConnectionManager CreateFileConnection(Package p)
{
ConnectionManager connMgr;
connMgr = p.Connections.Add("FLATFILE");
connMgr.ConnectionString = @."D:\MyCSVFile.csv";
connMgr.Name = "SSIS Connection Manager for Files";
connMgr.Description = "Flat File connection";
connMgr.Properties["Format"].SetValue(connMgr, "Delimited");
connMgr.Properties["HeaderRowDelimiter"].SetValue(connMgr, Environment.NewLine);
return connMgr;
}
And my CSV files is as follows
NAME, AGE, GENDER
Jon,52,MALE
Linda, 26, FEMALE
Thats all. Thanks.
As mentioned earlier, this CSV to OLEDB package generator is not far from working.First, the AccessMode on the OLEDB destination is set to the constant 2, which the "Sql Command" constant. Change that back to 0 ("Table or View").
FYI, the OLEDB destination's Access Mode enumeration is undocumented, but nevertheless can be found in the following dll: Microsoft SQL Server\DTS\PipelineComponents\OleDbDest.dll using a program like "PE Explorer".
Enum AccessMode;
AM_OPENROWSET = 0
AM_OPENROWSET_VARIABLE = 1
AM_SQLCOMMAND = 2
AM_OPENROWSET_FASTLOAD = 3
AM_OPENROWSET_FASTLOAD_VARIABLE = 4
Next, the target table doesn't exist , yet ReinitializeMetadata() for the OLEDB destination component will by default attempt to retrieve the target table's meta data. The table isn't there, so an exception is thrown. Therefore, as part of the driving program, you may wish to create the table temporarily (this table create is separate from the create in the Execute SQL Task), so that there is meta-data to reinitialize.
That should eliminate exceptions, but it doesn't mean the package will validate or execute successfully, just that you could save it to xml.
Then, add source columns for your flat file. SSIS does not materialize them for you (at least as far as I know), so the user needs to do perform that programmatically.
|||
Hello jaegd,
At last I made it work today . Your tips tremendously helped me. After implementing your suggestions I had to do very little coding to make it working. Thanks a lot
.
Anyway, as I am creating the source columns programmatically for the Flat file connection, I had to read the columns from the CSV file using .net IO functionality. Is there any way to accomplish this through SSIS APIs?
Thank you very much again.
Regards
Moim
|||
Hello,
How did you create the metadata for the destination?
Thanks in advance.
Cheers,
kix
Saturday, February 25, 2012
Programmatic "Script Table as Create to..." in T-SQL?
Or better yet, is there a way to retrieve the DDL that was used to create a table using a T-SQL query?
Thanks in advance,
-Preston M. Price
Perhaps something like this:
SELECT text
FROM sys.syscomments
WHERE id = object_id( 'MyStoredProcedure' )
|||There is no way to query DDL from the database (beside the Routine_definition your get from INFORMATION_SCHEMA.Routines table or the sysobejcts as Arnie pointed out), you will have to either write your own way to do this or you can use API of SMO to do the stuff with the scripter object.
Jens K. Suessmeyer.
http://www.sqlserver2005.de
Programmatic "Script Table as Create to..." in T-SQL?
Or better yet, is there a way to retrieve the DDL that was used to create a table using a T-SQL query?
Thanks in advance,
-Preston M. Price
Perhaps something like this:
SELECT text
FROM sys.syscomments
WHERE id = object_id( 'MyStoredProcedure' )
|||There is no way to query DDL from the database (beside the Routine_definition your get from INFORMATION_SCHEMA.Routines table or the sysobejcts as Arnie pointed out), you will have to either write your own way to do this or you can use API of SMO to do the stuff with the scripter object.
Jens K. Suessmeyer.
http://www.sqlserver2005.de
Monday, February 20, 2012
Program to save the blocking on a table
file.
How to write a program (any kind, SQL, Perl, C, Java, etc) to do it? I don't
want to use SQL profiler. I hope a program can do it for me.
Thanks,
LixinHi Lixin,
Thank you for using MSDN Newsgroup! It's my pleasure to assist you with
your issue.
From your description, I understand that you are handling some blocking or
deadlock issues. For monitoring the blocking issue and save it to a file,
you can refer to the following article to use the stored procedure to
collect the blocking information and use Osql utility to output and save
the information to a file:
271509 INF: How to Monitor SQL Server 2000 Blocking
http://support.microsoft.com/?id=271509
For more information, these articles may be also helpful:
283725 INF: How to View SQL Server 2000 Blocking Data
http://support.microsoft.com/?id=283725
283696 INF: Job to Monitor SQL Server 2000 Performance and Activity
http://support.microsoft.com/?id=283696
Best regards,
Billy Yao
Microsoft Online Support
----
Get Secure! - www.microsoft.com/security
This posting is provided "as is" with no warranties and confers no rights.
Please reply to newsgroups only. Thanks.|||The recommened articles are very helpful. Thanks a lot,
Lixin
""Billy Yao [MSFT]"" <v-binyao@.online.microsoft.com> wrote in message
news:%23aCiUKxuDHA.2520@.cpmsftngxa07.phx.gbl...
> Hi Lixin,
> Thank you for using MSDN Newsgroup! It's my pleasure to assist you with
> your issue.
> From your description, I understand that you are handling some blocking or
> deadlock issues. For monitoring the blocking issue and save it to a file,
> you can refer to the following article to use the stored procedure to
> collect the blocking information and use Osql utility to output and save
> the information to a file:
> 271509 INF: How to Monitor SQL Server 2000 Blocking
> http://support.microsoft.com/?id=271509
> For more information, these articles may be also helpful:
> 283725 INF: How to View SQL Server 2000 Blocking Data
> http://support.microsoft.com/?id=283725
> 283696 INF: Job to Monitor SQL Server 2000 Performance and Activity
> http://support.microsoft.com/?id=283696
>
> Best regards,
> Billy Yao
> Microsoft Online Support
> ----
> Get Secure! - www.microsoft.com/security
> This posting is provided "as is" with no warranties and confers no rights.
> Please reply to newsgroups only. Thanks.
>