Quantcast
Channel: World of Whatever
Viewing all 144 articles
Browse latest View live

Behind the scenes with Integration Services Catalogs Create Catalog

$
0
0
What happens you create the SSISDB catalog for the first time? I had no idea and this question had me wondering why in the world they were doing a restore. There's only one way to find out.

To profiler, we go! I ran a TSQL Trace against a 2014 instance (that had previously had an SSISDB catalog so this might be missing some conditional checks). Line breaks added for readability

First step, it creates a table variable and populates it with values from the SSISDB catalog, if the catalog exists.


exec sp_executesql N'
--Preparing to access the Catalog object
DECLARE @t_catalogs TABLE (
Name sysname COLLATE SQL_Latin1_General_CP1_CI_AS,
EncryptionAlgorithm nvarchar(256) COLLATE SQL_Latin1_General_CP1_CI_AS,
SchemaVersion int,
SchemaBuild nvarchar(256) COLLATE SQL_Latin1_General_CP1_CI_AS,
OperationLogRetentionTime int,
MaxProjectVersions int,
OperationCleanupEnabled bit,
VersionCleanupEnabled bit,
ServerLoggingLevel int,
OperationLogNumberOfRecords int,
VersionLogNumberOfRecords int)

IF DB_ID('
'SSISDB'') IS NOT NULL
BEGIN
INSERT INTO @t_catalogs VALUES(
'
'SSISDB'',
(SELECT [property_value] FROM [SSISDB].[catalog].[catalog_properties] WHERE [property_name] = N'
'ENCRYPTION_ALGORITHM''),
(SELECT CAST([property_value] AS INT) FROM [SSISDB].[catalog].[catalog_properties] WHERE [property_name] = N'
'SCHEMA_VERSION''),
(SELECT [property_value] FROM [SSISDB].[catalog].[catalog_properties] WHERE [property_name] = N'
'SCHEMA_BUILD''),
(SELECT CAST([property_value] AS INT) FROM [SSISDB].[catalog].[catalog_properties] WHERE [property_name] = N'
'RETENTION_WINDOW''),
(SELECT CAST([property_value] AS INT) FROM [SSISDB].[catalog].[catalog_properties] WHERE [property_name] = N'
'MAX_PROJECT_VERSIONS''),
(SELECT CAST([property_value] AS BIT) FROM [SSISDB].[catalog].[catalog_properties] WHERE [property_name] = N'
'OPERATION_CLEANUP_ENABLED''),
(SELECT CAST([property_value] AS BIT) FROM [SSISDB].[catalog].[catalog_properties] WHERE [property_name] = N'
'VERSION_CLEANUP_ENABLED''),
(SELECT CAST([property_value] AS INT) FROM [SSISDB].[catalog].[catalog_properties] WHERE [property_name] = N'
'SERVER_LOGGING_LEVEL''),
(SELECT COUNT(operation_id) FROM [SSISDB].[catalog].[operations]),
(SELECT COUNT(object_id) FROM [SSISDB].[catalog].[object_versions])
)
END



SELECT
'
'IntegrationServices[@Name=''
+ quotename(CAST(SERVERPROPERTY(N'
'Servername'') AS sysname),'''''''')
+ '
']''
+ '
'/Catalog[@Name=''
+ '
'''''''
+ REPLACE((SELECT Name from @t_catalogs), '
''''''', '''''''''''') + '''''''' + '']'' AS [Urn],
(SELECT Name from @t_catalogs) AS [Name],
(SELECT EncryptionAlgorithm from @t_catalogs) AS [EncryptionAlgorithm],
(SELECT SchemaVersion from @t_catalogs) AS [SchemaVersion],
(SELECT SchemaBuild from @t_catalogs) AS [SchemaBuild],
(SELECT OperationLogRetentionTime from @t_catalogs) AS [OperationLogRetentionTime],
(SELECT MaxProjectVersions from @t_catalogs) AS [MaxProjectVersions],
(SELECT OperationCleanupEnabled from @t_catalogs) AS [OperationCleanupEnabled],
(SELECT VersionCleanupEnabled from @t_catalogs) AS [VersionCleanupEnabled],
(SELECT ServerLoggingLevel from @t_catalogs) AS [ServerLoggingLevel],
(SELECT OperationLogNumberOfRecords from @t_catalogs) AS [OperationLogNumberOfRecords],
(SELECT VersionLogNumberOfRecords from @t_catalogs) AS [VersionLogNumberOfRecords]
WHERE
(CAST(SERVERPROPERTY(N'
'Servername'') AS sysname)=@_msparam_0)',N'@_msparam_0 nvarchar(4000)',@_msparam_0=N'RHUDAUR\DEV2014'
It's an interesting query result
UrnNameEncryptionAlgorithmSchemaVersionSchemaBuildOperationLogRetentionTimeMaxProjectVersionsOperationCleanupEnabledVersionCleanupEnabledServerLoggingLevelOperationLogNumberOfRecordsVersionLogNumberOfRecords
IntegrationServices[@Name='RHUDAUR\DEV2014']/Catalog[@Name='SSISDB']SSISDBAES_256312.0.2000.83651011100

Next up, a quick check to make sure I'm in the admin role


SELECT ISNULL(IS_SRVROLEMEMBER ('sysadmin'), 0)

Does the database already exist? The parameterization of this plus the table variable used in the first looks like the code is designed for more than one "SSISDB" catalog. Why you'd want such a thing is an entirely seperate question.


exec sp_executesql N'SELECT name FROM msdb.sys.sysdatabases WHERE name = @dbname',N'@dbname nvarchar(6)',@dbname=N'SSISDB'

Random check for our version. I assume whatever is issuing the commands uses this.


selectSUBSTRING (CAST(SERVERPROPERTY('ProductVersion') ASVARCHAR(20)),1,2);

Hit the registry to see where we have installed the SSIS (DTS) components.


declare @key_value nvarchar(1024);
exec master.dbo.xp_regread 'HKEY_LOCAL_MACHINE'
,'SOFTWARE\Microsoft\Microsoft SQL Server\120\SSIS\Setup\DTSPath'
, N''
, @key_value output;
select @key_value;

Check to verify the .bak file exists. I presume this is built off the preceding query. Returns a 1 if it was found.


DECLARE @CatalogFileExists bit
BEGIN
DECLARE @CatalogFile nvarchar(1024)
SELECT @CatalogFile = N'C:\Program Files\Microsoft SQL Server\120\DTS\Binn\SSISDBBackup.bak'
CREATETABLE #t (file_exists int, is_directory int, parent_directory_exists int)
INSERT #t EXEC xp_fileexist @CatalogFile
SELECTTOP 1 @CatalogFileExists=file_exists from #t
DROPTABLE #t
END
SELECT @CatalogFileExists

Now that we know where a backup is and that we're going to create a database called SSISDB, the code verifies the database doesn't already exist. If it does, it terminates the operation with an error The database, 'SSISDB', already exists. Rename or remove the existing database, and then run SQL Server Setup again.


IF DB_ID('SSISDB') ISNOTNULL
RAISERROR(27135, 16, 1, 'SSISDB')

Now we recheck the version except this time we force the failure of the script. I like seeing a reference to "Denali" in the error message.


IFCAST( SUBSTRING (CAST(SERVERPROPERTY('ProductVersion') ASVARCHAR(20)),1,2) ASINT ) < 11
RAISERROR (27193,16,1, 'Denali or later') WITH NOWAIT

As part of the installation of the SSISDB, we must have CLR enabled. I already had mine enabled so I expect there's a step after this that I didn't capture.


SELECT [value_in_use] FROM sys.configurations WHERE [name] = 'clr enabled'

This step builds out the path where the data files should be located


DECLARE @path nvarchar(1024) = Convert(nvarchar(1024),ServerProperty('MasterFile'));
SELECT @path = SUBSTRING(@path, 1, CHARINDEX(N'master.mdf', LOWER(@path)) - 1);
SELECT @path;

We check to see if the SSISDB .mdf file exists


DECLARE @CatalogFileExists bit
BEGIN
DECLARE @CatalogFile nvarchar(1024)
SELECT @CatalogFile = N'C:\Program Files\Microsoft SQL Server\MSSQL12.DEV2014\MSSQL\DATA\SSISDB.mdf'
CREATETABLE #t (file_exists int, is_directory int, parent_directory_exists int)
INSERT #t EXEC xp_fileexist @CatalogFile
SELECTTOP 1 @CatalogFileExists=file_exists from #t
DROPTABLE #t
END
SELECT @CatalogFileExists

And we check to see if the SSISDB .ldf file exists


DECLARE @CatalogFileExists bit
BEGIN
DECLARE @CatalogFile nvarchar(1024)
SELECT @CatalogFile = N'C:\Program Files\Microsoft SQL Server\MSSQL12.DEV2014\MSSQL\DATA\SSISDB.ldf'
CREATETABLE #t (file_exists int, is_directory int, parent_directory_exists int)
INSERT #t EXEC xp_fileexist @CatalogFile
SELECTTOP 1 @CatalogFileExists=file_exists from #t
DROPTABLE #t
END
SELECT @CatalogFileExists

This step generates the files contained within our bak. I assume this is used to generate the next command.


exec sp_executesql N'RESTORE FILELISTONLY FROM DISK = @backupfile'
,N'@backupfile nvarchar(67)'
,@backupfile=N'C:\Program Files\Microsoft SQL Server\120\DTS\Binn\SSISDBBackup.bak'

Now we're cooking with gas. Here we actually perform the restore of the SSISDB backup.


exec sp_executesql N'RESTORE DATABASE @databaseName
FROM DISK = @backupFile WITH REPLACE
,MOVE @dataName TO @dataFilePath
,MOVE @logName TO @logFilePath'

,N'@databaseName nvarchar(6),@dataName nvarchar(4),@dataFilePath nvarchar(75),@logName nvarchar(3),@logFilePath nvarchar(75),@backupFile nvarchar(67)'
,@databaseName=N'SSISDB'
,@dataName=N'data'
,@dataFilePath=N'C:\Program Files\Microsoft SQL Server\MSSQL12.DEV2014\MSSQL\DATA\SSISDB.mdf'
,@logName=N'log'
,@logFilePath=N'C:\Program Files\Microsoft SQL Server\MSSQL12.DEV2014\MSSQL\DATA\SSISDB.ldf'
,@backupFile=N'C:\Program Files\Microsoft SQL Server\120\DTS\Binn\SSISDBBackup.bak'

If for some reason the restore left the SSISDB in read-only mode, force it into read-write.


USE master;
IFEXISTS (SELECT [name] FROM sys.databases WHERE [name]='SSISDB'AND [is_read_only] = 1)
ALTERDATABASE [SSISDB]
SET READ_WRITE WITH
ROLLBACK IMMEDIATE

At this point, we have an SSISDB but it's not secure. We have the ability to store sensitive data in there so we need to protect our jewels.


USE [SSISDB];

IFEXISTS (SELECT [name] FROM sys.symmetric_keys WHERE [name] = '##MS_DatabaseMasterKey##')
DROP MASTER KEY

Secure our database with a master key


exec sp_executesql N'USE [SSISDB];
DECLARE @pwd nvarchar(4000) = REPLACE(@password, N'
''''''', N'''''''''''');
EXEC('
'CREATE MASTER KEY ENCRYPTION BY PASSWORD = '''''' + @pwd + '''''''');'
,N'@password nvarchar(20)',@password=N'pass@word1'

Create an asymmetric key from our assembly


IFNOTEXISTS(SELECT * FROM sys.asymmetric_keys WHERE name = 'MS_SQLEnableSystemAssemblyLoadingKey')
CREATE ASYMMETRIC KEY
MS_SQLEnableSystemAssemblyLoadingKey
FROM

EXECUTABLE FILE = 'C:\Program Files\Microsoft SQL Server\120\DTS\Binn\Microsoft.SqlServer.IntegrationServices.Server.dll'

I have no idea what this virtual account is for and why we'll drop it if it exists but so be it. I assume we're doing this to ensure it has the correct permissions in the next step.


IFEXISTS(SELECT [name] FROM sys.server_principals where name = '##MS_SQLEnableSystemAssemblyLoadingUser##')
DROP LOGIN ##MS_SQLEnableSystemAssemblyLoadingUser##

Create a login that can create unsafe assemblies. An unsafe assembly? Cats and dogs are signing leases at this very moment.


CREATE LOGIN ##MS_SQLEnableSystemAssemblyLoadingUser## FROM ASYMMETRIC KEY MS_SQLEnableSystemAssemblyLoadingKey
GRANT UNSAFE ASSEMBLY TO ##MS_SQLEnableSystemAssemblyLoadingUser##

I allowed the SSIS CLR to run on startup so the installing is obliging me. Here it drops it in case it existed.


IFEXISTS(SELECT name FROM sys.procedures WHERE name=N'sp_ssis_startup')
BEGIN
EXEC sp_procoption N'sp_ssis_startup','startup','off'
DROPPROCEDURE [sp_ssis_startup]
END

Creation of the stored procedure dbo.sp_ssis_startup in master.


CREATEPROCEDURE [dbo].[sp_ssis_startup]
AS
SET NOCOUNT ON
/* Currently, the IS Store name is'SSISDB' */
IF DB_ID('SSISDB') ISNULL
RETURN

IFNOTEXISTS(SELECT name FROM [SSISDB].sys.procedures WHERE name=N'startup')
RETURN

/*Invoke the procedurein SSISDB */
EXEC [SSISDB].[catalog].[startup]

I have no idea why they have IF here but they do.


IF (1=1)
BEGIN
/* Run sp_ssis_startup whenSql Server restarts */
EXEC sp_procoption N'sp_ssis_startup','startup','on'
END

At this point, we are going to get busy with setting up maintenance for the SSISDB. Drop any jobs named "SSIS Server Maintenance Job" and hope that you didn't have anything vitally important with the same name.


IFEXISTS (SELECT name FROM sysjobs WHERE name = N'SSIS Server Maintenance Job')
EXEC sp_delete_job
@job_name = N'SSIS Server Maintenance Job' ;

Drop an existing virtual login that will be associated to our job.


IFEXISTS(SELECT * FROM sys.server_principals where name = '##MS_SSISServerCleanupJobLogin##')
DROP LOGIN ##MS_SSISServerCleanupJobLogin##

Create our login


DECLARE @loginPassword nvarchar(256)
SELECT @loginPassword = REPLACE (CONVERT( nvarchar(256), CRYPT_GEN_RANDOM( 64 )), N'''', N'''''')
EXEC ('CREATE LOGIN ##MS_SSISServerCleanupJobLogin## WITH PASSWORD =''' +@loginPassword + ''', CHECK_POLICY = OFF')

Disable the login we just created...


ALTER LOGIN ##MS_SSISServerCleanupJobLogin## DISABLE

Create our job, owned by the disabled login above


EXEC dbo.sp_add_job
@job_name = N'SSIS Server Maintenance Job',
@enabled = 1,
@owner_login_name = '##MS_SSISServerCleanupJobLogin##',
@description = N'Runs every day. The job removes operation records from the database that are outside the retention window and maintains a maximum number of versions per project.'

Cleanup, aisle 1. The job step runs a stored procedure to do the cascading deletes



DECLARE @IS_server_name NVARCHAR(30)
SELECT @IS_server_name = CONVERT(NVARCHAR, SERVERPROPERTY('ServerName'))

EXEC sp_add_jobserver @job_name = N'SSIS Server Maintenance Job',
@server_name = @IS_server_name

EXEC sp_add_jobstep
@job_name = N'SSIS Server Maintenance Job',
@step_name = N'SSIS Server Operation Records Maintenance',
@subsystem = N'TSQL',
@command = N'EXEC [internal].[cleanup_server_retention_window]',
@database_name = N'SSISDB',
@on_success_action = 3,
@retry_attempts = 3,
@retry_interval = 3;

Clean up the old versions of the .ispac files


EXEC sp_add_jobstep
@job_name = N'SSIS Server Maintenance Job',
@step_name = N'SSIS Server Max Version Per Project Maintenance',
@subsystem = N'TSQL',
@command = N'EXEC [internal].[cleanup_server_project_version]',
@database_name = N'SSISDB',
@retry_attempts = 3,
@retry_interval = 3;

Create a schedule to run daily at midnight. Again, this might not be optimal for your ETL processing window (see link above).


EXEC sp_add_jobschedule
@job_name = N'SSIS Server Maintenance Job',
@name = 'SSISDB Scheduler',
@enabled = 1,
@freq_type = 4, /*daily*/
@freq_interval = 1,/*everyday*/
@freq_subday_type = 0x1,
@active_start_date = 20001231,
@active_end_date = 99991231,
@active_start_time = 0,
@active_end_time = 120000

Finally, we need to ensure our user can run the two stored procedure in the job (makes sense). Drop that user, like it's hot...


USE SSISDB
IFEXISTS (SELECT name FROM sys.database_principals WHERE name = '##MS_SSISServerCleanupJobUser##')
DROPUSER ##MS_SSISServerCleanupJobUser##

Add the user back in, based on our disabled login.


CREATEUSER ##MS_SSISServerCleanupJobUser## FOR LOGIN ##MS_SSISServerCleanupJobLogin##

Give them rights to run internal.cleanup_server_retention_window


GRANTEXECUTEON [internal].[cleanup_server_retention_window] TO ##MS_SSISServerCleanupJobUser##

Give them rights to run internal.cleanup_server_project_version


GRANTEXECUTEON [internal].[cleanup_server_project_version] TO ##MS_SSISServerCleanupJobUser##

References


Remove all MS_Diagram extended properties

$
0
0
When you create a view in SSMS using the wizard, it retains information for the layout designer. There's no need for this, especially as it makes my database comparisons messy.

I had already run through the following articles the first time to nuke all those metadata items. Then they restored over my dev environment and I had not saved my scripts.

  • http://sqlblog.com/blogs/jamie_thomson/archive/2012/03/25/generate-drop-statements-for-all-extended-properties.aspx
  • http://www.sqlservercentral.com/articles/Metadata/72609/
  • http://blog.hongens.nl/2010/02/25/drop-all-extended-properties-in-a-mssql-database/
  • http://msdn.microsoft.com/en-us/library/ms178595.aspx

My approach is a wee different. I'm going to use a cursor to enumerate through my results and then use sp_executesql instead of doing the string building the other fine authors were using.

This script will remove all the MS named objects attached to views. I hope that you can easily adapt this to stripping the extended properties from any object by adjusting the join to other system objects and/or using level2 specifications


DECLARE
@Query nvarchar(4000) = 'EXECUTE sys.sp_dropextendedproperty @name, @level0type, @level0name, @level1type, @level1name, @level2type, @level2name;'
, @ParamList nvarchar(4000) = N'@name sysname, @level0type varchar(128), @level0name sysname, @level1type varchar(128), @level1name sysname, @level2type varchar(128), @level2name sysname'
, @SchemaName sysname
, @ObjectName sysname
, @PropertyName sysname
, @ObjectType varchar(128);

DECLARE CSR CURSOR
READ_ONLY
FOR
SELECT
S.name AS SchemaName
, V.name AS ObjectName
, EP.name AS PropertyName
, O.type_desc AS ObjectType
FROM
sys.extended_properties AS EP
INNERJOIN
sys.views V
ON V.object_id = EP.major_id
INNERJOIN
sys.schemas S
ON S.schema_id = V.schema_id
INNERJOIN
sys.objects AS O
ON O.object_id = V.object_id
WHERE
EP.minor_id = 0
-- The underscore is a single character wild card, need to escape it
AND EP.name LIKE'MS_Dia%';


OPEN CSR;

FETCHNEXT
FROM CSR INTO
@SchemaName
, @ObjectName
, @PropertyName
, @ObjectType;
WHILE (@@fetch_status = 0)
BEGIN

EXECUTE sys.sp_executesql
@Query
, @ParamList
, @name=@PropertyName
, @level0Type = 'SCHEMA'
, @level0Name = @SchemaName
, @level1Type = @ObjectType
, @level1Name = @ObjectName
, @level2type = NULL
, @level2Name = NULL;


FETCHNEXT
FROM CSR INTO
@SchemaName
, @ObjectName
, @PropertyName
, @ObjectType;
END

CLOSE CSR;
DEALLOCATE CSR;

A quick and dirty date dimension for PowerPivot

$
0
0
I've built out this sort of thing a few times but in fine fashion, I've never saved my script. In a proper data warehouse, there would be a date dimension built out and I would just reference it. Whenever I get skunkworks projects for things like PowerPivot demos, since that data's not been cared for, I need something handy.

This script generates approximately 20 years of data. It uses an intelligent surrogate key and begins counting at 2014-01-01. It generates date part names and their numeric values for sorting purposes.


SELECT
-- 20 years, approximate
TOP (20 * 365)
CAST(CONVERT(char(8), D.FullDate, 112) ASint) AS DateKey
, D.FullDate
, YEAR(D.FullDate) AS YearValue

, 'Q' + DATENAME(QUARTER, D.FullDate) AS QuarterName
, DATEPART(QUARTER, D.FullDate) AS QuarterValue

, DATENAME(mm, D.FullDate) AS MonthName
, MONTH(D.FullDate) AS MonthValue

, DAY(D.FullDate) AS DayValue

, DATENAME(dw, D.FullDate) AS DayOfWeekName
, DATEPART(dw, D.FullDate) AS DayOfWeekValue

FROM
(
SELECT
DATEADD(d, D.number, BOT.StartDate) AS FullDate
FROM
(
SELECT
ROW_NUMBER() OVER (ORDERBY (SELECTNULL)) -1 AS number
FROM
sys.all_columns AS AC
) D
CROSS APPLY
(
-- Start date
SELECTCAST('2014-01-01'ASdate) AS StartDate
) BOT
) D;

Biml and Looping through excel files in SSIS

$
0
0

Biml and Looping through excel files in SSIS

I came across a question on DBA.StackExchange.com and they wanted to know why they were getting an error about the file being locked. My fine compatriots already suggested the questioner use ProcMon but the person was rather addament that it wasn't open. psssst it's open I cover the steps on file in use by another process.

Where's the biml and Excel?

Easy with the pitchforks... The questioner was implementing Mike Davis's Loop Through Excel Files in SSIS and I figureed since I didn't have a Biml post covering Excel yet, I'd pop one out.

Biml and Excel

The only oddity to make note of is the connection string. Normally, your connection string would look something like Provider=Microsoft.Jet.OLEDB.4.0;Data Source=c:\ssisdata\Excel\USCustomers1.xls;Extended Properties="Excel 8.0;HDR=YES"; However, you're going to need to escape those double quotes and your normal tricks of "" or \" aren't going to work here as that's xml we're dealing with. Instead, you'll need to use &quot;


<Bimlxmlns="http://schemas.varigence.com/biml.xsd">
<Connections>
<ExcelConnectionConnectionString="Provider=Microsoft.Jet.OLEDB.4.0;Data Source=c:\ssisdata\Excel\USCustomers1.xls;Extended Properties=&quot;Excel 8.0;HDR=YES&quot;;"Name="CM_Excel"></ExcelConnection>
</Connections>
<Packages>
<PackageName="LoopThroughExcel"ConstraintMode="Linear">
<Variables>
<VariableDataType="String"Name="strExcelFile"></Variable>
</Variables>
<Connections>
<ConnectionConnectionName="CM_Excel">
<Expressions>
<ExpressionExternalProperty="ExcelFilePath">@[User::strExcelFile]</Expression>
</Expressions>
</Connection>
</Connections>
<Tasks>
<ForEachFileLoop
Folder="C:\ssisdata\Excel"
FileSpecification="*.xls"
ConstraintMode="Linear"
Name="FELC Iterate Excel">
<VariableMappings>
<VariableMappingName="0"VariableName="User.strExcelFile"/>
</VariableMappings>
<Tasks>
<Dataflow
Name="DFT Do Excel"
DelayValidation="true">
<Transformations>
<ExcelSource
ConnectionName="CM_Excel"
Name="XL_SRC">
<ExternalTableInputTable="Sheet1$"></ExternalTableInput>
</ExcelSource>
<!--
Do noting, but do it splendidly
-->
<DerivedColumnsName="bit bucket"></DerivedColumns>
</Transformations>
</Dataflow>
</Tasks>
</ForEachFileLoop>
</Tasks>
</Package>
</Packages>
</Biml>

Success

That should generate a package that has a control flow like

And look at that data flow!

Error

Now, back to the original question. If I open that file in Excel and then run the package, YOU WON'T BELIEVE WHAT ERROR MESSAGE I RECEIVE.</Linkbait>

Error: 0xC0202009 at LoopThroughExcel, Connection manager "CM_Excel": SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80004005.
An OLE DB record is available. Source: "Microsoft JET Database Engine" Hresult: 0x80004005 Description: "The Microsoft Jet database engine cannot open the file ''. It is already opened exclusively by another user, or you need permission to view its data.".

Biml - Reorganize Index Task

$
0
0

Biml - Task

The Reorganize Index Task via Biml. As always, ADO.NET connection required.


<Bimlxmlns="http://schemas.varigence.com/biml.xsd">
<Connections>
<AdoNetConnectionName="CM_ADO_DB"ConnectionString="Data Source=localhost\dev2012;Integrated Security=SSPI;Connect Timeout=30;Database=msdb;"Provider="SQL"/>
</Connections>
<Packages>
<Package
ConstraintMode="Linear"
Name="Task_ReorganizeIndex">
<Tasks>
<ReorganizeIndex
ConnectionName="CM_ADO_DB"
DatabaseSelectionMode="All"
ObjectSelectionMode="Tables"
Name="RO All Tables">
</ReorganizeIndex>

<!-- Reorg a specific index -->
<ReorganizeIndex
ConnectionName="CM_ADO_DB"
DatabaseSelectionMode="Specific"
ObjectSelectionMode="Views"
Name="RO Specific View"
CompactLargeObjects="false">
<Databases>
<Database>AdventureWorks2012</Database>
</Databases>
<Objects>
<Object>Production.vProductAndDescription</Object>
</Objects>
</ReorganizeIndex>
</Tasks>
</Package>
</Packages>
</Biml>

Result

The above Biml describes a package that creates two Reorganize Index Tasks. The first reorganizes indexes on all the tables while the second targets a specific indexed view.

Control flow - Reorganize Index Task

RO ALL Tables

Reorganize all the things!

Reorganize all tables

RO Specific View

Here we reorganize a specific table but notice that this dialog makes it appear nothing is selected. Once you open the combobox or click View TSQL, you'll get a different story.

Reorganize specific view

View TSQL

Notice the expected TSQL shows that we're reorganizing Production.vProductAndDescription and Person.vStateProvinceCountryRegion. Weird

Generated TSQL

Biml - Replicate-O-Matic

$
0
0

I while back I posted about Copy all the tables. Since then, I've done a lot more Biml and one of the problems I had was duplicated logic across lots of .biml files. I say I had the problem because I didn't know about CallBimlScript. Well, I knew about it because yeah, I can include a file but I didn't realize the power there. CallBimlScript allows you to make functions for your biml. You can define parameters to it. That changes everything in my mind. Let's look the copy all tables example but this time, let's modularize it.

We'll create two biml files. One will be inc_Package.biml and the other will be Driver.biml. The definition of Driver.Biml remains the same, we're just going to cut out the actual Package block(s).


<#@ template language="C#" hostspecific="true" #>
<#@ import namespace="System.Data" #>
<#@ import namespace="System.Data.SqlClient" #>
<#@ import namespace="System.IO" #>
<Bimlxmlns="http://schemas.varigence.com/biml.xsd">
<!--
<#
string connectionStringSource = @"Server=localhost\dev2012;Initial Catalog=AdventureWorksDW2012;Integrated Security=SSPI;Provider=SQLNCLI11.1";
string connectionStringDestination = @"Server=localhost\dev2012;Initial Catalog=AdventureWorksDW2012_DEST;Integrated Security=SSPI;Provider=SQLNCLI11.1";

string SrcTableQuery = @"
SELECT
SCHEMA_NAME(t.schema_id) AS schemaName
, T.name AS tableName
FROM
sys.tables AS T
WHERE
T.is_ms_shipped = 0
AND T.name <> 'sysdiagrams';
";

DataTable dt = null;
dt = ExternalDataAccess.GetDataTable(connectionStringSource, SrcTableQuery);
#>
-->
<Connections>
<OleDbConnection
Name="SRC"
CreateInProject="false"
ConnectionString="<#=connectionStringSource#>"
RetainSameConnection="false">
</OleDbConnection>
<OleDbConnection
Name="DST"
CreateInProject="false"
ConnectionString="<#=connectionStringDestination#>"
RetainSameConnection="false">
</OleDbConnection>
</Connections>

<Packages>
<# foreach (DataRow dr in dt.Rows) { #>
<#=CallBimlScript("inc_Package.biml", dr[0].ToString(), dr[1].ToString())#>
<# } #>
</Packages>
</Biml>
So that looks a little easier to understand. We get our dataset from our external source, create the connections and for each table in our dataset, we call the inc_Package.biml file passing in columns 0 and 1, which corresponds to our schema name and table name.

Our inc_Package.biml file isn't that different either. What's crucial is the first two lines where we define our property values of schema and table. Our file expects two parameters now and then we reference them in the code as <#=schema#>, where previously we would have used <#=dr[0].ToString()#>


<#@ property name="schema" type="String" #>
<#@ property name="table" type="String" #>

<PackageConstraintMode="Linear"
Name="<#=schema#>_<#=table#>"

>
<Variables>
<VariableName="SchemaName"DataType="String"><#=schema#></Variable>
<VariableName="TableName"DataType="String"><#=table#></Variable>
<VariableName="QualifiedTableSchema"
DataType="String"
EvaluateAsExpression="true">"[" + @[User::SchemaName] + "].[" + @[User::TableName] + "]"</Variable>
</Variables>
<Tasks>
<Dataflow
Name="DFT"
>
<Transformations>
<OleDbSource
Name="OLE_SRC<#=schema#>_<#=table#>"
ConnectionName="SRC"
>
<TableFromVariableInputVariableName="User.QualifiedTableSchema"/>
</OleDbSource>
<OleDbDestination
Name="OLE_DST<#=schema#>_<#=table#>"
ConnectionName="DST"
KeepIdentity="true"
TableLock="true"
UseFastLoadIfAvailable="true"
KeepNulls="true"
>
<TableFromVariableOutputVariableName="User.QualifiedTableSchema"/>
</OleDbDestination>
</Transformations>
</Dataflow>

</Tasks>
</Package>

I right click on Driver.biml and select Generate SSIS Packages and boom! many packages are created to replicate all the data between my source and destination. I can't believe this is available in BIDS Helper, aka free version, of the biml engine. I have so much code to go and simplify.

Further reading

Slimming down the SSIS Script Task

$
0
0

SSIS Script Task History

Gather 'round children, I want to tell you a tale of woe. The 2005 release of SQL Server Integration Services allowed you to use any .NET language you wanted in a Script Task or Script Component, as long as you liked Visual Basic .NET. The 2008 release of SSIS allowed us to use either "Microsoft Visual C# 2008" or "Microsoft Visual Basic 2008". Many .NET devs rejoiced over this and that's what today's post is about.

This, is what the standard Task would generate as template code.


/*
Microsoft SQL Server Integration Services Script Task
Write scripts using Microsoft Visual C# 2008.
The ScriptMain is the entry point class of the script.
*/

using System;
using System.Data;
using Microsoft.SqlServer.Dts.Runtime;
using System.Windows.Forms;

namespace ST_d80f6050516944fa8639234f7b2e50b9.csproj
{
[System.AddIn.AddIn("ScriptMain", Version = "1.0", Publisher = "", Description = "")]
publicpartialclass ScriptMain : Microsoft.SqlServer.Dts.Tasks.ScriptTask.VSTARTScriptObjectModelBase
{

#region VSTA generated code
enum ScriptResults
{
Success = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Success,
Failure = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Failure
};
#endregion

/*
The execution engine calls this method when the task executes.
To access the object model, use the Dts property. Connections, variables, events,
and logging features are available as members of the Dts property as shown in the following examples.

To reference a variable, call Dts.Variables["MyCaseSensitiveVariableName"].Value;
To post a log entry, call Dts.Log("This is my log text", 999, null);
To fire an event, call Dts.Events.FireInformation(99, "test", "hit the help message", "", 0, true);

To use the connections collection use something like the following:
ConnectionManager cm = Dts.Connections.Add("OLEDB");
cm.ConnectionString = "Data Source=localhost;Initial Catalog=AdventureWorks;Provider=SQLNCLI10;Integrated Security=SSPI;Auto Translate=False;";

Before returning from this method, set the value of Dts.TaskResult to indicate success or failure.

To open Help, press F1.
*/

publicvoid Main()
{
// TODO: Add your code here
Dts.TaskResult = (int)ScriptResults.Success;
}
}
}
That's 50 lines in total, my trim job brings that down to 23.

I hate regions. Hate them with the fury of a thousand suns. Ctrl-M, Ctrl-P stops all outlining but I have to click that every time I open the script, or I have to remove the stupid #region lines. That's a minor annoyance but one I lived through.

The 2012/2014 release of SSIS was designed to make it easier for people to get started. We had these getting started videos that were suggested every time you create a new integration services project which is really charming when your job is an ETL developer. As part of the rookie developer changes, the default Script Task now provides you with a lot more hand holding with regard to developing your first Task. The following is that template


#region Help: Introduction to the script task
/* The Script Task allows you to perform virtually any operation that can be accomplished in
* a .Net application within the context of an Integration Services control flow.
*
* Expand the other regions which have "Help" prefixes for examples of specific ways to use
* Integration Services features within this script task. */
#endregion


#region Namespaces
using System;
using System.Data;
using Microsoft.SqlServer.Dts.Runtime;
using System.Windows.Forms;
#endregion

namespace ST_12345
{
/// <summary>
/// ScriptMain is the entry point class of the script. Do not change the name, attributes,
/// or parent of this class.
/// </summary>
[Microsoft.SqlServer.Dts.Tasks.ScriptTask.SSISScriptTaskEntryPointAttribute]
publicpartialclass ScriptMain : Microsoft.SqlServer.Dts.Tasks.ScriptTask.VSTARTScriptObjectModelBase
{
#region Help: Using Integration Services variables and parameters in a script
/* To use a variable in this script, first ensure that the variable has been added to
* either the list contained in the ReadOnlyVariables property or the list contained in
* the ReadWriteVariables property of this script task, according to whether or not your
* code needs to write to the variable. To add the variable, save this script, close this instance of
* Visual Studio, and update the ReadOnlyVariables and
* ReadWriteVariables properties in the Script Transformation Editor window.
* To use a parameter in this script, follow the same steps. Parameters are always read-only.
*
* Example of reading from a variable:
* DateTime startTime = (DateTime) Dts.Variables["System::StartTime"].Value;
*
* Example of writing to a variable:
* Dts.Variables["User::myStringVariable"].Value = "new value";
*
* Example of reading from a package parameter:
* int batchId = (int) Dts.Variables["$Package::batchId"].Value;
*
* Example of reading from a project parameter:
* int batchId = (int) Dts.Variables["$Project::batchId"].Value;
*
* Example of reading from a sensitive project parameter:
* int batchId = (int) Dts.Variables["$Project::batchId"].GetSensitiveValue();
* */

#endregion

#region Help: Firing Integration Services events from a script
/* This script task can fire events for logging purposes.
*
* Example of firing an error event:
* Dts.Events.FireError(18, "Process Values", "Bad value", "", 0);
*
* Example of firing an information event:
* Dts.Events.FireInformation(3, "Process Values", "Processing has started", "", 0, ref fireAgain)
*
* Example of firing a warning event:
* Dts.Events.FireWarning(14, "Process Values", "No values received for input", "", 0);
* */
#endregion

#region Help: Using Integration Services connection managers in a script
/* Some types of connection managers can be used in this script task. See the topic
* "Working with Connection Managers Programatically" for details.
*
* Example of using an ADO.Net connection manager:
* object rawConnection = Dts.Connections["Sales DB"].AcquireConnection(Dts.Transaction);
* SqlConnection myADONETConnection = (SqlConnection)rawConnection;
* //Use the connection in some code here, then release the connection
* Dts.Connections["Sales DB"].ReleaseConnection(rawConnection);
*
* Example of using a File connection manager
* object rawConnection = Dts.Connections["Prices.zip"].AcquireConnection(Dts.Transaction);
* string filePath = (string)rawConnection;
* //Use the connection in some code here, then release the connection
* Dts.Connections["Prices.zip"].ReleaseConnection(rawConnection);
* */
#endregion


/// <summary>
/// This method is called when this script task executes in the control flow.
/// Before returning from this method, set the value of Dts.TaskResult to indicate success or failure.
/// To open Help, press F1.
/// </summary>
publicvoid Main()
{
// TODO: Add your code here

Dts.TaskResult = (int)ScriptResults.Success;
}

#region ScriptResults declaration
/// <summary>
/// This enum provides a convenient shorthand within the scope of this class for setting the
/// result of the script.
///
/// This code was generated automatically.
/// </summary>
enum ScriptResults
{
Success = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Success,
Failure = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Failure
};
#endregion

}
}

Mercy, I get all twitchy in the eye just looking at it. That is 113 lines of code and for those that really pay attention to such things, there's a delightful mix of tabs and space characters. Again, after I go through my defluffing process, I'm back to 23 lines. But, the stripping process is much slower. I have more regions to deal with and inaccurate comments to clean and it's just a lot more work than my OCD brain should have to expend.

There must be a way to take the training wheels off a Script Task. Today I found that switch. Meet the ProjectTemplatePath property. On my 2012 installation, it points to C:\Program Files (x86)\Microsoft SQL Server\110\DTS\Binn\VSTA11_IS_ST_CS_Template.vstax

VSTA11_IS_ST_CS_Template.vstax

In fine Microsoft tradition, the vstax file is a ... what? Anyone? This fine document states An Open Packaging Container (OPC) file that contains one or more project templates which to me says "a zip file." So, I copied that out of my installation location and unzipped it. 10 files
  • AssemblyInfo.cs
  • IS%20Script%20Task%20Project.csproj
  • Resources.Designer.cs
  • Resources.resx
  • ScriptMain.cs
  • Settings.Designer.cs
  • Settings.settings
  • VSTA11_IS_ST_CS_Template.vstatemplate
  • vstax.manifest
  • [Content_Types].xml
Oh, how lovely! Point your favourite text editor at ScriptMain.cs. Look familiar? That is your ScriptMain, except it has a token of $safeprojectname$ instead of ST_12345 for the namespace.

There be dragons here

What we're about to do has the possibility of breaking your Visual Studio/BIDS/SSDT installation and you should not do it.

Really, by mucking about you and you alone are responsible for your actions. You break it, you fix it.

Excellent, you're still here. Step 1. Edit the contents of ScriptMain.cs as you see fit. Mine looks like the following.


using System;
using System.Data;
using Microsoft.SqlServer.Dts.Runtime;
using System.Windows.Forms;

namespace $safeprojectname$
{
[Microsoft.SqlServer.Dts.Tasks.ScriptTask.SSISScriptTaskEntryPointAttribute]
publicpartialclass ScriptMain : Microsoft.SqlServer.Dts.Tasks.ScriptTask.VSTARTScriptObjectModelBase
{
publicvoid Main()
{
Dts.TaskResult = (int)ScriptResults.Success;
}

enum ScriptResults
{
Success = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Success,
Failure = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Failure
};

}
}
I do comment my code, usually to the level that F/X Cop stops barking at me but I don't want to start with the annoying boilerplate comments.

Step 2 Create a new version of the vstax file. I could not figure out how to make the native windows compressed folder thing to not compress the archive. 7-Zip however makes this a snap.

  1. Select all 10 files in the folder. Do not select the enclosing folder or your archive will be wrong
  2. Right click and choose the Add to Archive option
  3. Change your "Archive format" to zip and the "Compression level" to Store
  4. Give the archive a name like VSTA11_IS_ST_CS_Template.vstax
If the resulting zip is approximately 8KB, you have compression turned on. This won't cause SSDT to break, but it also won't be able to instantiate the Task editor until you fix it.

Step 3 Backup. Make a copy of your existing VSTA11_IS_ST_CS_Template.vstax file in the Binn folder. Keep this safe.

Step 4 Replace the vstax file in the Binn folder with the one you just created. You will be prompted to perform an admin task with this since it's a protected folder. I clicked yes, but you shouldn't because you machine may catch on fire.

Step 5 Test. You don't even have to restart Visual Studio. Just drag a new Script Task onto your canvas and click edit script. If you've done everything correctly, you'll be sporting a slimmer Script Task. If this doesn't work, then replace your modified vstax file with the copy you made in Step 3.

Is my SSIS package running in 32 or 64 bit mode?

$
0
0

32 versus 64 bit backstory

I might have looked at some of the questions over on StackOverflow and I've lost track of the number of times a package has worked but then it doesn't on another machine and it ends up being a mismatch between the "bittedness" of their driver and their execution model.

32 or 64 bit?

Windows was a 32 Operating System, OS, until around 2003. In the main Windows folder, you had a System folder, which was for 16 bit libraries, and System32 folder, which was for 32 bit libraries and applications. All is well and good.

Then the 64 architecture made it into Windows and now we have a third folder, this one called SysWoW64 which contains? .... 32 bit applications. Of course. The 64bit applications and libraries are in the System32 folder. I am not making this up. So, 32 bit in the folder named 64, 64 bit in the folder named 32. Got it.

But wait, there's more. Not only can your OS come in 32 and 64 bit flavours, so can your applications and drivers! Applications usually install in the Program Files directory. If you are on a 64 bit OS, then your "Program Files" is going to contain your 64 bit executables while your "Program Files (x86)" will contain your 32 bit executables. But don't worry, if you're on a 32 bit OS, there won't be an (x86) folder and the 32 bit executables will be in "Program Files."

So what?

Think of 32 vs 64 bit as height in inches. A two year old is probably 32" while a 17 years old may be 64". If I put a knick knack on the top shelf, only the 17 year old can reach (address) it. If I make the mistake of putting it on a low shelf, 17 year old can't be bothered to bend down to pick it up but the 2 year old can and will do their best Godzilla impersonation on it.

What's drivers got to do with it

SSIS can target a variety of sources and sinks out of the box. Flat files, web services, Active Directory, SQL Server: piece of cake. Excel, Access, Informix, DB2, MySQL, Oracle: not so much with the cake. The problem is that you need special drivers to get SSIS to talk to these providers. Some, like Excel* are part of the base installation. Others might require an special download.

Some drivers come in both kinds, 32 and 64 bit. Others, are only found in the 32 bit variety. I'm sure there's some esoteric driver that only works in 64 bit but I've never found someone lamenting this on a forum.

Often, with these providers, you will create a Data Source Name, DSN, to provide configuration information or provide your own unique file that starts with TNS and ends with ORA. What's important to realize is that you will need to align these configuration values with the correct bit version of your driver and your target package execution mode.

The executable for ODBC driver administration is odbcad32.exe. That tool exists in SysWow64, our 32 bit app location, and also in System32, our 64 bit app location but it's still physically called odbcad32.exe in both locations. Have I mentioned I deal with this confusion with some frequency? I don't know why, it seems so readily apparent. Further muddying the waters, I believe it was the Server 2003 the Control Panel, Administrator Tools, ODBC Data Sources only pointed to one of them. You had to know the other existed and find it to use it. The current interface at least lists it twice and indicates which is 32 versus 64 bit.

Running SSIS packages

SSIS packages "run" by getting called from an executable named dtexec.exe. If you have SSIS/BIDS/SSDT installed on your machine, you likely have two versions of dtexec.exe. Assuming default paths, you likely have the following them installed at the following paths.

SQL Server Version32 bit64 bit (default)
2005C:\Program Files (x86)\Microsoft SQL Server\90\DTS\Binn\DTExec.exeC:\Program Files\Microsoft SQL Server\90\DTS\Binn\DTExec.exe
2008C:\Program Files (x86)\Microsoft SQL Server\100\DTS\Binn\DTExec.exeC:\Program Files\Microsoft SQL Server\100\DTS\Binn\DTExec.exe2012C:\Program Files (x86)\Microsoft SQL Server\110\DTS\Binn\DTExec.exeC:\Program Files\Microsoft SQL Server\110\DTS\Binn\DTExec.exe2014C:\Program Files (x86)\Microsoft SQL Server\120\DTS\Binn\DTExec.exeC:\Program Files\Microsoft SQL Server\120\DTS\Binn\DTExec.exe

This matters as the default executable that gets run will likely be the 64 bit version. If you're trying to execute a package in 32 bit mode from the command line, you will need to explicitly reference the dtexec.exe in the x86 folder.

Observant folks may see that the dtexec offers a /X86 option. Don't believe it. The only way to get the correct bit-ness is to explicitly call the correct dtexec.exe The documentation even says as much but nobody reads documentation. This option is only used by SQL Server Agent. This option is ignored if you run the dtexec utility at the command prompt.

SSISDB notes

For those working with the Project Deployment Model (2012/2014), you don't have to worry about paths when spawning an execution instance. It's a simple matter of passing True/False to the @user32bitruntime parameter (line 6 below)

   1:  EXEC [SSISDB].[catalog].[create_execution]
   2:      @package_name = N'PartialLookup.dtsx'
   3:  ,   @execution_id = @execution_id OUTPUT
   4:  ,   @folder_name = N'POC'
   5:  ,   @project_name = N'BimlTest'
   6:  ,   @use32bitruntime = True
   7:  ,   @reference_id = NULL

Tracking whether an execution instance was 32 or 64 bit isn't readily apparent with the native reports but a simple query against catalog.executions will reveal it.


SELECT
E.use32bitruntime
, *
FROM
catalog.executions AS E
WHERE
E.execution_id = @OperationIDFromReportUpperLeftCorner;

Putting it all together, mostly

If I'm in 32 bit space, I can only work with the drivers and data source names I know about. The DSN I'm looking for might be in 64 bit space but I'll never be able to reach it from the depths of 32 bit. To paraphrase: 32 is 32, and 64 is 64, and ne'er the twain shall meet.

You might swear up and down you created the DSN or ran it in 32 bit mode but JET's not installed and this a bug with SSIS but before you post a question on StackOverflow, the MSDN forums or Connect, double check.

How do I double check?

Finally, what I was originally putting this post together to cover. For 2012/2014 packages in the project deployment model, you already have your answer. For everything else, there's a Script Task.

I know, I heard you saw ewwwww. It's not true. When you run your package from the command line, the first two lines


Microsoft (R) SQL Server Execute Package Utility
Version 12.0.2000.8 for 64-bit
Right there, it says this is the 64 bit version of dtexec and that I don't need to patch my VM. But, there are other ways of invoking a package. What if I started my package via .NET code or am running it in Visual Studio?

The quickest way I've been able to determine is to evaluate IntPtr.Size. If it's 4, then it's 32 bit, if it's 8, then it's 64.

Assuming you pass in System::InteractiveMode as a read parameter, this little script will fire an information event alerting you whether you're 32 or 64 bit. It can also pop up a message box, because everyone loves those.


bool fireAgain = false;
string message = string.Empty;
message = string.Format("I am running in {0} mode", (IntPtr.Size == 4) ? "32 bit":"64 bit");
Dts.Events.FireInformation(0, "Log bittedness", message, string.Empty, 0, ref fireAgain);
if ((bool)this.Dts.Variables["System::InteractiveMode"].Value)
{
MessageBox.Show(message);
}

Biml

A post wouldn't be complete without some Biml! If you wanted to, you could add this little bit into every package you emit and then you'd know, absolutely know whether stumbled onto a bug or the problem exists between keyboard and chair.

Don't be afraid of that hot mess of code. All it does is generate an SSIS package that uses a Script Task that has the above code in it. You can use the CallBimlScript trick from the Replicate-O-Matic post to encapsulate the script task into an external file but that's not really as flexible as I'd want it to be.


<Bimlxmlns="http://schemas.varigence.com/biml.xsd">

<ScriptProjects>
<ScriptTaskProjectProjectCoreName="ST_12345"Name="ST_12345"VstaMajorVersion="0">
<ReadOnlyVariables>
<VariableNamespace="System"VariableName="InteractiveMode"DataType="Boolean"/>
</ReadOnlyVariables>
<Files>
<FilePath="ScriptMain.cs"BuildAction="Compile">using System;
using System.Data;
using Microsoft.SqlServer.Dts.Runtime;
using System.Windows.Forms;

namespace ST_12345
{
[Microsoft.SqlServer.Dts.Tasks.ScriptTask.SSISScriptTaskEntryPointAttribute]
public partial class ScriptMain : Microsoft.SqlServer.Dts.Tasks.ScriptTask.VSTARTScriptObjectModelBase
{
public void Main()
{
bool fireAgain = false;
string message = string.Empty;
message = string.Format("I am running in {0} mode", (IntPtr.Size == 4) ? "32 bit":"64 bit");
Dts.Events.FireInformation(0, "Log bittedness", message, string.Empty, 0, ref fireAgain);
if ((bool)this.Dts.Variables["System::InteractiveMode"].Value)
{
MessageBox.Show(message);
}

Dts.TaskResult = (int)ScriptResults.Success;
}

enum ScriptResults
{
Success = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Success,
Failure = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Failure
};
}
} </File>
<FilePath="Properties\AssemblyInfo.cs"BuildAction="Compile">
using System.Reflection;
using System.Runtime.CompilerServices;

//
// General Information about an assembly is controlled through the following
// set of attributes. Change these attribute values to modify the information
// associated with an assembly.
//
[assembly: AssemblyTitle("AssemblyTitle")]
[assembly: AssemblyDescription("Bill is awesome")]
[assembly: AssemblyConfiguration("")]
[assembly: AssemblyCompany("")]
[assembly: AssemblyProduct("ProductName")]
[assembly: AssemblyCopyright("Copyright @ 2015")]
[assembly: AssemblyTrademark("")]
[assembly: AssemblyCulture("")]
//
// Version information for an assembly consists of the following four values:
//
// Major Version
// Minor Version
// Build Number
// Revision
//
// You can specify all the values or you can default the Revision and Build Numbers
// by using the '*' as shown below:

[assembly: AssemblyVersion("1.0.*")]
</File>
</Files>
<AssemblyReferences>
<AssemblyReferenceAssemblyPath="System"/>
<AssemblyReferenceAssemblyPath="System.Data"/>
<AssemblyReferenceAssemblyPath="System.Windows.Forms"/>
<AssemblyReferenceAssemblyPath="System.Xml"/>
<AssemblyReferenceAssemblyPath="Microsoft.SqlServer.ManagedDTS.dll"/>
<AssemblyReferenceAssemblyPath="Microsoft.SqlServer.ScriptTask.dll"/>
</AssemblyReferences>
</ScriptTaskProject>
</ScriptProjects>
<Packages>
<PackageName="CheckMyBits"ConstraintMode="Linear">
<Tasks>
<ScriptProjectCoreName="ST_12345"Name="SCR Do Stuff">
<ScriptTaskProjectReference ScriptTaskProjectName="ST_12345" />
</Script>
</Tasks>
</Package>
</Packages>
</Biml>

That's a lot of Biml to try and remember, if only there was a way to keep that handy...


Biml - Snippets

$
0
0

I've been working with Biml for a year and a half now. With the Intellisense built into BIDS Helper or Mist itself, I can bang out some code fairly quick. My mental parser isn't too bad either, I can read and generally see what is/isn't set correctly in it. Except for the Script Tasks and Components. Even in Mist, they still kick me in the pants. What's the ProjectCoreName and how does that differ from the ScriptTaskProjectName and should it differ? What's the crazy syntax for escaping my code within the code? Yeah, I don't care anymore.

I no longer care, because I have a snippet. If only I had a Donk! A snippet is like a macro — type some mnemonic keystroke and if you want the snippet, hit Tab. The C# snippet that comes to mind is cw which autocompletes to Console.WriteLine. Man, that'd be helpful for slingin' Biml.

Wait, why haven't I used them? I know you can create custom snippets for .NET so why not one for "XML?" Yeah self, why not? To save you the trouble of arguing with yourself for not being clever, I'm going to tell you to start creating your own snippets, contribute them to bimlscript.com and let's get cranking.

Getting started with snippets

I don't know that you have to, but there's a very handy tool called Snippet Designer that makes it a cinch to create snippets.

Highlight the text you're interested in and in your right click menu, Export as Snippet. You can ignore the Create Snippet..., that's Red Gate's SQL Prompt and won't create the right type of snippet for these file types.

For a script task, I'm just going to assume I'm starting with an brand new Biml file so I've selected everything but that and put it into a snippet. You'll then be presented with a nice little editor so you can use things like anchors and such which I made heavy use of in TextPad's clip library.

I save it out and it goes into a file called C:\Users\bfellows\Documents\Visual Studio 2012\Code Snippets\XML\My Xml Snippets\ScriptTaskCS_2012.snippet


<?xmlversion="1.0"encoding="utf-8"?>
<CodeSnippetsxmlns="http://schemas.microsoft.com/VisualStudio/2005/CodeSnippet">
<CodeSnippetFormat="1.0.0">
<Header>
<SnippetTypes>
<SnippetType>Expansion</SnippetType>
</SnippetTypes>
<Title>ScriptTaskCS_2012</Title>
<Author>admin</Author>
<Description>
</Description>
<HelpUrl>
</HelpUrl>
<Shortcut>
</Shortcut>
</Header>
<Snippet>
<CodeLanguage="xml"><![CDATA[ <ScriptProjects>
<ScriptTaskProjectProjectCoreName="ST_12345"Name="ST_12345"VstaMajorVersion="0">
<ReadOnlyVariables>
<VariableNamespace="System"VariableName="MachineName"DataType="Boolean"/>
</ReadOnlyVariables>
<Files>
<FilePath="ScriptMain.cs"BuildAction="Compile">using System;
using System.Data;
using Microsoft.SqlServer.Dts.Runtime;
using System.Windows.Forms;

namespace ST_12345
{
[Microsoft.SqlServer.Dts.Tasks.ScriptTask.SSISScriptTaskEntryPointAttribute]
public partial class ScriptMain : Microsoft.SqlServer.Dts.Tasks.ScriptTask.VSTARTScriptObjectModelBase
{
public void Main()
{
bool fireAgain = false;
string message = Dts.Variables["System::MachineName"].Value.ToString();
Dts.Events.FireInformation(0, "Log MachineName", message, string.Empty, 0, ref fireAgain);

Dts.TaskResult = (int)ScriptResults.Success;
}

enum ScriptResults
{
Success = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Success,
Failure = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Failure
};
}
} </File>
<FilePath="Properties\AssemblyInfo.cs"BuildAction="Compile">
using System.Reflection;
using System.Runtime.CompilerServices;

//
// General Information about an assembly is controlled through the following
// set of attributes. Change these attribute values to modify the information
// associated with an assembly.
//
[assembly: AssemblyTitle("AssemblyTitle")]
[assembly: AssemblyDescription("")]
[assembly: AssemblyConfiguration("")]
[assembly: AssemblyCompany("I <3 @billinkc")]
[assembly: AssemblyProduct("ProductName")]
[assembly: AssemblyCopyright("Copyright @ 2015")]
[assembly: AssemblyTrademark("")]
[assembly: AssemblyCulture("")]
//
// Version information for an assembly consists of the following four values:
//
// Major Version
// Minor Version
// Build Number
// Revision
//
// You can specify all the values or you can default the Revision and Build Numbers
// by using the '*' as shown below:

[assembly: AssemblyVersion("1.0.*")]
</File>
</Files>
<AssemblyReferences>
<AssemblyReferenceAssemblyPath="System"/>
<AssemblyReferenceAssemblyPath="System.Data"/>
<AssemblyReferenceAssemblyPath="System.Windows.Forms"/>
<AssemblyReferenceAssemblyPath="System.Xml"/>
<AssemblyReferenceAssemblyPath="Microsoft.SqlServer.ManagedDTS.dll"/>
<AssemblyReferenceAssemblyPath="Microsoft.SqlServer.ScriptTask.dll"/>
</AssemblyReferences>
</ScriptTaskProject>
</ScriptProjects>
<Packages>
<PackageName="BasicScriptTask"ConstraintMode="Linear">
<Tasks>
<ScriptProjectCoreName="ST_12345"Name="SCR Do Stuff">
<ScriptTaskProjectReference ScriptTaskProjectName="ST_12345" />
</Script>
</Tasks>
</Package>
</Packages>]]></Code>
</Snippet>
</CodeSnippet>
</CodeSnippets>
Now when I need a script task, my workflow is
  1. Add new biml file
  2. Right click, Insert Snippet (Ctrl-K, Ctrl-X)
  3. Navigate to My Xml Snippets, select ScriptTaskCS_2012
  4. Replace the Name attribute for Package and replace all instances of ST_1235 with something a little more unique

If I click Generate SSIS Package, the biml engine is going to fire up and emit an SSIS package with a script task. How cool is that? Think about how you can leverage snippets and CallBimlScript: Replicate-o-matic, Don't Repeat Your Biml, Callable BimlScript (Caller), etc.

Like this? Joost van Rossum (b|t) just posted Creating BIML Script Component Transformation (rownumber). That's your framework for creating a Script Component, acting as a transform. Add that to your Snippets and now you have an example of each.

For large libraries, you might want to make those CoreNames unique. I'll see if there's an API call for generating a unique name. Also, you can make this into an in-line project script as Scott shows on Creating Script Task Projects inline. The difference between the two approaches boils down to do you want to create shareable, project level tasks or per-package tasks.

I am very excited about integrating snippets into my biml workflow and I hope this has opened your eyes to another means for speeding your development.

Biml SSIS ErrorCode and ErrorColumn

$
0
0
Did you know that ErrorCode and ErrorColumn are "reserved" column names in an SSIS Data Flow? I've been doing this for ten years now and I had never run into this until this week. My client's application has an ErrorCodes table defined like this

USE [tempdb]
GO
CREATETABLE [dbo].[ErrorCodes]
(
[ErrorCodeId] [uniqueidentifier] NOTNULL
, [ErrorCode] [nvarchar](3) NOTNULL
, [ErrorText] [nvarchar](50) NOTNULL
, [CreateDate] [datetime] NOTNULL
, [CreatedBy] [uniqueidentifier] NULL
, [ModifiedDate] [datetime] NULL
, [ModifiedBy] [uniqueidentifier] NULL
, [ModuleReference] [uniqueidentifier] NULL
, [RowStatusId] [uniqueidentifier] NULL
, CONSTRAINT [PK_dboErrorCode] PRIMARYKEYCLUSTERED ([ErrorCodeId] ASC)
);
We needed to replicate the data out of the application tables so naturally, I had described the pattern in Biml and let it run. Until it blew up on me. The 1.7.0 release of BIDS Helper uses the new and improved Biml engine and it actually reported an error on emission. Contrast that with the 1.6.0 release which happily emitted the DTSX. This of course is documented in the Release Notes for Mist 4.0 Update 1

BimlScript Errors/Warnings

  • Added an error for duplicate dataflow column nodes. This usually arises with "reserved" column names "ErrorCode" and "ErrorText" (sic)

That should actually be "ErrorColumn". There's an ErrorDescription that is added in the OnError event and I've sent an email along to Varigence to get that corrected but I'm not seeing an ErrorText anywhere.

Reproduction

The following Biml is sufficient for you to see the error generated (and to show off the new error reporting functionality in BIDS Helper 1.7.0)

<Bimlxmlns="http://schemas.varigence.com/biml.xsd">
<Connections>
<OleDbConnectionName="CM_OLE"ConnectionString="Data Source=localhost\dev2014;Initial Catalog=tempdb;Provider=SQLNCLI10.1;Integrated Security=SSPI;Auto Translate=False;"/>
</Connections>
<Packages>
<PackageConstraintMode="Linear"Name="ErrorCodeTest">
<Variables>
<VariableDataType="String"Name="SimulateBadTable"><![CDATA[SELECT * FROM (VALUES (NULL, NULL)) D(ErrorCode, ErrorColumn); ]]></Variable>
</Variables>
<Tasks>
<DataflowName="DFT Test">
<Transformations>
<OleDbSourceConnectionName="CM_OLE"Name="OLE_SRC dbo_ErrorCodes">
<VariableInputVariableName="User.SimulateBadTable"></VariableInput>
</OleDbSource>
<DerivedColumnsName="Do Nothing"></DerivedColumns>
</Transformations>
</Dataflow>
</Tasks>
</Package>
</Packages>
</Biml>
Component OLE_SRC dbo_ErrorCodes of type AstOleDbSourceNode in Dataflow DFT Test has duplicate column reference with name 'ErrorColumn' on ouput 'Error'. This may be due to the 'ErrorCode' and 'ErrorColumn' columns that are automatically added to error output paths by SSIS. Ensure that all column names are unique.

Known issue

This is apparently a known issue based on this Connect Item SSIS - ErrorCode column in source table causes duplicate ErrorCode to be introduced but it's marked as Closed - Fixed. I'm not seeing the fixed part with SQL Server 2014

For those Bimling, it's also logged over on the Varigence forums but they simply reference the Connect item.

What to do

I don't know yet (sadly, this excites me). My lazy pattern was using the table selector for my OLE DB Source but for this one table, I'd need to explicitly grab the column list and alias the ErrorCode column as something else. What I'd rather do is rename the Error path's columns but that does not seem possible.

Never trust the SSMS GUI

$
0
0

There's a group of SQL Server professionals that try to help on StackOverflow and DBA.StackExchange.com. One of the most telling symptoms of a bad question is people using the GUI to administer their SQL Server instances. "I clicked here, here, here and bad thing happened". What did you actually do? Do you know?

It's ok, you don't have to be ultra hardcore and memorize every bit of syntax or spend time looking through Books Online to find the specific command you're looking. You can use the GUI to administer your machine. Just don't click the OK button. Instead, look up.

That beautiful Script button is the one you're looking for. That generates all the DDL that clicking OK would have done but now you have something you can inspect. "Trust, but verify" as someone once said. Not only can you inspect it, you can share those commands with someone else.

In today's example, I learned the GUI was silently swallowing the error. I had created an operator for myself but misspelled my last name. I correct my name, clicked OK and ... it never changed. Weird, I must have clicked Cancel. This time, I ensured I clicked that big OK button.

I hit refresh on the Operators list and what the heck? Why am I still "Bill Fellow?" I demand an "S"! Once more unto the GUI and this time, I clicked Script Action to new query window.


EXEC msdb.dbo.sp_update_operator @name=N'Bill Fellows',
@enabled=1,
@pager_days=0,
@email_address=N'BillinKC@world.domination',
@pager_address=N'',
@netsend_address=N''
Perfect, this is what I want. I'm updating the operator...

Msg 14262, Level 16, State 1, Procedure sp_update_operator, Line 61
The specified @name ('Bill Fellows') does not exist.
Or not. I could have been spent time yelling at the SQL Server for being "dumb" or posted some question with exact reproduction and if someone was inclined, they could have verified that yes, it doesn't appear to rename. But look what happens when you actually run the command yourself. It tells you exactly why it's not working.

Asking questions is good. We should all endeavor to ask questions of everything. But even better is go find the answer.

Happy scripting.

Biml - separate dataflows into separate packages

$
0
0

I have an upcoming client with overly complicated SSIS packages. One package alone has 80 dataflow tasks, very few of which are related to each other. You wouldn't believe, but they have maintenance issues with this package! Especially with regard to concurrent editing of an SSIS package which doesn't exist. To simplify their solution, part of my proposal is that we take the massive, monolithic SSIS package and break it out into many teeny-tiny SSIS packages that just focus on a single task. This will allow for one developer to change the sales order detail dataflow task while another can safely edit the sales employee data flow task as they'll be contained in completely separate packages. I'll then build a master package that ensures proper invocation of the subpackages.

To get started, I'm going to use Mist to reverse engineer my packages into Biml. Let's assume it generates the following package.


<Bimlxmlns="http://schemas.varigence.com/biml.xsd">
<Packages>
<PackageName="SourcePackage">
<Tasks>
<DataflowName="DFT 1"/>
<DataflowName="DFT 2"/>
<ExecuteProcessName="EPT Do not include"Executable="C:\Windows\System32\cmd.exe"/>
<ContainerName="SEQ A">
<Tasks>
<DataflowName="DFT A1"/>
</Tasks>
</Container>
</Tasks>
</Package>
<PackageName="SourcePackage2">
<Tasks>
<DataflowName="DFT 1"/>
<DataflowName="DFT 2"/>
<ExecuteProcessName="EPT Do not include"Executable="C:\Windows\System32\cmd.exe"/>
<ContainerName="SEQ A">
<Tasks>
<DataflowName="DFT A1"/>
</Tasks>
</Container>
</Tasks>
</Package>
</Packages>
</Biml>
I only want the data flow tasks extracted from this. You can also see that I have data flow tasks encapsulated into containers. That's a pain to enumerate through in the .NET object model but Biml gives us lovely LINQ shortcuts. The first thing I want to do is find all of dataflow tasks. RootNode.AllDefinedSuccessors().OfType() generates an enumerable list.

For each of the items in our list, we want to do something. Here I'm creating a new package with the name of the original package + the data flow task. I assume this will be unique but it's not guaranteed. Within my Tasks collection, I simply call the GetBiml method which preserves the definition of my data flow.


<#@ template language="C#" tier="2" #>
<Bimlxmlns="http://schemas.varigence.com/biml.xsd">
<Packages>
<#
foreach(var t in RootNode.AllDefinedSuccessors().OfType<AstDataflowTaskNode>())
{
#>
<PackageName="<#=t.FirstThisOrParent<AstPackageNode>() #> - <#=t.Name #>">
<Tasks>
<#=t.GetBiml() #>
</Tasks>
</Package>
<#
}
#>
</Packages>
</Biml>
If you're in BIDS Helper, simply highlight both files and click generate SSIS packages. Out comes 8 SSIS packages, the 2 originals and then one for each data flow.

Think about the possibilities this opens up for you! Maybe I'd also like to add in an auditing framework (I would). Maybe I want to then turn on logging and configuration since this client is on 2008. That's just as easy. Oh Biml, I <3 you.

BIDS Helper 1.7.0 subtle improvement

$
0
0

BIDS Helper 1.7.0.0

On March 17th, a new version of BIDS Helper was released and one of the compelling features in it was the ability to emit 2014 packages natively via Biml. Tucked away in there though, is a usability feature that rocks. The validation reporter no longer pops up in the terrible window it had.

If I Check Biml for Errors, I get the usual pop up window although it's cleaner looking than the 1.6 and earlier versions but the real awesome sauce is in the Output window (View, Output Ctrl+Alt+O)

Biml Validation Items

How awesome is this? These errors, I can use a search engine on them! The picture, not so much. Here's some sample output.

  • Validating BIML
  • 1/1 Emitting Project ErrorCodeTest.dtproj.
  • 1/1 Emitting Package ErrorCodeTest.
  • C:\sandbox\POC_2013\POC_2013\BimlScript.biml(13,26) : Error 0 : Component OLE_SRC dbo_ErrorCodes of type AstOleDbSourceNode in Dataflow DFT Test has duplicate column reference with name 'ErrorCode' on ouput 'Error'. This may be due to the 'ErrorCode' and 'ErrorColumn' columns that are automatically added to error output paths by SSIS. Ensure that all column names are unique.
  • C:\sandbox\POC_2013\POC_2013\BimlScript.biml(13,26) : Error 0 : Component OLE_SRC dbo_ErrorCodes of type AstOleDbSourceNode in Dataflow DFT Test has duplicate column reference with name 'ErrorColumn' on ouput 'Error'. This may be due to the 'ErrorCode' and 'ErrorColumn' columns that are automatically added to error output paths by SSIS. Ensure that all column names are unique.
  • EmitSsis. There were errors during compilation. See compiler output for more information.

Thank you and excellent work on this release to Varigence and the devs on the BIDS Helper project.

Biml - Adding external assemblies

$
0
0

This blog post is coming live to you from SQL Saturday Dallas BI Edition. We were discussing horrible things you can do in SSIS and I mentioned how you can do anything that the .NET library supports. That got me wondering about whether I could do the same awful things within Biml. The answer is yes, assuming you supply sufficient force.

As a simple demonstration, I wanted to pop up a message box whenever I generated or checked my biml for errors. However, the MessageBox method lives in an assembly that isn't loaded by default. And to get that assembly to import, I needed to add it as a reference for the biml compiler. The last two lines of this show me adding the assembly and then importing the namespace.


<Bimlxmlns="http://schemas.varigence.com/biml.xsd">
<#
MessageBox.Show("This works");
#>
</Biml>
<#@ import namespace="System.Windows.Forms" #>
<#@ assembly name= "C:\Windows\Microsoft.NET\Framework\v4.0.30319\System.Windows.Forms.dll" #>

Biml breaking change - ServerExecutionID is repeated within scope

$
0
0

This is a brief post but I wanted to note that with the 1.7 release of BIDS Helper and 4.0 release of Mist, Biml code you found from around the Internet that dealt with the Project Deployment Model may no longer build. Specifically, you might run into

The name ServerExecutionID is repeated within scope. Eliminate duplicate names in scope

In your Variables collection, your Biml likely has an entry like


<VariableName="ServerExecutionID"DataType="Int64"IncludeInDebugDump="Include"Namespace="System">0</Variable>

The approach of explicitly declaring the ServerExecutionID was required in BIDS Helper 1.6.6 and prior to be able to access that System scoped Variable. With the newer release and the ability to natively emit 2014+ packages, this is no longer a requirement. Not only is it not a requirement, it breaks validation with the above error message.

The resolution is simple, delete that declaration.


Biml - Unpivot transformation

$
0
0

Biml - Unpivot transformation

I had cause today to use the Unpivot transformation in SSIS. My source database was still in 2000 compatibility mode, don't laugh, so I couldn't use the PIVOT operator and I was too lazy to remember the CASE approach. My client records whether a customer uses a particular type of conveyance. For simplicity sake, we'll limit this to just whether they own a car or truck. Some customers might own both, only one or none. Part of my project is to normalize this data into a more sane data structure.

Source data

The following table approximates the data but there are many more bit fields to be had.
CustomerNameOwnsCarOwnsTruck
Customer 111
Customer 210
Customer 310
Customer 410
Customer 510
Customer 610
Customer 711
Customer 810
Customer 910
Customer 1010
Customer 1110
Customer 1210
Customer 1301
Customer 1400
Customer 1500
Customer 1600
Customer 1700
Customer 1800

SSIS Package

The package is rather simple - we have the above source data fed into an Unpivot component and then we have a Derived Column serving as an anchor point for a data viewer.

Unpivot

To no great surprise to anyone who's worked with Biml, the code is not complex. We need to provide specifics about how the pivot key column should work and then the detailed mapping of what we want to do with our columns. Here we're going to keep our CustomerName column but we want to merge OwnsCar and OwnsTruck columns into a single new column called SourceValue. The PivotKeyValue we supply will be the values associated to our pivot. Since we specified an Ansi string of length 20, the values we supply of Car and Truck must map into that domain.

Unpivot Biml


<Bimlxmlns="http://schemas.varigence.com/biml.xsd">
<Connections>
<OleDbConnectionConnectionString="Provider=SQLOLEDB;Data Source=localhost\dev2014;Integrated Security=SSPI;Initial Catalog=tempdb"Name="CM_OLE"/>
</Connections>
<Packages>
<PackageConstraintMode="Linear"Name="Component_Unpivot">
<Variables>
<VariableName="QuerySource"DataType="String">
<![CDATA[SELECT
'Customer ' + CAST(ROW_NUMBER() OVER (ORDER BY (SELECT NULL)) AS varchar(2)) AS CustomerName
, *
FROM
(
VALUES
(CAST(1 AS bit))
, (CAST(1 AS bit))
, (CAST(0 AS bit))
) S(OwnsCar)
CROSS APPLY
(
VALUES
(CAST(1 AS bit))
, (CAST(0 AS bit))
, (CAST(0 AS bit))
, (CAST(0 AS bit))
, (CAST(0 AS bit))
, (CAST(0 AS bit))
) F(OwnsTruck);
]]></Variable>
</Variables>
<Tasks>
<DataflowName="DFT Unpviot">
<Transformations>
<OleDbSource
ConnectionName="CM_OLE"
Name="OLESRC Unpivot Source">
<VariableInputVariableName="User.QuerySource"/>
</OleDbSource>

<Unpivot
Name="UPV Vehicle types"
PivotKeyValueColumnName="Vehicle"
PivotKeyValueColumnDataType="AnsiString"
PivotKeyValueColumnCodePage="1252"
PivotKeyValueColumnLength="20"
AutoPassThrough="false"
>
<Columns>
<ColumnSourceColumn="CustomerName"IsUsed="true"/>
<ColumnSourceColumn="OwnsCar"TargetColumn="SourceValue"PivotKeyValue="Car"/>
<ColumnSourceColumn="OwnsTruck"TargetColumn="SourceValue"PivotKeyValue="Truck"/>
</Columns>
</Unpivot>

<DerivedColumnsName="DER Placeholder"/>
</Transformations>
</Dataflow>
</Tasks>
</Package>
</Packages>
</Biml>

Tracking bad queries aka Finally an Extended Event that is useful

$
0
0

Dessert first a.k.a. how useful would this be?

Imagine running a query and finding out all the bad queries that have been submitted. Like this

The client restored a database but neglected to fix permissions for those first three. Look at that fourth one though! I only enabled this trace on the 3rd. Any takers on how many months that error has been swallowed by their application framework? Exactly. This is definitely something I'm adding into my utility belt. Read on for how to set up Extended Events to do just this.
We now resume your regularly scheduled blog post, already in progress.

Finally, an extended event that was useful.

I'm sure that's heresy among the pure DBAs but I'm not a DBA. I'm just a developer, among many other things. At my last client, one of the developers was hell bent on building procedures which called procedures that built code to call more procedures - dynamic code to build statements to an untenable nesting level. Which was cool in theory but unfortunately they had no way to debug it when it built bad queries. Which it did. Over and over again. Thousands per second. It could have been marketed as Errors as a Service. The real problem though was that their bad code took several hours to for them to recreate the error scenario. What we needed was a way to log the statements across all those levels of nested stored procedures and the values they sent.

Enter Kendra Little and this "magic" script. It's magic because it was the missing link for me finally "getting" XEvents. I remember asking at SQL Saturday #91 (Omaha) back in 2011 "where would I, as a developer, use extended events?" Four years later, I had a scenario where I needed it.

I had meant to blog about this in April when I first used it but then shiny squirrels and this was lost in the TODO list. Fast forward to today. At my current client, we've built a new website and the database to manage part of their business. Most of the website runs off stored procedures but part of it is SQL embedded in the web pages. As much as the developer would like to just make the account the database owner, I'd rather not give up on security. I created a custom database role and added the service account into and granted it execute permissions to the schema. All is well and good except for those pesky direct queries we couldn't turn into procedures. When those queries hit, the "database is down." At least, that's the depth of the email I get. You know what would be handy here, that same bad code tracker!

Implementation

My implementation is to create an extended event that logs to an asynchronous file (first, do no harm). Where my approach differs from Kendra's is parsing all those files is dog slow - at least for the Errors As A Service scenario. So instead, I create a heap to hold the parsed bits I care about as well as the raw XML from the extended event. That way if I determine I wanted some other attribute extracted, I'd still have the original data available. Oh, and finally, I clean up after myself which is important when you're a consultant. You might roll off an engagement and someone forgets to watch the drive and suddenly, you've filled a drive with log data.

You'll notice I create the table in tempdb. Basically, I don't care if I lose that table, it's just for my monitoring purposes and in fact, putting it in temp assures me that at some point, it'll get wiped and recreated. If I were blessed to be on a 2014+ platform, I'd look hard at whether a memory optimized table might serve the same need as long as it's declared with a durability of SCHEMA_ONLY. I don't think LOB types are supported there yet, but I suspect it's a matter of time before they are.

This idempotent script will create our table, dbo.ExtendedEventErrorCapture, if it does not exist. It will create an extended event session, what_queries_are_failing, where we capture any error above severity 10 and stream it out to the file on F:\XEventSessions. If you don't have an F drive with that folder, you need to either update the script or talk to your storage vendor as they will happily supply one. Finally, we turn on the trace/session.


USE tempdb;

--------------------------------------------------------------------------------
-- Create the table to store out the XE data
--------------------------------------------------------------------------------
IFNOTEXISTS
(
SELECT
*
FROM
sys.schemas AS S
INNERJOIN
sys.tables AS T
ON T.schema_id = S.schema_id
WHERE
T.name = 'ExtendedEventErrorCapture'
AND S.name = 'dbo'
)
BEGIN
CREATETABLE
dbo.ExtendedEventErrorCapture
(
err_timestamp datetime2(7) NULL
, err_severity bigint NULL
, err_number bigint NULL
, err_message nvarchar(512) NULL
, sql_text nvarchar(MAX) NULL
, event_data xml NULL
, session_name sysname
, username nvarchar(512)
)
-- This is only vaild for a Develper/Enterprise edition license
WITH (DATA_COMPRESSION = PAGE);
END

--------------------------------------------------------------------------------
-- Create the extended event to keep track of bad sql queries
--------------------------------------------------------------------------------
IFNOTEXISTS
(
SELECT
*
FROM
sys.server_event_sessions AS SES
WHERE
SES.name = N'what_queries_are_failing'
)
BEGIN
--Create an extended event session
CREATE EVENT SESSION
what_queries_are_failing
ON SERVER
ADD EVENT sqlserver.error_reported
(
ACTION (sqlserver.sql_text, sqlserver.tsql_stack, sqlserver.database_id, sqlserver.username)
WHERE ([severity]> 10)
)
ADD TARGET package0.asynchronous_file_target
(set filename = 'F:\XEventSessions\what_queries_are_failing.xel' ,
metadatafile = 'F:\XEventSessions\what_queries_are_failing.xem',
max_file_size = 512,
increment = 16,
max_rollover_files = 5)
WITH (MAX_DISPATCH_LATENCY = 5SECONDS);

END

--------------------------------------------------------------------------------
-- Turn on the extended event
--------------------------------------------------------------------------------
IFNOTEXISTS
(
-- When a XE is active, then there is an entry
-- in sys.dm_xe_sessions
SELECT
*
FROM
sys.dm_xe_sessions AS DXS
INNERJOIN
sys.server_event_sessions AS SES
ON SES.name = DXS.name
WHERE
SES.name = N'what_queries_are_failing'
)
BEGIN
-- Start the session
ALTER EVENT SESSION what_queries_are_failing
ON SERVER STATE = START;
END

--------------------------------------------------------------------------------
-- Wait for errors
-- PROFIT!
--------------------------------------------------------------------------------

That's not so bad. Sure, it looks like a lot of code but you've got 1-2 lines of code to change (page compression and where to log to).

Where's the beef?

Once you think you've captured enough data, I turn off the session and parse the files into tables. As I am not a demon robot, I am grateful to Kendra for writing the XQuery for me to parse the meaningful data out of the trace files. I am a developer, but not that kind of developer.


USE tempdb;

--------------------------------------------------------------------------------
-- Turn off our extended event
--------------------------------------------------------------------------------
IFEXISTS
(
-- When a XE is active, then there is an entry
-- in sys.dm_xe_sessions
SELECT
*
FROM
sys.dm_xe_sessions AS DXS
INNERJOIN
sys.server_event_sessions AS SES
ON SES.name = DXS.name
WHERE
SES.name = N'what_queries_are_failing'
)
BEGIN
-- Start the session
ALTER EVENT SESSION what_queries_are_failing
ON SERVER STATE = STOP;
END

--------------------------------------------------------------------------------
-- Extract data from our XE
--------------------------------------------------------------------------------
;
WITH events_cte AS
(
SELECT
DATEADD(mi,
DATEDIFF(mi, GETUTCDATE(), CURRENT_TIMESTAMP),
xevents.event_data.value('(event/@timestamp)[1]', 'datetime2')) AS [err_timestamp],
xevents.event_data.value('(event/data[@name="severity"]/value)[1]', 'bigint') AS [err_severity],
xevents.event_data.value('(event/data[@name="error_number"]/value)[1]', 'bigint') AS [err_number],
xevents.event_data.value('(event/data[@name="message"]/value)[1]', 'nvarchar(512)') AS [err_message],
xevents.event_data.value('(event/action[@name="sql_text"]/value)[1]', 'nvarchar(max)') AS [sql_text],
xevents.event_data,
xevents.event_data.value('(event/action[@name="username"]/value)[1]', 'nvarchar(512)') AS [username],
'what_queries_are_failing'AS session_name
FROM sys.fn_xe_file_target_read_file
(
'F:\XEventSessions\what_queries_are_failing*.xel'
, 'F:\XEventSessions\what_queries_are_failing*.xem'
, NULL
, NULL) AS fxe
CROSS APPLY (SELECTCAST(event_data as XML) AS event_data) AS xevents
)
INSERTINTO
dbo.ExtendedEventErrorCapture
(
err_timestamp
, err_severity
, err_number
, err_message
, sql_text
, event_data
, session_name
, username
)
SELECT
E.err_timestamp
, E.err_severity
, E.err_number
, E.err_message
, E.sql_text
, E.event_data
, E.session_name
, E.username
FROM
events_cte AS E;

Mischief managed

Clean up, clean up, everybody clean up! As our final step, since we're good citizens of this database and we remove our extended event and once that's done, we'll leverage xp_cmdshell to delete the tracefiles.


USE tempdb;

--------------------------------------------------------------------------------
-- Get rid our extended event
--------------------------------------------------------------------------------
IFEXISTS
(
SELECT
*
FROM
sys.server_event_sessions AS SES
WHERE
SES.name = N'what_queries_are_failing'
)
BEGIN
-- Clean up your session from the server
DROP EVENT SESSION what_queries_are_failing ON SERVER;
END

--------------------------------------------------------------------------------
-- Get rid our extended event files only if the XE is turned off
-- or no longer exists
--------------------------------------------------------------------------------
IFNOTEXISTS
(
SELECT
1
FROM
sys.dm_xe_sessions AS DXS
INNERJOIN
sys.server_event_sessions AS SES
ON SES.name = DXS.name
WHERE
SES.name = N'what_queries_are_failing'

UNIONALL
SELECT
1
FROM
sys.server_event_sessions AS SES
WHERE
SES.name = N'what_queries_are_failing'

)
BEGIN
-- Assumes you've turned on xp_cmdshell
EXECUTE sys.xp_cmdshell'del F:\XEventSessions\what_queries_are_failing*.xe*';
END

Now what?

Query our table and find out what's going on. I use a query like the following to help me identify what errors we're getting, whether they're transient, etc.

SELECT
EEEC.err_message
, MIN(EEEC.err_timestamp) AS FirstSeen
, MAX(EEEC.err_timestamp) AS LastSeen
, COUNT_BIG(EEEC.err_timestamp) AS ErrorCount
FROM
tempdb.dbo.ExtendedEventErrorCapture AS EEEC
GROUPBY
EEEC.err_message
ORDERBY
COUNT_BIG(1) DESC;

You've already seen the results but I've very happy to have found a use case for XE in my domain. In the next post, I'll wrap this implementation up with a nice little bow. Or a SQL Agent job. Definitely one of the two.

Testing SQL Server Alerts

$
0
0

a.k.a. why the heck aren't my Alerts alerting?

I'm learning Power BI by building a sample dashboard for database operational analytics. One of the metrics we want to track was whether any SQL Alerts had fired in the reporting time frame.

Seems reasonable enough so I defined a few alerts on my machine and did some dumb things that should fire off the alerts. And it didn't. At first, I thought I must be looking in the wrong place but I watched profiler and the SSMS gui was calling "EXECUTE msdb.dbo.sp_help_alert" which at its core uses "msdb.dbo.sysalerts". All of that looked right but by golly the last_occurrence_date fields all showed zeros.

I took to twitter asking what else I could do to invoke errors and y'all had some great ideas

  • https://twitter.com/crummel4/status/645764451724095489
  • https://twitter.com/SirFisch/status/645763600116617216
but the one that kept coming up was raiserror . There I scan specify whatever severity I'd like so here's my alert tester for severity 15

DECLARE
@DBID int
, @DBNAME nvarchar(128)
, @severity int;

SELECT
@DBID = DB_ID()
, @DBNAME = DB_NAME()
, @severity = 15;

RAISERROR
(
N'The current database ID is:%d, the database name is: %s.'
, @severity -- Severity.
, 1 -- State.
, @DBID -- First substitution argument.
, @DBNAME
);
I run it, verify SSMS shows The current database ID is:23, the database name is: tempdb. Msg 50000, Level 15, State 1, Line 29 and as expected sp_help_alert shows ... 0 for occurrence for date and time? What the heck? I raised an error of the appropriate severity and had SQL Alerts defined for that severity - it's peanut butter and chocolate, they go together but this was more like mayonnaise and peanut butter.

A quick trip around the interblag raised some interesting possibilities. Maybe I needed to restart SQL Agent. Maybe I was tracking the wrong thing and some other links I had since closed but none of these bore fruit.

Resolution

I scaled up my RAISERROR call to enumerate all the severity levels just to get something to work and it wasn't until I hit 19 that I found my mistake. Error severity levels greater than 18 can only be specified by members of the sysadmin role, using the WITH LOG option.

Once I tacked on the "WITH LOG" option, my alerts began firing.


DECLARE
@DBID int
, @DBNAME nvarchar(128)
, @severity int;

SELECT
@DBID = DB_ID()
, @DBNAME = DB_NAME();

DECLARE Csr CURSOR
FOR
-- Sev 20 and above breaks the current connection
SELECTTOP 20
CAST(ROW_NUMBER() OVER(ORDERBY (SELECTNULL)) ASint) AS Severity
FROM
sys.objects AS O;

OPEN Csr;

FETCHNEXTFROM Csr
INTO @severity;

WHILE(@@FETCH_STATUS = 0)
BEGIN

RAISERROR
(
N'The current database ID is:%d, the database name is: %s.'
, @severity -- Severity.
, 1 -- State.
, @DBID -- First substitution argument.
, @DBNAME
)
WITH LOG;

FETCHNEXTFROM Csr
INTO @severity;
END

CLOSE Csr;
DEALLOCATE Csr;

Resources

Glenn Berry and Brent Ozar Unlimited both have great posts with some Alerts should have turned on for all of your servers.

SSISDB Delete all packages and environments

$
0
0

SSISDB tear down script

For my Summit 2015 presentation, 2014 SSIS Project Deployment Model: Deployment and Maintenance, I needed to revise my SSISDB tear down script. When I first built it, it removed all the projects and then removed all the folders. Which was great but as I've noted elsewhere, a folder can contain Environments and those too will need to be accounted for. Otherwise, the catalog.delete_folder operation will fail as it is not empty.

Running the following code will remove everything in your SSISDB. This is the nuclear option so be certain you really want to clean house. You can uncomment the WHERE clause and selectively remove folders for a tactical nuclear option.

How it works is simple: I query catalog.folders to get a list of folders and then look in catalog.projects to find all the projects contained within the folder and delete those. I then repeat the process but look in catalog.environment to identify and remove all the SSIS environments.


USE [SSISDB]
GO

DECLARE
@folder_name nvarchar(128)
, @project_name nvarchar(128)
, @environment_name nvarchar(128);

DECLARE Csr CURSOR
READ_ONLY FOR
SELECT
CF.name AS folder_name
FROM
catalog.folders AS CF
--WHERE
-- CF.name IN ('');
;

OPEN Csr;
FETCHNEXTFROM Csr INTO
@folder_name;
WHILE (@@fetch_status <> -1)
BEGIN
IF (@@fetch_status <> -2)
BEGIN

-------------------------------------------------------------
-- Drop any projects
-------------------------------------------------------------
DECLARE FCsr CURSOR
READ_ONLY FOR
SELECT
CP.name AS project_name
FROM
catalog.projects AS CP
INNERJOIN
catalog.folders AS CF
ON CF.folder_id = CP.folder_id
WHERE
CF.name = @folder_name;

OPEN FCsr;
FETCHNEXTFROM FCsr INTO
@project_name;
WHILE(@@FETCH_STATUS = 0)
BEGIN
EXECUTEcatalog.delete_project
@folder_name
, @project_name;

FETCHNEXTFROM FCsr INTO
@project_name;
END
CLOSE FCsr;
DEALLOCATE FCsr;

-------------------------------------------------------------
-- Drop any environments
-------------------------------------------------------------
DECLARE ECsr CURSOR
READ_ONLY FOR
SELECT
E.name AS project_name
FROM
catalog.environments AS E
INNERJOIN
catalog.folders AS CF
ON CF.folder_id = E.folder_id
WHERE
CF.name = @folder_name;

OPEN ECsr;
FETCHNEXTFROM ECsr INTO
@environment_name;
WHILE(@@FETCH_STATUS = 0)
BEGIN
EXECUTEcatalog.delete_environment
@folder_name
, @environment_name;

FETCHNEXTFROM ECsr INTO
@environment_name;
END
CLOSE ECsr;
DEALLOCATE ECsr;

-------------------------------------------------------------
-- Finally, remove the folder
-------------------------------------------------------------
EXECUTE [catalog].[delete_folder]
@folder_name;

END
FETCHNEXTFROM Csr INTO
@folder_name;

END

CLOSE Csr;
DEALLOCATE Csr;

Caveat

The one thing I haven't investigated yet was cross folder dependencies. Imagine folders Configurations and Projects. Configurations has an Environment called Settings. Projects has a project called AWShoppingCart which then has a reference to the environment Settings. I expect I will be able to delete the Configurations folder and the environment just fine and it will just leave the project AWShoppingCart broken until I reconfigure it. But, the environment delete operation could just as easily fail if there's reference count is non-zero.

Biml - Script Component Source

$
0
0

Biml - Script Component Source

What is the Biml to create an SSIS Script Component Source? This is a very simplistic demo but you'll see the magic is distilled to two sections - the first part is where we define the output buffer, lines 20-24. In this case, I specify it is DemoOutput and then provide a columns collection with a single column, SourceColumn.

The second set of magic is in the CreateNewOutputRows, lines 54 to 58. There I use the buffer I defined above to inject a single row into it with a value of "Demo". Nothing fancy, everything is static from a Biml perspective but I needed to know the syntax before I could try something a little more advanced.

Biml Demo Script Component Source

Using this a simple matter of adding a new Biml file into an existing SSIS project and pasting the following code. What results from right-clicking on the file and selecting Generate New SSIS package will be a single SSIS package, BasicScriptComponentSource, with a Data Flow task "DFT Demo Source Component"

The data flow "DFT Demo Source Component" consists of our new Script Component, SCR Demo Source, and a Derived Column, DER Placeholder, so you can attach a data viewer if need be.

Use the following Biml to generate your package and feel free to tell me in the comments how you adapted it to solve a "real" problem.


<Bimlxmlns="http://schemas.varigence.com/biml.xsd">
<ScriptProjects>
<ScriptComponentProjectName="SC_Demo">
<AssemblyReferences>
<AssemblyReferenceAssemblyPath="Microsoft.SqlServer.DTSPipelineWrap"/>
<AssemblyReferenceAssemblyPath="Microsoft.SqlServer.DTSRuntimeWrap"/>
<AssemblyReferenceAssemblyPath="Microsoft.SqlServer.PipelineHost"/>
<AssemblyReferenceAssemblyPath="Microsoft.SqlServer.TxScript"/>
<AssemblyReferenceAssemblyPath="System.dll"/>
<AssemblyReferenceAssemblyPath="System.AddIn.dll"/>
<AssemblyReferenceAssemblyPath="System.Data.dll"/>
<AssemblyReferenceAssemblyPath="System.Xml.dll"/>
</AssemblyReferences>
<OutputBuffers>
<!--
Define what your buffer is called and what it looks like
Must set IsSynchronous as false. Otherwise it is a transformation
(one row enters, one row leaves) and not a source.
-->
<OutputBufferName="DemoOutput"IsSynchronous="false">
<Columns>
<ColumnName="SourceColumn"DataType="String"Length="50"/>
</Columns>
</OutputBuffer>
</OutputBuffers>
<Files>
<FilePath="Properties\AssemblyInfo.cs">
using System.Reflection;
using System.Runtime.CompilerServices;
[assembly: AssemblyTitle("SC_Demo")]
[assembly: AssemblyDescription("Demonstrate Script Component as source")]
[assembly: AssemblyConfiguration("")]
[assembly: AssemblyCompany("billinkc")]
[assembly: AssemblyProduct("SC_Demo")]
[assembly: AssemblyCopyright("Copyright @ 2015")]
[assembly: AssemblyTrademark("")]
[assembly: AssemblyCulture("")]
[assembly: AssemblyVersion("1.0.*")]
</File>
<FilePath="main.cs">
<![CDATA[
using System;
using System.Data;
using Microsoft.SqlServer.Dts.Pipeline.Wrapper;
using Microsoft.SqlServer.Dts.Runtime.Wrapper;

/// &lt;summary&gt;
/// Demonstrate how to generate a Script Component Source in SSIS
/// &lt;/summary&gt;
[Microsoft.SqlServer.Dts.Pipeline.SSISScriptComponentEntryPointAttribute]
public class ScriptMain : UserComponent
{

public override void CreateNewOutputRows()
{
DemoOutputBuffer.AddRow();
DemoOutputBuffer.SourceColumn = "Demo";
}
}
]]>
</File>
</Files>
</ScriptComponentProject>
</ScriptProjects>

<Packages>
<PackageName="BasicScriptComponentSource"ConstraintMode="Linear">
<Tasks>
<DataflowName="DFT Demo Source Component">
<Transformations>
<ScriptComponentSourceName="SCR Demo Source">
<ScriptComponentProjectReference
ScriptComponentProjectName="SC_Demo">

</ScriptComponentProjectReference>
</ScriptComponentSource>
<DerivedColumnsName="DER Placeholder"/>
</Transformations>
</Dataflow>
</Tasks>
</Package>
</Packages>
</Biml>

Viewing all 144 articles
Browse latest View live