Saturday, 24 December 2016

Working with Precedence Constraints in SQL Server Integration Services

Precedence Constraints are useful to define workflow in a sequential manner for various Executables, Containers, tasks under Control tab as well as they also can be used to define the workflow in sequential manner in Event Handlers in SQL Server Integration Services

In simple words, Precedence Constraints are useful to perform the data transportation in a sequential manager based on certain business requirements.

you can design the required workflow in your SSIS package by using Precedence Constraint Editor which is a user interface with simple configurable options (Figure 1).
(Figure 1)

As you can see in above screenshot (Figure 1), Precedence Constraints can configure by two options such as Constraint Options and By Logical Operations. We will checkout both of them in detail in the below explanation.

Workflow by Constraint Options

There are four ways to configure this section such as Constraint, Expression, Expression and Constraint, Expression or Constraint.

1. Constraint

If business requirement is to run the next executable only based on Constraint then value can be configured such as Success, Failure and Completion. (Figure 1)

Success: Current executable has to run successfully then only next level executable will start running.

Failure: It will allow to start next execution only when current executable got failed, this constraint maybe useful to capture the log for failure execution.

Completion: In this constraint, next executable has to start running whether previously linked executable was completed successfully or not.

2. Expression

If next executable need to run based on an Expression value for true evaluation then you can select this option. This option is very rare in terms of business requirement but you can define running the next executable based on certain Expression evaluation as well. If you choose this option then you can see in screenshot (Figure 2) that value section has been disabled while a text box has been enabled to write an expression.
(Figure 2)

3. Expression and Constraint

This is most common business scenario in which you select constraint value as Success and evaluation from an expression. It will allow execution of next executable if result of both Constraint and Expression will evaluate to true. When you choose Expression and Constraint option from Precedent Constraint Editor then both the value and expression text boxes would be enabled (Figure 3).
(Figure 3)

4. Expression or Constraint

This option will allow execution of next executable if either of the evaluation is true from the Constraint or from the Expression. At least one of these value must evaluate to true then only next executable will run.

Workflow by Logical Operations

As already mentioned, there are two logical operations are available under Precedence Constraint Editor. Below is the step by step explanation for both of them.

1. Workflow by 'Logical And'

This is the default option selected under Precedence Constraint Editor and green arrow of Precedence Constraint will be solid. If  Logical And is selected then result from all the executable must evaluate to true.

2. Workflow by 'Logical Or'

In some business scenario, you might need to select this option and Precedence Constraint arrow will become as dotted arrow (Data Flow TaskFigure 4). If this option is selected then next level of executable will consider input only from one executable set or container.

For your better understanding, let's take an example (Figure 4) below -
(Figure 4)

There is a business scenario for college (Figure 4), in which one project table will be filled for a particular department which received maximum number of projects submitted for the current day. The basic workflow is defined as below -

1. Check the @RowCount (EST 1) of project table if it has been already filled for a department which received maximum number of projects for the current day.

2. There are two conditions as below -

Condition 1 - If @RowCount is equal to zero then find the find department name from all projects submission which has maximum number of projects submitted for the day (EST 2).

Condition 2 - If @RowCount is greater than zero then truncate the project table (EST 3) and find the department name from all the project submission which has second largest number of project submission for the day.

3. Fill the data into project table using Data Flow Task (DFT 1). Here Data Flow Task will execute on the basis of input value either from the EST 2 or EST 4 (Logical OR, dotted arrows). 

4. There are two Send Mail Tasks for the success and failure activity of Data Flow Task (DFT 1) respectively. Here executable Send Mail Task with green arrow will run only for successful run of Data Flow Task (DFT 1) and executable Send Mail Task with red arrow will run only if Data Flow Task (DFT 1) got failed due to some reason.

5. Start backup utility after the completion of Data Flow Task (DFT 1). Here is the example of Completion constraint which means backup utility has to run for logging purpose even though Data Flow Task (DFT 1) executed successfully or not.

Tuesday, 23 February 2016

How to check if a SQL Server Agent job is running

I was getting an error that "SQLServerAgent Error: Request to run job <Job name>  from User <User name>  refused because the job already has a pending request from User <other User name>.

To avoid the above error, it is always required to check the running status of a SQL job, if it is already running or not. It also might be the scenario that it is already have a pending  request to execute by some other user. So below script will help to check the status of any SQL job.

Below T-SQL script is an If condition which is identifying the job status by the SQL job name. This If condition will be true if current job is not running.

For returning the value of the same job you need to make an inner join between msdb.dbo.sysjobs_view and msdb.dbo.sysjobactivity 
        select 1
        from msdb.dbo.sysjobs_view job
        inner join msdb.dbo.sysjobactivity activity on job.job_id = activity.job_id
            activity.run_Requested_date is not null
        and activity.stop_execution_date is null
        and = <SQL Job name> in single quotes
    EXEC msdb.dbo.sp_start_job <SQL Job name> in single quotes;

I wish above script would help you to find the running status of any SQL job.

Download a data file from a Claims based authentication SharePoint 2010 or SharePoint 2013 site programmatically using C# programming language.

It was required to download few data files from a SharePoint site which is using claim based authentication. Here we can implement this feature in script task by using C# programming language. The below script task in SSIS package with C# code  is placed under a ForEach loop container. This ForEach loop container will have all the file names and pass them to script task one by one. Script will have below parameters to full fill this requirement -

1. tFileName - Name of the file which will be downloaded from SharePoint site (ADFS claim based authentication).
2. tSPSiteRoot - Folder location of SharePoint site where all files are existing.
3. tLoginId - Login id, which will be user id or user name for SharePoint authentication.
4. tLoginPwd - Password, which will be password  for SharePoint authentication.
5. tDomain - Domain name, which will be used for SharePoint authentication.

Using System
Using System.Net (for WebClient class and NetworkCredential).

If you have described all the Parameters and NameSpaces, then go through the below C# programming -

Script Task body

// TODO: Add your code here 
//Initialize or get all the parameters values//********************************************************************
//Assign the local variables from package variables
//File name in SharePoint site, it is only the file name
string tFileName = Dts.Variables["User::strFileName"].Value.ToString();
//SharePoint site root path, it is folder location where files are existing.
string tSPSiteRoot = Dts.Variables["User::str_SPSite_Root"].Value.ToString();
//Login id, it will be used to login to the above sharepoint site
string tLoginId = Dts.Variables["User::strLoginId"].Value.ToString();
//Login Password, it will be used to login to the above sharepoint site
string tLoginPwd = Dts.Variables["User::strLoginPwd"].Value.ToString();
//Domain for Login, it will be used to login to the above sharepoint site
string tDomain = Dts.Variables["User::strDomain"].Value.ToString();

//Create instance of WebClient class,  it will be used to perform further download operations in  C# code.
//Create object of web client class
WebClient tWebClient = new WebClient();

//Below header is much important to add, without it may download operation of the files wouldn't be able to complete.
//Include the appropriate header for web client object
tWebClient.Headers.Add("X-FORMS_BASED_AUTH_ACCEPTED", "f");

Create the fully qualified path of the particular file in SharePoint site by concatenating the sharepoint folder locatino and particular file name 
//Get the File URL which is available at SharePoint location
string tRemoteFileName = tSPSiteRoot + tFileName;

//Get a folder location into local system, so here file will download from SharePoint location. you can use any location in your local system hard drive.
//Get the File location which is going to store locally
string LocalFileName = Environment.GetFolderPath(Environment.SpecialFolder.ApplicationData) + "/" + tFileName;

Now authenticate the SharePoint ADFS claim based authentication by providing appropriate user name, password and domain name like you have set through above parameters.
//Authenticate the login to SharePoint site which has the data files
tWebClient.Credentials = new NetworkCredential(tLoginId, tLoginPwd, tDomain);

//Download the file into location system folder location
//Download the files into locally, DownloadFile is a function in WebClient class. it contain two parameters which are fully qualified SharePoint file path and local file path.

tWebClient.DownloadFile(tRemoteFileName, LocalFileName);

//Just adding downloaded path of the file into SSIS variable so it would be used further in SSIS package for moving this file from local download path to appropriate location like as shared folder. //Assign the path of the locally downloaded file
Dts.Variables["User::strLocalFileName"].Value = LocalFileName;

Above peice of code will download the file into a local system folder, but due to project purpose, it may be required to move the downloaded file to any other appropriate location so at the last step of the above code locally download  path of the file has been restored into one SSIS package variable.

I wish it will help you to download any number of files from SharePoint ADFS (Active Directory Federation Services) claim based authentication.Thanks and good luck.

Tuesday, 9 February 2016

SSIS - How To Load Multiple Files ( .txt or .csv or .excel ) To a Table in SQL server ?

Data upload from multiple files formats into one destination table using SSIS.

Most of the time in SSIS, the data upload process is designed for the same type of file's format, but how to manage the data upload process if there are multiple file's formats available in a folder. In this situation we need to prepare a SSIS logic which will help to design the dynamic process to take care of different file's formats.

Let's assume a scenario in which there are multiple data files of different formats which contains excel, csv and text files. There can be multiple approaches to implement this scenario

The above scenario can be achieved by the following approaches.

Approach 1 - Keep multiple data flow tasks inside the single For each loop container.

Maintain multiple Data Flow Tasks inside For each loop container and each data flow task will handle one particular file format. 

In the above screen shot there are three data flow tasks are taken to handle with three kind of files like as excel, csv and text file. Task flow to the appropriate data flow task will be handled based on the expression written into the Precedence constraints.

Approach 2 - Maintain only one Data Flow Task and create multiple data sources based on the available file types. 
Example - There will be three type of data sources for excel, csv and text files.

Approach 3 - There will be a separate For Each Loop container for each format and will be a parallel execution on For each loops. 
Example - all the excel files will be processed through a separate for each loop, in the same manner all the csv files will be processed through a separate for each loop and all the text files will be processed through a separate for each loop.