Tuesday, November 1, 2016

All About Try and Catch



Loading log files (Error, Trace, Monitor) into a table


Sometimes we need to write error details(like error number, description) into table.
Let us say, a job is being executed, an 
error occurred in the middle of the execution, error details will be written in to error log file. Now, i want to capture those error details and preserve in a backend table. 
Are there any metadata tables that holds this information? Certainly, I didn't find any.  I looked at metadata table AL_HISTORY_INFO, It holds the path of log files but not the error values(like error number, description etc.,). one solution is there, we can use get_error_filename() environment function, this function gives the current job error log filename and its complete path. 
Here is the way to go, I have a simple dataflow which loads the inventory details excel sheet into target. Now, I’m purposefully generating an error by pointing to a wrong file name to read. 

 
 
and in the catch statement, I have a script to get the error filename, monitor filename, trace filename and their path. And in the next step I placed a workflow,  in that I have designed a dataflow for each log file to load.

 
And in the script, I'm using environments functions to get their respective filenames, and passing those values in to three global variables , and then trimming the path from one of the filename. 
$PATH ------- log files entire path
$GV_ERR_FILENAME ------- For Error log filename
$GV_MTR_FILENAME ------ For Monitor log filename
$GV_TRC_FILENAME ------- For Trace log filename
Define these global variables at job level 
  

I have defined fixed width flatfile formats for each log file. 
Flat file format for error log

 
Create a similar file format definitions for Monitor log and Trace log, after defining these file formats, design a dataflow for each file to load.
 

  

 
Post execution steps: check the trace log window and error log window
 

  

Method2:


Step 1: Create a Job and name it as ‘ERROR_LOG_JOB’

Step 2: Declare following four global variables at the Job level. Refer the screen shot below for Name and data types.
GV.png
Step 3: Drag a Try Block, Dataflow and Catch block in work area and connect them as shown in diagram below. Inside dataflow you can drag any existing table in your repository as a source and populate few columns to a target table. Make sure target table is a permanent table. This is just for demo.
Job.png

Step 4: Open the Catch block and Drag one script inside Catch Block and name it as shown in below diagram.
Catch Inside.png

Step 5: Open the scrip and write below code inside as shown in the diagram below.
Script Msg.png

The above script is to populate the values in global variables using some in-built BODS functions as well as calling a custom function to log the errors into a permanent table. This function does not exits at this moment. We will be creating this function in later steps.

Step 6: Go to Custom Function section in your repository and create a new custom function and name it as under.

Fun sec.png

Step 7: Click next in above dialog box and write the below code inside the function. You need to declare parameters and local variables as shown in the editor below. Keep the datatypes of these parameters and local variables what we have for global variables in setp 2. Validate the function and save it.

Fun.png

Step 8: Now your function is ready to use. Considering that you have SQL Server as a database where you want to capture these errors in a table. Create a table to store the information.
CREATE TABLE [dbo].[ERROR_LOG](
      [SEQ_NO] [int] IDENTITY(1,1) NOT NULL,
      [ERROR_NUMBER] [int] NULL,
      [ERROR_CONTEXT] [varchar](512) NULL,
      [ERROR_MESSAGE] [varchar](512) NULL,
      [ERROR_TIMESTAMP] [VARCHAR] (512) NULL
)
You may change the datastore as per your requirement. I have taken ETL_CTRL as a datastore in above function which is connected to a SQL Server Database where above table is being created.
Step 9: Just to make sure that dataflow is failing, we will be forcing it to throw an error at run time. Inside your dataflow use permanent target table. Now double click the target table and add one text line below existing comment under load triggers tab. Refer below screen shot. This is one way to throw an error in a dataflow at run time.

Tbl Error.png

Step 10: Now your Job is ready to execute. Save and Execute your Job. You should get an error message  monitor log. Open the table in your database and check if error log information is populated. Error Log shall look like as shown below.
Monitor Log.png

ERROR_LOG table shall capture the same error message in a table as under.

Error Result.png

1 comment:

  1. I am impressed. I don't think Ive met anyone who knows as much about this subject as you do. You are truly well informed and very intelligent. You wrote something that people could understand and made the subject intriguing for everyone. Really, great blog you have got here. https://catcherrors.com/repos/pandas-dev/pandas

    ReplyDelete