Pentaho Tools :

Pentaho C-Tools(CDE,CDF,CDA),Pentaho CE & EE Server,OLAP-Cubes,Analysis using Pivot4J, Saiku Analytics, Saiku Reporting, Ad-hoc Reporting using Interactive Reporting Tool,Dashboards,Reports using PRD, PDD,Data Integration using Kettle ETL,Data Mining usign WEKA,Integration of Servers with Databases,Mobile/iPad compatible Dashboards using Bootstrap Css,Drilldown dashboards,Interactive Dashboards

Thursday, 31 December 2015

Failed : Virus Scan Failed download error in google - How to Fix in Windows machine ?

Follow below steps to fix the issue

1) Click on Start Menu --> In search box type "run" ---> In run box type -> regedit
2)  Navigate to Attachments as shown in below


3) In the right side window double click on ScanWithAntiVirus option.
4) In the "Edit DWORD(32-bit) value window change value data option from 3 to 1
5) Close the window & restart the Chrome if already running and try downloading now.

:-) :-)

A video tutorial is available at

Wednesday, 23 December 2015

Tip : Difference between Text File Input and CSV File Input steps in Pentaho Data Integration(aka Kettle ETL)

Hi Folks,

A question makes us go in-depth of subject, isn't it ? Recently, One of my community folks asked a question on the differences between Text File Input & CSV file input ? Which one would you prefer to use if you are given a CSV file ?  
Hmmm..!!! Was trying to explain some bla stuff but later in the evening had a quick investigation on finding the differences. (An example speaks more than theory when understanding the concepts).

Here we go. ! Pentaho WIKI has detailed information about these two steps at

Text File Input :
CSV File Input :

Here is my quick understanding on the conceptual things which you may find useful.

1) "Text File" input steps can read variety of files such as Text, CSV, Fixed length whereas CSV file  
     can read only read .csv extension files.

2) "Text File" input step is useful to add file information such as File Name, Extension, File URI(File directory and etc) which you can't do with CSV file.

3) What if you given multiple CSV files to read at a time ? Text file input step is capable of reading multiple files whereas CSV file only takes 1 file.

4)  CSV file is a subsidiary component of Text file input step.

5) There are few advantages of CSV file over text file.
*  Use CSV file input step if you are given single file to process at a time (for instance lookup
* When you use CSV file  input step it will faster the processing and hence increases the 
*  It has NIO (Non blocking Input Output / Native System calls) - which is nothing but size of
    read buffer.
*  It represents the amount of bytes that is read in one time from disk. 
*  Parallel running: If you configure this step to run in multiple copies or in clustered mode, and you 
    enable parallel running, each copy will read a separate block of a single file allowing you to
   distribute the file reading to several threads or even several slave nodes in a clustered

* Lazy conversion: If you will be reading many fields from the file and many of those fields will not be manipulate, but merely passed through the transformation to land in some other text file or a database, lazy conversion can prevent Kettle from performing unnecessary work on those fields such as converting them into objects such as strings, dates, or numbers.

Example : 

Download Example here : 
Click me .!!!
Change Files path to your system path.

More Information at NIO is at :

How to create a sample Fixed length file using Excel ? 
create a fixed length text file from excel

 References :

Monday, 9 November 2015

Pentaho CDE useful links

•    Jquery (
•    JqueryUI (
•    Jquery i18n (
•    Mustache (
•    Backbone (
•    Underscore (
•    Moment (
•    Require (

Layout related:
•    Modernizr (
•    Bootstrap (
•    Blueprint (
•    Font-awesome (

•    Raphael (
•    Protovis (
•    DataTables (
•    Community Charts Components – CCC (
•    Select2 (
•    Chosen (
•    Sparklines (
•    Fancybox (

Pentaho Data Integration - 8 - (DWH) - Change Data Capture(CDE) OR incremental loading of dimension table - Sample example


This post will give you the beginner learning concepts in understanding incremental loading of data from a OLTP database(transaction database)  table into OLAP database(data warehouse dimension table) table.

Software used for this example :
1) Pentaho Data Integration CE 5.4 (or 6.0 CE)
2) PostgreSQL  CE

Source data:
create database OLTP;
create table customer(c_id int not null primary key,c_name text,c_location text);

insert into customer values(1,'sadakar','hyderabad')

insert into customer values(2,'hasini','mncl');
insert into customer values(3,'anji','banglore');
insert into customer values(4,'venkat','chenni');
insert into customer values(5,'anjali','karimnagar');

select * from customer

c_id c_name  c_location

Scenario  :
Lets say a new record is inserted in above customer table then while running ETL transformation/job then the only newly inserted record should be loaded into the customer dimension table in warehouse table.
if 6;"soujanya";"banglore" is inserted into above table then this record only load into the customer dimension table in warehouse.

Target :  
create database OLAP
CREATE TABLE dim_customer
, customer_name TEXT
, customer_location TEXT
You can create above dimension table manually in the database or can generate the script using table output step.

Steps : 
1) Create two data source connections (one is for OLTP and the other is for OLAP) in kettle.
2) Drag and drop "Table Input" step and write below query taking from OLTP database.
     SELECT * FROM customer
3) Drag and drop "Table Output" step and give all the properties in the window.
    database name : OLTP , table name : dim_customer 
   Check Specify database fields 
   Table field      Stream field
   customer_id      c_id
   customer_name  c_name
   customer_location  c_locatioin 
4) ETL 1st run : save the transformation and run the job. 
    Observe , Step Metrics tab from "Execution Results: tab ( 5 rows are read and 5 rows are written). 

dim_customer table output after 1st ETL run
customer_id customer_name customer_location 

5) Now, drag and drop table input step for finding the maximum of customer id from "dim_customer" table using below query.
SELECT COALESCE(max(customer_id),0) FROM dim_customer

6) connect this step to source table input step as shown in below

7) Edit the source table input step - i.e., Give insert data from Step - connect the above step to source table input step.
Also, impotently change the SQL code from
     SELECT * FROM customer to
      SELECT * FROM customer WHERE c_id > ?

8) ETL 2nd Run : Save the transformation and run the job. 

In above case no new  record is inserted in OLTP table hence no new records are loaded into dimension table. You can see it in Step Metrics in above figure.

dim_cusotmer table output after 2nd run of ETL 
customer_id customer_name customer_location 

9) Now, insert few new records in OLTP customer table.
lets say,
insert into customer values(6,'soujanya','banglore');
insert into customer values(7,'kumar','bellampally');
insert into customer values(8,'srinivas','somagudem');
insert into customer values(9,'srinandan','warangal');

SELECT * from customer
c_id c_name c_location

i.e, from row 6 to 10 are newly inserted rows in OLTP customer table.

Now when you run the ETL, the c_id should load from 6 .

10) ETL 3rd run - Insert the newly inserted rows of OLTP into OLAP dim_customer table

Observe the table input has read 5 records which were not available in dim_customer table and loaded only these new 5 rows.

dim_customer table after 3rd ETL run 
customer_id customer_name, customer_lcoation

The same procedure can be applied on dates.
 # If you are taking CREATED_ON in OLTP, find the max(CREATED_ON) instead of ID.
 # You can also take the combination of ID and CREATED_ON for incremental loading.

Download sample .ktr file :  Click Me.!!

References : 

Table Input 1 (Select MAX(Insert Timestamp) from DestinationTable ) ->
Table Input 2 ( Select * from SourceTable where Insert Timestamp > ? ; Insert Data from Table Input 1) ->
 Table Output (Insert Rows into DestinationTable)

1) Table Input: Select MAX(ID) from LocalTable
2) Table Input: Select * from RemoteTable where ID>? (Insert Data from 1)
3) Table Output: All Fields into LocalTable 

I hope this helps some one .! Please do share if you find this post is useful.


Sunday, 30 August 2015

TIP : Copy files from Windows10 Host to Ubuntu 15.04 Guest installed in VMWare

Hi Folks,

Find below steps to copy files from Windows Host machine to Ubuntu Guest installed in VMware.

Software Used
1) Windows 10 as Host
2) Ubuntu 15.04 server(not desktop Ubuntu) as Guest
3) VMWare Work Station 10.01
4) WinScp 5.7.5
5) Internet is on Wi-Fi

1)  Install open ssh server in Ubuntu terminal
     Example : sudo apt-get install openssh-server

2) Download WinSCP in Windows
    Download Link :

3)  Check VMware internet connection
     #  VM-> Settings ->Virtual Machine Settings window will pop up
     # Select Network Adapter (NAT) and check NAT option in Network connection Tab.

4) Find the ipaddress using ifconfig on Ubuntu terminal
   Example :

5) Note down the sudo su username and password of Ubuntu
   Example : User name : hasini and password is hasini

6) Open WinScp and click on Advanced settings
     Left side Environment tab -> Shell
     Right side : In shell tab and shell property choose sudo su -

7) Now, provide the credentials of Ubuntu server in WinScp
(If you won't follow this, you will get can't copy error error

8) Click Login to get connect to Ubuntu server and happy file coping.

  Testing : Copy a file from windows directory to ubuntu directory

 Before copying  a file: 

After copying : Drag and drop the file (In above image point 2 to point 4)

NOTE : Alternative way to copy files
1) Copy files tmp folder of Ubuntu
2) Go to tmp folder and using mv command move the file to the target directory in ubuntu.

:-) :-)

Did you find it useful ? Please share for community guys like us..!

Reference :

Tuesday, 25 August 2015

Upgrade Ubuntu Server 14.04 to 15.04 - in VMWare Work Station 10.0.1 build-1379776

Follow below steps to upgrade Ubuntu Server (Not the desktop) from 14.04 to 14.10

Software I used :
Host   :   Windows -10
Guest : 14.04 server installed in VMWare Work Station.
Virtual Machine : VMWare 10.01

1) check update-manager-core is available or not from server command line editor.

2) If it is not available install it
   command :
       sudo apt-get install update-manager-core

3) Change Prompt=lts to Prompt=normal in release-upgrades file

4) File location : /etc/update-manager/release-upgrades
     Navigate to this directory do the point-3 changes.

5) Save the file and run below command to upgrade
    Upgrade command : sudo do-release-upgrade

6) It will check for new servers and if available upgrade starts

Thanks . Do you find it useful, If yes, share it.

Click on Image for best view of the content 

References : 

:-) :-)

Thursday, 20 August 2015

Basics of writing AND and OR conditions in Filter rows Step in Pentaho Data Integration (aka Kettle ETL)

This post is intended to the beginners of Kettle ETL on writing AND and OR conditions in Filter Step 

Let us take this problem statement (taken from community pentaho forums).

Reference :

Target1 will have records where Salary < 50000 and Dept_id =10 or 20 and
Target2 will have records where Salary > 50000 and Dept_Id=30 or 40 and
Target3 will have the rows which don't satisfy Target-1 and Target-2 conditions.

Software used : Windows 10, PDI- 5.4 CE stable , Java 1.7 installed

Sample source data : 
Transformation Design  

Combination of AND and OR conditions in Filter rows step

To do the negative of the condition just click on empty condition box appeared left top corner.
You can find the "NOT" keyword which will negative the condition written. 

NOTE : send the true and false data to target steps as shown in above image. 

outputs  - 3 target outputs based on conditions we took in problem. 
Target-1 output : <50000 and 10 & 20 are as dept

Target-2 output : >50000 and 30 and 40 are as dept

 Target-3 : rows which don't satisfy the conditions of Target-1 and Target-2 

I hope this use-case helps some in understanding AND and OR conditions usage in Filter rows step in Kettle ETL 

Download Example : Don't wait - Click Me..!!!

- Sadakar Pochampalli

Pentaho Data Integration use case : add value fields changing sequence and Java script concat & indexOf methods usage

This post is based on a thread posted in pentaho community forums.

Link :

Problem Statement :
Add row numbers to child values of a filed. ( the first field in the image is the input )

Solution :

Java script : 

var res = field.concat(seq); // con-cat the "field" column with "seq" column.

var indexOfParentValue=res.indexOf("ParentValue"); //identify the "String position" and returns 0 or -1

if(indexOfParentValue ==0){
var finalField=field;
var finalField= res;

NOTE that compatibility mode is un-checked in MJavaScript Step.

Download : Download example here

I hope this helps - how to write java script methods in Kettle ETL.

- Sadakar Pochampalli

Wednesday, 5 August 2015

CST - Community Startup Tabs in Pentaho 5.4 CE - Example : User based dashboard access on logging

This post is a workout on how CST(Community Statup tabs) works with Pentaho CE BA server.

Documentation from webdetails site has been put in below PPT for easy and quick understanding with an example.

(You may not find new information in PPT but you may quick understand the usage of the plug-in once you gone through the presentation).

Software used :
Pentaho BA Server 5.4 CE, C-Tools 15.x version (CDE,CDF,CDA,CGG and CST).

CST allows you to set which tabs will automatically load when a particular user logs into the PUC.

1) User based
2) Role based
3) Default rules.

Notice that the example in presentation is based on Userbased access for dashboards in a project folder called "Dashboards". (no roles are created for this example).


Quick Snapshot : ( Click the images to get the best view of the content )

References : 

Tuesday, 21 July 2015

Bar chart drill down to another dashboard example in Pentaho CDE

In this post, you will find the quick steps and best practices to develop bar chart drill down to other dashboard.

Soft ware setup for this example
1) Pentaho CE 5.4 BA server
2) Pentaho C-Tools 15.06.30
3) Foodmart database running on PostgreSQL server.
4) IE or Mozilla firefox.

Problem Statement : Drill down bar chart
1) Create 2 dashboards
2) Dashboard 1 :
contains bar chart - In this example lets say quarters(Q1,Q2,Q3 and Q4) on X-axis and UnitSales and StoreSales on Y axis.
3) Dashboard 2 :
A line chart - Months of quarter on X axis and UnitSales and Store Sales on Y axis.
4) Functionality :
When you click on a bar from dashboard 1 the quarter has to be passed as parameter value to the 2nd dashboard.

Steps :
1) Dashboard -1 
a) Go to bar chart click action and write below code. (Don't forget to make clickable = true). :-)

    function f(scene) {
    var vars = scene.vars;
    var s = vars.series.value;
    var c = vars.category.value;
    var v = vars.value.value;
    alert("series: " + s + "\ncategory: " + c + "\nvalue: " + v);  

    window.location =url+'?param_Quarter='+c;

b) In above code, c - represents category, the X-axis value, in this example the quarters Q1, Q2, Q3 and Q4.

c) param_Quarter is a string written in code, it would be the parameter(or custom parameter) in 2nd dashboard.

d) Combine URL and 1st parameter with ? (question mark). If you use & you will get error message like "Sorry, we really did it" .

e) When you pass multiple parameter separate parameters using &  but note that URL and 1st parameter should be separated by ?

f) Use alert functions to test the categories,values, series and etc.

Dashboard 2 :

1. Develop dashboard with custom parameters.(In this example , I have created  a custom parameter  that is :
param_Quarter  and the java code (you can say default value(s) expression editor) )

2. Don't directly test the drill down from 1st dashboard by clicking bar chart.(keep it on hold for some time).

3. View the 2nd dashboard output in browser though you will not get output.  (This is to take the URL)

4. Now, add the custom parameter created to the URL and test dashboard 2 individually, if you get the result then your 2nd dashboard is dynamic with parameters. Importantly, provide listeners and parameters to the components where ever required in your 2nd dashboard.  

Example : 

That's all, you are done.!!! 

Download Example :  Bar chart drill down example 
and have at the sample dashboards.

Note that to run this example you need to have foodmart database running on PostgreSQL database .
(I used jasperserver example foodmart database).

Here are quick snapshots of the output.

Dashboard 1 : 

Dashboard 2 : 

Thanks for reading this small article. Hope it helped you a bit : -)



Sunday, 19 July 2015

Tip : Pentaho CDA : Error Listing Queries : Observation in CDE & CDA

This post is a test outcome on how queries and its connections works in pentaho CDE.

Problem found : Error Listing Queries
I have created 3 new queries of connection types JDBC(sql over sqljdbc). I have not provided any connection details for 1st and 3rd queries but provided all the details username, password and URL for 2nd query.

When I previewed the queries keeping above as thy are, using CDA editor, it has not listed out the queries and shown me "Error Listing Queries". 

 Solution : 
I had tested the username, password and URL in some other database client tools. Every thing working fine but it is not listing the queries in CDA. Just removed the 1st and last query and tested again. I didn't find the magic behind it but worked fine. Now, it listed the query and able to see the output.

So, I think, the order of queries with connections providing plays important role while working with CDE dashboards.  

- Sadakar Pochampalli

Thursday, 16 July 2015

Pentaho 5.4 CE Login Page or Pentaho User Console(PUC) - 5.4 CE Customization Tips

In this post we will see how to customize pentaho 5.4 CE BA server login only page.

I thank you to the Java+Web developer for helping me in how to play around styles of Login Page using firebug and then in PUCLogin.jsp page

I had not involved in full time web development therefore you will find the tips not so awkward but hoping that this small write up would help someone somewhere who want to learn and apply  for your quick business needs.

Software Requirement : 
1) Pentaho BA 5.4 CE Server
2) Images (Your company logo, Your company favicon logo).
3) Any web-browser (I preferred Mozilla firefox as Inspcet / Inspect with firebug options are handy to play around).


I have tested all the changes explained below on laptop and desktop web browsers where I didn't find greater difference in UI.

I have also tested the changes made in iPhone and found there are some known alignment issues hence this post will work only for desktop/laptop browsers.

(Yes, I have a one by this time - thanks to my small customers/consultants.  In few of my earlier posts I used to test the mobile comparability of dashboards by minimizing the web browsers - Now, no more that method I will use :-)

Let us start with ,

1) Remove background Image

Problem : 

Solution : Remove "login-crystal-bg.jpeg" image

E:\1_Softwares\Business Intelligence\Pentaho BI Tools\BI Servers\biserver-ce-\biserver-ce\pentaho-solutions\system\common-ui\resources\themes\crystal\images

from above location remove "login-crystal-bg.jpeg" image. 

Login Page view after removing image

Points to note down :
1) Restart of server is required.
  (Why ? Web applications usually doesn't pick up updated or removed images ).
2) Delete the cache in browser (clean the browser).

2) Remove default login elevators

 Solution : 
 1) Disable/Comment few lines of code in PUCLogin.jsp file (This login page code is written in this file).
2) How to find ? Using webpage "Inspect Element"
    Right click on Login as an Elavator and click "Inspect  Element" as shown in below image

  • Note down the ID of the DIV element and search it in "PUCLogin.jsp" ( I used notepad++ to view the code and search) 
  • ID = eval-users-toggle-container (near by 186 line in the .jsp file)
  • Comment the code start with   <div id="eval-users-toggle-container"> and end with it's </div>
  • Server restarting is not required b'z we are not editing css or image but html

Sample view of BA server after this change : 

 3)  Change Credential Look and feel

Stage 1 :  Change <label> tags to <span> and add <br> tags as shown below

Stage 2 :  Adjust login form  - Horizontal center and vertically middel

a) Inspect the id="login-form-container" and adjust left and top pixels i.e., as shown in above image add them to the element class. As a result style tag will be added to the container.

b) The above is on fly experiment, do the same in PUCLogin.jsp file
 add style="top: 150px; left: 440px; border-radius: 25px; border: 2px solid rgb(178, 196, 212); width: 380px; height: 220px;"

c) After adding the code in PUCLogin.jsp should look like below (Line number near 167)

<div id="login-form-container" class="lang_<%=cleanedLang%>" style="top: 150px; left: 440px; border-radius: 25px; border: 2px solid rgb(178, 196, 212); width: 380px; height: 220px;">

d) Server restart is not required -> To see the affect reload the server by clicking F5 or refresh.

e) Change the left and top pixel(s) till you get it middle of Login Page. 

Stage 3 : Inner elements adjustments inside form
1) Add styles to the <span> and input elements explained in Stage 1 

<span style="margin-left: 35px;"><%=Messages.getInstance().getString("UI.PUC.LOGIN.USERNAME")%></span>
              <input id="j_username" name="j_username" type="text" placeholder="" autocomplete="off" style="margin-left: 20px; margin-top: 10px;">


 <span style="margin-left: 35px;"><%=Messages.getInstance().getString("UI.PUC.LOGIN.PASSWORD")%></span>
              <input id="j_password" name="j_password" type="password" placeholder="" autocomplete="off" style="margin-left: 30px; margin-top: 5px;">

2) Add styles to "Login" button 
<button type="submit" id="loginbtn" class="btn" style="margin-left: 199px; margin-top: -18px;">

3) Login form header (User Console) styling 
<h1 style="margin-left: 110px; margin-top: 20px;font-size: 25px;margin-bottom: 5px;">

4) Remove vertical scroll - add height style to below

<div id="login-wrapper" style="height:200px;">
  <div id="login-background" style="top:-82px;height:600px">

and comment below code

  <div id="login-footer-wrapper">
    <div id="login-footer" class="beforeSlide"><%=Messages.getInstance().getString("UI.PUC.LOGIN.COPYRIGHT", String.valueOf(year))%></div>

After changing all the pentaho login page look as below

 5) Change favicon image .
File Name : PUCLogin.jsp

Find below code and change logo name to your logo name. 
<link rel="shortcut icon" href="/pentaho-style/favicon.png"/>

Location to find favicon image or place a new logo of your choice is 

E:\1_Softwares\Business Intelligence\Pentaho BI Tools\BI Servers\biserver-ce-\biserver-ce\tomcat\webapps\pentaho-style

Keep a new logo of your company with favicon.png and save the file. 

Refresh the PUC login page and find the change.

6) Change button colors 

Login Button color change : 
<button type="submit" id="loginbtn" class="btn-warning " style="margin-left: 199px; margin-top: -18px;"><%=Messages.getInstance().getString("UI.PUC.LOGIN.LOGIN")%></button>

Error button "OK" color change
   <div class="button-panel">
      <button class="btn-warning pull-right" onclick="document.getElementById('loginError').style.display='none'"><%=Messages.getInstance().getString("UI.PUC.LOGIN.OK")%></button>

7) Change puc-login-logo.png 
Image name : puc-login-logo.png
Location : <directory>\biserver-ce-\biserver-ce\pentaho-solutions\system\common-ui\resources\themes\images
How to : Take your company or client logo , rename it with same name (take a backup copy of original).

Adjustments : 
File to edit : PUCLogin.jsp
Find below code : 
<div id="login-logo"></div>
Change code to add styles :
<div id="login-logo" Style="left:440px;margin-top:80px;"></div

Download this PUCLogin_ModifiedCopy.jsp and check how the styles are added.

A sample final output of the pentaho login page

Thursday, 9 July 2015

Pentaho Data Integration Basics - 7 - (DWH) : Implementing SCD Type-II dimension table in data warehouse

In this post we will learn how to implement Slowly Changing Dimension (SCD) Type-II

What is SCD Type 2 ?

Software Requirements : 
1) Pentaho Data Integration(a.k.a Kettle ETL) - 5.4.0
2) PostgreSQL database
3) Sample data

Problem Statement :  (All the three points comes under)

1) Source data (Staging data environment)  - Raw Data for employee staging table

Let's say table name is: emp_staging

2) Problem scenario :

 Track the location of employee when he moved to other place.(with the same emp_no )
 i.e., keep the old location value along with new location value (i.e., keep track of history - how ? )

Example :
If 103 numbered employee moves to a new location let's say "new delhi" keep both the records(old location and new location records) in the dimension table

CASE 1 : emp_staging (raw data) table after 103 employee moved to a new location "new delhi"


CASE 2 : emp_staging (raw data) table after 103 employee moved again to a new location "USA"
NOTE : employee can be shifted to "N" number of times to "N" number of locations.
Whenever employee moves to a new location keep all the locations in the dimension table along with new location.

3) Expected final output for dimension table

 Let's say, the dimension table name is "dim_employee"

ETL  1st RUN : (On raw data)
table name : dim_employee

ETL 2nd RUN (Change-1): Employee 103 moved to "new delhi" from "chenni"
table name : dim_employee

3rd RUN(Change-2) : Assume the same 103 employee moved from "chenni" to "USA"
 table name : dim_employee

Theory about SCD Types: 
1) Type 1 :
2) Type 2 :

All 3 types.

Technical Implementation of above example in Pentaho Data Integration (using Kettle ETL)

1)  ETL :  Prepare staging area (Sample data) in staging database.  (table name : emp_staging)
2)  Design : Dimension table design in data model (dim_employee)
3)  ETL : load the dimension table (Run 1 , Run 2 and Run 3 , for this example). 

1)  ETL : Prepare staging area(Sample data) in staging database. (table name : emp_staging)

2)  Design : Dimension table design in data model (dim_employee)

3)  ETL : Design transformation and load the dimension table 

Dimension Lookup/Update

1) Provide all the db connections
2) Write the dimension table name (eg : dim_employee)
3) a) Surrogate key = primary key = emp_id_sk
    b) version field holds the revision numbers for dimensions to keep track of history.
4) Keys
     Stream field = emp_code and dimension field=emp_code
5) Fields

Download this example :  Click here

Hope this helps some one :-)

Thanks for your time on this page :-)