You are on page 1of 11

UNIT-05: HR Metrics and HR Dashboards:

1. HUMAN CAPITAL ANALYTICS:

Introduction:

Human Capital Analytics is applying the analytic processes to the HR department of


an organisation to improve employee performance and thereby get a better ROI.
Investments in people is the most important asset, can show financial returns to the
organisations, while also showing benefit to the employee through improved
engagement and retention.
Analytics on human capital investments is a very powerful way to improve those
returns, on both individuals and the organisational level.

2. HUMAN CAPITAL ANALYTICS CONTINUUM:


Organisations can use human capital analytics to optimise their HR investments and
improve business decisions regarding talent development, employee retention,
engagement, and productivity, all of which improves organisational profitability.
Those who want to venture into human capital analytics may want to first take smaller
measurements steps. To do so, it is helpful to view the human capital analytics process
as a measurement continuum.
The continuum begins with anecdotes or storytelling. Anecdotes tell the story behind
the numbers. Anecdotes are a step beyond “smiley sheets”. They augment them,
providing further insight into the satisfaction of talent development activities, and can
be the first step toward quantifying the value of those efforts. Scorecards and
Dashboards are the next level of sophistication. Scorecards are a performance
management tool that concisely summarise the organisation’s talent management
strategies and associated measurements.
Dashboards are a distillation of the most important performance indicators. They allow senior
leaders to review those indicators at a quick glance. Scorecards and dashboards can leverage
automated surveys and activity data to track how the organisation is executing strategy and
the consequences arising from business processes.
Benchmarks are the next level beyond scorecards and dash-boards. Studying the best-run
companies in a specific area will help organisations establish benchmarks for things such as,
salary, training levels, desired turnover rates, and so an.
Correlations and causation are the next two stages in the continuum. Correlations describe the
statistics where two or more variables move together, but there is no dear reason why.
Causation shows why a metric or key performance indicator has changed. The final stage of
the continuum is optimisation, the holy grail of HR measurement. Optimisation involves
predictive analytics. Predictive analytics not only measure the impact of the causes, it also
helps optimise and prescribe future investments.

3. HR MATRICS: MEASUREMENT PLAN:

a. Defining Metrics:
A metric is something that can be quantified to describe the outcome on a business
process. The measurement map’s leading indicators and business results will typically
map to existing metrics.
If there are points on the map that do not correspond to existing metrics, individuals
have a couple of options. First they can start collecting the necessary data. however,
they may need performance data for a period preceding the investment, a new metric
may not be feasible.
Another option is to identify surrogate metrics. These are existing metrics that help to
quantify an otherwise non-measurable link in the causal chain.

For the measurement plan, create a basic list of information about each metric.
Here is a checklist of what people should know about each metric:
1) Name of the metric.
2) Units in which the metric is measured (hours, dollars, percentage points, etc.).
3) Maximum and minimum range of the metric (in some cases, negative values are
not permissible).
4) Dates for which the data are available (is information available both before and
after the investment?).
5) Unit financial value (how much it is worth?).
6) Desired direction of change (increase or decrease).
7) Expected impact of the investment.
8) Contact- who is in charge of collecting this metric? Who can answer questions
about it?
b. Demographics:
As you create the measurement plan, consider which demographics have a
reasonable connection to the investment, as well as the type of demographic data
that is available. We identify two categories of demographics: individual and
organisational.
Individual demographics describes the individual personal traits such as gender,
age, educational level, ethnicity, and so on. Organisational demographics are
derived from some unit the individual is part of, e.g., region, division, work unit,
and so forth.

There are two important considerations when selecting the demographics to


include in the measurement plan. First, can an individual’s demographic have an
impact on his or her performance, as measured by one or more metrics? Second,
can the individual’s demographic predict the impact of the investment?

Here is a checklist of information your measurement plan should include for each
demographic:
1) Description.
2) Unit of measure (categorical, continuous numeric, true/false).
3) Minimum and maximum possible values (for numeric).
4) Number and list of valid categories (for categorical).
5) Distribution of participants across categories (for categorical).
6) Categories of ranges (e.g., age could be grouped into categories).
7) Contact or expert within the organisation who can take responsibility for
answering questions.

C. Data Sources and Requirements:


Once metrics and demographics for the study have been established, the next step
is to tackle the data sources and requirements. This is another area to rely on
stakeholders, as well as on the individuals within their departments who
administer data. The stakeholders will be able to tell where data live, but people
will want to interact with the analysts to get details about the data and requesting
for what they need.

4. DATA:
Data is a set of values of subjects with respect to qualitative or quantitative variables.
Data and information or knowledge is often used interchangeably; however, data
becomes information when it is viewed in context or in post-analysis.
Data is collected by a huge range of organisations and institutions and business (e.g.,
sales data, revenue data, profits, and stock price), governments (e.g., crime rates,
unemployment rates, literacy rates) and non-governmental (e.g., censuses of the
number of homeless people by non-profit organisations).
Data is represented or coded in some form suitable for better usage or
processing. Raw data is a collection of numbers or characters before it has been
cleaned and corrected by researchers.

4.1.Types of Data:
Different types of data available which are follows:
1) Operational Data:
Operational data tracks the business processes within the organisation. Operational
data has the advantage that it is close to the cash flow of the enterprise and is thus
likely to be well-organised and closely tracked.
Example: revenue or sales commissions, information from within a call centre, defect
information from an assembly line etc.

2) Customer Service Data:


Customer service data addresses important business processes, particularly in
organisations where there is a high ratio of customer-facing employees, or ‘surface
area’. Example: satisfaction data, number of items returned data.

3) HMS Systems:
Human Resource Information Systems (HMS) are an important part of most projects.
Primarily, they provide demographic information, such as education, tenure, job title,
and other items. Sometimes it also contains compensation data. Example: the
employee name, ID, and e-mail address.

4) Learning Data:
Learning Management Systems (LMS) contain information about training. The LMS
may contain some way of identifying the employee, the type of training they received,
and the date on which they received the training.

5) Engagement Surveys:
Engagement surveys are effective instruments for gauging sentiment by employees
and are gaining popularity. The question may be viewed individually (‘I have a best
friend at work’) or aggregated into various subsets, gauging employee’s satisfaction
with their manager or their view of career prospects. If the employees can write
without being worried about who might read their information, the goal is served.

6) Psychological Testing:
Psychological testing shows promise in predicting performance on job metrics, both
in an individual sense and in how individuals fit together in a team.

7) Survey Data:
Survey data can take in quite a wide range of information and can provide data in
many different categories.
Some popular uses for surveys include:
1) Customer satisfaction.
2) Trainee satisfaction with course materials.
3) Engagement data.

4.2. TYING DATA SETS TOGETHER:


One of the most crucial tasks in an analysis is combining data from different sources for
analysis. Database people talk about “unique identifiers”- a piece of information that can tie
together data sources. Example include social security numbers, e-mail addresses, and
employee ID numbers.
To make connections between data sets, your data analysts will need one or more unique
identifiers. Sometimes these efforts are fairly straightforward, nut they can also be
challenging. With people identifiers are usually employee IDs. They also protect the privacy
of the participants. Proper names are messy identifiers. Not only is there the privacy issue,
but there are variants of names- maiden and married, with suffixes attached or not, middle
names omitted entirely or truncated to an initial, and so forth.
Example: we have a performance management system that use an employee ID, a training
system that uses proper names, and other systems that use an e-mail, all within the same
organisation. We will not get into how common mappings can be created, but a good data
analyst will be able to manage this.
You may notice that in below diagram, has areas where data exist only in operations but not
in human resources, and so forth. There are often cases where this occurs in practice,
although it should not occur in theory.

Operation

Human Resources
4.3. DIFICULTIES IN OBTAINING DATA:
The Tolstoy noted, unsuccessful data pulls all have their own character, and it is sometimes
impossible to anticipate all of the possible difficulties. The Tolstoy quote has been generally
appropriated by scientists as shorthand for situations where many conditions need to be true
simultaneously, and a single failure creates an overall failure.
Sometimes the problem is not too few systems containing data but too many. In one study we
did with Sun Microsystems, the company needed to consolidate five different LMS systems,
which existed due to acquisitions and continual development of decentralised learning tools.
In this type of situations, the data are not always comparable between different systems, and
it is not always possible to consolidate the systems prior to pulling data.
Here are some of the examples of situations which have witnessed in major corporations.
1) People were sent multiple data sets initially, including some sensitive HR files. The
senior leader of HR demanded that the information be deleted, which delayed the
project for a full year until the proper permissions were obtained.
2) A highly placed person at a client organisation, who was in charge of negotiating the
contract, tried to bargain in terms of how much data we would receive. The negotiator
was in the habit of pushing to give the vendor as little of everything as possible.
3) Some divisions of a company own certain parts of the business data share them about
as readily as toddlers share toys. Actually, that is not really fair, because toddlers are
sometimes in sharing moods can eventually be taught to behave better.
4) Some stakeholder’s express apprehension about the results. If they are not very
optimistic about what a study will show, they resist getting involved in a project and
sharing their data.

4.4. ETHICS OF MEASUREMENTS AND EVALUATION:

Many projects require the use of sensitive information, such as career and salary data. It
almost goes without saying that this information should be treated as private. There are
safeguards that should be taken in any project with this type of sensitive information.
First, it is better to use an employee ID than a proper name for several reasons, especially
because it makes it harder to reveal the information.
When an individual starts a project, it is important to talk to the stakeholders and
compliance officers to understand what data are off limits for making decisions. There are
cases where having more information to use for a given end can be specifically unethical
or illegal.
Example: many pharmaceutical companies provide continuing medical education, such as
webinars and special lectures, in their area of expertise. It is not only unethical, but
actually illegal, to design the information to promote particular products.
5. HR DASHBOARD:
The HR Dashboard is the visual representation of the metrics that an HR manager needs to
keep a track of to judge the performance of different organizational departments. Apart from
the records of personnel, it includes the business performance dashboard, marketing
dashboard, sales dashboard, finance dashboard, etc.

There is no standard format for creating the HR dashboard; every company has its different
dashboards that vary according to their unique requirements.
HR dashboard is used for gathering, verifying, sorting the necessary information for the
consumption of the top management. The purpose behind creating the dashboards is to keep a
check on the performance of each department and see the progress towards the set goals.
6. STATISTICAL SOFTWARE USED FOR HR ANALYTICS:

1. MS-Excel:
Excel is where most of us started. Whenever you manually extract data from any of your HR
systems, it most likely comes out in the form of a comma-separated value (CSV) file. These
files can easily be opened and edited using Excel.
The good thing about Excel is that it’s very intuitive (natural) to most of us HR data geeks
and therefore easy to use.
For example, if you wanted to check how clean your data is, you can quickly transform a
dataset into a table and check each column’s data range for outliers.
This way, if you select the age column, you can quickly check the minimum and maximum
ages. You wouldn’t expect anyone below 16 to work at your company, nor would you expect
anyone over the age of 80 to work for you. You can find these outliers in a single click.
Some quick tips on how to use Excel for HR analytics purposes:
 If you want to run advanced analyses, load the Analysis Tool Pak in Excel. This
package enables you to do advanced analytics, including correlation and linear
regression.
 When you work with large files, transform them into Tables. Excel can work much
more efficiently when data is structured in a table.
 Don’t use Excel formulas in large data sets. When you calculate a column using an
Excel formula, transform the outcome to a numeric value. Formulas recalculate every
time you make a change in the data set. This places a significant and unnecessary
burden on your computer’s memory and processing speed – and bogs down Excel.
 Categorical variables (Gender: Male, Female) are easy to check in a table. Select the
table column and check for errors or inconsistencies. Can you spot the discrepancies
in columns in this picture?
2. SPSS: (Statistical Package for the Social Sciences)
SPSS is one of the most commonly used HR analytics tools in the social sciences. Thanks to
its user-friendly interface, you’re able to analyse data without having extensive statistical
knowledge. Because SPSS is often used in the social sciences, a lot of HR professionals
know how to use it – especially the ones interested in data analysis.
This is also the reason why we put SPSS on the list and not its biggest competitor, SAS. SAS
has more users outside of the social science field. However, SAS has a steeper learning curve.
SPSS also shares many similarities with Excel, which makes it easier to work with.
Consider SPSS an easy stepping stone for companies with less mature analytical capabilities.
SPSS makes it easy to do an exploratory correlation analysis or a quick regression analysis.
3. AMOS: (Analysis of a Moment Structure)
AMOS is statistical software and it stands for analysis of a moment structures. AMOS is an added
SPSS module, and is specially used for structural Equation path analysis, and confirmatory factor
analysis. It is also known as covariance or casual modelling software. AMOS is a visual program for
Structural Equation modelling (SEM). In AMOS, we can draw models graphically using simple
drawing tools. AMOS quickly performs the computations for SEM and displays the results.
4. SAS: (Statistical Analysis System)
SAS stands for Statistical Analysis Software. It was created in the year 1960 by the SAS Institute.
SAS is a leader in business analytics. Through innovative analytics it caters to business intelligence
and data management software and services. SAS transforms data into insight which can a fresh
perspective on business.
SAS takes an extensive programming approach to data transformation and analysis rather
than a pure drag drop and connect approach. SAS was used for data management, business
intelligence, predictive, descriptive, and Prescriptive analysis.
With the introduction of JMP (Jump) for statistics SAS took advantage of the Graphical user Interface
which was introduced by the Macintosh.
5. R and R Programing:
(Robert Gentleman and Ross haka)
R is the most used HR analytics tool. R is great for statistical analysis and visualization and is
well-suited to explore massive data sets. It enables you to analyse and clean data sets with
millions of rows of data. It also lets you visualize your data and analysis.
The most often used Integrated Development Environment, or IDE, for R is RStudio. An IDE
is software that provides additional facilities for software development and data analysis.
This makes the software more user-friendly.
Simply put, RStudio does everything that R does but more and better. The RStudio interface
contains a code editor, the R console, an easily accessible workspace, a history log, and room
for plots and files. The picture below shows all these elements.
As previously stated, R is useful because it enables you to work with much larger datasets
compared to, for example, Excel. Furthermore, R has a very extensive library with R
packages.
These packages are easy to install and allow you to run virtually all statistical analyses and
create beautiful visualizations. Take, for example, the caret package. This package enables
you to split data into training and testing sets to train algorithms using cross-validation.
Another example of an R package is ggplot, which helps you to visualize graphs. In a
previous article on R Churn analytics, Lyndon showed the distribution of employee turnover
for a large Canadian company as seen in the following chart using ggplot.

7. DATA VISUALISATION TOOLS:

1) Tableau:
Tableau is often regarded as the grand master of data visualisation software. Tableau is very
similar to Power BI in that it enables the aggregation and visualization of various data
sources. Founded in 2003 as a commercial outlet for research produced at Stanford
University, the software has taken the visualization world by storm.
Tableau is arguably the best business intelligence tool (BI tool) out there when it comes to
visualization. It has been recognized in the Gartner Magic Quadrant for seven consecutive
years between 2012 and 2019.
a) Tableau Desktop: Tableau Desktop has a rich feature set and allows us to code and
customise reports. Right from creating the reports, charts to blending them all to form a
dashboard, all the necessary work is created in Tableau Desktop.
Tableau Desktop is also classified into two parts:
1) Tableau Desktop Personal.
2) Tableau Desktop Professional.
b) Tableau Online: Its functionality is similar to the tableau server, but data is stored on the
servers that hosted on the cloud, which is maintained by the Tableau group. There is no
storage limit on the data which is published in the Tableau Online.
Tableau Online creates direct link over 40 data sources who are hosted in the cloud such as
the Hive, MySQL, Spark SQL, Amazon Aurora, and many more. Data that flow the web
applications say Tableau Server and Tableau Online also support Google Analytics and
Salesforcr.com.
2) Plotly:
Plotly enables more complex and sophisticated visualisations, thanks to its integration with
analytics-oriented programming languages such a Python, R and Mat lab. It is built on top of
the open source d3.js visualisation libraries for JavaScript, but this commercial package adds
layers of user- friendliness and support as well as inbuilt support for APIS such as salesforce.
Plotly will help you to create a slick and sharp chart in just a few minutes or in a very
short time. It also starts from single spreadsheet. It is used in Google, and also by the US Air
Force, Goji and The New York University.

3) Qlik view:
Qlik is very suitable for more general data aggregation, warehousing, and dash boarding.
Qlik with their Qlik view tool is the other major tool in this space and Tableau’s biggest
competitor. In addition to its data visualisation capabilities Qlik view offers powerful
business intelligence, analytics, and enterprise reporting capabilities. Qlik view is commonly
used alongside its sister package, Qliksense, which handles data exploration and discovery.
Features of Qlik view:
 Data Association is Maintained Automatically.
 Data is Held in Memory for Multiple users, for a Super-Fast User Experience.
 Aggregations are Calculated on the Fly as needed.
 Data is Compressed to 10% of its Original Size.
 Visual Relationship using colours.
 Direct and Indirect Searches.

4) Fusion Charts:
This is a very widely-used, JavaScript-based charting and visualisation package that
has established itself as one of the leaders in the paid-for market. It can produce 90
different chart types and integrates with a large number of platforms and frameworks
giving a great deal of flexibility.
Features of Fusion Charts:
 Completely free of cost.
 PHP Wrapper class for quick and easy embedding of charts.
 Gantt and Funnel charts.
 Code samples including PHP and ASP.NET.

------------------------------------------------------------

You might also like