Tracking Workflow and Illustrating Value

Dr. Toby Pearlstein New York Chapter SLA March 11, 2008

Agenda
 

The environment Getting to tracking, workflow management & illustrating value
Identifying

a solution

Implementation

Sample screen shots

Key Takeaway: A generic process to help you think through your options for tracking workflow and illustrating contribution
2

Your environment
 

Global, regional, local, other? Stakeholders  Contributors  Users  Other stakeholders Challenges  Existing processes  Resources  Buy in

3

Understanding needs and identifying a solution
Background Audit Buy in Implement

Request tracking to date Aspiration

Create project team Review options

Identify benefits to stakeholders

   

Contract Customize Test Deploy

4

Where are you now? Where do you want to go?

2000 -2002 • No accountability to Center • Ad hoc local request tracking • Testing Microsoft Access in NA • No technology for workflow management 2002-2006 • Standardized separate local tracking and some local workflow management • Monthly reports to Center

 

Simpler Web based global tracking Easy to maintain and customize Reduce challenges to compliance Reduce complexity around capture and reporting Reduce entry time Increase timeliness of reporting Enable better workflow management Platform for sharing 5 insights

Audit: Consider your Options
 

Is your current process sustainable? Are internal development resources available? Do you want/need to develop something proprietary? Review alternative applications needs LS related products in academia, consortia, etc. IT and Call Center Help Desk applications Customized versions of applications Have your competitors developed request management or workflow tracking solutions? They are often willing to provide demos.

6

Stakeholder buy-in: Communication and feedback intensifies
 

Build consensus Critical questions What needs to happen before you can move to a new system? How might it change the way you work? If you decide on a specific tool what are the next steps? Develop custom specifications Evaluate current data collection and any changes needed Plan implementation and roll out • New forms and reports • New processes for reporting • Training and documentation

7

ILLUSTRATIVE

Codifying feedback: What data do you want to capture?
Current Fields
Project Code Researcher Name Request Date Requester Name Requester Office Level of Difficulty Industry Subject Time period covered Geographic area Time spent on request Requested turn around Text of Request Cost Recovery
8

Proposed Fields

Mandatory/Optional/Delete

M M M M M M

M

ILLUSTRATIVE

What reports do you want to generate?
Current Reports
Monthly from Center All requests by office Complexity of req/office Req by indus by office Turnaround (avg. time) Requests by project type Other? Local IS Reports Requests by project code Chargebacks by project Requests by researcher/indus Requested turn around Other? 9 Req by requester level

Proposed Reports

Mandatory/Optional/Delete

M M M M M M

O M

ILLUSTRATIVE

Who can do what? Security and access options
Options/Command links

End User

IS Staff

Administrator

Search e-Reference database/knowledgebase

Basic

 

Add request to database Logout and Home
My Requests: list “owned” requests and add info
List all: list and update all requests



Request management

List all - Statistics: provides statistics on all requests
My Work: list and update all “owned” requests
My Work - Incomplete: list and update requests
Today’s Work: list and update all requests added today


Today’s Work - Incomplete: list and update all incomplete requests added today


In Progress: list and update all in progress
User Manager: list, add and update all users on system and set security levels


System management

List File Manager: maintain List files (e.g. department and type drop down fields)


Sources: source maintenance
System Maintenance: archive databases, database functions and housekeeping functions

Settings: system information

10

Contract and Implement: customize as needed
Request tracking

Metrics & Workflow

Track current and past requests for
Self Local

IS Regional/Global IS

Select metrics to be captured and at what level Allow flexibility to create ad hoc metrics & monitor activity

11

Sample customizations
Knowledge Base Cost Recovery

Capture new or outstanding data sources for reuse locally and globally and to inform purchasing decisions

Capture time spent and other purchased services

12

Close audit loop: Do perceived benefits match aspirations?
Benefit

Comment

Knowledgebase of used and useful sources and search strategies accessible firm wide
Easy database search

Aid in reducing research time and research duplication
Offers full text, Boolean, truncation
Shareable across system Minimizes need for specific application knowledge at local IS level
Background slides and spreadsheets can be stored Useful data sources can be stored in RMS if desired


Web based application

Document storage: upload and store documents

 

 

Streamlines data entry Easier and more timely compilation of metrics

RMS eliminates need for individual IS to download and email metrics data  Metrics directly accessible by Center

Workflow co-ordination improvement

Quick workload and request status overview (who is working on what)

13

ILLUSTRATIVE

Who is responsible for training & documentation?

WebEx training  1 member of each IS team  Use for 1 month before launch as part of testing  Demonstrate system to teammates eRef User Manual (distributed day of launch)  Navigating eRef  Confidentiality and eRef eRef Local Administrator Guide (distributed day of launch)  Automatic HR Loader  Adding new IS Researchers, Requesters and Case Codes eRef Creating Local Statistics  Creating statistics for your IS eRef Chargeback Guide  Running chargeback export and converting export data into organization’s chargeback format

All written in house

14

Workflow and contribution - what else might you do with this kind of tool?
Workflow Management Workflow Management  Make knowledge sharing more robust  Even more granular review of industry segments served  Inform content selection  Inform continuing education choices  Identify individual researcher skill improvements needed  Use more granular statistical review for strategic capacity and content planning Illustrating Contribution Illustrating Contribution  Increase ability of local managers for ad hoc reporting  Rethink metrics collected and their presentation  Monitor turnaround time and support for client development and match to client wins  Monitor stakeholder support and proactively connect to special projects such as IP development (books and articles published, etc.)
15

SAMPLE SCREEN SHOTS

16

What data do you want/need to capture? (1 of 3)

17

What data do you want or need to capture? (2 of 3)

18

What data do you want or need to capture? (3 of 3)

19

What search capability do you want?

20

Sample Search Results

21

Sample Request Drill Down

22

Managing workflow

23

Illustrating contribution

24

Sample and type and types of requests Volume volume of requests 2007 (Aug)
Possible Uses ¥ Monitor capacity and demand
T o t a l r e q u e s t s

¥ Are researchers doing most valued work
4 8 , 9 1 8

5

0

,

0

0

0

4

1

,

7

2

5 R u n R a t e

3 4 0 , 0 0 0

8

,

4

9

6

2

0

0

7

D

o

c

u

m

e

n

t

R

e

t

r

i

e

v

a

l

3

0

,

0

0

0

S

i

m

p

l

e

2

0

,

0

0

0

Quarterly Avg.

C 1 0 , 0 0 0

o

m

p

l

e

x

0

2

0

0

5

2

0

0

6

2

0

0

7

(

A

u

g

)

BOS

070925-ISH-SLAHARVARDFinal

17

This information is confidential and was prepared by Bain & Company solely for the use of our client; it is not to be relied on by any 3rd party without Bain's prior written consent.

25

Sample requests by industry/topic
Possible Uses • Content gaps
Total Requests Aug 07
100% Other Statistics 80 Retail Company Healthcare 60 Consumer Products Financial Services 40 Industrial 20 TMT 0 Goods & Services

• When cut by office – researcher skill needs
Total = 4,040

26

Comparison of activity by units
Possible Uses • Team utilization rates • Identifying differences in IS use across units

27

View into time spent on types of work
Possible Uses
Research time on PEG 30%

• Time spent on certain types of cases • Implications for capacity/expertise required
26 26
Avg. time per month on PEG = 23%

23 20

24 22 19 18

10

0

Jan 07

Feb

Mar

Apr/May

June

July

Aug

28

Key Take Away: What to think about when starting a project like this?

Vendor
 Stability  Experience

Technology & Deployment
 Compatible  Sustainable  Hosting  Scalable  Maintenance  Security

clients  Customer service and ongoing support  Service guarantees  On time performance  Training/Documentation

with similar

(how)

Stakeholders
 Project

Cost
 Basic

team  Decision makers  Input and Feedback  Buy in  Compliance  Maintenance (who)

investment • Software • hardware  Customization  Maintenance (ongoing)  Recovery

29

Key Take Away: Hindsight is always 20/20

Assemble a project team that is passionate about the outcome Call Berlitz sooner rather than later  Need to ensure that you and the vendor are speaking the same language Make testing during pilot as thorough as possible  Did not uncover problems with some fields (like offices with two word names) and punctuation errors caused by quotation and punctuation marks Give yourself time to do it right

30

ILLUSTRATIVE

Key Take Away - it will likely take longer than you originally estimate!
2004
• See product at SLA • Preliminary evaluation and agree with concept

2005
• Intro to global team at various meetings • Proposal to Partner – budget approval • Virtual round robin on data capture and reporting • Specification developed and iterated on with vendor

2006
• Pilot initial customized version • Fix disconnects • Pilot revised version • Fix add in downloads (Excel/Word) • Develop documentation • Product launch June 06 •Ongoing feedback

2007
• 6 month review • Continue identifying bugs • Evaluate recommended enhancements • Contract for enhancements •Test •Launch Aug 07

31

Questions?

Toby.Pearlstein@comcast.net

32

Toby Pearlstein - Bio

Toby Pearlstein is retired Global Director of Information Services for Bain & Company, an international management consulting firm. During her 14 years at Bain she was responsible for oversight management of Bain’s information centers including global vendor contract negotiation, budgeting, evaluation and acquisition of print and electronic products and services, and coordination of global information services policies and global IS team professional development standards. She was actively involved in Bain’s early knowledge management program and the development of its intranet platform the Global Experience Center (GXC) including a portal for end user desktop access to contracted research resources. Prior to joining Bain, Dr. Pearlstein was Chief Librarian & Archivist for the Massachusetts Department of Transportation for twelve years, and Curator of the Archives of the Commonwealth of Massachusetts for two years. An historian by training with a Bachelor of Arts degree from the University of Massachusetts at Boston and an M.A. in American History from the University of New Hampshire, she also holds a Masters of Library Science and Doctorate degrees from the Simmons Graduate School of Library and Information Science. Dr. Pearlstein is a Fellow of the Special Libraries Association. She has been Chair of both the Transportation and Business & Finance Divisions of SLA as well as Chair of the Association’s Professional Development Committee and a member of the Research Committee and the Finance Committee. She is a frequent presenter at conferences and contributor to professional publications. In 2007 she was the recipient of the Boston Chapter of SLA’s Distinguished Service Award.

33