and validating both process changes and the benefits of technical solutions or automation. It also reduces the time requiredto collect process metrics by employing an easy-to-use tool and methodology.
A case study
Taratec recently performed a lab business process improvement (BPI) assessment for a large pharmaceutical R&Dorganization with three labs. A small piece of this assessment can be used as a simple case study to demonstrate howmodeling and simulation were applied in a lab environment. This organization had recently implemented both LIMS andCDS and had an aggressive schedule of follow-on integration and new technology projects planned. A lab BPI processprofile assessment found that overall acceptance and utilization of LIMS and CDS was low, resulting in a poor return onthese recent investments. The results of this lab process profile offered valuable insight into the current operations, but anychanges made would be trial-and-error implementations. So, to avoid the risk associated with trial-and-error changes, weemployed the use of modeling and simulation.The first step was to develop a clear focus for the modeling effort. In this case, the purpose of the model was to test processand technology changes. The next step was to map the main activity flow. That meant defining the basic process flow, subprocesses, activities, resources and time duration, workflow sequence, business rules and behaviors. A simulation wasdeveloped to flush out mechanical mistakes in the process, to tune performance of the model and to check activity-basedcosting metrics necessary to ensure robustness of the model.This established an ‘as-is’ baseline model that could be used to measure improvements and to test ‘what-if’ processalternatives, such as improved or standardized processes and elimination of non-value-added activities, as well as todemonstrate the benefits of integrating new technology like ELN to improve the efficiency of the process. Figure 2illustrates a simplified model of a raw materials lab process and sub processes for pre-lab, within-lab and post-lab activities.In this dynamic model, the computer simulates the flow of samples and information through the process. The modelaccounts for the random variations in how work is done and the way samples flow through the real-world lab. By employingdiscrete event simulation to capture the time-varying nature of the process under study, we are able to correlate the dataproduced by the model with measurements taken from the real processes. Thisprovides a good degree of certainty that the model had adequately captured theessential features of the real process. Using this approach and simulation toolallowed us to integrate process mapping, hierarchical event-driven simulationand activity-based costing into a single modeling and simulation environment.Figure 3 is a top-level screen shot of the model with three methods of examiningthe simulation results:1. Dynamic metrics are updated as the simulation runs — we could definevirtually any set of metrics such as samples initiated, analyzed and approved.2. At either side of the main graphic are two real-time plots that show thenumber of samples in the system and number of QA resources busy. We couldisolate any entity (in our case a sample is an entity), resource or activity (tocollect statistics) counts, cycle time, units busy.3. Displayed to the left (background) is a standard numerical report that capturesall performance and cost statistics that were specified to be collected by thesimulation model. These reports include volumes processed, cycle time, resource utilization and activity-based costs.Finally, the numerical data was analyzed for both the ‘as-is’ and a number of ‘what-if’ simulations. Figure 4 represents aside-by-side comparison of some key metrics from the ‘as-is’ and ‘what-if’ simulations. This table shows the sample volumein the ‘as-is’ and ‘what-if’ simulation models was kept constant, while cycle time was reduced from 25 to 14 days and actualhands-on processing time was reduced by nearly one-quarter of a day. In addition, resources were freed up adding potentialfor increased capacity. Perhaps more importantly, costs were reduced from $62K to $36K just for this very small piece of the process. This clearly demonstrates how the use of activity-based costing can be extremely valuable in trulyunderstanding the price of hands-on work. In addition, these types of metrics allow us to decide what other alternativesshould be examined. For instance, what happens if the number of analysts were reduced from four to three or the samplevolume increased by 30 percent?
In summary, as companies continue to drive process improvement further and further down into their organizations andoperations, it is clear that labs and their lab informatics projects will be expected to provide more measurable benefits to thebusiness. One of the most effective approaches to meet these expectations is through the use of process modeling andsimulation. As other industries have already discovered, simulation clearly adds value by providing a structured, repeatableprocess for developing, evaluating and comparing proposed solutions and implementation strategies that is far superior totraditional static workflow mapping techniques. Simulation also provides exceptional value by providing cost and timeinformation to support the business case (activity-based costing), as well as the ability to visualize the ‘to-be’ state and togain buy-in for changes, including modeling the impact of informatics technology options.
click to enlarge
Figure 4: Side-by-side comparison of the ‘as-is’ and ’what-if’ simulations
Page 3 of 4<b>From Cow Paths to Superhighways</b><br/><br/>9/19/2007http://www.scientificcomputing.com/ShowPR_Print~PUBCODE~030~ACCT~300004346...