Welcome to Scribd. Sign in or start your free trial to enjoy unlimited e-books, audiobooks & documents.Find out more
Standard view
Full view
of .
Look up keyword
Like this
0 of .
Results for:
No results containing your search query
P. 1
Process Analysis Modeling

Process Analysis Modeling



|Views: 128|Likes:
Published by Ioannis Moutsatsos
Process modeling and simulation can add value to lab informatic projects by providing cost and time information to support the business case.
Process modeling and simulation can add value to lab informatic projects by providing cost and time information to support the business case.

More info:

Published by: Ioannis Moutsatsos on Jan 17, 2009
Copyright:Attribution Non-commercial


Read on Scribd mobile: iPhone, iPad and Android.
download as PDF or read online from Scribd
See more
See less


From Cow Paths to Superhighways
Sophisticated process analysis techniques can bring dramatic change to informatics implementationsStuart M. Miller and John M. PetrakisEven before it was called “lab informatics,” the concept of process improvement as a predecessor to informatics systemimplementations was recognized and accepted as a necessary part of most major lab informatics projects. Yet, today,increasing pressure to get informatics projects done faster and cheaper,particularly in the life science industry, is resulting in abbreviated, aborted orcompletely ignored process analysis. IT teams are usually eager to get to the“fun stuff” of implementing the technology. Convincing them and the budgetapprovers of the need for time, resources and money to analyze and improveprocesses before implementing new informatics systems is not easy. Thealternative is, however, to use a common IT euphemism, to continue “pavingcow paths.”This trend of ignoring process analysis and merely paving over the old andinefficient lab business process “cow paths” with new informatics tools, likeLIMS, CDS, SDMS and ELN, is even more alarming when you consider theevolving nature of commercial lab informatics systems. Newer informaticstools, like ELN and some LIMS, are providing very specific niche functionalitythat must be intimately interwoven with everyday work processes in order to beeffective. Ignoring process analysis can be more costly in dollars and time thanany incremental savings you might have temporarily realized from bypassingup-front process analysis and improvement.Budget approvers in many organizations have learned from prior mistakes made of investing heavily in multiple commercialstate-of-the-art informatics systems only to discover the systems have not delivered the anticipated benefits. Consequently,informatics project teams are being challenged to develop hard financial and process improvement metrics as part of theirbusiness cases to demonstrate the predicted return on investment (ROI) prior to the approval of project funding. They mustprovide to budget approvers the hard metrics they need to make decisions on where to spend informatics program funds andhow to monitor the improvements over time. Process modeling and simulation is one method that bridges the gap betweentraditional static workflow tools and the increasingly sophisticated demands of lab informatics projects.
Process modeling and simulation
Like all industries, the life sciences industry is experiencing enormous pressure to reduce costs and improve efficiency.Until recently, R&D and quality labs have flown under the radar of process excellence initiatives in many companies,probably due to their relatively small size and complex business models, which are foreign to traditional business andprocess analysts. However, as industry executives become concerned aboutcosts and efficiency, they are driving process excellence initiatives down togreater depths within their organizations. Recent statements by some executivespredict the cost of bringing a new drug to market will balloon to over $2 billionby 2010 unless the pharma industry can find better ways to improve efficiencyand effectiveness in drug development.
This would include the laboratoryoperations that support the development process. In order to examine thebusiness process without significant risk, cost or disruption of businessoperations, labs should begin adopting a methodology that has been usedsuccessfully in other industries — process modeling and simulation.
Static workflow mapping vs. dynamic process modeling andsimulation
click to enlarge
Figure 1: Comparison Chart •Advantages of dynamic processmodeling and simulation over staticprocess modelingclick to enlarge
 Page 1 of 4<b>From Cow Paths to Superhighways</b><br/><br/>9/19/2007http://www.scientificcomputing.com/ShowPR_Print~PUBCODE~030~ACCT~300004346...
The goal of process modeling is to create a simplified model of a businessprocess. The models developed allow analysts to study the processes involved ina business in order to:• uncover waste and inefficiency• develop changes to a process to correct for performance problems• select process designs that give the best results• provide cost justification for the proposed changes• establish performance metrics for the process.But what exactly is modeling and simulation? Simulation uses a combination of dynamic modeling and simulationtechniques and software tools to produce a simulated software-based model of the flow and interaction of materials andinformation through a business process. It supports detailed analysis and forecasting of business process improvementoutcomes and permits those questions to be answered that are most often asked, but rarely answered satisfactorily whenusing traditional tools. For example:• Capacity: What is the ideal throughput capacity of the business or system?• Workload: Should workload (or activities) be performed at a single site or distributed?• Technology: What is the effect of using automation or integration of systems?• Optimization: What is the optimum number of resources to support the process?• Resource scheduling: What personnel and equipment are needed at what time?• Efficiency: How much time, labor or money can be saved?It can also alleviate functional, technical and management concerns around new technology or process improvementinitiatives by giving them visual and quantitative proof that the new system will work, and it may even help uncover processissues, bottlenecks or solutions not previously understood or realized from static mapping approaches.The traditional approach of static workflow mapping is very different from simulation. Workflow mapping, which employscommon flow charting tools like Microsoft Visio, can be useful to a point, but has some major shortcomings compared todynamic process modeling and simulation. Static workflow mapping is familiar, inexpensive and generally easy to use andto understand. However, it also can be too familiar, visually underwhelming those from the business who need to understandand evaluate the models. It rarely provides enough detail to fully describe the variability of the process, provide metrics ormake an impact with budget approvers evaluating your business case. It provides only a snapshot of the process and cannottake into account the time-varying nature of a process.Simulation, by comparison, has some clear advantages. For example, simulation is dynamic and graphical, providing a high-impact visualization of the process being modeled. Although more complex to build, simulation is intuitive and easy tounderstand; it uses the familiar pictorial abstraction or representation of a process workflow but adds a graphical animationor simulation of the process in action. A well-constructed simulation model will simulate the flow of materials andinformation through a process, including entities such as samples, test data and approval status, and is able to account forrandom variations in how work is done and how materials and information flow through the real world.One unique and valuable advantage is the ability to perform quantifiable comparisons of ‘what-if’ scenarios facilitatingselection of the best ‘to-be’ model. It allows the analyst to evaluate in quantifiable terms the effects of modified or re-engineered processes and helps to demonstrate the effect of important changes to the process, as well as modeling the effectsof automation, new informatics systems, interfaces, and so forth. As part of this evaluation, analysts can perform statisticalanalysis of process parameters and metrics. The simulation models produce data on key metrics such as cost, throughput,capacity, wait time and others that can be analyzed in order to optimize the process.The other advantage of simulation is the ability to forecast costs associated with a process using activity-based costing.Activity-based costing involves the evaluation of the cost of resources or equipment based on activities performed in adiscrete part of the process. The actual cost of different parts of the process canbe analyzed in terms of value-added versus non-value-added activities and canenable better decisions to be made based on cost.Simulation also provides future benefits in the form of a tool to monitor theprocess improvements implemented. The models developed can be reused astemplates or dynamic reference models. Actual values of key metrics from theimproved process can be loaded into the model and can be used for ongoingmonitoring of process improvements and reporting actual results compared toforecasted improvements.Finally, simulations are more engaging than static workflow models and,therefore, more likely to evoke a response. This stimulates employeeparticipation in the process of identifying areas for improvement anddevelopment of new process solutions, and promotes innovative thinking to helpensure acceptance of the proposed process changes.Ultimately, simulation is an extremely valuable tool to help optimize businessprocesses. It reduces experimentation time and the risk of costly field implementations of incorrect solutions by modeling
Figure 2: Simplified model of a rawmaterials lab process and subprocessesclick to enlarge
Figure 3: Top-level screen shot of simulation model with three methodsof examining the simulation statistics
Page 2 of 4<b>From Cow Paths to Superhighways</b><br/><br/>9/19/2007http://www.scientificcomputing.com/ShowPR_Print~PUBCODE~030~ACCT~300004346...
and validating both process changes and the benefits of technical solutions or automation. It also reduces the time requiredto collect process metrics by employing an easy-to-use tool and methodology.
A case study
Taratec recently performed a lab business process improvement (BPI) assessment for a large pharmaceutical R&Dorganization with three labs. A small piece of this assessment can be used as a simple case study to demonstrate howmodeling and simulation were applied in a lab environment. This organization had recently implemented both LIMS andCDS and had an aggressive schedule of follow-on integration and new technology projects planned. A lab BPI processprofile assessment found that overall acceptance and utilization of LIMS and CDS was low, resulting in a poor return onthese recent investments. The results of this lab process profile offered valuable insight into the current operations, but anychanges made would be trial-and-error implementations. So, to avoid the risk associated with trial-and-error changes, weemployed the use of modeling and simulation.The first step was to develop a clear focus for the modeling effort. In this case, the purpose of the model was to test processand technology changes. The next step was to map the main activity flow. That meant defining the basic process flow, subprocesses, activities, resources and time duration, workflow sequence, business rules and behaviors. A simulation wasdeveloped to flush out mechanical mistakes in the process, to tune performance of the model and to check activity-basedcosting metrics necessary to ensure robustness of the model.This established an ‘as-is’ baseline model that could be used to measure improvements and to test ‘what-if’ processalternatives, such as improved or standardized processes and elimination of non-value-added activities, as well as todemonstrate the benefits of integrating new technology like ELN to improve the efficiency of the process. Figure 2illustrates a simplified model of a raw materials lab process and sub processes for pre-lab, within-lab and post-lab activities.In this dynamic model, the computer simulates the flow of samples and information through the process. The modelaccounts for the random variations in how work is done and the way samples flow through the real-world lab. By employingdiscrete event simulation to capture the time-varying nature of the process under study, we are able to correlate the dataproduced by the model with measurements taken from the real processes. Thisprovides a good degree of certainty that the model had adequately captured theessential features of the real process. Using this approach and simulation toolallowed us to integrate process mapping, hierarchical event-driven simulationand activity-based costing into a single modeling and simulation environment.Figure 3 is a top-level screen shot of the model with three methods of examiningthe simulation results:1. Dynamic metrics are updated as the simulation runs — we could definevirtually any set of metrics such as samples initiated, analyzed and approved.2. At either side of the main graphic are two real-time plots that show thenumber of samples in the system and number of QA resources busy. We couldisolate any entity (in our case a sample is an entity), resource or activity (tocollect statistics) counts, cycle time, units busy.3. Displayed to the left (background) is a standard numerical report that capturesall performance and cost statistics that were specified to be collected by thesimulation model. These reports include volumes processed, cycle time, resource utilization and activity-based costs.Finally, the numerical data was analyzed for both the ‘as-is’ and a number of ‘what-if’ simulations. Figure 4 represents aside-by-side comparison of some key metrics from the ‘as-is’ and ‘what-if’ simulations. This table shows the sample volumein the ‘as-is’ and ‘what-if’ simulation models was kept constant, while cycle time was reduced from 25 to 14 days and actualhands-on processing time was reduced by nearly one-quarter of a day. In addition, resources were freed up adding potentialfor increased capacity. Perhaps more importantly, costs were reduced from $62K to $36K just for this very small piece of the process. This clearly demonstrates how the use of activity-based costing can be extremely valuable in trulyunderstanding the price of hands-on work. In addition, these types of metrics allow us to decide what other alternativesshould be examined. For instance, what happens if the number of analysts were reduced from four to three or the samplevolume increased by 30 percent?
In summary, as companies continue to drive process improvement further and further down into their organizations andoperations, it is clear that labs and their lab informatics projects will be expected to provide more measurable benefits to thebusiness. One of the most effective approaches to meet these expectations is through the use of process modeling andsimulation. As other industries have already discovered, simulation clearly adds value by providing a structured, repeatableprocess for developing, evaluating and comparing proposed solutions and implementation strategies that is far superior totraditional static workflow mapping techniques. Simulation also provides exceptional value by providing cost and timeinformation to support the business case (activity-based costing), as well as the ability to visualize the ‘to-be’ state and togain buy-in for changes, including modeling the impact of informatics technology options.
click to enlarge
Figure 4: Side-by-side comparison of the ‘as-is’ and ’what-if’ simulations
Page 3 of 4<b>From Cow Paths to Superhighways</b><br/><br/>9/19/2007http://www.scientificcomputing.com/ShowPR_Print~PUBCODE~030~ACCT~300004346...

You're Reading a Free Preview

/*********** DO NOT ALTER ANYTHING BELOW THIS LINE ! ************/ var s_code=s.t();if(s_code)document.write(s_code)//-->