You are on page 1of 27

Advance formulation approaches (Version 2019)

Taught by: Nadeem Irfan Bukhari

INTRODUCTION OF THE HANDOUTS


This document has been prepared to give information on the latest computer-aided
formulation methodologies to design and develop pharmaceutical formulations and to the
new generation formulation technologies for addressing the drawbacks associated with the
conventional dosage forms. The new generation formulation technologies will be discussed
in lectures.
Note:
1. The handouts have been continuously updated since 2008. Please do not use handouts
bearing an older date (it must bear the current year).
2. This handout is meant for preparation of exams only and not for uploading on the
personal webpage or inclusion as a Chapter in any book or notes for mass circulation.

A. ADVANCE FORMULATIONS
Advance pharmaceutical formulation is a dosage form which demonstrates the optimized
properties, robust and without the drawbacks associated with the conventional dosage
forms. The new generation formulations are without the following problems.
1. Pharmaceutical problems
a. Unpalatability: Unpleasant taste is masked by microencapsulation or coating with
appropriate film forming substances.
b. Gastric irritation and pain: Several drugs, such as nonsteroidal anti-inflammatory
cause gastric irritation, pain or harmful for gastric tissues. Elimination of the
problem can decrease pain and harm and enhance safety and patient acceptability
for a dosage from. Enteric coating, targeting drugs to small intestine and presenting
drug in lipid-based delivery systems are few drug delivery options to address the
above issues.
c. Insolubility: Drug insolubility shows problem during drug manufacturing and after
administration. New formulation technologies, such as particulate delivery system,
nanoparticles, microspheres, solid dispersion, co-grinding, etc enhance drug
solubility.
d. Instability: In vitro drug instability and reaching a drug intact to the blood are
important for drug efficacy. Presenting a drug in a particulate delivery system or
specific targeting dosage form improves the vitro or in vivo stability of drug.

2. Pharmacokinetic Problems
a. Poor drug absorption due to physiological barriers: Novel drug delivery systems
can address the issue of the lower drug absorption.

1
b. Poor drug distribution. The drug distribution may be altered by surface
modification or by targeting an appropriate site in body.
c. Unrestricted distribution: Unrestricted drug distribution exposes the normal tissues
unnecessarily. This can be controlled by modifying the release or by targeted drug
delivery system.
d. Impermeability of drug: An impermeable drug shows lower or erratic
bioavailability. This can be improved using appropriate drug delivery system, such
as lipid based drug delivery.
e. Rapid metabolism: Drug metabolism can be modified by presenting drug as a
prodrug, or targeted drug delivery system. Surface modification can lead to
decreased metabolic inactivation
f. Rapid clearance: Drug elimination can be modified by a concept of prodrug, surface
modification or drug targeting to appropriate site.
g. Slow clearance: Drug elimination can be modified by a concept of prodrug, surface
modification or drug targeting to appropriate site.

3. Pharmacodynamics issues
a. Short action: Some drugs have short duration of action and thus, require frequent
drug administration. Drug release can be modified for their longer stay in body.
b. Prolonged action: An excessively prolonged drug action is undesirable which can
be shortened by surface modification of the drug particles.

4. Toxicity problems
Toxicity issue can be engineered using appropriate novel technologies.

The new generation delivery systems are manufactured by the novel manufacturing
methods. The categories of the advanced formulation technologies are the following:
Drug carrier systems (Particulate system) Bioadhesive
Prodrug Antigen-target delivery system
Particulates (liposomes, niosomes, Novel GIT delivery systems
nanoparticles)
Lipid based Microchip-based (intelligent or smart
delivery system)
Biosensor-based delivery system

Lesser information, as well as the experience is available for the design and development
of the novel drugs systems thus, the computer-aided formulation approach has particular
applications in design and fabrication of such dosage forms.

2
1. FORMULATION DESIGN AND DEVELOPMENT
A drug, active pharmaceutical moiety (APM) or active pharmaceutical ingredient (API) is
seldom given as such, rather is given as a formulation or drug delivery system. The
pharmaceutical delivery systems are designed and developed to meet the required
specifications and desirabilities. For instance, it must have efficacy, safety and elegance.
Furthermore, it must maintain stability and other product quality attributes. A
pharmaceutical formulation contains active pharmaceutical ingredient(s) and several non-
active ingredients, called as excipients. Each excipient is added in a formulation to impart
certain properties in the formulation. A formulation therefore, is prepared according to
certain recipe (formula) where the specific amounts of ingredients are added, processed
and adjusted to obtain the desired properties, characteristics or specifications in the
formulation. A product that has the desired characteristics and meets all the specifications
is called the optimized formulation. The pharmaceutical formulations are attempted to be
optimized systems where all the properties (quality attributes) are adjusted to certain
desired values, may be quantitative (numeric) or qualitative (maximum or minimum). For
instance, the targeted quantitative values of a table formulation may be disintegration time
within 15 min, hardness above 4 kg/p, 80% release of drug within 4 hours, etc are the
numeric desired values. Elegance, without peeling or color defects are the qualitative
attributes of the tablet dosage forms.
The development of a formulation is usually a complicated process. During the
development process, the choice of excipients and their levels (amounts), as well as the
conditions of manufacturing process are optimized as a result of intensive and time-
consuming experimentation where series of formulations are prepared. As the number of
ingredients increases the formulation becomes more and more complex because of the
involvement and the possible interactions of various ingredients which collectively
effecting the final formulation.
The concentration of a formulative ingredient is a mixture variable and a formulation
property does not dependent on the absolute amounts of ingredients but depends on their
relative concentrations (amounts), since the total weight of a formulation is constant. When
changing the concentration of one of the ingredients in formulation, the concentration of at
least another one component is needed to be changed or adjusted to get the required total

3
weight. For example, increasing amount of a slow releasing polymer in a tablet formulation
requires the reduction in the amount of the bulking agent, lactose to achieve the fixed
weight of a tablet. Thus, it is not possible to consider the effect of one component only on
the final properties of the formulation, since the increase or decrease of one component is
always confounded with the decrease or increase, respectively of (at least) one other
component.

1.1 Formulations as systems of inputs and outputs


Typically, the ingredients interact and consequently, products’ properties depend and
governed by exact ratios of the ingredients. The processing of the ingredients also affects
product properties. A formulation can be defined as a system of inputs (factors) and the
outputs (properties of the product). The physical properties of the formulation are
determined by the physicochemical properties of the component excipients and the process
of manufacturing. The formulative ingredients and the processing methods are considered
as the integral part of the products’ formulation. These formulation properties can be
influenced by changing the proportions of the excipients and/or by changing the conditions
of the manufacturing process. Consider the change in tablet hardness with increase in
binder (excipient) amount or increasing the compression force. Changing the amount of a
binder is a formulation factor while change in the compression force is the machine
variable. It is assumed that in most cases the concentration, but almost always the absolute
amounts, of the active-substance is fixed. Also other excipients with minor concentrations,
in most cases can appear with fixed concentrations in the formulation. These components
can be: a lubricant, a glidant and/or a disintegrant.

1.2 Factors affecting product properties (quality attributes)


Following are the several categories of factors affecting the properties of a pharmaceutical
formulation.
Raw martial-related (Physicochemical) factors: these factors include the physical and
chemical characteristics and properties of the raw materials which may affect the properties
of the product. For example: the crystalline or amorphous nature, solubility, pH, etc of the

4
raw materials. Sometimes, the sources of the raw materials also affect the final properties
of the formulation.
Machine-related factors: these include the factors related to machines used in the
manufacturing of a certain formulation. For example, for tablets the speed of the tablet
machine may affect the properties of the tablets.
Process-related factors: these are the factors related to the process used in the
manufacturing of the product. A change in the process may influence the outcome of the
final formulation. For instance, in tablet manufacturing the process of wet granulation, dry
granulation or direct compression, drying time during wet granulation, addition of internal
or external binder/disintegrant may affect the tablet properties. Change in the
manufacturing method or sequence of addition of the excipients in a formulation may also
be included as the process-related factors.
The variables (factors) of a formulation system are controllable, uncontrollable and may
be known or unknown. The uncontrollable factors are the major cause of variability in
outputs’ properties. The other sources of variability are the deviations from the set points
of controllable factors and sampling and measurement errors. Furthermore, the system
under study may be composed of parts (unknown factors) that also exhibit variability. The
ingredients of a formulation are the causes and affect the resulting properties (effects) of
the formulation. This has led to a theory, called cause and effect model.

1.3 Cause-and-effect model


In a “cause-and-effect” model, the transformation of a system (ingredients) into an output
(product) depends on the way the external factors interact with the internal components of
a system. Four types of interactions between internal and external factors can be proposed
as indicated in Figure 1.
1. Favorable interaction: The favorable interaction of the factors leads to a desired output.
The factors in this case are called as the active factors. An emulsion, for example with
desired stability would be obtained from an interaction of the polarity of the solvent
(internal factor of the dispersion phase) and the value of hydrophilic-lipophilic balance
(HLB) of the emulsifying agent which is an external factor. It can be implied that by
selection of any appropriate external factors, a desired change can be brought in the

5
internal factors of the substance. In case of the active factors, if the system factors are
sensitive to the adjustable external factors, the system can be readily transformed into
a desired output.
2. A partial interaction transforms input system into output with partially acceptable
profiles or properties. An anionic surfactant may not for example turn dispersion into
a stable emulsion due to a partial interaction of polarity of solvent and HLB value of
the emulsifying agent. In case of the partially desired output, with change of the amount
of the factors levels or modifying the factors themselves may cause favorable outputs.
3. When there is no interaction of the external and internal factors, the system remains
unaltered. The factors in this case are the non-factors or non-active factors. One type
of material, some time cannot transform the inputs into outputs. Cases with no
interaction, warrants search for entirely other active factors which can cause a favorable
output. Thus, a careful selection of a set of controllable external factors at appropriate
amounts (levels) may cause an interaction which can manipulate the inaccessible
internal factors to yield a desired output.
4. When the interaction is unfavorable, the output has unfavorable features. This type of
interaction is also called as the negative interaction. Theoretically, particles size of a
polymer can be reduced when a stress is applied in presence of a surfactant. One type
of surfactant may reduce the size of polymer while other type may increase the size of
the polymer under stress. Like the partial interaction above, the negative interactions
also warrants the use of entirely other set of active factors to achieve favorable output.

6
Figure 1: Four possible interactions in cause and effect model
This theory is further supported by literature since certain parameters could not bring the
desired change or produce the desired outputs. Furthermore, a factor critical for a given
system (formulation or manufacturing condition) may not necessarily be affecting others.
This can be elaborated by the example given in Figure 2.

Binder Release

Disintegrant Disintegration
Bulk agent Wt. variation
Lubricant Content Unif
Polymer Hardness

Glidant Friability

Figure 2: Some factors critical for a property are not affecting other properties (arrows
show the effect of factor on the property)

The cause and effect theory can also be stated in terms of factors and responses. Factors
are the inputs and the responses are the properties. Properties or responses are dependent
on several factors. Sometimes, a single property depends on more than one factors. Certain
relationship exists between the factor(s) and response(s). With the knowledge of this

7
relationship, the responses or the properties of a formulation are predictable and hence
adjustable for the optimization of their outputs. These relationships can be linear and non-
linear. The non-linear relationships are complex which makes the prediction difficult.
Advanced formulations pose complex and non-linear relationships between factors and
properties and thus, require use of computer-aided approaches to understand the cause-
and-effect relationships.
The US federal drug regulatory authority (FDA) guidelines of Industry (2006) and
equivalent authorities of several countries recommends understanding of the cause-and-
effect relationship to accomplish products of desired property attributes by approaches
called as quality by design (QBD) and process analytical technique (PAT).

1.4 Quality by design


Quality by design (QBD) is a structured and organized method for determining relationship
between factors affecting a process and the response(s)/of that process. Under QBD, a
thorough investigation of the variables associated with materials, product design, process,
etc is necessary for understanding effect of factors and their interactions on the outputs by
designed set of experiments to achieve outputs with desired and predefined specifications.
QBD helps achievement of certain predicable quality with desired and predetermined
specifications through relating critical material attributes and critical process parameters to
critical quality attributes of drug product. Simply, the QBD provides understandings for
process, output (product) and process control. For QBD the first step is to set the predefined
objectives, standards and specifications. The main aim of QBD is to achieve a product
according to or as close as possible to the desired quality attributes. The QBD uses
multivariate experiments to understand product and process to establish a design space
through design of experiment.
QBD for health products is required by US FDA and the equivalent authorities of several
countries. The real cause-and-effect or the effect of factors and factor interaction is difficult
to understand with the conventional ‘hit and trial’ approach which is termed as one factor
at a time (OFAT) approach. The QBD is achieved by the computer-aided approaches such
as design of experiment (DoE) or DoE-combined with artificial neural network (ANN).

8
Figure 4: QBD (Q8) in the total regulatory quality system

QBD was emerged in 2000s furthering of quality under International Conference on


Harmonization (ICH) for Registration of Pharmaceuticals for Human Use. QBD is
included in the regulatory quality system shown in Figure 4. Quality assurance and GMP,
introduced in 1970s which was based on written procedure and focused on production
process where the product defects were prevented using process control. The system has
the documented systems and used for improved quality. The disadvantage of the concept
was the quality owned only by the Quality Department.
Total Quality concept was introduced in 1980’s based on the quality culture where
everybody is responsible for quality and lead to continual improvement. In 2000s, QBD
was introduced where the quality is to be achieved by design and is habitual using 6-sigma
process capability, or better. Limitations includes the difficulty in accomplishing, and the
lack of acceptance by Pharmaceutical industry. QBD was introduced under ICH. ICH
guidelines Q8 (on Pharmaceutical Development), Q9 (on Quality Risk Management), and
Q10 (on Pharmaceutical Quality System) assist manufacturers to implement Quality by
Design into manufacturing operations or process development.
QBD relies on the concept of design space. The above computer-aided experimentation
generates the “design space” (as shown in Figure 4) which according to FDA is the
multidimensional combination and interaction of input variables (factors, e.g., material
attributes) and process parameters that have demonstrated to provide optimum responses

9
(outputs). Design space is a region where specifications are consistently met. Thus, design
space is a graphical optimization plot of multi-factors and properties under study. Design
space provides the allowable operating boundaries within which, the process factors can
vary with little risk of producing off-grade product. Design space is a processing window
that provides a high level of confidence that 99% of the output population meet (or exceed)
specifications. Design space provides assurance of quality. Thus, finding the design space
is the aim of optimization. Design space is proposed by the applicant of a new drug
(pharmaceutical industry) to the regulatory authority and is subjected to regulatory
assessment and approval. The future working on the factor levels demonstrated within the
design space is not considered as a “change” in product or process which would not initiate
investigational new drug application (INDA) or a regulatory post approval change process.
In such cases, bioavailability/bioequivalence studies are required. Any of the following
conditions is considered as change: Change in manufacturing site, change in manufacturing
method, change in raw material suppliers, minor modification in formulation and
modification in the product strength. When the above is not considered as change, this is a
flexibility for the formulator. This regulatory flexibility is a great incentive for
gn-Expert® Software pharmaceutical industry.
or Coding: Actual
85.00
Overlay Plot
ay Plot

s
High
s Yield: 65.000
High
d Overs CI: 13.968
Low 80.00
esign Points

Fines: 20.000 Yield CI: 65.000


A: Impeller Speed
B: Addition Rate

B: Addition Rate

75.00 3
Fines CI: 20.000

70.00

Design Space

65.00
120.00 126.00 132.00 138.00 144.00 150.00 156.00 162.00 168.00 174.00 180.00

A: Impeller Speed

Figure 4: Design space showing area within which the setting of factors gives the desired
property and within which change is not considered as change

10
1.5 Process analytical technique
The process analytical technique (PAT) is the system for designing, analysing and
controlling manufacturing through timely measurements, during processing of critical
quality and performance attributes of raw materials and processes with the goal of assuring
final product quality. Risk assessment of the critical processing variables is also noted for
process under PAT. The critical processing variables or the control points in the entire
process under study are the sub-tasks of a complex operation that is used to measure the
success or failure of whole operations. The specifications of the chosen critical points are
pre-defined to which the results of measurements during processing are matched to decide
about the proceeding of error-free process. PAT for health products is required by US FDA
and the equivalent authorities of several countries. Like QBD, the PAT is accomplished by
the design of experiment (DoE) or DoE-combined with artificial neural network (ANN).
Six sigma (6) is related terminology to QBD and PAT. Six sigma is the accomplishment
of a level of quality of outputs (products) where the number of defects are not more than
3.4 per million produced. This is near zero defect in products. This is a level of quality
where 99.9997% of the products are free of defect and thus are called as products with near
zero defects. With the help of QBD and PAT employing DoE and ANN, it is now possible
to achieve such products with 6 quality. FDA and other equivalent drug regulatory
authorities are requiring now the use of all these approaches to improve the quality of the
drugs and medical devices. The innovative pharmaceutical companies are measuring their
overall performance using QBD, PAT and 6 approaches, which was earlier based on
quality assurance, quality control, current good manufacturing practices, and the total
quality concept. QBD, PAT and 6  have emerged as the new GMPs concepts for the 21st
Century.

2. TRADITIONAL FORMULATION APPROACH


The traditional or the conventional approach of formulation is based on trial and error,
where the focus is the adjustment of one individual factor at a time while fixing all other
factors. The adjustment of the individual factor is usually based on the experience of the
formulator. Gaining knowledge of the relationships between the factors and response is not
emphasized in the conventional approach. Thus, the formulations resulting from the

11
traditional approach are also called as the experience based formulations. The desired
properties of formulation are obtained by changing one factor and holding all others fixed.
Some initial experiments with selected levels of the ingredients based on the experience
are carried out. The succeeding experiments are based on the results obtained after each
experimentation in the direction of increase (or decrease) of the response (properties). In
this way a maximum (or minimum) of property is reached. Since in this approach, the
factors for product properties are optimized one by one, the approach is also called as one
factor at a time (OFAT) approach. This is called as the sequential approach of formulation
development. After formulating a product, if it is not the desired one, then the center of
attention is another, but one specified factor. One by one, by controlling the other factors
at a constant level, effort is made to get the desired product/formulation. OFAT approach
cannot achieve optimized output. The reason for OFAT failure is that the multiple
responses (properties) of a product are related differently to factors. The traditional OFAT
approach has certain other limitations.

2.1 Limitations of the traditional approach


A. In OFAT, the experimental process is unplanned and based on hit and trial.
B. Approach is less effective since it may improve but never approaches to the optimal
setting of the factors and properties.
C. Development is sequential where the factors are adjusted one by one for each product
property.
D. OFAT is unable to estimate effect of each factor independent of the existence of the
effect of other factors. Thus, the factor interaction remains unrevealed.
E. This approach cannot obtain the information on two factor interactions, which may be
synergistic or antagonistic.
F. OFAT requires larger number of experimentations to obtain information helpful to
make formulation decision.
G. The product of OFAT is not knowledge-based.
H. OFAT is time consuming, laborious and costly
Figure 5 shows that the OFAT approach cannot help in achieving a produce with
aspirational or desired quality. Even after more developmental efforts, the maximally
achievable level achievable using OFAT is just above the acceptable level.

12
Figure 5: Comparative output for OFAT and latest computer-aided approaches (DOE and
ANN)

OFAT QbD

Product development: Empirical Designed/scientific approach


approach
Manufacturing process: Fixed Adjustable based on design space
manufacturing process
Process control: in-process control QbD, PAT tools

Specifications: based on batch Based on product performance


data/history
QC strategy: QC by in-process and QC by risk based approach with real time
finished product testing release test
Life cycle management: reactive Preventive

Little information Knowledge-based built into product

Rich in understanding
Static process allowing no change Flexible process allowing change

Focus on reproducibility Focus on robustness, reduced variation,


identification of critical control point
Assured by testing Designed based

13
3. COMPUTER-AIDED FORMULATION
Computer aided formulation or artificial intelligence-based formulations are new
approaches and powerful tools for pharmaceutical formulation which work by using the
artificial intelligence and computational approaches. These are coupled with visualization
and statistical validation and robust optimization methods. Currently, design of experiment
(DoE) and artificial neural network (ANN) strategies have found rapidly increasing
applications in optimization against classical OFAT approach. These approaches require
computerized decision support systems that recognize relationship existing between the
factors and the responses. Computational approaches can reduce the formulators’ effort by
automatically generating knowledge (of relationships between factors and responses)
directly from data, which are obtained from the planned experimentation using different
settings (levels) of the factors. Thus, the resulting formulations are called as the knowledge-
based formulations. The newer approaches allow simultaneous optimization of all
properties, thus are also called as the simultaneous approaches. Such approaches plan the
complete set of experiments, called as experimental design or matrix beforehand. This
matrix is generated by the mathematical and statistical algorithms in DoE by providing a
range of the levels of the factors. However, this matrix system is not the requirement for
the ANN approach. The number of experiments is based on the number of factors and
precision in prediction for the optimized levels of the factors. The experiments are carried
out according the plan (matrix or grid), the data are entered in a decision support system
(software) and the results are fitted to a mathematical model. The response values can be
predicted by using a range for the settings of variables (formulative, process and machine).
A wide range of possible choices (factor settings) is available for a product in a matrix.

3.1 Advantages and applications of the advanced approaches


A. Formulation design for complex formulations: The complex formulations, which have
several properties or several factors are the major candidates for such approaches.
B. Development of new products: The new products for which much information or
experience is not available can easily be developed using these approaches.
C. Revealing of the interaction between different variables: The advanced formulation
approaches reveal interactions between the factors which may be antagonistic or

14
synergistic for a particular property. Information could be obtained by which, a factor
may totally be excluded from the system without compromising on the quality of the
product. This is called as the breakthrough which can only be achieved with the
computer-aided formulation approaches.
D. Enhancement of product quality and performance at low cost: The advanced
approaches require lesser number of experiments for optimization of a product or
process thus require lesser materials and time for the optimized formulation
development.
E. Shorter time to market: Since an optimized product can be developed with lesser time,
the product can be placed in market in a shorter time. This could provide an edge in the
market competition.
F. Improved customer response: With the improved quality products, the consumers have
more confidence on the product.
G. Improved competitive edge: Lesser time for a research product from bench to bedside,
improved product quality and the improved consumer confidence in the product leads
to the improved competitive edge for a pharmaceutical company that uses computer-
aided approaches.
H. Recommended by FDA/equivalent regulatory authorities: The use of the advanced
formulation approaches is recommended by FDA, equivalent regulatory authorities of
several countries and the standard setting organizations for formulation design, process
validation and developing control plans.
I. Regulatory flexibility: The regulatory authorities recommend the use of advanced
approaches because these generate the “design space”. Re-working on the factor levels
demonstrated within the design space is not considered as a “change” in product or
process, which would not initiate a regulatory post approval change process. Thus,
these approaches provide a regulatory flexibility which is a great incentive for
pharmaceutical industry.
J. Role in scale up and post approval change (SUPAC): The advanced computer-aided
approaches provide information which are helpful for appropriate scale up of the
products. Due to the generation of design space, the post approval changes are also

15
possible without initiating the investigational new drug (IND) or new drug application
(NDA).

Due to the above advantages, the advance formulation approaches have wide applications
in the following field:

a. Formulation design for pharmaceutical products


b. Optimization of pharmaceutical formulation
c. Optimization of pharmaceutical process
d. Pharmaceutical process validation
e. Industrial scale up
f. Cost reduction
g. Prediction

3.2 Need of the advanced formulation approaches


Failure of the traditional approaches to optimize the output has created the need for the use
of advanced formulation approaches. Coping with the following is becoming increasingly
difficult for the pharmaceutical formulations by the traditional approaches:

1. Increasing pressure for developing new products quickly to cope with market
competition
2. Products with more stringent quality standards
3. Partial or totally unavailability of historical knowledge for the new formulations
4. The task of formulation is complex because there is often no model for detailed
understanding of how changes in formulation ingredients affect product properties.
Data generated during optimization process is huge and difficult to understand.
5. Optimization process is multidimensional (some properties are required to be minimum
while others to be maximum).
6. Existence of opportunity to improve the formulation operations and resulting
profitability by streamlining the formulation design tasks
7. Formulation requires experimentation which is expensive in terms of laboratory and
staff time and in terms of opportunities missed through slow response to new customer
requirements.
8. Use of the advanced formulation approaches has been recommended by the FDA,
regulatory authorities and standards setting organizations.

16
3.3 Statistical optimization approach
Design of experiment (DoE) is carried out systematically, identifies critical variables,
reveals factor interactions and helps obtain combinations of variables to accomplish
optimum response with lesser number of experiments. DoE is statistical approaches using
the algorithms, which are based on the following components:

1. Factorial analysis and the analysis of variance (ANOVA): This gives the information
on the statistically significant factor(s) and their interactions, individually for all
properties included in a study.
2. Principle component analysis: Principal component analysis supplements the findings
of the ANOVA and shows the principle, major and core factors for individual
properties.
3. Polynomial regression. Regression is used to predict the best combination of the factors
to forecast the best properties.
4. Response surface methodology (RSM). RSM composes of several mathematical
algorithms which help in the optimization of the properties. In this approach, two
properties are related simultaneously to the properties.

This approach finds the relationship between the factors and properties statistically. Two
factors are related with a single properties in a set of several factors and properties. In
Figure 6, the simultaneous effect of two factors, such as the backing time and oven
temperature on the property of taste rating is given in a response surface. The maximum
taste could be achieved when the oven temperature is set at above 170C and the backing
time as about 55 min.

17
Figure 6: Combined effect of backing time and oven temperature on the taste rating

3.3a Flow (experimental strategy) of computer-aided formulation design


Though a general procedure to execute DoE is available as given in Figure 7, yet this can
be applicable to ANN as well. The different phases in the DoE procedure are the discovery,
breakthrough, optimization, and validation (confirmation).

18
Figure 7: A suggested exprimentation strategy for optimum design of experiment (DoE)

1. Discovery: Discovery is the first step in DoE, where all the possible factors which may
affect the formulation are considered. Brainstorming on the problem under study is the
basic tool which is carried out in a group of experts. Ishikawa Fishbone diagram (Figure

19
8) is used as a team brainstorming tool to evoke ideas for as maximum as possible
factors (causes) which may affect a particular output. The property/response is placed
at the right of a straight line which is called as spine or backbone. Categories of the
factors are drawn and connected to the backbone through angled lines in such a way
that the illustration resembles a fishbone. Thus, a fishbone graphically explains all the
possible factors of a particular property. The factors are classified as the real variables
(values of which can be changed), fixed variable (values which can be changed but
deliberately fixed due to technologic limitations). Uncontrollable factors are beyond
control in an experiment. If the factors are known and there number is up to 3, then
RSM is carried out directly.
When the factors are more than three and their ranges are unknown, a range finding pilot
study is carried out to find the range of factors. Usually a wider range of a factor is selected
to capture the effect of change in the amounts (levels) of the factor on the property).
However, usually there is a restriction on the upper limit due to the toxicity or other
pharmaceutical issues.

Carrier’s attributes Surfactant’s attributes

Concentration Concentration

HPMC Tween 80

PEG 4000 Span 80 High


Dissolution
Disintegration
Low friabillity
Low weight
Wet granulation variation
Particle size

Moisture content Dry granulation

Drug’s attributes Preparation method

Figure 8: Fishbone enlisting factors affecting final properties of a tablet formulation

2. Screening studies under breakthrough phase are more statistically-intensive and


planned than pilot study. Screening study helps narrowing down the large number of

20
factors to a few critical factors. In screening study, a factor can be studied at 2-levels,
lower and higher. In this simplest study, the number of experimental runs is minimum.
Full factorial, optimal, Plackett-Burman or Tiguchi design use more levels of a factor
and have own advantages or limitations. This phase of DoE is called as breakthrough
because usually the factors which are considered theoretically the most important for a
property are revealed to be otherwise.
3. Optimization of the properties is carried out using RSM. Several tools available under
RSM are central composite design, Box Behnken, 3-fatorial level and optimal design.
These designs generate different matrix (a planning to perform experiments) for
different levels of factors for RSM. Sometimes an experiment performed without
matrix is analyzed using data s “history” for RSM.
4. Under validation, based on the predicted levels of factors given for predicted optimized
properties of a formulation, a real formulation is manufactured. An agreement between
the predicted and the real formulation properties at the suggested factor level is the
success. Sometimes, the output is not achieved according to the prediction. In this case,
design augmentation is employed, which simply may be addition of another factor level
in the previous factor levels or can be replication of whole design.

Statistical approach becomes more difficult for more than three or four inputs since the
formulator is tempted to oversimplify the problem (for example, restricting a study to three
input variables) in order to model it. Statistics also often requires the assumption of a
functional form (for example, linearity) in order to generate a model and such assumptions
can be inappropriate for complex tasks like formulation.

3.4 Artificial neural network


Alternative to statistical approach, artificial neural network (ANN), a biologically inspired
mathematical construct (algorithm) mimics the learning of human brain through modeling
of and pattern recognition within data. In ANN, the complexity of biological neural
architect is highly abstracted as enormous processing elements (PEs), analogous to neurons
(called artificial neurons, nodes or units) connected to other PEs, comparable to synapse
through coefficients (weights), similar to signal strength (threshold) and the outputs

21
representing axons (Figure 9).

Figure 9: A synapse – the basis for ANN

The comparison and contrast of the biological neurons and the artificial neural network is
given in Table 1.

Table 1: Biological neurons and their analogous in artificial neural network

Biological neurons Artificial neural analogous to biological


neurons
Neuron Processing elements/nodes/units
Synapse Node to node connection
Signal strength (threshold) Weights/coefficients
Axons Outputs
Dendrites Inputs
Learning Training (process of finding cause-and-
effect relationship within a given data)
Complex functionality Highly abstracted (simplified)
functionality
Slow speed Fast speed
Numerous neurons (n=109) Few neurons (n=102 – 103)

Pattern of connectivity among the ANN units is equivalent to a mammalian neural architect
as shown in Figure 10. A typical ANN forms input and output layers and at least one or
more hidden layers. ANN works by reducing the error between observed and predicted
outcomes by adjusting the weight. Like biological neural learning, it acquires knowledge
from a learning process responsible for adapting the connection strength (weight value) to
input stimuli. Mathematically, it detects the underlying patterns in data that recognizes the

22
functional relationships between factors and responses and predicts optimum levels of
factors from a limited input data. Finding the relationships between the cause-and-effect is
called the training. ANNs are particularly suitable for complex and non-linear systems for
which the conventional approach is more exhausting.

Figure 10: Schematics for the architect of a neural network

Use of ANN for optimization does not require any prior knowledge. The neural network
makes no assumptions about the functional form of the relationships; it simply generates
and assesses a range of models to determine one that best fits the experimental data
provided to it. As such, increasingly, (ANNs) are used to model a complex behaviour in
problems like pharmaceuticals formulation and processing. The models generated by
neural networks allow “what if” possibilities to be investigated easily. Even lesser number
of experimentation is required in ANN, than that required by DoE.

23
4. COMPARISON OF DoE AND ANN
The DoE and ANN have be compared in Table 2.
DoE ANN
Statistical based approach Neural network based
Regression Supervised learning
Interpolation Generalization
Parameters (Statistical) Weights (Synaptic)
Independent variables (Factors) Inputs
Dependent variables Outputs

5. OPTIMIZATION
Closeness of the properties of a product to its pre-set (desired) criteria for the critical
attributes is called optimization. Optimization is achieving the desired properties of a
product. The latest computer-aided approaches, such as DoE and ANN optimize the several
properties of a product simultaneously, thus it is also called as the multi-objective
optimization or simultaneous optimization. For instance, an optimized tablet formulation
is that which is with a high hardness (crushing strength), low disintegration time, has
certain dissolution profile and is robust towards (small) deviations (errors) in process
conditions, mixture variable settings or in the environment. In a robust formulation, despite
small variations, the values of the properties remain at (almost) the same level or deviate
with only within an acceptable range. It is of course desirable to accomplish a product
which maintain exactly the same values of properties (or specifications) during and after
production, storage before use, or during use, but this may be costly and is not always
needed or achievable.
Usually there are number of criteria which a formulation has to fulfil which has made the
optimization challenging. It is however almost never possible to fulfil to all the criteria at
once. This means that a compromise must be found between certain criteria. Usually, for
non critical attributes are compromised if there is a need to undertake such compromise.
Many methods are available to search for such a compromise variable setting. A
pharmaceutical formulation usually, is optimized with a compromise between cost and
quality. This robustness aspect can also be extended towards environmental factors like
temperature and (more importantly) humidity. It can be predicted the shelf-life of a

24
formulation under certain conditions is or what the desired conditions are to keep a product
stable during a certain time on a desired quality level.
When there is only one response or property, the goal is often to search for a maximum or
a minimum or property response. However, in practice the optimization is challenging and
difficult to accomplish because optimization problems usually require simultaneous
optimizing the multiple properties. For instance, a modified release formulation is required
to be optimized for friability, hardness, weight variation, content uniformity, disintegration
time, dissolution at different time intervals. Optimization of a products having more
number of factors and more number of properties, which is a usual case itself is difficult
and complex. Furthermore, the desirability for some of the properties is to minimize them
(e.g., friability), maximize them (e.g., release of immediate formulation) or targeting to
certain absolute value (e.g., dissolution targeted to 80% within 4 h). Sometimes, the
optimization requires adjusting qualitative parameters, e.g., setting zero order release. This
complexity in simultaneous optimization is dealt with undertaking a compromise on certain
properties. Usually, it may be necessary to trade off properties during such
experimentation, to sacrifice one characteristic in order to improve another, e.g., to accept
a tablet with lesser hardness in order to achieve the desired dissolution profile. Thus, the
primary objective may not be to optimize absolutely but to compromise effectively and
thereby to produce the best formulation under a given set of restrictions.
In some cases, the most different responses from each other are selected for optimization.
The optimization procedure can be simplified by discarding highly correlated responses
with other responses. Statistical approach called Principal Component Analysis (PCA) is
used to select these key responses. The ANN finds the critical factor by recognizing
(“learning”) the pattern in the data. The above information is used for optimization.
Achieving optimization is difficult as the product must meet specification which are
complex to accomplish. The complexity is more where there are more number of factors
and the properties. Usually for one property, the setting of the factors (levels) are different
from that for another property. For example, even for the release profile at different time
intervals makes the simultaneous optimization difficult.

25
Sample Questions
S. Questions Possible
No Marks
1 What is an advance formulation? 3
2 Describe formulation design and development. 12
3 “Formulation is a system of input and output. Comment 10
4 Why is it difficult in traditional approach to assess the effect of 8
changing one factor at a time on the final property of a
formulation?
5 What are the several factors effecting the final properties of a 8-10
product?
6 What are the different types of interactions between the internal 10
and external factors of a formulation system?
7 Prove that the formulation is a system of factors and responses. 8
8 What is traditional formulation approach? What are its 10
limitations?
9 Why the one factor at a time (OFAT) approach fails to achieve 6
optimum product?
10 Describe the new formulations approaches, why they are needed 20
and what are their advantages and applications.
11 Describe the “design space”. What is its importance? 6
12 Describe briefly the quality by design (QBD), process analytical 6+6+3
technique (PAT) and six sigma?
13 Compare the biological and artificial neural network 3–5
14 Compare the DoE and ANN 3–5
15 Match ANN with the biological nervous system 5
16 Why there is a need for the computer-aided formulation 6
approaches?
17 Discuss the statistical optimization approach. Describe its 14
experimental strategy.
18 What is a fishbone? Describe its applications 6
19 What is artificial neural network (ANN)? How ANN resembles 16
to the biological neural system? How it works?
20 What is optimization? Why is optimization challenging? 10
21 Why is a compromise in desired formulation properties required? 6
________________________________________________________________________

The document has been prepared mainly from the following references. Other references
are available on demand.

Colbourn E. Spotlight on Intelligensys. Controlled release society Newsletter. Vol 21 (3):


Ibrić, S, Djurić, Z., Parojčić J., Petrović, U. Artificial intelligence in pharmaceutical
product formulation: Neural computing. Chemical Industry & Chemical Engineering
Quarterly 15 (4) 227−236 (2009)

26
Ladani R K, Patel M J, Patel, R P. Bhatt TV. Modern optimization techniques in field of
Pharmacy. Research Journal of Pharmaceutical, Biological and Chemical Sciences, 2(1),
148-157 (2010)
Rowe, R. C. Roberts, R. J. Intelligent software for product formulation, Taylor and Francis,
London, 1998.

27

You might also like