Professional Documents
Culture Documents
The projects included in this journal were selected from the twenty one projects submitted by the SCM Class of
2017 at the Massachusetts Institute of Technology. The articles are written as executive summaries of the master’s
thesis and are intended for a business rather than an academic audience. The purpose of the executive summaries
is to give the reader a sense of the business problems being addressed, the methods used to analyze the problem,
the relevant results and the insights gained.
The articles included in this publication cover a wide selection of interests, approaches, and industries. These
projects were developed in partnership with companies ranging in size from startups to the largest companies
in the world. They cover industries as diverse as transportation, big box retail, semi-conductors, manufacturing,
e-commerce, CPG, chemicals, pharmaceuticals, and logistics services. They address issues of drug security,
consumer goods distribution, production planning, warehouse network design, inbound transportation, supply
chain risk, demand forecasting, and inventory management.
Each of the projects is a joint effort between a sponsoring company, one or two students, and a faculty advisor.
Companies who are members of CTL’s Supply Chain Exchange are eligible to submit their ideas for thesis projects
in June and July and then present these proposals to the students in mid-August. In early September the students
select which projects they will work on. From September until early May the teams conduct the research and write
up the results. In late May all the sponsors, faculty, and students participate in Research Fest where all the research
projects are presented.
The 10-month SCM program is designed for early to mid-career professionals who want a more in-depth and
focused education in supply chain management, transportation, and logistics. The class size each year is limited
to 40 students from around the globe and across all industries. The Master’s research projects give the students
a hands-on opportunity to put into practice the learnings that they are
receiving in their coursework.
We hope you enjoy the articles. The rest of the master’s research projects
are listed at the end of this journal. You can also view all of the executive
summaries on the CTL website at: http://ctl.mit.edu/pubs. If you would
like to learn more about the SCM Master’s Program or sponsor a student
master’s research please contact us directly.
Happy reading!
3
2017 SCM RESEARCH JOURNAL
Summaries of selected research projects by the 2016 graduates of the MIT Master of Supply Chain Management Program
Table of Contents
Introduction | 3
scm.mit.edu/research
4
Production Planning with Complex Cost Drivers | 18
This thesis provides a new lot sizing model formulation for manufacturing firms that contract third party logistics
(3PL) providers’ warehouses. The formulation extends existing models to account for the change in inventory
holding costs depending on 3PL warehouse utilization. In addition, it provides a novel method for considering
multi-tiered setup costs for products that share common setups. The new formulation produces production plans
that reduce relevant supply chain costs for firms with these features.
5
Predicting On-time Delivery in the Trucking Industry
By: Rafael Duarte Alcoba and Kenneth W. Ohlund Topic Areas: Transportation, Predictive Analytics
Thesis Advisor: Matthias Winkenbach
Summary: On-time delivery is a key metric in the trucking segment of the transportation industry. If on-time delivery
can be predicted, more effective resource allocation can be achieved. This research focuses on building a predictive
analytics model, specifically logistic regression, given a historical dataset. The model, developed using explanatory
variables with statistical significance, results in a significant resource reduction while incurring a relatively small
impactful error. Interpretability and application of the logistic regression model can deliver value in predictive power
across many industries. Resulting cost reductions lead to strategic competitive positioning among firms employing
predictive analytics techniques.
identify, from the input variables, which combination would stratified sampling. Stratified sampling is a method of
allow Coyote to predict on-time delivery. In the interest of sampling data used when classes are presented in a very
reducing the number of variables that might yield better unequal proportion in the original dataset.
performance in the validation set, we used the stepwise
regression approach. The stepwise approach enabled us to Besides certifying a correct representation of both classes
explore different combination of variables with a quick and (0 and 1), it is also extremely important to avoid overfitting.
flexible interface. Figure 2 presents the chosen six variables This ensures that the chosen model is able to generalize
with high statistical significance for our model. beyond the dataset at hand. To mitigate this risk, we use
Multi-collinearity can play a role in distorting the results the concept of data partitioning- dividing the stratified
of a model by duplicating the statistical value through dataset into two groups: training and validation. The model
corresponding variables. We explored approaches aimed is developed using the training set and evaluated using
at identifying multi-collinearity and mitigating it. Of those the validation set. The performance on the validation set
approaches, principal component analysis, multiple provides insight into the model’s predictive power.
correspondence analysis, and a correlation matrix were
performed. Performance evaluation
Once we developed and ran different models on our
Through a systematic literature review, we identified three dataset, we determined how to measure the predictive
widely used models to predict categorical response using performance of each. Different methods to evaluate a
continuous and categorical predictors. The three models model’s performance can be used. Even though adjusted
are logistic regression, neural nets and bootstrap forest. R squared is widely used, as the main goal of our model
Due to Coyote’s desire for an explanatory model with high was to predict a binary outcome, we used a confusion
interpretability of model results, this thesis focuses on the matrix. The confusion matrix is a two-by-two table that
logistic regression model. The neural nets and bootstrap classifies the actual response levels and the predicted
forest models validate the selected model. response levels. The diagonal elements of the confusion
matrix indicate correct predictions, while the off-diagonals
The dataset used presented a very imbalanced proportion represent the incorrect predictions.
of on-time and delayed loads. The small representation of
delayed observations reflects the high service level that
Coyote provides its customers. To develop a model capable
of capturing the useful information that distinguishes the
underrepresented class from the dominant class, we used
7
Figure 2. Stepwise Regression for Variable Selection
8
Conclusion
In predictive modeling, where a binary response is required, Despite all the possible extensions of this research, our
misclassification is the overall metric of choice. Predictive findings present valuable insights to Coyote Logistics. The
models can be tailored to optimize other, more specific thorough modeling process validates much of the intuition
metrics. For a company that focuses on achieving a high from the experts. Data-backed decisions enable firms to
service level, the minimization of missed delays is critical. have greater success and gain competitive advantage. This
As the rate of missed delays decreases, the sensitivity of the research represents a step in the right direction for Coyote
model increases. The reaction of the model to increased in investigating predictive analytics for their operations.
sensitivity is a higher misclassification rate and lower overall The model, developed using six explanatory variables with
usefulness of the model. These tradeoffs are key drivers in statistical significance, results in a 76.4% resource reduction
the thesis. while incurring an impactful error of 2.4%.
Figure 3. Confusion Matrix Results Comparison for Validation and Customer Test Data
9
Reducing Shipment Variability through Lean Leveling
By: Melissa Botero Aristizabal and Fabian Brenninkmeijer Topic Area: Distribution, Variability, Lean Principles
Thesis Advisor: James B. Rice, Jr.
Summary: This research focused on reducing shipment variability between manufacturer warehouse and customer
distribution center. We used the leveling principle as defined by lean theory to create predetermined customer shipments
for the top selling SKUs. We determined the optimal degree of lean leveling implementation through simulation analysis
of various order policies. Sending frequent, small, recurring shipments on a weekly basis could lead to improvements in
service level, transportation cost and cash.
processes has received relatively little attention in practice. on the process, standardize it and improve it, something
We simulated various order policies based on lean known as “Economies of repetition.”
leveling principles with the goal of improving operational Following the lean leveling principle derived from the
performance across supply chain echelons. manufacturing domain, we conducted a SKU segmentation
to identify the top selling 50% of SKUs. The resulting top
Creating a new Order Policy SKUs were used to find the base quantity by SKU to be
The research focused on diminishing the impacts of shipped continuously under the new policy.
variability by creating more stable inventory flows between
a supplier warehouse and a retailer’s distribution center. 2. Simulation
Figure 1 depicts the degree of shipment variability We built a model to simulate the effects of the new order
experienced by a typical customer of the sponsor company policy based on varying degrees of lean implementation, in
under the current order system. The range of 5 to 22 weekly order to determine its viability. The simulation determined
shipments confirmed the need to address the issue of the number of cases ordered by week, inventory levels, and
volatility. anticipated lost sales by week and by SKU.
We developed a model based on one year of historical The new policy consisted of two components: a fixed
demand, supply and forecast data. We created a new part and a variable part. The fixed part was comprised
order policy based on the following 3 key steps: SKU of a percentage of the base quantity derived from the
segmentation, simulation, evaluation. segmentation analysis. The variable component follows the
process logic of the sponsor company, taking into account
1. SKU segmentation current inventory levels, demand forecasts and promotional
We utilized lean theory to develop a new order policy with data. Figure 2 shows the reduction in weekly shipment
the goal of smoothening out demand volatility. Under lean variability under the new order policy. In this figure, the
theory, a daily fixed production plan is applied for the top adjustable fixed order percentage was set at the default rate
50% of SKUs. The daily manufacturing of the same products of 50% of the average.
allows the manufacturing team to develop experience
11
Figure 2: New Order policy vs Actual Performance
Under the new policy, the fixed order component percentage of the average increases.
represents a consistent minimum shipment level which Under the new order policy with fixed recurring shipments,
enables the overall reduction in volatility. At the same time, the sponsor company can rely on the same carrier and
the variable order component gives the company the even the same drivers, which will create efficiencies out of
flexibility to cover sudden demand peaks. habit. Overall, the probability of a shipment arriving on time
increases with a higher rate of fixed shipments.
3. Evaluation
The results of the simulation runs were compared with the More frequent deliveries reduce inventory requirements
actual performance of the sponsor company according to at the customer warehouse. However, inventory levels
the following criteria: cash (inventory levels), transportation become increasingly inflated with a rising fixed percentage
cost and service level. An improved inventory policy should because available quantities start to exceed demand levels
be based on an approach that balances the three criteria. more often.
This enables sustained gains for both buyers and sellers
across the supply chain. Since all three evaluation criteria had to at least remain
the same under the new policy, the team established a
To obtain the optimal inventory order policy, we conducted feasible region with potential order policies. At a fixed order
a sensitivity analysis based on the adjustable percentage of percentage of 75% the three evaluation criteria were equally
fixed versus variable shipments. Figure 3 shows the impact balanced between buyer and seller.
of the fixed order percentage on the three evaluation
criteria. Conclusions
Shipping variability affects the performance of every global
The more fixed shipments a company has, the more supply chain. Causes such as changing customer behaviors,
shipments can be planned in advance, which reduces lack of coordination between supply chain partners, and
the probability of having to pay for premium freight rates. disruptions prevent companies from achieving a more
Therefore, the transportation cost decrease as the fixed uniform flow of goods. We demonstrated that lean leveling
12
could effectively reduce shipment variability while leading lean leveling should be applied. An order policy, combining
to improvements of service levels, transportation costs and the synergies gained from lean leveling while maintaining
cash requirements. the flexibility to respond to unusual demand volatility, could
lead to a win-win situation in the long run.
Operating with the lowest possible shipping variability does
Conclusion
not necessarily lead to the best results due to the highly
The AHP provides ShopCo a tool to prioritize inbound loads
volatile nature of the consumer goods industry. Instead,
awaiting shipment by assigning priority scores to each load
reducing shipment variability should be seen as a means to based on factors defined by ShopCo.
the end of improving cash requirements,
ShopCo will be able to identify the relative priority of the
transportation cost, and service level. For an optimal loads and determine which loads should receive priority
when carrier capacity is constrained, facilitating improved
strategy, companies should find a balance of fixed and
service levels without incurring additional costs. Further,
variable shipments of the top selling SKUs. the Knapsack optimization model found opportunities
Companies should further balance the evaluation criteria to improve the load priority shipped by up to 8.3% as
to potentially enable sustainable gains for both buyers and compared to the current load assignment.
sellers. Sending frequent, small, recurring shipments of the
We believe this research will benefit not only ShopCo
Top SKUs on a weekly basis could lead to better operational
but also other companies and industries managing their
performance than infrequent bulk shipments of the same inbound transportation with carrier capacity constraints by
items. applying this framework. Although the factors and sub-
factors used may differ, this underlying framework can align
A careful analysis of the order policies should be conducted load priority with company objectives.
with the goal of determining the optimal degree to which
13
Preemptive Solutions to Reducing Call Center Wait Times
By: Qiao Chu and Nisha Palvia Topic Areas: Queuing Theory in Call Centers, Simulation,
Thesis Advisor: James B. Rice Segmentation, Home Security Industry
Summary: The security alarms services market in the United States delivers hardware equipment and services to
homeowners and businesses to help monitor and enhance personal property protection. For this thesis, MIT partnered
with OnProcess Technology, a managed services provider specializing in complex, global service supply chain operations.
Together, the team developed a robust framework to preemptively reduce the number of inbound customer calls,
and thereby improve customer service. The team followed a three-step process to develop this framework: customer
segmentation, call center queue simulation (using interarrival times, service times, and number of agents), and preemptive
solution development.
Qiao graduated with a Bachelor Before coming to MIT, Nisha
of Engineering degree in Electrical Palvia graduated with a Master’s
and Computer Engineering from in Business Administration from
University of Toronto. Before coming IE Business School in Madrid. She
to MIT, she worked as a technology worked as an analytics consultant
consultant with Citrix for 3 years. with A.T. Kearney for 4 years and will
be rejoining the firm following the
MIT SCM program.
To understand the current call center system, the MIT thesis Taking a closer look at these top issue codes, we found
team simulated the queue of an AlarmCo call center by that a majority of calls were warm transfers to sales (agent
calculating interarrival rates, service time, and using the transfers call, without concrete solution), solutions that
number of agents by hour and day of week. The call center were accepted by the customer, and resolved solutions to
as a queuing system can be mapped as per Figure 1. general questions. The call data also revealed that Sunday,
Saturday, and Friday had the lowest call volumes while
The team also strategically segmented the AlarmCo the days with the largest number of incoming calls were
customer base and developed targeted preemptive Monday, Tuesday and Wednesday. Additionally, we noticed
solutions for prevalent segments. After adjusting the a dip in total call density in the months of December and
queuing inputs per the potential preemptive solutions, July. When studying the distribution of calls, we learned that
we were able to directly assess the quantifiable impact of the highest call count on any Sunday for the calendar year
introducing preemptive solutions to this industry. was 1500, nearly 40% lower than the maximum number of
inbound calls on Monday.
Analysis, Methodology, and Insights
Customer Segmentation
Analysis Next, customers were segmented by demographic
The team first requested three separate sets of data groups and prevalence of top reason codes (Figure 2).
including details on inbound customer calls, customer Demographic data helped us group AlarmCo customers
demographics and labor resource plans. We then performed into two categories: business clients and homeowner
preliminary data analysis to understand the top customer clients. Homeowners account for 81% of sales while the
reason codes and solutions for the inbound traffic. Analysis business clients account for the remaining 19%. The
of call volume reveals that 80% of calls are made by homeowner category can be further broken down to
customers with seven of the thirty possible issue codes. platinum or premium customers accounting for 15% of
15
customers, silver customers (average package holders) We developed 20 solutions that fell into five categories:
accounting for 58% of customers and finally, basic package • Automated Remote Services
holders, accounting for the remaining 27%. The team • Education
segmented the homeowner customers into 18 sub groups • Online Resources
based on their age, income, and English proficiency level. • Telephonic Assistance
• Proactive Analysis
Queuing Simulation We calculated the impact of the recommended preemptive
The team selected the appropriate queuing model solutions on the number of inbound calls using a feasibility
based on the inbound call center data and decided to and risk ranking.
approximate the interarrival rate and service rate as a
Poisson distribution. In order to make sure the selected Comparative Analysis
model, M/M/n, was representative of the call center, the After calculating different success rates for each of the
team extracted empirical inputs (interarrival rate, service twenty preemptive solutions, we applied a weighted
rate, and the number of agents) and outputs (average average percent reduction of interarrival rate for each
wait time and average queue length) from the customer day and hour. With these new and improved interarrival
queue data, and compared the theoretical outputs with rates, the team was able to rerun the queuing simulation
the empirical data. Since the difference fell within a 10% and compare three different scenarios: implementing
validity threshold, it could be concluded that the M/M/n no preemptive solutions (as-is state), implementing the
model accurately emulates AlarmCo’s inbound customer most favorable (twelve) preemptive solutions, and finally,
call queue. implementing all twenty solutions (cherry pick). After
updating the queuing model interarrival rates, we see that
Preemptive Solutions wait times steadily decrease with implementation of each
Customer segmentation led to the development of different scenario. If AlarmCo conservatively moves forward with
preemptive solutions addressing major call reasons codes. twelve of the twenty solutions, average wait times reduce
16
Figure 3. Scenario Analysis of Wait Time Reduction
from 6.88 seconds to 4.47 seconds in the morning, 7.74 the team addressed the following question for AlarmCo:
seconds to 5.03 seconds in the afternoon, and 8.84 seconds “How can we improve the customer service experience for
to 5.74 seconds in the evening. If the client implements customers of a major security service provider in the US?”
all twenty solutions per the cherry pick scenario (best By analyzing the outputs of the simulation before and after
case), average wait times reduce from 6.88 seconds to 4.03 adjusting the dataset, the team quantified the impact of
seconds in the morning, 7.74 seconds to 4.53 seconds in the preemptive solutions on the call center queue. Ultimately,
afternoon, and 8.84 seconds to 5.17 seconds in the evening narrowing to twelve strategic preemptive solutions led to
(Figure 3). the enhancement of the as-is queuing model, reducing
average wait time by up to ~35%. Our chosen hybrid model
Conclusion is a conservative and feasible approach in implementing
Three major insights can be drawn from the process of preemptive solutions with the goal of minimizing inbound
applying queuing theory in this research: calls and reducing average wait time. The twelve solutions,
• Poisson distribution is a robust model for a general scattered across all five preemptive solution categories,
queuing model have strong support from stakeholders and according to
• Small changes in the number of inbound calls can have our research, will work successfully with other similar players
a large effect on the queue in the home alarm securities market.
• Tradeoffs have to be made between the service level
and resources
While the home security industry in the United States
has seen vast change in the last decade in terms of
technological innovation, security companies are still not
able to fully utilize call center data to develop proactive
solutions in place of reactive measures. In closing this gap,
17
Production Planning with Complex Cost Drivers
By: Ian Chua and Thomas Heyward Topic Areas: Supply Chain Planning, Optimization
Thesis Advisor: Josue Velazquez-Martinez
Summary: This thesis provides a new lot sizing model formulation for manufacturing firms that contract third party logis-
tics (3PL) provider’s warehouses. The formulation extends existing models to account for the change in inventory holding
costs depending on 3PL warehouse utilization. In addition, it provides a novel method for considering multi-tiered setup
costs for products that share common setups. The new formulation produces production plans that reduce relevant supply
chain costs for firms with these features.
KEY INSIGHTS
1. Manual planning processes are time consum-
ing and limited in the number of cost drivers
that can be focused on.
2. Implementation of lot sizing problems
requires the right formulation that fits the
business environment.
3. Production lot sizing optimization is not
always a trade-off between setups and in-
ventory. In some cases, improvements can be
made in both areas simultaneously.
Introduction
Many manufacturers have to strategize around seasonality Niagara Bottling Co (Niagara) sponsored thesis research
and capacity constraints when planning production with MIT to improve their production planning process.
schedules. Companies need to come up with production Niagara wanted a method that explicitly considered the
schedules that consider demand, inventories, production cost impact that production lot sizing decisions have on
capacities, and when to use 3PL services. Typical production utilization of 3PL warehouse space.
strategies prioritize maximizing line efficiency which favors
large lot sizes and few setups. On the other hand, logistics
strategies prioritize minimizing inventory costs which
favors smaller lot sizes and more setups. Lot sizing problem
formulations provide a means for considering both setup
and inventory costs to achieve the optimum balance.
18
Figure 1: Transfer events during peak demand
Niagara demand has seasonality with peak demand However, certain aspects of Niagara’s business could not be
exceeding production capacity during certain periods modeled using the existing formulations. A new extension
of the year. Niagara prepares for the peak season by for the formulation was devised to address these features.
building inventory during periods with excess capacity. Additional variables were introduced to track the amount
The inventory needed for peak seasons are stored in 3PL of inventory in 3PL warehouses and when an inbound
warehouses. shipment occurred. Other variables were added to track the
changeover of major setups for bottle sizes or packaging
Niagara plants have some limited inventory warehousing separate from minor setups for labels.
capacity. The plant warehousing cost structure is
significantly lower than the cost structure for the 3PL Adding these features to the model provided the necessary
warehouse. Additionally, there are transfer costs associated integrated framework for assessing tradeoffs between
with transporting and handling of inventory entering the production and logistics cost considerations.
3PL warehouse. Niagara expects that because of this cost
structure, transfers to the 3PL warehouse should only occur
during periods where inventory is being built for the peak
season.
20
Figure 3: Cumulative Production relative to Demand
21
Intermodal Variability and Optimal Mode Selection
By: Tianshu Huang and Bernarda Serrano Topic Areas: Transportation, Mode Choice, Inventory
Thesis Advisors: Dr. Chris Caplice & Dr. Francisco Jauffred
Summary: This research focused on incorporating the transit time variability into the transportation mode selection
process. We identified a probability distribution for the transit time per unit of distance to quantify the impact of transit
time uncertainty on inventory cost. Then, we built a model that calculates the total logistics costs and compares the results
for two transportation modes: intermodal and truckload. Results show that a particular lane is more likely to be assigned to
truckload for high load values and/or high service levels.
22
Figure 1: Time/Distance Histogram Figure 2: Time/Distance Histogram
(h/ mi.) - Truckload (h/ mi.) - Intermodal
November 2016 to find the best distribution of Time/ a certain mean (μ), a standard deviation (σ), and a specific
Distance (in hours per mile) for each mode. service level. The service level refers to the percentage of
time that an order is delivered on time.
As shown in Figures 1 and 2, both truckload and intermodal
have right-skewed transit time distributions. The goodness- For a normal distribution, the t(μ,σ,service level) can
of-fit results show that the Normal and Lognormal be found using the standard normal table. For other
distribution fit the data well. The probability that the data distributions, it can be found by calculating the area under
do not follow the Normal or Lognormal distributions for the distribution curve that covers the required service level.
Truckload is less than 1%. For Intermodal, the probability
that the data do not follow a normal distribution is less For example, in Figure 3, the transit time does not follow
than 1%, and the probability it does not follow a lognormal a normal distribution. The required time/distance can be
distribution is less than 4%. Neither number is statistically found by finding the point in the x-axis that covers 95% of
significant. Therefore, both distributions were used for the total area under the curve.
modeling the transit time.
Required Time/Distance
The required Time/Distance is expressed in our model
ast(μ,σ,service level). It is measured in hours per mile. It
represents the transit time over one unit of distance given
23
Total Cost Equation its variability are given by the distance and the parameters
The total cost includes two components: transportation we set for the Time/Distance probability distribution. To
cost and inventory cost. calculate the cost of the safety stock we included the
monetary value of a load and the inventory holding rate
defined by the company.
24
Sensitivity Analysis Conclusion
To investigate the impact of different load values on mode We investigated how to optimize the transportation mode
selection, a wide range of load values, from $1,000 to selection process for ABC Stores. We developed a total
$120,000 were used to compute the total cost using the cost equation that integrates transportation costs and
total cost equation. Values ranging from 1 to 100 were transit time variability in terms of inventory costs. The main
used to evaluate how changes in the volume affect the findings of this study show that:
percentage of lanes assigned to each transportation mode. • The average and standard deviation of transit time
We also measured the dynamics of mode selection changes are impacted by distance. The relationship between
with respect to service levels from 0 to 100%. average transit time and distance is non-linear as there
are many other factors such as driving hour limitation,
These sensitivity analyses show that: pick up delay, etc. that are also relevant to transit time.
• The variation in the model’s recommendation when • The expected transit time of truckload is shorter than
using parameters of the Normal vs. Lognormal the expected transit time of intermodal (lower mean).
distribution is insignificant, as can be seen in Figure 4. • For this particular dataset, the model generally favors
The horizontal axis represents the load value measured Truckload over intermodal because the lower average
in dollars per load and the vertical axis represents and standard deviation result in lower inventory costs.
the percentage of lanes assigned to a specific mode. Therefore, most of the lanes are assigned to truckload.
The blue (TL(ND)) and gray (TL(LnD)) lines represent
the percentage of lanes assigned to truckload using
normal and lognormal distribution, respectively. The
orange (IM(ND)) and yellow (IM(LnD)) lines represent
the percentage of lanes assigned to intermodal using
normal and lognormal distribution, respectively.
• An increase in the load value would enlarge the
difference in inventory cost, increasing truckload’s cost
advantage.
• As the volume increases between 0 and 20 loads per
year, the percentage of lanes assigned to Intermodal
increases.
• The percentage of lanes assigned to truckload increases
as the service level increases.
25
Serialization of Prescription Drugs in the US: A Centralized View
By: Aisha Nabiyeva and David Z.Y. Wu Topic Areas: Distribution, Healthcare, Tracking & Tracing
Thesis Advisor: Bruce Artzen
Summary: The 2013 Drug Supply Chain Security Act (DSCSA) of 2013 requires serialization of prescription drugs across
the whole pharmaceutical supply chain. This thesis assesses the costs associated with serialization implementation on
manufacturers and distributors/retailers. Taking the perspective of a centralized data model, we tested the feasibility of
implementing a centralized database under both data nesting and unit level relational models. The findings suggest that
serialization using a centralized data model would in all cases incur a higher proportional cost than using a decentralized
model.
Prior to attending MIT, Aisha Prior to attending MIT, David
graduated with a Bachelor of obtained a Bachelor Degree
Science in Mathematics with in Supply Chain & Information
Finance from King’s College London Systems from The Pennsylvania
and Master of Philosophy in State University. Post-graduation,
Industrial Systems & Manufacturing he worked at Unilever in their North
from University of Cambridge. American operations, and then
She has worked for Rolls-Royce with A.T. Kearney Asia-Pacific. David
Aero in Supply Chain roles. Upon looks to enter the humanitarian and
graduation, Aisha will join Converse non-profit sector upon graduation.
as an Operations Strategy Manager
in Boston, MA.
27
One Time Capex Costs IT Investment Costs
Capital Expenditures differed significantly depending on IT Investment Cost is calculated using a combination of
the serialization scenario. In this case, the cost difference is the investment necessary in storage space, as well as the
driven by the distinction between a Nested Data relational cost of setting up the data interfaces between partners in
model vs a Unit Level relational model. In a Nested Data the supply chain. Storage space investments include the
model, the capital expenditures are expected to be higher capital expenditures on physical assets (servers and racks),
due to the added complexity of data aggregation. In a Unit as well as the cost of cloud storage if managed by a 3rd
Level model, some of the capital equipment would not be party. The cost of setting up data interfaces includes all
necessary. In each case, production volume was used to the IT implementation costs associated with integrating
determine the quantity of equipment needed to obtain the serialization database with the existing IT architecture.
sufficient serialization capacity. Published equipment prices This portion includes both a one-time cost, as well as an
were then used to determine the total investment cost. ongoing annual cost.
To estimate the industry wide initial capital requirements we The IT costs have been calculated based on the number of
have constructed the tree of all applicable costs. These costs storage requirements and transactions required per each
relate to echelon. It also considers that at each point of transaction
• Line upgrade for serialization, all the information will be provided by the receiver of the
• Additional inbound equipment, product.
• Additional outbound equipment. IT Investment One-time Cost =
The Nested Data approach only applies to Manufacturer [(Annual Production Volume *
and Wholesaler levels, since the upstream supply chain is Data Storage needed) /
assumed to all be comprised of unit-level transactions. (Storage Capacity per server)] +
[Number of downstream supply chain partners *
Cost per database linkage]
28
Figure 3 - 10 Year Cost Comparison
29
Balancing Product Flow and Synchronizing Transportation
By: Priya Andleigh and Jeffrey S Bullock Topic Areas: Optimization, Supply Chain Planning,
Thesis Advisor: Dr. Ahmad Hemmati and Dr. Chris Caplice Transportation
Summary: Traditionally, production and transportation planning processes are managed separately in organizations. In
such arrangements, order processing, load planning, and transportation scheduling are often done sequentially, which can
be time consuming. Establishing a proactive steady flow of products between two nodes of a supply chain can bypass this
order-plan-ship process. A steady flow of products can reduce transportation costs, increase cross-dock productivity, and
reduce bullwhip effect upstream in the supply chain. This thesis develops an analytical framework to calculate this steady
flow. The methodology developed in the research also presents the opportunity for new and innovative contract types
with transportation providers.
30
Figure 1 – Steady Flow Calculation Framework
31
Figure 2 – Steady Flow Optimization Framework
used.
Definitions (units) Test Lane Results
n = # weeks in model horizon (weeks) The methodology developed was tested using data from
s = number of pallets on steady flow (pallet) the sponsor company’s North American operations. One
high volume plant-to-warehouse lane with more than
j = % of truck that a pallet represents – based on weight
1,000 SKUs was chosen to calculate the steady flow using
and volume (truck/pallet)
a set of input parameters, and the impact on the model’s
p = transportation savings per truck ($/truck) performance when changing those parameters was
d_i = demand for week i (pallet) studied. Figure 3 shows the optimization result for a sample
c_s = cross-dock savings per pallet ($/pallet) SKU. As the steady flow increases to up to 9 pallets a week,
c_e = cross-dock eligibility: 0 ≤ c_e≤ 1 (dimensionless) the transportation and cross-dock savings continue to
rise without incurring noticeable excess inventory costs.
e_(i-1) = excess inventory running total in week i-1; e_0 However, as the steady flow is increased beyond 9 pallets
= 0 (pallet) per week, the excess inventory costs rise rapidly. The total
h = inventory holding cost ($/$/weeks) savings are maximized at 11 pallets per week. The optimal
v = inventory value ($/truck) steady flow determined for this SKU was in between the
r = end of period risk factor (%) minimum pallets shipped and the mean pallets shipped
over the model horizon.
32
Conclusion As the coefficient of variation decreases, the optimal steady
The goal of this thesis was to develop an analytical flow moves closer to the mean of the non-zero historical
framework for determining the products that could flow demand and selected forecast over the model horizon.
on an intra-company lane regularly. This steady flow
of products between supply chain nodes can lower Areas for future enhancements to this framework include
transportation costs, increase cross-dock productivity, as optimal data aggregation, forecast accuracy implications,
well as dampen the bullwhip effect upstream in the supply expansion of cost considerations, and implications on DRP.
chain. This research was done in the context of a plant-to- This research opens up the possibility of realizing cost
warehouse lane of a fast moving consumer goods company. savings by decreasing transportation costs and improving
warehouse productivity, paving the way for innovative
The methodology developed in this research utilizes contract types with transportation providers. This framework
historical and forecast data to characterize the demand. allows shippers to bridge the gap between steady flow
Descriptive statistics are used to determine whether a SKU theory and implementation.
is eligible for steady flow. Subsequently, transportation
cost savings, cross-dock savings, and excess inventory
considerations are used to optimize the quantity of each
eligible SKU on steady flow. The model considers additional
business constraints of vehicle capacity utilization and then
recommends a final list of SKUs and quantities for the steady
flow. This methodology was tested on a high volume plant-
to-warehouse lane. Insights about the relationship between
the variation in demand and steady flow were presented.
33
Additional 2016 SCM Theses
Innovative Transportation Solutions: Uber for Freight
By: Leah Davis and Joseph Lucido
The transportation industry is constantly evolving, with new start-ups and technologies. One innovative transportation
solution, Uber for Freight (UFF), seeks to more efficiently match shippers’ loads with drivers and trucks through application-
based algorithms. This research (1) defines the UFF model and its major players and processes, (2) distinguishes UFF from a
traditional broker, and (3) analyzes the applicability of UFF to the sponsor company, a large multinational chemical seller.
34
A Generalized Framework for Optimization with Risk
By: Damaris Zipperer and Andrew Brown
In high-tech capital construction projects, the construction of facilities requires complex project schedules, forecast well in
advance. These forecasts are used to hire contract workers. Traditional optimization solutions provide a starting point, but
cost-efficient solutions require a more robust approach. This thesis proposes a risk integration methodology for contract
workforce hiring optimization. The approach can be generalized to address other supply chain problems.
scm.mit.edu/research 2016
35
MIT Supply Chain Management Program
Massachusetts Institute of Technology
77 Massachusetts Avenue
Building E40-355
Cambridge, MA 02139-4307
Phone: 617.324.6564
Fax: 617.253.7972
Email: scm@mit.edu
Web: scm.mit.edu
©2017 MIT Center for Transportation & Logistics. All rights reserved.