You are on page 1of 14

SOLUTIONS

Microsoft Dynamics NAV 2009

Hardware Sizing Guide

White Paper

Date: May, 2008


Microsoft Dynamics is a line of integrated, adaptable business management solutions that enables you and your people
to make business decisions with greater confidence. Microsoft Dynamics works like and with familiar Microsoft software,
automating and streamlining financial, customer relationship and supply chain processes in a way that helps you drive
business success.

U.S. and Canada Toll Free 1-888-477-7989


Worldwide +1-701-281-6500
www.microsoft.com/dynamics

The information contained in this document represents the current view of Microsoft Corporation on the issues discussed as of the date of publication.
Because Microsoft must respond to changing market conditions, this document should not be interpreted to be a commitment on the part of Microsoft, and
Microsoft cannot guarantee the accuracy of any information presented after the date of publication.
This White Paper is for informational purposes only. MICROSOFT MAKES NO WARRANTIES, EXPRESS, IMPLIED, OR STATUTORY, AS TO THE INFORMATION IN
THIS DOCUMENT.
Complying with all applicable copyright laws is the responsibility of the user. Without limiting the rights under copyright, no part of this document may be
reproduced, stored in or introduced into a retrieval system, or transmitted in any form or by any means (electronic, mechanical, photocopying, recording, or
otherwise), or for any purpose, without the express written permission of Microsoft Corporation.
Microsoft may have patents, patent applications, trademarks, copyrights, or other intellectual property rights covering subject matter in this document.
Except as expressly provided in any written license agreement from Microsoft, the furnishing of this document does not give you any license to these patents,
trademarks, copyrights, or other intellectual property.

© 2008 Microsoft Corporation. All rights reserved.


Microsoft, the Microsoft Dynamics Logo, BizTalk, FRx, Microsoft Dynamics, SharePoint, Visual Basic, Visual C++, Visual SourceSafe, Visual Studio, Windows,
and Windows Server are either registered trademarks or trademarks of Microsoft Corporation, FRx Software Corporation, or Microsoft Business Solutions ApS
in the United States and/or other countries. Microsoft Business Solutions ApS and FRx Software Corporation are subsidiaries of Microsoft Corporation.

Microsoft Dynamics is a line of integrated, adaptable business management solutions that enables you and your
people to make business decisions with greater confidence. Microsoft Dynamics works like and with familiar
Microsoft software, automating and streamlining financial, customer relationship and supply chain processes in a
way that helps you drive business success.

U.S. and Canada Toll Free 1-888-477-7989


Worldwide +1-701-281-6500
www.microsoft.com/dynamics
Contents
Introduction ....................................................................................................................... 4

Background – How Performance is Tested .................................................................... 4


Application Tests .............................................................................................................................................................................. 4
Client tests........................................................................................................................................................................................... 4

Hardware Configurations ................................................................................................. 4


Multiuser Tests (Application Scenarios) ................................................................................................................................... 4
Setup 1 – Three-tiers with no shared hardware ................................................................................................................... 5
Database Server ........................................................................................................................................................................... 5
Middle Tier Servers ..................................................................................................................................................................... 6
Setup 2 – Three-tiers with shared hardware .......................................................................................................................... 6
Database Tier and Middle Tier ............................................................................................................................................... 6
Client Single User Tests .................................................................................................................................................................. 7
Client Machine .............................................................................................................................................................................. 7
Middle Tier Server ....................................................................................................................................................................... 7

Network Latency & Bandwidth Requirements ............................................................... 7


Description of Test and Method ................................................................................................................................................ 7
Warm Client Startup ....................................................................................................................................................................... 8
The minimum supported bandwidth and latency .......................................................................................................... 8
The minimum supported bandwidth with no impact on performance ................................................................. 8
Warm Create and Post New Sales Order ................................................................................................................................ 9
The minimum supported bandwidth and latency .......................................................................................................... 9
The minimum supported bandwidth with no impact on performance ................................................................. 9

Summary........................................................................................................................... 10

Appendix A – Application Scenarios ............................................................................. 11

Appendix B – Client Scenarios ....................................................................................... 13


Introduction
This document shows how performance is tested in Microsoft Dynamics NAV and to give guidance to the
hardware that is used, how it is configured, and the different configurations that are used in the performance
lab. This document is subject to change as RTM approaches.

Background – How Performance is Tested


In Microsoft Dynamics NAV, measurements are used to track performance on different platforms and
hardware configurations. This is the foundation for what is included in this document.
Performance is measured in two main groups – application performance and client performance.

Application Tests
An application test measures the performance of common application operations in a multiuser environment.
This is done by using the NAV Application Benchmark Toolkit (NABT) that will be made available for
download on PartnerSource at a later time. There are 26 different test scenarios that are executed randomly
from 50 concurrent users. These scenarios run for 4 hours and measure the performance for 3 hours, starting
30 minutes into the execution and ending 30 minutes before the end of the execution. For details about the
scenarios, see Appendix A.

Client tests
A client test measures the performance of the client when it is isolated from the application. There are 26
different test scenarios for the user interface (UI) of the RoleTailored client. All of these tests, unlike the
application scenarios, run in a single user environment to enable the performance of the UI and the client to
be tested in isolation from the application. Each test is executed once for cold scenarios and twice for warm
scenarios, where only the second test will be measured. For details about the scenarios, see Appendix B.
The next section shows the different hardware configurations which are tested on. For all of these
configurations, performance is defined as acceptable for the combination of hardware and the number of
users connected to the system. The client tests all abide to general goals for responsiveness of UI.

Hardware Configurations

Multiuser Tests (Application Scenarios)


For application scenarios, two different hardware setups are tested. The first setup is a three-tier setup where
each tier resides on its own hardware. Two identical middle-tiers run with an equal amount of users
connecting to the same database-tier. In this configuration, each middle-tier has 50 concurrent users.
Database tier

Middle-tier Middle-tier

Client Client Client Client

Client Client Client Client

The second setup is a three-tier setup, where the middle-tier and database-tier resides on a single piece of
hardware. In this setup, the server has 50 concurrent users.

Database-tier
Middle-tier
Client Client

Client Client

The following section outlines the hardware for the two setups.

Setup 1 – Three-tiers with no shared hardware

Database Server

Siemens Rack Server RX300

Hardware

Dual CPU Xeon 2,8 GHz, 400 MHz FSB


2 GB ECC DDR RAM, 512 KB L2 Cache
2 x 36 GB HotPlug Ultra320 SCSI Disk
o System drive – RAID 0 – 2 disks
External Rack Storage – 14 x 18 GB HotPlug Ultra320 SCSI Disk
o DB log drive – RAID 10 – 6 disks
o DB data drive - RAID 10 – 8 disks

Software
Installed with Microsoft Windows 2003 Server Enterprise SP1 + various Server tools
Microsoft SQL Server 2005 installed on system drive
eTrust Antivirus 6
Fully security patched
Acrobat Adobe Reader
WinZip 8.1

Middle Tier Servers

FujitsuSiemens Esprimo E5915

Hardware

Intel Core 2 Duo E6300 1.83 GHz, 1066 MHz FSB


4 GB DDR2-533 RAM, 2 MB L2 Cache
2 x 160GB SATAII 7.2k RPM HDD
NVIDIA GeForce 7200LE, 256MB

Software

Installed with Microsoft Windows 2003 Server Enterprise SP2 + various Server tools
Visual Studio 2005
Microsoft SQL Server 2005 installed on system drive
Fully security patched

Setup 2 – Three-tiers with shared hardware

Database Tier and Middle Tier

HP Proliant DL380 G5

Hardware

Intel® Xeon® E5335 Quad Core Processor 2 GHz


8 MB (2 x 4 MB) Level 2 cache
8 GB RAM
HP Smart Array P400/256MB Controller (RAID 0/1/1+0/5)
o System drive – RAID 1+0
o Temp Drive - RAID 0
External Rack Storage P800
o DB log drive – RAID 1+0
o DB data drive - RAID 1+0

Software

Installed with Microsoft Windows 2003 Server Enterprise SP2 R2 + various Server tools
Microsoft SQL Server 2005
Fully security patched

Disk layout

2 x 146 GB Raid 1 for OS = 146 GB.


4 x 146 GB Raid 10 for = 292 GB for DB Data and DB log. The OS is on a different channel, and data
and log share the same channel.
Client Single User Tests

Client Machine

FujitsuSiemens Esprimo E5915

Hardware

Intel Core 2 Duo E6300 1.83 GHz, 1066 MHz FSB


2 GB DDR2-533 RAM, 2 MB L2 Cache
2 x 160GB SATAII 7.2k RPM HDD
NVIDIA GeForce 7200LE, 256 MB

Software

Installed with Microsoft Windows 2003 Server Enterprise SP2 + various Server tools
Visual Studio 2005
Microsoft SQL Server 2005 installed on system drive
Fully security patched

Middle Tier Server

FujitsuSiemens Esprimo E5915

Hardware

Intel Core 2 Duo E6300 1.83 GHz, 1066 MHz FSB


2 GB DDR2-533 RAM, 2 MB L2 Cache
2 x 160GB SATAII 7.2k RPM HDD
NVIDIA GeForce 7200LE, 256MB

Software

Installed with Windows 2003 Server Enterprise SP2 + various Server tools
Visual Studio 2005
Microsoft SQL Server 2005 installed on system drive
Fully security patched

Network Latency & Bandwidth Requirements

Description of Test and Method


The tests run on a 2-tier setup, where machine A runs the client tests and machine B runs Microsoft Dynamics
NAV service tier and the database, thus representing the server. These are single user performance tests,
During each test, a network throttling tool is used on the client machine, which emulates limited bandwidth
and latency on the connection to the server.
The different settings on the network throttling simulate high speed ISDN and different connections speeds
for an ordinary ADSL connection. During each test, there is no emulation of lost packages and network errors
that might occur in a real life Internet connection.
The latencies represent the different distances between the client and the server.
- 0 ms is considered a setup where both machines are located on the same site, such as on a campus
- 10 ms represents a short distance link within Denmark (CPH-ARHUS 9 ms)
- 30 ms represents a group 1 country in Europe (Germany, Switzerland, France)
- 60 ms represents countries such as Turkey, Greece, etc.
- 90ms represents a connection between Denmark and the USA

Warm Client Startup

The minimum supported bandwidth and latency


The client works with a 128/128 kilobit connection for zero (0) latency, although startup takes approximately
40 seconds, which would probably be interpreted as a crash by a user. Starting with 512/128 kilobits, the
startup time is reduced to a much more acceptable 12 seconds compared to 6.6 seconds on a 100 megabit
connection.
By increasing the latency, the startup time increases linearly. With a 128/128 kilobit connection and a latency
of 60 milliseconds, the client needs approximately 44 seconds for startup, approaching the 12 second limit
with a bandwidth of 2048/128 kilobits.

The minimum supported bandwidth with no impact on performance


Another aspect is to determine the bandwidth needed with no latency to get performance which is less than
5% different from a normal test run without any throttling. The following graph shows a warm client start up
with a 20 megabit downlink and 10 megabit uplink connection. The time is 6720 milliseconds compared to
6618 milliseconds with a 100 megabit connection, which is a difference of 1.5%. Keeping the uplink at 10
megabits and increasing the downlink does not have a big impact on the performance for this scenario.
Warm Client Startup (0 latency variable bandwidth)
Warm Create and Post New Sales Order

The minimum supported bandwidth and latency


The minimum bandwidth that is required for a 0 latency environment is 128-128 kilobits. This is questionable
if a time of over 35 seconds is considered usable. With a bandwidth of 2048/2048 kilobits the limit of 5
seconds is reached, which is acceptable, compared to 3.2 seconds in an unthrottled environment.

The minimum supported bandwidth with no impact on performance


For the analog to the client startup, the bandwidth offering is less than 5% different than a broadband
connection. The following graph shows that with a 20 megabit downlink and a 10 megabit uplink connection,
the difference is less than 5%. The upload bandwidth is crucial for this scenario, and therefore a setup with
more downlink bandwidth and less uplink bandwidth achieves results which are poorer than those posted
before.
Warm Create Sales Order (0 latency variable bandwidth)
Summary
As described above, there are test two different configurations for multiuser tests. One is a three machine
setup where each tier resides on a separate machine. With regard to scalability, this configuration will scale
farther as there are no shared resources.
With regard to the testing of client performance, a standard desktop machine is used to perform this test. The
middle tier is only loaded with the connection from the client that is measured. This enables the client to be
measured by itself and not measured with the rest of the stack.
Appendix A – Application Scenarios

Scenario Description
Application Benchmark Tool code unit 99600 Profile-Create GL
100 Create GL Transaction Transaction
Application Benchmark Tool code unit 99601 Profile-Post GL
101 Post GL Transactions (ALL) Transaction
Application benchmark tool code unit 99602 Profile-ChartofAcc.
150 Chart of account simulation simulation
Application Benchmark Tool code unit 99604 Profile-Create SQ,
200 Create Sales Quote and Make Order SO, SI, SC
201 Create Sales Order (% Post) Application benchmark tool code unit 99604 Profile-Create SQ,
SO, SI, SC

202 Create and Post Sales Order (Ship & Application Benchmark tool code unit 99604 Profile-Create SQ,
Invoice) SO, SI, SC
Application Benchmark Tool code unit 99606 Profile-Create SQ,
203 Create Sales Invoice (% Post) SO, SI, SC
Application Benchmark Tool code unit 99604 Profile-Create SQ,
204 Create and Post Sales Invoice SO, SI, SC
Application Benchmark Tool code unit 99604 Profile-Create SQ,
206 Post Shipment from Sales Order SO, SI, SC
Application Benchmark Tool code unit 99604 Profile-Create SQ,
207 Post Invoice from Sales Order SO, SI, SC
Application Benchmark Tool code unit 99600 Profile-Create GL
210 Create Customer Receipt Transaction
Application Benchmark Tool code unit 99601 Profile-Post GL
211 Post Customer Receipt (All) Transaction
Application Benchmark Tool code unit 99607 Profile-Customer
250 Customer Lookup - simulation Lookup Sim.
Application Benchmark Tool code unit 99616 Profile-Sales Doc.
251 Sales Documents Lookup - simulation Lookup Sim.
300 Create Purchase Quote and Make Application Benchmark Tool code unit 99609 Profile-Create PQ,
Order PS, PI, PC
Application Benchmark Tool code unit 99609 Profile-Create PQ,
301 Create Purchase Order (% Post) PS, PI, PC
302 Create and Post Purchase Order Application Benchmark Tool code unit 99609 Profile-Create PQ,
(Receive & Invoice) PS, PI, PC
Application Benchmark Tool code unit 99609 Profile-Create PQ,
303 Create Purchase Invoice (% Post) PS, PI, PC
Application Benchmark Tool code unit 99609 Profile-Create PQ,
304 Create and Post Purchase Invoice PS, PI, PC
Application Benchmark Tool code unit 99609 Profile-Create PQ,
306 Post Receipt from Purchase Order PS, PI, PC
Application Benchmark Tool code unit 99609 Profile-Create PQ,
307 Post Purchase Invoice PS, PI, PC
Application Benchmark Tool code unit 99600 Profile-Create GL
310 Create Vendor Payment Transaction
Application Benchmark Tool code unit 99601 Profile-Post GL
311 Post Vendor Payment (All) Transaction
Application Benchmark Tool code unit 99612 Profile-Vendor
350 Vendor Lookup - simulation Lookup Sim.
351 Purchase Document Lookup - Application Benchmark Tool code unit 99617 Profile-Purchase
simulation Doc. Lookup
Application Benchmark Tool code unit 99614 Profile-Item
450 Item Lookup - simulation Lookup Simulation
Appendix B – Client Scenarios
Scenario Description

Cold client startup The time that it takes to launch Microsoft Dynamics NAV
client. Cold client means NST is running but has not had any
client connections (client start is the first connection to a
specific NST–this tests not only the client startup but also the
NST loading time).

Measures the time that it takes to start up the Microsoft


Warm client startup Dynamics NAV client until the client becomes responsive.
Open Sales Order List Place - cold Opens Sales Order List Place from a stack on the home page.
Open Sales Order List Place - warm Opens Sales Order List Place from a stack on the home page.
Uses the down arrow on the sales order list place line and
verifies if the focus on the list is moved to the next row.
Step to next line on sales order list place

Open the Sales Order list place and select a line. Select a new
Refresh InfoPart after moving to new sales line in the list place and measure the amount of time it takes to
order line refresh the related fact box.
Open existing sales order - cold Opens an existing sales order (unposted) as a task page from
sales order list place by clicking any line from the sales order
list.
Open existing sales order - warm Opens an existing sales order (unposted) as a task page from
sales order list place by clicking any line from the sales order
list.
Click on a new line in the Sales order to put focus on the line
Enter a new sales order line type field.
Sets line type to item and then moves the focus.

Measure the time that it takes to move the focus to the next
Set sales order line type field.
Add item Adds item 1000 and then moves the focus. Item line
information is automatically entered for the line.
Creates a new Sales order (click from promoted actions), which
Create new Sales order - cold makes a new task page.
Create a new Sales order (click from promoted actions) which
Create new Sales order - warm makes a new task page.
Automatically generate sales order number Moves from Sales order to a Customer field.
Measures the time that it takes to generate the sales order
number.
Move from Sales Order No to Customer field.
Show customer drop-down list Measure the time it takes to open the customer number
dropdown.

Select customer Click to Select Customer 10000, which closes the drop-down
list and transfers the number to the customer ID field.
Add customer information to sales order A customer has been selected as described in S04.09.
after selection Measures the time that it takes to press the TAB key in the
customer ID field and fill the sales order with customer data.

The same click from S4.9 shows the data in the customer details
Show customer details InfoPart.
Adds item 1000 and after moving to next field, shows
Show item line info remaining item line information.
After moving from S4.11, shows the data in the item details
Show item details InfoPart.
Change focus and show item details The user verifies that she has chosen the correct items by
reselecting first Item Line information and viewing the item
details InfoPart data for that line.

Measures the time that it takes to render an InfoPart.


200 milliseconds for changing focus (this should be split into
two scenarios).
Post and ship Posts and ships by pressing F11.
On a new sales order, sets the shipment code to ‘aaa’. Leaves
the field and error message which displays that the code is
Sales order validation invalid.

You might also like