Professional Documents
Culture Documents
Automating Microsoft®
SQL Server™ 2005
Databases and Servers
Delivery Guide
Course Number: 2789
Microsoft may have patents, patent applications, trademarks, copyrights, or other intellectual
property rights covering subject matter in this document. Except as expressly provided in any
written license agreement from Microsoft, the furnishing of this document does not give you any
license to these patents, trademarks, copyrights, or other intellectual property.
Microsoft, MS-DOS, Windows, Windows NT, <plus other appropriate product names or titles.
The publications specialist replaces this example list with the list of trademarks provided by the
copy editor. Microsoft, MS-DOS, Windows, and Windows NT are listed first, followed by all
other Microsoft trademarks listed in alphabetical order.> are either registered trademarks or
trademarks of Microsoft Corporation in the U.S.A. and/or other countries.
<The publications specialist inserts mention of specific, contractually obligated to, third-party
trademarks, provided by the copy editor>
The names of actual companies and products mentioned herein may be the trademarks of their
respective owners.
Beta
Contents
Course objectives After completing this course, the student will be able to:
Course Timing
The following schedule is an estimate of the course timing. Your timing may
vary.
Day 1
Start End Module
9:00 9:30 Introduction
Module x: Title
Break
Lab x: Title
Lunch
Lab x: Title (continued)
Break
4:00
Day 2
Start End Module
9:00 9:30 Day 1 review
Break
Lunch
Break
4:00
Day 3
Start End Module
9:00 9:30 Day 2 review
Break
Lunch
Break
4:00
Day 4
Start End Module
9:00 9:30 Day 3 review
Break
Lunch
Break
4:00
Day 5
Start End Module
9:00 9:30 Day 4 review
Break
Lunch
Break
4:00
Document Conventions
The following conventions are used in course materials to distinguish elements
of the text.
Convention Use
At the end of this module, you will be able to describe this course and its purpose.
Introduction
Course Materials
Course evaluation
You will have the opportunity to provide feedback about the course, training facility, and
instructor by completing an online evaluation near the end of the course.
Document conventions
The following conventions are used in course materials to distinguish elements of the text.
Convention Use
Bold Represents commands, command options, and syntax that
must be typed exactly as shown. It also indicates commands
on menus and buttons, and indicates dialog box titles and
options, and icon and menu names.
Italic In syntax statements or descriptive text, indicates argument
names or placeholders for variable information. Italic is also
used for introducing new terms, for book titles, and for
emphasis in the text.
Title Capitals Indicate domain names, user names, computer names,
directory names, and folder and file names, except when
specifically referring to case-sensitive names. Unless
otherwise indicated, you can use lowercase letters when you
type a directory name or file name in a dialog box or at a
command prompt.
ALL CAPITALS Indicate the names of keys, key sequences, and key
combinations — for example, ALT+SPACEBAR.
try/Try Keywords in Microsoft® Visual C#® and Visual Basic®
.NET are separated by a forward slash when casing differs.
monospace Represents code samples or examples of screen text.
[] In syntax statements, enclose optional items. For example,
[filename] in command syntax indicates that you can choose
to type a file name with the command. Type only the
information within the brackets, not the brackets themselves.
{} In syntax statements, enclose required items. Type only the
information within the braces, not the braces themselves.
| In syntax statements, separates an either/or choice.
Providing feedback
To provide additional comments or feedback about the course, send e-mail to
support@mscourseware.com. To ask about the Microsoft Certification Program, send e-mail to
mcphelp@microsoft.com.
Clinics are for IT professionals, developers, and technical decision makers. Clinics offer a detailed
presentation that may describe the features and functionality of an existing or new Microsoft product
or technology, provide guidelines and best practices for decision making, and/or showcase product
demonstrations and solutions. Clinics focus on how specific features will solve business problems.
Stand-alone Hands-On Labs provide IT professionals and developers with hands-on experience with
an existing or new Microsoft product or technology. Hands-on labs provide a realistic and safe
environment to encourage knowledge transfer by learning through doing. The labs provided are
completely prescriptive so that no lab answer keys are required. There is very little lecture or text
content provided in hands-on labs, aside from lab introductions, context setting, and lab reviews.
Facilities
Inform students of class logistics and rules for the training site.
Microsoft Learning
Fact: Describe certifications for which this course helps you prepare.
Introduction
Microsoft Learning offers a variety of certification credentials for developers and IT professionals.
The Microsoft Certified Professional (MCP) program is the leading certification program for
validating your experience and skills, keeping you competitive in today’s changing business
environment.
Related certification exams
This course helps students to prepare for:
• Exam 70–431: TS: Microsoft® SQL Server™ 2005 - Implementation and Maintenance
• Exam 70–444: PRO: Optimizing and Maintaining a Database Administration Solution by
Using Microsoft SQL Server 2005
MCP certifications
The Microsoft Certified Professional program includes the following certifications.
MCDST on Microsoft Windows®
The Microsoft Certified Desktop Support Technician (MCDST) certification is designed for
professionals who successfully support and educate end users and troubleshoot operating system
and application issues on desktop computers running the Windows operating system.
MCSA on Microsoft Windows Server™ 2003
The Microsoft Certified Systems Administrator (MCSA) certification is designed for
professionals who implement, manage, and troubleshoot existing network and system
environments based on the Windows Server 2003 platform. Implementation responsibilities
include installing and configuring parts of systems. Management responsibilities include
administering and supporting systems.
MCSE on Microsoft Windows Server 2003
The Microsoft Certified Systems Engineer (MCSE) credential is the premier certification for
professionals who analyze business requirements and design and implement infrastructure for
business solutions based on the Windows Server 2003 platform. Implementation responsibilities
include installing, configuring, and troubleshooting network systems.
MCAD
The Microsoft Certified Application Developer (MCAD) for Microsoft .NET credential is
appropriate for professionals who use Microsoft technologies to develop and maintain
department-level applications, components, Web or desktop clients, or back-end data services, or
who work in teams developing enterprise applications. This credential covers job tasks ranging
from developing to deploying and maintaining these solutions.
MCSD
The Microsoft Certified Solution Developer (MCSD) credential is the premier certification for
professionals who design and develop leading-edge business solutions with Microsoft
development tools, technologies, platforms, and the Microsoft Windows DNA architecture. The
types of applications that MCSDs can develop include desktop applications and multiuser, Web-
based, N-tier, and transaction-based applications. The credential covers job tasks ranging from
analyzing business requirements to maintaining solutions.
MCDBA on Microsoft SQL Server™ 2000
The Microsoft Certified Database Administrator (MCDBA) credential is the premier certification
for professionals who implement and administer SQL Server databases. The certification is
appropriate for individuals who derive physical database designs, develop logical data models,
create physical databases, create data services by using Transact-SQL, manage and maintain
databases, configure and manage security, monitor and optimize databases, and install and
configure SQL Server.
MCP
The Microsoft Certified Professional (MCP) credential is for individuals who have the skills to
successfully implement a Microsoft product or technology as part of a business solution in an
organization. Hands-on experience with the product is necessary to successfully achieve
certification.
MCT
Microsoft Certified Trainers (MCTs) demonstrate the instructional and technical skills that qualify
them to deliver Official Microsoft Learning Products through a Microsoft Certified Partner for
Learning Solutions (CPLS).
Certification requirements
Certification requirements differ for each certification category and are specific to the products and
job functions addressed by the certification. To become a Microsoft Certified Professional, you must
pass rigorous certification exams that provide a valid and reliable measure of technical proficiency
and expertise.
You can also send e-mail to mcphelp@microsoft.com if you have specific certification questions.
Acquiring the skills tested by an MCP exam
Official Microsoft Learning Products can help you develop the skills that you need to do your job.
They also complement the experience that you gain while working with Microsoft products and
technologies. However, no one-to-one correlation exists between Official Microsoft Learning
Products and MCP exams. Microsoft does not expect or intend for the courses to be the sole
preparation method for passing MCP exams. Practical product knowledge and experience is also
necessary to pass MCP exams.
To help prepare for MCP exams, use the preparation guides that are available for each exam. Each
Exam Preparation Guide contains exam-specific information, such as a list of the topics on which you
will be tested. These guides are available on the Microsoft Learning Web site at
http://www.microsoft.com/learning/.
• Must have some experience with database design. Specifically, they must fully understand
Third Normal Form (3NF), be able to design a database to 3NF (fully normalized), and know
the tradeoffs when backing out of the fully normalized design (denormalization; that is,
designing for performance and or business requirements). They should also be familiar with
specific design models, such as Star and Snowflake schemas.
• Must have basic monitoring and troubleshooting skills.
• Must have working knowledge of the operating system and platform. That is, how the
operating system integrates with the database, what the platform or operating system can do,
and the interaction between the operating system and the database.
• Must have basic knowledge of application architecture. That is, how applications can be
designed in three layers, what applications can do, interactions between applications and the
database, and interactions between the database and the platform or operating system.
• Must know how to use:
o Third-party database administration and management tools
o Source control software
• Must have been exposed to the new features and terminology of Microsoft SQL Server
2005.
Important
This learning product will be most useful to people who are already working in the job role of
Database Administrator and who intend to use their new skills and knowledge on the job immediately
after training.
Course objectives
After completing the course, you will be able to:
• Manage and automate databases and servers.
• Manage supporting services.
Course Outline
Setup
Important
If, when performing the hands-on activities, you make any changes to the virtual machine and do not
want to save them, you can close the virtual machine without saving the changes. This will take the
virtual machine back to the most recently saved state. To close a virtual machine without saving the
changes, perform the following steps: 1. On the virtual machine, on the Action menu, click Close. 2.
In the Close dialog box, in the What do you want the virtual machine to do? list, click Turn off
and delete changes, and then click OK.
If you save changes, any operation that affects system configuration or files on drive C will be
persisted between modules, but each module has its own D drive.
Software configuration
The classroom computers use the following software:
• Microsoft Windows Server 2003
• Microsoft SQL Server 2005
• Microsoft Office 2003.
Course files
There are files associated with the demonstrations, practices, and labs in this course. The files are
located on each student computer, on drive D.
Classroom setup
Each classroom computer will have the same virtual machine configured in the same way. Windows
Server 2003 is installed in a workgroup and has the server name MIAMI. Three instances of SQL
Server 2005 are installed: a default instance and two named instances with the names
SQLINSTANCE1 and SQLINSTANCE2.
Course hardware level
To ensure a satisfactory student experience, Microsoft Learning requires a minimum equipment
configuration for trainer and student computers in all Microsoft Certified Partner for Learning
Solutions (CPLS) classrooms in which Official Microsoft Learning Products are used. This course
requires computers that meet or exceed the following specification:
Component Requirement
Processor Pentium III or equivalent personal computer with processor speed greater than or equal to 1 GHz
Hard Disk At least 18 GB 7200 RPM; larger drives are recommended where storage of multiple–Virtual PC courses
is desired.
RAM At least 1 GB
DVD/CD CD-ROM/DVD
Keyboard shortcuts
While working in the Virtual PC environment, you might find it helpful to use keyboard shortcuts. All
Virtual PC shortcuts include a key that is referred to as the HOST key or the RIGHT-ALT key. By
default, the HOST key is the ALT key on the right side of your keyboard. Some useful shortcuts
include:
• RIGHT-ALT+DELETE to log on to the Virtual PC.
• RIGHT-ALT+ENTER to switch between full-screen and window modes.
• RIGHT-ALT+RIGHT ARROW to display the next virtual machine.
For more information about using Virtual PC, see Virtual PC Help.
Categories Achievement targets for this title (and proposed place to teach and/or reinforce)
Most important o Best practices for administering and managing SSIS packages (Module 2)
conceptual o Best practices for administering and managing RS packages (Module 2)
knowledge and o Best practices for administering and managing replication (Module 2)
understanding o Good change control strategies (Module 1)
o Methods for automating the management of large numbers of servers (Module 1)
o The ramifications of server maintenance on performance and availability (e.g.,
index rebuilds) (Module 1)
o The role of automation in increasing productivity. Management takes a lot of time,
need to automate daily activities. (Module 1 and 2)
o
Categories Achievement targets for this title (and proposed place to teach and/or reinforce)
Most important o Deciding which types of tasks can benefit from automation and which can’t
problems for (Module 1)
students to solve o Maximize the availability and minimize the performance impact of server
maintenance (Module 1 and 2)
o Managing development, test, and product databases (Module 1)
Note to design team Due to the time constraints of our courses, students do NOT need
to demonstrate these dispositions (or even agree with them). However, for ILT, this
information should at least be communicated to instructors, who can share this
information with students, watch for these things during learner activities, and praise
students individually for what they see them demonstrate. This information could also
be shared with students via a “What Matters Most” appendix or some other way.
Important
As the course progresses, if you feel that you have not adequately learned something mentioned in this
table, ask questions of the instructor and your peers until you are satisfied that you understand a
concept or know how to do something. Also, you will not be able to learn everything you need to do
this complex job in a one-day course. Take note of the recommended additional reading included
throughout the course, and schedule yourself some additional time to read the supplementary
materials. Your instructor and peers will have additional and more up-to-date ideas about where to go
for additional information. Ask them about additional resources that you can use after class.
business. It has a number of servers as described below and is suffering from problems with lack of
storage space, a part-time database administrator who is struggling to keep up with the demands of the
business growth, and increasing need for data availability and reporting.
As the company is heavily focused on an internet sales strategy it needs a reliable 24x7 computing
infrastructure environment to support its head office and regional sales offices.
Network Environment
Adventure Works runs a database server, an Exchange server, a Web server, and a file server that is
running out of space.
Database Server Environment
The main OLTP database for Adventure Works Cycles is AdventureWorks, housed at the
headquarters. Another database that serves as a data warehouse is also installed on the
AdventureWorks database server. SQL Server 2005 Reporting Services is also used.
Current Situation
The part-time DBA has not identified an automation strategy. Therefore, you need to establish the
automation strategy and maintain the administration and automation document. The reporting services
load is high, the data warehouse is becoming more important to the business operations, and the
company is considering scaling out to another database server.
Your role in Adventure Works Cycles
Adventure Works Cycles has not been able to afford a full-time DBA and has been getting by with
part-time database support. You have been hired to provide full-time DBA support and are now
working on fixing gaps that have been identified in the Adventure Works systems. There are several
key requirements: to automate the running and maintenance of this highly demanding environment, to
provide a stable database environment that can reliably support the business critical data warehouse,
and the increasing demands of the reporting services. From its experience with part-time DBAs, the
management of the company is insistent that the new DBAs properly document the system and keep
documentation up-to-date with any changes to environment, processes, or procedures.
Module objective:
After completing this module you will be able to:
Manage and automate databases and servers.
Introduction
In the daily administration of a Microsoft® SQL Server installation there are a number of tools and
techniques that can make the everyday running, maintenance, and record keeping about the
installation less burdensome for the database administrator (DBA). A well-run installation that uses
proactive management techniques is less likely to have performance problems, database corruption,
faulty backups, or other problems than sites that do not use automation and documentation properly.
Such automation also reduces problems caused by human error. If you carefully plan and document
the various automated processes and procedures, you can minimize downtime in the event of database
disaster recovery situations and other routine DBA interventions, such as rolling back an incorrect
global update.
In this module, you will learn how to plan and implement automation procedures and processes that
help you maintain SQL Server databases and servers. You will also learn how to document these plans
and procedures in a run book so that the information about the entire set of processes is properly
described and maintained. This planning and implementation make it much easier for any DBA, even
a temporary contract DBA, to know what activity to do, how to do it, and when to do it.
Lesson objective
After completing this lesson, you will be able to:
Tip
Often, these cases not to automate are not entirely clear-cut. It might be worthwhile to automate some
activities that are susceptible to exceptions to see how long it takes, so you can decide whether it is
worthwhile for other similar activities. Additionally, an activity that started off as a low-frequency
event might become more common, and then the investment in the time to automate the process or
procedure can prove to be more beneficial than first thought. Consequently, it is important to review
your database administration processes and requirements on a regular basis.
With the exception of the sysadmin fixed server role, the default condition is that no user is a member
of any of these roles. So, aside from the sysadmin role, all SQL Server Agent access must be explicitly
granted for all users who need such access.
To manage the security of the SQL Server Agent environment you must consider the following:
• SQL Server Agent best practice operation assumes the use of proxy accounts.
• Proxy accounts allow for predictable behavior and restricted execution permissions.
• Running the SQL Server Agent service under a Microsoft Windows account that is a member
of the Windows Administrators is quite risky and dangerous.
• Proxy accounts are “purpose-built” accounts that are designed by the system administrator to
have only the privileges that are required for a particular job or job step. They can be used by
members of the various SQL Server Agent roles. The system administrator can also create
jobs that run under the SQL Service Agent service account.
For more information
For more details on the use of proxies with SQL Server Agent, see the topic “Security for SQL Server
Agent Administration” in Books Online.
Considerations for capturing and using SQL Server Agent scripts
You should make sure that you allow time for capturing SQL Server Agent jobs as scripts so they can
be reused, distributed, or used for documentation. You must also consider how you will do the
captures, who will do them, and whether you will use them. Specifically, consider the following:
1. To capture SQL Server Agent job step scripts, you must be connected to the appropriate SQL
Server instance.
2. Under the SQL Server Agent object in the Object Explorer, expand the Jobs list.
3. Select the job that you are interested in. right click the context menu, click the Script Job As
option, and click either the CREATE To or DROP To item.
From that point, choose one of the following three options:
• New Query Editor Window
• File
• Clipboard
You can then work with the script as you need to, add documentation comments, and save the
completed script in Microsoft Visual SourceSafe® for purposes of security and version control. You
might also want to store the scripts as part of your run book.
• Identify the restrictions of executing jobs concurrently. Because of resources accessed, the
concurrency of the jobs can be compatible or incompatible.
Compatible Concurrency
Two or more operations can run concurrently if they are fully compatible. If they
are partially compatible, they might run satisfactorily together, but it might be
preferable that they run sequentially for overall efficiency. For example, a re-
indexing operation in one database can run concurrently with a full backup job
of another database, but it is preferable that they do not run concurrently due to
the high input/output (I/O) activity that both operations produce.
Incompatible concurrency
Incompatible concurrency means that you cannot run two operations
simultaneously. For example, you should not re-index a table with a data
archiving operation that affects the same table.
• Coordinate with other enterprise activities to avoid conflicts. For example:
Tape Backup
A tape backup is a sequential device, and only one resource can access it at a
time. Multiple resources accessing a tape drive can lead to conflicts.
ELT routine
Data extraction and load routines might conflict with other database activities. It
does not make good sense to try to import large quantities of data into a table
when that table is being heavily used.
Periodic activity
Periodic activities, such as month-end closing jobs that populate summary tables,
can lead to conflicts with other jobs. You can avoid such conflicts by
coordinating the job execution.
• Build a job dependency diagram to avoid conflicts and coordinate jobs efficiently. You must
clearly define the inputs and outputs of a job, and update the diagram. Inputs are resources
that provide value to a process, and outputs are results of a process. Examples of inputs and
outputs can include offices, servers, physical devices, databases, and database objects.
• Consider the granularity for a job definition when you build the dependency diagram. For
example, you can design a diagram to show that one job depends on another job or you can
show specific steps within the jobs to reflect particular process dependencies. The DBA’s task
is to identify the appropriate level of detail for dependency analysis. It is usually best to use
the more granular approach and determine specific steps.
Tip
If you do not have a lot of practice creating such diagrams, sometimes it is best to step through the
dependencies for tasks with which you are familiar, such as the dependencies in feeding a pet or doing
a particular maintenance task on a car or motorbike.
Preparation
Ensure that the virtual machine 2789A-MIA-SQL-01 is running and that you are logged on as
Student.
If a virtual machine has not been started, perform the following steps:
1. Close any other running virtual machines.
2. Start the virtual machine.
3. In the Log On to dialog box, complete the logon procedure by using the user name Student
and the password Pa$$w0rd.
Create jobs
To create jobs, perform the following steps.
1. Start SQL Server Management Studio, connecting to the MIAMI Database Engine by using
Windows authentication.
2. In SQL Server Management Studio, open the 1-
prerequisites_table_and_stored_procedure_creation.sql script in D:\Democode and
execute it. Connect to MIAMI by using Windows authentication when prompted.
3. Open the 2-prerequisites_job_creation1.sql script in D:\Democode and execute it. Connect
to MIAMI by using Windows authentication when prompted.
4. Open the 3-prerequisites_job_creation2.sql script in D:\Democode and execute it. Connect
to MIAMI by using Windows authentication when prompted.
To start a job
To start a job, perform the following steps.
1. In SQL Server Management Studio, open Object Explorer, expand SQL Server Instance,
expand SQL Server Agent, and then expand Jobs.
2. Right-click the Fill ProductSalesInCurrentDay1 in AdventureWorks job, and then click
Start Job. Perform the next procedure as soon as the job begins.
Note
You might want to discuss this topic with your business managers as well as members of this class.
Many businesses have been unpleasantly surprised to discover that they did not have adequate systems
in place to deal with disasters. It might also be of interest to DBAs to investigate the area generally
known as “business continuity” and their role in providing such continuity.
It is impossible to overemphasize the degree to which some critical databases affect businesses or
people’s lives. One of the assessments the DBA should make of each and every database is the degree
to which it is mission critical for any particular activity or purpose. One of the most important things
for DBAs to appreciate is that the businesses or organizations for which they work might well cease to
exist if major problems strike the database and proper steps have not been taken to ensure that these
databases are intact, are properly backed up, and are coherent. You must define the appropriate
backup strategy based on the business requirements of the organization.
Tip
Create a list of all the ways in which the databases you deal with could contribute to the success or
failure of your company or organization. Discuss this list with your staff and your managers to
identify any gaps in your understanding of business criticality.
It is important to consider how much time the system requires to perform the backups, what is the
amount of allowable data loss in case of a system disaster, and how to minimize the downtime of the
system during a recovery. Factors that you need to consider include the following:
Impact of backup operations on system performance
The frequency and type of backups you choose to implement can have a major impact on database
performance. For highly active databases, you might consider using filegroup backups to reduce the
time a backup operation takes and therefore minimize the impact on performance.
Backup plan complexity
Your backup plan can be as simple as performing regular full database backups or it can involve a
combination of full, differential, and log backups. Consider the business requirements in selecting
your strategy for availability and recoverability.
Point of recovery
In some cases, you might need to be able to recover a database to a specific point in time to back out
unwanted changes. If this is the case, you need to design your backup plan to include log backups.
Recovery time
You need to consider the impact of the time taken to get the database back online and operational.
Scheduling an additional differential backup every three hours would increase the complexity of a
backup plan, but significantly reduce the amount of time required to recover the database.
Impact on warm spare servers
Evaluate the backup strategy’s impact on all the servers involved in warm spares. For example, in a
full synchronous database mirroring scenario, you should consider how the principal server will be
affected by database backups in the mirror server. In log shipping scenarios, you should consider how
the full database backup affects the transaction log backups that are transferred to the secondary
servers.
Data Definition Language (DDL) changes
The software life cycle involves different servers, such as development servers, quality assurance
servers, and production servers. Changes made in development environments have to be applied first
to quality assurance servers and then eventually to production servers during maintenance windows.
You should carefully consider the update process and make backups before applying DDL changes.
You should also consider what types of DDL changes create risks and be especially cautious when
applying those changes to production systems.
It is important to identify backup needs and requirements that in turn require additional hardware,
software tools, routines, or personnel. Failure to attend to the entire range of issues involved can cause
the system to run poorly; data to be lost; users to be frustrated; manufacturing or sales to affected
adversely; and critical control systems in medicine, scientific, or engineering data environments to
fail, sometimes with catastrophic effects.
Index Maintenance Considerations
Regular maintenance of indexes on tables and views can make significant differences to performance,
user satisfaction, and productivity. It is important to identify indexing needs through proactive and
thoughtful design, systematic analysis, and regular system health checks on indexing. Tools to
maintain indexes include system procedures, tuning wizards, and SQL Query Analyzer.
Statistics update
Automatic maintenance of statistics is generally the best way to go. If you determine that automatic
updating is not needed or desirable, you need to determine the update frequency and automate it.
Defragmentation: online/offline
You must determine the appropriate defragmentation. You need to consider whether to perform the
defragmentation operation online (a new mode available in SQL Server 2005 Enterprise Edition) or
offline (the default). Online operations avoid blocking, thereby permitting more concurrency, but they
are more time-consuming. DBAs should consider the defragmentation needs of the database with the
amount of live activity to avoid affecting overall server performance.
Reorganization
Although defragmentation operations rebuild the index structure, reorganization operations
defragment the leaf-level pages of an index. Although time-consuming, reorganization operations are
less resource-intensive and more suitable for concurrency. Reorganization can be applied as a solution
in situations where you cannot perform defragmentation operations.
Query analysis and index appropriateness
There should be thorough query analysis, testing, and verification processes in place during all stages
of the software life cycle. Indexes should be checked for use, efficiency, and the degree to which they
are being used by routine production queries.
Important
DBCC commands are familiar to most DBAs but you should check the topics “DBCC (Transact-
SQL)” and “Check Database Integrity Task” for the SQL Server 2005 thinking on these tools. The
emphasis is changing to a heavier reliance on the automatic tasks. See the indicated topics and the
related topics. Also notice in topics such as “DBCC INDEXDEFRAG (Transact-SQL)” that a number
of DBCC commands will be removed in future versions of SQL Server. Ensure that your automation
does not rely on commands or features that will eventually disappear from the product.
DBAs need to communicate with other sections of the business and establish regular ways of staying
informed about business processes and requirements, particularly ones that might change. Various
other forms of automation, e-mail alias use, and project management can aid such communications.
On the management side, both IT and business managers need to clearly communicate their
requirements to the DBAs. Understanding business needs, being able to communicate well in both
spoken and written forms, and using techniques such as company intranets for maintaining status
information and other information pertinent to the use of corporate databases are all key activities in
which DBAs should be involved.
In addition to serving the needs of the business it is also important to ensure that database maintenance
activities do not adversely affect daily operations. As all DBAs are aware, the use of SQL Server
automation can be used to guarantee that particular operations (such as rebuilding indexes) occur
when there is little load on the system and operators can be notified that the operation in question has
completed properly.
In all these cases and all other maintenance scenarios, the automation of activities must be tuned to the
particular situation and requirements. Each case is unique, although there are similar patterns involved
in almost every situation. So one of the critical abilities required of a DBA is to think creatively and
laterally about what needs to be covered by the database maintenance activities and the best ways to
achieve those ends. To do this requires detailed knowledge of the databases, the applications, and the
tools that can be applied.
For example, performing a maintenance operation and data movement between servers, both I/O-
intensive tasks, will affect the performance of each other because of the high use of I/O resources.
Lesson objective:
After completing this lesson, students will be able to:
• Establish a change-promotion policy. You should perform the following tasks to establish a
change-promotion policy:
• Define database developers’ roles in promotions. Besides the defined roles in the
change-promotion process, database developers should develop and test new
requirements and enhancements, and fix bugs requiring database changes before
introducing them to any integrated environment, and create roll back scripts. After
the development and testing are complete, developers should package all changes
and deliver them to the persons responsible for the promotion to the next
environment in the software life-cycle hierarchy.
• Define DBAs’ roles in promotions. The primary role of a DBA in the promotion
process is to ensure that the elements of the change-promotion policy have been
followed properly and consistently. This involves ensuring compliance with coding
standards, performance guidelines, and existence of accurate rollback scripts. A DBA
also needs to ensure that the risks of introducing changes are kept to a minimum.
• Use SQL Server Management Studio projects integrated with Visual SourceSafe.
Use SQL Server Management to create database projects and integrate them with
Visual SourceSafe. SQL Server Management Studio projects enable you to save
several data-source configurations. Therefore, a DBA, a change-promotion specialist,
or an application can execute an application with the appropriate connection without
making any changes to the tested scripts.
• Plan the coordination between application changes and database changes. In most
cases, the procedures used at the application layer will not be identical to those used
at the database layer. To ensure consistency in the application layer and the database
layer, it is important to determine a method to coordinate application layer changes
and data layer changes.
• Document change-control procedures. You must document all the processes required to
implement changes in a database application. These changes can relate to system
requirements, restrictions such as service level agreement (SLA) terms and application
dependencies, and scripts. The objective of this activity is to properly document processes so
that the correct responses to all successful or failed situations are clearly defined at the time a
change is implemented. The documentation should be updated as the system evolves.
Principle: Evaluate considerations for implementing Windows and SQL Server updates.
Considerations
Considerations for implementing Windows and SQL Server updates include the following:
Risks and benefits of automatic updates
You must consider the scheduling and deployment needs, and the flexibility and control
granularity required to determine the tools that are appropriate for your requirements. You must
evaluate the benefits and risks of automatic updates on servers. If an update is applied
automatically to all servers in an organization, there is a risk of applying an update without
assessing the impact on the overall infrastructure. This can lead to security issues that block ports,
disable services, or change security access. On the other hand, if you do not apply a fix, you might
be exposed to instability or vulnerability issues solved by the fix.
Best practices
Research by the Microsoft SQL Server group after the Slammer Worm indicated that almost without
exception damage to SQL Server databases and servers reported to the group was from sites that had
not applied the updates when they were released and so left their systems exposed. The problem had
in fact been fixed several months before the Slammer Worm hit.
Note
Analyze changes in the system behavior following update installation and the overall impact on the
system. The preferred method to implement updates should be to first test them in a preproduction
environment. Report any problems with updates to Microsoft.
Tracking system software updates
Tracking software updates, service packs, and versions helps you analyze system malfunctions
and the behavior of the environment with and without an update. Keeping an historical record of
updates permits you to detect anomalies in the environment, and you can later compare them with
scenarios using various updates. Consider the use of tools such as Windows Server Update
Services (WSUS) and Microsoft System Management Server for tracking the system software
updates.
server, multiserver administration is less desirable because all the tasks are already
located on the data warehouse server.
Q What are the problems and advantages of using a single multiserver administration
master server for database servers across all software life-cycle environments?
A
• Problems:
• Obstructs the testing of multiserver changes in a nonproduction environment
before implementing them in a production environment.
• Creates a dependency between production systems and nonproduction systems.
The dependency can affect the stability of the production environment.
• Limits the capability of the master server to access all servers if firewalls,
nontrusted Windows domains, or network subnets separate them.
• Advantages
• Provides a single point of control for tracking and managing all SQL Server Agent
scheduled activity.
Lesson objective
After completing this lesson, students will be able to:
Fact: Explain the importance of documenting administration and automation information in a run
book.
Facilitate Knowledge Sharing
Knowledge or information about an organization is shared among its employees. Mission-critical
processes and tasks form a major portion of this information. Documenting the tasks and processes
ensures that this information is available to all employees around the clock. If you cannot perform a
task due to the unavailability of an employee, any qualified employee should be able to use the
documented information to complete the task. You also need to document resource information that
includes member and supplier contact information, and hardware and software component
specifications. You must also document procedural information for all operational and emergency
tasks and keep a record of the history of administrative details, such as security settings and data
monitoring.
Best practices
It is very useful to have an electronic run book on the corporate intranet, where DBA staff can easily
access and use it.
• Emergency procedures
• Organizational disaster recovery
plan
• Service pack and security update history To record physical server characteristics,
configuration, and installation of server software
• Server configuration setting
• Server hardware—general server
information
• Processors
• Machines
• Storage
• Arrays
• Storage Area Network (SAN)
• Update a run book whenever you need to. You can manage the existing environment
configuration by updating a run book.
• Define a strategy or schedule to keep the run book up-to-date. You must define and document
how to update a run book so that people working with it can modify it systematically. You
should select the appropriate strategy based on the frequency of the environment changes and
its complexity. You should adapt a run book structure to facilitate the process of updating a
run book. To facilitate the process of updating a run book, you should:
• Automate run book maintenance. Use tools that automate run book maintenance.
Examples of such tools are the SQL H2 Tool, Microsoft Operations Management (MOM),
and third-party monitoring tools. You can use T-SQL, Windows Management
Instrumentation (WMI), SQL Management Objects (SMO), CLR integration, dynamic
management views, or Profiler to build custom run book components to capture data.
• Use familiar tools. The tools that you use should be familiar to you. Otherwise, you will
need to spend time to become familiar with every tool that meets the requirements of a run
book.
• Use appropriate run book format. Based on the type of collected information, you should
use the appropriate format. For example, you can use:
• Visio diagrams for a network infrastructure.
• Tables for performance data collected in auditing processes.
• Access or Excel for a contact information list.
• Keep a master document. A run book consists of multiple technologies. Therefore, it is
important that you use a master document to gather information in one place. The master
document can be a Word document, an intranet Microsoft Office FrontPage® site, or a
printed document.
Important
The organization must ensure that there is sufficient time for staff to maintain the run book. A run
book will not help if staff members do not have the time and management support to maintain it
properly. An incomplete run book is almost more dangerous than none at all because it can engender a
false sense of security.
Important
There are two versions of the SQL H2 tool: one for SQL Server 2000 and one for SQL Server 2005.
When you download the tool from the Microsoft Web site, be sure to get all the files for the version
that you want to use. You will then need to review the Readme files and follow the installation
instructions precisely. Because the tools are not fully supported, they do not have as complete
installation routines as normal software.
Preparation
Ensure that the virtual machine 2789A-MIA-SQL-01 is running and that you are logged on as
Student.
If a virtual machine has not been started, perform the following steps:
1. Close any other running virtual machines.
2. Start the virtual machine.
3. In the Log On to dialog box, complete the logon procedure by using the user name Student
and the password Pa$$w0rd.
Introduction
In this exercise, you will update the run book with information about database maintenance activities.
You will document backup activities, index maintenance operations, and database integrity. You will
also create a jobs dependency diagram. Finally you will create maintenance plans to automate the
database maintenance tasks.
Create a maintenance plan
Task Supporting information
Prepare the database. 1. Use Windows Explorer to view the
contents of D:\Labfiles\Starter.
2. Double-click PrepareDB.cmd to
execute the database preparation script.
3. Close Windows Explorer.
Identify and document the current backup 1. Examine the information relating to
activities. backup activities in the “Notes From
Bob” document located at
D:\Labfiles\Starter\Notes From Bob.doc.
2. Examine existing SQL Server Agent jobs
to identify backup tasks that are currently
automated.
3. Create SQL Server Agent jobs with
appropriate schedules for any backup
tasks that are not currently automated.
4. Document the backup tasks in the
Backup section of the run book located
at D:\Labfiles\Starter\ Run Book.doc,
detailing the tasks that must be
performed, the jobs that have been
created to perform them, and the
schedules for the jobs.
Identify and document the current index 1. Examine the information relating to
maintenance activities. index maintenance in “Notes From Bob”
document.
2. Examine existing SQL Server Agent jobs
to identify index maintenance tasks that
are currently automated.
3. Create SQL Server Agent jobs with
appropriate schedules for any index
maintenance tasks that are not currently
automated.
4. Document the index maintenance tasks
in the Index Maintenance section of the
maintenance plan.
Update the run book. 1. Remove the scheduling information for
the jobs that are now included in the
daily and hourly maintenance plans.
2. Add a section entitled Maintenance
Plan to the run book.
3. Document the daily and hourly
maintenance plans in the Maintenance
Plans section, detailing which jobs are
included and when the plans are
scheduled to run.
Answer Key
Preparing the database
You must perform the following steps to prepare the database.
1. Use Windows Explorer to view the contents of D:\Labfiles\Starter.
2. Double-click PrepareDB.cmd to execute the database preparation script.
3. Close Windows Explorer.
Start menu.
5. On the Config file menu, click Edit and add
MIAMI as a PerfProvider.
6. Close SQL H2 Configuration utility.
7. Start the C:\SQLH2\SQLH2.exe application to
insert the collected data into the database.
View SQL H2 reports. 1. Use SQL Server Configuration Manager to
start the SQL Server Reporting Services
(MSSQLSERVER) service.
2. Run SetupDB.cmd in the
D:\Labfiles\Starter\SQLH2\SQLH2 Reports
folder.
3. Start Internet Explorer and browse to
http://localhost/reports.
4. Click New Data Source and create a new data
source named SQLH2Repository that uses
the connection string
SERVER=MIAMI;DATABASE=SQLH2Re
pository with Windows authentication.
5. Click Upload File and upload the
Performance Counters.rdl report from the
D:\Labfiles\Starter\SQLH2\SQLH2 Reports
folder.
6. When the file is uploaded, click Performance
Counters to view the report. Note that you can
select a specific counter and then click View
Report to generate a chart showing values for
the selected counter.
7. Close Internet Explorer.
Answer key
Configuring the SQL H2 tool
You must perform the following steps to configure the SQL H2 tool.
1. Install the SQL H2 Configuration Program by running H2Setup.msi in the
D:Labfiles\Starter\SQLH2 folder. Select the option to start the configuration tool after
installation.
2. On the Welcome to SQLH2 Installation & Configuration Wizard page, click Next.
3. On the Please choose a server to install SQLH2 Repository Database page, enter MIAMI
as the server, accept the default database and connection options, and click Next.
4. On the Please decide whether you want to allow sharing data from the Repository with
Microsoft page, leave the Allow sending data to Microsoft from the Repository option
unselected, and click Next.
5. When installation is finished, click Next.
6. On the Please enter the list of computers you want to collect from page, ensure MIAMI is
included in the Targets list, and click Next.
7. On the Schedule a task for SQLH2 Collector page, configure the collector to run every day
at 10:00 PM using the MIAMI\SQLServer account with the password Pa$$w0rd. Then
click Next.
8. On the The Wizard has completed all the necessary steps page, click Finish, and on the
Installation Complete page, click Close.
5. Click Upload File and upload the Performance Counters.rdl report from the
D:\Labfiles\Starter\SQLH2\SQLH2 Reports folder.
6. When the file is uploaded, click Performance Counters to view the report. Note that you can
select a specific counter and then click View Report to generate a chart showing values for
the selected counter.
7. Close Internet Explorer.
Module objective
In this module you will learn how to:
Introduction
Microsoft® SQL Server™ 2005 has many components that serve a wide range of purposes, each of
which has its place in the provision of database services and related supporting services. In this
module you will focus on several of the key supporting services, namely Integration Services,
Replication, and Reporting Services.
Microsoft SQL Server Integration Services (SSIS) is the key component of SQL Server 2005 that
supports Extract, Transfer, and Load (ETL) operations and related work flows through FTP,
messaging, transformation services, script execution, and other processes, usually through the use of
SSIS packages. Integration services has its own management and administration requirements that
database administrators (DBAs) must be aware of separate from the rest of the product.
Replication is not new to SQL Server 2005 but the replication features include a number of new
capabilities and tools. To manage replication well, the DBA must be familiar with these product
features. In this module, you will learn about these features and how to use them to run and automate
replication.
Reporting Services provides SQL Server 2005 with a flexible, Web-enabled reporting capability.
Reports can be based on various data sources and produced in a range of formats. Like all core
services, Reporting Services requires particular administration and can be usefully automated.
Lesson objective
After completing this lesson, you will be able to:
Introduction
SSIS is new to SQL Server 2005. Although related conceptually to the older Data Transformation
Services (DTS) tools in SQL Server 2000, the new tool set has more capabilities along with new
graphical tools. DBAs who have created DTS packages must be aware of specific migration issues
from old packages to SSIS packages.
Most of the work done in SSIS is done through packages, which are collections of one or more steps
used to accomplish data import, export, transformations, or a combination thereof. Most SSIS
packages also involve the use of notifications and other steps for purposes of data verification,
scheduling, and package step confirmation of success or failure. In this lesson, you will learn about
SSIS packages and guidelines for how to deploy and manage SSIS packages.
Guidelines
To use packages properly, the DBA must do the following:
• Identify deployable packages.
• Import and export packages as required.
• Configure packages for deployment.
• Do package deployment using either a deployment utility or through automation using
dtutil.exe.
<StopExecutingPackagesOnShutdown>true</StopExecutingPackagesOnShu
tdown>
<TopLevelFolders>
<Folder xsi:type="SqlServerFolder">
<Name>MSDB</Name>
<ServerName>.</ServerName>
</Folder>
<Folder xsi:type="FileSystemFolder">
<Name>File System</Name>
<StorePath>..\Packages</StorePath>
</Folder>
</TopLevelFolders>
</DtsServiceConfiguration>
Pay attention to the first occurrence of the <ServerName> tag. Its value must be “.”.
4. Save the file and exit Notepad.
5. In SQL Server Configuration Manager, right-click the SQL Server Integration Services
service and then click Restart.
6. Click Start, point to All Programs, point to Microsoft SQL Server 2005, and then click
SQL Server Management Studio.
7. In the Connect to Server dialog box, click Integration Services in the Server type list and
MIAMI in the Server Name list, and then click Connect.
8. In the Object Explorer window, expand Stored Packages and then expand the MSDB folder.
The Maintenance Plans folder is listed. This demonstrates that SSIS can successfully connect
to the MSDB database in the specified server in Step 3.
9. Keep SQL Server Management Studio open. You will use it in the next procedure.
3. In the right pane, select True for the CreateDeploymentUtility option. Notice the value of
the DeploymentOutputPath property. Click OK.
4. On the Build menu, click Build Demo 2789 M2L1.
5. Close SQL Server Business Intelligence Developer Studio.
6. Using Microsoft Windows® Explorer, view the D:\Democode\Demo 2789 M2L1\bin folder,
and then open the Deployment folder.
7. Notice the following three files:
• SalesByCreditCard.dtsx—This is the package.
• config.dtsconfig —This is the configuration file specified for changing the value of
ServerName of one of the connection managers.
• Demo 2789 M2L1.SSISDeploymentManifest—This is the installation descriptor.
To deploy a package
8. Double-click the Demo 2789 M2L1.SSISDeploymentManifest file to start the Package
Installation Wizard and click Next.
9. On the Deploy SSIS packages screen, click SQL Server deployment and then click Next.
10. On the Specify Target SQL Server screen, in the Server Name box, type MIAMI and then
click Next.
11. On the Select Installation Folder screen, click Next.
12. On the Confirm Installation Screen, click Next.
13. On the Configure Packages screen, notice the name of the configuration file. Expand the
Property node, and note the current value of the \Package.Connections[Adventure
Works],Properties[ServerName] property (which should be MIAMI.) Then click Next.
14. On the Finish Package Installation Wizard screen, click Finish.
SSIS introduces the capability of detailed and customizable logging of package execution. In an
SSIS package, you can configure your choice of location to store the logs. For instance, you can
store logs in a SQL Server table, an XML file, or the Windows Event log. You can define logging
at a package level or a task level. You can also choose the events and columns that you want to
log for a specified level.
• Use Application Event Viewer and alerts
When logging events to Windows Event log, you can generate alerts related to specific SSIS
events. For instance, you can create an alert to indicate that a particular task inside a package has
failed. This alert can be based on the OnError event, either at the package level or at the task
level.
• Create alerts for performance monitor counters
An alert generated as a result of a performance counter not meeting a specified threshold helps
you realize the important execution conditions in packages. An example of an execution condition
can be the successful completion of a package.
• Implement notifications
SSIS has a control task that permits sending e-mail messages. You can include this kind of
notification in a package to ensure that you are updated when a specific sequence of tasks has
been executed.
• Use database activity monitoring tools
In most cases, SSIS reads and writes data from multiple sources. You must use traditional
database tools to verify the connection status of a package. For example, you can use tools to
verify that a connection is not blocked or to determine the number of resources a package is
consuming inside a database.
Lesson objective
After completing this lesson, you will be able to:
Manage Replication.
Introduction
SQL Server 2005 has a range of replication technologies that provide a powerful and flexible
mechanism to distribute data to multiple servers and keep everything synchronized. The
synchronization process has the effect of binding the two servers together so that issues on one server
can affect the other server. In all forms, replication can be used to copy and distribute data between
database objects, between databases, and between servers and mobile devices. To ensure that
replication happens properly; is done in a timely manner; and is managed with regard to data
processes, backup, restore, and performance, there are a number of things that DBAs must understand
and be familiar with.
In addition to the three types of replication (transactional, merge, snapshot) there are important
differences between server-to-server replication and server-to-client replication.
See Books Online topics “SQL Server Replication” and “Using Merge Replication.” The latter topic
includes extensive material on SQL Server Mobile replication.
Note
Replication is covered in detail in Course 2788, Designing High Availability Database Solutions
Using Microsoft SQL Server 2005.
Discussion questions
maintain a full set of state data. Because the merge engine is designed to calculate changes that need
to be exchanged between sites, there are no special considerations for backups and restores with
merge replication.
Q. In what order should service packs be applied in a replicated environment?
A. Service packs and hot fixes should always be applied to the distributor first. After the distributor is
updated, changes should be applied to the publisher. The last step in the upgrade process is to apply
the service packs and/or hotfixes to all subscribers.
Q. How do you manage schema changes in a replicated environment?
A. Previous versions of Microsoft® SQL Server™ imposed very strict limitations on changes to the
schema. SQL Server 2005 does not impose any restrictions and it does not require special procedures.
All schema changes are sent from the publisher to all subscribers during the first synchronization
cycle following the change.
Q. How do you determine how long it will take to catch up, once replication has fallen behind in
distributing transactions to the subscriber?
A. In previous versions of SQL Server, this question could not be answered. The infrastructure did not
exist to determine the two components required to answer this question: number of changes pending
distribution and how long it takes a transaction to move from publisher to subscriber. SQL Server
2005 introduces addition instrumentation into the engine, which provides the ability to calculate end-
to-end transit time for a transaction. With this additional piece of information provided by tracer
tokens, an administrator can now determine how much time it will take to finish synchronizing the
environment.
These suggested topics are not exhaustive but provide a starting point for the class.
After which replication related changes should you capture a new backup of each replication
database? Your answer should include the changes and the related replication database to
backup.
How do you ensure that all subscribers are synchronized after a restore of the publication
database in a merge replication? How is this synchronization process different if the
replication is using the HTTPS protocol?
When should you use a backup to initialize transactional replication?
What schema changes must be propagated to subscribers manually?
Why should you regularly generate replication topology scripts?
Principle: Evaluate scenarios for monitoring and verifying replication, and choosing monitoring tools.
Considerations for choosing monitoring tools
In selecting monitoring tools, the DBA should consider the following:
• Monitoring health of environment. A monitoring tool should be able to display the status of
any replication component and the task it is performing. Any monitoring solution, especially
for replication, should display aggregated information across an entire environment.
Note
It is a common mistake to classify a tool that can connect to each server and display information for
each server on an individual basis as providing an aggregated view. Replication has components
running on multiple servers simultaneously, which requires the use of tools that can establish multiple
connections to multiple servers and aggregate all the monitoring data into a single console without
requiring a DBA to connect manually to multiple servers.
• Troubleshooting errors. In the event of an error, a monitoring tool should display any error
messages in a single consolidated view and eliminate the need for a DBA to access tables,
event logs, or error logs separately to obtain the error information.
• Determining latency. A critical element in any replication environment is latency. DBAs are
always working to minimize latency. Any monitoring tool that does not meet the needs to
measure latency is only marginally effective in any production environment.
• Comparing tables. SQL Server provides routines to determine whether data on the publisher
is synchronizing with data on the subscriber. Determining whether databases are
synchronized is only part of the process. A monitoring tool should also be able to indicate to a
DBA which rows are not synchronized and generate compensating transactions so that the
DBA can repair the environment.
• Resolving errors. Many monitoring tools have the capability to resolve certain types of errors
automatically, without user intervention. This capability is useful when the system encounters
the same types of errors repeatedly; it might be possible to program logic into the monitoring
system to deal with the problem instead of having to wait for manual intervention.
Lesson objective
After completing this lesson, you will be able to:
Introduction
Reporting Services is a powerful but complex product. This complexity comes from the number of
elements involved in his architecture: databases, Web services, windows services, Web sites, and,
occasionally, custom assemblies.
DBAs must manage Reporting Services components, from configuration following installation, to
applying security corporate policies, to scaling up reporting solutions. In this lesson, you will learn
about considerations and guidelines for adjusting Reporting Services to solution requirements.
Configuration files (which are stored in an XML format) can also be edited manually for some
purposes; in other cases, some settings can be changed only manually.
Through the use of these tools, the DBA can manage virtual directories, Reporting Services databases,
encryption, service accounts, and many other settings.
Caution
In the case of manual editing of configuration files, be aware that incorrect configuration can cause
Reporting Services to use a default value, fail to start at all, or log an error to the Windows application
log. In most cases it is advisable to work through the graphical user interface (GUI) or command-line
tools.
IIS
Reporting Services depends on Internet Information Services (IIS) virtual directories. These
directories must be properly configured and secured. DBAs might need to work with other IT
professionals to deal with IIS issues.
Logging
Reporting Services provides for extensive logging of its operations. Logs must be reviewed on a
regular basis to determine whether there are any problems with the service.
Performance
Use of Reporting Services is usually a core business activity and is usually heavily used. To determine
patterns of use and any related performance issues, DBAs need to do regular performance monitoring
of all aspects of the service.
Temporary data
You can configure Reporting Services to store temporary snapshots in the file system instead of using
ReportServerTempDB. These snapshots can be compressed to take up less storage space. You can
enable this functionality through configuration files. You should change the default storage path
because it defaults to a subfolder of the Reporting Services installation path.
Moving Reporting Services Databases
If you decide to move Reporting Services databases to a different database server, you can move the
current database using regular tools. However, after moving the database you must update Reporting
Services configuration files accordingly. To do so, use Reporting Services Configuration Manager or
the rsconfig command-line utility
Storage Space
Storage space required for the Report Server database varies depending on how your solution is
configured. The Report Server database can grow to a significant size.
Storage space required for the temporary database depends on reports: how big, how many, and for
how long they will be maintained in the system. DBAs must monitor real usage to adjust the space
requirements. This monitoring must be done during peak workload hours because temporary
snapshots occupy the maximum space.
Manage security
Reporting Services has its own set of roles, permissions, and authorized users. You should revise
those roles to decide whether adding new roles is necessary.
When there are many users, and volatility or turnover ratio is significant, you can benefit from using
data-driven subscriptions instead of grating permissions to user and groups specifically. Alternatively,
you should create some scripts to automate user management.
Manage subscriptions
Users are granted to subscribe to updated reports and to receive results at selected destinations. This
kind of subscription is managed by the subscription owner. Obsolete subscriptions, no longer required
but still active, might cause unnecessary processing. Therefore, you should establish a strategy for
keeping track of subscriptions and checking whether they stay current.
Review execution logs
Reporting Services maintains logs, which can be adjusted in logging detail level or completely
deactivated through configuration files. The Report Execution Log maintains information about report
execution history on a server. This log can be deactivated through Report Manager, although it is
enabled by default. To exploit log information, you must use the provided Integration Services
package to put log data into a database table
Verifying that the MIAMI SQL Server instance is the database repository for the SSIS service
You must perform the following steps to verify that the MIAMI SQL Server instance is the
database repository for the SSIS service.
1. Click Start, click Run, type C:\Program Files\Microsoft SQL Server\90\DTS\Binn, and
then click OK.
2. Open the MsDtsSrvr.ini.xml file by using Notepad.
3. Verify that the file contains the following code:
<StopExecutingPackagesOnShutdown>true</StopExecutingPackagesOnShu
tdown>
<TopLevelFolders>
<Folder xsi:type="SqlServerFolder">
<Name>MSDB</Name>
<ServerName>.</ServerName>
</Folder>
<Folder xsi:type="FileSystemFolder">
<Name>File System</Name>
<StorePath>..\Packages</StorePath>
</Folder>
</TopLevelFolders>
</DtsServiceConfiguration>
Pay attention to the first occurrence of the <ServerName> tag. Its value must be “.”.
4. Save the file and exit Notepad.
5. In SQL Server Configuration Manager, right-click the SQL Server Integration Services
service and then click Restart.
6. Click Start, point to All Programs, point to Microsoft SQL Server 2005, and then click
SQL Server Management Studio.
7. In the Connect to Server dialog box, click Integration Services in the Server type list and
MIAMI in the Server Name list, and then click Connect.
8. In the Object Explorer, expand Stored Packages and then expand the MSDB folder. The
Maintenance Plans folder is listed. This procedure demonstrates that SSIS can successfully
connect to the MSDB database in the specified server in Step 3.
9. Keep SQL Server Management Studio open. You will use it in the next procedure.
1. Click Start, click Run, type D:\Labcode\Starter, and then click OK.
2. Right-click the SalesByCreditCard.dtsx file, and then click Edit. The package opens in SQL
Server Business Intelligence Development Studio.
3. On the File menu, click Save Copy of SalesByCreditCard.dtsx As.
4. In the Save Copy of Package dialog box, in the Package location list, click SQL Server. In
the Server list, click MIAMI. In the Package path box, type /CCSales.
5. Click the button next to the Protection level box.
6. In the Package Protection level dialog box, in the Package Protection level list, click Rely
on server storage and roles for access control and then click OK.
7. In the Save Copy of Package dialog box, click OK.
8. Close SQL Server Business Intelligence Development Studio.
9. In SQL Server Management Studio, refresh the MSDB folder in Object Explorer and verify
that the Demo package has been copied to this server.
10. Keep SQL Server Management Studio open. You will use it again later in this exercise.
Creating a package configuration for the SalesByCreditCard package in the AWPackages solution
You must perform the following steps to create a package configuration for the SalesByCreditCard
package in the AWPackages solution.
1. In the D:\Labfiles\Starter, folder, double-click AWPackages.sln to open it in SQL Server
Business Intelligence Development Studio.
2. In Solution Explorer, double-click SalesByCreditCard.dtsx.
3. On the SSIS menu, click Package Configurations.
4. In the Package Configurations Organizer dialog box, select the Enable package
configuration check box.
5. Click the Add button.
6. In the Package Configuration Wizard, click Next.
7. In the Configuration type list, click XML configuration file.
8. Select Specify configuration settings directly and then type
D:\Labfiles\Starter\AWPackage\config.dtsconfig.
9. Click Next.
10. From the objects tree in the connection managers folder, expand Adventure works DW,
expand the properties folder, and finally then the ServerName check box. This will allow
administrators to modify the server that the data source connects to when deploying the
package.
11. Click Next.
12. In the Configuration name box, accept the default value and then click Finish.
13. In Package Configurations Organizer, click Close.
14. Press CTRL+SHIFT+S to save all the projects.
1. In the SQL Server Business Intelligence Development Studio, on the Project menu, click
AWPackages.
2. Click Deployment Utility.
3. In the right pane, select True for the CreateDeploymentUtility option. Notice the value of
the DeploymentOutputPath property. Click OK.
4. On the Build menu, click Build AWPackages.
5. Close SQL Server Business Intelligence Developer Studio.
6. Using Windows Explorer, view the D:\Labfiles\AWPackages\bin folder, and then open the
Deployment folder.
7. Notice the following three files:
• SalesByCreditCard.dtsx—This is the package.
• config.dtsconfig —This is the configuration file specified for changing the value of
ServerName of one of the connection managers.
• AWPackages.SSISDeploymentManifest—This is the installation descriptor.
Questions
Q When do you think it is necessary to deploy packages to a server?
A When moving from development to testing environments
A When moving from testing to production environments
A When restoring a server from a backup
Creating a publication on MIAMI named AWTables that contains all tables in AdventureWorks
You must perform the following steps to create a publication on MIAMI named AWTables that
contains all tables in AdventureWorks.
1. If Object Explorer is not visible, click Object Explorer on the View menu.
2. In Object Explorer, expand Replication, right-click Local Publications, and then click New
Publication.
5. On the New Publication Wizard page, click Next.
6. On the Distributor page, click Next to use MIAMI as its own distributor.
7. On the Snapshot Folder page, click Next to use the default snapshot folder location.
8. On the Publication Database page, click AdventureWorks to choose it as the publication
database and then click Next.
9. On the Publication Type page, click Transactional publication and then click Next.
10. On the Articles page, select Tables, and then click Next.
11. Click Next on the Filter Table Rows page.
12. On the Snapshot Agent page, leave all check boxes unselected and click Next.
13. On the Agent Security page, click Security Settings.
14. Enter MIAMI\Student as the Process account. Enter Pa$$w0rd in the Password and
Confirm Password boxes. Leave By impersonating the process account selected and then
click OK.
15. Click Next on the Agent Security page.
16. On the Wizard Actions page, select the Create the publication and Generate a script file
with steps to create the publication check boxes and then click Next.
17. On the Script File Properties page, change the File name to
D:\Labfiles\Starter\AWReplication\PublicationScript.sql and then click Next.
18. On the Complete the Wizard page, enter AWTables as the Publication name and then click
Finish.
19. On the Creating Publication page, wait until all actions have been completed, check that
there are no errors, and then click Close.
20. On the Project menu, click Add Existing Item.
21. Add D:\Labfiles\Starter\AWReplication\PublicationScript.sql. When prompted, connect to
MIAMI by using Windows authentication.
22. On the File menu click Save All.
3. In Object Explorer, right-click the Databases folder for MIAMI\SQLINSTANCE2 and then
click Restore Database.
4. In the To database box, type AdventureWorks.
5. Select From device and then click the ellipsis button (...).
6. Click Add, select C:\Program Files\Microsoft SQL
Server\MSSQL.1\MSSQL\Backup\AdventureWorks.bak, and then click OK.
7. Click OK in the Specify Backup dialog box.
8. Select the AdventureWorks – Full Database Backup check box in the list of backup sets.
9. Click Options in the Select a Page pane.
Discussion
Q Why do you want to establish a performance baseline?
A The most fundamental question that is asked in a production environment is: “How long
will it take me to be synchronized.” The amount of time to synchronize is dependent
upon knowing the amount of data that needs to be sent along with the amount of time it
takes for a transaction to reach the subscriber. Replication Monitor will display the
number of commands that are pending transfer. A tracer token is used to measure the
time required for a transaction to move from the publisher to the subscriber. By
creating a tracer token after the subscription is created, you will establish the amount of
time it takes for a transaction to move from the publisher to the subscriber. Replication
Monitor can then use this information along with the number of transactions waiting to
be sent to answer the question of how long it will take to be synchronized.
project.
Initialize a subscription on 1. Add a new query to the project, connecting to
MIAMI\SQLINSTANCE2 from the backup MIAMI.
of AdventureWorks. 2. Execute the sp_addsubscription stored
procedure in the AdventureWorks database
with the following options:
• @publication = 'AWTables'
• @subscriber =
'MIAMI\SQLINSTANCE2'
• @destination_db = 'AdventureWorks'
• @subscription_type = 'push'
• @sync_type = 'initialize with
backup'
• @backupdevicetype = 'location of
the backup file'
Establish a baseline latency measurement by 1. Use Replication Monitor to view the
using a tracer token. AWTables publication.
2. Add a tracer token and view the latency
measurement for the AWTables publication.
Procedure Answer Key
Detaching the Reporting Services databases from MIAMI
You must perform the following steps to detach the Reporting Services databases from MIAMI.
1. Click Start, point to All Programs, point to Microsoft SQL Server 2005, point to
Configuration Tools, and click SQL Server Configuration Manager.
2. In the list of SQL Server 2005 services, right-click SQL Server Reporting Services and
click Stop.
3. Right-click the SQL Server Agent (MSSQLSERVER) service and click Stop.
4. Minimize SQL Server Configuration Manager.
5. Start SQL Server Management Studio, connecting to the MIAMI Database Engine by using
Windows authentication when prompted.
6. In Object Explorer, expand Databases.
7. Right-click the ReportServer database, point to Tasks, and click Detach. Then click OK to
detach the database.
8. Repeat the previous step for the ReportServerTempDB database.
9. Use Windows Explorer to move the ReportServer.mdf, ReportServer_Log.ldf,
ReportServerTempDB.mdf, and ReportServerTempDB_Log.ldf database files from
C:\Program Files\Microsoft SQL Server\MSSQL.1\MSSQL\Data to C:\Program
Files\Microsoft SQL Server\MSSQL.4\MSSQL\Data.
10. In SQL Server Configuration Manager, restart the SQL Server Agent (MSSQLSERVER)
service. Then close SQL Server Configuration Manager.