SSIS - SQL Server Integration Services

Q: What is SSIS? How it related with SQL Server.
SQL Server Integration Services (SSIS) is a component of SQL Server which can be used to
perform a wide range of Data Migration and ETL operations. SSIS is a component in MSBI
process of SQL Server.
This is a platform for Integration and Workflow applications. It is known for a fast and flexible
OLTP and OLAP extensions used for data extraction, transformation, and loading (ETL). The
tool may also be used to automate maintenance of SQL Server databases and
multidimensional data sets.
Q: What are the tools associated with SSIS?
We use Business Intelligence Development Studio (BIDS) and SQL Server Management Studio
(SSMS) to work with Development of SSIS Projects.
We use SSMS to manage the SSIS Packages and Projects.
Q: What are the differences between DTS and SSIS
Data Transformation Services SQL Server Integration Services
Limited Error Handling Complex and powerful Error Handling
Message Boxes in ActiveX Scripts Message Boxes in .NET Scripting
No Deployment Wizard Interactive Deployment Wizard
Limited Set of Transformation Good number of Transformations
NO BI functionality Complete BI Integration

Q: What is a workflow in SSIS ?
Workflow is a set of instructions on to specify the Program Executor on how to execute tasks
and containers within SSIS Packages.

Q: What is the control flow?
A control flow consists of one or more tasks and containers that execute when the package
runs. To control order or define the conditions for running the next task or container in the
package control flow, we use precedence constraints to connect the tasks and containers in a
package. A subset of tasks and containers can also be grouped and run repeatedly as a unit
within the package control flow. SQL Server 2005 Integration Services (SSIS) provides three
different types of control flow elements: Containers that provide structures in packages, Tasks
that provide functionality, and Precedence Constraints that connect the executables,
containers, and tasks into an ordered control flow.
Q: What is a data flow?
A data flow consists of the sources and destinations that extract and load data, the
transformations that modify and extend data, and the paths that link sources,
transformations, and destinations The Data Flow task is the executable within the SSIS
package that creates, orders, and runs the data flow. A separate instance of the data flow
engine is opened for each Data Flow task in a package. Data Sources, Transformations, and
Data Destinations are the three important categories in the Data Flow.
Q: How does Error-Handling work in SSIS
When a data flow component applies a transformation to column data, extracts data from
sources, or loads data into destinations, errors can occur. Errors frequently occur because of
unexpected data values.
Type of typical Errors in SSIS:
-Data Connection Errors, which occur incase the connection manager cannot be initialized with
the connection string. This applies to both Data Sources and Data Destinations along with
Control Flows that use the Connection Strings.
-Data Transformation Errors, which occur while data is being transformed over a Data
Pipeline from Source to Destination.
-Expression Evaluation errors, which occur if expressions that are evaluated at run time
perform invalid
Q: What is environment variable in SSIS?
An environment variable configuration sets a package property equal to the value in an
environment variable.

Environmental configurations are useful for configuring properties that are dependent on the
computer that is executing the package.
Q: What are the Transformations available in SSIS?

AGGEGATE - It applies aggregate functions to Record Sets to produce new output records
from aggregated values.
AUDIT - Adds Package and Task level Metadata - such as Machine Name, Execution Instance,
Package Name, Package ID, etc..
CHARACTER MAP - Performs SQL Server level makes string data changes such as changing
data from lower case to upper case.
CONDITIONAL SPLIT – Separates available input into separate output pipelines based on
Boolean Expressions configured for each output.
COPY COLUMN - Add a copy of column to the output we can later transform the copy keeping
the original for auditing.
DATA CONVERSION - Converts columns data types from one to another type. It stands for
Explicit Column Conversion.
DATA MINING QUERY – Used to perform data mining query against analysis services and
manage Predictions Graphs and Controls.
DERIVED COLUMN - Create a new (computed) column from given expressions.
EXPORT COLUMN – Used to export a Image specific column from the database to a flat file.
FUZZY GROUPING – Used for data cleansing by finding rows that are likely duplicates.
FUZZY LOOKUP - Used for Pattern Matching and Ranking based on fuzzy logic.
IMPORT COLUMN - Reads image specific column from database onto a flat file.
LOOKUP - Performs the lookup (searching) of a given reference object set against a data
source. It is used for exact matches only.
MERGE - Merges two sorted data sets into a single data set into a single data flow.
MERGE JOIN - Merges two data sets into a single dataset using a join junction.
MULTI CAST - Sends a copy of supplied Data Source onto multiple Destinations.
ROW COUNT - Stores the resulting row count from the data flow / transformation into a
ROW SAMPLING - Captures sample data by using a row count of the total rows in dataflow
specified by rows or percentage.
UNION ALL - Merge multiple data sets into a single dataset.
PIVOT – Used for Normalization of data sources to reduce analomolies by converting rows into
UNPIVOT – Used for demoralizing the data structure by converts columns into rows incase of
building Data Warehouses.
Q: How to log SSIS Executions?
SSIS includes logging features that write log entries when run-time events occur and can also
write custom messages. This is not enabled by default. Integration Services supports a diverse
set of log providers, and gives you the ability to create custom log providers. The Integration
Services log providers can write log entries to text files, SQL Server Profiler, SQL Server,
Windows Event Log, or XML files. Logs are associated with packages and are configured at the
package level. Each task or container in a package can log information to any package log.
The tasks and containers in a package can be enabled for logging even if the package itself is
Q: How do you deploy SSIS packages.
BUILDing SSIS Projects provides a Deployment Manifest File. We need to run the manifest file
and decide whether to deploy this onto File System or onto SQL Server [ msdb]. SQL Server
Deployment is very faster and more secure then File System Deployment. Alternatively, we
can also import the package from SSMS from File System or SQ Server.
Q: What are variables and what is variable scope ?
Variables store values that a SSIS package and its containers, tasks, and event handlers can
use at run time. The scripts in the Script task and the Script component can also use
variables. The precedence constraints that sequence tasks and containers into a workflow can
use variables when their constraint definitions include expressions. Integration Services
supports two types of variables: user-defined variables and system variables. User-defined
variables are defined by package developers, and system variables are defined by Integration

Services. You can create as many user-defined variables as a package requires, but you
cannot create additional system variables.

Q: Can you name five of the Perfmon counters for SSIS and the value they provide?
 SQLServer:SSIS Service
 SSIS Package Instances
 SQLServer:SSIS Pipeline
 BLOB bytes read
 BLOB bytes written
 BLOB files in use
 Buffer memory
 Buffers in use
 Buffers spooled
 Flat buffer memory
 Flat buffers in use
 Private buffer memory
 Private buffers in use
 Rows read
 Rows written

SSAS - SQL Server Analysis Services
Q: What is Analysis Services? List out the features?
Microsoft SQL Server 2005 Analysis Services (SSAS) delivers online analytical processing
(OLAP) and data mining functionality for business intelligence applications. Analysis Services
supports OLAP by letting we design, create, and manage multidimensional structures that
contain data aggregated from other data sources, such as relational databases. For data
mining applications, Analysis Services lets we design, create, and visualize data mining models
that are constructed from other data sources by using a wide variety of industry-standard
data mining algorithms.
Analysis Services is a middle tier server for analytical processing, OLAP, and Data mining. It
manages multidimensional cubes of data and provides access to heaps of information including
aggregation of data. One can create data mining models from data sources and use it for
Business Intelligence also including reporting features.
Analysis service provides a combined view of the data used in OLAP or Data mining. Services
here refer to OLAP, Data mining. Analysis services assists in creating, designing and managing
multidimensional structures containing data from varied sources. It provides a wide array of
data mining algorithms for specific trends and needs.
Some of the key features are:
 Ease of use with a lot of wizards and designers.
 Flexible data model creation and management
 Scalable architecture to handle OLAP
 Provides integration of administration tools, data sources, security, caching, and
reporting etc.
 Provides extensive support for custom applications
Q: What is UDM? Its significance in SSAS?
The role of a Unified Dimensional Model (UDM) is to provide a bridge between the user and the
data sources. A UDM is constructed over one or more physical data sources, and then the end
user issues queries against the UDM using one of a variety of client tools, such as Microsoft
Excel. At a minimum, when the UDM is constructed merely as a thin layer over the data

slicing or industry standard query language oriented toward data mining. more readily understood model of the data. Each instance of Analysis Services implemented as a separate instance of the Windows service. A user can interactively explore data by drilling. source. • Provides high performance queries supporting interactive analysis. and improved performance for summary type queries.  Analysis Services Scripting Language (ASSL) . Q: What is FASMI ? A database is called a OLAP Database if the database satisfies the FASMI rules :  Fast Analysis– is defined in the OLAP scenario in five seconds or less. • Allows business rules to be captured in the model to support richer analysis.. even over huge data volumes.  Ability to create and manage Data warehouses. Q: What is the need for SSAS component?  Analysis Services is the only component in SQL Server using which we can perform Analysis and Forecast operations.  In OLAP.  Drilling refers to the process of exploring details of the industry standard query language orientated towards analysis  Data Mining Extensions (DMX) . Q: What are the components of SSAS?  An OLAP Engine is used for enabling fast ad hoc queries by end users.  Clients communicate with Analysis Services using the standard the XMLA (XML For Analysis) . Q: What languages are used in SSAS ?  Structured Query Language (SQL)  Multidimensional Expressions (MDX) .  Faster Analysis and Troubleshooting.used to manage Analysis Services database objects. exposed as a web service.  Multidimensional – The data inside the OLAP Database must be multidimensional in structure. With greater investment in the construction of the UDM. isolation from heterogeneous backend data sources. protocol for issuing commands and receiving responses. Q: How Cubes are implemented in SSAS ? . we will be using what are called as Dimensional Databases. the advantages to the end user are a simpler.  Pivoting refers to switching categories of data between rows and columns. The UDM provides the following benefits: • Allows the user model to be greatly enriched.  Apply efficient Security Principles. additional benefits accrue from the richness of metadata that the model can provide. In some scenarios a simple UDM like this is constructed totally automatically.  Information – The OLAP database Must support large volumes of data.  Shared – Must support access to data by many users in the factors of Sensitivity and Write Backs. Q: Explain the TWO-Tier Architecture of SSAS?  SSAS uses both server and client components to supply OLAP and data mining functionality BI Applications.  SSAS is very easy to use and interactive.  The server component is implemented as a Microsoft Windows service.  Slicing refers to the process of placing data in rows and columns.

 Cubes are multidimensional models that store data from one or more sources.
 Cubes can also store aggregations
 SSAS Cubes are created using the Cube Wizard.
 We also build Dimensions when creating Cubes.
 Cubes can see only the DSV( logical View).
Q: What is the difference between a derived measure and a calculated measure?
The difference between a derived measure and a calculated measure is when the calculation is
performed. A derived measure is calculated before aggregations are created, and the values of
the derived measure are stored in the cube. A calculated measure is calculated after
aggregations are created, and the values of a calculated measure aren’t stored in the cube.
The primary criterion for choosing between a derived measure and a calculated measure is not
efficiency, but accuracy.
Q: What is a partition?
A partition in Analysis Services is the physical location of stored cube data. Every cube has at
least one partition by default. Each time we create a measure group, another partition is
created. Queries run faster against a partitioned cube because Analysis Services only needs to
read data from the partitions that contain the answers to the queries. Queries run even faster
when partition also stores aggregations, the pre calculated totals for additive measures.
Partitions are a powerful and flexible means of managing cubes, especially large cubes.
Q: While creating a new calculated member in a cube what is the use of property
called non-empty behavior?
Nonempty behavior is important property for ratio calculations. If the denominator Is empty,
an MDX expression will return an error just as it would if the denominator Were equal to zero.
By selecting one or more measures for the Non-Empty Behavior property, we are establishing
a requirement that each selected measure first be evaluated before the calculation expression
is evaluated. If each selected measure is empty, then The expression is also treated as empty
and no error is returned.
Q: What is a RAGGED hierarchy?
Under normal circumstances, each level in a hierarchy in Microsoft SQL Server 2005 Analysis
Services (SSAS) has the same number of members above it as any other member at the same
level. In a ragged hierarchy, the logical parent member of at least one member is not in the
level immediately above the member. When this occurs, the hierarchy descends to different
levels for different drilldown paths. Expanding through every level for every drilldown path is
then unnecessarily complicated.

Q: What are the roles of an Analysis Services Information Worker?
The role of an Analysis Services information worker is the traditional "domain expert" role in
business intelligence (BI) someone who understands the data employed by a solution and is
able to translate the data into business information. The role of an Analysis Services
information worker often has one of the following job titles: Business Analyst (Report
Consumer), Manager (Report Consumer), Technical Trainer, Help Desk/Operation, or Network
Q: What are the different ways of creating Aggregations?
We can create aggregations for faster MDX statements using Aggregation Wizard or thru UBO
– Usage Based Optimizations. Always, prefer UBO method in realtime performance
Q: What is WriteBack? What are the pre-conditions?
The Enable/Disable Writeback dialog box enables or disables writeback for a measure group in
a cube. Enabling writeback on a measure group defines a writeback partition and creates a
writeback table for that measure group. Disabling writeback on a measure group removes the
writeback partition but does not delete the writeback table, to avoid unanticipated data loss.
Q: What is processing?
Processing is a critical and resource intensive operation in the data warehouse lifecycle and
needs to be carefully optimized and executed. Analysis Services 2005 offers a high
performance and scalable processing architecture with a comprehensive set of controls for

database administrators.
We can process an OLAP database, individual cube, Dimension or a specific Partition in a cube.

Q: Name few Business Analysis Enhancements for SSAS?
The following table lists the business intelligence enhancements that are available in Microsoft
SQL Server 2005 Analysis Services (SSAS). The table also shows the cube or dimension to
which each business intelligence enhancement applies, and indicates whether an enhancement
can be applied to an object that was created without using a data source and for which no
schema has been generated.
No data
Enhancement Type Applied to
Time Intelligence Cube Cube No
Account Intelligence Dimension Dimension or cube No
Dimension Intelligence Dimension Dimension or cube Yes
Dimension (unary operator) or
Custom Aggregation Dimension No
Semiadditive Behavior Cube Cube Yes>
Custom Member Formula Dimension Dimension or cube No
Custom Sorting and Uniqueness
Dimension Dimension or cube Yes
Dimension Writeback Dimension Dimension or cube Yes

Q: What MDX functions do you most commonly use?
This is a great question because you only know this answer by experience. If you ask me this
question, the answer practically rushes out of me. “CrossJoin, Descendants, and NonEmpty, in
addition to Sum, Count, and Aggregate. My personal favorite is CrossJoin because it allows
me identify non-contiguous slices of the cube and aggregate even though those cube cells
don’t roll up to a natural ancestor.” Indeed, CrossJoin has easily been my bread and butter.
Q: Where do you put calculated members?
The reflexive answer is “in the Measures dimension” but this is the obvious answer. So I
always follow up with another question. “If you want to create a calculated member that
intersects all measures, where do you put it?” A high percentage of candidates can’t answer
this question, and the answer is “In a dimension other than Measures.” If they can answer it,
I immediately ask them why. The answer is “Because a member in a dimension cannot
intersect its own relatives in that dimension.”

Q: How do I find the bottom 10 customers with the lowest sales in 2003 that were not null?

A: Simply using bottomcount will return customers with null sales. You will have to combine it

SELECT { [Measures].[Internet Sales Amount] } ON COLUMNS ,
NONEMPTY(DESCENDANTS( [Customer].[Customer Geography].[All Customers]
, [Customer].[Customer Geography].[Customer] )
, ( [Measures].[Internet Sales Amount] ) )
, 10
, ( [Measures].[Internet Sales Amount] )
FROM [Adventure Works]
WHERE ( [Date].[Calendar].[Calendar Year].&[2003] ) ;
Q: How in MDX query can I get top 3 sales years based on order quantity?

By default Analysis Services returns members in an order specified during attribute design.
Attribute properties that define ordering are "OrderBy" and "OrderByAttribute". Lets say we
want to see order counts for each year. In Adventure Works MDX query would be:

SELECT {[Measures].[Reseller Order Quantity]} ON 0
, [Date].[Calendar].[Calendar Year].Members ON 1
FROM [Adventure Works];

Same query using TopCount:
{[Measures].[Reseller Order Quantity]} ON 0,
TopCount([Date].[Calendar].[Calendar Year].Members,3, [Measures].[Reseller Order
Quantity]) ON 1
FROM [Adventure Works];
Q: How do you extract first tuple from the set?

Use could usefunction Set.Item(0)

SELECT {{[Date].[Calendar].[Calendar Year].Members
ON 0
FROM [Adventure Works]
Q: How can I setup default dimension member in Calculation script?

You can use ALTER CUBE statement. Syntax:

SSRS - SQL Server Reporting Services
Q: What is SSRS?
SQL Server Reporting Service is one of the server-based software systems that generate
reports developed by Microsoft. It is used for preparing and delivering interactive and variety
of printed reports. It is administered through an interface that is web based. Reporting
services utilizes a web service interface for supporting and developing of customized reporting
applicatons. It can be competed with Crystal Reports and other business intelligent tools.
Q: Explain SSRS Architecture?
Reporting services architecture is comprises of integrated components. It is multi-tiered,
included with application, server and data layers. This architecture is scalable and modular. A
single installation can be used across multiple computers. It includes the following
components: - Report Manager, Reporting Designer, Browser Types Supported by Reporting
services, Report server, Report server command line utilities, Report Server Database,
Reporting Services Extensibility, Data sources that is supported by Reporting Services.
Q: Explain Reporting Life Cycye?
The Reporting Lifecycle includes - Report designing – The designing is done in Visual Studio
Report Designer. It generates a class which embodies the Report Definition. - Report
processing – The processing includes binging the report definition with data from the report
data source. It performs on all grouping, sorting and filtering calculations. The expressions are
evaluated except the page header, footer and section items. Later it fires the Binding event
and Bound event. As a result of the processing, it produces Report Instance. Report instance
may be persisted and stored which can be rendered at a later point of time. - Report
Rendering: Report rendering starts by passing the Report Instance to a specific rendering
extension (HTML or PDF formats). The instance of reports is paged if paging supported by
output format. The expressions of items are evaluated in the page header and footer sections
for every page. As a final step, the report is rendered to the specific output document.
Q: How to finetune Reports?
To tune-up the Reporting Services, follow the below mentioned ways: - Expand the Server or
utilizing the reporting services of another database server. For better embedding of report
contents, report application’s logic and characteristics can have a duplicate copy of data. -
Replication of data continuously. Using nolock, the issues of locking can well be resolved and
the performance of the query can be improved. This can be done by using dirty read at the
time of duplicating the data is unavailable.

Q: What are the Reporting Services components? Reporting services components assist in development. a report sever is used to execute and distribute reports. Caching in Crystal reports is available through cache server. Q: How can you monitor the report Usage? Q: How does the report manager work in SSRS? Report manager is a web application. Report Builder works with both SQL Server and Analysis Services data sources. To create a model for Analysis Services cube. In SSRS it is accessed by a URL. Q:SQL Server Reporting Services vs Crystal Reports. For example send a sales performance report to the top ten sales managers in an organization. Q: How does Report Builder support Analysis Services cubes? Report Builder supports relational SQL and Analysis Services data sources in SQL Server 2005. Distributing reports to a specific group of recipients based on predefined criteria. a URL needs to be defined. but without needing to understand database schemas or how to write SQL or MDX queries. Logical page breaks defined on items in table or matrix cells are not kept. A user with a role of full permissions can entire all the features and menus of the report. SSRS allows only user defined field labels. In contrast with other delivery extensions. Q: What is Linked Report? Q: What are different types of roles provided by SSRS? Q: Difference between Logical Page an Physical Page in SSRS. Q: What is Report Builder? Report Builder is a business-user. caching in SSRS is available for Report history snapshots. A report designer is used to create the reports. Data-driven subscriptions are intended for the following kinds of scenarios: Distributing reports to a large recipient pool whose membership may change from one distribution to the next. a report manager is used to manage the report server. the report server targets the report server database as the delivery destination and uses a specialized rendering extension called the null rendering extension.Q: What are Data Driven Subscriptions? Reporting Services provides data-driven subscriptions so that you can customize the distribution of a report based on dynamic subscriber data. Crystal reports have standards and user defined field labels. The interface of this Report manager depends on the permissions of the user. Crystal reports are processed by IIS while SSRS have a report server. Q: User want only to display only pdf as export option in report Manager. Logical page breaks are page breaks that you insert before or after report items or groups. When you specify the Null Delivery Provider as the method of delivery in the subscription. the Null Delivery Provider does not have delivery settings that you can configure through a subscription definition. Page breaks help to determine how the content is fitted to a report page for optimal viewing when rendering or exporting the report. the user must be assigned a role. How to acheive this? Q: Name and Describe few console utilities for SSRS? Q: Name few Endpoints exposed by SSRS 2012? Q: Describe different Processing Modes offered by SSRS? Q: When to Use Null Data driven Subscription? Create a data-driven subscription that uses the Null Delivery Provider. On the other hand. ad-hoc report design client that allows users to design reports based on the business terms (Report Builder model) they are familiar with. go to Report Manager or Management Studio. This does not apply to items in lists. . This means to access any functionality or perform any task. Space is preserved between the report item with the logical page break and its peer report items. The report item is rendered at the top of the next page. The following rules apply when rendering logical page breaks: Logical page breaks are ignored for report items that are constantly hidden and for report items where the visibility is controlled by clicking another report item. Logical page breaks are applied on conditionally visible items if they are currently visible at the time the report is rendered. Logical page breaks that are inserted before a report item push the report item down to the next page. For example distribute a monthly report to all current customers. These processing components include some tools that are used to create. To configure the report manager. manage and view reports.

ODBC. they have SQL Server Reporting Services. OLE DB). These are then client-side reports. you can use object collections as well. Now. Q: What are the drawbacks of reporting in SSRS? For many years. Q: What new data source types were added in SSRS 2012? In addition to the data source types available in SSRS 2005 (SQL Server. A set of reporting controls was added in Visual Studio 2010 allowing you to report in a dataset. Q: Can we use datagrids for our report in SSRS? I've got an ASP. It is still complex to understand the complete functionality and structure of this new component. but it does have several drawbacks. you can build reports using the Report Designer that is directly integrated with Visual Studio language projects. on data that was supplied by you. and then select the Generate Model option to create the model.NET applications.NET providers but you should be able to connect to any ODBC or OLE-DB data source. but the report is still deployed to and hosted from reporting services. Q: How do users use Report Builder with SQL Server data sources? While models that provide access to SQL Server Analysis Services are automatically generated on the report server. . The data access in embedded reports is a natural extension of the Visual Studio data facilities. components in SSRS like Report Builder and Report Designer are meant for different users for different aspects of the report process. Your only option for viewing them from other sites is an HTTP link. However. You can embed reports directly in any Windows Forms or ASP. one for rich Windows client applications and one for ASP. including Excel files.create a data source for your Analysis Services database. I will assume that you're asking whether you can create reports and use Excel spreadsheets as data sources. Q: Do I need a report server to run reports in my application? In addition to publishing reports to a report server. Select the Prompt option in the drop-down list. Also. and many users are still relying on the reporting application they are more familiar with. There are also issues when exporting very large reports to Microsoft Excel. you could then use that dataset as the datasource for the reporting controls. Not only can you use traditional databases as a source of data for your reports. The answer is Yes. Some tools. Is this possible? The simple answer is no.NET project that populates a datagrid. Q: Can you import Microsoft Excel data to SSRS? Reporting Services does not import data. Q: How do I get Report Builder to generate a parameter that can be set by users viewing the report? In the filter dialog box. You'll get the best performance with the built-in native . nothing's ever simple. Oracle. yet complete understanding and exposure to both is important to utilize both functions fully and extensively. I'd like to use the datagrid as my datasource for my report using SQL Server Reporting Services. like SharePoint offer controls allowing you to view reports in the context of the other websites. bound the datagrid to the dataset so it had data to display.NET Web application without access to a report server. click Order Year. It only queries data in whatever format it is stored in their native storage system. not server reports though. whether it comes from Microsoft or a third-party company. So. the Report Builder Model Designer can be used to generate or modify the models that are built on top of SQL Server relational databases. For example. Q: Can we deploy SSRS reports on our personal website? Your reports can only be deployed on a reporting services site. These model-building projects are a new type of project within a Visual Studio–based development shell. Reporting Services supports a wide variety of data sources. which is Crystal Reports. Microsoft had no direct solution for reporting with the SQL Server besides Crystal Reports. Two versions of the Report Viewer exist. if you retrieved your data into a dataset. click the name of the criteria that you would like to prompt the user for when viewing the report. for the criteria Order Year=2000. as it can lead to a loss of data. the following have been added in SSRS 2008: SQL Server 2005 Analysis Services SQL Server 2005 Integration Services SQL Server 2005 Report Builder Models XML (through URL and Web services) SAP Q: How can I add Reporting Services reports to my application? Visual Studio 2005 (Standard and Enterprise editions) contains a set of freely redistributable Report Viewer controls that make it easy to embed Reporting Services functionality into custom applications.

PackageName 4. it is usually headers or footers that cause exporting issues.MachineName . You need Standard or Enterprise Edition for production use. If you have extra resources. As SSRS runs via IIS. by using sp_start_job and passing the relevant job name you can execute the SSRS report subscription. Sources 2. the only edition of SSRS that will install on Windows XP is the Developer Edition. Execution StartTime 6. but not because of SSRS. From my experience. At the report subscription you can mention the report format and the email address of the recipient.VersionID : GUID version of the package 5. When you create a schedule for the SSRS report. Today i will share with you the list of SSIS most frequently asked interview questions with answers SSIS 2008 or SSIS 2012 interview questions will help you prepare better for your interviews 1. Windows 2008 Standard. etc). as well. Transformations 3. Also. In order to do this. Q: How to send a SSRS report from SSIS? Often there is a requirement to be able to send a SSRS report in Excel. even if you stay within the Microsoft range of products. Q: Are there issues when exporting SSRS reports into Microsoft Excel? When my users are trying to export a SSRS report into Microsoft Excel. check columns next to each other to make sure that there is no overlap. though. Destinations 2. you will find that the exported version of the report has merged cells. What are the different types of Data flow components in SSIS? There are 3 data flow components in SSIS. first you need to create a subscription to the report. PDF or another format to different users from a SSIS package one it has finished performing a data load. 1. This edition can not be used for production use. Execution of Instance GUID : ID of execution instance of the package 2. which requires a Server OS to install on (Windows 2003 Standard. Why might this be? Exporting from SSRS is not always perfect. Explain Audit Transformation ? It allows you to add auditing information as required in auditing world specified by HIPPA and Sarbanes-Oxley (SOX). The Internet Information Services (IIS) component of Windows XP only allows a small number of users to connect to the website at once. If any of these headers or footers overlap with data columns in your report. one or two columns in the report appear to merge together. Auditing options that you can add to transformed data through this transformation are : 1. such as OfficeWriter. you could splurge for an add-on that offers much better control over exporting to Excel. From the SSIS. You can create a SSRS report subscription from Report Manager. this would prevent more than a few people from using SSRS at once. PackageID : ID of the package 3. Also.Q: Will running SSRS on Windows XP limit the number of users? Yes. a SQL Server Agent Job will be created.

6.What is a Task? A task is very much like a method of any programming language which represents or carries out an individual unit of work. 5. Although there are around 30 control flow tasks which you can use in your package you can also develop your own custom tasks with your choice of . There are broadly two categories of tasks in SSIS.What is the Control Flow? When you start working with SSIS.Explain Multicast Transformation? This transformation sends output to multiple output paths with no conditional as Conditional Split does. you first create a package which is nothing but a collection of tasks or package components. Takes ONE Input and makes the COPY of data and passes the same data through many outputs. The control flow allows you to . All Control Flow tasks are operational in nature except Data Flow tasks.TaskName 9.TaskID : uniqueidentifier type of the data flow task that contains audit transformation 3. Explain Copy column Transformation? This component simply copies a column to another new column.NET programming language 7. Explain Derived column Transformation? Derived column creates new column or put manipulation of several columns into new column. Just like ALIAS Column in T-Sql. You can directly copy existing or create a new column using more than one column also. Control Flow tasks and Database Maintenance tasks.UserName 8. 4. In simple Give one input and take many outputs of the same data.7.What is a workflow in SSIS ? Workflow is a set of instructions on to specify the Program Executor on how to execute tasks and containers within SSIS Packages 8.

so you can ensure tasks/components get executed in the appropriate order 9. Access. SSRS INTERVIEW QUESTIONS Question: Which versions of SSRS have you used? Question: Have you got samples of your work? Question: Do you have Data Visualization skills? (optional) Question: How have you learnt SSRS (on the job. books. conferences) Question: How do you normally create reports (wizard/manually)? Question: What languages have you used to query data for SSRS Reports? Question: Gives examples where you used parameters? Question: What types of graphs do you normally use and what effects do you apply to them? Question: Have you used custom assemblies in SSRS? If Yes give an example Question: Can you update report data using SSRS? Question: What is the formatting code (format property) for a number with 2 decimal places? Below are "narrowed" questions: Question: What does rdl stand for? Question: How to deploy an SSRS Report? Question: What is Report Manager? Question: What is Report Builder? Question: What permission do you need to give to users to enable them to use Report Builder? Question: What do you need to restore report server database on another machine? Question: Can you create subscription using windows authentication? Question: What caching options do you have for reports on report server? Question: How to find slow running reports? . Common ones are SQL Server. Excel. Anther example is if you have master/reference data and want to pull only related data from the source and hence you need some sort of lookup. For example you are pulling data from the source and want to ensure only distinct records are written to the destination. so duplicates are removed. Oracle. MySQL but also Salesforce. CSV/TXT.What is a Transformation? A transformation simply means bringing in the data in a desired format. web data scrapping.order the workflow. 10. There are around 30 transformation tasks available and this can be extended further with custom built tasks if needed. Flat Files. articles.How many difference source and destinations have you used? It is very common to get all kinds of sources so the more the person worked with the better for you.

Question: How have you learnt SSRS (on the job. Model. articles. 3D effects. Question: Do you have Data Visualization skills? (optional) Comment: If you need dashboards then it is worth asking this question (we will ask more specific questions later). If the candidate used several versions ask to describe the differences between them. Some candidates may mention Stephen Few or Edward Tufte and this normally is a big plus.Question: How can you make calendar available in report parameter? Question: How to pass multi-valued parameter to stored procedure in dataset? Question: Which functions are used to pass parameter to MDX query? Question: How to create "dependent " parameter "Make. When you review samples ask the candidate or yourself two questions: What is the purpose? And is it useful? Someone who does not know anything about data visualization will usually use pie charts. gauges (exception is bullet chart which is classified as gauge in SSRS). make colourful designs (without any colour meaning). Question: Have you got samples of your work? Comment: If SSRS is main skills for the role than it is worth asking for samples (before interview) they will often tell you a lot about the candidate quality of work. books. Year" Question: How to create "alternate row colour"? Question: How to create percentile function? Question: How to create median function? Question: How to make conditional sum in SSRS? Question: How to find a value in another dataset based on current dataset field (SSRS 2008 R2)? Question: How to change parameter value inside the report? Question: How to identify current user in SSRS Report? General SSRS Questions Question: Which versions of SSRS have you used? Comment: Differences between 2005 and 2008 are quite big so if someone hasn't used 2008 or 2008 R2 before (or vice versa) than make sure you know this will require a few days to get up to speed with the new/previous version. This is also related to previous question. Blog/Articles vary in quality so best practise articles is a . conferences) Comment: The thing is that most people who read good books have usually an advantage over those who hasn't because they know what they know and they know what they don't know (but they know it exists and is available)….

If someone build reports based on cubes before than they will say MDX. You can also query data from other sources (not recommended) or use data mining expressions (DMX = advanced). Bad graphs are pie charts. if manually than it is usually good answer. Question: What is the formatting code (format property) for a number with 2 decimal places? Comment: N2. SSRS interview "narrowed" questions In this section I will give you fairly long list of short and narrowed questions: . Generally developers create reports from scratch which is not bad but it is not very efficient either. Question: What languages have you used to query data for SSRS Reports? Comment: Most answers will probably be SQL (TSQL). Re-usability is good but building dependencies is not so good… so one small mistake might break all reports using it. "glass" effect. Attention to details and good memory is always welcome. scatter. SSRS Development open questions Question: How do you normally create reports (wizard/manually)? Comment: If wizard in most cases then you got rather inexperienced person. so it should be used with care. line. shadows etc SSRS Advanced questions Question: Have you used custom assemblies in SSRS? If Yes give an example Comment: This allows to re-use code in reports and it is not very common. Question: What types of graphs do you normally use and what effects do you apply to them? Comment: Good graph are bar. But T-SQL is not the only query language. area graphs. You can also have hidden and internal parameters which get very handy for more advanced stuff. bullet graphs. Developers should avoid 3D effects. conferences can be also a plus. Effects should be limited to minimum. Question: Gives examples where you used parameters? Comment: Typically you use parameters to filter data in datasets (or data on reports) but you can also use them to restrict values like the well-known make->model->year example.big plus+. gauges (apart from bullet graph which classified as gauge in SSRS). The best answer is using a template. Question: Can you update report data using SSRS? Comment: This is not often used (and probably shouldn't be used in most cases) but it is possible.

Visit our tutorial for more details. Question: How to create median function? Answer: Custom code is required. Question: How to create percentile function? Answer: Custom code is required. Question: How to pass multi-valued parameter to stored procedure in dataset? Answer: Join function in SSRS and split function in T-SQL Question: Which functions are used to pass parameter to MDX query? Answer: StrToMember and StrToSet Question: How to create "dependant" parameter "Make. Question: How to make conditional sum in SSRS? Answer: IIF condition true then SUM else nothing. Model. Question: What is Report Builder? Answer: Report Builder is a self-service tool for end users.exe for command line deployment. For instance Model dataset should be filtered using Make parameter Question: How to create "alternate row colour"? Answer: use RowNumber and Mod function OR visit our tutorial. and previous parameter should filter next parameter dataset. Question: What caching options do you have for reports on report server? Answer: Do no cache. upload manually or use rs. Question: How to find slow running reports? Answer: Check ReportExecution table Question: How can you make calendar available in report parameter? Answer: Set parameter data type to date. Question: What permission do you need to give to users to enable them to use Report Builder? Answer: "Report Builder" role and "system user". Year" Answer: They need to be in correct order. Visit our tutorial for more details. expiry cache after x minute. Question: What do you need to restore report server database on another machine? Answer: SSRS Encryption key Question: Can you create subscription using windows authentication? Answer: No. Visit our tutorial for more details . Report builder should also be enable in report server properties. Question: What is Report Manager? Answer: Web based tool that allows to access and run reports. on schedule.Question: What does rdl stand for? Answer: Report Definition Language Question: How to deploy an SSRS Report? Answer: Configure project properties (for multiple environments) and deploy from bids.

Question: How to change parameter value inside the report? Answer: Set action. Management c. The reports that you create can be viewed and managed over a World Wide Web-based connection Q2. or create custom tools to build and manage reports. and more. TIFF. WHAT is SQL Server Reporting Services(SSRS)? SQL Server Reporting Services is a server-based reporting platform that you can use to create and manage tabular. "Jump to itself" and pass different value for the parameter. What are the three stages of Enterprise Reporting Life Cycle ? a. A Complete set of Tools that can be used to create. and free-form reports that contain data from relational and multidimensional data sources. Architecture of SSRS: -Admin Q3. What are the components included in SSRS? 1. matrix. 3. PDF. Authoring b.Question: How to find a value in another dataset based on current dataset field (SSRS 2008 R2)? Answer: Use lookup function. CSV. Output formats include HTML. Access and Delivery Q4. Excel. graphical.An API that allows developers to integrate or extend data and report processing in custom applications. manage and view reports 2. A Report Server component that hosts and processes reports in a variety of formats. . Question: How to identify current user in SSRS Report? Answer: User!UserID ----------------------------------------------------------------------------------------------------- Q1.

used for ac hoc reports created in Report Builder. A report definition contains information about the query and layout for the report. Folder hierarchy: A bounded namespace that uniquely identifies all reports. shared data source items. 4. Linked report: A report that derives its definition through a link to another report. Report server administrator: This term is used in the documentation to describe a user with elevated privileges who can access all settings and content of a report server. 3. 5. Which programming language can be used to code embedded functions in SSRS? Visual Basic . which provides data and report processing. Shared schedule: A predefined. Important terms used in the reporting services? 1. and resources that are stored in and managed by a report server. A report snapshot is actually a report definition that contains a dataset instead of query instructions. standalone item that contains data source connection information. The Report Server component includes several subcomponents . standalone item that contains schedule information. 12. Parameterized report: A published report that accepts input values through parameters. 7. Report Server: Describes the Report Server component.Q5. Report snapshot: A report that contains data captured at a specific point in time. Shared data source: A predefined. a report server administrator is typically a user who is assigned to both the Content Manager role and the System Administrator role. If you are using the default roles. 6. Report model: A semantic description of business data. in a format suitable for viewing (such as HTML). 9. report models. and report delivery. 11.NET Code. Q6. folders. Report definition: The blueprint for a report before the report is processed or rendered. Centralized code: helps in better manageability of code. 2. Report-specific data source: Data source information that is defined within a report definition. 10. Rendered report: A fully processed report that contains both data and layout information. Q7. Reuseability of Code: function created in embedded code to perform a logic can be then used in multiple expressions 2. Local administrators can have elevated permission even if role assignments are not defined for them. What is the benefit of using embedded code in a report? 1. 8.

a) RsConfig. Each column corresponds to a column selected from the database. 15. Model Designer: Report model creation tool used to build models for ad hoc reporting. Report Designer: Report creation tool included with Reporting Services. Q. c) Rs.exe): encrypts and stores connection and account values in the RSReportServer.exe.exe Q8.that perform specific functions. What is difference between Tablular and Matrix report? OR What are the different styles of reports? Tablular report: A tabular report is the most basic type of report. c. Martix reports can be considered more of a Pivot table. 16. restores. Report Builder: Report authoring tool used to create ad hoc reports. How to create Drill-through reports? Using Navigation property of a cell and setting child report and its parameters in it. what are the Command Line Utilities available In Reporting Services? · Rsconfig Utility (Rsconfig. Matrix report: A matrix (cross-product) report is a cross-tabulation of four groups of data: a. One group of data is displayed down the page.Processes script you provide in an input file. b) RsKeymgmt. which determines all possible locations where the across and down data relate and places a cell in those locations. d. Q. 14. One group of data is the cross-product. Report Server Command Prompt Utilities: Command line utilities that you can use to administer a report server. 17. How to create Drill-Down reports? . creates. One group of data is displayed across the page. Q. b. SELECT * FROM ReportServer.ExecutionLog -Development Q. How to know Report Execution History? ExecutionLog table in ReportServer database store all the logs from last two months.exe.config file. Report Manager: Describes the Web application tool used to access and manage the contents of a report server database. and deletes the symmetric key used to protect sensitive report server data against unauthorized access · RS Utility: this utility is mainly used to automate report server deployment and administration tasks. One group of data is displayed as the "filler" of the cells. 13.dbo. Encrypted values include report server database connection information and account values used for unattended report processing · RsKeymgmt Utility: Extracts.

By grouping data on required fields -Then toggle visibility based on the grouped filed --------------------------------------------------------------------------------------------------------- -------------------------- 1. 8. What are data regions?Data regions are report items that display repeated rows of summarized information from datasets. 7. and Image rendering extensions use physical page breaks. they are the only formats that are affected by the PageSize properties. you must create the chart report directly by using the Report Designer. How do you display the Image Properties dialog box? When you drag an image item from the Toolbox window to the Report Designer. . you can use the DMX Designer to create data mining queries for SSRS reports. Word. select the report item. do not forget to flatten the result set returned by the DMX query. Which rendering formats are affected by the PageSize properties?Because only the Adobe PDf file. However. the Image Properties dialog box automatically opens. You want to use BIDS to deploy a report to a different server than the one you chose in the Report Wizard. You want to generate a report that is formatted as a chart. Can you use a stored procedure to provide data to an SSRS report?Yes. You want to include an image in a report. 9. 6. your stored procedure should return only a single result set. Which property do you use? To configure an amount to display a value in a currency format. You want to use a perspective in an MDX query. How can you change the server URL? You can right-click the project in Solution Explorer and then change the Target-Server URL property. How do you select the perspective?Use the Cube Selector in the MDX Query Designer to select a perspective. You want to configure an amount to display the value in a currency format. Can you use data mining models in SSRS?Yes.To cut the story short: . and then set the format property to C or c. Can you use the Report Wizard to create such a report? No. 2. you can use a stored procedure to provide data to an SSRS report by configuring the dataset to use a stored procedure command type. only the first one is used for the report dataset. However. the Report Wizard lets you create only tabular and matrix reports. 4. If it returns multiple result sets. 5. 3.

18. can you create the same interactive experience in Excel that you would have on the Web? No. How should you configure the parameter? You should create a data source that contains the possible values and then bind the data source to the parameter. What is the main purpose of a query parameter? The main purpose of a query parameter is to filter data in the data source. You want users to see only summarized information initially but to be able to display the details as necessary. set the action to Go To URL. is it better to use parameters to filter information in the query or to use filters in the dataset?From a performance perspective. You want a report to display Sales by Category. In SSRS. when you use filters. 17. You want to create an Excel interactive report from SSRS. --------------------------------------------------------------------------------------------------------- . 20. You want your report to display a hyperlink that will take users to your intranet. Hide the Product category group and set the visibility to toggle based on the SubCategory item. SubCategory. What is the main purpose of a report parameter? The main purpose of a report parameter is to add interactivity to your reports. When you do not use report caching. and then configure the URL. How do you configure a running aggregate in SSRS?You can use the RunningValue function to configure a running aggregate. 11. the queries retrieve all data and then filter the information in an additional step. use Excel to create such an experience. What is the main difference between a Matrix report item and a Table report item? The main difference between a Matrix and a Table report item is in the initial template. however. and Product. How do you configure such a hyperlink?Create a text box item. 16. 10. You want your users to select a parameter from a list of values in a list box. letting users change the report behavior based on options they select. and Product. Hide the SubCategory group and set the visibility to toggle based on the Category item. Actually. 12. both report items are just templates for the Tablix data region. you cannot create the same experience with SSRS. you can. 14. What programming language would you use to create embedded functions in SSRS? An SSRS report supports only visual Basic . How would you create the report?Group the Sales information by Category. 19. What is the main benefit of using embedded code in a report?The main benefit of using embedded code in a report is that the code you write at the report level can be reused in any expression in the report.nET embedded code. In contrast. 13. SubCategory. 15. it is better to use parameters because they let SSRS pull filtered data from the data source.

SQL Server 2005 Integration Services (SSIS) provides three different types of data flow components: sources. and destinations. which occur if expressions that are evaluated at run time perform invalid operations or become syntactically incorrect because of missing or incorrect data values. You specify how the component behaves when truncation or an error occurs by setting options on individual . Before you can add a data flow to a package. Sources extract data from data stores such as tables and views in relational databases. resulting in a mathematical operation that is not valid. -Lookup errors. or an expression fails to evaluate because a column value is zero. which occur if a lookup operation fails to locate a match in the lookup table. and runs the data flow. containers. A separate instance of the data flow engine is opened for each Data Flow task in a package. and the truncation of strings. Errors typically fall into one the following categories: -Data conversion errors. files. which let you control how the component handles row-level errors in both incoming and outgoing data. transformations. tasks that provide functionality.SSIS and SSRS Interview questions with Answers 1) What is the control flow ANS: In SSIS a workflow is called a control-flow. The Data Flow task is the executable within the SSIS package that creates. which occur if a conversion results in loss of significant digits. 2) what is a data flow ANS: A data flow consists of the sources and destinations that extract and load data. errors can occur. Errors frequently occur because of unexpected data values. a data conversion fails because a column contains a string instead of a number. Destinations load data into data stores or create in-memory datasets. and the paths that link sources. A subset of tasks and containers can also be grouped and run repeatedly as a unit within the package control flow. the loss of insignificant digits. and clean data. SQL Server 2005 Integration Services (SSIS) provides three different types of control flow elements: containers that provide structures in packages. Transformations modify. and tasks into an ordered control flow. or loads data into destinations. Many data flow components support error outputs. the package control flow must include a Data Flow task. Data conversion errors also occur if the requested conversion is not supported. and destinations. -Expression evaluation errors. A control-flow links together our modular data-flows as a series of operations in order to achieve a desired result. the transformations that modify and extend data. orders. you use precedence constraints to connect the tasks and containers in a package. 3) how do you do error handling in SSIS ANS: When a data flow component applies a transformation to column data. For example. A control flow consists of one or more tasks and containers that execute when the package runs. To control order or define the conditions for running the next task or container in the package control flow. summarize. and precedence constraints that connect the executables. an insertion into adatabase column fails because the data is a date and the column has a numeric data type. extracts data from sources. transformations. and Analysis Services databases.

Similarly. 2. The Integration Services log schema defines the information that you can log. Scope : A variable is created within the scope of a package or within the scope of a container. click Logging. Integration Services supports a diverse set of log providers. 3. variables defined within the scope of a container such as a For Loop container can be used by all tasks or containers within the For Loop container. You can select elements from the log schema for each log entry. SQL Server.columns in the input or output.Using a checkpoint file in SSIS is just like issuing the CHECKPOINT command against the relational engine. On the SSIS menu. 5)what are variables and what is variable scope ? ANS: Variables store values that a SSIS package and its containers. Integration Services provides a schema of commonly logged information to include in log entries. ANS: SSIS includes logging features that write log entries when run-time events occur and can also write custom messages. Each task or container in a package can log information to any package log. Logs are associated with packages and are configured at the package level. and system variables are defined by Integration Services. 6) True or False . Because the package container is at the top of the container hierarchy. WindowsEvent Log. and event handlers can use at run time. The precedence constraints that sequence tasks and containers into a workflow can use variables when their constraint definitions include expressions. In Business Intelligence Development Studio. or XML files. To enable logging in a package 1. . and then click Add. It commits all of the data to the database. The tasks and containers in a package can be enabled for logging even if the package itself is not. or event handler in the package. User-defined variables are defined by package developers. open the Integration Services project that contains the package you want. To customize the logging of an event or custom message. 4) How do you do logging in ssis. ANS: False. tasks. 7) True or False: SSIS has a default means to log all records updated. Select a log provider in the Provider type list. but you cannot create additional system variables. and gives you the ability to create custom log providers. but ignore errors on another column that contains less important data. deleted or inserted on a per table basis. For example. You can create as many user-defined variables as a package requires. SQL Server Profiler. variables with package scope function like global variables and can be used by all containers in the package. The Integration Services log providers can write log entries to text files. SSIS provides a Checkpoint capability which allows a package to restart at the point of failure. Integration Services supports two types of variables: user-defined variables and system variables. you can specify that the component should fail if customer name data is truncated. task. The scripts in the Script task and the Script component can also use variables.

remove remaining 5 packages. SSIS. Rather than using a hard- coded path as shown above. The Always choice raises an error if the Checkpoint file does not exist.ANS: False. but a custom solution can be built to meet these needs. But How to deploy only 5 packages in Destination machine eventhough Manifest file contains all 10 packages after the Build? -Open the Manifest file in any editor like BIDS or Notepad. IfExists is the typical setting and implements the restart at the point of failure behavior. 10 unique conditions exist for each breakpoint. Choose from these options: Never (default). one or two techniques are typically used: Recode the package based on the functionality in SQL Server DTS Use the Migrate DTS 2000 Package wizard in BIDS then recode any portion of the package that is not accurate. the SSIS project contain 10 packages.Specify the full path to the Checkpoint file that the package uses to save the value of package variables and log completed tasks. . Never indicates that you are not using Checkpoints. If a Checkpoint file is not found the package starts execution with the first task. The breakpoint can give the Developer\DBA an opportunity to review the status of the data. -Double click on Manifest file to deploy the required 5 package. Additional information: How to strip out double quotes from an import file in SQL Server Integration Services 10) Can you explain how to setup a checkpoint file in SSIS? ANS: The following items need to be configured on the properties tab for SSIS package: CheckpointFileName . keep the required 5 packages.Determines if/how checkpoints are used. If a Checkpoint file is found it is used to restore package variable values and restart at the point of failure. 8) What is a breakpoint in SSIS? How is it setup? How do you disable it? ANS: A breakpoint is a stopping point in the code. Breakpoints are setup in BIDS..Choose from these options: True or False (default). You must select True to implement the Checkpoint behavior. Right click on the object where you want to set the breakpoint and select the 'Edit Breakpoints. SaveCheckpoints .' option. navigate to the control flow interface. it's a good idea to use an expression that concatenates a path defined in a package variable and the package name. -Save and Close the Manifest file. 11) How do you upgrade an SSIS Package? ANS: Depending on the complexity of the package. 9) How do you eliminate quotes from being uploaded from a flat file to SQL Server? ANS: In the SSIS package on the Flat File Connection Manager Editor. SSAS Interview Questions and Answers Views SSRS FAQs SSAS FAQs SSIS FAQs WCF FAQs SSIS Interview Questions and Answers 1. or Always. enter quotes into the Text qualifier field then preview the data to ensure the quotes are not included. IfExists. In BIDS. variables and the overall status of the SSIS package. In BIDS. SSRS.. CheckpointUsage .

How would you pass a Parent variable value to Child Package? We can pass the parent variable to child package by using Package configuration: Parent Package Variable. .2.2 Drag an 'Execute Package Task' in Control Flow and configure it to start child package.txt 2.1 Create parent variable FilePath .String .Parent .C:\RK\file. 2.

3 Go to Child Package and create a variable with same name as your parent package variable.5 "Enable Package Configuration".4 Add Package configurations 2. . choose Configuration type as "Parent Package Variable" and type the name of the variable. 2.2.

6 click 'Next' button and select the 'Value' property of the child package variable. .2.

8 To test the package. My Source Table data as follows: Output Should be as follows: How to Implement? Designed SSIS package like: .2. 3. I added sample Script Task with a messageBox to show the value of the parent package.7 click 'Next' and 'OK' buttons 2.

The script component code: .

4. My Source Table Data as follows: Output should be as follows: .

How to Implement? .

In the data flow task. How to pass property value at Run time? A property value like connection string for a Connection Manager can be passed to the package using package configurations. and there should be a property "FastLoadOptions". go to the Advanced Editor of OLEDB Destination. Specify FIRE_TRIGGERS as an additional option.5. Will trigger fire when inserting data through SSIS package? 1. 2. 6. SQL Destination Editor: .

Minimum. upper case. . Count.3. used to convert data lower case. Maximum AUDIT . which you can use to specify groups to aggregate across. Average. What are the different types of Transformations you have worked AGGEGATE -The Aggregate transformation applies aggregate functions to column values and copies the results to the transformation output. Character Map transformation is very useful.such as Machine Name. Besides aggregate functions. etc. Execution Instance. Package Name. Bulk Insert Task Editor: 7. Package ID. The Aggregate Transformation supports following operations: Group By. CHARACTER MAP . Count Distinct.. Sum. the transformation provides the GROUP BY clause.When it comes to string formatting in SSIS.Adds Package and Task level Metadata .

Checkpoint mechanism uses a Text File to mark the point of package failure. a new transaction is created automatically. UNION ALL . FUZZY LOOKUP .Add a copy of column to the output. PIVOT – Used for Normalization of data sources to reduce analomolies by converting rows into columns UNPIVOT – Used for demoralizing the data structure by converts columns into rows incase of building Data Warehouses. 10. DATA MINING QUERY – Used to perform data mining query against analysis services and manage Predictions Graphs and used to create/distribute exact copies of the source dataset to one or more destination datasets. ROW COUNT . How to execute SSIS Package from Stored Procedure. It is used for exact matches only.Stores the resulting row count from the data flow / transformation into a variable. IMPORT COLUMN . 9. IfExists.Used for Pattern Matching and Ranking based on fuzzy logic.Reads image specific column from database onto a flat file.Merges two sorted data sets of same column structure into a single output. MULTI CAST . MERGE JOIN .Merges two sorted data sets into a single dataset using a join. Set the following Properties: CheckpointFileName: Specifies the name of the checkpoint file.Performs the lookup (searching) of a given reference object set against a data source. Not Supported:The executable of the package do not honour any transaction ie do not join other transaction nor creates new transaction. CheckpointUsage: Never. the executable join the transaction else do not create a new transaction. DERIVED COLUMN . FailPackageOnFailure: property needs to be set to True for enabling the task in the checkpoint. ROW SAMPLING . If No transaction at the upper level.Merge multiple data sets into a single dataset. FUZZY GROUPING – Groups the rows in the dataset that contain similar values.Converts columns data types from one to another type. 8. if there is a transaction at upper level. LOOKUP .Create a new (computed) column from given expressions. using xp_cmdshell command . Always SaveCheckpoints: indicates whether the package needs to save checkpoints. the current executable will join the transaction.CONDITIONAL SPLIT – used to split the input source data based on condition. EXPORT COLUMN – Used to export a Image specific column from the database to a flat file. COPY COLUMN . These checkpoint files are automatically created at a given location upon the package failure and automatically deleted once the package ends up with success. This property must be set to True to restart a package from a point of failure.Captures sample data by using a row count of the total rows in dataflow specified by rows or percentage. What are the different types of Transaction Options Required: If a transaction already exists at the upper level. Supported:In any executable. MERGE . we can later transform the copy keeping the original for auditing. It stands for Explicit Column Conversion. Explain about Checkpoints with properties Checkpoint is used to restart the package execution from the point of failure rather than from initial start. DATA CONVERSION .



EngineThreads: is property of each DataFlow task. After deploying the package to a different machine (using SQL Server or file system deployment mode) it is mandatory to copy the related package configuration files on to that machine. Package configurations help in managing such changes without actually opening and editing the SSIS package in Business Intelligence Development Studio (BIDS). system testing.MaxConcurrentExecutables: defines how many tasks can run simultaneously. The EngineThreads property applies equally to both the source threads that the data flow engine creates for sources and the worker threads that the engine creates for transformations and destinations. which equates to number of physical or logical processor + 2. Therefore. the configuration file should be added while creating the job so that package will read the information from the configuration file. by specifying the maximum number of SSIS threads that can execute in parallel per package. Package configuration? Different types of Configuration Files The package can be transferred across various environments like development and unit testing. that would be impacted while moving the package across environments as part of deployment process. UAT and production.11. 2. While executing the job. it is mandatory to change these environment dependent variables when the package is transferred across environments. There are 5 configuration types available with package configurations. If the package is scheduled to run through a SQL Agent job. Parallel processing in SSIS To support parallel execution of different tasks in the package. SQL Agent will take the design time values for connection strings if the package configuration file is not supplied. Most packages will have environment specific variables like connection string to a database or path to a flat file. SSIS uses 2 properties: 1. This property defines how many threads the data flow engine can create and run in parallel. The default is -1. setting EngineThreads to 10 means that the engine can create up to ten source threads and up to ten worker threads. or user defined variables etc. . Hence. 12.

to disable logging of that component UseParentSetting . Logging. A log provider can be a text file. a proprietary file format). you can define a custom log provider (e. Error handling in SSIS package I have created a package like below: . or an XML file. 14. we have 3 options: -Setting breakpoints in a package. the SQL Server Profiler. How to debug a package For debugging a package. a Windows event log.g. What is the LoggingMode property? SSIS packages. container or Task -Using Data Viewer in the Data flow task -Setting Error redirection in the Data Flow task 15. This property accepts 3 possible values: Enabled . use parent's setting of that component to decide whether or not to log the data.. If necessary. a SQL Server relational database. tasks and containers have a property called LoggingMode. Different types of Logging files Logging is used to log the information during the execution of enable logging of that component Disabled .13.


Select 'Load to sql Table' Data flow Task. Navigate to 'Even Handlers' Tab.

Drag and Drop 'Execute Sql Task'. Open the Execute Sql Task Editor and in Parameter
Mapping' section, select the system variables as follows:

Create a table in Sql Server Database with Columns as: PackageID, PackageName,
TaskID, TaskName, ErrorCode, ErrorDescription.

The package will be failed during the execution.

The error information is inserted into Table.

17. Difference between Merge and UnionAll Transformations
The Union All transformation combines multiple inputs into one output. The
transformation inputs are added to the transformation output one after the other; no
reordering of rows occurs.

The Merge transformation combines two sorted datasets into a single dataset. The
rows from each dataset are inserted into the output based on values in their key
The Merge transformation is similar to the Union All transformations. Use the Union
All transformation instead of the Merge transformation in the following situations:

-The Source Input rows are not need to be sorted.
-The combined output does not need to be sorted.
-when we have more than 2 source inputs.

When the output buffer of Transformation created a new buffer. examples: Sort. Difference between Synchronous and Asynchronous Transformation Synchronous T/F process the input rows and passes them onto the data flow one row at a time. Other partially blocking transformations within the package. Audit. Multicast Transformation generates exact copies of the source data. Difference between Lookup and Fuzzy Lookup transformation . OLEDB Data Destinations Other Row transformation within the package. Sql Server Deployment: SSIS packages will be stored in the sysssispackages table of MSDB Database. In Row Transformation. 2. How to migrate Sql server 2005 Package to 2008 version 1. Incremental Load in SSIS Using Slowly Changing Dimension Using Lookup and Cache Transformation 20. Output buffer or output rows are not sync with input buffer. Import. Fully Blocking Transformation with examples. 19. it will automatically launch the SSIS package upgrade wizard. Lookup. 3. Bulk Insert Task is used to copy the large volumn of data from text file to sql server destination. Multicast. Running "ssisupgrade. Partially Blocking Transformation. by right click on the "SSIS Packages" folder of an SSIS project and selecting "Upgrade All Packages". the buffers can be re-used for other purposes like following: OLEDB Datasource. and we should provide a full qualified path to stored package in the FileSystem option. examples of Row Transformations: Copy Column. Conditional Split. Multicast. If you open a SSIS 2005 project in BIDS 2008. each value is manipulated individually. In this transformation. In BIDS. 22. Cache Transformation 23. then it is Asynchronous transformation. What are Row Transformations. 24. examples: Merge. Conditional Split.18. Bulk Insert Tasks Multicast Transformation is used to extract output from single source and places onto multiple destinations. Character Map Partially Blocking Transformation: These can re-use the buffer space allocated for available Row transformation and get new buffer space allocated exclusively for Transformation.exe" from the command line (default physical location C:\Program Files\Microsoft SQL Server\100\DTS\Bin folder). Difference between FileSystem and Sql server Deployment File System Deployment: We can save the package on a physical location on hard drive or any shared folder with this option. The condition is evaluated in VB Script. it means each recipient will have same number of records as the source whereas the Conditional Split Transformation divides the source data based on the defined conditions and if no rows match with this defined conditions those rows are put on default output. Aggregate. 21. Export Column Fully Blocking Transformation: It will make use of their own reserve buffer and will not share buffer space from other transformation or connection manager. Conditional Split transaformation is used for splitting the input data based on a specific condition.

Lookup Transformation finds the exact match. Fuzzy Lookup transformation matches input table with reference table. . It finds the closest match and indicates the quality of the match.


some combination of Type 1 and Type 2. Type 2 . Types: Type 1 . The new (modified) record and the old record(s) are identified using some kind of a flag like say IsActive. When your data flow is adding new rows to your reference table. Explain Slowly Changing Dimesnion The SCD concept is basically about how the data modifications are absorbed and maintained in a Dimension Table. IsDeleted etc. 26. When you have a large reference table. This approach uses most of the memory. Different types of File Enumerators . Lookup operations will be very fast during execution.preserve the change history in the dimension table and create a new row when there are changes. Full Cache: The default cache mode for lookup is Full cache. When to use Full cache mode: When you're accessing a large portion of your reference set When you have a small reference table When your database is remote or under heavy load. usually maintaining multiple instances of a column in the dimension row. When you want to limit the size of your reference table by modifying query with parameters from the data flow. Type 3 . or using Start and End Date fields to indicate the validity of the record. Caching takes place before any rows are read from the data flow source. 28. If the match is found at the database. The database is queried once during the pre-execute phase of the data flow.update the columns in the dimension row without preserving any change history. a current value and one or more previous values. Difference between Full Cache and Partial Cache Partial Cache: The lookup cache starts off empty at the beginning of the data flow. and you want to reduce the number of queries sent to the server When to use Partial cache mode: When you're processing a small number of rows and it's not worth the time to charge the full cache. e.g. 27. the lookup transform checks its cache for the matching values. Cache Transformation Cache Transformation: to cache the data used in the Lookup transform. If no match is found.25. The entire reference set is pulled into memory. When a new row comes in. the values are cached so they can be used the next time a matching row comes in. it queries the database.

dtsx". How to execute the package from . For example. 29. Foreach Item: The Item Enumerator enumerates the collections. Foreach ADO.Net Enumerator enumerates the schema information. For example.ManagedDts.The variable must be of Object data type.NET? We need a reference to Microsoft. package = app. we can get the rows in the ADO records.LoadPackage(@"C:\Program Files\Microsoft SQL Server\100\DTS\Packages\Integration Services Project2\Package.Runtime.Dts.NET Schema Rowset: The ADO. Foreach From Variable: The Variable Enumerator enumerates objects that specified variables contain.SqlServer. null). Here enumerator objects are nothing but an array or data table.txt extension in a windows folder and its sub folders.DTSExecResult results = package.dll to call a package. Foreach Nodelist: The Node List Enumerator enumerates the result of an XPath expression. For example. . we can get the list of functions or views in a SQL Server database. For example. Foreach File: The File Enumerator enumerates files in a folder. using Microsoft. Package package = null. we can get all the files which have the *.Dts.Foreach ADO: The ADO Enumerator enumerates rows in a table.SqlServer.SqlServer. Foreach SMO: The SMO Enumerator enumerates SQL Server Management Objects (SMO). For example. we can enumerate the names of executables and working directories that an “Execute Process” task uses.Execute(). Microsoft. we can get the table from the database.Runtime Application app = new Application().

Completion – Workflow will proceed when the preceding container’s execution . This container group tasks and containers that must succeed or fail Groups tasks as well as containers as a unit. By using these constraints. one time for every day of the week. For For Loop repeatedly by checking conditional example. Failure – Workflow will proceed when the preceding container’s execution results in a failure. Success – Workflow will proceed when the preceding container executes successfully. a package Sequence into Control Flows that are subsets can group tasks that delete and add Container of the package Control Flow. How to schedule a package (Role of Sql Server Agent) In order for the job to run successfully. or running a command for multiple objects. To repeat tasks until a specified This container runs a Control Flow expression evaluates to false. a package can send a Container expression (same as For Loop in different e-mail message seven programming language).it will choose different execution paths depending on the success or failure of other tasks. times. What are containers? (For loop. rows in a database table. Containers can include other containers in addition to tasks. running T-SQL Loop repeatedly using an enumerator. For example.MSC console  Starting SQL Server Agent Service using SQL Server Configuration Manager  Starting SQL Server Agent Service using SQL Server Management Studio (SSMS) 31.30. Container Container Description Purpose of SSIS Container Type To repeat tasks for each element in a collection. What are precedence constraints A task will only execute if the condition that is set by the precedence constraint preceding the task is met. Containers support repeating control flows in packages and they group tasks and containers into meaningful units of work. Sequence Container) SSIS Containers are controls that provide structure to SSIS packages. and then commit or roll back all the tasks when one fails. Indicated in control flow by a solid red line. 32. Indicated in control flow by a solid green line. the SQL Server agent should be running on the target machine. for example retrieve Foreach This container runs a Control Flow files from a folder. statements that reside in multiple Container files. We can start the SQL Server Agent Services in numerous ways like:-  Starting SQL Server Agent Service from Command Line  Starting SQL Server Agent Service from Services.

we get sorted rows from datasource using ORDER By clause. Indicated in control flow by a dotted color line along with a small ‘fx’ icon next to it. the dataflow engine will ensure that the source identity values are preserved and same value is inserted into the destination table. Expression/Constraint with Logical OR – Workflow will proceed when either the specified expression or the logical constraint (success/failure/completion) evaluates to true. we should use the column name specifically in the SELECT statement. Performance Optimization in SSIS 1. Even if we need all the columns from source. the data first comes into the buffer. Hence select only those columns which are required at the destination. completion=blue). If you check this setting. regardless of success or failure. Keep Identity – By default this setting is unchecked which means the destination table (if it has an identity column) will create identity values on its own. Instead of using Sort T/F. Color of line depends on logical constraint chosen (e. success=green. Color of line depends on logical constraint chosen (e. The size of buffer depends on the estimated row size. 2. So unless you have a reason for changing it. Indicated in control flow by a solid color line along with a small ‘fx’ icon next to it. Expression/Constraint with Logical AND – Workflow will proceed when specified expression and constraints evaluate to true. 4. Create Clustered Index and Non-clustered indexes. 3.g. Sort T/F required all the incoming rows to be arrivaed before start processing. So the more columns in a row means less number of rows in a buffer.completes. Effect of OLEDB Destination Settings There are couple of settings with OLEDB destination which can impact the performance of data transfer as listed below. Avoid Asynchronous Transformation (Sort T/F) wherever possible. there are also a couple of other settings which you can use as discussed below. Keep Nulls – Again by default this setting is unchecked which means default value will be inserted (if the default constraint is defined on the target column) during insert into the destination table if NULL value is coming from the source for that particular column. If you select the 'fast load' option. then Transfer and load the data into Destination Table. Avoid SELECT * DataFlow Task uses buffer oriented architecture for data transfer and transformation. otherwise it takes another round for the source to gather meta-data about the columns when u are using SELECT *. If you check this option then default constraint on the destination table's column will be ignored and preserved NULL of the source column will be inserted into the destination. completion=blue). Indicated in control flow by a solid blue line. success=green. required transformations are done in the buffer itself and then written to Destination. When data transfer from Source to Destination. don't change this default value of fast load. Data Access Mode – This setting provides the 'fast load' option which internally uses a BULK INSERT statement for uploading data into the destination table instead of a simple INSERT statement (for each single row) as in the case for other options. Pulling High Volumes of Data Drop all Non-Clustered Indexes and Clustered Index if exists.g. Table Lock – By default this setting is checked and the recommendation is to let it be . 33. The estimated row size is equal to the maximum size of all columns in the row.

The first consideration is the estimated row size. The number of buffer created is dependent on how many rows fit into a buffer and how many rows fit into a buffer dependent on few other factors. The above two settings are very important to understand to improve the performance of tempdb and the transaction log. If you un-check this option it will improve the performance of the data load. First you can remove unwanted columns from the source and set data type in each column appropriately. the transaction log and tempdb will keep on growing during the extraction process and if you are transferring a high volume of data the tempdb will soon run out of memory as a result of this your extraction will fail. For better buffer performance you can do two things. It means the size of a buffer can be as small as 64 KB and as large as 100 MB. This will enable you to accommodate as many rows as possible in the buffer. It specifies a table lock will be acquired on the destination table instead of acquiring multiple row level locks. DefaultBufferMaxRows which is again a property of data flow task which specifies the default number of rows in a buffer. You can specify a positive value for this setting to indicate that commit will be done for those number of records. but at the same time it will release the pressure on the transaction log and tempdb to grow specifically during high volume data transfers. especially if your source is flat file. #7 . The default value is 10 MB and its upper and lower boundaries are constrained by two internal properties of SSIS which are MaxBufferSize (100MB) and MinBufferSize (64 KB). Its default value is 10000. This setting specifies that the dataflow pipeline engine will validate the incoming data against the constraints of target table. So it is recommended to set these values to an optimum value based on your environment. which is the sum of the maximum sizes of all the columns from the incoming records. The second consideration is the DefaultBufferMaxSize property of the data flow task. Check Constraints – Again by default this setting is checked and recommendation is to un-check it if you are sure that the incoming data is not going to violate constraints of the destination table. Changing the default value for this setting will put overhead on the dataflow engine to commit several times. which could turn into lock escalation problems. Yes that is true.Effect of Rows Per Batch and Maximum Insert Commit Size Settings: Rows per batch: The default value for this setting is -1 which specifies all incoming rows will be treated as a single batch. For example if you leave 'Max insert commit size' to its default. You can change this default behavior and break all incoming rows into multiple batches. If the size exceeds the DefaultBufferMaxSize then it reduces the rows in the buffer. The third factor is.DefaultBufferSize and DefaultBufferMaxRows : The execution tree creates buffers for storing incoming rows and performing transformations. . This property specifies the default maximum size of a buffer. The allowed value is only positive integer which specifies the maximum number of rows in a batch. #5 . Maximum insert commit size: The default value for this setting is '2147483647' (largest value for 4 byte integer type) which specifies all incoming rows will be committed once on successful completion.checked unless the same table is being used by some other process at same time.

So before you set a value for these properties. which could improve performance. a temporary table. If you set it to TRUE. Beware if you change the values of these properties to a point where page spooling (see Best Practices #8) begins. When it goes above 0. Monitoring the SSIS Performance with Performance Counters Launch Performance Monitor: 1. Among them. their values should return to the same value as what they were before the execution. every component has a DelayValidation (default=FALSE) property. you can tune these properties to have a small number of large buffers. it adversely impacts performance. “Buffers spooled” has an initial value of 0. so no object has been created causing a package validation failure when validating the second component. the first component has not yet executed. Load the SSIS related Counters In the Performance Object. Let's consider a scenario where the first component of the package creates an object i. “Flat buffers in use” and “Private buffers in use” are useful to discover leaks. When to use events logging and when to avoid. if your system has sufficient memory available. which validates the components of the package once started. buffers are leaked. Second SSIS uses component validation (late validation). we will see these counters fluctuating. SSIS provide a set of performance counters. During package execution time. Start -> All Programs -> Administrative Tools -> Performance 2. early validation will be skipped and the component will be validated only at the component level (late validation) which is during package execution 9. SSIS will throw a validation exception and will not start the package execution. 11. During package validation. #8 . it indicates that the . the following few are helpful when you tune or debug your package: Buffers in use Flat buffers in use Private buffers in use Buffers spooled Rows read Rows written “Buffers in use”. Otherwise.How DelayValidation property can help you SSIS uses two types of validation.e.Second. select SQL Server:SSIS Pipeline and SQL Server:SSIS Service. first thoroughly testing in your environment and set the values appropriately. But once the package finishes execution. First is package validation (early validation) which validates the package and all its components before starting the execution of the package. So how will you get this package running in this common scenario? To help you in this scenario. Better performance with parallel execution 10. which is being referenced by the second component of the package.

Buffers Spooled: The number of buffers currently written to the disk. 12. date and time types if the conversion does not have to be locale-sensitive. FastParse property Fast Parse option in SSIS can be used for very fast loading of flat file data. It will speed up parsing of integer. This option is set on a per-column basis using the Advanced Editor for the flat file source. buffers not currently used are written to disk and then reloaded when needed. “Rows read” and “Rows written” show how many rows the entire Data Flow has processed. In a case like this. . If the data flow engine runs low on physical memory. set Data Flow Task properties BLOBTempStoragePath and BufferTempStoragePath appropriately for maximal I/O bandwidth.engine has started memory swapping.

OnProgress : Executed when a progress message is sent by the SSIS engine. such as stopping the package. and destinations. A data flow consists of the sources and destinations that extract and load data. 35. Checkpoint features helps in package restarting 34. 2. as such. Select the List of packages that needs to be upgraded to SSIS 4. and Data Destinations are the three important categories in the Data Flow. OnExecStatusChanged : Runs for all tasks and containers when the execution status changes to In Process. the transformations that modify and extend data. OnWarning Runs: when a task returns a warning event such as a column not being used in a data flow. Upgrade DTS package to SSIS 1. SSIS provides three different types of control flow elements: Containers that provide structures in packages. OnTaskFailed : Similar to OnError. or Failed. Specifty the Log file for Package Migration. containers. Difference between Control Flow and Data Flow 1. choose the Source. their success or Failure effects how your control flow operates . OnPostExecute : Runs after a container or task successfully completes. In the Package Migration Wizard. OnVariableValueChanged: Runs when the value changes in a variable for which the RaiseChangeEvent property is set to True. Events in SSIS OnError : Runs when a task or container reports an error. OnInformation : Runs when SSIS outputs information messages during the validation and execution of a task or container. 1. but runs when a task fails rather than each time an error occurs. 3. Sql Server 2000 Server Name. transformations. In BIDS. and the paths that link sources. and runs the data flow. OnPostValidate : Executes after a container or task has successfully been validated. and Precedence Constraints that connect the executables.13. Data Sources. from the Project Menu. We use precedence constraints to connect the tasks and containers in a package.Tasks are run in series if connected with precedence or in parallel.Control flow does not move data from task to task. Destination folder. OnQueryCancel : Invoked when an Execute SQL Task is cancelled through manual intervention. Transformations. Success. Tasks that provide functionality. but there are also tasks in the control flow. OnPreValidate: Runs before the component is validated by the engine. select 'Migrate DTS 2000 Package' 2. OnPreExecute : Runs just before a container or task is executed.Control flow consists of one or more tasks and containers that execute when the package runs. 2. Data flows move data. orders. 36. and tasks into an ordered control flow. 3. indicating tangible advancement of the task or container. The Data Flow task is the executable within the SSIS package that creates.

. Right click on the Jobs node and select New Job from the popup menu. Sql Server Agent: Drill down to the SQL Server Agent node in the Object Explorer. SQL Server or an SSIS Package Store. How to deploy packages from one server to another server 1. . 2. select Build option under the Configuration Properties node and in the right side panel.dtsx" 3. The syntax to execute a SSIS package which is stored in a File System is shown below.In the SSIS Property Pages dialog box. DTEXECUI provides a graphical user interface that can be used to specify the various options to be set when executing an SSIS package. double-click the manifest file. DTEXEC. Test the SSIS package execution by running the package from BIDS: -In Solution Explorer. -Right click the package within Solution Explorer and select Execute Package option from the drop down menu 4. SQL Server or an SSIS Package Store. Click OK to save the changes in the property page. You can launch DTEXECUI by double-clicking on an SSIS package file (. Navigate to Security then Credentials in SSMS Object Explorer and right click to create a new credential Navigate to SQL Server Agent then Proxies in SSMS Object Explorer and right click to create a new proxy 38. locate the deployment bundle. right click the SSIS project folder that contains the package which you want to run and then click properties. Different ways to execute SSIS package 1. Using the DTEXEC. Using the Execute Package Utility (DTEXECUI.dtsx). On the destination computer. provide the folder location where you want the SSIS package to be deployed within the OutputPath.3.EXE) graphical interface one can execute an SSIS package that is stored in a File System. 4. Data is passed between each component in the data flow. 2: Running the Package Installation Wizard 1. 37. the deployment bundle is the Bin\Deployment folder.EXE command line utility one can execute an SSIS package that is stored in a File System. In the Deployment folder. Right-click the Deployment folder and click Copy. You can also launch DTEXECUI from a Command Prompt then specify the package to execute.To copy the deployment bundle Locate the deployment bundle on the first server.EXE /F "C:\BulkInsert\BulkInsertTask. If you used the default location. Locate the public share to which you want to copy the folder on the target computer and click Paste. The first step to setting up the proxy is to create a credential (alternatively you could use an existing credential). How to execute a Stored Procedure from SSIS using Execute SQL Task 39. 2. Data is moved and manipulated through transformations.

7.Project1. select Use SQL Server Authentication and provide a user name and a password. 8. in the Server name box. On the Select Installation Folder page. otherwise. 5. 4. On the Confirm Installation page. On the Deploy SSIS Packages page. . The wizard installs the packages. After installation is completed. On the Welcome page of the Package Installation Wizard. 3. Click Next. If the instance of SQL Server supports Windows Authentication. 40. click Browse. select the "Validate packages after installation" check box. the Configure Packages page opens. 6. click Next. 9. How to deploy a package Right click on the Solution in Solution Explorer and choose properties in the Menu. click Next.SSISDeploymentManifest. and then click Next. specify (local). 10. Verify that the "Rely on server storage for encryption" check box is cleared. On the Specify Target SQL Server page. select either File sytem or SQL Server deployment option. select Use Windows Authentication.


navigate to the directory is referred in DeploymentOutputPath Deploying the Package: Double click the Manifest File to start the deployment.When the build /rebuild is successful. The Package Installation wizard begins and Deploy SSIS Packages step is the first screen that is presented. .

.This screen lets you select where shall the packages be deployed. deploying in SQL Server is more secure. as mentioned in the Dialog Box. since SQL Server stores the packages internally compared to File System where additional security measures needs to taken to secure the physical files.



It can be also used to find out the content of a dataset. Column Chart . For example if the input has 1000 rows and if I specify 10 as percentage sample then the transformation returns 10% of the RANDOM records from the input data. 43. It extracts terms from text in a transformation input column and then writes the terms to a transformation output column. Grid 2. What is the use of Term Extraction transformation in SSIS? Term Extraction transformation is used to extract nouns or noun phrases or both noun and noun phrases only from English text. What is the use of Percentage Sampling transformation in SSIS? Percentage Sampling transformation is generally used for data mining. The different types of data viewers are: 1. This transformation builds a random sample of set of output rows by choosing specified percentage of input rows. What is Data Viewer and what are the different types of Data Viewers in SSIS? A Data Viewer allows viewing data at a point of time at runtime.41. Scatter Plot 4. Histogram 3. 42.

44. 45. Execution StartTime 6. the error will be ignored and the data row will be directed to continue on the next transformation. then using this option in SSIS we can REDIRECT the junk data records to another transformation instead of FAILING the package. VersionID : GUID version of the package 5. Auditing options that you can add to transformed data through this transformation are : 1. What is Ignore Failure option in SSIS? In Ignore Failure option. Explain Audit Transformation ? It allows you to add auditing information. Execution of Instance GUID : ID of execution instance of the package 2. PackageName : Name of the Package 4. Let’s say you have some JUNK data(wrong type of data ) flowing from source. PackageID : ID of the package 3. UserName . This helps to MOVE only valid data to destination and JUNK can be captured into separate file. 46. MachineName 7. How do we convert data type in SSIS? The Data Conversion Transformation in SSIS converts the data type of an input column to a different data type.

Package encryption 2. 2. TaskName 9. in the sysssispackages table. 47.8. what are the possible locations to save SSIS package? 1. DonotSaveSensitive: any sensitive information is simply not written out .File System: We can save the package on a physical location on hard drive or any shared folder with this option. TaskID : unique identifier type of the data flow task that contains audit transformation. So. How to provide security to packages? We can provide security to packages in 2 ways 1. Password protection 1. SSIS Package Store is nothing but combination of SQL Server and File System deployment. don’t get it wrong as something different from the 2 types of package deployment. Sql Server: SSIS packages will be stored in the MSDB database. 48. as the package developer) taste. and we should provide a full qualified path to stored package in the FileSystem option. as you can see when you connect to SSIS through SSMS: it looks like a store which has categorized its contents (packages) into different categories based on its manager’s (which is you.

Server Storage: allows the package to retain all sensitive information when you are saving the package to SQL Server. 50. Remove Local Directory. You can change the Protection Level of deployed packages by using the DTUTIL utility. 2. EncryptAllWithPassword: allows to encrypt the entire contents of the SSIS package with your specified the package XML file when you save the package. EncryptSensitiveWithUserKey: encrypts sensitive information based on the credentials of the user who created the package. 6. Copying directories and data files from one directory to another. 1. Predefined FTP Operations: Send Files. a package can download data files from a remote server.Set the "EvaluateasExpression" property of the variable as True. SSIS packages are saved to MSDB database of SQL Server. FTP Task: The FTP task downloads and uploads data files and manages directories on servers. 5. use the FTP task for the following purposes: 1. the FTP task connects to a server by using an FTP connection manager. 49. Downloading files from an FTP location and applying transformations to column data before loading the data into a database. and this password will be used to encrypt and decrypt the sensitive information in the package. It is the default value for the ProtectionLevel property. For example. Remove Remote Directory Delete Local Files. 2. Create Remote Directory. The FTP connection manager includes the server settings. the credentials for accessing the FTP server. EncryptSensitiveWithPassword: requires to specify a password in the package. and options such as the time-out and the number of retries for connecting to the server. How to track a variable in ssis? OnVariableValueChanged: This event gets raised when value of the variable is changed. It does not support Windows Authentication. Create Local directory. Delete Remote File Customer Log Entries available on FTP Task: FTPConnectingToServer FTPOperation .Create an event handler for the "OnVariableValueChanged" event for the container in which the variable is scoped. The FTP connection manager supports only anonymous authentication and basic authentication. before or after moving data. EncryptAllWithUserKey: allows to encrypt the entire contents of the SSIS package by using the user key.Set the "RaiseChangedEvent" property of the variable as True. 2. 4. 3. At run time. 3. and applying transformations to the data. Receive File.

and embedded qualifiers. Flat File Connection Manager Changes . column remapping is sometimes needed -SSIS 2012 maps columns on name instead of id -It also has an improved remapping dialog 7. occurrence) TOKENCOUNT: This function uses delimiters to separate a string into tokens and then returns the count of tokens found within the string: TOKENCOUNT(character_expression. This is equivalent to ISNULL in T-SQL: REPLACENULL(expression. they disappear in all other packages.51. The Flat File Source now supports a varying number of columns. 4. -When converting shared connection managers back to regular (package) connection managers. After a package is opened. -CDC transformation splits records into new rows. New features in SSIS 2012 1. . CDC (Change Data Capture) Task and Components . updated rows and deleted rows. GUI Improvements . This helps to reduce the delay in validating the package data flow. Left syntax is the same as we know in T- SQL: LEFT(character_expression. delimiter_string.-Sort packages by name -Package visualization -Zoom -Data flow source/destination wizard -Grouping in data flow 2. Easy Column Remapping in Data Flow (Mapping Data Flow Columns) -When modifying a data flow. based on the LSN it for from the CDC task. This can speed up operations in the SSIS Designer. 5. The connection manager also by default always checks for row delimiters to enable the correct parsing of files with rows that are missing column fields. expression) TOKEN: This function allows you to return a substring by using delimiters to separate a string into tokens and then specifying which occurrence to return: TOKEN(character_expression. New Functions/Expressions in SSIS 2012: LEFT: You now can easily return the leftmost portion of a string rather than use the SUBSTRING function. Offline Connection Managers: Integration Services now validates all connections before validating all of the data flow components when a package is opened and sets any connections that are slow or unavailable to work offline.-The Flat File connection manager now supports parsing files with embedded qualifiers. 3.number) REPLACENULL: You can use this function to replace NULL values in the first argument with the expression specified in the second argument. delimiter_string) 6. The connection manager you create at the project level is automatically visible in the Connection Managers tab of the SSIS Designer window for all packages.-CDC is nothing but Incremental load loads all rows that have changed since the last load -CDC needs to keep track of which changes have already been processed. -CDC task does this by storing LSNs in a tracking table -CDC source component reads from the CDC table function. you can also turn off a connection by right-clicking the connection manager in the Connection Managers area and then clicking Work Offline. Shared Connection Managers: To create connection managers at the project level that can shared by multiple packages in the project.

used a lot of system resources and could be a memory hog.Message using the methods of the & ControlChars. you can’t undo that. but package workflow..ComponentMetaData myMetaData.FireError(0.0. it only runs once. You must specify whether you want to create purpose task. although helpful.FireError(. 11. we can see the support of undo/redo. Unless you put it in a loop typically it runs its main processing routine container or an event handler. a source.Events. "Event Snippet". Script Task Script Component Control The Script task is configured on the Control Flow tab The Script component is configured on the Flow/Date of the designer and runs outside the data flow of the Data Flow page of the designer and Flow package. 52. . ODBC Source and Destination . properties directly affect the code that you can write. - Breakpoints are supported in Script Component 9. The metadata and properties that you configure on each of these pages determines the members of the base classes that are autogenerated for your use in coding. Events object to raise events. Purpose A Script task can accomplish almost any general. Inputs and and ReadWriteVariables. Undo/Redo: One thing that annoys users in SSIS before 2012 is lack of support of Undo and Redo. Editor The Script Task Editor has three pages: General. and Expressions.StackTrace IDTSComponentMetaData100 interface returned by the ComponentMetaData property.CrLf & ex. For example: warnings. transformation. transformation. For example: Dim myMetadata as IDTSComponentMetaData100 myMetaData = Me. Difference between Script Task and Script Component in SSIS.8. and ScriptLanguage Outputs.NET 10. or destination with the Script component. using either of these properties. or destination in the Data Flow task. Only the ReadOnlyVariables four pages: Input Columns. and Connection Managers. Now in SSIS 2012. ex. represents a source.) Execution A Script task runs custom code at some point in the A Script component also runs once. The Script Transformation Editor has up to Script. Once you performed an operation. Raising The Script task uses the Events property of the Dts The Script component raises errors. Raising The Script task uses both the TaskResult property The Script component runs as a part of the Results and the optional ExecutionValue property of the Dts Data Flow task and does not report results object to notify the runtime of its results. Reduced Memory Usage by the Merge and Merge Join Transformations – The old SSIS Merge and Merge Join transformations. once for each row of data in the data flow. Script.-ODBC was not natively supported in 2008 -SSIS 2012 has ODBC source & destination -SSIS 2008 could access ODBC via ADO.. Scripting Enhancements: Now Script task and Script Component support for 4. they will not consume excessive memory when the multiple inputs produce data at uneven rates. Most importantly. and informational messages by Dts. In 2012 these tasks are much more robust and reliable.

MyStringVariable. the task’s ReadOnlyVariables and ReadWriteVariables created from the component’s properties. drag and drop an "Execute Process Task" to the control flow and configure. connMgr. For example: string myVar. In the Execute Process. For example: Dts. perform the following configurations: Executable: The path of the application that is being used. Using The Script task uses the Variables property of the Dts The Script component uses typed accessor Variables object to access variables that are available through properties of the autogenerated based class.Transaction) as IDTSConnectionManager100 String). Working Directory: The current directory for all process. For example: string created from the list of connection managers myFlatFileConnection.Variables. The accessor properties to access certain package Package Dts property is a member of the ScriptMain class. To execute an SSIS package saved to SQL Server using Windows Authentication: dtexec /sq pkgOne /ser productionServer 2.Variables["MyStringVariable"]. How to execute package from command line 1.dtsx" /conf "c:\pkgOneConfig. string myVar. .Value. The PostExecute method can access both read- only and read/write variables. To execute an SSIS package that is saved in the file system: dtexec /f "c:\pkgOne.MyADONETConnection.connMgr = this. 53. The PreExecute method can access only read-only variables. To execute an SSIS package that is saved in the file system and configured externally: dtexec /f "c:\pkgOne. For example: Connection"].cfg" 55. How to unzip a File in SSIS? Use Execute Process Task in the Control Flow. myVar = this. you use typed with the property to access other features of the package. Limited error handling.dtsx" 3. Better error handling. Message box in active-x scripts. SSIS: More number of transformations. To execute an SSIS package saved to the File System folder in the SSIS Package Store: dtexec /dts "\File System\MyPackage" 4.ToString(). Arguments: Need to supply the arguments to extract the zipped files.AcquireConnection(Dts.Difference between DTS and SSIS? DTS: Limited number of transformations. the package. myFlatFileConnection entered by the user on the Connection = (Dts.Connections["Test Flat File Managers page of the editor. Using The Script task uses the Connections property of the The Script component uses typed accessor Connections Dts object to access connection managers defined in properties of the autogenerated base class. you use the Dts In Script component code. Message box in . ReadOnlyVariables and ReadWriteVariables myVar = properties.Interaction In the code written for a Script task.Connections. 54. From BIDS. features such as variables and connection managers.NET scripting.

Early Validation: Validation take place just before the package execution. where as SQL Server destination loads all the records at one go. 58. Delay Validation. Forced Execution Delay Validation: Validation take place during the package execution. 3. This task works in two modes: Offline. Difference between OLEDB Destination. Online. The Foreach ADO enumerator saves the value from each column of the current row into a separate package variable. Transfer Database Task used to move a database to another SQL Server instance or create a copy on the same instance (with different database name). Instead. OLEDB Destination uses the 'Fast Load' data access mode. SQL Server destination uses shared memory for maximum loading speed. The Bulk Insert task uses the T-SQL BULK INSERT statement for speed when loading large amounts of data. . 2. How to run dynamic T-SQL in SSIS? Option#1: Using Script Component as Destination Option#2: Using Object Variable and run t-sql with Execute SQL Task 60. OLEDB destination loads the records in batches. which service requires to start a job SQL Server Agent Service 57.56. we use a Foreach Loop container with the Foreach ADO enumerator to process one row of the recordset at a time.which services are installed during Sql Server installation SSIS SSAS SSRS SQL Server (MSSQLSERVER) SQL Server Agent Service SQL Server Browser SQL Full-Text 59. it saves data in memory in a recordset that is stored in variable of the Object data type. What is the use of RecordSet Destination? The Recordset destination does not save data to an external data source. Prefer the OLE-DB Destination simply because it gives better flexibility on where you execute the package. must execute on the same server as the database engine. 62. 61. Bulk Insert 1. Then. After the Recordset destination saves the data. SQL Destination. the tasks that you configure inside the Foreach Loop container read those values from the variables and perform some action with them.

Online: The task uses SMO to transfer the database objects to the destination server. 63. In this mode. Also. Indexes. ndf and ldf files are moved to specified network location. the source database is detached from the source server after putting it in single user mode. but it will take longer as it has to copy each object from the database individually. the person executing the package with this mode must be sysadmin on both source and destination instances. but a disadvantage with mode is that the source database will not available during copy and move operation. User defined functions. Users. Roles etc.Offline: In this mode. . either on the same or another SQL Server instance. copies of the mdf. Transfer SQL Server Object Task Used to transfer one or more SQL Server objects to a different database. the database is online during the copy and move operation. This mode is faster. On the destination server the copies are taken from the network location to the destination server and then finally both databases are attached on the source and destination servers. You can transfer tables. Someone executing the package with this mode must be either sysadmin or database owner of the specified databases. stored procedures. views.


Follow these instructions: 1. 2. If you set this property to TRUE for task or container. delete all breakpoints and disable all breakpoints go to the Debug menu and click on the respective menu item. Drag and drop the Script Document to the Data flow and select the Script Component Type as Transformation. For example OnPreExecute. let's pop up a simple non parameterized report. Open the expression screen (Fx button). you specify when you want execution to be halted/paused. Double click the Script Component. Select the column which is to pass through the script component. in the Inputs and Outputs tab. Breakpoint in SSIS? A breakpoint allows you to pause the execution of the package in BIDS during development or when troubleshooting an SSIS Package. The steps are as follows: 1. Eg: Country --> State --> City 2. SSRS Interview Questions and Answers 1. How do u implement Cascading parameter? The list of values for one parameter depends on the value chosen in preceding parameter. How to Generate an Auto Incremental Number in a SSIS Package? A script component can be used for the designated task. OnError events. click on 'Edit Breakpoint' menu and from the Set Breakpoint window. 65. etc. How to pass parameter from Report Viewer Control to sub report? 3. use the "Jump to URL" option. 3. tasks and containers have a property called DisableEventHandlers. Add a column with an integer data type. 4. 3. To toggle a breakpoint. OnPostExecute. How to open another report in a new window from existing report? Use a little javascript with a customized URL in the "Jump to URL" option of the Navigation tab. in the Input Columns tab. 2. then all event handlers will be disabled for that task or container. You can even specify different conditions to hit the breakpoint as well. Non-parameterized Solution To get started. Instead of using the "Jump to Report" option on the Navigation tab.64. What is the DisableEventHandlers property used for? SSIS packages. You can right click on the task in control flow. Enter the following: ="javascript:void(window. If you set this property value to FALSE then the event handlers will once again be'http://servername?%2freportserver %2fpathto%2freport&rs:Command=Render'))" . 66.

To do so.Value is a field of dataset which is used as Chart Category fields. We can have Nested Tables. How to apply stylesheet to SSRS Reports select Report->Report Properties from the menu and then click the Code tab. You can think of it like this: http://servername/reportserver?%2fpathto %2freport&rs:Command=Render&ProductCode=Fields!ProductCode.ContainsKey(groupingValue) Then Return mapping(groupingValue) End If Dim c As String = colorPalette(count Mod colorPalette. "Teal". Parameterized Solution Assume you have a field called ProductCode. 4. "Green". 7. you might hard code that like this: http://servername/reportserver?%2fpathto %2freport&rs:Command=Render&ProductCode=123 In this case. Now write following expression in the Color expression: =code. select Series Properties and select the Fill option from left side. Click OK twice. "#CFBA9B"} Private count As Integer = 0 Private mapping As New System. How to pass parameter from chart to Table in same report? 5. click Report => Report Properties => Code and copy below code: Private colorPalette As String() = { "Blue". using an available value from the source dataset.GetColor(Fields!Year. "Red". "#B8341B".Value The exact syntax in the "Jump to URL" (Fx) expression window will be: ="javascript:void(window. Function StyleElement (ByVal Element As String) As String . then save and deploy the report.Hashtable() Public Function GetColor(ByVal groupingValue As String) As String If mapping.Collections.Add(groupingValue. "#352F26".Length) count = count + 1 mapping. "#E16C56". 6. Normally. "#F1E7D6".open('http://servername/reportserver?%2fpathto %2freport&rs:Command=Render&ProductCode="+Fields!ProductCode.Value) Here Fields!Year. "Gold".Value+"'))" 4. c) Return c End Function STEP2: In the Pie Chart. you want to pass variables dynamically. How to apply custom Colors of chart report? STEP1: Create your custome color palette in the report using Custom Code in your report."#A59D93". Can we have Table within a Table in SSRS report? Yes.

the Report Server uses cached data rather than returning to the database server. Report Filter: This includes filtering after the source query has come back – on a data region (like the Tablix). 8. less data is sent from the source database server to the Report Server . The total time to generate a report (RDL) can be divided into 3 elements: Time to retrieve the data (TimeDataRetrieval). we must specify a filter equation (expression). specify the Stored Procedure. Case e Elsese Return "Black”k” End Select End Function Now apply this function to the style property of an element on the report. The Data will be fetched based on parameters at the database level using WHERE condition in the query. When you implement a filter within the data set. Report can be optimized through Caching. Difference between Filter and Parameter? Which one is better? In case of Filters. first the data will be fetched from the database. Parameters are applied at the database level. Create a Dataset. when the report is re-executed again with different parameter choices. in top-down order for group hierarchies. and then on the group. . 10. Parameters are better than Filters in performance. Using a Dataset Filter is the most efficient method. 11. The data type of filtered data and value must match. then create tables in sql server and insert style information into the tables. Dynamic Grouping in SSRS Can be done using expressions. When you implement a filter within the report. and then on the data region. then the Filters are applied on the fetched data.Value where TABLE_HEADER_TEXT is a value in the table. Select Case UCase(Element) Case "TABLE_HEADER_BG" Return "Red" Case "TABLE_FOOTER_BG" Return "Green" Case "TABLE_HEADER_TEXT" Return "White"t. Filters. or a data grouping. 1.usually a good thing. 9. To add a filter. Optimization of Report Report can be optimized in terms of Grouping. =code. example: =Fields!TABLE_HEADER_TEXT. Dynamic sorting.StyleElement("TABLE_HEADER_TEXT") If you want apply dynamic styles to report. Different types of Filters The 2 types of filters in SSRS are: Dataset Filter: Filtering within the source query. Filters are applied at run time first on the dataset. Snapshot and subscriptions.

embedded within the report.Source. The number of rows and columns for groups is determined by the number of unique values for each row and column groups. Sometimes a dataset contains more columns than used in the Tablix\list. Use the SQL Server Management Studio (SSMS) to analyze the execution plan of every dataset. Rendering of the report can take a while if the result set is very big. 6. grouped in rows and columns. TimeDataRetrieval. parameters. For instance.Time to process the report (TimeProcessing) Time to render the report (TimeRendering) Total time = (TimeDataRetrieval) + (TimeProcessing) + (TimeRendering) These 3 performance components are logged every time for which a deployed report is executed. Report Designer MIME-encodes the image and stores it as text in . datasets for available parameter values. CPU and Duration). Use only required columns in the Dataset. Use the SQL Profiler to measure the performance of all datasets (Reads. AdditionalInfo FROM ExecutionLogStorage ORDER BY Timestart DESC 2. You need to decide where the data will be sorted. Avoid dataset with result sets with a lot of records like more than 1000 records. In which scenario you used Matrix Report Use a matrix to display aggregated data summaries. It can be done within SQL Server with an ORDER BY clause or in by the Reporting server engine. 3. This information can be found in the table ExecutionLogStorage in the ReportServer database. 7. similar to a PivotTable or crosstab. This will avoid the retrieval of all details in 95 % of the situations. Image Source : Embedded Local report images are embedded in the report and then referenced. Assam should be in bold) 13. Remove all datasets which are not used anymore. When you embed an image. In that scenario do the group by already in your dataset. It is not useful to do it in both. 12. ORDER BY in the dataset differs from the ORDER BY in the Tablix\list. [RowCount]. 4. If details are used in only 5 % of the situations. SELECT TOP 10 Itempath. Look very critical if such a big result set is necessary. If an index is available use the ORDER BY in your dataset. Image control in SSRS An image is a report item that contains a reference to an image that is stored on the report server. Every dataset in the report will be executed. Check if all datasets are still being used. TimeProcessing. ByteCount. TimeDataRetrieval + TimeProcessing + TimeRendering as [total time]. A lot of times data is GROUPED in the report without an Drill down option. 5. I have 'State' column in report. display the States in bold. TimeRendering. 14. This will save a lot of data transfer to the SQL Server and it will save the reporting server engine to group the result set. Sometimes you will see more queries being executed than you expected. Use the SQL Profiler to see which queries are executed when the report is generated. or stored in a database. whose State name starts with letter 'A' (eg: Andhra pradesh. A lot of times new datasets are added during building of reports. create another report to display the details.

Image Source : Database If we add images that are stored in a database to report then such image is known as a data-bound image. When you are required to store all images within the report definition. When to Use: When images are stored in a File System. Data-bound images can also be displayed from binary data . When to Use: When image is embedded locally within the report. Image Source : External When you use an external image in a report. External File Share or Web Site.the report definition. the image source is set to External and the value for the image is the URL to the image.

A shared dataset contains a query to provide a consistent set of data for multiple reports. 2. To publish a shared dataset. select Header should remain visible while scrolling. deploy it to a report server or SharePoint site.rsd). On the General tab. under Row Headers or Column Headers. How to upload a report to report server In the Report Manager. Delivering the reports through E-mail or File Share using the subscriptions. The dataset query can include dataset parameters. Providing the Security to the reports. dataset parameters including default values. column. How do u display the partial text in bold format in textbox in Report? (eg: FirstName LastName. You can use one of the following applications to create a shared dataset: 1. and dataset filters. 17. Creating the Cached and Snapshot Reports. Shared datasets use only shared data sources. data options such as case sensitivity. Role of Report Manager Deploying the reports onto the web server. You can upload a file to the report server or SharePoint site. we have upload option to upload the reports. When to use: When image is stored in a Database. Report Builder: Use shared dataset design mode and save the shared dataset to a report server or SharePoint site. Upload a shared dataset definition (. Report Designer in BIDS: Create shared datasets under the Shared Dataset folder in Solution Explorer. To create a shared dataset. . you must use an application that creates a shared dataset definition file (. 15. 16. and then click Tablix Properties. not embedded data sources. When you specify a dataset field that is bound to a database field that contains an image. On a SharePoint site. How to Keep Headers Visible When Scrolling Through a Report? 1. an uploaded file is not validated against the schema until the shared dataset is cached or used in a report. What is a Shared Dataset Shared datasets retrieve data from shared data sources that connect to external data sources.) Use PlaceHolder 19. 18. or corner handle of a tablix data region. Right-click the row. 2. The shared dataset definition includes a query. where "FirstName" should in bold fornt and "LastName" should be in normal font.(BLOB) stored in a database.rsd) file.

and then clickAdvanced Mode. Click OK. The Grouping pane displays the row and column groups. To keep a static tablix member (row or column) visible while scrolling 1. On the design surface. On the right side of the Grouping pane. 2. The Row Groups pane displays the hierarchical static and . click the down arrow. click the row or column handle of the tablix data region to select it.3.

4. set FixedData to True.dynamic members for the row groups hierarchy and the Column groups pane shows a similar display for the column groups hierarchy. The Properties pane displays the Tablix Member properties. Click the static member (row or column) that you want to remain visible while scrolling. . In the Properties pane. 3.

right-click the corner handle of the data region and then click Tablix Properties. Add a page break after:Select this option when you want to add a page break after the table. select one of the following options: Add a page break before:Select this option when you want to add a page break before the table. How to add Page Break 1. under Page break options. Fit table on one page if possible:Select this option when you want the data to stay on one page. 2. On the design surface. On the General tab. .20.

7 cm Set the height of the report to 21 cm. 2.1. 22.21. insert new table on SSRS report with custom error information which will be shown to the report user if the user gets any error during execution of the report.1. Different ways of Deploying reports 1. This procedure should be accepting a unique value. if any error occurred while execution of that procedure. 24. which makes all the query executions through a single transaction. the size of the body should be less or equal to the size of the report . Error handling in Report Step 1: All the data sets of the report should contain one addition input parameter which should pass a unique information for every request (for every click of View Report button) made by the user. and then click Properties.Right-click the report project.5 -1. In the Solution Explorer. Have u worked on any 3rd party Report Tools There are few third party Report Tools like Nevron.7 cm (29.5 . Step 4:Enable the “Use Single Transaction When Processing Queries” option in data source properties. A main report contain subreport also.5) 23. Set the width of the body to 26.exe tool 2. Can we export both main report and subreport to Excel? Yes. This dataset will return the error information available in the data base table by verifying records with the unique id which has passes as input parameter. We can deploy the reports using rs. This unique value should be passed to all the data sets for every click of 'View Report' button made by the user. Step 2: Need to implement TRY CATCH blocks for all the Stored procedures used in the SSRS reports through datasets. Step 5: After successful completion of all the above mentioned steps. how to convert PDF report from Portrait to Landscape format? In Report Properties --> Set the width of the report to the landscape size of your A4 paper: 29. To avoid extra blank pages during export. The exported report contains both the mail report and sub report. 25.7 -1.5) Set the height of the body to 18 cm (21 . Step 3: Add one more additional dataset with the name "ErrorInfo" which should call the store procedure (USP_ERROR_INFO). The CATCH section of every procedure should have the provision to save the error details into DB table.margins. . izenda.1.

2. If you leave this value blank.5.In the Property Pages dialog box for the project. Common configurations are DebugLocal. SharePoint document library 26. Before you publish a report.6.In the OverwriteDataSources list.3.4. Report Manager 2. There are 2 options for deploying the reports that you create with Report Builder 3. 2. type the folder on the report server in which to place the published reports. and Release. The default value for TargetReportFolder is the name of the report project. select True to overwrite the shared data source on the server each time shared data sources are published. select a report to display in the preview window or in a browser window when the report project is run. In the TargetServerURL text box. 2. or select False to keep the data source on the server.In the TargetDataSourceFolder text box. type the folder on the report server in which to place the published shared data sources. Debug. type the URL of the target report server. you must set this property to a valid report server URL.7.0: 1. 3. 2. In the TargetReportFolder text box.2.2. The default value for TargetDataSourceFolder is Data Sources. 2. the data sources will be published to the location specified in TargetReportFolder. select a configuration to edit from the Configuration list. Difference between Cached Report and Snapshot Report .In StartItem.

subscriptions. Different type of Reports Linked report:A linked report is derived from an existing report and retains the original's report definition.Report security: It is important to protect reports as well as the report resources.Scheduling report execution and delivery . but flags it as a second cached instance. SSRS Architecture 29. In short Report Management includes: . Report Designer is a full-featured report authoring tool that runs in Business Intelligence Development Studio and Report Builder. All other properties and settings can be different from those of the original report. Different types of Subscriptions? Subscriptions are used to deliver the reports to either File Share or Email in response to Report Level or Server Level Schedule. Report management: This involves managing the published reports as a part of the webservice. They can be executed whenever demanded or can be scheduled and executed. Standard Subscription: Static properties are set for Report Delivery.Cached Report is a saved copy of processed report. 27. The first time a user clicks the link for a report configured to cache. 3. Data Driven Subscription: Dynamic Runtime properties are set for Subscriptions 28. Then they can view them is a web-based format. 31. role-based security model. location. The reports are cached for consistency and performance. If a user request a different set of parameter values for a cached report. The intermediate format is cached and stored in ReportServerTempDB Database until the cache expiry time. It executes the query and produces the intermediate format. There are 2 types of subscriptions: 1. parameters. and schedules. Report snapshot contains the Query and Layout information retrieved at specific point of time. Reporting Services implement a flexible. Subscription. the report execution process is similar to the on-demand process. then the report processor treats the requests as a new report executing on demand. –Web based delivery via Report Manager web site –Subscriptions allow for automated report delivery –URL Access. A linked report always inherits report layout and data source properties of the original report. How to deploy Reports from one server to other server 30. . Different life cycles of Report 1. Report delivery: Reports can be delivered to the consumers either on their demand or based on an event. including security. and is stored in ReportServer Database. The intermediate format of the report has no expiration time like a cached instance.Organizing reports and data sources. . RDL is an XML based industry standard for defining reports. 2. 2.Report authoring: This stage involves creation of reports that are published using the Report Definition language. Therefore. Web Services and Report Viewer control 4.Tracking reporting history.

ds file). The does not have control over sorting.smdl file).dsv file). You can provide control to the user by adding Interactive Sort buttons to toggle between ascending and descending order for rows in a table or for rows and columns in a matrix. 2. Cached reports: A cached report is a saved copy of a processed report. Difference between Sorting and Interactive Sorting? To control the Sort order of data in report. The <Query> element of RDL contains query or command and is used by the Report Server to connect to the datasources of the report. 1.Snapshot reports: A report snapshot contains layout information and query results that were retrieved at a specific point in time. Report Model and Model Designer Use 'Model Designer' tool to design 'Report Models' and then use 'Report Model' tool to generate reports. 35. Define a data source view for the report model A data source view is a logical data model based on one or more data sources. Cached reports are used to improve performance by reducing the number of processing requests to the report processor and by reducing the time required to retrieve large reports. The subreport can use different data sources than the main report. Subreport: A subreport is a report that displays another report inside the body of a main report. They have a mandatory expiration period. Report snapshots are processed on a schedule and then saved to a report server. The most common use of interactive sort is to add a sort button to every column header. The <Query> element is optional in RDLC file. Create the report model project select "Report Model Project" in the Templates list A report model project contains the definition of the data source (. but used data that the host application supplies. 3 components: Report Builder. the definition of a data source view (. Define a data source for the report model 3. Report Builder . Ad hoc reports:Ad Hoc reporting allows the end users to design and create reports on their own provided the data models. and the report model (. 33. Define a report model 5. Publish a report model to report server. Drill Through Report: Navigation from one report to another report. Difference between RDL and RDLC? RDL files are created for Sql Server Reporting Services and . SQL Reporting Services generates the report model from the data source view. The user can then choose which column to sort by. 4. Drill Down Report: Means navigate from the summary level to detail level in the same report.RDLC files are for Visual Studio Report Viewer Component. you must set the sort expression on the data region or group. How to get the data for Report Model Reports Datasource View 34. Explain the Report Model Steps. . usually in minutes.Windows Winform application for End users to build ad-hoc reports with the help of Report models. This element is ignored by Report Viewer control because Report Viewer control does not perform any data processing in Local processing mode. 32.

Use a Table to display detail data. You can use a list to design a form for displaying many dataset fields or as a container to display multiple data regions side by side for grouped data. You are not limited to a grid layout. For example. add a table.config for Report Manager Includes only those settings that are required for ASP. Difference between a Report and adhoc Report Ad Hoc reporting allows the end users to design and create reports on their own provided the data models. Difference between Table report and Matrix Report A Table Report can have fixed number of columns and dynamic rows. A Matrix Report has dynamic rows and dynamic columns.exe. 38. 2.config Stores the code access security policies for the server extensions. RSReportDesigner. Matrix and List 1. similar to a PivotTable or crosstab. What is Report Builder Windows Winform application for End users to build ad-hoc reports with the help of Report models.config for the Report Server Web service Includes only those settings that are required for ASP. Web. RSPreviewPolicy. 3.Publishers .NET. IIS security controls access to the report server virtual directory and Report Manager. How do u secure a Report 1. organize the data in row groups. the Report Server Web service. Web.config 6. 4. and display values in table and graphic form for each group value 39. Adhoc Report is created from existing report model using Report Builder. ReportingServicesService. RSSrvPolicy. 5. 37.Report Builder 2. RSReportServer.Content Manager .My Reports . Report Server Configuration Files 1. Use a list to create a free-form layout. 3. but can place fields freely inside the list. 41. Registry settings 7. and image. 42.NET 8. Use a matrix to display aggregated data summaries. you can define a group for a list.Browsers . RSMgrPolicy. and background processing. chart.36.config 9. or both.config 40. Different Types of Roles provided by SSRS : .How to Combine Datasets in SSRS (1 Dataset gets data from Oracle and other dataset from Sql Server) . The number of rows and columns for groups is determined by the number of unique values for each row and column groups.config Stores the code access security policies for Report Manager. When to use Table. grouped in rows and columns. 2.config: Stores configuration settings for feature areas of the Report Server service: Report Manager. Authorization is provided through a role-based security model that is specific to Reporting Services.

43.Value.config). rendering and delivery operations. subscriptions. How to add assemblies in SSRS . assume that a table is bound to a dataset that includes a field for the product identifier ProductID.Using LookUP function. select a dropdown arrow beside column groups. =Lookup(Fields!ProductID. Set RepeatOnNewPage= True for repeating headers Set KeepWithGroup= After Set FixedData=True for keeping the headers visible. In the following example. 3. Report Manager is the web-based application included with Reporting Services that handles all aspects of managing reports (deploying datasources and reports. Select the table 2. At the bottom of the screen. under Row Groups. 44. we can combine 2 datasets in SSRS. Enable "Advanced Mode" by clicking on it. snapshot). returns the value of the Name field for that row. when a match is found. "Product") In the above expression. 4. Set the following attributes for the static row or header the static row and choose properties / press F4. data processing. A separate dataset called "Product" contains the corresponding product identifier ID and the product name Name. The configuration settings of Report Manager and the Report Server Web service are stored in a single configuration file (rsreportserver. caching a report. Difference between Report Server and Report Manager Report Server handle authentication. Fields!ID. Fields!Name. 45.Value.Value. Steps to repeat Table Headers in SSRS 2008? 1. Lookup compares the value of ProductID to ID in each row of the dataset called "Product" and.

child grouping in SSRS 47. nothing) and set the visibility of this Text box as =IIF(Count(<ID Field>. "DataSet")=0. How to show "No Data Found" Message to end user? Add a Text box with expression =IIF(Count(<ID Field>."No Data Returned". Report Extensions? 46.False.45. parent grouping.True) . "DataSet")=0.

This can be used to reduce network traffic and potentially increase performance. However. This option used to reduce the amount of open connections to the database. Open the data source dialog in report designer. then only one connection will be open to the database and all the datasets will return the data and the connection will be closed. which also corresponds to the order shown in report designer. datasets that use the same data source are no longer executed in parallel. if you have a report with 3 datasets and you don’t have this option checked.e.48. What is the 'Use single transaction when processing the queries' in the Datasource? Dataset Execution Order? By default. Once selected. a new connection is made to the database for every single dataset. ReportServer and ReportServerTempDB Databases ReportServer: hosts the report catalog and metadata. 49. the entire transaction is rolled back. i. . and select the "Use Single Transaction when processing the queries' check box. They are also executed as a transaction. datasets are executed in parallel. The order of the dataset execution sequence is determined by the top- down order of the dataset appearance in the RDL file. if you have it checked. if any of the queries fails to execute. For example.

ReportServerTempDB: used by RS for caching purposes. the Report Server saves a copy of the report in the ReportServerTempDB database.For eg: keeps the catalog items in the Catalog table. the data source information in the Data-Source table of ReportServer Database. For eg: once the report is executed. .

Snowflake: Normalized form of star schema is a snow flake schema. 2. 3. query response time. MOLAPoffers faster query response and processing times. In MOLAP. takes more space with less time for data analysis compared to ROLAP. the structure of aggregation along with the data values are stored in multi dimensional format. Hierarchies in the Dimension are stored in the Dimension itself. SSAS Interview Questions and Answers 1. This storage mode offers optimal storage space. This storage mode leads to duplication of data as the detail data is present in both the relational as well as the multidimensional storage. Dimension table will have one or more parent tables. stucture is stored in Relational model and data is stored in multi dimensional model which provides optimal usage and space. Types of Dimensions Dimension Description type . ROLAP offers low latency. latency and fast processing times. but it requires large storage space as well as slower processing and query response times. It is a De- Normalized form. Default storage setting is MOLAP.MOLAP. the structure of aggregation along with the values are stored in the 2 dimensional relational formats at disc level. ROLAP. Data storage modes . HOLAP In ROLAP. Increases the number of joins and poor performance in retrival of data. These hierarchies helps to drilldown the data from Top hierarchy to lowermost hierarchy. Star Vs Snowflake schema and Dimensional modeling Star Schema: One Fact Table surrounded by number of Dimension Tables. Dimension tables can be further broken down into sub dimensions. but offers a high latency and requires average amount of storage space. Hierarchies are broken into seperate tables in snow schema. In HOLAP. Dimension table will not have any parent table.

Channel A dimension whose attributes represent channel information. the total balance of all the customers in a bank at the end of a particular quarter). date. Organization A dimension whose attributes represent organizational information. Semi-Additive Facts: These are facts which can be added across only few dimensions rather than all dimensions. and so on. Customers A dimension whose attributes represent customer or contact information. Promotion A dimension whose attributes represent marketing promotion information. such as parts lists for products.Regular A dimension whose type has not been set to a special dimension type. Accounts A dimension whose attributes represent a chart of accounts for financial reporting purposes. semesters. For example. Products A dimension whose attributes represent product information. product. 4.e. such as years. the total balance at the end of quarter 1 is $X million and $Y million at the end of quarter 2. such as cities or postal codes. Rates A dimension whose attributes represent currency rate information. so at the end of quarter 2. such as employees or subsidiaries. Geography A dimension whose attributes represent geographic information. Types of Measures Fully Additive Facts: These are facts which can be added across all the associated dimensions. Currency This type of dimension contains currency data and metadata. For example. However.e. Utility A dimension whose attributes represent miscellaneous information. the same fact cannot be added across the date dimension (i. Scenario A dimension whose attributes represent planning or strategic analysis information. Time A dimension whose attributes represent time periods. bank balance is a fact which can be summed across the customer dimension (i. months. sales amount is a fact which can be summed across different dimensions like customer. geography. the total balance is . quarters. Quantitative A dimension whose attributes represent quantitative information. and days. BillOfMaterials A dimension whose attributes represent inventory or manufacturing information.

For example. Referenced: The dimension table is joined to an intermediate table. while the MOLAP cache is rebuilt. Types of relationships between dimensions and measuregroups. Regular: The dimension table is joined directly to the fact table. status flag). codes (i. product codes). 5. flags (i.e. Partition indexes and aggregations might be dropped due to changes in related dimensions data so aggregations and partition indexes need to be reprocessed. often by applying additional criteria. a balance enquiry at an automated teller machine (ATM). Factless Facts: A factless fact table is one which only has references (Foreign Keys) to the dimensions and it does not contain any measures. Fact table: The dimension table is the fact table. If you want to bring cube online sooner without waiting rebuilding of partition indexes and aggregations then lazy processing option can be chosen. Proactive caching Proactive caching can be configured to refresh the cache (MOLAP cache) either on a pre-defined schedule or in response to an event (change in the data) from the underlying relational database. Lazy aggregations: When we reprocess SSAS cube then it actually bring new/changed relational data into SSAS cube by reprocessing dimensions and measures. in turn. These types of fact tables are often used to capture events (valid transactions without a net change in a measure value). Proactive caching settings also determine whether the data is queried from the underlying relational database (ROLAP) or is read from the outdated MOLAP cache.e.the intermediate fact table is joined. It might take more time to build aggregation and partition indexes. Many to many:The dimension table is to an intermediate fact table. which in turn. No relationship: The dimension and measure group are not related. to an intermediate dimension table to which the fact table is joined. this transaction is still important for analysis purposes. Though there is no change in the account balance. Derived Facts: Derived facts are the facts which are calculated from one or more base facts. Textual Facts: Textual facts refer to the textual data present in the fact table. Proactive caching helps in minimizing latency and achieve high performance. then your net Profit Margin at the end of Day2 is still 10% and not 20%. The source dimension must also be included in the cube. It enables a cube to reflect the most recent data present in the underlying database by automatically refreshing the cube based on the predefined joined to the fact table.only $Y million and not $X+$Y). which is not measurable (non-additive). For example. profit margin is a fact which cannot be added across any of the dimensions. Non-Additive Facts: These are facts which cannot be added across any of the dimensions in the cube. For example. 6. For example. Lazy processing option bring SSAS cube online as soon as dimensions and measures get . but is important for analysis purposes. Often these are not stored in the cube and are calculated on the fly at the time of accessing them. if your profit margin is 10% on Day1 and 10% on Day2. if product P1 has a 10% profit and product P2 has a 10% profit then your net profit is still 10% and not 20%. profit margin. etc. Data mining:The target dimension is based on a mining model built from the source dimension. For example. Similarly. We cannot add profit margins across product dimensions.

Disadvantage: User will see performance hit when aggregation are getting build in background. Every Dimension has at least one Attribute Hierarchy by default whereas every Dimension does not necessarily contain a User Defined Hierarchy. 3. A User Defined Hierarchy is defined explicitly by the user/developer and often contains multiple levels. In essence. Process Index: Creates or rebuilds indexes and aggregations for all processed partitions. Process Full: Drop all object stores. and rebuild the objects. After the data is dropped. For unprocessed objects. 2. one. Process Update: Forces a re-read of data and an update of dimension attributes. 8. Month. a Calendar Hierarchy contains Year. Disable the Attribute Hierarchy for those attributes which are commonly not used to slice and dice the data during . this option generates an error. This option is when a structural change has been made to an object. Process Add: For dimensions. Partition indexes and aggregations are triggered later as a background job. Partition processing options Process Default: SSAS dynamically chooses from one of the following process options. For example. If there is data is in the partitions. for example. when an attribute hierarchy is added. it is not reloaded. An Attribute by default contains only two levels . Some of the highlights/differences of Attribute and User Defined Hierarchies: 1. Lazy processing option can be changed by server level property "OLAP\LazyProcessing\Enabled" Advantage: Lazy processing saves processing time as it brings as soon as measures and dimension data is ready. Attribute Hierarchies are always Two-Level (Unless All Level is suppressed) whereas User Defined Hierarchies are often Multi-Level. Attribute Hierarchies can be enabled or disabled. Unprocess: Delete data from the object.processed. Process Clear Structure: Removes all training data from a mining structure. it will be dropped before re-populating the partition with source data. Quarter. Process Structure: Drop the data and perform process default on all dimensions. adds new members and updates dimension attribute captions and descriptions. or renamed. or more User Defined Hierarchies. Difference between attirubte hierarchy and user hierarchy An Attribute Hierarchy is created by SSAS for every Attribute in a Dimension by default. deleted. Process Clear: Drops the data in the object specified and any lower-level constituent objects. Flexible aggregations and indexes on related partitions will be dropped. 7. Process Data:Processes data only without building aggregations or indexes. and Date as its levels. By default. Every Attribute in a Dimension has an Attribute Hierarchy whereas User Defined Hierarchies have to be explicitly defined by the user/developer.An "All" level and a "Detail" level which is nothing but the Dimension Members. a Dimension can contain zero. 4.

Database dimension exists only once. then you're creating a Database dimension in your AS database. Importance of CALCULATE keyword in MDX script. resulting in slower query performance. 10. If the option is not selected.where as Cube dimensions can be created more than one using ROLE PLAYING Dimensions concept. and Members Dimensions in Analysis Services contain attributes that correspond to columns in dimension tables. Doing this will improve the cube processing performance and also reduces the size of the cube as those attributes are not considered for performing aggregations. Perspectives. since the attribute is available to the end users through the User Defined Hierarchy and helps eliminate the confusion/redundancy for end users. Month. Phone Number. Level: refers to individual attribute within the Hierarchy. Hierarchy: is the relation between attributes in a dimension. Level. 13. This imporvies the qery performance. and Unit Price etc. data pass and limiting cube space 12. Database dimensions is independent of cube and can be processed on their own. only the relationship between the fact records and the intermediate dimension is stored in the cube. it will not be visible to the client application while browsing the Dimension/Cube. which are part of the Calendar Hierarchy. whereas a Cube dimension has several more properties.analysis. Hierarchies are used to organize measures that are contained in a cube. Select to store the attribute member in the intermediate dimension that links the attribute in the reference dimension to the fact table in the MOLAP structure. like Day. you create cube dimensions: cube dimensions are instances of a database dimension inside a cube. These attributes appear as attribute hierarchies and can be organized into user-defined hierarchies. 11. When you build a cube. 9. Translations. Dimension. Attribute Hierarchies for those attributes which are part of the User Defined Hierarchies. Attribute Hierarchies can be made visible or hidden. This means that Anaylysis services has to derive the aggregated values for the members of the referenced dimension when a query is executed. we have the option of "materializing" the dimension. A database dimension can be used in multiple cubes. but increases the processing time and storage space. Difference between database dimension and cube dimension When you create a dimension using dimension wizard in BIDS. and you add dimensions to that cube. can be hidden. Quarter. 5. Database dimension is created one where as Cube dimension is referenced from database dimension. When an Attribute Hierarchy is hidden. and multiple cube dimensions can be based on a single database dimension The Database dimension has only Name and ID properties. and Year. Partition processing and Aggregation Usage Wizard 14. Hierarchy. Effect of materialize When setting up a dimension with a Refence relationship type. or can be defined as parent-child hierarchies based on columns in the underlying dimension table. Linked Object Wizard . like Address.

its data is actually stored in the Fact Table. etc Steps: In Cube Designer. and each Order has an Order Number associated with it. Ship Date. click the Dimension Usage tab. A Conformed Dimension is a Dimension which connects to multiple Fact Tables across one or more Data Marts (cubes). Due Date. -For an end user. Conformed Dimensions. A Degenerate Dimension is a Dimension which is derived out of a Fact Table and it appears to the end user as a separate/distinct Dimension. Linked Dimensions can be used when the exact same dimension can be used across multiple Cubes within an Organization like a Time Dimension. and then click OK. which forms the unique value in the Degenerate Dimension. Junk Dimensions.15. a Linked Dimension appears like any other Dimension. same members and same meaning irrespective of which Fact Table it is connected to A linked dimension is based on a dimension that is stored in a separate Analysis Services Database which may or may not be on the same server. flags. Conformed Dimensions are exactly the same structure. SCD and other types of dimensions Role playing Dimesnion: A Role-Playing Dimension is a Dimension which is connected to the same Fact Table multiple times using different Foreign Keys. Here are some of the highlights of a Linked Dimension: -More than one Linked Dimension can be created from a Single Database Dimension. Role playing Dimensions. In the Add Cube Dimension dialog box. gography Dimension etc. It is also called as a Garbage Dimension. Degenerate Dimensions are commonly used when the Fact Table contains/represents Transactional data like Order Details. etc. or right-click anywhere on the work surface and then click Add Cube Dimension. etc. values (dimension members). each time using a different Foreign Key in the Fact Table like Order Date. Handling late arriving dimensions / early arriving facts 16. attributes. -These can be used to implement the concept of Conformed Dimensions. One of the common scenarios is when a Fact Table contains a lot of Attributes which . select the dimension that you want to add. codes. meaning and definition. Junk Dimensions are usually small in size. Example: A Date Dimension has exactly the same set of attributes. Either click the 'Add Cube Dimension' button. It's a Dimension table which does not have an underlying physical table of its own. You can create and maintain a dimension in just one database and then reuse that dimension by creating linked dimensions for use in multiple databases. A Junk Dimension is often a collection of Unrelated Attributes like indicators. Delivery Date. eg: Consider a Time Dimension which is joined to the same Fact Table (Say FactSales) multiple times.

suppose this is your query: select {[Measures]. flags. SCD: The Slowly Changing Dimension (SCD) concept is basically about how the data modifications are absorbed and maintained in a Dimension Table. 17. IS keyword. MemberWithLeafLevelData property 18.&[2002] You can modify it like this: select {[Measures]. etc. IF THEN END IF. THIS keyword. [Product]. then it will run just like: where [Date]. IsDeleted etc. such Attributes can be removed/cleaned up from a Fact Table. Dimension security vs Cell security 22.[Category] on 1 from [Adventure Works] where strtomember(@P1) Now. The new (modified) record and the old record(s) are identified using some kind of a flag like say IsActive. How will you keep measure in cube without showing it to user? 19. CASE (CASE.[Calendar Year]. Logging and monitoring MDX scripts and cube performance . END) statement. [Product].&[2002] STRTOSET returns a set. or using Start and End Date fields to indicate the validity of the record. Parent Child Hierarchy. SCOPE statement. if you pass the value [Date].[Calendar Year].[Category]. 20.[Calendar Year]. SUBCUBE 23. ELSE.[Internet Order Quantity]} on 0. Common types of errors encountered while processing a dimension / measure groups / cube 27. RECURSION and FREEZE statement 26. Using Junk Dimensions.[Category] on 1 from [Adventure Works] where [Date]. How to pass parameter in MDX Here is an example. NamingTemplate property.[Internet Order Quantity]} on 0.are like indicators.[Category].&[2002] to the P1. STRTOMEMBER returns a member. THEN. HAVING clause 24. CELL CALCULATION and CONDITION clause 25. SSAS 2008 vs SSAS 2012 21. WHEN.

every Attribute in a Dimension is related to the Key Attribute. 31. Try to set the relationship to Rigid wherever possible. By default. Flexible: Attribute Relationship should be set to Flexible when the relationship between those attributes is going to change over time. For example. Deploy: Deploy the structure of the cube(Skeleton) to the server. relationship between a Month and a Date is Rigid since a particular Date always belongs to a particular Month like 1st Feb 2012 always belongs to Feb Month of 2012. where each attribute is related either directly or indirectly to all other attributes in the same hierarchy. What is deploy.product subcategory . as in product category . In dimension usage tab. Can you create server time dimension in analysis services(Server time dimension)? 35. relationship between an Employee and a Manager is Flexible since a particular Employee might work under one manager during this year (time period) and under a different manager during next year (another time period). process and build? Bulid: Verifies the project files and create several local files. For example. There are basically 2 types of Attribute Relationships: Rigid. you would like to go for materializing dimension? Reference dimensions let you create a relationship between a measure group and a dimension using an intermediate dimension to act as a bridge between them. how many types of joins are possible to form relationship between measure group and dimension? 33. 32. What is time intelligence? How will you implement in SSAS? . Flexible 29.28. What do you understand by rigid and flexible relationship? Which one is better from performance perspective? Rigid: Attribute Relationship should be set to Rigid when the relationship between those attributes is not going to change over time. What do you understand by attribute relationship? what are the main advantages in using attribute relationship? An Attribute Relationship is a relationship between various attributes within a Dimension. Process: Read the data from the source and build the dimesions and cube structures 34. In which scenario. What is natural hierarchy and how will you create it? Natural hierarchies.product name 30. How many types of dimension are possible in SSAS? Account Bill of Materials Currency Channel Customer Geography Organizations Products promotion Regular Scenario Time Unary 36.

Deploy Changes only 1. What do you understand by following properties of dimension attribute. What is the difference between "ProcessingGroup" ByAttribute and ByTable? 42. (200 Employees) " 45. There are four operation managers. What are the options to deploy SSAS cube in production? Right click on Project in Solution Explorer -> Properties Build -> Select ' Output Path' Deployment -> Processing Option .Default. In Process Update. I need to give access to CEO. they will be useful? 44. 3) Employees can see only certain dimension and measure groups data. What are different storage mode option in SQL server analysis services and which scenario.asdatabase . Do Not Process Transactional Deployment .deploymentoptions .deploymenttargets . Full. 1) CEO can see all the data of all 4 cubes.BIDS In BIDS from the build menu – select the build option (or right click on the project in the solution explorer). The build process will create four xml files in the bin subfolder of the project folder .37.configsettings . What do you understand by linked cube or linked object feature in SSAS? 38. How will you implement data security for given scenario in analysis service data? "I have 4 cubes and 20 dimension. Default Member AttributeHierarchyEnabled AttributeHierarchyOptimizedState DiscretizationMethod OrderBy OrderByAttribute 43.False. True Deployment Mode . How will you write back to dimension using excel or any other client tool? 39. What do you understand by dynamic named set (SSAS 2008)? How is i different from static named set? 40.Deploy All. 2) Operation Managers can see only data related to their cube. which relationship will be better(Rigid and Flexible relationship)? the main object definition file . Operation managers and Sales managers and employee.

A more controllable option is the Deployment wizard. Count. specify how configuration settings are deployed 5.asdatabase file created by the build 2. How can you improve dimension design? 1: Limit the Number of Dimensions Per Measure Group and Number of Attributes Per Dimension. What are the main feature differences in SSAS 2005 and SSAS 2008 from developer point of view? 54. Deploy Deployment via BIDS will overwrite the destination database management settings – so is not recommended for production deployment. Specify Processing options: Default processing allows SSAS to decide what needs to be done. AttributeHierarchyOptimizedState: Determines the level of optimization applied to the attribute hierarchy. What are the options available to incrementally load relational data into SSAS cube? Use Slowly Changing Dimesnion 47. What are the processing modes available for measure group? What do you understand by lazy aggregation? 56. Browse to the . which means that Analysis Services builds indexes for the attribute hierarchy to improve query performance. By default. available in interactive or command line mode. Can we use different aggregation scheme for different partitions? 51. You can also choose not to process at all. 2: Use Dimension Properties Effectively For large dimensions that expose millions of rows and have a large number of attributes. 6. means that no indexes are built for the attribute hierarchy. By .2. an attribute hierarchy is FullyOptimized. Min. NotOptimized. The script will be created in the same location as the . Why will you use aggregation at remote server? 48. Configure how partitions and roles should be deployed 4. pay particular attention to the ProcessingGroup property.What are the aggregate functions available for measure in SSAS? Sum. The other option. connect to the target server 3.asdatabase file. Run the wizard from Start -> All Programs ->Microsoft Sql Server -> Analysis Services -> deployment wizard 1. What are different ways to create aggregations in SSAS? 49. What do you understand by Usage based optimization? 50. choose whether to deploy instantly or to create an XMLA command script for later deployment. and Distinct Count 55. What are KPIs? How will you create KPIs in SSAS? 53. 46. Full processing can be used to process all objects. Max. Why will you use perspective in SSAS? 52.

Storage Engine (SE) is multi-threaded.. What do you understand by Named set? Is there any new feature added in SSAS 2012 related to named set? 5. this property is assigned a value of ByAttribute. attribute. What is the difference between . 58. existing and scope? 7. Process Index for the rest of the partitions MDX 1. the better. How will you pass parameter in MDX? 9. Of course. 2. hierarchy? 6. If only new members were added. But if members were deleted or if member relationships changed. Process Update can affect dependent partitions. i. the smaller the partition. 3: Use Regular Dimension Relationship Whenever Possible 4: Use Integer Data Type for Attribute Keys If at All Possible 5: Use Natural Hierarchies When Possible 57. Explain the structure of MDX query? 2. BAsc and others .e.MEMBERS and . What will happen if we remove CALCULATE keyword in the script? 8. What is the difference between set and tuple? Tuple: It is a collection of members from different Dimension Set: collection of one or More tuples from same dimension 4.default. aggregations are created only for the key attribute and the top attribute. How can you improve overall cube performance? Partitioning the cube can help to reduce the processing time. Depending on the nature of the changes in the dimension table. What are the performance issues with parent child hierarchy? In parent-child hierarchies. Process Data the partitions that have changed data (which are usually the most recent partitions). The benefit of partitioning is that it allows to process multiple partitions in parallel on a server that has multiple processors. then some of the aggregation data and bitmap indexes on the partitions are dropped. 59. then the partitions are not affected. Regarding the best possible processing strategy. How will you differentiate among level. The Query Processor Cache/Formula Engine Cache caches the calculation results whereas the Storage Engine Cache caches aggregated/fact data being queried. member. the All attribute unless it is disabled. What are the differences among exists. What do you understand by formula engine and storage engine? Formula Engine is single-threaded. Descendants.What is the difference between NON EMPTY keyword and NONEMPTY() function? 11. 3. the following steps: 1. Process Update all the dimensions that could have had data changed. Tell me your 5 mostly used MDX functions? 3. Functions used commonly in MDX like Filter.CHILDREN? 10. so try to use daily partitions instead of monthly or use monthly partitions instead of yearly.

Write MDX for retrieving top 3 customers based on internet sales amount? 17. Is Rank MDX function performance intensive? 26. attribute relationships 15. Difference between static and dynamic set 14. What do you understand by storage engine and formula engine? .. Difference between NON EMPTY keyword and function. Difference between rigid and flexible relationships 16. What are the performance consideration for improving MDX queries? 25. QTD(quarter to date) and YTD(year to date) internet sales amount for top 5 products? 20. Write MDX to extract nth position tuple from specific set? 23. AUTOEXISTS 13. Write MDX to compare current month's revenue with last year same month revenue? 19. How will you find performance bottleneck in any given MDX? 28.12. Write MDX to set default member for particular dimension? 24. Difference between natural and unnatural hierarchy. NON_EMPTY_BEHAVIOR. Write MDX to find current month's start and end date? 18.. Write MDX to rank all the product category based on calendar year 2005 internet sales amount? 22.NON Empty keyword or NONEMPTY function? 27. Which one is better from performance point of view. Write MDX to find count of regions for each country? 21. Write MDX to find MTD(month to date). ParallelPeriod.