You are on page 1of 7

Mohammad.

S
mohammad.s143225@gmail.com
(551)-866-0041
Sr. Data Analyst

PROFESSIONAL SUMMARY:

 8+ years of IT experience in the Data Analysis, Data Warehousing, Data Modeling, Data Governance, Data Mapping,
Data Lineage, Data Migration, Data Profiling, Financial Reporting, budgeting, forecasting, Informatica Power Center,
Design, Power BI, Tableau, Administering, Implementing and Testing Client/server application using Microsoft SQL
Server and BI suite (Development, UAT and Production Environment) in Banking and Finance Industry.
 Strong technical knowledge in MS SQL Server development including DTS, Microsoft Analysis.
 Strong experience in Data Analysis, Data Profiling, Data Migration, Data Conversion, Data Quality, Data Integration
and Metadata Management Services and Configuration Management.
 Experienced in Installation, Configuration, and Administration of Informatica Data Quality and Informatica Data
Analyst.
 Expert in generating on - demand and scheduled reports for business analysis or management decision using
Tableau and POWER BI.
 Strong analytical skills; support operations through analysis of key performance indicators and trends.
 Created various types of charts such as Heat Maps, Geocoding, Symbol Maps, Pie Charts, Bar Charts,
Tree Maps, Gantts, Circle Views, Line Charts, Area Charts, Scatter Plots, Bullet Graphs and Histograms in
Table Desktop, Power BI and Excel to provide better data visualization.
 Solid knowledge of Power BI and Tableau Desktop report performance optimization.
 Experienced in Dodd-Frank (Basel) compliance, Collected and analyzed the data, developed the risk management
strategies, generated reports, monitored compliance, and identified the process improvements. By leveraging data
analytics tools and techniques, data analysts help banks to comply with regulatory requirements and manage their
risks effectively.
 Proficient at ROI analysis in the banking industry through data collection, analysis, forecasting, risk analysis,
reporting, and decision making. By leveraging data analytics tools and techniques, data analysts help banks to make
informed investment decisions and maximize the ROI of their investments.
 Worked with financial institutions to monitor and comply with regulatory requirements related to KYC, such as the
USA Patriot Act, Bank Secrecy Act, and Anti-Money Laundering (AML) laws.
 Worked with financial institutions to comply with regulatory requirements related to Forex trading, such as the
Dodd-Frank Act, the European Market Infrastructure Regulation (EMIR), and the Foreign Account Tax Compliance
Act (FATCA).
 Created Excel reports, Dashboards & Performing Data validation activity using VLOOKUP, HLOOKUP,
Macros, formulas, index match, Slicer with (Pivot table, Get Pivot Data, Dashboards), Power View, Power
Map and Heat Map.
 Expert on maintaining and managing Microsoft Power BI and Tableau reports, dashboards and
Publishing to the end users for Executive level Business Decision.
 Thrives in high-pressure environments, budget forecasting, management and strong team player skills.
 Extensive experience with OLTP/OLAP System and E - R modeling, developing Database Schemas like Star schema
and Snowflake schema used in relational, dimensional and multidimensional modeling.
 Extensive experience in strategic development of a Data Warehouse and in Performing Data Analysis and Data
Mapping from an Operational Data Store to an Enterprise Data Warehouse.
 Expertise in Data modeling (Dimensional & Relational) concepts like Star-Schema Modeling, Snowflake Schema
Modeling, Fact and Dimension tables.
 Strong background in software for Banking and Financial services with knowledge on Payment’s module, SWIFT
Messages (MT103, MTn90, MTn91, MTn92, MTn95, MTn96, MT202, MT300, MT320, MT900, MT910, MT940 and
MT950) and Confidential (GPP) system.
 Strong experience in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Import, and
Data Export using multiple ETL tools such as Informatica PowerCenter, SSIS.
 Good Knowledge in the ETL (Extract, Transform and Load) of data into a data ware house and Business Intelligence
(BI) tools like Tableau and Power BI Services, Performance tuning, Reporting, Designing logical/physical databases
and Troubleshooting.
 Expert Knowledge of T-SQL, Integration Services (SSIS), Reporting Services (SSRS) and Analysis Services (SSAS).
 Experience in writing complex SQL queries involving multiple tables with inner and outer joins, stored procedures.
 Experience in SQL Server DTS and SSIS (Integration Service) package design, constructing, and deployment.
 Experience in enhancing and deploying the SSIS Packages from development server to production server.
 Experience in Tableau report developing, testing and deploying report using Tableau Server.
 Transformed data from one server to other servers using tools like Bulk Copy Program (BCP), Data Transformation
Services (DTS) and SQL Server Integration Services (SSIS).
 Proficiency in creating different types of reports such as Cross-Tab, Conditional, Drill-down, Sub reports, and
formatting them.

Business Skills Business Requirements, Business Process Analysis & Design, Financial Analysis, Risk Analysis
Requirement Gathering, Use Case Modeling, JAD/JRP Sessions, GAP Analysis & Impact analysis,
OSS, BSS.

Databases SQL Server, Oracle, MYSQL, No-SQL

Database Tools SQL Server Management Studio, Performance Monitor, Query Analyzer, Query Optimizer, SQL
Profiler, Data Transformation Services (DTS), ETL, Bulk Insert and BCP, ODBC, Business Intelligence
Development Studio (BIDS), SQL CMD.

ETL and BI Tools SSRS, Report Builder, Focus, Power BI, Tableau, Informatica

Languages SQL, C, C++, VB, PL/SQL, Python, UML, HTML, XML, VB, Java scripts.

OLAP SQL Server Reporting Services (SSRS), SSIS, Crystal Reports, OLAP, Erwin, Tableau, Power BI

Operating Systems Windows Server, MS DOS, Unix, Linux.


IT Processes Software Development Life Cycle (SDLC), Agile, Waterfall, Iterative.

Methodologies Agile, Waterfall, Scrum, SDLC.


TECHNICAL SKILLS:

PROFESSIONAL EXPERIENCE:

Valley Bank, NJ
Sept 2020 – Present
Sr.DATA ANALYST.
Responsibilities:
 Worked with Data Analysis Primarily Identifying Data Sets, Source Data, Source Metadata, Data Definitions Data
Formats, Data Validation, Data Cleansing, Data Verification and Identifying Data Mismatch.
 Involved in Data Architecture Designing and development of a Central Data Repository of Static Content for
Syndicated Loans business. Liaison with the Syndicated Loans team as well as downstream clients in the bank to
define scope and requirements.
 Involved in the Data Integration and Infrastructure project for the Valley bank. We assess the impact on downstream
data management environments of processing different banking products through different systems and processes.
 Designed a STAR schema for the detailed data marts and Plan data marts involving confirmed dimensions.
 Performed Count Validation, Dimensional Analysis, Statistical Analysis and Data Quality Validation in Data Migration.
 Generated portfolio analysis and P&L reports by performing risk analysis and risk management (Credit Risk, Market
Risk, Reinvestment Risk, and Inflation Risk) based on the client investment objectives.
 Designed and implemented multiple dashboards using Power BI - PowerPivot & Power Query tools for in house
metrics.
 Designed and developed MDM data model and integration of Web Sphere Process Server.
 Created Page Level Filters, Report Level Filters, Visual Level Filters in Power BI according to the requirements.
 Verified all aspect of Client Identification Program (CIP) and necessary account opening documents are provided and
in accordance with know Your Customer (KYC) and Bank Secretary Act (BSA) standards.
 Analyze the Bank's data and business terms from a data quality and integrity perspective.
 Implemented an MDM process to take strategy to roadmap and design development activities. Delivered MDM
roadmap & MDM Architecture.
 Delivered Enterprise Data Governance, Data Analyst, Metadata solutions.
 Assist ETL team to define Source to Target Mappings.
 Worked extensively on Data Governance for Data Quality, Data Security, Data Privacy, Data Access and Data
Management.
 Perform root cause analysis on smaller self-contained data analysis tasks that are related to assigned data processes.
 Created a road map for the client for the planning, developing, and implementing of MDM solutions, enabling
consolidation of MDM data following Mergers and Acquisitions.
 Work to ensure high levels of data consistency between diverse source systems including flat files, XML and SQL
Database.
 Daily Data quality checks and Data profiling for accurate and better reporting and analysis.
 Involved in translating the business requirements into data requirements across different systems.
 Involved in understanding the customer needs with regards to data, documenting requirements and complex SQL
statements to extract the data and packaging data for delivery to customers.
 Wrote complex SQL and PL/SQL queries to identify granularity issues and relationships between data sets and
created recommended solutions based on analysis of the query results.
 Performed unit testing on transformation rules to ensure data moved correctly.
 Maintained Excel workbooks, such as development of pivot tables, exporting data from external SQL databases,
producing reports and updating spreadsheet information.
 Researched and fixed data issues pointed out by QA team during regression tests.
 Participates in the development of Enhancement for the current Commercial and Mortgage Securities.
 Manipulate and prepare data, extract data from database for business analyst using Tableau.
 Review normalized schemas for effective and optimum performance tuning queries and data validations in OLTP and
OLAP environments.
 Exploits power of MS SQL to solve complex business problems by data analysis on a large set of data.
 Working knowledge on different data sources as Flat files, Oracle, SQL Server, RDBMS, DB2, Teradata and have
knowledge in Data extraction, profiling, identifying data quality issues.
 Worked on Dimensional Modeling, Designing of STAR, SNOW FLAKE Schemas.
 Supported team in resolving SQL Reporting services, SQL Integration services and T-SQL related issues.
 Experience in Analyzing Execution Plan and managing indexes and troubleshooting deadlocks.
 Involved in Normalization and De-Normalization of existing tables for faster query retrieval.
 Developed Merge jobs in Python to extract and load data into MySQL database.
 Implemented advanced SQL queries using DML and DDL statements to extract large quantities of data from multiple
data points on MS SQL and Teradata SQL Assistant.
 Working on live site issues to fix slow running stored procedures, triggers etc. and provide root cause analysis.
 Experience in Data Analysis, understanding trends and patterns in data.
 Data Mining Models, developing reports using MDX and SQL.
 Created KPI's, Partitions, Perspectives, and Calculated Measures, used DAX queries, Roles.
 Developed dashboards and ad-hoc reports using MS Power BI for senior management team for analysis.
 Wrote calculated columns, Measures query's using DAX in Power BI to show good data analysis techniques.
 Involved in scorecard and KPI Reporting.
 Created different tabular reports using Power BI features and enhanced them based on user requirements.
 Utilized Power BI to design multiple scorecards and dashboards to display information required by different
departments and upper-level management.
 Lead pricing meetings with project managers to price out projects, determine incentive budgets, revenue sharing
among teams and overall profit margins. Update high level divisional directors quarterly with forecasts and revenue
for all managed project.

Environment: MS SQL Server, Microsoft Management Studio, Microsoft Visual Studio, MSSQL Server Integration Services
(SSIS), SQL Server Reporting Services (SSRS), SQL, Power BI, Tableau, Python.

Western Alliance Bank, AZ


Jun 2018 – Sept 2020
Sr.DATA ANALYST.
Responsibilities:
 Created SSIS Packages using Various Transformation like Lookup, Derived Columns, Condition Split, Data Conversion,
Aggregate, Merge Join, Sort, Execute SQL Task, Data Flow Task, and Execute Package Task etc. to generate
underlying data for the reports and to export cleaned data from Excel Spreadsheets, Text file, MS Access and CSV
files to data warehouse.
 Created/Reviewed data flow diagram to illustrate where data originates and how data flows within the Enterprise
Data warehouse (EDW).
 Created different visualization (Stacked bar Chart, Clustered bar Chart, Scatter Chart, Pie Chart, Donut Chart, Line &
Clustered Column Chart, Map, Slicer, Time Brush etc.) in Power BI according to the requirements.
 Developed dashboards and ad-hoc reports using MS Power BI for senior management team for analysis.
 Used MDM tool to support Master Data Management by removing duplicates, standardizing data (Mass
Maintaining), and incorporating rules to eliminate incorrect data and filter data as per requirements.
 Provided continued maintenance and development of bug fixes for the existing and new Power BI Reports.
 Developed the required data warehouse model using Star schema for the generalized model.
 Worked closely with the Enterprise Data Warehouse team and Business Intelligence Architecture team to
understand repository objects that support the business requirement and process.
 Worked for a specific mortgage loans and vehicle insurance database project to design a new database schema.
Migrated data from Oracle Database, Sybase, MongoDB to SQL Server, performing data analysis and then generating
various types of reports through BI tools.
 In charge of comprehensive Data Quality by making sure invalid, inconsistent or missing Obligor Risk Ratings are
reported to portfolio managers for remediation and ensure that checks are in place to prevent the issue from re-
occurring.
 Generated reports on IAM performance metrics, such as access rights, permissions, and account activity. They
provide insights into IAM effectiveness to inform business decisions.
 Monitored ACH transaction data to identify trends, anomalies, and potential issues. ACH payment processing costs
can vary depending on the volume of transactions, the type of transactions, and other factors.
 Evaluated payment processing costs to identify areas where costs can be reduced, such as by optimizing payment
processing workflows or negotiating better pricing with payment processing vendors.
 Collected and analyzed the data related to a bank's compliance with Dodd-Frank (Basel) regulations. This includes
data on capital ratios, stress test results, risk exposures, and other factors that impact compliance.
 Worked with risk managers to develop risk management strategies that comply with Dodd-Frank (Basel) regulations.
They use data analytics tools to identify potential risks and develop strategies to mitigate them.
 Performed scenario analysis to identify potential financial stress scenarios and their potential impact on the bank's
capital adequacy. They use historical data and predictive models to identify potential risks.
 Responsible for collecting and analyzing the data related to financial transactions, customer behavior, and other
relevant data to create models that can be used to identify suspicious activities. These models are then integrated
into the Actimize software to improve its accuracy in detecting financial crime.
 Integrated data from various sources, such as internal bank systems and external data providers, into the Actimize
software and ensure that the data is accurate and complete, and that it is aligned with the requirements of the
software.
 Created reports and visualizations to help stakeholders understand the effectiveness of the Actimize software in
detecting financial crime and used these reports to identify areas where the software can be improved or optimized.
 Monitored the performance of the Actimize software, looking for issues such as false positives or false negatives,
and work with the software development team to make improvements and involved in maintaining the software by
updating data sources, configuring new rules, or managing access control.
 Involved in leveraging Hadoop and Hive to process, manage, analyze, and report on vast amounts of financial data,
contributing to informed decision-making, risk mitigation, and improved customer experiences.
 Involved with Impala in the banking industry utilizes its fast SQL querying capabilities to process, analyze, and derive
insights from large volumes of banking data and optimized query performance, generated reports, and ensure the
data security and compliance, contributing to informed decision-making and improved operational efficiency
 Managed functional requirements for interfaces and conversions between other legacy systems to Teradata, MDM,
Enhancements, Workflows and Reports for MDM.
 Documented logical, physical, relational and dimensional data models. Designed the Data Marts in dimensional data
modeling using star and snowflake schemas.
 Database table review and data mapping for large scale data conversion project Oracle database to Mainframe.
 Wrote SQL queries for each Test case and executed in SQL Plus to validate the data between Enterprise Data
Warehousing and Data Mart Staging Tables.
 Created Data Flow Diagrams and Process Flow Diagrams for various load components like FTP Load, SQL Loader
Load, ETL process and various other processes that required transformation.
 Validated the test data in DB2 tables on Mainframes and on Teradata using SQL queries.
 Transformed project data requirements into project data models.
 Wrote Test cases for Enterprise Data Warehousing (EDW) Tables and Data Mart Staging Tables.
 Worked on Performance tuning and loading data for fast access of reports in Client/Database. Server balancing,
business Rules Implementation, Metadata, Data Profiling.
 Involved in Data mapping specifications to create and execute detailed system test plans. The data mapping
specifies what data will be extracted from an internal data warehouse, transformed and sent to an external entity.
 Extraction of test data from tables and loading of data into SQL tables.
 Collected business requirements to set rules for proper data transfer from Data Source to Data Target in Data
Mapping.

Texas Capital Bank, TX


Aug 2016 – Jun 2018
DATA ANALYST.
Responsibilities:
 Created various data quality mappings in Informatica Data Quality (IDQ) tool and imported them into Informatica
PowerCenter as mappings, mapplets.
 Extensively used Informatica Data Quality transformations – Labeler, Parser, Standardizer, Match, Association,
Consolidation, Merge, Address Validator, Case Converter, and Classifier.
 Involved in Creating Dashboards Scorecards, views, pivot tables, charts, for further data analysis.
 Define and communicate project scope, milestones/deliverables and projected requirements from clients.
 Used Tableau for SQL queried data, and data analysis, generating reports, graphics and statistical analysis.
 Queried questions up front to determine key issues and foresee potential show-stoppers that might arise later and
in-depth understanding of OTC derivatives operations.
 Involved in delivering reports using the SQL advanced techniques like Rank, Row number etc.
 Analyze data using Tableau for automation and determine business data trends.
 Involved in Proving guidance for transitioning from Access to SQL Server.
 Transfer data objects and queries from MS Excel to SQL Server.
 Involved in creating different reports based on the business request and ad hoc requirements using Tableau, Power
BI desktop. Prepare financial deliverables (quarterly and budgetary reports) for funding agencies as well as proof of
concept, and solution mockups.
 Worked extensively in documenting the Source to Target Mapping documents with data transformation logic.
 Transformations of requirements into data structures, which can be used to efficiently store, manipulate and
retrieve information.
 Collaborate with data modelers, ETL developers in the creating the Data Functional Design documents.
 Evaluation and selection of various MDM technologies and products in view of the client requirements.
 Involved in migration projects to migrate data from data warehouses on Oracle/DB2 and migrated those to Teradata
 Define downstream targets for MDM with design for attributes and sync methodologies.
 Created Complex Teradata scripts to generate ad-hoc reports that supported and monitored Day to Day.
 Automated and scheduled recurring reporting processes using UNIX shell scripting and Teradata utilities such as
MLOAD, BTEQ and Fast Load.
 Involved in Building a specific data-mart as part of a Business Objects Universe, which replaced the existing system
of reporting that was based on exporting data sets from Teradata to Excel spreadsheets.
 Migrated three critical reporting systems to Business Objects and Web Intelligence on a Teradata platform.
 Worked with project team representatives to ensure that logical and physical ER/Studio data models were
developed in line with corporate standards and guidelines.
 Involved in designing Payment Integrity Data mart, which captures the fraudulent claims on daily basis. Also,
involved in gathering the data requirements, for building Payment Integrity data mart required for various reporting
purposes BI Adhoc Reporting.
 Used data analysis techniques to validate business rules and identify low quality for Missing data in the existing Bank
data warehouse EDW.
 Also Worked on some impact of low quality and/or missing data on the performance of data warehouse client.
 Performed Data Analysis and Data validation by writing complex SQL queries using TOAD against the ORACLE
database.

Investors Bank, NJ
Nov 2014 – Aug 2016
DATA ANALYST
Responsibilities:
 Involved in creating and maintain data model, including master data management (MDM).
 Extensively tested the ETL mappings which were developed to load data from Oracle and SQL Server sources into
the Oracle Staging/Target Area
 Verified and validated data model structure and E-R modeling with all related entities and relationship with each
entity based on the rules using Erwin as per the business requirements.
 Worked in the implementation of loan and risk management system applications
 Used MDM tool to support Master Data Management by removing duplicates, standardizing data (Mass
Maintaining), and incorporating rules to eliminate incorrect data and filter data as per requirements.
 Involved in (Master Data Management) MDM to help the organization with strategic decision making and process
improvements. (Streamline data sharing among personnel and departments).
 The project was for bank Data and Reporting processes for their Balance Sheet Capital Management team, which
covers end-to-end data processes for Market Operation Risk and Securitization from data extraction through
production phase. Managing the online contents and lead cross-functional project and initiatives. Additionally,
verifying and updating information on banking accounts globally by ensuring Know Your Customer (KYC).
 Worked on transforming the requirements into data structures, which can be used to efficiently store, manipulate
and retrieve information.
 Collaborate with data modelers, ETL developers in the creating the Data Functional Design documents.
 Involved in Creating Dashboards and visualization of different types for analysis, monitoring, management and
better understanding of the business performance metrics.
 Used SQL Queries to verify the data from the Oracle and MS SQL Server databases
 Performed backend testing using SQL queries and analyzed the server performance on UNIX
 Tested mappings and SQL queries in transformations such as Expression transformation, Filter transformation,
Lookup transformation, Joiner transformation, XML transformation, Aggregator transformation, and Update strategy
transformation.
 Involved in migration projects to migrate data from data warehouses on Oracle and migrated those to Teradata.
 Responsible for management of data, duplicating the data and storing the data in to specific data warehouse using
Talend Platform for Data Management and MDM.
 Worked extensively in documenting the Source to Target Mapping documents with data transformation logic.
 Worked on Integration on banking platforms. ETL process developed in Informatica Power Center extracts this data
from many data sources (DB2, Teradata, Oracle databases and flat files), identifies inaccurate data, and transforms,
loads data into an Oracle 9i data warehouse that is optimized for query and reporting.
 Extensively used Informatica Debugger to validate maps and to gain troubleshooting information about data and
error conditions.
 Interacted with client user personnel to ensure continuing adherence to requirements and user standards.
 Tested formulas for various loan, interest calculations, and currency conversions required for reporting purposes.
 Actively participated in System testing, integration testing and coordinated UAT.

You might also like