You are on page 1of 6

Visala Nekkanti

Professional Summary:

● 15+ years of proven experience in Software requirement gathering, design, development, implementation
using Business Intelligence (BI) tools and database with the emphasis on BI Development using SQL for
data warehouse, database development on CRM, Finance, Supply Chain, Taxation, Health Care domains.
● Worked on building Enterprise Analytics using different source systems (SAP CRM, ECC, CLM, Flat Files,
and SFDC) as the Source to Extract, Transform and Load Data into Centralized Data Warehouse systems.
● Good experience in working with MS SQL Server products like SSIS, SSRS and SQL Server Data tools with
knowledge on SAP functional flow.
● Extensive experience and solid understanding of Data Warehousing, Star / Snowflake Schema
● Modeling, Dimensional Modeling, Entity-Relationship Modeling, OLTP and OLAP concepts.
● Extensively worked on Creating, Populating and maintaining Data marts. Thorough knowledge of
Features, Structure, Attributes, Hierarchies, Star and Snowflake Schemas of Data Marts.
● Excellent knowledge in Data Analysis, Data Validation, Data Cleansing, Data Verification and
identifying data mismatch over transformation data process.
● Experience in Database Programming such as writing T-SQL queries, stored procedures, user defined
functions, triggers, indexes, constraints, CTEs, and implementing Row Level Security.
● Expertise in Creation, Extraction, Transformation and Loading (ETL) packages using MS SSIS application
tools and Python using different sources (Oracle, Text Format, CSV, and Excel).
● Exposure on Data Warehousing - OLTP and Data modeling (using Toad) and creating Facts and
Dimensions.
● Experienced in deploying ETL packages using Integration Services Catalogs and scheduling the packages
using SQL Jobs, windows scheduler, Airflow.
● Extensively worked on data migration from legacy database to SFDC using DBAmp for data and files, using
Salesforce RestAPI for file uploads to ContentVersion, using Workbench/data loader.
● Significant professional experience in ETL/ELT using cloud-based databases at scale such as Microsoft SQL
Server, Postgres, Salesforce.
● Experience on migrating data from SQL to Snowflake.
● Hands on SnowSQL, Python, Tasks, Time travel, Optimizer, data sharing, and stored procs in Snowflake.
● Experience working with scripting languages such as Python, R, PowerShell, VB/JavaScript, C#
● Good working knowledge on GitHub, TFS, VSS, SVN as a source control system.
Technical Skills:
Databases SQL Server, Azure, Oracle 12c, Salesforce, Snowflake
Scripting Languages Python, C#, JavaScript, React JS, Ajax, XML, SQL Azure, Data Sync, .Net, Java
Reporting / ETL SSIS, SSRS, Crystal Reports, Business Objects
Operating System Windows, Unix/Linux
DB Tools SSMS, BIDS, Query Analyzer and Profiler, DBAmp
ETL Tools SSIS, Azure Data Factory
Reporting/Visualization SSRS, JavaScript and Microsoft Excel.
Tools VisualStudio 2017, GitHub, TFS, Visio, SFDC

Certifications:
● MCP (Microsoft Certified Professional) (MCP ID: 5952708)
● MCTS (3.5 Framework, ASP.NET Application Development) (Exam No: 070 562) (Candidate ID:
SR5609981)
● Certified/Awarded as Best Developer at PG&E

Education: Master of Computer Applications (MCA), 2007

Projects:

BI Developer
Sonos – Oct 21 to Till Date

Responsibilities:

● Analyze and document client's business requirements and processes, transform the requirements to basic
conceptual data and process models, data dictionaries and volume estimates.
● Working with SAP functional and Technical Analyst to gather User Requirements and System
Specifications which includes STTM with code logic and business flows in SAP modules.
● Performing data migration between homogeneous/heterogeneous systems and SQL Server using
Integration Service (SSIS).
● Creating and tuning stored procedures in SQL Server for incremental and differential updates coming from
SAP and Salesforce.
● Working on creating tables, writing SQL Queries, stored procedures, views, triggers, performance tuning,
troubleshooting issues, creating ERDs.
● Creating ETL packages using SSIS for full and incremental loads and using python to load CSV and Json files
coming from different sources.
● Documentation exists in dbt markdown file.
● Developing multiple data pipelines to read/write data from heterogeneous sources to S3 buckets and to
Snowflake tables using Python.
● Working on data warehousing- OLTP and Data modeling (using Toad), CDC (Change Data Capture), ODS,
DWM (using Facts and Dimensions).
● Write dbt model as per mapping document.
● Working on migrating the data into Snowflake from SQL.
● Working on SnowSQL, Python, Tasks, Time travel, Optimizer, data sharing, and stored procs.
● Using DBT to populate Snowflake tables for data pipeline processing and data transformation.
● Moving the existing system from Windows scheduler/BODS jobs to Airflow Dag packages for scheduling.
● Using/creating Dockers containers to save images for full package sand box installations with
Airflow/Python and SQL.
● Write dbt model as per mapping document.
● Developing data assets (ETL/ELT pipelines, data structures) on our client's cloud infrastructure.
● Creating reports using Tableau where edit and change existing reports and traverse through.
● Working on Agile software development methodology with scrum model using Jira.

Environment: SQL Server 2019, Python, Snowflake, Snow SQL, DBT, Airflow, AWS, OLTP, OLAP, Data Warehousing,
Data Modeling, ETL, SSIS, SAP, SFDC, GitHub, Agile, Jira.

BI Developer
Hill Physicians – San Ramon/Remote – Dec 20 to Oct 21

Responsibilities:

● Analyze and document client's business requirements and processes, transform the requirements to basic
conceptual data and process models, data dictionaries and volume estimates.
● Working on STTM documents to map sql and salesforce objects and load order based on child objects.
● Understanding the legacy application using eVips UI and its logic built using .Net for Data Extraction, Data
Profiling, and Data Migration.
● Automated the process of uploading files to Salesforce ContentVersion using RestAPI using C#.
● Migrated the data from SQL Server to Salesforce using DBAmp.
● Working on data analysis, extraction, profiling, cleansing and validating the data using SQL.
● Analyzing the data related to health care such as practices, providers, facilities, contracts, specialities,
TINs, etc.
● Creating SSIS packages to validate addresses using Melissa data and fuzzy grouping to find duplicates of
practices, providers, and facilities.
● Developing scripts and loaded data into Snowflake database for data mirroring using AWS and for ETL
process using python.
● Converting cursor based code into SnowSQL, processed data in bulk instead of row by row.
● Migrating/loading structured and unstructured data using Flat files and JSON into snowflake.
● Developing data assets (ETL/ELT pipelines, data structures) on our client's cloud infrastructure.
● Working on data warehousing - OLTP and Data modeling (using Toad) and creating Facts and Dimensions,
full load and incremental loads.
● Creating and tuning stored procedures with incremental and differential updates and views in Sql.
● Creating SQL jobs and scheduling them using SQL Server Agent.
● Monitoring and troubleshooting issues related to data and environment.
● Working on Agile software development methodology with scrum model using Jira.

Environment: SQL Server 2018, OLTP, Data Modeling, ETL, SSIS, Snowflake, SnowSQL, Python, C#, Salesforce/SFDC,
Agile, Jira.

BI/SQL Developer
SAP Ariba Inc – Mountain View, CA – Jan 20 to Nov 20

Responsibilities:

● Involved in Data Mapping, identified and mapped source to target fields, defined transformation rules
based on the requirements to generate custom reports between SAP and SQL.
● Worked on SSIS script task, lookup transformations, data flow tasks using T-SQL and VB scripts.
● Creating and tuning stored procedures with incremental and differential updates and views in Sql.
● Creating SQL jobs and scheduling them using windows scheduler to run unit testing.
● Working on data warehousing - OLTP, OLAP, Facts, Dimensions, and Data modeling.
● Developing data assets (ETL/ELT pipelines, data structures) on our client's cloud infrastructure.
● UI dashboard design and development on SSRS and support on Tableau.
● Using Python and PowerShell scripts extensively for the ETL processes.
● Created scorecards to monitor the data quality metrics and set up the email notifications to the Data
stewards and Governance team if any changes in the threshold using SQL and Airflow.
● Working on Agile software development methodology with scrum model using Jira.

Environment: SQL Server 2017, Python, OLTP, OLAP, Data Warehousing, Data Modeling, ETL, SSIS, JavaScript, SAP,
GitHub, Agile, Jira.

BI/SQL Developer
BART Parking/MTC – BART/511.org – Oakland, CA – Nov 18 to Dec 19

Responsibilities:

● Requirement gathering, analysis, design, development, testing, deployment, demos.


● Created tables, writing queries, stored procedures and views to facilitate the data availability.
● Finetune stored procedures and views for performance without impact on the data quality and integrity
using SQL Profiler and Execution Plan.
● Worked on Data modeling, Facts, and Dimensions.
● Worked on developing the scheduler to store daily events into the database using NodeJs.
● Prepared Data Flow Diagrams, Entity Relationship Diagrams and data dictionaries and prepared scripts for
deployment for multiple environments
● Added and modified reports in SSRS as per business requirement using Visual Studio.
● Trouble-shooting day to day level 1 issues which comes through service now ticketing tool and providing a
root cause and fixing it.
● Worked with large data volumes including structured and unstructured data to integrate into Enterprise
Data warehouse using SSIS and Python.
● Performed peer reviews, code optimization, coding standards, performance optimization.
● Used GitHub as source repositories.
● Worked on Agile software development methodology with scrum model using Jira

Environment: SQL Server 2014, Shell Scripting, Python, SSRS, SSIS, GitHub, Agile, Jira.

Developer
RevRec – Dolby Laboratories Inc – San Francisco, CA – June 17 to Oct 18

Responsibilities:

● Requirement gathering, analysis, task breakdowns, implementation, testing, and deployment.


● Designed and implemented data modeling using Calculation View based on requirement documents from
SAP HANA.
● Fine tune stored procedures and views for performance without impact on data quality and integrity
using SQL Profiler and Execution Plan.
● Created script-based calculation views using stored procedure to handle complex scenarios
● Created calculated measures, restricted measures, input parameters and constant filters to satisfy
reporting requirements.
● Worked with Admin team to setup SLT for real time replication for CRM / ECC tables
● Prepared SSIS/Python scripts for deployment and deployed the changes in multiple environments
● Worked on Agile software development methodology with scrum model using Jira
Environment: SQL Server 2014, Unix Shell Scripting, Tableau, SSIS, Jira, Agile.

Developer
GLS (Gas Logging System) – Pacific Gas & Electric (PG&E) – San Ramon, CA - Feb 15 to May 17

Responsibilities:

● Requirement gathering, analysis, task breakdowns, implementation, testing, and deployment.


● Created tables, computed columns and indexes in the database using SSMS.
● Written T-SQL queries, stored procedures, and Views and created ETL packages using SSIS to dump data
from Oracle to SQL using different controls such as transformations, control flow tasks, data flow tasks,
containers and event handlers.
● Deployed the ETL package in SQL Server using Integration Services Catalogs and scheduled the package
using SQL jobs.
● Performed Optimization and Performance Tuning of stored procedures and queries using SQL Profiler and
Execution Plan.
● Extensively used Joins and subQueries to simplify complex queries involving multiple tables.
● Created report datasets and using stored procedure parameters in SSRS.
● Created Reports UI using Kendo Grid and implemented ExportToExcel.
● Creating SPA using AngularJS to bind JSON objects with HTML elements using MVC.
● Designing components using ReactJS with ES6 class definition and using flux architecture.
● Used Team foundation server (TFS) as a source control for storing code repository.
● Worked on Documentation -FSD, TSD & flow diagrams, and code optimization, peer reviews.
● Prepared SQL scripts for deployment and deployed the new changes in multiple environments.
● Worked on Agile software development methodology with scrum model using Jira.

Environment: SQL Server 2012, SSMS, JavaScript, C#, SQL Server Business Development Studio (BIDS), SSIS, SSRS,
Visual Studio 2015, TFS, Agile, Jira, Jenkins.

Tech Lead
Avalara – Feb 14 to Jan 15
Environment: SQL Server 2008 R2, SSMS, C#, SSIS, Visual Studio 2012, Agile, Jira, TFS

Tech Lead
UDT Migration –Bank of New York (BNY)Mellon–May 12 to Jan 14
Environment: SQL Server 2008 R2, SSMS, C#, SQL Server Business Development Studio (BIDS), SSIS, Visual Studio
2012, Agile, Jira, TFS

Senior Developer
CurePet / EBC CAT – Microsoft / Hitachi Consulting – July 10 to Apr 12
Environment: MVC3, Silverlight, MVP, C#, JQuery, JSON, MVVM, WCF, Entity Framework, Unity Framework, LINQ,
NUnit, WindsorCastle, Telerik, Design Patterns, Windows Azure, SQL Azure, Data Sync, ADFS, ReSharper, SSIS, SQL
Server 2008, Visual Studio 2008/10, TFS, Agile, Jira, TFS, IIS.

Software Developer & Development Engineer in Test (SDET)


MSI/MIO Privacy/MSN Media Services TV Listings–Microsoft/ HCL Technologies–Mar 08 to Jun 10
Environment: SQL Server 2008, Visual Studio 2008, Ajax, JavaScript, C#, WCF, Entity Framework, SSIS, TFS, and
Agile.

Developer
CAF America – CAFAmerica – Feb 05 to Feb 08
Environment: JavaScript, CSS, Web Services, ASP.Net, ADO.Net, VB.Net, Telerik, SQL Server 2005, SVN, Visual
Studio 2005, IIS.

You might also like