You are on page 1of 8

FAKIR MOHAMMED

: 732-429-4608, : fakir_m@yahoo.com

Certified Hadoop
Apache Cassandra
Developer (Big Data)
AVIATRIX
Developer Associate Multi Cloud Network Associate

Career Summary
Overview:
 Seasoned Sr/Lead Data Architect / Big Data Integration / BI / EDW / MDM Architect / Data Modeler
/ ETL Developer / Data Analyst / Database Developer consultant with 20+ years of experience.
 Excellent Business Knowledge in Pharmaceutical, Retail Banking, Trade Finance, Investment, Insurance,
Power Industries, Health Care (clinical and insurance), Fintech and Retail Domains.
Technical:
 Extensive experience in the Major Cloud Technologies (AWS, Azure and GCP).
 Hands-on experience implementing Big Data Technologies (Pig, Hive, Hbase, Cassandra 3.x) solutions,
Data Analytics Using AWS Athena, Snowflake, Pig, Hive, Scala,Spark 2.x and Impala with Hadoop.
 Strong knowledge in Columnar databases i.e. Redshift, HP Vertica, Snowflake
 Integration between Veeva Platform (Salesforece.com) and ODS / EDW, SAP Hybris, IBMS OMS and
MS DYNAMIC CRM.
 Expert knowledge in NoSQL DB (Mongo DB 2.6, Cosmos DB), Google Big Query & Teradata 15
 Expert knowledge in BI tools like Tableau, Cognos, Business Objects XI, Oracle BI Suite, Microsoft SSRS.
 Data Modeling Using Erwin, Embarcadero ER/Studio, Oracle SQL Developer Data Modeler, MS-Visio
Data Modeling Tools. Extensive logical and physical design experience.
 Well versed in Data Vault 2.0, Data Warehousing methodologies, e.g., Kimball, Inmon.
 Expert Experience and Knowledge in Master Data/Metadata Management (MDM), Data Quality and Data
Governance implementation.
 Extensive experience in Data Conversion and Migration from Legacy/Cloud systems (DL, EDW, OLTP).
 Extensive experience in working with Structured, Semi-Structure (XML, JSON, Parque..) and Unstructured Data.
 Extensive experience in SQL tuning, PL/SQL tuning, Application tuning.
 ETL using Oracle, Informatica Power Center & IICS, SSIS, OWB, Talend & AWS GLUE
 Strong experience in implementing Machine Learning (ML) Algorithms on various dataset.
 Extensive experience in Unix Shell, Perl and Python Scripting.
 Expert knowledge of Web Apps, e.g., .Net Architecture, J2EE Architecture.
 Testing - Unit, Functional and Integration. Possess a commitment to Quality.
Additional Business & Soft Skills:
 Excellent experience in clinical life science data i.e. Clinical Trials.
 Excellent experience in Treasury Transaction in Banking like Conversion of Multi Currency, Foreign
Exports & Imports Transaction, Deposits, Loans and Running Packing Credit etc,.
 Expert knowledge in Project Management process and a PMP candidate.
 Strong development work with methodologies (LEAN, AGILE, SCRUM), etc.
 Self-motivated, highly committed to responsibilities, ability to work independently and perform well
within cross-functional teams.

Education

 Master of Science in Computer Science


 Bachelor of Science in Computer Science
 See Certifications at end of resume
Professional Experience

MerchantE, GA (Remote)
Lead Data Architect Jun 2020 – At Present

Responsibilities:
 Worked with the business analysts to understand the project specification and helped them to complete the
specification.
 Defined user requirements, developing the dataflow diagrams, process models, entity relationship diagrams.
 Gathered business requirements and build/modify the Logical Data model
 Build the Physical data modeling Enterprise Business Intelligent for existing objects using Navicat data
model tool.
 Build/Design/implement the Master Data Management (MDM), Data Quality and Data Governance
Process for Upstream business
 Working on Data Migration from Oracle to AWS Aurora (DMS, SCT)
 Extensively tuning Stored Procedures, Triggers, Packages and Functions in PL/SQL for OLTP applications.
 Query Optimization & SQL tuning.
 Performed unit testing accordance with unit test plan
 Migrate EDW from Redshift to Snowflake and designed Star Schema model.
 Use Snowpipe for continuous files ingestion and Use SnowSQL for querying data in Snowflake
 Loading data into Big Query Table and Perform analytic query against them.
 Participated in full Software Development Life Cycle (SDLC) following Agile Methodology (SCRUM)

Barnes & Noble College, NJ


Lead Data / Integration Architect Nov 2019 – Jun 2020

Environment: AWS - S3, Lambda, Athena, Glue, RDS -PostgreSQL, ECS, SQS, SNS, Cloud Watch, GIT,
Jenkin, Oracle 12c, SAP Hybris, IBM OMS, Splunk, AS400, JM Client, Zeppelin Notebook 0.6,Scala, Spark 2.x,
Python 3.x.

Barnes & Noble College is one of the major college book sellers in the United States.
Responsibilities:
 Working with the Business / IT stake holders to understand project requirement and helped them to
implement.
 Understand user requirements, developing the dataflow diagrams, process models, enterprise MDM
architecture diagram.
 Participating in full Software Development Life Cycle (SDLC) following Agile Methodology (SCRUM)
 Gathering business requirements and build/modify the Conceptual & Logical Data model
 Design, Build, Implement and Integrate between MDM and SAP Hybris using AWS Glue.
 Design, Build, Implement and Integrate between MDM and IBM OMS using AWS SQS / JMS
Client
 Build/Design/implement the Master Data/Metadata Management (MDM), Data Quality and Data
Governance Process for Upstream business.
 Working with Dev Ops to setup the MDM environment in AWS cloud (S3, RDS, GLUE, SPARK,
LAMBDA, SQS, ATHENA, VPC, SUBNETS).
 Build Technical Design and Implemented On-Prem OLTP system data migration to Cloud OLTP system.
 Ingested dataset daily to do find data pattern and design thru ML algorithms using AWS SageMaker.
 Working with QA to verify all the test cases, integration test and performance test.
 Analyzed the major long running ETL Jobs and found the bottlenecks and implemented solution
to improve performance of those jobs.
JetBlue, NY
Lead Data/ Integration Architect May 2019 – Nov 2019

Environment: MS Azure, MS Visio, Star schema, ODS/EDW schema, Informatica 10x, ER Studio 18, MS Share
point, MS SQL Server 2017 (SSIS) and Bit Bucket, MDM Tool(Tibco), Hackolade NOSQL data model tool, MS
data bricks, data factory, Cosmos DB,BLOB Storage(DL), Tibco Spotfire,Scala, Spark 2.x and Python 3.x.

JetBlue Airways Corporation, stylized as jetBlue, is a major American airline low cost passenger carrier, and the sixth
largest in the United States by passengers carried
Responsibilities:
 Working with the business analysts to understand project specification and helped complete specifications.
 Defining user requirements, developing the dataflow diagrams, process models, entity relationship diagrams.
 Participating in full Software Development Life Cycle (SDLC) following Agile Methodology (SCRUM)
 Gathering business requirements and build/modify the Conceptual & Logical Data model
 Analyzing business requirements build/modify data mart (star schema data mart).
 Build the NoSQL Logical & Physical data modeling using Hackolade tool.
 Build the Conceptual & Logical & Physical data modeling Enterprise Business Intelligent with new
analytics for next generation data warehouse using ER Studio tool.
 Design, Build, Implement and Integrate between ODS and Data Lake using MS Web Apps and
Service Bus.
 Design and Build the integration for consuming real time event messages using EMS (MS Event
Hub, Web apps and Data Vault).
 Architect Migration Design and Implemented EDW/OLTP systems data migration to Cloud Data Lake /
EDW systems.
 Work with developers, data quality analysts, customer support, operations technicians to troubleshoot.
 Ingested dataset periodically for data science and implemented thru Azure ML(Machine Learning).

Wyndham Hotels & Resorts, NJ


Lead Data / Integration Architect Jan 2019 – Apr 2019

Environment: AWS (S3, Lambda, Redshift etc.), ODS / EDW (Star Schema), Informatica iPaas, IDQ, MS Share
point, PostgreSQL, Bit Bucket, MDM Tool (Informatica 10.x), Oracle DB.

Wyndham Hotels and Resorts is an international hotel and resort chain based in the United States. It has locations
in China, Canada, Mexico, Colombia, Ecuador, Turkey, Germany, the UK, the Caribbean, Indonesia and Margarita
Island in Venezuela.
Responsibilities:
 Worked with the Business / IT stake holders to understand project requirement and helped them to
implement.
 Understood user requirements, developing the dataflow diagrams, process models, enterprise DW architecture
diagram.
 Analyzed ODS / Data Mart schema based on business requirements and implement the solution to improve
performance of SQL queries in the ETL jobs.
 Analyzed the major long running ETL Jobs and found the bottlenecks and implemented solution
to improve performance of those jobs.
 Tune application database system performance by analyzing data, making recommendations and
implementing application database tuning improvements.
 Performed POC on Snowflake to setup EDW and data migration from Redshift to Snowflake.

Valeant Pharmaceutical (Bausch Heath), NJ


Sr. Solution / Data Integration Architect Feb 2018 – Dec 2018

Environment: MS Azure, Windows Server 2016, JAVA SE 8, Star schema, ODS schema, IICS (Informatica),
Vertabolo Cloud DM, MS Share point, MS SQL Server 2017 (SSAS,SSIS, SSRS) and MS-VSS,MDM Tool (Reltio
Cloud) and VEEVA Platform (Salesforce.com).
Valeant Pharmaceutical (Bausch Heath) is focused on improving people’s lives with our health care products.
They manufacture and market a broad range of branded and generic pharmaceuticals, over-the-counter (OTC)
products and medical devices (contact lenses, intraocular lenses, ophthalmic surgical equipment and aesthetic
devices).
Responsibilities:
 Worked with the business analysts to understand project specification and helped complete specifications.
 Defined user requirements, developing the dataflow diagrams, process models, entity relationship diagrams.
 Participated in full Software Development Life Cycle (SDLC) following Agile Methodology (SCRUM)
 Gathered business requirements and build/modify the Conceptual & Logical Data model
 Analyzed business requirements build/modify data mart (star/ snow flow schema data mart).
 Build the Conceptual & Logical & Physical data modeling Enterprise Business Intelligent with new
analytics for next generation data warehouse using Vertabolo Cloud DM tool.
 Design, Build, Implement and Integrate between Veeva Platform (Salesforce.com) and ODS
using Informatica IICS (Cloud Version) API.
 Build/Design/implement the Master Data/Metadata Management(MDM), Data Quality and Data
Governance Process for Upstream business
 Integrate ODS/EDW with MS Dynamic CRM to maintain Customer Data using IICT.
 Build/Design/implement the Business Intelligent applications/Reports/Dashboard using Tableau 10.x.
 Capacity planning, performance tuning and problem resolution for ,DW Staging and EDW databases
 Creating technical requirements to use SSIS, SSAS cubes/dimensions, SSRS for BI development.
 Implemented slowly changing dimensions and data transformations using SSIS components.
 Loading data into Big Query Table and Perform analytic query against them.
 Tune application database system performance by analyzing data, making recommendations and
implementing application database tuning improvements.

ERT, NJ
Sr. / Lead Data Architect Sep 2010 to Jan 2018

Environment: Red hat Enterprise Linux, Hortonwork HDP 2.x, Amazon AWS, Mongo DB 2.6, Hadoop 2.x,
Cassandra 3.x, Hive, Sqoop and Pig, Talend, Apache Nifi, ORACLE 11g/12c, PL/SQL, SQL*Loader, Star
schema,SAP, SQL Server 2008/2012, Erwin 9.x, Enterprise Architect(UML Tool), Informatica MDM 9.X/10.X,
Tableau 9, TOAD V10/11.x, Sql Developer, Bugzilla, JIRA, SVN, GIT, MSBI, JAVA SE 7 & 8 and IBM Informix,
Redshift, Neo4j, Teradata 15, Medidata, Spark 2.x, Python 3.x

Responsibilities:
 Design and build scalable platforms to collect & analyze large amounts of structured and unstructured
data
 Implemented Hortonwork HDP 2.x for the big data by Setting 11 node HDFS cluster in EC2 instances.
 Designed the job by which was Scooping/Loading data from Oracle and other databases and load into the
HDFS cluster.
 Designed Apache Nifi which pulls the data from the Kafka and loaded into the Cassandra.
 Create and worked Sqoop jobs with bulk & incremental load to populate Hive Tables. Push data using Sqoop
from Hive tabels to Reporting Database in Oracle.
 Build the Conceptual & Logical Data modeling(EDW & OLTP) using Erwin and Oracle SQL Developer
Data Modeler tools
 Participating in full Software Development Life Cycle (SDLC) following Agile Methodology (SCRUM)
 Gathering business requirements and build/modify the Conceptual & Logical Data model
 Ingest the data from SAP as Flat File to ODS and Feed to EDW and Vice Versa.
 Data Migration from Legacy Databases to AWS RDS and Redshift DBs.
 Analyzing business requirements build/modify data mart (star/ snow flow schema data mart).
 Creating logical and physical dimensional model data mart. Created data dictionary and database
documentation.
 Build the Amazon EC2 database environment(SQL Server, Oracle, Redshift) for BI(SSRS, Cognos,
Tableau)
 Build/Design/implement the Business Intelligent applications/Reports/Dashboard using SSRS / Cognos
tools
 Working on Big Data (Hadoop, Pig, Hive & Mongo DB) and Cloud Computing Technology POC with
different clients for Data analytics
 Build the Big Data & NoSQL Conceptual, Logical & Physical Data modeling using Data Vault 2.0.
 Build Raw Data Vault using Hub,Link and Satellite tables and Business Vault using PIT, Bridge,
Reference Tables and Informational Vault(Data Mart).
 Design large-scale distributed data processing applications data model. real-time analytics and non-
relational data stores
 Working on Big Data Analytics platforms and ETL in the context of Big Data using Talend.
 Implementing the MDM, Metadata, Profiling, Data Governance and integrate with EDC (Medidata).
 Extract the unstructured data using Big data technologies and move to EDW
 Capacity planning, performance tuning and problem resolution for OLTP ,DW Staging and EDW databases
 Build Technical Design and Implemented OLTP system data migration to Cloud Data Lake / EDW /
ODS systems.
 Creating technical requirements to use SSIS, SSAS cubes/dimension, SSRS for business intelligence
development.
 Implemented slowly changing dimensions and data transformations using SSIS components.
 Design, Development and configuration reports using SS Reporting Service (SSRS) , Cognos and
Tableau 9x.
 Writing SS Integration Services(SSIS) packages and deploy it in Development/Testing and production server
 Extensively developing and tuning Stored Procedures, Triggers, Packages and Functions in PL/SQL for OLTP
applications.
 Query Optimization & SQL tuning.
 Work with the developers, data quality analysts, customer support and operations technicians to troubleshoot
all databases
 Tune application database system performance by analyzing data, making recommendations and
implementing application database tuning improvements.

Active Health Management (AETNA), NYC


Data Architect Jul 2007 – Sep 2010

Environment: Red hat Enterprise Linux 5, ORACLE 10g, PL/SQL, SQL*Loader, Unix Shell , JAVA SE 6, Scripts, Star
schema, ODS schema, Informatica 8.x, Erwin 7.x, Enterprise Architect(UML Tool), TOAD V9.7, Sql Developer,
Webmethod 6.5, Eclipse BIRT, Mantis, MS Share point, MS SQL Server (SSAS,SSIS, SSRS), MS-VSS, Python 2.x

Active Health is a technology-driven health management company that combines proprietary technology and clinical
knowledge. We provide information that enables doctors and patients to make more informed medical decisions.
Developed web-based application (OLTP) called Admin Suite, which supports multiple applications (OLTP) like Care
Engine, CRS, and PHR etc.

Responsibilities:
 Worked with the business analysts to understand the project specification and helped them to complete the
specification.
 Defined user requirements, developing the dataflow diagrams, process models, entity relationship diagrams.
 Participated in full Software Development Life Cycle (SDLC) following Agile Methodology (SCRUM)
 Gathered business requirements and build/modify the Conceptual & Logical Data model
 Analyzed business requirements build/modify data mart (star/ snow flow schema data mart).
 Build the Conceptual & Logical & Physical data modeling Enterprise Business Intelligent with new
analytics for next generation data warehouse using Erwin and ER-Studio Data Modeling tools
 Build/Design/implement the Master Data/Metadata Management(MDM), Data Quality and Data
Governance Process for Upstream business
 Build/Design/implement the Business Intelligent applications/Reports/Dashboard using Business Objects &
Cognos tools.
 Extensively developing and tuning Stored Procedures, Triggers, Packages and Functions in PL/SQL for OLTP
applications.
 Query Optimization & SQL tuning.
 Performed unit testing accordance with unit test plan
 Performed source code review
 Participated in product meeting.
 Implemented the MDM, Metadata, Profiling, Data Governance and data Quality in Projects
 Implemented the Kimball and Bill Inmon data warehousing methodology
 Capacity planning, performance tuning and problem resolution for OLTP ,ODS & EDW databases
 Worked with client governance bodies to define reference architectures for business intelligence, reporting and
data integration
 Implemented slowly changing dimensions and data transformations using informatica , SSIS.
 Supported Data Warehouse application and databases
 Developed informatica packages and deploy it in Development/Testing and production server
 Tune application database system performance by analyzing data, making recommendations and
implementing application database tuning improvements.
 Prepare IQ/OQ documents, Functional Specification documents, System Design Specification documents,
Requirements Traceability Matrix documents, Quality Assurance Plan documents, Cross Reference Design
documents, Configurations Items List documents etc.

Merck & Co., Inc., NJ


Lead Database Developer / Lead ETL Developer Oct 2006 – Jul 2007

Environment: HP-Unix 11.x, Oracle 9i /10g, Informatica 8.x, SQL Plus, PL/SQL, UNIX shell script, AWK, Perl Script,
Erwin 4.x, Toad 8.6, SQL Navigator 4.x, MS Office, TIBCO Web Service.

Merck & Co., Inc. is a global research-driven pharmaceutical company established in 1891, discovers, develops,
manufactures and markets vaccines and medicines to address unmet medical needs. The company devotes extensive
efforts to increase access to medicines through far-reaching programs and is focused on turning cutting-edge science
into breakthrough medicines.

Responsibilities:
 Work with data analyst to handle with data design, data mapping and data mining.
 Handle with data mart to maintain clinical trials for testing different levels and events.
 Handle with separate data mart in order to measure the metrics of time cycle.
 Maintain and modify CORE database to extract data from different application such as CTMS, CTS, SPIDER,
RDR and CDR.
 Create Objects, Packages, Procedures, Functions and SQL queries to manipulate CORE data.
 Resolve data integration issues across systems, which are integrated with CORE database.
 Periodically monitor CORE database for performance and logs.
 Periodically design and execute database performance measures, e.g., partitioning, query tuning, re-indexing,
analyzing etc.
 Wide use of PL/SQL for extracting, transforming, loading data from OLTP database into OLAP database in
oracle.
 Implementing the MDM, Metadata, Profiling, Data Governance and data Quality in Projects.
 Maintain CORE database documentation.
 Work with Reporting Services to deliver COGNOS reports via accessing CORE database.
 Prepare IQ/OQ documents, Functional Specification documents, System Design Specification documents,
Requirements Traceability Matrix documents, Quality Assurance Plan documents, Cross Reference Design
documents, Configurations Items List documents etc.
 Setup deployment steps and procedures with Unix deployment team for IT/UAT and production environments.

MISO, IN Mar 2006 – Sep 2006


Sr. Oracle Consultant

Environment: Sun Solaris 9.x, Oracle 10g/9i, informatica 7.x, SQL Plus, PL/SQL, UNIX Shell Script, Apache 2.0, MQ
Series, J2EE, Crystal Report XI, Erwin 4.x, Autosys 4.0, MS Office, Toad 8.6, PVCS.

The Midwest Independent Transmission System Operator has the large OLAP database, which provides the
information about power supply to their contractors and its productivity and services, cost-effective delivery of electric
power across much of North America. The Midwest ISO is committed to reliability, the nondiscriminatory operation of
the bulk power transmission system, and to working with all stakeholders to create cost-effective and innovative
solutions for our changing industry.

Responsibilities:
 Worked with the data analyst to understand the project specification and helped them to complete the
specification.
 Defining user requirements, developing the dataflow diagrams, process models, entity relationship
diagrams.
 Developed and tuned Stored Procedures, Triggers, Packages and Functions in PL/SQL.
 Query Optimization & SQL tuning.
 Created Materialized Views
 Automate UNIT testing for PL/SQL packages.
 Written UNIX shell scripts to handle tasks like background processes.
 Extensively used PL/SQL for extracting, transforming, loading data from OLTP database into OLAP
database in oracle.
 Analyzing and testing the loaded data in OLAP database.
 Extract the data from OLAP database and published into business objects tool, which helps the business
analysts to analyze the business related queries for specific period.
 Generating reports based upon user requirements using PL/SQL.
 Monitoring and optimizing the performance of the Database through Database Tuning – SQL Tuning,
PL/SQL Tuning using EXPLAIN PLAN, Oracle Hints, Bulk Collections and native PL/SQL compilation
method

INFOSYS TECHNOLOGIES LTD / NELITO SYSTEMS LTD, India


Senior Executive – Project Implementation Apr 2002 – Feb 2006

Environment: Oracle 9.x/8i,SQL,PL/SQL,Sun Solaris 9x, Apache 1.3, UNIX Shell Script, Crystal Reports 6, Perl Script,
TOAD v7.x, Visio 2000, MS VSS.

Project Description:
FINACLETM v7.x Banking ERP is developed by INFOSYS TECHNOLOGIES LTD., which is an enhanced feature of the retail
banking activities and Trade Finance business. It helps to analyze the database of bank for the purpose of balance
sheet, Trail Balance and P&L at any particular period. It meets International Banking Systems and Multi Currency
Scenario. The Customer can get information of accounts thru phone banking system as well as internet banking
system and also provides accessing of data across the country. It is centralized Web-Enabled solution to the Banking
and Financial Industries and also supports various market requirements. It has powerful security system at various
levels. FINACLETM Banking ERP is recognized as one of the best banking product in the worldwide.

Responsibilities:
 First level Data Model and Analysis Structure design
 As a Team Leader, Involved in migration of legacy database into Oracle 9i database thru tools, which was
developed by UNIX shell scripts.
 Written a PL/SQL interface program that imported customers from a legacy system interface tables. SQL *
Loader /UTL_FILE was used to populate into temporary tables, from where PL/SQL program validated and
loaded the data into open interface.
 Developed the reports using PL/SQL, Stored Procedure and third party tools.
 Conversion and migration of data using SQL Loader, PL/SQL stored packages and procedures and UNIX shell
scripts.
 Supporting and managing data loads.
 SQL optimization and tuning.
 Playing a major role in customization and supporting to the End-User Training Program.
 Playing an important role in help desk activities as 24/7 basis.
 Expert knowledge in Banking Functionality Domain
 Involved in Migration, Customization, Supporting and Training of the FINACLETM Banking ERP in different Banks
in various branches through out India.

Certifications

 Cloudera Certified Hadoop Developer (Big Data)


 Oracle Certified Professional
 Informatica Certified Designer
 Microsoft Certified Professional
 Apache Cassandra Developer Associate
 ACE - Multi-Cloud Networking Associate
 AWS Certified Solution Architect Associate
 Microsoft Azure Solutions Architect Expert
 Google Cloud Certified – Professional Cloud Architect

You might also like