Professional Documents
Culture Documents
· AWS Certified,
Proven working EKS,
experience in Aviatrix,
MDM-RDM Artefactory, HDP, data
space (Master Terraform
and Reference 8data)
- 10 years Bangalore
· Must be open to learn New technologies 4 - 6 years Any
Documentum P2 2 - 4 years Kolkata
Python, AWS, Mongo DB, Document DB, Big Data 8 - 10 years Bangalore
Stibo STEP MDM. Knowledge on AWS cloud will be appreciated.Resource should have good
4 - 6 years
communication,
Bangalore
work experinece in AMS support Model.
ing with end users. A detailed understanding and experience of development best practices
8 -in10
below
yearsareas: Hyderabad
Continuous Integration, branching and merg
e + Snowflake+ Apache Kafka + Azure DataBricks + HDInsights / OpenShift / Azure Kubenetes
8 - 10 years
Services Bangalore
/InfoSphere IGC / Denodo / Talend Data Fabric / IBM Data Virtualisation / Watson Knowledge
6 - 8 years
Catalog / IDA,
Bangalore
Erwin
Open Source - Cloudera, Hortonworks 8 - 10 years Bangalore
IT Data SME 4 - 6 years Bangalore
ess Architecture Experience with object-oriented/object function scripting languages: Python,
6 - 8SQL,
yearsScala, Spark-SQL
Bangalore etc. Experience with Devops. Stro
n implementing ETL using Pyspark on Azure Databricks. Must have strong knowledge of
6 -Software
8 years Development
Bangalore
Life Cycle including requirement ana
ty, Folder Structure, TeamSpace, Choice lists, Entry Templates
5. Snowflake . and Search Templates etc.,2- 4Hands-on
years experience
Mumbai
in ICN Plugins and External Data Serv
5. 6. Snowflake
Python . 4 - 6 years Bangalore
6. Python 4 - 6 years Bangalore
ectional
ticket leads
raising?
on They saidduring
projects time is a critical factor,
requirements that we need
development to findconversations
and guide a suitable resource
based offshore
12+
on years
best asap. Thesolution
technical candidate
Bangalore must have good communication sk
available.
ing new technologies/capabilities and advise strategically about how technology and tool capabilities
6 - 8 years can be leveraged
Pune to create Analytic and Business
e ticket raising? They said time is a critical factor, that we need to find a suitable resource offshore
12+ years
asap. The candidate
Bangaloremust have good communication sk
ory (Infosphere
· Governance
Proven working Catalog) Information
experience in MDM-RDMManagement Strategy
space (Master Design
data Co-ordinate
and Reference 2 - with
4 years
data) Multiple US based
Pune Business Stakeholders Exposure to r
· · experience
Proven working Must beinopen to learn New
MDM-RDM spacetechnologies 4 - 6 years
(Master data and Reference data) Any
· · experience
Proven working Must beinopen to learn New
MDM-RDM spacetechnologies 6 - 8 years
(Master data and Reference data) Any
· Must be open to learn New technologies 4 - 6 years Any
egration
· plusProven
Application Integration)
working experience, ODI, ,Informatica
in MDM-RDM Powercenter,
space Informatica
(Master data Powercenter
and Reference 2 - 4 years
data) exchange ,Informatica
Bangalore Data Quality. Database
· Must be open to learn New technologies 6 - 8 years Any
ol Conduct logical data analysis and data modeling joint application design (JAD) sessions,
8 -documented
10 years data-related
Kolkatastandards Define data modeling and
Chain data visualization Should have exposure in R/Python. Good communication - Effectively
8 - 10 communicate
years &
Kolkata
interact with internal and external clients
Data engineering,
pplication landscape based on data analysis, and
requirements Azure, Hadoop,
code NoSQL,
walk through Spark,
and SQL
map to Server
future state6architecture
- 8 years on GCP Bangalore
is required.
nce leading a team, set up projects in GCP and define process for team to implement future8 state
- 10 years
architecture. Bangalore
ases. Qualification and Skills Experience of APIS, microservices, Kubernetes and data ingestion
8 - 10 years
(Airflow) and
Bangalore
ETL patterns Experience of developing d
ases. Qualification and Skills Experience of APIS, microservices, Kubernetes and data ingestion
6 - 8 years
(Airflow) and
Bangalore
ETL patterns Experience of developing d
e client master data. Skills include client information architecture, MDM tools including WebSphere
4 - 6 yearsCustomerBangalore
Center (DWL), CIIS, Siperion, etc. Experie
ol Conduct logical data analysis and data modeling joint application design (JAD) sessions,
8 -documented
10 years data-related
Kolkatastandards Define data modeling and
dsource. Advanced
managing Knowledge
an organization's on architecture
data Relational Database Systems
in the cloud (Big and Big
Data, data design
analytics, framework
2 - 4analysis,
information years
is required to translate
data lakesPune the source
and database data modelsolutions)
management and reverse
Experience of architecting of data ingestion ETL solutions in the cloud (Airflow)6 - 8 years Bangalore
ases. Qualification and Skills Experience of APIS, microservices, Kubernetes and data ingestion
6 - 8 years
(Airflow) and
Bangalore
ETL patterns Experience of developing d
APIExperience
technologies Experience data
in developing in developing
managementmasterdata capabilities,
capabilities data standardization,
or data warehousing and
6 - 8quality
is an advantage years development
Bangalore
Good team leadership and interaction s
Experience in regulatory requirements and regulation in the fi 8 - 10 years Bangalore
tic) Experience
2. Writing PL/SQL jobsscalable
in building to run load from staging
end-to-end to dimensional
data ingestion model 3.solutions
and processing Excellent_x000D_
communication
6 - 8 years skills 4.Pune
Ability to work independently as well as
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing Python,_x000D_
solutions Java
8 - 10
andyears
Scala" Chennai
erience with object-oriented and/or functional programming languages, such as Python, Java8 - 10
andyears
Scala" Chennai
Mandatory): BankingDomain Experience, Scheduling tool like Autosys, Control-M, IBM 4Tivoli
- 6 years
experience. Hands
Puneon experience on any ETL tool. Incas
era based Data lake with strong Microsoft understanding and future goal of Cloudera on Azure.
8 - 10 years
BigData Skills
Bangalore
and understanding: Hadoop, Scala, SQL
era based Data lake with strong Microsoft understanding and future goal of Cloudera on Azure.
8 - 10 years
BigData Skills
Bangalore
and understanding: Hadoop, Scala, SQL
Development experience in Hadoop and Spark component knowledge 2 - 4 years Bangalore
Big Data technologies including API development. Expected to have traditional Application6 Development
- 8 years background
Hyderabadalong with knowledge of Analytics
era based Data lake with strong Microsoft understanding and future goal of Cloudera on Azure.
6 - 8 years
BigData Skills
Bangalore
and understanding: Hadoop, Scala, SQL
era based Data lake with strong Microsoft understanding and future goal of Cloudera on Azure.
6 - 8 years
BigData Skills
Bangalore
and understanding: Hadoop, Scala, SQL
BI-Azure 8 - 10 years Kolkata
0 4 - 6 years Mumbai
gration withETL
dataInformatica
sources (Mainframe
developerfiles,
mustRDBMS, SFTP, Kafka)
have the following - 3 to 5Experience with AWS
years experience. 6 - 8services
years like S3,Bangalore
DataSync, RDS. (Good to have) Informatic
Nice to have -- Experience on SAP Data Conversion in the areas SAP FICO, SAP MM, 2SAP
- 4 years
SD Bangalore
6 - 8 years Bangalore
6 - 8 years Bangalore
6 - 8 years Bangalore
6 - 8 years Bangalore
n OLTP and OLAP models Experience working on Tableau 9.x (Desktop, Server, Reader),4 creating
- 6 yearsvarious Reports
Any and Dashboards using different funct
adata solution) Metadata management development (platform, glossaries, data dictionary,8 data
- 10 catalog,
years Lineage,
Bangalore
data models) Data architecture or soluti
ed to information management Experience in developing masterdata capabilities, data standardization
8 - 10 years and quality
Bangalore
development is beneficial Experience i
OD / Mainframe Developer with 4_+ years of relevent experience in CMOD. Good Communication
4 - 6 yearsskills. Pune
ands-on Bitbucket/Git experience,Experience in software development,Apache Spark experience
8 - 10 years
is a bonus,Big
Hyderabad
Data experience is a bonus,Pythonic code
ands-on Bitbucket/Git experience,Experience in software
6. COBOL knowledge development,Apache
is added advantage. Spark experience
8 - 10 years
is a bonus,Big
Hyderabad
Data experience is a bonus,Pythonic code
7. Good
on Big Data technologies . using communication
PySpark. Strong and writing
technical skills
abilities 4 -write
to understand, design, 6 years Pune code
and debug complex
Mandate skills - Pyspark, Kubernates (desirable) 6 - 8 years Bangalore
gration with data sources (Mainframe files, RDBMS, SFTP, Kafka) Experience with AWS
2 - 4services
years like S3,Bangalore
DataSync, RDS. (Good to have) Informatic
gration with data sources (Mainframe files, RDBMS, SFTP, Kafka) Experience with AWS
2 - 4services
years like S3,Bangalore
DataSync, RDS. (Good to have) Informatic
rnance, data quality, data preparation, or data architecture Experience in Informatica (Informatica
4 - 6 years
Enterprise Data
Kolkata
Catalog) Prior experience in all stages
Offshore Informatica Cloud & AWS 2 - 4 years Kolkata
Databricks / Snowflake / Cassandra/ Teradata 4 - 6 years Bangalore
gration withETL
dataInformatica
sources (Mainframe
developerfiles,
mustRDBMS, SFTP, Kafka)
have the following - 3 to 5Experience with AWS
years experience. 6 - 8services
years like S3,Bangalore
DataSync, RDS. (Good to have) Informatic
Nice to have --Have
Experience on SAP
a very good Data
grasp on Conversion in the
SQL to be able to areas
extractSAP
and FICO,
analyzeSAP
dataMM, 2SAP
- 4 years
SD Bangalore
Has a good understanding of Entity relationships and can understand the associated business
2 -process.
4 years Pune
OD / Mainframe Developer with 4_+ years of relevent experience in CMOD. Good Communication
4 - 6 yearsskills. Pune
gn
on kit.
Big Includes integration. using
Data technologies between Business Intelligence
PySpark. Strong Platforms,
technical which
abilities to will allow
understand, users
6 -write
design, to
8 years
toggle easily Hyderabad
and debug between reporting
complex code and analysis tasks. The re
Mandate skills - Pyspark, Kubernates (desirable) 4 - 6 years Hyderabad
um 3 to 4 years of experience with Enterprise ETL Platforms and processes 3+ years of experience
6 - 8 yearswith SQL development,
Kolkata Stored Procedures 2 years of
ess. Resolve bugs within defined service levels. Manage and own application Bugs and incident
2 - 4 years
life cycle, andGurgaon
drive those to permanent resolutions, imple
Looking for an Abinito resource who can build data pipelines & handle any new data sourcing
6 - 8needs
years NCR
ETL Tools / Python/Kafka / Spark 6 - 8 years Bangalore
ETL Tools / Python/Kafka / Spark 6 - 8 years Bangalore
Experience in Data Migration from RDBMS to Snowflake cloud data warehouse.12+ years Hyderabad
ing with end users. A detailed understanding and experience of development best practices12+
in below
years areas: Hyderabad
Continuous Integration, branching and merg
ing with end users. A detailed understanding and experience of development best practices
8 -in10
below
yearsareas: Hyderabad
Continuous Integration, branching and merg
Tableau 6 - 8 years Hyderabad
Hands-on with any scripting skills is preferred(Python/Spark) 6 - 8 years Hyderabad
Experience in ETL development & deployment using IBM DataStage 6 - 8 years Hyderabad
Knowledge on AWS 6 - 8 years Hyderabad
Knowledge on dbt (data build tool)""" 6 - 8 years Hyderabad
rt, Fast Load, MultiLoad, TPump and TPT), Oracle 8.1 with ETL knowledge Good to have6(Not
- 8 years
Mandatory):Hyderabad
Informatica, SQL,UNIX/Linux Shell Scripti
FS using Hive and Spark. Experience with Object Oriented Programming using Python and
4 -its
6 years
design patterns.Chennai
Experience handling Unix systems, for op
erience with Cognos Framework Manager and Cognos Transformer is needed. Should have4 hands
- 6 years
on experience
Hyderabad
on SQL queries and worked on debuggin
Tableau 4 - 6 years Hyderabad
Data scienceTableau
workflow exposure 6 - 8 years Hyderabad
Cloud pipelines (AWS or Google Cloud) 6 - 8 years Hyderabad
rt, Fast Load, MultiLoad, TPump and TPT), Oracle 8.1 with ETL knowledge Good to have6(Not
- 8 years
Mandatory):Hyderabad
Informatica, SQL,UNIX/Linux Shell Scripti
Must have: Strong hands-on Ab Initio experience, along with Unix and Sql 4 - 6 years Bangalore
FS using Hive and Spark. Experience with Object Oriented Programming using Python and
4 -its
6 years
design patterns.Chennai
Experience handling Unix systems, for op
ted solutions based on Informatica product. Experience extracting data from a variety of sources,
4 - 6 years
and a desire Chennai
to expand those skills (working knowledge
erience with Cognos Framework Manager and Cognos Transformer is needed. Should have4 hands
- 6 years
on experience
Hyderabad
on SQL queries and worked on debuggin
4 - 6 years Bangalore
d hands-on expertise in creating data solutions for client, creating best of breed end-to-end8solutions
- 10 yearsleveraging
Bangalore
Cloud and traditional data platform offerin
evelopment background along with knowledge of Analytics libraries, open-source Natural 2Language
- 4 years Processing,
Bangalore
statistical and big data computing librarie
ements, including IBM and Client employees and 3rd party vendors. Requires experience with
6 - 8PM
years
methodologies.
Bangalore
Skill include Big Data, Analytics, Busi
evelopment background along with knowledge of Analytics libraries, open-source Natural 2Language
- 4 years Processing,
Bangalore
statistical and big data computing librarie
ge (formerly Ascential) - IBM's WebSphere Data Integration Suite. Skills include designing2 and
- 4 years
developing extract,
Bangalore
transform and load (ETL) processes. E
d in databases. Implementation of one conceptual data model may require multiple logical 4data
- 6 models.
years The last
Bangalore
step in data modeling is transforming the
d in databases. Implementation of one conceptual data model may require multiple logical 6data
- 8 models.
years The last
Bangalore
step in data modeling is transforming the
ements, including IBM and Client employees and 3rd party vendors. Requires experience with
6 - 8PM
years
methodologies.
Bangalore
Skill include Big Data, Analytics, Busi
park and DBT, Data Vault/Data Warehouse experience, Strong knowledge of data in general,
8 - 10Strong
years understanding
Bangalore
of Java/Python,
park and DBT, Data Vault/Data Warehouse experience, Strong knowledge of data in general,
6 - 8Strong
years understanding
Bangalore
of Java/Python,
7. Promote innovation in team and able to generate innovative approach to any given problem4 - 6 years Bangalore
nderstanding of retail domain is preferred, ability to discuss with client business people in 4their
- 6 years
language Bangalore
ation, Kofax KIC folder import, email, fax, web service, KAFC - Kofax Analytics for Capture,
2 - 4 Kofax
years Export Bangalore
Script Customization and ), Kofax KAPOW/
WhereEscape, SnowFlake, AWS 4 - 6 years Bangalore
4 - 6 years Bangalore
8 - 10 years Bangalore
8 - 10 years Bangalore
4 - 6 years Bangalore
4 - 6 years Bangalore
Filenet Developer 2 - 4 years Hyderabad
Big Data technologies including API development. Expected to have traditional Application6 Development
- 8 years background
Hyderabadalong with knowledge of Analytics
performance tuning Strong experience in one of programming languages (Scala, Java) Familiarity
8 - 10 years
with development
Hyderabad
tools (experience on either IntelliJ / E
ment of ETL development work--Design, develop, test, implement and troubleshoot ETL mappings in a large Data Warehouse environment
code) skills
through using PowerEnterprise
Informatica’s Center. Experience in Data
Data Catalog (EDC)Warehouse applications,
or Microsoft Purview orOracle PL/SQL
Collibra 6 -curating
for 8and
years
UNIX shell and
the Data scripts--Build
Any shell scripts,
Analytics Metadata andOracle
define packages
a proces
e process for all ongoing and future developments to ensure that the Business and Technical4 Metadata
- 6 years definitionsPune
remains current on the Enterprise Data C
Power BI, Tableau, Cognos, MS Excel (Expert) SQL, Spark Teradata and Oracle PL/SQL
2 - 4 years Bangalore
park and DBT, Data Vault/Data Warehouse experience, Strong knowledge of data in general,
6 - 8Strong
years understanding
Bangalore
of Java/Python,
technical specification documents from business requirements Problem-solving skills o4Proficiency
- 6 years in designing
Bangalore
and developing Dashboards & Scorec
forms. The roleinnovation
7. Promote requires Oracle database
in team knowledge
and able to generateandinnovative
cloud platform knowledge
approach to be problem
to any given able
4 - 6toyears
troubleshootBangalore
the performance issues & bottlenecks, take n
nderstanding of retail domain is preferred, ability to discuss with client business people in 4their
- 6 years
language Bangalore
ation, Kofax KIC folder import, email, fax, web service, KAFC - Kofax Analytics for Capture,
2 - 4 Kofax
years Export Bangalore
Script Customization and ), Kofax KAPOW/
WhereEscape, SnowFlake, AWS 4 - 6 years Bangalore
· In depthexperience
Proven working knowledgeininMDM-RDM
Azure Data bricks and datadata
space (Master analysis. 2 - 4 years
and Reference data) Bangalore
· Must be open to learn New technologies 2 - 4 years Any
GCP - Hadoop / Solution Architect / Tech SME 12+ years Any
Understanding of Agile6.principles
Valid Passport.
and processes will be a plus. 6 - 8 years Gurgaon
self starter and result oriented the
5. Oversee person with
teams an eye
work, andfor detailroadblocks
ensure and abilityare
to removed 6 - 8for
provide inputs/ideas years
improvementGurgaon
6. Valid Passport. 6 - 8 years Gurgaon
uding API development. Expected to have traditional Application Development background
6 - along
8 years
with knowledge
Bangalore
of Analytics libraries, open-source Na
ing with end users. A detailed understanding and experience of development best practices6in
- 8below
years areas: Hyderabad
Continuous Integration, branching and merg
ing with end users. A detailed understanding and experience of development best practices6in
- 8below
years areas: Hyderabad
Continuous Integration, branching and merg
ing with end users. A detailed understanding and experience of development best practices6in
- 8below
years areas: Hyderabad
Continuous Integration, branching and merg
ing with end users. A detailed understanding and experience of development best practices6in
- 8below
years areas: Hyderabad
Continuous Integration, branching and merg
uding API development. Expected to have traditional Application Development background
4 - along
6 years
with knowledge
Hyderabad
of Analytics libraries, open-source Na
uding API development. Expected to have traditional Application Development background
6 - along
8 years
with knowledge
Bangalore
of Analytics libraries, open-source Na
ing with end users. A detailed understanding and experience of development best practices6in
- 8below
years areas: Hyderabad
Continuous Integration, branching and merg
ing with end users. A detailed understanding and experience of development best practices6in
- 8below
years areas: Hyderabad
Continuous Integration, branching and merg
ing with end users. A detailed understanding and experience of development best practices6in
- 8below
years areas: Hyderabad
Continuous Integration, branching and merg
ing with end users. A detailed understanding and experience of development best practices6in
- 8below
years areas: Hyderabad
Continuous Integration, branching and merg
uding API development. Expected to have traditional Application Development background
4 - along
6 years
with knowledge
Hyderabad
of Analytics libraries, open-source Na
Big Data technologies including API development. Expected to have traditional Application6 Development
- 8 years background
Hyderabadalong with knowledge of Analytics
5-7 Yrs Mandatory Skills :--- ETL Informatica Good to have (Not Mandatory):-- Detailed Job
6 - 8Description
years - Experience
Mumbai in ETL Informatica.
6 - 8 years Bangalore
th building dashboard, this will be anall
Prepare individual contributor’s
documents roleobjects
for reporting who will be driving changes
4 - 6 by
years
themselves and
Pune
helping us build dashboards. Strong in b
MonitorPrepare
and resolve all desk ticket
all documents issues and
for reporting troubleshoot
objects 6 - 8 years Any
MonitorPrepare
and resolve all desk ticket
all documents issues and
for reporting troubleshoot
objects 6 - 8 years Any
Monitor and resolve all desk ticket issues and troubleshoot 6 - 8 years Any
12+ years Any
12+ years Any
perience of which 5+ years in Data Integration - Design, Analysis, Modelling, Mapping. Minimum 2+ yrs on Architect role.
Good to have - Knowledge on Retail Domain - Supply chain, JDE, Finance, HR6 - 8 years Bangalore
th building dashboard, this will be anall
Prepare individual contributor’s
documents roleobjects
for reporting who will be driving changes
4 - 6 by
years
themselves and
Pune
helping us build dashboards. Strong in b
MonitorPrepare
and resolve all desk ticket
all documents issues and
for reporting troubleshoot
objects 6 - 8 years Any
MonitorPrepare
and resolve all desk ticket
all documents issues and
for reporting troubleshoot
objects 6 - 8 years Any
Monitor and resolve all desk ticket issues and troubleshoot 6 - 8 years Any
ecture and integration.. modeling, metadata management, extract-transform-load (ETL), data
6 -staging
8 yearstechniques, Any
to integrate with multiple end systems to
5. Oversee the teams work, and ensure roadblocks are removed 4 - 6 years Bangalore
6. Valid
5. Oversee the teams work, andPassport.
ensure roadblocks are removed 4 - 6 years Gurgaon
6. Valid
5. Oversee the teams work, andPassport.
ensure roadblocks are removed 4 - 6 years Gurgaon
6. Valid Passport.
5. Oversee the teams work, and ensure roadblocks are removed 6 - 8 years Gurgaon
6. Valid
5. Oversee the teams work, andPassport.
ensure roadblocks are removed 4 - 6 years Gurgaon
6. Valid Passport.
5. Oversee the teams work, and ensure roadblocks are removed 4 - 6 years Gurgaon
6. Valid Passport. 6 - 8 years Gurgaon
6 - 8 years Hyderabad
n Ab Initio development - GDE, Co-op> 3.4.x , Plan, Metaprogramming, MHUB (Knowledge
2 - 4ofyears
other Ab InitioPune
products would be an added advantage)
n Ab Initio development - GDE, Co-op> 3.4.x , Plan, Metaprogramming, MHUB (Knowledge
2 - 4ofyears
other Ab InitioPune
products would be an added advantage)
n Ab Initio development - GDE, Co-op> 3.4.x , Plan, Metaprogramming, MHUB (Knowledge
2 - 4ofyears
other Ab InitioPune
products would be an added advantage)
n AbExperience
Initio development - GDE,
in building Co-op>
scalable 3.4.x , data
end-to-end Plan,ingestion
Metaprogramming, MHUB
and processing (Knowledge
solutions 2 - 4ofyears
_x000D_ other Ab InitioPune
products would be an added advantage)
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing Python,_x000D_
solutions Java
4 - and
6 years
Scala" Pune
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing 4 - and
Python,_x000D_
solutions Java 6 years
Scala" Pune
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing Python,_x000D_
solutions Java
4 - and
6 years
Scala" Pune
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing Python,_x000D_
solutions Java
2 - and
4 years
Scala" Pune
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing Python,_x000D_
solutions Java
4 - and
6 years
Scala" Pune
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing Python,_x000D_
solutions Java
4 - and
6 years
Scala" Pune
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing Python,_x000D_
solutions Java
4 - and
6 years
Scala" Pune
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing Python,_x000D_
solutions Java
4 - and
6 years
Scala" Pune
erience with object-oriented and/or functional programming languages, such as Python, Java
4 - and
6 years
Scala" Pune
n Ab Initio development - GDE, Co-op> 3.4.x , Plan, Metaprogramming, MHUB (Knowledge
2 - 4ofyears
other Ab InitioPune
products would be an added advantage)
n Ab Initio development - GDE, Co-op> 3.4.x , Plan, Metaprogramming, MHUB (Knowledge
2 - 4ofyears
other Ab InitioPune
products would be an added advantage)
n Ab Initio development - GDE, Co-op> 3.4.x , Plan, Metaprogramming, MHUB (Knowledge
2 - 4ofyears
other Ab InitioPune
products would be an added advantage)
n Ab Initio development - GDE, Co-op> 3.4.x , Plan, Metaprogramming, MHUB (Knowledge
2 - 4ofyears
other Ab InitioPune
products would be an added advantage)
n Ab Initio development - GDE, Co-op> 3.4.x , Plan, Metaprogramming, MHUB (Knowledge
2 - 4ofyears
other Ab InitioPune
products would be an added advantage)
n Ab Initio development - GDE, Co-op> 3.4.x , Plan, Metaprogramming, MHUB (Knowledge
2 - 4ofyears
other Ab InitioPune
products would be an added advantage)
n Ab Initio development - GDE, Co-op> 3.4.x , Plan, Metaprogramming, MHUB (Knowledge
2 - 4ofyears
other Ab InitioPune
products would be an added advantage)
n Ab Initio development - GDE, Co-op> 3.4.x , Plan, Metaprogramming, MHUB (Knowledge
6 - 8ofyears
other Ab InitioPune
products would be an added advantage)
n Ab Initio development - GDE, Co-op> 3.4.x , Plan, Metaprogramming, MHUB (Knowledge
2 - 4ofyears
other Ab InitioPune
products would be an added advantage)
n Ab Initio development - GDE, Co-op> 3.4.x , Plan, Metaprogramming, MHUB (Knowledge
6 - 8ofyears
other Ab InitioPune
products would be an added advantage)
n Ab Initio development - GDE, Co-op> 3.4.x , Plan, Metaprogramming, MHUB (Knowledge
2 - 4ofyears
other Ab InitioPune
products would be an added advantage)
n Ab Initio development - GDE, Co-op> 3.4.x , Plan, Metaprogramming, MHUB (Knowledge
6 - 8ofyears
other Ab InitioPune
products would be an added advantage)
n Ab Initio development - GDE, Co-op> 3.4.x , Plan, Metaprogramming, MHUB (Knowledge
2 - 4ofyears
other Ab InitioPune
products would be an added advantage)
n Ab Initio development - GDE, Co-op> 3.4.x , Plan, Metaprogramming, MHUB (Knowledge
2 - 4ofyears
other Ab InitioPune
products would be an added advantage)
n Ab Initio development - GDE, Co-op> 3.4.x , Plan, Metaprogramming, MHUB (Knowledge
2 - 4ofyears
other Ab InitioPune
products would be an added advantage)
n Ab Initio development - GDE, Co-op> 3.4.x , Plan, Metaprogramming, MHUB (Knowledge
2 - 4ofyears
other Ab InitioPune
products would be an added advantage)
n Ab Initio development - GDE, Co-op> 3.4.x , Plan, Metaprogramming, MHUB (Knowledge
2 - 4ofyears
other Ab InitioPune
products would be an added advantage)
n Ab Initio development - GDE, Co-op> 3.4.x , Plan, Metaprogramming, MHUB (Knowledge
2 - 4ofyears
other Ab InitioPune
products would be an added advantage)
n Ab Initio development - GDE, Co-op> 3.4.x , Plan, Metaprogramming, MHUB (Knowledge
2 - 4ofyears
other Ab InitioPune
products would be an added advantage)
n Ab Initio development - GDE, Co-op> 3.4.x , Plan, Metaprogramming, MHUB (Knowledge
2 - 4ofyears
other Ab InitioPune
products would be an added advantage)
n Ab Initio development - GDE, Co-op> 3.4.x , Plan, Metaprogramming, MHUB (Knowledge
2 - 4ofyears
other Ab InitioPune
products would be an added advantage)
n AbExperience
Initio development - GDE,
in building Co-op>
scalable 3.4.x , data
end-to-end Plan,ingestion
Metaprogramming, MHUB
and processing (Knowledge
solutions 2 - 4ofyears
_x000D_ other Ab InitioPune
products would be an added advantage)
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing Python,_x000D_
solutions Java
4 - and
6 years
Scala" Pune
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing Python,_x000D_
solutions Java
4 - and
6 years
Scala" Pune
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing Python,_x000D_
solutions Java
4 - and
6 years
Scala" Pune
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing Python,_x000D_
solutions Java
6 - and
8 years
Scala" Pune
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing 4 - and
Python,_x000D_
solutions Java 6 years
Scala" Pune
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing Python,_x000D_
solutions Java
4 - and
6 years
Scala" Pune
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing Python,_x000D_
solutions Java
4 - and
6 years
Scala" Pune
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing Python,_x000D_
solutions Java
4 - and
6 years
Scala" Pune
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing Python,_x000D_
solutions Java
4 - and
6 years
Scala" Pune
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing Python,_x000D_
solutions Java
4 - and
6 years
Scala" Pune
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing Python,_x000D_
solutions Java
6 - and
8 years
Scala" Pune
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing Python,_x000D_
solutions Java
4 - and
6 years
Scala" Pune
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing 4 - and
Python,_x000D_
solutions Java 6 years
Scala" Pune
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing Python,_x000D_
solutions Java
4 - and
6 years
Scala" Pune
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing Python,_x000D_
solutions Java
6 - and
8 years
Scala" Pune
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing Python,_x000D_
solutions Java
4 - and
6 years
Scala" Pune
erience with object-oriented
Experience and/or functional
in building scalable end-to-endprogramming
data ingestionlanguages, such as
and processing Python,_x000D_
solutions Java
4 - and
6 years
Scala" Pune
erience with object-oriented and/or
- Experience on functional
SAS DI willprogramming
be an addedlanguages, such as Python, Java
advantage_x000D_ 4 - and
6 years
Scala" Pune
- Strong Data Analysis and Problem Solving Skills" 2 - 4 years Pune
uide the reporting team as well (Review the reporting solution ) - Understand Data-warehousing
8 - 10 years
( Data Vault Bangalore
Modeling is a plus) - Capable to guide the te
rchitect to ensure appropriate requirements are incorporated. Responsible for ensuring
8 - 10
Data
years
Quality Management
Pune standards are adhered to in new p
nalysis and ETL development activities. Worked with Big Data projects across Hadoop 6and
- 8Cloud
years technologies
Gurgaon
(or equivalents) S3, Hive, Spark, Pre
ontentdata
ment Navigator
modelsConfiguration and Customization.
according to business requirementsInfor
addition,
specificit use
is required to have
cases and/or hands-on
client’s6business
- 8 years
experience with
Hyderabad
administrator tools such as ACCE, knowl
domains_x000D_
modelling
ment dataschema
modelsdesign approaches
according and requirements
to business techniques such
for as Third use
specific Normal
casesForm (3NF),
and/or Dimensional
client’s4business
- 6 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship (ER
modelling
ment dataschema
modelsdesign approaches
according and requirements
to business techniques such
for as Third use
specific Normal
casesForm (3NF),
and/or Dimensional
client’s4business
- 6 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship (ER
modelling
ment dataschema
modelsdesign approaches
according and requirements
to business techniques such
for as Third use
specific Normal
casesForm (3NF),
and/or Dimensional
client’s4business
- 6 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship (ER
modelling
ment dataschema
modelsdesign approaches
according and requirements
to business techniques such
for as Third use
specific Normal
casesForm (3NF),
and/or Dimensional
client’s4business
- 6 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship (ER
modelling
ment dataschema
modelsdesign approaches
according and requirements
to business techniques such
for as Third use
specific Normal
casesForm (3NF),
and/or Dimensional
client’s4business
- 6 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship (ER
modelling
ment dataschema
modelsdesign approaches
according and requirements
to business techniques such
for as Third use
specific Normal
casesForm (3NF),
and/or 4business
- 6 yearsdomains_x000D_
Dimensional
client’s Hyderabad
Modelling, Entity-Relationship (ER
modelling
ment dataschema
modelsdesign approaches
according and requirements
to business techniques such
for as Third use
specific Normal
casesForm (3NF),
and/or Dimensional
client’s4business
- 6 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship (ER
ament
modelling schemaaccording
data models design approaches
to businessand techniques for
requirements such as Third
specific useNormal Form (3NF),
cases and/or client’sDimensional
6business
- 8 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship
ament
modelling schemaaccording
data models design approaches
to businessand techniques for
requirements such as Third
specific useNormal Form (3NF),
cases and/or client’sDimensional
6business
- 8 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship
ament
modelling schemaaccording
data models design approaches
to businessand techniques for
requirements such as Third
specific useNormal Form (3NF),
cases and/or client’sDimensional
6business
- 8 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship
ament
modelling schemaaccording
data models design approaches
to businessand techniques for
requirements such as Third
specific useNormal Form (3NF),
cases and/or client’sDimensional
6business
- 8 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship
a modelling schema design approaches and techniques such as Third Normal Form (3NF), Dimensional
6 - 8 years Modelling,
Hyderabad
Entity-Relationship
ment data models according to business requirements for specific use cases and/or client’s business domains_x000D_
ament
modelling schemaaccording
data models design approaches
to businessand techniques for
requirements such as Third
specific useNormal Form (3NF),
cases and/or client’sDimensional
6business
- 8 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship
ament
modelling schemaaccording
data models design approaches
to businessand techniques for
requirements such as Third
specific useNormal Form (3NF),
cases and/or client’sDimensional
6business
- 8 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship
a consulting
modelling schema designregarding
and guidance approaches
theand techniques
usage such as Third
of the enterprise Normal
integrated Form
logical (3NF),
model forDimensional
6use
- 8 in
years Modelling,
Hyderabad
development. Entity-Relationship
ment7.data
Have deep knowledge
models according toofbusiness
data architectures,
requirementsODSs, Data warehouses
for specific client’s6business
and methodologies.
use cases and/or - 8 yearsdomains_x000D_
Pune
ament
modelling schemaaccording
data models design approaches
to businessand techniques for
requirements such as Third
specific useNormal Form (3NF),
cases and/or client’s8Dimensional
- 10 yearsdomains_x000D_
business Modelling,
Hyderabad
Entity-Relationship
ament
modelling schemaaccording
data models design approaches
to businessand techniques for
requirements such as Third
specific useNormal Form (3NF),
cases and/or client’s8Dimensional
- 10 yearsdomains_x000D_
business Modelling,
Hyderabad
Entity-Relationship
adata
modelling
modelsschema design
according approaches
to business and techniques
requirements such as
for specific useThird
casesNormal
and/or Form (3NF),
client’s 8Dimensional
- 10
business years Modelling,
domains_x000D_Hyderabad
Entity-Relationship
ing schema
data models design approaches
according andrequirements
to business techniques such as Thirduse
for specific Normal
casesForm (3NF),
and/or Dimensional
client’s 4 - 6domains_x000D_
business years
Modelling, Entity-Relationship
Hyderabad (ER) Models a
ing schema
data models design approaches
according andrequirements
to business techniques such as Thirduse
for specific Normal
casesForm (3NF),
and/or Dimensional
client’s 4 - 6domains_x000D_
business years
Modelling, Entity-Relationship
Hyderabad (ER) Models a
ing schema
data models design approaches
according andrequirements
to business techniques such as Thirduse
for specific Normal
casesForm (3NF),
and/or Dimensional
client’s 4 - 6domains_x000D_
business years
Modelling, Entity-Relationship
Hyderabad (ER) Models a
ing schema
data models design approaches
according andrequirements
to business techniques such as Thirduse
for specific Normal
casesForm (3NF),
and/or Dimensional
client’s 4 - 6domains_x000D_
business years
Modelling, Entity-Relationship
Hyderabad (ER) Models a
ing schema
data models design approaches
according andrequirements
to business techniques such as Thirduse
for specific Normal
casesForm (3NF),
and/or 4 - 6domains_x000D_
Dimensional
client’s business years Hyderabad
Modelling, Entity-Relationship (ER) Models a
ing schema
data models design approaches
according andrequirements
to business techniques such as Thirduse
for specific Normal
casesForm (3NF),
and/or Dimensional
client’s 4 - 6domains_x000D_
business years
Modelling, Entity-Relationship
Hyderabad (ER) Models a
elling schemaaccording
data models design approaches
to businessand techniques for
requirements suchspecific
as Third
useNormal Form (3NF),
cases and/or client’sDimensional
6 - 8domains_x000D_
business years
Modelling,Hyderabad
Entity-Relationship (ER) Mode
elling schemaaccording
data models design approaches
to businessand techniques for
requirements suchspecific
as Third
useNormal Form (3NF),
cases and/or client’sDimensional
6 - 8domains_x000D_
business years
Modelling,Hyderabad
Entity-Relationship (ER) Mode
elling schemaaccording
data models design approaches
to businessand techniques for
requirements suchspecific
as Third
useNormal Form (3NF),
cases and/or client’sDimensional
6 - 8domains_x000D_
business years
Modelling,Hyderabad
Entity-Relationship (ER) Mode
elling schema design approaches and techniques such as Third Normal Form (3NF), Dimensional 6 - 8 years
Modelling,Hyderabad
Entity-Relationship (ER) Mode
gn kit. Includes integration between Business Intelligence Platforms, which will allow users
4 - to
6 years
toggle easily between
NOIDAreporting and analysis tasks. The re
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
echnologies - ex. Spark, Hive, Hadoop, Presto, Kafka etc Understanding or hands on experience
8 - 10 yearsof data visualization
Hyderabad challenges, tools used (Qlikview
echnologies - ex. Spark, Hive, Hadoop, Presto, Kafka etc Understanding or hands on experience
8 - 10 yearsof data visualization
Hyderabad challenges, tools used (Qlikview
echnologies - ex. Spark, Hive, Hadoop, Presto, Kafka etc Understanding or hands on experience
8 - 10 yearsof data visualization
Hyderabad challenges, tools used (Qlikview
echnologies - ex. Spark, Hive, Hadoop, Presto, Kafka etc Understanding or hands on experience
8 - 10 yearsof data visualization
Hyderabad challenges, tools used (Qlikview
echnologies - ex. Spark, Hive, Hadoop, Presto, Kafka etc Understanding or hands on experience
8 - 10 yearsof data visualization
Hyderabad challenges, tools used (Qlikview
echnologies - ex. Spark, Hive, Hadoop, Presto, Kafka etc Understanding or hands on experience
8 - 10 yearsof data visualization
Hyderabad challenges, tools used (Qlikview
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
DataStage, QualityStage, IA, Data Migration 8 - 10 years Gurgaon
nalysis and ETL development activities. Worked with Big Data projects across Hadoop 2and
- 4Cloud
years technologies
Gurgaon
(or equivalents) S3, Hive, Spark, Pre
roject plan, budget, structure, schedule and staffing requirements, including IBM and Client
8 - employees
10 years and 3rd
Bangalore
party vendors. Requires experience with P
n OLTP and OLAP models Experience working on Tableau 9.x (Desktop, Server, Reader),4 creating
- 6 yearsvarious Reports
Any and Dashboards using different funct
olutions Using SQL; Design Data Lake Solutions; Design Solutions Using Hadoop; Design
8 -Solutions
10 years Using MapReduce;
Bangalore Design Solutions Using HDFS;
oud) '- Analysis of applications specifically from standpoint of extraction, structured and unstructured
6 - 8 years data repository
Bangalore
management '- Design and implemen
build and manage solutions using Web-based BI reporting using MicroStrategy Other Preferred
6 - 8 years
Qualifications:Any
Agile experience is preferred
Data Modelling with Banking domain experience 6 - 8 years Bangalore
Data Modelling with Banking domain experience 4 - 6 years Bangalore
nalysis and ETL development activities. Worked with Big Data projects across Hadoop 6and
- 8Cloud
years technologies
Gurgaon
(or equivalents) S3, Hive, Spark, Pre
ontentdata
ment Navigator
modelsConfiguration and Customization.
according to business requirementsInfor
addition,
specificit use
is required to have
cases and/or hands-on
client’s6business
- 8 years
experience with
Hyderabad
administrator tools such as ACCE, knowl
domains_x000D_
modelling
ment dataschema
modelsdesign approaches
according and requirements
to business techniques such
for as Third use
specific Normal
casesForm (3NF),
and/or Dimensional
client’s4business
- 6 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship (ER
modelling
ment dataschema
modelsdesign approaches
according and requirements
to business techniques such
for as Third use
specific Normal
casesForm (3NF),
and/or Dimensional
client’s4business
- 6 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship (ER
modelling
ment dataschema
modelsdesign approaches
according and requirements
to business techniques such
for as Third use
specific Normal
casesForm (3NF),
and/or Dimensional
client’s4business
- 6 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship (ER
modelling
ment dataschema
modelsdesign approaches
according and requirements
to business techniques such
for as Third use
specific Normal
casesForm (3NF),
and/or Dimensional
client’s4business
- 6 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship (ER
modelling
ment dataschema
modelsdesign approaches
according and requirements
to business techniques such
for as Third use
specific Normal
casesForm (3NF),
and/or Dimensional
client’s4business
- 6 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship (ER
modelling
ment dataschema
modelsdesign approaches
according and requirements
to business techniques such
for as Third use
specific Normal
casesForm (3NF),
and/or Dimensional
client’s4business
- 6 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship (ER
modelling
ment dataschema
modelsdesign approaches
according and requirements
to business techniques such
for as Third use
specific Normal
casesForm (3NF),
and/or 4business
- 6 yearsdomains_x000D_
Dimensional
client’s Hyderabad
Modelling, Entity-Relationship (ER
modelling
ment dataschema
modelsdesign approaches
according and requirements
to business techniques such
for as Third use
specific Normal
casesForm (3NF),
and/or Dimensional
client’s4business
- 6 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship (ER
modelling
ment dataschema
modelsdesign approaches
according and requirements
to business techniques such
for as Third use
specific Normal
casesForm (3NF),
and/or Dimensional
client’s4business
- 6 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship (ER
modelling
ment dataschema
modelsdesign approaches
according and requirements
to business techniques such
for as Third use
specific Normal
casesForm (3NF),
and/or Dimensional
client’s4business
- 6 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship (ER
ament
modelling schemaaccording
data models design approaches
to businessand techniques for
requirements such as Third
specific useNormal Form (3NF),
cases and/or client’sDimensional
6business
- 8 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship
ament
modelling schemaaccording
data models design approaches
to businessand techniques for
requirements such as Third
specific useNormal Form (3NF),
cases and/or client’sDimensional
6business
- 8 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship
a modelling schema design approaches and techniques such as Third Normal Form (3NF), Dimensional
6 - 8 years Modelling,
Hyderabad
Entity-Relationship
ment data models according to business requirements for specific use cases and/or client’s business domains_x000D_
ament
modelling schemaaccording
data models design approaches
to businessand techniques for
requirements such as Third
specific useNormal Form (3NF),
cases and/or client’sDimensional
6business
- 8 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship
ament
modelling schemaaccording
data models design approaches
to businessand techniques for
requirements such as Third
specific useNormal Form (3NF),
cases and/or client’sDimensional
6business
- 8 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship
ament
modelling schemaaccording
data models design approaches
to businessand techniques for
requirements such as Third
specific useNormal Form (3NF),
cases and/or client’sDimensional
6business
- 8 yearsdomains_x000D_
Modelling,
Hyderabad
Entity-Relationship
ament
modelling schemaaccording
data models design approaches
to businessand techniques for
requirements such as Third
specific useNormal Form (3NF),
cases and/or 6business
- 8 yearsdomains_x000D_
client’sDimensional Hyderabad
Modelling, Entity-Relationship
adata
modelling
modelsschema design
according approaches
to business and techniques
requirements such as
for specific useThird
casesNormal
and/or Form (3NF),
client’s Dimensional
6 - 8domains_x000D_
business years Modelling,
Hyderabad
Entity-Relationship
ing schema
data models design approaches
according andrequirements
to business techniques such as Thirduse
for specific Normal
casesForm (3NF),
and/or Dimensional
client’s 4 - 6domains_x000D_
business years
Modelling, Entity-Relationship
Hyderabad (ER) Models a
ing schema
data models design approaches
according andrequirements
to business techniques such as Thirduse
for specific Normal
casesForm (3NF),
and/or Dimensional
client’s 4 - 6domains_x000D_
business years
Modelling, Entity-Relationship
Hyderabad (ER) Models a
ing schema
data models design approaches
according andrequirements
to business techniques such as Thirduse
for specific Normal
casesForm (3NF),
and/or Dimensional
client’s 4 - 6domains_x000D_
business years
Modelling, Entity-Relationship
Hyderabad (ER) Models a
ing schema
data models design approaches
according andrequirements
to business techniques such as Thirduse
for specific Normal
casesForm (3NF),
and/or Dimensional
client’s 4 - 6domains_x000D_
business years
Modelling, Entity-Relationship
Hyderabad (ER) Models a
ing schema
data models design approaches
according andrequirements
to business techniques such as Thirduse
for specific Normal
casesForm (3NF),
and/or Dimensional
client’s 4 - 6domains_x000D_
business years
Modelling, Entity-Relationship
Hyderabad (ER) Models a
ing schema
data models design approaches
according andrequirements
to business techniques such as Thirduse
for specific Normal
casesForm (3NF),
and/or Dimensional
client’s 4 - 6domains_x000D_
business years
Modelling, Entity-Relationship
Hyderabad (ER) Models a
elling schemaaccording
data models design approaches
to businessand techniques for
requirements suchspecific
as Third
useNormal Form (3NF),
cases and/or 6 - 8domains_x000D_
client’sDimensional
business years
Modelling,Hyderabad
Entity-Relationship (ER) Mode
elling schemaaccording
data models design approaches
to businessand techniques for
requirements suchspecific
as Third
useNormal Form (3NF),
cases and/or client’sDimensional
6 - 8domains_x000D_
business years
Modelling,Hyderabad
Entity-Relationship (ER) Mode
elling schemaaccording
data models design approaches
to businessand techniques for
requirements suchspecific
as Third
useNormal Form (3NF),
cases and/or client’sDimensional
6 - 8domains_x000D_
business years
Modelling,Hyderabad
Entity-Relationship (ER) Mode
elling schema design approaches and techniques such as Third Normal Form (3NF), Dimensional
6 - 8 years
Modelling,Hyderabad
Entity-Relationship (ER) Mode
gn kit. Includes integration between Business Intelligence Platforms, which will allow users
6 - to
8 years
toggle easily between
Bangalore
reporting and analysis tasks. The re
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 6with
- 8 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
spark, impala, hive, as well as database technology Source Code Control (experience 4with
- 6 years
Git preferred)Hyderabad
Able to perform Unix / Linux scripting
nalysis and ETL development activities. Worked with Big Data projects across Hadoop 2and
- 4Cloud
years technologies
Gurgaon
(or equivalents) S3, Hive, Spark, Pre
n OLTP and OLAP models Experience working on Tableau 9.x (Desktop, Server, Reader),4 creating
- 6 yearsvarious Reports
Any and Dashboards using different funct
ude designing and developing extract, transform and load (ETL) processes. Experience includes
1 - 3 years
full lifecycleHyderabad
implementation of the technical components
ude designing and developing extract, transform and load (ETL) processes. Experience includes
2 - 4 years
full lifecycleHyderabad
implementation of the technical components
ude designing and developing extract, transform and load (ETL) processes. Experience includes
2 - 4 years
full lifecycleHyderabad
implementation of the technical components
ude designing and developing extract, transform and load (ETL) processes. Experience includes
2 - 4 years
full lifecycleHyderabad
implementation of the technical components
iving long term data architecture roadmaps in alignment with corporate strategic objectives.
6 -Experience
8 years with Bangalore
following message queuing, stream processi
build and manage solutions using Web-based BI reporting using MicroStrategy Other Preferred
6 - 8 years
Qualifications:Any
Agile experience is preferred
build and manage solutions using Web-based BI reporting using MicroStrategy Other Preferred
6 - 8 years
Qualifications:Any
Agile experience is preferred
Data Modelling with Banking domain experience 6 - 8 years Bangalore
Data Modelling with Banking domain experience 4 - 6 years Bangalore
t, schedule, and contractual deliverables, which includes applying techniques for planning,6tracking,
- 8 years changeNavi
control,
Mumbai
and risk management.
t, schedule, and contractual deliverables, which includes applying techniques for planning,6tracking,
- 8 years changeNavi
control,
Mumbai
and risk management.
Nice to have Skill: Any BI Reporting tools, Cloud Services in Data Analytics experience. 4Should
- 6 years
independantly
Bangalore
work with business IT and command ove
Nice to have Skill: Any BI Reporting tools, Cloud Services in Data Analytics experience. 4Should
- 6 years
independantly
Bangalore
work with business IT and command ove
Nice to have Skill: Any BI Reporting tools, Cloud Services in Data Analytics experience. 4Should
- 6 years
independantly
Bangalore
work with business IT and command ove
Nice to have Skill: Any BI Reporting tools, Cloud Services in Data Analytics experience. 4Should
- 6 years
independantly
Bangalore
work with business IT and command ove
acle, Greenplum. Experience working with Data Warehousing, ETL Development and ETL
4 -Architecture
6 years Excellent
Bangalore
communication and troubleshooting sk
using Informatica Cloud (IICS) 2.Experience of setting informatica pipeline with snow flake
6 - 8asyears
target 3.Knowledge
Kolkataon CI CD using bit bucket and jenkin
rs of experience with change management procedures and SDLC is a must Experience with
4 -analytical
6 years and operational
Bangaloredata Store modeling Experience wi
sources Experience with Data Marts, Data Warehouse structures (e.g., star schema, fact and
6 -dimensions)
8 years 3+ years
Kolkata
Database Experience including Oracle
ound data profiling, cleansing, parsing, standardization, verification, matching, rules and data
4 - quality
6 yearsexception monitoring
Any and handling.
Should have atleast 3 + years of experience in Google Big Query, ETL, Python, Data2lake
- 4 years Gurgaon
Should have atleast 3 + years of experience in Google Big Query, ETL, Python, Data2lake
- 4 years Gurgaon
Should have atleast 3 + years of experience in Google Big Query, ETL, Python, Data2lake
- 4 years Gurgaon
Should have atleast 7 + years of experience in Google Big Query, ETL, Python, Data4lake
- 6 years Gurgaon
Should have atleast 7 + years of experience in Google Big Query, ETL, Python, Data4lake
- 6 years Gurgaon
Should have atleast 9 + years of experience in Google Big Query, ETL, Python, Data8 lake
- 10 years Gurgaon
Should have atleast 3 + years of experience in Google Big Query, ETL, Python, Data1lake
- 3 years Gurgaon
Should have atleast 3 + years of experience in Google Big Query, ETL, Python, Data1lake
- 3 years Gurgaon
Should have atleast 3 + years of experience in Google Big Query, ETL, Python, Data1lake
- 3 years Gurgaon
Application consultant ( Knowledge of xECM and Archive Server), open text 6 - 8 years Ahmedabad
OpenText ECM Consultant / Developer JD : Knowledge of Development and Business workspace)
4 - 6 years Ahmedabad
Google tools like BigQuery, App Engine, AirFlow, Cloud Composer, etc...+ Data lakes...data
4 - 6ponds..domain
years Bangalore
driven design....data dictionaries, content ca
Google tools like BigQuery, App Engine, AirFlow, Cloud Composer, etc...+ Data lakes...data
4 - 6ponds..domain
years Bangalore
driven design....data dictionaries, content ca
Google tools like BigQuery, App Engine, AirFlow, Cloud Composer, etc...+ Data lakes...data
4 - 6ponds..domain
years Bangalore
driven design....data dictionaries, content ca
Google tools like BigQuery, App Engine, AirFlow, Cloud Composer, etc...+ Data lakes...data
6 - 8ponds..domain
years Bangalore
driven design....data dictionaries, content ca
n - SQL DB and BI Tools - GCP Big Query (preferred) Nice to have Skills: - Java 8 - Python
2 - 4- years
OLAP Cubes Bangalore
and Star Schema
n - SQL DB and BI Tools - GCP Big Query (preferred) Nice to have Skills: - Java 8 - Python
6 - 8- years
OLAP CubesBangalore
and Star Schema
concepts Google CloudonPlatform
- To work expertiseActivities
Data Migration Data Engineer Certification
from current is preferred
environment Coding4 &
to Azure - 6CI/CD
years GitHubBangalore
Management SQL Secondary Skills: Data W
- -Execute
To worktasks based
on Data on directions
Migration from from
Activities the Technical Team lead / to
current environment Architect
Azure 2 - 4 years Bangalore
- Execute tasks based on directions from the Technical Team lead / Architect 2 - 4 years Bangalore
ork on solution design of ETL activity - should have a sound knowledge in Informatica Powercenter
4 - 6 yearsdevelopment
Bangalore
along with SQL knowledge - should hav
rk) Experience in Big Data file formats (e.g. Parquet) Experience in object-oriented programming
4 - 6 years
languages (Python)
BangaloreSolid understanding of Data Model
rk) Experience in Big Data file formats (e.g. Parquet) Experience in object-oriented programming
4 - 6 years
languages (Python)
BangaloreSolid understanding of Data Model
4 - 6 years Bangalore
MDM Sustain developer 4 - 6 years Kolkata
strong knowledge in areas SAS development experience with SAS Base, SAS DI, SAS Enterprise
1 - 3 years
Guide andHyderabad
must have SAS ETL development skills.
Snowflake / Wherescape developer 6 - 8 years Bangalore
Snowflake / Wherescape developer 6 - 8 years Bangalore
st Export, Fast Load,• MultiLoad,
Should haveTPump and TPT),
the capability Oracle
to work 8.1minimal
with with ETL knowledge Good4to
guidance - 6have
years
(Not Mandatory):
BangaloreSQL,UNIX/Linux Shell Scripting, P
• Knowledge
• Should have the of Agile way
capability of working
to work is preferred
with minimal guidance 4 - 6 years Pune
• Knowledge
1. Experience of of Azure
working with Agile way
cloudofbased
working is preferred of Snowflake4 - 6 years
implementations Pune
2. Experience in Data Migration from RDBMS to Snowflake cloud data warehouse
4 - 6 years Pune
GCP - Data Engg- DataProc/ Big Query / Big Table, HDP/Python/PySpark/Spark4 - 6 years Pune
GCP - Data Engg- DataProc/ Big Query / Big Table, HDP/Python/PySpark/Spark6 - 8 years Pune
GCP - Data Engg- DataProc/ Big Query / Big Table, HDP/Python/PySpark/Spark2 - 4 years Pune
GCP - Data Engg- DataProc/ Big Query / Big Table, HDP/Python/PySpark/Spark2 - 4 years Pune
GCP - Data Engg- DataProc/ Big Query / Big Table, HDP/Python/PySpark/Spark6 - 8 years Pune
Nice to have Skill: Any BI Reporting tools, Cloud Services in Data Analytics experience. 4Should
- 6 years
independantly
Bangalore
work with business IT and command ove
Nice to have Skill: Any BI Reporting tools, Cloud Services in Data Analytics experience. 4Should
- 6 years
independantly
Bangalore
work with business IT and command ove
Nice to have Skill: Any BI Reporting tools, Cloud Services in Data Analytics experience. 4Should
- 6 years
independantly
Bangalore
work with business IT and command ove
Nice to have Skill: Any BI Reporting tools, Cloud Services in Data Analytics experience. 4Should
- 6 years
independantly
Bangalore
work with business IT and command ove
Nice to have Skill: Any BI Reporting tools, Cloud Services in Data Analytics experience. 4Should
- 6 years
independantly
Bangalore
work with business IT and command ove
on programming language Debugging/troubleshooting of Spark jobs Performance tuning experience
4 - 6 yearsfor Hadoop/Spark
Bangalorejobs Good To Have Hands-on deve
on programming language Debugging/troubleshooting of Spark jobs Performance tuning experience
4 - 6 yearsfor Hadoop/Spark
Bangalorejobs Good To Have Hands-on deve
acle, Greenplum. Experience working with Data Warehousing, ETL Development and ETL
4 -Architecture
6 years Excellent
Bangalore
communication and troubleshooting sk
t , waterfall chart pie charts etc. Knowledge in connecting HANA DB to Tableau Knowledge
6 - 8inyears
SQL related toKolkata
Tableau End to End experience in Tablea
MDM hub development, MDM File Import process, design/build MDM Batch Jobs set up Strong
4 - 6 years
ability to understand,
Any document and communicate technic
ke Architecture experience in designing and developing projects. Understand ETL, primarily
6 - 8informatica,
years forBangalore
data engineering and knowledge of SQL. Cr
rs of experience with change management procedures and SDLC is a must Experience with
4 -analytical
6 years and operational
Bangaloredata Store modeling Experience wi
sources Experience with Data Marts, Data Warehouse structures (e.g., star schema, fact and
6 -dimensions)
8 years 3+ years
Kolkata
Database Experience including Oracle
ound data profiling, cleansing, parsing, standardization, verification, matching, rules and data
4 - quality
6 yearsexception monitoring
Any and handling.
Should have atleast 3 + years of experience in Google Big Query, ETL, Python, Data2lake
- 4 years Gurgaon
Should have atleast 3 + years of experience in Google Big Query, ETL, Python, Data2lake
- 4 years Gurgaon
Should have atleast 3 + years of experience in Google Big Query, ETL, Python, Data2lake
- 4 years Gurgaon
Should have atleast 7 + years of experience in Google Big Query, ETL, Python, Data4lake
- 6 years Gurgaon
Should have atleast 7 + years of experience in Google Big Query, ETL, Python, Data4lake
- 6 years Gurgaon
Should have atleast 3 + years of experience in Google Big Query, ETL, Python, Data1lake
- 3 years Gurgaon
Should have atleast 3 + years of experience in Google Big Query, ETL, Python, Data1lake
- 3 years Gurgaon
Should have atleast 3 + years of experience in Google Big Query, ETL, Python, Data1lake
- 3 years Gurgaon
Application consultant ( Knowledge of xECM and Archive Server), open text 6 - 8 years Ahmedabad
OpenText ECM Consultant / Developer JD : Knowledge of Development and Business workspace)
4 - 6 years Ahmedabad
Google tools like BigQuery, App Engine, AirFlow, Cloud Composer, etc...+ Data lakes...data
4 - 6ponds..domain
years Bangalore
driven design....data dictionaries, content ca
Google tools like BigQuery, App Engine, AirFlow, Cloud Composer, etc...+ Data lakes...data
4 - 6ponds..domain
years Bangalore
driven design....data dictionaries, content ca
Google tools like BigQuery, App Engine, AirFlow, Cloud Composer, etc...+ Data lakes...data
6 - 8ponds..domain
years Bangalore
driven design....data dictionaries, content ca
Google tools like BigQuery, App Engine, AirFlow, Cloud Composer, etc...+ Data lakes...data
6 - 8ponds..domain
years Bangalore
driven design....data dictionaries, content ca
n - SQL DB and BI Tools - GCP Big Query (preferred) Nice to have Skills: - Java 8 - Python
2 - 4- years
OLAP Cubes Bangalore
and Star Schema
rs of experience on - ETL Developer Nice to have Skills:
•Experience - Java 8 - Python - OLAP Cubes4 and
in leading - 6 years
Star Schema Bangalore
•Product Ownership experience 6 - 8 years Bangalore
concepts Google CloudonPlatform
- To work expertiseActivities
Data Migration Data Engineer Certification
from current is preferred
environment Coding4 &
to Azure - 6CI/CD
years GitHubBangalore
Management SQL Secondary Skills: Data W
- -Execute
To worktasks based
on Data on directions
Migration from from
Activities the Technical Team lead / to
current environment Architect
Azure 2 - 4 years Bangalore
- -Execute
To worktasks based
on Data on directions
Migration from from
Activities the Technical Team lead / to
current environment Architect
Azure 2 - 4 years Bangalore
- Execute tasks based on directions from the Technical Team lead / Architect 2 - 4 years Bangalore
Informatica power center. Should be able to handle the operational activities independently
4 -and
6 years
resolve issues.Very
Bangalore
Good SQL knowledge and DW conc
TL & SQL) knowledge - Able to do dependency analysis for change request & present it before
4 - 6 years
client in a non-ambiguous
Bangalore way. Candidate should also be
rk) Experience in Big Data file formats (e.g. Parquet) Experience in object-oriented programming
4 - 6 years
languages (Python)
BangaloreSolid understanding of Data Model
rk) Experience in Big Data file formats (e.g. Parquet) Experience in object-oriented programming
4 - 6 years
languages (Python)
BangaloreSolid understanding of Data Model
4 - 6 years Bangalore
MDM Sustain developer 4 - 6 years Kolkata
MDM Sustain developer 4 - 6 years Kolkata
strong knowledge in areas SAS development experience with SAS Base, SAS DI, SAS Enterprise
1 - 3 years
Guide andHyderabad
must have SAS ETL development skills.
Snowflake / Wherescape developer 6 - 8 years Bangalore
Snowflake / Wherescape developer 6 - 8 years Bangalore
Datawarehouse/BigQuery/Big data Developer/Analyst/Data Engineer 4 - 6 years Hyderabad
GCP - Data Engg- DataProc/ Big Query / Big Table, HDP/Python/PySpark/Spark4 - 6 years Pune
GCP - Data Engg- DataProc/ Big Query / Big Table, HDP/Python/PySpark/Spark2 - 4 years Pune
GCP - Data Engg- DataProc/ Big Query / Big Table, HDP/Python/PySpark/Spark2 - 4 years Pune
GCP - Data Engg- DataProc/ Big Query / Big Table, HDP/Python/PySpark/Spark2 - 4 years Pune
GCP - Data Engg- DataProc/ Big Query / Big Table, HDP/Python/PySpark/Spark2 - 4 years Pune
GCP - Data Engg- DataProc/ Big Query / Big Table, HDP/Python/PySpark/Spark4 - 6 years Pune
GCP - Data Engg- DataProc/ Big Query / Big Table, HDP/Python/PySpark/Spark6 - 8 years Pune
GCP - Data Engg- DataProc/ Big Query / Big Table, HDP/Python/PySpark/Spark2 - 4 years Pune
GCP - Data Engg- DataProc/ Big Query / Big Table, HDP/Python/PySpark/Spark6 - 8 years Pune
GCP - Data Engg- DataProc/ Big Query / Big Table, HDP/Python/PySpark/Spark2 - 4 years Pune
GCP - Data Engg- DataProc/ Big Query / Big Table, HDP/Python/PySpark/Spark6 - 8 years Pune
perience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.2--Experience/knowledge
4 years Gurgaon
on AWS cloud services - Experience
bjects in the application. Migration and deployment of objects among Development, Test,
4 - Production
6 years Environments.
Hyderabad Reviewing objects prepared by
Big Data technologies including API development. Expected to have traditional Application6 Development
- 8 years background
Hyderabadalong with knowledge of Analytics
perience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.2--Experience/knowledge
4 years Gurgaon
on AWS cloud services - Experience
QL Server Strong on DW Fundamentals, concepts. Strong knowledge of database
6 - 8 design
years and entity
Hyderabad
relationships. Data warehouse and
experience -3 years of SQL and unix experience -2 Years of Datawarehousing Experience4-Nice
- 6 years
to have experience
Bangalore
in Java
omponents -Expert with ETL tools using REST and SOAP API. -understanding of Salesforce
6 - 8Data
years
Models and
Bangalore
Informatica components -SQL knowlege -
ustry d. Should be well conversant in English and should have excellent writing, MIS, communication,
2 - 4 years time management
Any and multi-tasking skills Shoul
ETL Design and development techniques, Create ETL jobs from source-target mapping documents,
4 - 6 years Extensive
Bangalore
ETL and SQL database skills
th building dashboard, this will be an individual contributor’s role who will be driving changes
2 - 4 by
years
themselves and
Pune
helping us build dashboards. Strong in b
th building dashboard, this will be an individual contributor’s role who will be driving changes
2 - 4 by
years
themselves and
Pune
helping us build dashboards. Strong in b
th building dashboard, this will be an individual contributor’s role who will be driving changes
2 - 4 by
years
themselves and
Pune
helping us build dashboards. Strong in b
th building dashboard, this will be an individual contributor’s role who will be driving changes
2 - 4 by
years
themselves and
Pune
helping us build dashboards. Strong in b
on-prod sites. Refreshing Extracts on non-prod sites. Monitoring the status of extract schedules.
6 - 8 years
Performing Unit
Bangalore
Testing on non-prod. Resolving any data
Primary Skills: IBM DataStage, PL/SQL, Unix 2 - 4 years Any
Primary Skills: IBM DataStage, PL/SQL, Unix 2 - 4 years Any
nd analytical layer Data integration architecture design estimate data engineer efforts (pipelines
6 - 8 years
for ingestion, transformation
Kolkata and storage destination) desi
namoDB / AWS Kinesis / Apache Kafka / Apache EMR / AWS Glue / Apache Airflow / Lambda
6 - 8 years
+ Step Functions
Bangalore
/ OpenShift / AWS EKS
Azure - Data Bricks - Spark, Synapse Analytics, ADF - Python 6 - 8 years Bangalore
Azure - Data Bricks - Spark, Synapse Analytics, ADF - Python 6 - 8 years Bangalore
Azure - Data Bricks - Spark, Synapse Analytics, ADF - Python 6 - 8 years Bangalore
OpenSource - Big Data - Spark / PySpark or Scala or Java + Talend / Pentaho 6 - 8 years Bangalore
OT products - Document Presentation for SAP Solutions including Live S/4 addon - Imaging
6 -Enterprise
8 years Scan Hyderabad
Exposure to other OT products, Some of the
data model and traceback issues related To data To source systems - understand interfaces 6across
- 8 years
different systems
Bangalore
and ACM. Person should have worked
t , waterfall chart pie charts etc. Knowledge in connecting
Special HANA DB to Tableau Knowledge
Challenges: 8 - 10
in years
SQL related toKolkata
Tableau End to End experience in Tablea
oncile different – often competing – interests andChallenges:
Special requirements applicable to solutions, and6the
- 8need
yearsto escalateBangalore
conflicts timely as appropriate.
oncile different – often competing – interests and requirements applicable to solutions, and6the
- 8need
yearsto escalateBangalore
conflicts timely as appropriate.
Special Challenges:
oncile different – often competing – interests andChallenges:
Special requirements applicable to solutions, and6the
- 8need
yearsto escalateBangalore
conflicts timely as appropriate.
oncile different – often competing – interests andChallenges:
Special requirements applicable to solutions, and6the
- 8need
yearsto escalateBangalore
conflicts timely as appropriate.
oncile different – often competing – interests and requirements applicable to solutions, and6the
- 8need
yearsto escalateBangalore
conflicts timely as appropriate.
6 - 8 years Bangalore
6 - 8 years Bangalore
6 - 8 years Bangalore
6 - 8 years Bangalore
6 - 8 years Bangalore
6 - 8 years Bangalore
6 - 8 years Bangalore
6 - 8 years Bangalore
6 - 8 years Bangalore
echnical requirements into detailed design. Perform analysis of vast data stores and uncover
6 - 8 insights.
years Maintain
Bangalore
security and data privacy. Create sc
echnical requirements into detailed design. Perform analysis of vast data stores and uncover
6 - 8 insights.
years Maintain
Bangalore
security and data privacy. Create sc
echnical requirements into detailed design. Perform analysis of vast data stores and uncover
6 - 8 insights.
years Maintain
Bangalore
security and data privacy. Create sc
querying tools, such as Hive and Impala Basic knowledge on LINUX commands. Good communication
6 - 8 years skills.
Bangalore
Follow organization's support process in r
and developing dashboards, reports, visualizations and storytelling techniques, and customizing
2 - 4 years
out of the box
Bangalore
widgets. Hands on expertise on Qlik scr
on-prod sites. Refreshing Extracts on non-prod sites. Monitoring the status of extract schedules.
2 - 4 years
Performing Unit
Bangalore
Testing on non-prod. Resolving any data
on-prod sites. Refreshing Extracts on non-prod sites. Monitoring the status of extract schedules.
2 - 4 years
Performing Unit
Bangalore
Testing on non-prod. Resolving any data
on-prod sites. Refreshing Extracts on non-prod sites. Monitoring the status of extract schedules.
4 - 6 years
Performing Unit
Bangalore
Testing on non-prod. Resolving any data
on-prod sites. Refreshing Extracts on non-prod sites. Monitoring the status of extract schedules.
4 - 6 years
Performing Unit
Bangalore
Testing on non-prod. Resolving any data
on-prod sites. Refreshing Extracts on non-prod sites. Monitoring the status of extract schedules.
6 - 8 years
Performing Unit
Bangalore
Testing on non-prod. Resolving any data
Primary Skills: IBM DataStage, PL/SQL, Unix 2 - 4 years Any
Primary Skills: IBM DataStage, PL/SQL, Unix 2 - 4 years Any
namoDB / AWS Kinesis / Apache Kafka / Apache EMR / AWS Glue / Apache Airflow / Lambda
4 - 6 years
+ Step Functions
Bangalore
/ OpenShift / AWS EKS
namoDB / AWS Kinesis / Apache Kafka / Apache EMR / AWS Glue / Apache Airflow / Lambda
4 - 6 years
+ Step Functions
Bangalore
/ OpenShift / AWS EKS
namoDB / AWS Kinesis / Apache Kafka / Apache EMR / AWS Glue / Apache Airflow / Lambda
6 - 8 years
+ Step Functions
Bangalore
/ OpenShift / AWS EKS
namoDB / AWS Kinesis / Apache Kafka / Apache EMR / AWS Glue / Apache Airflow / Lambda
6 - 8 years
+ Step Functions
Bangalore
/ OpenShift / AWS EKS
Azure - Data Bricks - Spark, Synapse Analytics, ADF - Python 6 - 8 years Bangalore
Azure - Data Bricks - Spark, Synapse Analytics, ADF - Python 6 - 8 years Bangalore
Azure - Data Bricks - Spark, Synapse Analytics, ADF - Python 6 - 8 years Bangalore
pment of ETL mappings, workflows Strong SQLChallenges:
Special skills in database languages like Oracle /4Netezza/
- 6 yearsDB2 Preferred
Mumbaidevelopment skills in Unix scripting
oncile different – often competing – interests andChallenges:
Special requirements applicable to solutions, and6the
- 8need
yearsto escalateBangalore
conflicts timely as appropriate.
oncile different – often competing – interests andChallenges:
Special requirements applicable to solutions, and6the
- 8need
yearsto escalateBangalore
conflicts timely as appropriate.
oncile different – often competing – interests andChallenges:
Special requirements applicable to solutions, and6the
- 8need
yearsto escalateBangalore
conflicts timely as appropriate.
oncile different – often competing – interests requirements applicable to solutions, and6the
andChallenges:
Special - 8need
yearsto escalateBangalore
conflicts timely as appropriate.
oncile different – often competing – interests and requirements applicable to solutions, and6the
- 8need
yearsto escalateBangalore
conflicts timely as appropriate.
6 - 8 years Bangalore
6 - 8 years Bangalore
6 - 8 years Bangalore
6 - 8 years Bangalore
6 - 8 years Bangalore
6 - 8 years Bangalore
6 - 8 years Bangalore
6 - 8 years Bangalore
6 - 8 years Bangalore
6 - 8 years Bangalore
echnical requirements into detailed design. Perform analysis of vast data stores and uncover
6 - 8 insights.
years Maintain
Bangalore
security and data privacy. Create sc
echnical requirements into detailed design. Perform analysis of vast data stores and uncover
6 - 8 insights.
years Maintain
Bangalore
security and data privacy. Create sc
echnical requirements into detailed design. Perform analysis of vast data stores and uncover
6 - 8 insights.
years Maintain
Bangalore
security and data privacy. Create sc
querying
• tools, such
Mandatory asrequired
skills Hive and– Impala Basic knowledge
Spark, Scala, on LINUX
Oozie, HIVE, commands.
Shell script, Jenkins,Good communication
6 -Github
Ansible, 8 years skills.
Bangalore
Follow organization's support process in r
Good to -have
Goodskills
oral–and
Nifi, Elastic,
written KIBANA, GRAFANA,
communication abilities Kafka 2 - 4 years Bangalore
- Proficient in MS Office tools and SDLC life cycle. 2 - 4 years Bangalore
ment Console, Talend JobServer, Talend Runtime, Talend Remote Engine, and all other server
6 - 8 years
components., Bangalore
Open Shift, Kubernetes and Managing Depl
ment Console, Talend JobServer, Talend Runtime, Talend Remote Engine, and all other server
6 - 8 years
components., Bangalore
Open Shift, Kubernetes and Managing Depl
ment Console, Talend JobServer, Talend Runtime, Talend Remote Engine, and all other server
6 - 8 years
components., Bangalore
Open Shift, Kubernetes and Managing Depl
ment Console, Talend JobServer, Talend Runtime, Talend Remote Engine, and all other server
6 - 8 years
components., Bangalore
Open Shift, Kubernetes and Managing Depl
ment Console, Talend JobServer, Talend Runtime, Talend Remote Engine, and all other server
6 - 8 years
components., Bangalore
Open Shift, Kubernetes and Managing Depl
ment Console, Talend JobServer, Talend Runtime, Talend Remote Engine, and all other server
6 - 8 years
components., Bangalore
Open Shift, Kubernetes and Managing Depl
ment Console, Talend JobServer, Talend Runtime, Talend Remote Engine, and all other server
6 - 8 years
components., Bangalore
Open Shift, Kubernetes and Managing Depl
ment Console, Talend JobServer, Talend Runtime, Talend Remote Engine, and all other server
6 - 8 years
components., Bangalore
Open Shift, Kubernetes and Managing Depl
ment
• Console,
MandatoryTalend
skillsJobServer,
required – Talend Runtime,
Spark, Scala, Talend
Oozie, Remote
HIVE, ShellEngine, and all other
script, Jenkins, server
6 -Github
Ansible, 8 years
components., Bangalore
Open Shift, Kubernetes and Managing Depl