You are on page 1of 27

JR: S Band Primary Location Secondary Location

Application Architect-Enterprise Content Management 7B Hyderabad Pune


Application Architect-Data Platforms 7B Bangalore
Data Engineer-Master Data Management 7A Bangalore
Data Engineer-Big Data 6B Bangalore
Data Engineer-Data Integration 7A Kolkata Kolkata
Business Transformation Consultant-Enterprise Asset Management 7B Gurgaon
erience and technical certifications.
Application3)Developer-Healthcare
Minimum of 6 years Applications
of Curam development experience
7B .Gurgaon
4) Atleast Three years of experience in the design principles of large-sca
Data Engineer-Big Data 7A Bangalore
Application Architect-Data Platforms 7B Bangalore
his role , successful candidatesData
mustEngineer-Master
have the belowData
knowledge:
Management
MDM CE Lead responsible
7A for MDM
Bangalore
CE Solution, design, development, unit testing, and integration testin
his role , successful candidatesData
mustEngineer-Master
have the belowData
knowledge:
Management
MDM CE Lead responsible
7A for MDM
Bangalore
CE Solution, design, development, unit testing, and integration testin
Data Engineer-Enterprise Content Management 7A Pune Pune
Application Consultant-Healthcare Applications 6B Bangalore
work using the Curam estimation
Application
model.
Consultant-Healthcare
Designs and develops
Applications
solutions for Websphere
6BApplication
Bangalore
Server and DB2/Oracle Architect or development experience using W
work using the Curam estimation
Application
model.
Consultant-Healthcare
Designs and develops
Applications
solutions for Websphere
6BApplication
Bangalore
Server and DB2/Oracle Architect or development experience using W
work using the Curam estimation
Application
model.
Consultant-Healthcare
Designs and develops
Applications
solutions for Websphere
7AApplication
Bangalore
Server and DB2/Oracle Architect or development experience using W
Application Consultant-Healthcare Applications 6B Bangalore
work using the Curam estimation
Application
model.
Consultant-Healthcare
Designs and develops
Applications
solutions for Websphere
6BApplication
Bangalore
Server and DB2/Oracle Architect or development experience using W
Data Engineer-Master Data Management 7B Any
Data Engineer-Master Data Management 7A Any
Data Engineer-Master Data Management 7B Any
Data Scientist-Advanced Analytics 7A Bangalore
Data Engineer-Big Data 7B Bangalore
Application Architect-Data Platforms 7B Bangalore
Data Engineer-Big Data 7B Bangalore
Data Consultant-Data Governance 7B Bangalore
Business Transformation Consultant-Operational Risk Management 7A Bangalore
Business Transformation Consultant-Operational Risk Management 7A Bangalore
Data Engineer-Big Data 7B Hyderabad Pune
Data Engineer-Big Data 7B Bangalore
Data Engineer-Big Data 7B Bangalore
Business Transformation Consultant-Operational Risk Management 7B Bangalore
Business Transformation Consultant-Operational Risk Management 7B Bangalore
Business Transformation Consultant-Operational Risk Management 7A Bangalore
Data Scientist-Advanced Analytics 7B Technical
Bangalore
skills in Advanced analytics, Machine learning, data insights, creating/
Data Scientist-Advanced Analytics 7B Technical
Bangalore
skills in Advanced analytics, Machine learning, data insights, creating/
Data Engineer-Data Integration 6B Bangalore Bangalore
Data Engineer-Data Integration 7B Bangalore
Data Engineer-Enterprise Content Management 7A Pune Bangalore
Data Engineer-Big Data 7B Bangalore Bangalore
Data Engineer-Data Integration 6B Kolkata ANY
Data Engineer-Data Warehouse 7A Bangalore Bangalore
Data Engineer-Data Integration 6B Bangalore Bangalore
Data Engineer-Enterprise Content Management 7A Pune Bangalore
Data Engineer-Data Warehouse 7B Bangalore
Data Engineer-Data Warehouse 7B Bangalore
Application Developer-Process Management (BPM) 7B Pune Hyderabad
The resourceData
mustEngineer-Business
have at least 4 yrsIntelligence
of experience with various versions
7A of Cognos
Hyderabad
(8x, 10x, 11x) & be able to independently develop Cognos solution
The resourceData
mustEngineer-Business
have at least 4 yrsIntelligence
of experience with various versions
7A of Cognos
Hyderabad
(8x, 10x, 11x) & be able to independently develop Cognos solution
Data Engineer-Data Integration 7A Bangalore
Demonstrates
Data Engineer-Machine
an understanding
Learningof Machine Learning concepts
7B and
Bangalore
techniques in order to solve a business problem. The individual is abl
Project Manager-Data Platforms 7B Bangalore
Demonstrates
Data Engineer-Machine
an understanding
Learningof Machine Learning concepts
7B and
Bangalore
techniques in order to solve a business problem. The individual is abl
Data Engineer-Data Integration 6B Bangalore
Project Manager-Data Platforms 7B Bangalore
Data Engineer-Advanced Analytics 7A Bangalore
Data Engineer-Data Integration 7A Bangalore
Data Engineer-Data Integration 7A Bangalore
Data Engineer-Enterprise Content Management 6B Hyderabad
Data Engineer-Advanced Analytics 7A Bangalore
Data Engineer-Master Data Management 6B Any
Data Engineer-Data Modeling 7A Bangalore
Data Engineer-Data Modeling 7A Bangalore
Data Engineer-Data Integration 7A Pune Pune
Data Engineer-Business Intelligence 7A Bangalore Bangalore
Solution Architect-Cognitive Computing a.) IBM Watson Developers
7A Bangalore
with working exp and skill in IBM Watson product and services specific
Solution Architect-Cognitive Computing a.) IBM Watson Developers
7A Bangalore
with working exp and skill in IBM Watson product and services specific
Data Engineer-Enterprise Content Management 6A Bangalore
Data Engineer-Data Integration 6B Hyderabad
Application Developer-Healthcare Applications 7B Any
Data Engineer-Big Data 7A Bangalore Bangalore
Application Developer-Healthcare Applications 7B Any
Data Engineer-Big Data 7B Hyderabad
Data Consultant-Data Governance 7B Bangalore
Data Consultant-Data Governance 7B Bangalore
Data Engineer-Enterprise Content Management 7B Hyderabad
Data Engineer-Enterprise Content
It is mandatory
Managementto have 4+ years7B
of relevant
Hyderabad
exp and proven knowledge in the areas of FileNet, Case Manager and
Data Engineer-Enterprise Content Management 7A Bangalore
Data Engineer-Enterprise Content Management 7A Hyderabad
Data Engineer-Enterprise Content Management 7A Hyderabad
Data Engineer-Big Data 7B Hyderabad
Data Consultant-Data Governance 7B Bangalore
Data Consultant-Data Governance 7B Bangalore
Data Consultant-Data Governance 7B Bangalore
Data Engineer-Enterprise Content Management 7B Hyderabad
Data Engineer-Enterprise Content
It is mandatory
Managementto have 4+ years7B
of relevant
Hyderabad
exp and proven knowledge in the areas of FileNet, Case Manager and
Data Engineer-Enterprise Content Management 7A Hyderabad
Data Engineer-Enterprise Content Management 7A Bangalore
Data Engineer-Enterprise Content Management 7A Hyderabad
Data Engineer-Enterprise Content Management 7A Hyderabad
Data Engineer-Data Integration 7A Hyderabad
Data Engineer-Data Modeling 6B Pune
Data Engineer-Data Modeling 7A Pune
Data Engineer-Data Modeling 6B Pune
Data Engineer-Business Intelligence 7B Hyderabad
Data Engineer-Data Integration 6A Pune
Data Engineer-Data Modeling 6B Pune
Data Engineer-Data Modeling 6B Pune
Data Engineer-Data Integration 6B Pune
Data Engineer-Data Warehouse 7B Pune
Data Engineer-Big Data 7B Bangalore
Data Engineer-Big Data 7B Bangalore
Data Engineer-Big Data 7A Bangalore
Data Engineer-Big Data 6B Bangalore
Data Engineer-Big Data 6B Bangalore
Data Engineer-Big Data 6B Bangalore
Data Engineer-Data Modeling 7B Bangalore
Data Engineer-Big Data 7B Bangalore
Data Engineer-Big Data 7A Bangalore
Data Engineer-Big Data 7A Bangalore
Data Engineer-Big Data 6B Bangalore
Data Engineer-Big Data 6B Bangalore
Data Engineer-Enterprise Content Management 7A Any
Data Engineer-Enterprise Content Management 7B Pune
Application Developer-Process Management (BPM) 6B Bangalore
Application Consultant-Blockchain 7B Bangalore
Business Transformation Consultant-Cognos Analytics 6B Kolkata
Data Engineer-Big Data 7B Pune
Data Engineer-Data Warehouse 7B Pune
Data Engineer-Big Data 7A Pune
Data Engineer-Big Data 6B Pune
Data Engineer-Big Data 6B Pune
Data Engineer-Big Data 6B Pune
Data Engineer-Big Data 7B Hyderabad
Data Engineer-Big Data 7A Bangalore
meet the business
Business
requirements/strategies
Transformation Consultant-Operational
while ensuring enterprise
Risk Management
architecture principles/standards/practices
7B Chennai Have the ability to understand client /server models, d
Data Engineer-Big Data 7B Pune
Data Engineer-Big Data 7B Pune
Data Engineer-Big Data 7A Pune
Data Engineer-Big Data 7A Pune
Data Engineer-Big Data 6B Pune
Data Engineer-Big Data 6B Pune
Data Engineer-Big Data 6B Pune
Data Engineer-Big Data 6B Pune
Data Engineer-Big Data 7B Hyderabad
Data Engineer-Business Intelligence 6B Bangalore
Data Engineer-Big Data 7A Bangalore
Data Engineer-Big Data 7A Bangalore
meet the business
Business
requirements/strategies
Transformation Consultant-Operational
while ensuring enterprise
Risk Management
architecture principles/standards/practices
7B Chennai Have the ability to understand client /server models, d
dustry specific capabilityApplication
models, process
Developer-Process
models, service
Management
models, and(BPM)
business object models
6B toBangalore
accelerate development and testing of Smarter Process projects. Should
Data Engineer-Big Data 7A Hyderabad
Data Engineer-Big Data 7A Hyderabad
Data Engineer-Data Modeling 7B Bangalore
Data Engineer-Advanced Analytics 7B Any
Data Engineer-Enterprise Content
It is mandatory
Managementto have 4+ years7B
of relevant
Hyderabad
exp and proven knowledge in the areas of FileNet, Case Manager and
Data Engineer-Data Modeling 7A Hyderabad
Data Engineer-Data Modeling 7A Hyderabad
Data Engineer-Data Modeling 7A Bangalore
Data Engineer-Data Modeling 7A Bangalore
Data Engineer-Data Modeling 7A Bangalore
Data Engineer-Data Modeling 7A Bangalore
Data Engineer-Data Modeling 7A Bangalore
Data Engineer-Data Modeling 7A Bangalore
Data Engineer-Data Modeling 7B Hyderabad
Data Engineer-Data Modeling 7B Hyderabad
Data Engineer-Data Modeling 7B Hyderabad
Data Engineer-Data Modeling 7B Hyderabad
Data Engineer-Data Modeling 7B Hyderabad
Data Engineer-Data Modeling 7B Hyderabad
Data Engineer-Data Modeling 7B Hyderabad
Data Engineer-Data Modeling 7B Hyderabad
Data Engineer-Data Modeling 7B Hyderabad
Data Engineer-Data Modeling 7B Hyderabad
tegration and BPEL). Apply
Application
knowledge
Developer-Process
of industry specific
Management
capability models,
(BPM) process models,
7A service
Hyderabad
models, and business object models to accelerate development and
b-based BI reporting technologyData
thatEngineer-Business
offers personalizedIntelligence
features, such as allowing users
6B to drillHyderabad
down into the content of a report using a menu and comes with an intui
Data Engineer-Data Modeling 7A Bangalore
Data Engineer-Enterprise Content
It is mandatory
Managementto have 4+ years7B
of relevant
Hyderabad
exp and proven knowledge in the areas of FileNet, Case Manager and
Data Engineer-Data Modeling 7A Bangalore
Data Engineer-Data Modeling 7A Bangalore
Data Engineer-Data Modeling 7A Bangalore
Data Engineer-Data Modeling 7A Bangalore
Data Engineer-Data Modeling 7A Bangalore
Data Engineer-Data Modeling 7A Bangalore
Data Engineer-Data Modeling 7B Hyderabad
Data Engineer-Data Modeling 7B Hyderabad
Data Engineer-Data Modeling 7B Hyderabad
Data Engineer-Data Modeling 7B Hyderabad
Data Engineer-Data Modeling 7B Hyderabad
Data Engineer-Data Modeling 7B Hyderabad
Data Engineer-Data Modeling 7B Hyderabad
Data Engineer-Data Modeling 7B Hyderabad
Data Engineer-Data Modeling 7B Hyderabad
Data Engineer-Data Modeling 7B Hyderabad
tegration and BPEL). Apply
Application
knowledge
Developer-Process
of industry specific
Management
capability models,
(BPM) process models,
7A service
Hyderabad
models, and business object models to accelerate development and
b-based BI reporting technologyData
thatEngineer-Business
offers personalizedIntelligence
features, such as allowing users
7B to drillBangalore
down into the content of a report using a menu and comes with an intui
b-based BI reporting technologyData
thatEngineer-Business
offers personalizedIntelligence
features, such as allowing users
6B to drillHyderabad
down into the content of a report using a menu and comes with an intui
Data Engineer-Data Modeling 7A Bangalore
Data Engineer-Big Data 7B Bangalore
Data Engineer-Data Integration 7B Any
Hands-onData
experience
Engineer-Master
with Informatica
Data Management
MDM Hub configurations, Data
7A modelingAny
& Data Mappings, Data validation, Match/Merge rules, etc. Experienc
Data Engineer-Data Modeling 7A Bangalore
Data Engineer-Big Data 7A Bangalore
Data Engineer-Big Data 7A Bangalore
Data Engineer-Master Data Management 7A Bangalore
Data Engineer-Data Integration 6A Hyderabad
Data Engineer-Data Warehouse 7A Bangalore
Data Engineer-Enterprise Content Management 7A Pune
Data Engineer-Enterprise Content Management 7A Pune
Data Engineer-Big Data 7B Pune
Data Engineer-Big Data 7B Pune
Data Engineer-Data Modeling 7A Bangalore
pelines, performance tuning, dataData
modeling
Engineer-Data
and troubleshooting
Integration 3+ Years SQL Experience
7B writing
Kolkata
complex queries, performance tuning, creating indexes and joining m
Data Engineer-Big Data 7A Bangalore
Data Engineer-Big Data 7A Bangalore
Data Engineer-Big Data 7A Bangalore
ne cloud assignment (Azure/GCP) Data
GoodEngineer-Data
to have CloudIntegration
data engineer / Cloud knowledge7Ain Azure/GCP
Bangalore
Detailed Job DescriptionDeveloper should be able to independent
Description5+ years experience of development
Data Engineer-Big
and testing
Datain the field of Data & Analytics
7A Experience
Bangalore
in Azure cloud tech-stack Experience in Big Data technologies (e.g
Description5+ years experience of development
Data Engineer-Big
and testing
Datain the field of Data & Analytics
7A Experience
Bangalore
in Azure cloud tech-stack Experience in Big Data technologies (e.g
Description5+ years experience of development
Data Engineer-Big
and testing
Datain the field of Data & Analytics
7A Experience
Bangalore
in Azure cloud tech-stack Experience in Big Data technologies (e.g
Data Engineer-Master Data Management 7A Bangalore
Data Engineer-Data Integration 6A Hyderabad
Data Engineer-Data Warehouse 7A Bangalore
Data Engineer-Enterprise Content Management 7A Pune
Data Engineer-Enterprise Content Management 7A Pune
Data Engineer-Big Data 7A Pune
Data Engineer-Big Data 6B Pune
Data Engineer-Big Data 6B Pune
Data Engineer-Big Data 6B Pune
Data Engineer-Big Data 7A Pune
Data Engineer-Big Data 6B Pune
Data Engineer-Big Data 7A Pune
Data Engineer-Big Data 7B Pune
Data Engineer-Big Data 6B Pune
Data Engineer-Big Data 7B Pune
Data Engineer-Big Data 6B Pune
Data Engineer-Big Data 7B Pune
Business Transformation Consultant-Operational Risk Management 7B Bangalore
Business Transformation Consultant-Operational Risk Management 7B Bangalore
Business Transformation Consultant-Operational Risk Management 7B Bangalore
Business Transformation Consultant-Operational Risk Management 7A Bangalore
Business Transformation Consultant-Operational Risk Management 7A Bangalore
Business Transformation Consultant-Operational Risk Management 7A Bangalore
Business Transformation Consultant-Operational Risk Management 7A Bangalore
ineer experience - Experience with bigData
dataEngineer-Big
tools: Hadoop,
Data
Spark, Kafka,Hive etc. - Experience
7A with
Gurgaon
relational SQL and NoSQL databases, including Postgres, Cassandra et
ineer experience - Experience with bigData
dataEngineer-Big
tools: Hadoop,
Data
Spark, Kafka,Hive etc. - Experience
6B with
Gurgaon
relational SQL and NoSQL databases, including Postgres, Cassandra et
d Ab>Initio Web Services Components.
Data Engineer-Data
3. Good Knowledge/experience
Integration of Ab>Initio Meta
7A Programming
Bangalore
Concepts. 4. Sound knowledge/experience of Ab>Initio Multi File
Data Engineer-Enterprise Content Management 7A Hyderabad
Business Transformation Consultant-Operational Risk Management 7B Bangalore
Business Transformation Consultant-Operational Risk Management 7B Bangalore
Business Transformation Consultant-Operational Risk Management 7B Bangalore
Business Transformation Consultant-Operational Risk Management 7A Bangalore
Business Transformation Consultant-Operational Risk Management 7A Bangalore
Business Transformation Consultant-Operational Risk Management 7A Bangalore
Business Transformation Consultant-Operational Risk Management 7A Bangalore
ineer experience - Experience with bigData
dataEngineer-Big
tools: Hadoop,
Data
Spark, Kafka,Hive etc. - Experience
7A with
Gurgaon
relational SQL and NoSQL databases, including Postgres, Cassandra et
ineer experience - Experience with bigData
dataEngineer-Big
tools: Hadoop,
Data
Spark, Kafka,Hive etc. - Experience
6B with
Gurgaon
relational SQL and NoSQL databases, including Postgres, Cassandra et
ed of multiple software packages
Application
and custom
Architect-Business
components.Intelligence
This role defines best practices
7B in the
Bangalore
critical evaluation and selection and / or development of the software c
d Ab>Initio Web Services Components.
Data Engineer-Data
3. Good Knowledge/experience
Integration of Ab>Initio Meta
7A Programming
Bangalore
Concepts. 4. Sound knowledge/experience of Ab>Initio Multi File
d Ab>Initio Web Services Components.
Data Engineer-Data
3. Good Knowledge/experience
Integration of Ab>Initio Meta
7A Programming
Bangalore
Concepts. 4. Sound knowledge/experience of Ab>Initio Multi File
Data Engineer-Enterprise Content Management 7A Hyderabad
n, pytorch, and TensorFlow, andData
Natural
Engineer-Advanced
Language Processing
Analytics
toolkits such as NLTK or 7B
spaCy; Experience
Bangalorein cleaning and transforming data for machine learning.; Experie
Data Scientist-Advanced Analytics 7B Kolkata
Data Scientist-Advanced Analytics 7B Kolkata
Data Scientist-Advanced Analytics 7A Kolkata
Solution Architect-Cognitive Computing 7A Any
Solution Architect-Cognitive Computing 7A Any
Application Architect-Data Platforms 7B Minimum
Bangalore
10 yrs for Architect and mimnimum of 4+ yrs for Data Engineers wi
Data Engineer-Big Data 7B Bangalore
Data Engineer-Data Integration 7B Hyderabad
Project Manager-Data Platforms 7B Hyderabad
Application Developer-Process Management (BPM) 6B Pune
n, pytorch, and TensorFlow, andData
Natural
Engineer-Advanced
Language Processing
Analytics
toolkits such as NLTK or 7B
spaCy; Experience
Bangalorein cleaning and transforming data for machine learning.; Experie
Data Scientist-Advanced Analytics 7B Kolkata
Data Scientist-Advanced Analytics 7B Kolkata
Data Scientist-Advanced Analytics 7B Kolkata
Solution Architect-Cognitive Computing 7A Any
Solution Architect-Cognitive Computing 7A Any
Application Architect-Data Platforms 7B Minimum
Bangalore
10 yrs for Architect and mimnimum of 4+ yrs for Data Engineers wi
Data Engineer-Big Data 7B Bangalore
Data Engineer-Enterprise Content Management 7B JD
Hyderabad
- Document Presentment for SAP (Forms/Letters) Expertise in the follo
Data Engineer-Enterprise Content Management 7B JD
Hyderabad
- Document Presentment for SAP (Forms/Letters) Expertise in the follo
Data Engineer-Enterprise Content Management 7B JD
Hyderabad
- Document Presentment for SAP (Forms/Letters) Expertise in the follo
Project Manager-Data Platforms 7B Hyderabad
Data Engineer-Data Integration 7B Kolkata
Data Engineer-Big Data 7B Bangalore
Data Engineer-Data Integration 6B Bangalore
Business Transformation Consultant-HR Reinvention 7A Pune Bangalore
Business Transformation Consultant-HR Reinvention 7A Pune Bangalore
Business Transformation Consultant-HR Reinvention 7A Pune Bangalore
Business Transformation Consultant-HR Reinvention 7A Pune Bangalore
Application Architect-Blockchain 7B Bangalore
Application Developer-Blockchain 7A Bangalore
Application Consultant-Blockchain 7B Bangalore
Application Developer-Blockchain 6B Bangalore
Business Transformation Consultant-HR Reinvention 7B Pune
Business Transformation Consultant-HR Reinvention 7A Pune
Application Architect-Blockchain 7B Bangalore
Application Developer-Blockchain 7A Bangalore
Application Consultant-Blockchain 7B Bangalore
Application Developer-Blockchain 6B Bangalore
Business Transformation Consultant-HR Reinvention 7B Pune
Business Transformation Consultant-HR Reinvention 7A Pune
Application Developer-Enterprise Asset Management 7B Bangalore
Application Developer-Enterprise Asset Management 7A Gurgaon
Application Developer-Enterprise Asset Management 7A Gurgaon
Application Developer-Enterprise Asset Management 7A Gurgaon
Business Transformation Consultant-Enterprise Asset Management 6B Gurgaon
Application Architect-Enterprise Asset Management 6B Bangalore
Application Developer-Enterprise Asset Management 6A Bangalore
Business Transformation Consultant-Enterprise Asset Management 6A Bangalore
Business Transformation Consultant-Enterprise Asset Management 6A Bangalore
Business Transformation Consultant-IoT & PLM 7A Mysore
Business Transformation Consultant-IoT & PLM 7B Mysore
Business Transformation Consultant-IoT & PLM 7B Bangalore
Application Developer-Enterprise Asset Management 7B Bangalore
Application Developer-Enterprise Asset Management 7A Gurgaon
Application Developer-Enterprise Asset Management 7A Gurgaon
Application Developer-Enterprise Asset Management 7A Gurgaon
Business Transformation Consultant-Enterprise Asset Management 6B Gurgaon
Business Transformation Consultant-Enterprise Asset Management 6B Gurgaon
Application Architect-Enterprise Asset Management 6B Bangalore
Application Developer-Enterprise Asset Management 6A Bangalore
Business Transformation Consultant-IoT & PLM 7A Mysore
Business Transformation Consultant-IoT & PLM 7A Bangalore
Business Transformation Consultant-IoT & PLM 7B Mysore
Business Transformation Consultant-IoT & PLM 7B Bangalore
Business Transformation Consultant-Industry 4.0 7A Pune
Business Transformation Consultant-Industry 4.0 7B Pune
Business Transformation Consultant-Industry 4.0 7B Pune
Business Transformation Consultant-IoT & PLM 6B Bangalore
Business Transformation Consultant-IoT & PLM 6B Bangalore
Business Transformation Consultant-IoT & PLM 7A Bangalore
Business Transformation Consultant-IoT & PLM 7A Bangalore
on ML algorithms and their usageDataWorking
Scientist-Advanced
experience Analytics
in end to end data science project
7B life cycles
Bangalore
from use case framing, data collection, data exploration, model bu
on ML algorithms and their usageDataWorking
Scientist-Advanced
experience Analytics
in end to end data science project
7B life cycles
Bangalore
from use case framing, data collection, data exploration, model bu
e learning with Python + Timeseries
Datamodelling
Scientist-Advanced
WorkingAnalytics
experience in end to end data
7BscienceBangalore
project life cycles from use case framing, data collection, data exploratio
Application Consultant-Procurement 7A Bangalore
Application Developer-Process Management (BPM) 7A Bangalore
Business Transformation Consultant-Industry 4.0 7A Pune
Business Transformation Consultant-Industry 4.0 7B Pune
Business Transformation Consultant-Industry 4.0 7B Pune
Business Transformation Consultant-IoT & PLM 6B Bangalore
Business Transformation Consultant-IoT & PLM 6B Bangalore
Business Transformation Consultant-IoT & PLM 7A Bangalore
Business Transformation Consultant-IoT & PLM 7A Bangalore
on ML algorithms and their usageDataWorking
Scientist-Advanced
experience Analytics
in end to end data science project
7B life cycles
Bangalore
from use case framing, data collection, data exploration, model bu
e learning with Python + Timeseries
Datamodelling
Scientist-Advanced
WorkingAnalytics
experience in end to end data
7BscienceBangalore
project life cycles from use case framing, data collection, data exploratio
Application Consultant-Procurement 7B Bangalore Chennai
Business Transformation Consultant-Supply Chain 7B Bangalore Pune/Kolkata
Application Consultant-Procurement 7B Bangalore
Data Engineer-Machine Learning 7A Bangalore
Business Transformation Consultant-Supply Chain 7B Bangalore Pune/Kolkata
Application Consultant-Procurement 7B Bangalore
Data Engineer-Machine Learning 7A Bangalore
Data Engineer-Business Intelligence 7B Kolkata Any location in India
Data Engineer-Data Integration 7B Kolkata Any
Data Engineer-Data Modeling 7B Kolkata Any
Data Engineer-Big Data 7B Kolkata Any
Data Engineer-Big Data 7B Kolkata Any
Data Engineer-Master Data Management 7A Any
Data Engineer-Data Integration 7B Chennai Bangalore
Data Engineer-Data Integration 7A Chennai Bangalore
Data Engineer-Data Integration 7A Chennai Bangalore
Data Engineer-Data Integration 7A Chennai Bangalore
Data Engineer-Data Modeling 7B Any Any
Data Engineer-Big Data 7B Any Any
Data Engineer-Big Data 7B Any Any
Data Engineer-Data Modeling 7B Any Any
Business Transformation Consultant-Big Data 7B Bangalore Kolkota
Data Engineer-Big Data 7A Bangalore Kolkota
Data Engineer-Big Data 7A Bangalore Pune
Data Engineer-Big Data 7A Bangalore Bangalore
Data Consultant-Data Governance 7B Pune Pune
Data Consultant-Data Governance 7B Pune Pune
Data Consultant-Data Governance 7B Pune Pune
Data Consultant-Data Governance 7B Pune Pune
Data Engineer-Data Warehouse 7B Pune Pune
Data Engineer-Data Modeling 7B Pune Pune
Data Engineer-Data Integration 7B Pune Pune
Data Engineer-Data Integration 7B Pune Pune
Data Engineer-Data Integration 7B Pune Pune
Data Engineer-Data Integration 7B Kolkata Bangalore
Data Engineer-Big Data 7A Kolkata Bangalore
Data Engineer-Data Integration 7B Bangalore Bangalore
Data Engineer-Data Modeling 7A Bangalore Bangalore
Data Engineer-Business Intelligence 7B Kolkata Any location in India
Data Engineer-Data Integration 7B Kolkata Any
Data Engineer-Data Modeling 7B Kolkata Any
Data Engineer-Big Data 7B Kolkata Any
Data Engineer-Big Data 7B Kolkata Any
Data Engineer-Master Data Management 7A Any
Data Engineer-Data Integration 7B Chennai Bangalore
Data Engineer-Data Integration 7A Chennai Bangalore
Data Engineer-Data Integration 7A Chennai Bangalore
Data Engineer-Data Integration 7A Chennai Bangalore
Data Engineer-Data Modeling 7B Any Any
Data Engineer-Big Data 7B Any Any
Data Engineer-Big Data 7B Any Any
Data Engineer-Data Modeling 7B Any Any
Business Transformation Consultant-Big Data 7B Bangalore Kolkota
Data Engineer-Big Data 7A Bangalore Kolkota
Data Engineer-Big Data 7A Bangalore Pune
Data Engineer-Big Data 7A Bangalore Bangalore
Data Consultant-Data Governance 7B Pune Pune
Data Consultant-Data Governance 7B Pune Pune
Data Consultant-Data Governance 7B Pune Pune
Data Consultant-Data Governance 7B Pune Pune
Data Engineer-Data Warehouse 7B Pune Pune
Data Engineer-Data Modeling 7B Pune Pune
Data Engineer-Data Integration 7B Pune Pune
Data Engineer-Data Integration 7B Pune Pune
Data Engineer-Data Integration 7B Pune Pune
Data Engineer-Data Integration 7B Kolkata Bangalore
Data Engineer-Big Data 7A Kolkata Bangalore
Data Engineer-Data Integration 7B Bangalore Bangalore
Data Engineer-Data Modeling 7A Bangalore Bangalore
Sub_Skill Job Description
ANA for ERP processes, Integrating with
Opentext
MHIRJ Applications which manages non-ERP document management requirements.Document Management overall architecture
s tables) Good to have (Not Mandatory)
Big Data
Unix
Architect
shell scripting, Data Engineer We are seeking for a Data Engineer who has strong experience in DWH and BI technologies
· Proven working experience in MDM-RDM space (Master data and Reference data)
MDM Engineer · Must be open to learn New technologies
· Experience working with version control GitHub and CI/CD pipelines using Azure DevOps
L1/L2 Support Ready to work in any shift
ger Experience in handling projects where
Data Cutover
data is moved from legacy or other systems to SAP; Experience in managing teams; Experience in Informatica Powercenter; Ex
Maximo
s of experience in the design principles of large-scale client/server transaction processing solutions. 5) Minimum of 6 years experience in maintaining C ram applicatio
SAP EIM/ BODS coIInsultant/ developer - works for data modelling, extraction and data load. IDOC processing, Data Migration Co
ools Strong expertise in Pyspark Azure
Hands-on
Databricks
experience in implementing ETL using Pyspark on Azure Databricks. Must have strong knowledge of Software Developme
n, development, unit testing, and integration testing. Understanding on MDM CE and product Modeling. Understand MDM Data Model & MDM architecture Knowledge
n, development, unit testing, and integration testing. Understanding on MDM CE and product Modeling. Understand MDM Data Model & MDM architecture Knowledge
plication Developer for GMC/Quadient
ECM/PrintNet
Inspire PrintNet (Designer, Automation, ICM) for handling file types: Text, XML, PDF, CSV, Delimited file. Alternate Skills (if primar
Oracle Architect or development experience
Curam
using WebSphere Message Broker, WebSphere, InfoSphere Information Server, ETL. Well versed in Eclipse, Rational Software
Oracle Architect or development experience using WebSphere Message Broker, WebSphere, InfoSphere Information Server, ETL. Well versed in Eclipse, Rational Software
Oracle Architect or development experience using WebSphere Message Broker, WebSphere, InfoSphere Information Server, ETL. Well versed in Eclipse, Rational Software
Oracle Architect or development experience using WebSphere Message Broker, WebSphere, InfoSphere Information Server, ETL. Well versed in Eclipse, Rational Software
Oracle Architect or development experience
Curam
using WebSphere Message Broker, WebSphere, InfoSphere Information Server, ETL. Well versed in Eclipse, Rational Software
Oracle Architect or development experience using WebSphere Message Broker, WebSphere, InfoSphere Information Server, ETL. Well versed in Eclipse, Rational Software
· Proven working experience in MDM-RDM space (Master data and Reference data)
MDM Engineer · Must be open to learn New technologies
· Proven working experience in MDM-RDM space (Master data and Reference data)
MDM Engineer · Must be open to learn New technologies
· Proven working experience in MDM-RDM space (Master data and Reference data)
MDM Engineer · Must be open to learn New technologies
azon Lex, AWS Lambda, Amazon Polly, AWS
Amazon
CloudWatch, Amazon Transcribe, AWS Step functions, Amazon S3, Amazon Kinesis, Amazon DynamoDB, RDS, Node.js/ Python
targeting BI/analytical, data science and/or
Big data
transactional use cases. Qualification and Skills Experience of APIS, microservices, Kubernetes and data ingestion (Airflow) an
Experience of analysing business requirements and managing an organization's data architecture in the cloud (Big Data, analytics, information analysis, data lakes
AWS Experience of architecting of data ingestion ETL solutions in the cloud (Airflow)
targeting BI/analytical, data science and/or
Big data
transactional use cases. Qualification and Skills Experience of APIS, microservices, Kubernetes and data ingestion (Airflow) an
perience in ETL or integration development,
Data Governance
REST and open API technologies Experience in developing masterdata capabilities, data standardization, and quality developm
Openpages with Java and Reporting Skills
Openpages with Java and Reporting Skills
ing) Hive, Hbase and PhoenixPySpark
Develops
and
applications
Spark Streaming
on Big Data technologies including API development. Expected to have traditional Application Development backgrou
zure, HDInsights Data Engineering/Data Science
Azure for Cloudera based Data lake with strong Microsoft understanding and future goal of Cloudera on Azure. BigData Skills a
zure, HDInsights Data Engineering/Data Science
Azure for Cloudera based Data lake with strong Microsoft understanding and future goal of Cloudera on Azure. BigData Skills a
Open pages
IBM OpenPages, Java, IBM OpenPages Configuration, IBM Cognos, TeamCity, GitHub, Linux, Docker, Container, Octopus
Open pages
IBM OpenPages, Java, IBM OpenPages Configuration, IBM Cognos, TeamCity, GitHub, Linux, Docker, Container, Octopus
OpenPages
IBM OpenPages, Java, IBM OpenPages Configuration, IBM Cognos, TeamCity, GitHub, Linux, Docker, Container, Octopus
analytics, Machine learning, data insights, creating/training various kinds of learning models (supervised/unsupervised) .. Should have experience of defining scope, build
analytics, Machine learning, data insights, creating/training various kinds of learning models (supervised/unsupervised) .. Should have experience of defining scope, build
ETL Informatica developer must have the following - 3 to 5 years experience.
ETL developer Nice to have -- Experience on SAP Data Conversion in the areas SAP FICO, SAP MM, SAP SD

CMOD CMOD / Mainframe Developer with 4_+ years of relevent experience in CMOD. Good Communication skills.
Develops applications on Big Data technologies . using PySpark. Strong technical abilities to understand, design, write and debug com
Mandate skills - Pyspark, Kubernates (desirable)
Offshore Informatica Cloud & AWS
Azure Databricks Databricks / Snowflake / Cassandra/ Teradata
ETL Informatica developer must have the following - 3 to 5 years experience.
ETL developer Nice to have -- Experience on SAP Data Conversion in the areas SAP FICO, SAP MM, SAP SD
CMOD CMOD / Mainframe Developer with 4_+ years of relevent experience in CMOD. Good Communication skills.
ETL Architect ETL Tools / Python/Kafka / Spark
ETL Architect ETL Tools / Python/Kafka / Spark
5+ years of development
Camundaexperience on BPM, Camunda, Java 8, Springboot, Angular 9, Amazon Web Services platform, Experience in Banking indust
& be able to independently develop Cognos solutions. Experience with Cognos Framework Manager and Cognos Transformer is needed. Should have hands on experience
& be able to independently develop Cognos solutions. Experience with Cognos Framework Manager and Cognos Transformer is needed. Should have hands on experience

er to solve a business problem. The individual is able to interpret exploratory statistics and identify appropriate features and machine learning algorithms to use. In additi
project plan, budget, structure,
PM schedule
with Data
andPlatform
staffing exp
requirements, including IBM and Client employees and 3rd party vendors. Requires experience with PM methodolo
er to solve a business problem. The individual is able to interpret exploratory statistics and identify appropriate features and machine learning algorithms to use. In additi
lligence environment using Ab Initio
Abinitio
software
& Datastage
and DataStage (formerly Ascential) - IBM's WebSphere Data Integration Suite. Skills include designing and developing extra
project plan, budget, structure,
PM schedule
with Data
andPlatform
staffing exp
requirements, including IBM and Client employees and 3rd party vendors. Requires experience with PM methodolo
§Agile principles
ata Engineer - Advanced Analytics §Stakeholder management

FileNet Filenet Developer


§Agile principles
ata Engineer - Advanced Analytics §Stakeholder management
· Proven working experience in MDM-RDM space (Master data and Reference data)
MDM Engineer · Must be open to learn New technologies
Data Modeler should
Data Modeler
have 10 to 12 years of Experience in Designing and Data Modeling of Data Analytics Systems using Relational and Non Relation
Data Modeler should
Data Modeler
have 10 to 12 years of Experience in Designing and Data Modeling of Data Analytics Systems using Relational and Non Relation
1. ETL experience, preferablyDataStage
DataStageSenior
2. Bash
Developer
(shell) scripting, Linux 3. Data Analysis and if possible masking experience 4. Good in SQL 5. RDMS concepts, Oracle, DB2
ect/UX lead QlikView, QlikSense (must to have), Supply Chain data visualization Should have exposure in R/Python. Good communication - Effectively communicate & int
nd skill in IBM Watson product and services specifically Watson Assistant and Watson Discovery Services. Experience in building conversations b.) Good analytical/proble
nd skill in IBM Watson product and services specifically Watson Assistant and Watson Discovery Services. Experience in building conversations b.) Good analytical/proble
- Hands-on experience working with databases like Oracle, SQL etc.
OpenText - Content Server - Knowledge of Extream xECM is a plus
rson should 7+ years relevant exp and
SAShave
Developer
strong knowledge in areas SAS development experience with SAS Base, SAS DI, SAS Enterprise Guide and must have SAS ETL
Previous of working in an Incident/Problem Management role would be an advantage
Curam BA Knowledge of SQL queries. Curram experience will be an asset
stone Magellan tool, Working knowledge
Hive,
ofSQL
data visualization/reporting tools like Tableau Role & Responsibility (What exactly this resource will be doing): The resource
Previous of working in an Incident/Problem Management role would be an advantage
Curam BA Knowledge of SQL queries. Curram experience will be an asset
•One+ of the certifications added advantage: Splunk Certified Admin, Splunk Certified Architect, Splunk Certified Consultant
Apache Griffin, Spark •Identify innovative ways to improve the process of delivering solutions to clients.
bsetting techniques Functional and technical
Optimspecifications and development, testing Data archive exp from various data sources like SQL , Oracle , JDEdwards , Microsoft
bsetting techniques Functional and technical
Optimspecifications and development, testing Data archive exp from various data sources like SQL , Oracle , JDEdwards , Microsoft
ontent Navigator Configuration and
IBM
Customization
Case Managerof IBM Case Manager and Content Navigator, basic knowledge of Java/Javascript Hands-on experience with administr
owledge in the areas of FileNet, Case Manager and IBM Content Navigator Configuration and Customization. In addition, it is required to have hands-on experience with a
5+Yrs of tech Filenet
expert in FileNet/ Case Manager and some Java/React knowledge is good to have.Good Communication skill and also worked in A
et and IBM Content Navigator Must
IBMbeCase
ableManager
to use Administrative tools such as ACCE, ICM Admin, ICN Admin Working knowledge of deployments processes of FileNet ar
3-4 years of working knowledge of
IBM
FileNet
Case APIs
Manager
and java. Must be aware of good practices related to Java development, code maintenance. Basic knowledge on deploym
•One+ of the certifications added advantage: Splunk Certified Admin, Splunk Certified Architect, Splunk Certified Consultant
Apache Griffin, Spark •Identify innovative ways to improve the process of delivering solutions to clients.
bsetting techniques Functional and technical
Optimspecifications and development, testing Data archive exp from various data sources like SQL , Oracle , JDEdwards , Microsoft
bsetting techniques Functional and technical
Optimspecifications and development, testing Data archive exp from various data sources like SQL , Oracle , JDEdwards , Microsoft
bsetting techniques Functional and technical
Optimspecifications and development, testing Data archive exp from various data sources like SQL , Oracle , JDEdwards , Microsoft
ontent Navigator Configuration and
IBM
Customization
Case Managerof IBM Case Manager and Content Navigator, basic knowledge of Java/Javascript Hands-on experience with administr
owledge in the areas of FileNet, Case Manager and IBM Content Navigator Configuration and Customization. In addition, it is required to have hands-on experience with a
Window Services Proficient in development
IBM Caseusing
Manager
Microsoft.net and C# sharp Experience with IBM FileNet P8 Experience with paper document to DataCap, Indexing, Prep
5+Yrs of tech Filenet
expert in FileNet/ Case Manager and some Java/React knowledge is good to have.Good Communication skill and also worked in A
et and IBM Content Navigator Must
IBMbeCase
ableManager
to use Administrative tools such as ACCE, ICM Admin, ICN Admin Working knowledge of deployments processes of FileNet ar
3-4 years of working knowledge of
IBM
FileNet
Case APIs
Manager
and java. Must be aware of good practices related to Java development, code maintenance. Basic knowledge on deploym
ETL AbInito, UNIX, Shell scripting,0Talend 4 to 6 years in Ab Initio development experience with UNIX background. RDBMS concepts are a must. Having Mainfra
Advanced Knowledge on Relational Database Systems and Big data design framework is required to translate the source data model and reverse engineer
Source Data ModelData needModeling
to be explained and socialized with Client Architects and Pros/Cons provided before implementing the mapping to Target Syste
Advanced Knowledge on Relational Database Systems and Big data design framework is required to translate the source data model and reverse engineer
Source Data ModelData needModeling
to be explained and socialized with Client Architects and Pros/Cons provided before implementing the mapping to Target Syste
5. Work closely to define the data strategies and build data flows between source and target with the IT teams (API team and Snowfla
Data Modeling 6. Work closely with Busines
a menu and comes with an intuitive design
Cognoskit.
DevIncludes integration between Business Intelligence Platforms, which will allow users to toggle easily between reporting a
Big Data Python, Hive, Spark/Scala, Sqoop, Hadoop
Advanced Knowledge on Relational Database Systems and Big data design framework is required to translate the source data model and reverse engineer
Source Data ModelData
needModeling
to be explained and socialized with Client Architects and Pros/Cons provided before implementing the mapping to Target Syste
5. Work closely to define the data strategies and build data flows between source and target with the IT teams (API team and Snowfla
Data Modeling 6. Work closely with Busines
ETL Datastage Primary Skills: IBM DataStage, PL/SQL, Unix
experience in developing complex stored
Snowflake
procedures/SQL queries.. Experience in Data Migration from RDBMS to Snowflake cloud data warehouse. . Deep understanding
Data Engineer AEP, Experience in Cloud based ETL tool and Python
Data Engineer AEP, Experience in Cloud based ETL tool and Python
dataplatform technologies, hands on experience working for cloud migration projects. Python coding along with scala and experience working on GCP platform Data serv
derstand application landscape based onGCP
requirements and code walk through and map to future state architecture on GCP is required. Experience leading a team techn
Data Engineer AEP, Experience in Cloud based ETL tool and Python
Data Engineer AEP, Experience in Cloud based ETL tool and Python
Azure Big Data In depth knowledge in Azure big data development.
d Information Capability Roadmap.Identify
Data Modeler
new information capabilities needed to support business capabilities of a given business domain.Provides consulting and guida
Data Engineer AEP, Experience in Cloud based ETL tool and Python
dataplatform technologies, hands on experience working for cloud migration projects. Python coding along with scala and experience working on GCP platform Data serv
derstand application landscape based onGCP
requirements and code walk through and map to future state architecture on GCP is required. Experience leading a team techn
Data Engineer AEP, Experience in Cloud based ETL tool and Python
Data Engineer AEP, Experience in Cloud based ETL tool and Python
Data Engineer AEP, Experience in Cloud based ETL tool and Python
i. Experience in Java scripting for creation of Rules in template Strong Working knowledge in Java, J2EE , XML, HTML, Oscript, Weblingo, AJAX, Webse
OpenText OT Exstream - Developer
Open Text OpenText Developer (OTCS, xECM, OTAS) 8 - 10 years Exp
evelopment environment. So one needBPM
to beARIS
well versed with ARIS concepts, objects, methods, etc. 2.Should be able to gather and understand the requirements for crea
Must Have: Strong OOPs
Typescript
concept, Good experience in hands-on API development, Node JS Nice to Have: Prior experience in Typescript programming / d
edge 4. Experience in performancePlanning
tuning, handling
Analyticsdata issues and large volume of data in Planning Analytics cubes 5. Strong knowledge of Visualization techniques (Plan
Big Data Streamsets Data Engineers -
Snowfalkes DBT and Snowflake Data Engineers
Big Data AWS S3 Data Engineers (Glue)
Big Data Streamsets Data Engineers
Big Data Streamsets Data Engineers
Big Data Streamsets Data Engineers
ing) Hive, Hbase and Phoenix Develops
Spark/Scala
applications on Big Data technologies including API development. Expected to have traditional Application Development backgrou
Data Quality Engineer
e the ability to understand client /server models, design patterns, database & web server technologies You will posess good understanding on Operational Risk Managem
Big Data AWS S3 Data Engineers (Glue)
Big Data Streamsets Data Engineers -
Big Data AWS S3 Data Engineers (Glue)
Big Data AWS S3 Data Engineers (Glue)
Big Data AWS S3 Data Engineers (Glue)
Big Data Streamsets Data Engineers
Big Data Streamsets Data Engineers
Big Data Streamsets Data Engineers
ing) Hive, Hbase and Phoenix Develops
Spark/Scala
applications on Big Data technologies including API development. Expected to have traditional Application Development backgrou
os reporting. Well versed with Agile delivery,
Cognos good in team work and individual delivery.Experience with DevOps tools(git/BitBucket). Cognos data manager. ETL experien
ed), Scala - SQL programming - Building
Data Ingestion
ETL processes - Code versioning (GIT/SVN) - Data ingestion from APIs - Problem solving skills - Quick learning of ne
Data Quality Engineer
e the ability to understand client /server models, design patterns, database & web server technologies You will posess good understanding on Operational Risk Managem
ent and testing of Smarter Process projects. Should have knowledge on process work flow and process choreography and would be able to develop applications to integr
Chef. Develops applications onSpark,
Big Data
Cassandra
technologies
and Solr
including API development. Expected to have traditional Application Development background along with knowledg
Chef. Develops applications onSpark,
Big Data
Cassandra
technologies
and Solr
including API development. Expected to have traditional Application Development background along with knowledg
g their understanding of key
Business
external
Transformation
and internal Consultant
data sources to evaluate the benefit of a solution to the business and provide business value using value realization fra
owledge in the areas of FileNet, Case Manager and IBM Content Navigator Configuration and Customization. In addition, it is required to have hands-on experience with a
Design and implement data models according to business requirements for specific use cases and/or client’s business domain
Demonstrate understanding in data modelling schema design approaches and techniques such as Third Normal Form (3NF), Dimensional Modelling
Design and implement data models according to business requirements for specific use cases and/or client’s business domain
Demonstrate understanding in data modelling schema design approaches and techniques such as Third Normal Form (3NF), Dimensional Modelling
Design and implement data models according to business requirements for specific use cases and/or client’s business domain
Demonstrate understanding in data modelling schema design approaches and techniques such as Third Normal Form (3NF), Dimensional Modelling
Design and implement data models according to business requirements for specific use cases and/or client’s business domain
Demonstrate understanding in data modelling schema design approaches and techniques such as Third Normal Form (3NF), Dimensional Modelling
Design and implement data models according to business requirements for specific use cases and/or client’s business domain
Demonstrate understanding in data modelling schema design approaches and techniques such as Third Normal Form (3NF), Dimensional Modelling
Design and implement data models according to business requirements for specific use cases and/or client’s business domain
Demonstrate understanding in data modelling schema design approaches and techniques such as Third Normal Form (3NF), Dimensional Modelling
Design and implement data models according to business requirements for specific use cases and/or client’s business domain
Demonstrate understanding in data modelling schema design approaches and techniques such as Third Normal Form (3NF), Dimensional Modelling
Design and implement data models according to business requirements for specific use cases and/or client’s business domain
Demonstrate understanding in data modelling schema design approaches and techniques such as Third Normal Form (3NF), Dimensional Modelling
Design and implement data models according to business requirements for specific use cases and/or client’s business domain
Demonstrate understanding in data modelling schema design approaches and techniques such as Third Normal Form (3NF), Dimensional Modellin
Design and implement data models according to business requirements for specific use cases and/or client’s business domain
Demonstrate understanding in data modelling schema design approaches and techniques such as Third Normal Form (3NF), Dimensional Modellin
Design and implement data models according to business requirements for specific use cases and/or client’s business domain
Demonstrate understanding in data modelling schema design approaches and techniques such as Third Normal Form (3NF), Dimensional Modellin
Design and implement data models according to business requirements for specific use cases and/or client’s business domain
Demonstrate understanding in data modelling schema design approaches and techniques such as Third Normal Form (3NF), Dimensional Modellin
Design and implement data models according to business requirements for specific use cases and/or client’s business domain
Demonstrate understanding in data modelling schema design approaches and techniques such as Third Normal Form (3NF), Dimensional Modellin
Design and implement data models according to business requirements for specific use cases and/or client’s business domain
Demonstrate understanding in data modelling schema design approaches and techniques such as Third Normal Form (3NF), Dimensional Modellin
Implement data models according to business requirements for specific use cases and/or client’s business domains
Demonstrate understanding in data modelling schema design approaches and techniques such as Third Normal Form (3NF), Dimensional Modelling, En
Implement data models according to business requirements for specific use cases and/or client’s business domains
Demonstrate understanding in data modelling schema design approaches and techniques such as Third Normal Form (3NF), Dimensional Modelling, En
Implement data models according to business requirements for specific use cases and/or client’s business domains
Demonstrate understanding in data modelling schema design approaches and techniques such as Third Normal Form (3NF), Dimensional Modelling, En
Implement data models according to business requirements for specific use cases and/or client’s business domains
Demonstrate understanding in data modelling schema design approaches and techniques such as Third Normal Form (3NF), Dimensional Modelling, En
iness object models to accelerate development and testing of Smarter Process projects. Should have knowledge on process work flow and process choreography and wo
nt of a report using a menu and comes with an intuitive design kit. Includes integration between Business Intelligence Platforms, which will allow users to toggle easily be
Data Modeller Data Modelling with Banking domain experience
owledge in the areas of FileNet, Case Manager and IBM Content Navigator Configuration and Customization. In addition, it is required to have hands-on experience with a
Design and implement data models according to business requirements for specific use cases and/or client’s business domain
Demonstrate understanding in data modelling schema design approaches and techniques such as Third Normal Form (3NF), Dimensional Modelling
Design and implement data models according to business requirements for specific use cases and/or client’s business domain
Demonstrate understanding in data modelling schema design approaches and techniques such as Third Normal Form (3NF), Dimensional Modelling
Design and implement data models according to business requirements for specific use cases and/or client’s business domain
Demonstrate understanding in data modelling schema design approaches and techniques such as Third Normal Form (3NF), Dimensional Modelling
Design and implement data models according to business requirements for specific use cases and/or client’s business domain
Demonstrate understanding in data modelling schema design approaches and techniques such as Third Normal Form (3NF), Dimensional Modelling
Design and implement data models according to business requirements for specific use cases and/or client’s business domain
Demonstrate understanding in data modelling schema design approaches and techniques such as Third Normal Form (3NF), Dimensional Modelling
Design and implement data models according to business requirements for specific use cases and/or client’s business domain
Demonstrate understanding in data modelling schema design approaches and techniques such as Third Normal Form (3NF), Dimensional Modelling
Design and implement data models according to business requirements for specific use cases and/or client’s business domain
Demonstrate understanding in data modelling schema design approaches and techniques such as Third Normal Form (3NF), Dimensional Modellin
Design and implement data models according to business requirements for specific use cases and/or client’s business domain
Demonstrate understanding in data modelling schema design approaches and techniques such as Third Normal Form (3NF), Dimensional Modellin
Design and implement data models according to business requirements for specific use cases and/or client’s business domain
Demonstrate understanding in data modelling schema design approaches and techniques such as Third Normal Form (3NF), Dimensional Modellin
Design and implement data models according to business requirements for specific use cases and/or client’s business domain
Demonstrate understanding in data modelling schema design approaches and techniques such as Third Normal Form (3NF), Dimensional Modellin
Design and implement data models according to business requirements for specific use cases and/or client’s business domain
Demonstrate understanding in data modelling schema design approaches and techniques such as Third Normal Form (3NF), Dimensional Modellin
Design and implement data models according to business requirements for specific use cases and/or client’s business domain
Demonstrate understanding in data modelling schema design approaches and techniques such as Third Normal Form (3NF), Dimensional Modellin
Implement data models according to business requirements for specific use cases and/or client’s business domains
Demonstrate understanding in data modelling schema design approaches and techniques such as Third Normal Form (3NF), Dimensional Modelling, En
Implement data models according to business requirements for specific use cases and/or client’s business domains
Demonstrate understanding in data modelling schema design approaches and techniques such as Third Normal Form (3NF), Dimensional Modelling, En
Implement data models according to business requirements for specific use cases and/or client’s business domains
Demonstrate understanding in data modelling schema design approaches and techniques such as Third Normal Form (3NF), Dimensional Modelling, En
Implement data models according to business requirements for specific use cases and/or client’s business domains
Demonstrate understanding in data modelling schema design approaches and techniques such as Third Normal Form (3NF), Dimensional Modelling, En
iness object models to accelerate development and testing of Smarter Process projects. Should have knowledge on process work flow and process choreography and wo
nt of a report using a menu and comes with an intuitive design kit. Includes integration between Business Intelligence Platforms, which will allow users to toggle easily be
nt of a report using a menu and comes with an intuitive design kit. Includes integration between Business Intelligence Platforms, which will allow users to toggle easily be
Data Modeller Data Modelling with Banking domain experience
omma Big data Technologies: SPARK(Mandatory)
Spark SQL Big dataSpark SQL(Mandatory) Platforms: Windows OR Linux. IDE Tools: Eclipse OR Pycharm OR Intelliji Good to have (N
ch include ETL (extract, transform and
ETL
load),
developer
data architecture and integration.. modeling, metadata management, extract-transform-load (ETL), data staging techniques,
Data validation, Match/Merge rules, etc. Experience in MDM hub development, MDM File Import process, design/build MDM Batch Jobs set up Strong ability to underst
of working experience in ANSI standard
DataSQL
Modeling
a must 3 years of experience with change management procedures and SDLC is a must Experience with analytical and operati
ineer (preference)..who has hands
DataonEngineer-Big
experience with
Data all Google tools like BigQuery, App Engine, AirFlow, Cloud Composer, etc...+ Data lakes...data ponds..domain drive
ineer (preference)..who has hands
DataonEngineer-Big
experience with
Data all Google tools like BigQuery, App Engine, AirFlow, Cloud Composer, etc...+ Data lakes...data ponds..domain drive

The person should 6+ years


SAS
relevant
Developer
exp and have strong knowledge in areas SAS development experience with SAS Base, SAS DI, SAS Enterprise Guide and mu
Snowflake / Wherescape developer Snowflake / Wherescape developer
• Should have the capability to work with minimal guidance
Energy & Utilties Background • Knowledge of Agile way of working is preferred
• Should have the capability to work with minimal guidance
Energy & Utilties Background • Knowledge of Agile way of working is preferred
GCP - Data Engg- DataProc/ Big Query / Big Table, HDP/Python/PySpark/Spark
GCP - Data Engg- DataProc/ Big Query / Big Table, HDP/Python/PySpark/Spark
of working experience in ANSI standard
DataSQL
Modeling
a must 3 years of experience with change management procedures and SDLC is a must Experience with analytical and operati
performance tuning, creating indexes and joining multiple sources Experience with Data Marts, Data Warehouse structures (e.g., star schema, fact and dimensions) 3+ ye
ineer (preference)..who has hands
DataonEngineer-Big
experience with
Data all Google tools like BigQuery, App Engine, AirFlow, Cloud Composer, etc...+ Data lakes...data ponds..domain drive
ineer (preference)..who has hands
DataonEngineer-Big
experience with
Data all Google tools like BigQuery, App Engine, AirFlow, Cloud Composer, etc...+ Data lakes...data ponds..domain drive
ctures (end-to-end) Streaming Batch
GCPData
DataLakes
Engineer
/ Big Data concepts Google Cloud Platform expertise Data Engineer Certification is preferred Coding & CI/CD GitHub Man
scriptionDeveloper should be able to independently work on solution design of ETL activity - should have a sound knowledge in Informatica Powercenter development a
tech-stack Experience in Big Data technologies (e.g. Spark) Experience in Big Data file formats (e.g. Parquet) Experience in object-oriented programming languages (Pyth
tech-stack Experience in Big Data technologies (e.g. Spark) Experience in Big Data file formats (e.g. Parquet) Experience in object-oriented programming languages (Pyth
tech-stack Experience in Big Data technologies (e.g. Spark) Experience in Big Data file formats (e.g. Parquet) Experience in object-oriented programming languages (Pyth

The person should 6+ years


SAS
relevant
Developer
exp and have strong knowledge in areas SAS development experience with SAS Base, SAS DI, SAS Enterprise Guide and mu
Snowflake / Wherescape developer Snowflake / Wherescape developer
• Should have the capability to work with minimal guidance
Energy & Utilties Background • Knowledge of Agile way of working is preferred
• Should have the capability to work with minimal guidance
Energy & Utilties Background • Knowledge of Agile way of working is preferred
GCP - Data Engg- DataProc/ Big Query / Big Table, HDP/Python/PySpark/Spark
GCP - Data Engg- DataProc/ Big Query / Big Table, HDP/Python/PySpark/Spark
GCP - Data Engg- DataProc/ Big Query / Big Table, HDP/Python/PySpark/Spark
GCP - Data Engg- DataProc/ Big Query / Big Table, HDP/Python/PySpark/Spark
GCP - Data Engg- DataProc/ Big Query / Big Table, HDP/Python/PySpark/Spark
GCP - Data Engg- DataProc/ Big Query / Big Table, HDP/Python/PySpark/Spark
GCP - Data Engg- DataProc/ Big Query / Big Table, HDP/Python/PySpark/Spark
GCP - Data Engg- DataProc/ Big Query / Big Table, HDP/Python/PySpark/Spark
GCP - Data Engg- DataProc/ Big Query / Big Table, HDP/Python/PySpark/Spark
GCP - Data Engg- DataProc/ Big Query / Big Table, HDP/Python/PySpark/Spark
GCP - Data Engg- DataProc/ Big Query / Big Table, HDP/Python/PySpark/Spark
GCP - Data Engg- DataProc/ Big Query / Big Table, HDP/Python/PySpark/Spark
OpenPages
IBM OpenPages, Java, IBM OpenPages Configuration, IBM Cognos, TeamCity, GitHub, Linux, Docker, Container, Octopus
OpenPages
IBM OpenPages, Java, IBM OpenPages Configuration, IBM Cognos, TeamCity, GitHub, Linux, Docker, Container, Octopus
OpenPages
IBM OpenPages, Java, IBM OpenPages Configuration, IBM Cognos, TeamCity, GitHub, Linux, Docker, Container, Octopus
OpenPages
IBM OpenPages, Java, IBM OpenPages Configuration, IBM Cognos, TeamCity, GitHub, Linux, Docker, Container, Octopus
OpenPages
IBM OpenPages, Java, IBM OpenPages Configuration, IBM Cognos, TeamCity, GitHub, Linux, Docker, Container, Octopus
OpenPages
IBM OpenPages, Java, IBM OpenPages Configuration, IBM Cognos, TeamCity, GitHub, Linux, Docker, Container, Octopus
OpenPages
IBM OpenPages, Java, IBM OpenPages Configuration, IBM Cognos, TeamCity, GitHub, Linux, Docker, Container, Octopus
NoSQL databases, including Postgres, Cassandra etc - Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc. - Experience/knowledg
NoSQL databases, including Postgres, Cassandra etc - Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc. - Experience/knowledg
Sound knowledge/experience of Ab>Initio Multi File system. 5. Experience in Designing and Building Project Architecture. Experience in Building Realtime applications. 6
.Net Developer with ECM .NET
skills_HA
developer and ECM (FileNet) skills. Should have good communication and 4+ years of relevent experience.
OpenPages
IBM OpenPages, Java, IBM OpenPages Configuration, IBM Cognos, TeamCity, GitHub, Linux, Docker, Container, Octopus
OpenPages
IBM OpenPages, Java, IBM OpenPages Configuration, IBM Cognos, TeamCity, GitHub, Linux, Docker, Container, Octopus
OpenPages
IBM OpenPages, Java, IBM OpenPages Configuration, IBM Cognos, TeamCity, GitHub, Linux, Docker, Container, Octopus
OpenPages
IBM OpenPages, Java, IBM OpenPages Configuration, IBM Cognos, TeamCity, GitHub, Linux, Docker, Container, Octopus
OpenPages
IBM OpenPages, Java, IBM OpenPages Configuration, IBM Cognos, TeamCity, GitHub, Linux, Docker, Container, Octopus
OpenPages
IBM OpenPages, Java, IBM OpenPages Configuration, IBM Cognos, TeamCity, GitHub, Linux, Docker, Container, Octopus
OpenPages
IBM OpenPages, Java, IBM OpenPages Configuration, IBM Cognos, TeamCity, GitHub, Linux, Docker, Container, Octopus
NoSQL databases, including Postgres, Cassandra etc - Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc. - Experience/knowledg
NoSQL databases, including Postgres, Cassandra etc - Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc. - Experience/knowledg
nd selection and / or development of the software components and hardware requirements of the applications and data, and the development of the application, includin
Sound knowledge/experience of Ab>Initio Multi File system. 5. Experience in Designing and Building Project Architecture. Experience in Building Realtime applications. 6
Sound knowledge/experience of Ab>Initio Multi File system. 5. Experience in Designing and Building Project Architecture. Experience in Building Realtime applications. 6
.Net Developer with ECM .NET
skills_HA
developer and ECM (FileNet) skills. Should have good communication and 4+ years of relevent experience.
nd transforming data for machine learning.; Experien+D4ce working with Natural Language Processing preferred.; Knowledge of machine learning and predictive modelin
Experienced developerDataiku
for DATAIKU having experience with building pipelines in Dataiq. Tableau/Microstrategy experience is a plus. Understanding of A
Experienced developerDataiku
for DATAIKU having experience with building pipelines in Dataiq. Tableau/Microstrategy experience is a plus. Understanding of A
Experienced developerDataiku
for DATAIKU having experience with building pipelines in Dataiq. Tableau/Microstrategy experience is a plus. Understanding of A
nd skill in IBM Watson product and services
Watson
specifically Watson Assistant and Watson Discovery Services. Experience in building conversations b.) Good analytical/proble
nd skill in IBM Watson product and services
Watson
specifically Watson Assistant and Watson Discovery Services. Experience in building conversations b.) Good analytical/proble
ect and mimnimum of 4+ yrs for Data Engineers with below skillsets: Understanding and experience with Google Cloud architecture In-depth knowledge of database an
Azure Data Factory, Data Lake,
Data Bricks, SQL Server
GoodHadoop
project management experience, preferably with data ware house(Hadoop technology)back ground. Min 4 yrs exp in Project Ma
Overall
Data platforms
10 yrs exp with 3-4 yrs in Project Management. They should have experience in managing sData Platform / Data Services Proje
nt at offshore) Pune (No remote access
Appway
allowed
BPMafter covid situation) Start date: 1st April-2021 Total Years of Experience :-5-8 Relevant years of Experience:-3-6 JRSS :
nd transforming data for machine learning.; Experien+D4ce working with Natural Language Processing preferred.; Knowledge of machine learning and predictive modelin
Experienced developerDataiku
for DATAIKU having experience with building pipelines in Dataiq. Tableau/Microstrategy experience is a plus. Understanding of A
Experienced developerDataiku
for DATAIKU having experience with building pipelines in Dataiq. Tableau/Microstrategy experience is a plus. Understanding of A
Experienced developerDataiku
for DATAIKU having experience with building pipelines in Dataiq. Tableau/Microstrategy experience is a plus. Understanding of A
nd skill in IBM Watson product and services
Watson
specifically Watson Assistant and Watson Discovery Services. Experience in building conversations b.) Good analytical/proble
nd skill in IBM Watson product and services
Watson
specifically Watson Assistant and Watson Discovery Services. Experience in building conversations b.) Good analytical/proble
ect and mimnimum of 4+ yrs for Data Engineers with below skillsets: Understanding and experience with Google Cloud architecture In-depth knowledge of database an
Azure Data Factory, Data Lake,
Data Bricks, SQL Server
ment for SAP (Forms/Letters) Expertise in the following OT products - Document Presentation for SAP Solutions including Live S/4 addon - Imaging Enterprise Scan Expos
ment for SAP (Forms/Letters) Expertise in the following OT products - Document Presentation for SAP Solutions including Live S/4 addon - Imaging Enterprise Scan Expos
ment for SAP (Forms/Letters) Expertise in the following OT products - Document Presentation for SAP Solutions including Live S/4 addon - Imaging Enterprise Scan Expos
Overall
Data platforms
10 yrs exp with 3-4 yrs in Project Management. They should have experience in managing sData Platform / Data Services Proje
* Should have expertise in performing health checks, patching,upgrades and communicate the updates to Stake holders
* Good to have experience in Log monitoring & associated remediations
Primary skills - Spark, , Spark SQL, Python. Secondary skills - Unix Shell Script, SQL, CTRLM
Datastage Primary Skills - Datastage, Secondary : PySpark, GLue Nice to have : Lamda, S3 etc

0 HLF, NodeJS, Kubernetes,


0 Node JS, scripting, Unit Testing, Git
0 HLF, Node-js, Micro services Framework
0 tyescript, /Nodejs/javascript
ts functionalities Extraction and transformation
Kronos of data using Dell Boomi (Integrations) Ability to transform HR policies into advance Kronos pay rules Configuring d
ts functionalities Extraction and transformation
Kronos of data using Dell Boomi (Integrations) Ability to transform HR policies into advance Kronos pay rules Configuring d
0 HLF, NodeJS, Kubernetes,
0 Node JS, scripting, Unit Testing, Git
0 HLF, Node-js, Micro services Framework
0 tyescript, /Nodejs/javascript
ts functionalities Extraction and transformation
Kronos of data using Dell Boomi (Integrations) Ability to transform HR policies into advance Kronos pay rules Configuring d
ts functionalities Extraction and transformation
Kronos of data using Dell Boomi (Integrations) Ability to transform HR policies into advance Kronos pay rules Configuring d
Should have good understanding of Maximo Data model & have worked in designing Data migration template & should have performed mapping of data
0 Should have basic knowledge of Integration to setup interface for Data loading or should have used MxLoader.
of Integration setup with External system,
MaximoData
Spatial
Extrations and Data Loading, Data Integrity validations, error resolutions. > Proficient with data Data Mapping and Transf
The developer will participate in full software development life cycle including requirements analysis, solution design, software developme
nd-user requirements using the embedded0configuration tools found within the Maximo application. The developer will have experience in agile methodologies and have
The developer will participate in full software development life cycle including requirements analysis, solution design, software developme
nd-user requirements using the embedded0configuration tools found within the Maximo application. The developer will have experience in agile methodologies and have
The developer will participate in full software development life cycle including requirements analysis, solution design, software developme
nd-user requirements using the embedded0configuration tools found within the Maximo application. The developer will have experience in agile methodologies and have
0 Maximo inventory management
0 Maximo data load templates for item master and inventory
The developer will participate in full software development life cycle including requirements analysis, solution design, software developme
nd-user requirements using the embedded0configuration tools found within the Maximo application. The developer will have experience in agile methodologies and have
The developer will participate in full software development life cycle including requirements analysis, solution design, software developme
nd-user requirements using the embedded0configuration tools found within the Maximo application. The developer will have experience in agile methodologies and have
0 Business Transformation Consultant-IoT & PLM
per - ENOVIA V5 LCA and CATIA V5 PLMEnovia
Application Knowledge. Good customization knowledge especially for: - Change, Document, Part and Release Management - Wo
ears Good ENOVIA V5 LCA and CATIA V5Enovia
PLM Application Support Knowledge PLM Application support and maintenance experience - Functional experience of PLM Ap
Should have good understanding of Maximo Data model & have worked in designing Data migration template & should have performed mapping of data
0 Should have basic knowledge of Integration to setup interface for Data loading or should have used MxLoader.
of Integration setup with External system,
MaximoData
Spatial
Extrations and Data Loading, Data Integrity validations, error resolutions. > Proficient with data Data Mapping and Transf
The developer will participate in full software development life cycle including requirements analysis, solution design, software developme
nd-user requirements using the embedded0configuration tools found within the Maximo application. The developer will have experience in agile methodologies and have
The developer will participate in full software development life cycle including requirements analysis, solution design, software developme
nd-user requirements using the embedded0configuration tools found within the Maximo application. The developer will have experience in agile methodologies and have
The developer will participate in full software development life cycle including requirements analysis, solution design, software developme
nd-user requirements using the embedded0configuration tools found within the Maximo application. The developer will have experience in agile methodologies and have
The developer will participate in full software development life cycle including requirements analysis, solution design, software developme
nd-user requirements using the embedded0configuration tools found within the Maximo application. The developer will have experience in agile methodologies and have
0 Maximo inventory management
0 Maximo data load templates for item master and inventory
0 Business Transformation Consultant-IoT & PLM
Teamcenter Integration with CATIA (TCIC)
TeamCenter
Strong in programming skills like C, C++, Java & OOPS concepts BMIDE Codeful and codeless customization Teamcenter SOA d
per - ENOVIA V5 LCA and CATIA V5 PLMEnovia
Application Knowledge. Good customization knowledge especially for: - Change, Document, Part and Release Management - Wo
ears Good ENOVIA V5 LCA and CATIA V5Enovia
PLM Application Support Knowledge PLM Application support and maintenance experience - Functional experience of PLM Ap
• Stakeholder Management, Creative problem-solving skills and excellent communication Skills
Industry 4.0 BA • Should be a good team player with the ability to drive and motivate
• Stakeholder Management, Creative problem-solving skills and excellent communication Skills
Industry 4.0 BA • Should be a good team player with the ability to drive and motivate
• Stakeholder Management, Creative problem-solving skills and excellent communication Skills
Industry 4.0 BA • Should be a good team player with the ability to drive and motivate
ties during US time zone. Responsibilities
Teamcenter
and Tasks: Must be flexible to work in Shifts (24x7 Shifts) Administration of Teamcenter, including license management, T
ties during US time zone. Responsibilities
Teamcenter
and Tasks: Must be flexible to work in Shifts (24x7 Shifts) Administration of Teamcenter, including license management, T
ties during US time zone. Responsibilities
Teamcenter
and Tasks: Must be flexible to work in Shifts (24x7 Shifts) Administration of Teamcenter, including license management, T
ties during US time zone. Responsibilities
Teamcenter
and Tasks: Must be flexible to work in Shifts (24x7 Shifts) Administration of Teamcenter, including license management, T
aming, data collection, data exploration, model building, deployment Working experience in most of the common Machine Learning techniques related to Time series
aming, data collection, data exploration, model building, deployment Working experience in most of the common Machine Learning techniques related to Time series
m use case framing, data collection, data exploration, model building, deployment Working experience in most of the common Machine Learning techniques related to
Change Management Stakeholder
Change Management
Engagement Communications Learning and Development Organisation Design Development and alignment and value
ODM Primary Skill :BPM, ODM, exposure to cloud pack for business Automation Design and Develop BPM solutions
• Stakeholder Management, Creative problem-solving skills and excellent communication Skills
Industry 4.0 BA • Should be a good team player with the ability to drive and motivate
• Stakeholder Management, Creative problem-solving skills and excellent communication Skills
Industry 4.0 BA • Should be a good team player with the ability to drive and motivate
• Stakeholder Management, Creative problem-solving skills and excellent communication Skills
Industry 4.0 BA • Should be a good team player with the ability to drive and motivate
ties during US time zone. Responsibilities
Teamcenter
and Tasks: Must be flexible to work in Shifts (24x7 Shifts) Administration of Teamcenter, including license management, T
ties during US time zone. Responsibilities
Teamcenter
and Tasks: Must be flexible to work in Shifts (24x7 Shifts) Administration of Teamcenter, including license management, T
ties during US time zone. Responsibilities
Teamcenter
and Tasks: Must be flexible to work in Shifts (24x7 Shifts) Administration of Teamcenter, including license management, T
ties during US time zone. Responsibilities
Teamcenter
and Tasks: Must be flexible to work in Shifts (24x7 Shifts) Administration of Teamcenter, including license management, T
aming, data collection, data exploration, model building, deployment Working experience in most of the common Machine Learning techniques related to Time series
m use case framing, data collection, data exploration, model building, deployment Working experience in most of the common Machine Learning techniques related to
Must have work experience in key COUPA modules, sourcing, Procurement, Contract Management , Supplier, Inventory, and invo
Coupa Good communication Skills with client facing role.
iValua iValuva with Business Analyst - Automotive. Designs, develops and supports applications based on iValua technology
Must have work experience in key COUPA modules, sourcing, Procurement, Contract Management , Supplier, Inventory, and invo
Coupa Good communication Skills with client facing role.
Data platform - Azure Machine Learning, Azure Functions
iValua iValuva with Business Analyst - Automotive. Designs, develops and supports applications based on iValua technology
Must have work experience in key COUPA modules, sourcing, Procurement, Contract Management , Supplier, Inventory, and invo
Coupa Good communication Skills with client facing role.
Data platform - Azure Machine Learning, Azure Functions
Experienced BI report developer/designer Using Qlik
Qliksense Understanding of DataVault model
Senior Informatica ETL Developer - Powercenter Nice to have-Data Engineering, Data Modeling, Datalake, AWS
Data Modeller Experience in Data Modeling OR ETL Informatica, Datalake, Nice to have-AWS,
Data Engineer Data Engineering, ETL, Data Modeling, Datalake, AWS Nice to have-AWS, Informatica
Data Engineer Data Engineering, ETL, Data Modeling, Datalake, AWS Nice to have-AWS, Informatica
ng systems -Good Client handling skill,
MDM Clarification
Engineerskill from onsite BA team and Coordination with offshore -Any batch scripting knowledge is a plus, specially Windows
Experience in SQL Scripts
Datastage Excellent Communication Skills
Experience in SQL Scripts
Datastage Excellent Communication Skills
Experience in SQL Scripts
Datastage Excellent Communication Skills
Experience in SQL Scripts
Datastage Excellent Communication Skills
Data Modeler Data Modeller Experience in Data Modeling OR ETL Informatica, Datalake, Nice to have-AWS,
AWS Data Engineer Data Engineering, ETL, Data Modeling, Datalake, AWS Nice to have-AWS, Informatica
AWS Data Engineer Data Engineering, ETL, Data Modeling, Datalake, AWS Nice to have-AWS, Informatica
Data Modeler Data Modeller Experience in Data Modeling OR ETL Informatica, Datalake, Nice to have-AWS,
· Creating CI/CD pipeline using git, cloud build, terraform
Business Transformation Consultant - Big Data ·
· Creating CI/CD pipeline using git, cloud build, terraform
GCP · Ability to handle massive data in Terabytes
· Strong experience in managing virtual teams and working with matrix organization.
Hadoop · Strong articulation, communication skills
ve, Impala, HBase, HDFS, Linux, JAVA/Python.
HadoopMAPR Data engineer JD Hands on experience on Azure Azure data factory/synapse/data bricks/adls SQL experience SQL in
15 Develop archiving and purging strategies and processes
Information
16. KnowledgeGovernance
on information
Analyst
governance guidelines, principles, policies, and standards for information / data stewards, stakeholders, and de
15 Develop archiving and purging strategies and processes
16. Knowledge
Information on information
Governance governance guidelines, principles, policies, and standards for information / data stewards, stakeholders, and de
Analyst
15 Develop archiving and purging strategies and processes
Information
16. KnowledgeGovernance Analyst
on information governance guidelines, principles, policies, and standards for information / data stewards, stakeholders, and de
15 Develop archiving and purging strategies and processes
Information
16. KnowledgeGovernance
on information
Analyst
governance guidelines, principles, policies, and standards for information / data stewards, stakeholders, and de
• Excellent verbal and written communication skills with the ability to establish deep understanding of client's business issues
evelopment on Datastage as an ETL tool,
Cloud ETLEngineer
Data architecture, Datawarehouse development knowledge, Systems Analysis, Project Management, Information Technology/So
6. Provides consulting and guidance regarding the usage of the enterprise integrated logical model for use in development.
Data Modeler 7. Have deep knowledge of data architectures, ODSs, Data warehouses and methodologies.
ETL Datastage Primary Skills: IBM DataStage PL/SQL Unix. Secondary Skills: Control-M AWS Basics dbt (data build tool)
ETL Datastage Primary Skills: IBM DataStage PL/SQL Unix. Secondary Skills: Control-M AWS Basics dbt (data build tool)
ETL Datastage Primary Skills: IBM DataStage PL/SQL Unix. Secondary Skills: Control-M AWS Basics dbt (data build tool)
Specialized Technical Consultant (Data Engineer)
ed), Scala - SQL programming - Building
Big ETL
Dataprocesses - Code versioning (GIT/SVN) - Data ingestion from APIs - Problem solving skills - Quick learning of ne
performance tuning, creating indexes andETL
joining multiple sources Experience with Data Marts, Data Warehouse structures (e.g., star schema, fact and dimensions) 3+ ye
Data Modeler Data Modeler
Experienced BI report developer/designer Using Qlik
Qliksense Understanding of DataVault model
Senior Informatica ETL Developer - Powercenter Nice to have-Data Engineering, Data Modeling, Datalake, AWS
Data Modeller Experience in Data Modeling OR ETL Informatica, Datalake, Nice to have-AWS,
Data Engineer Data Engineering, ETL, Data Modeling, Datalake, AWS Nice to have-AWS, Informatica
Data Engineer Data Engineering, ETL, Data Modeling, Datalake, AWS Nice to have-AWS, Informatica
ng systems -Good Client handling skill,
MDM Clarification
Engineerskill from onsite BA team and Coordination with offshore -Any batch scripting knowledge is a plus, specially Windows
Experience in SQL Scripts
Datastage Excellent Communication Skills
Experience in SQL Scripts
Datastage Excellent Communication Skills
Experience in SQL Scripts
Datastage Excellent Communication Skills
Experience in SQL Scripts
Datastage Excellent Communication Skills
Data Modeler Data Modeller Experience in Data Modeling OR ETL Informatica, Datalake, Nice to have-AWS,
AWS Data Engineer Data Engineering, ETL, Data Modeling, Datalake, AWS Nice to have-AWS, Informatica
AWS Data Engineer Data Engineering, ETL, Data Modeling, Datalake, AWS Nice to have-AWS, Informatica
Data Modeler Data Modeller Experience in Data Modeling OR ETL Informatica, Datalake, Nice to have-AWS,
· Creating CI/CD pipeline using git, cloud build, terraform
Business Transformation Consultant - Big Data ·
· Creating CI/CD pipeline using git, cloud build, terraform
GCP · Ability to handle massive data in Terabytes
· Strong experience in managing virtual teams and working with matrix organization.
Hadoop · Strong articulation, communication skills
ve, Impala, HBase, HDFS, Linux, JAVA/Python.
HadoopMAPR Data engineer JD Hands on experience on Azure Azure data factory/synapse/data bricks/adls SQL experience SQL in
15 Develop archiving and purging strategies and processes
Information
16. KnowledgeGovernance
on information
Analyst
governance guidelines, principles, policies, and standards for information / data stewards, stakeholders, and de
15 Develop archiving and purging strategies and processes
16. Knowledge
Information on information
Governance governance guidelines, principles, policies, and standards for information / data stewards, stakeholders, and de
Analyst
15 Develop archiving and purging strategies and processes
Information
16. KnowledgeGovernance Analyst
on information governance guidelines, principles, policies, and standards for information / data stewards, stakeholders, and de
15 Develop archiving and purging strategies and processes
Information
16. KnowledgeGovernance
on information
Analyst
governance guidelines, principles, policies, and standards for information / data stewards, stakeholders, and de
• Excellent verbal and written communication skills with the ability to establish deep understanding of client's business issues
evelopment on Datastage as an ETL tool,
Cloud ETLEngineer
Data architecture, Datawarehouse development knowledge, Systems Analysis, Project Management, Information Technology/So
6. Provides consulting and guidance regarding the usage of the enterprise integrated logical model for use in development.
Data Modeler 7. Have deep knowledge of data architectures, ODSs, Data warehouses and methodologies.
ETL Datastage Primary Skills: IBM DataStage PL/SQL Unix. Secondary Skills: Control-M AWS Basics dbt (data build tool)
ETL Datastage Primary Skills: IBM DataStage PL/SQL Unix. Secondary Skills: Control-M AWS Basics dbt (data build tool)
ETL Datastage Primary Skills: IBM DataStage PL/SQL Unix. Secondary Skills: Control-M AWS Basics dbt (data build tool)
Specialized Technical Consultant (Data Engineer)
ed), Scala - SQL programming - Building
Big ETL
Dataprocesses - Code versioning (GIT/SVN) - Data ingestion from APIs - Problem solving skills - Quick learning of ne
performance tuning, creating indexes andETL
joining multiple sources Experience with Data Marts, Data Warehouse structures (e.g., star schema, fact and dimensions) 3+ ye
Data Modeler Data Modeler
Account Name BRCODE
Bombardier Aerospace 347933BR
Heineken 341180BR
Allianz Belgium 353722BR
Air Canada 366846BR
SKF 369119BR
National Grid 366122BR
Ministry of Community and Social Services (MCSS) 367349BR
Rockwell Automation 386542BR
Jones Lang LaSalle 390393BR
Bell Canada 388986BR
Bell Canada 388989BR
UBP 392867BR
Scottish Government 392651BR
Scottish Government 392654BR
Scottish Government 392655BR
Scottish Government 392694BR
Scottish Government 392663BR
Scottish Government 392664BR
XL Global Services 407968BR
XL Global Services 407970BR
XL Global Services 407971BR
Allianz Belgium 407979BR
Shell Oil 407448BR
Shell Oil 407106BR
Shell Oil 407578BR
OP Bank 414701BR
Suncorp Corporate Services PTY LTD 448301BR
Suncorp Corporate Services PTY LTD 448325BR
Westpac Corporation 407947BR
Chevron corporation 405742BR
Chevron corporation 405745BR
Commonwealth Bank 419125BR
Commonwealth Bank 419120BR
Commonwealth Bank 419116BR
AMS Solutioning Worldwide 418125BR
AMS Solutioning Worldwide 418122BR
COMPASS GROUP 424265BR
Saudi Arabian Airlines 424234BR
Suntrust 422042BR
Internal Accounts 423782BR
J&J Global COE - US 424124BR
AMS Solutioning Worldwide 424120BR
COMPASS GROUP 424129BR
Suntrust 422462BR
AMS Solutioning Worldwide 424316BR
AMS Solutioning Worldwide 424321BR
Suntrust 424456BR
CIBC 423838BR
CIBC 424617BR
MUFG Bank Ltd 429617BR
NatWest 429227BR
NatWest 462588BR
NatWest 429598BR
NatWest 429783BR
NatWest 462594BR
IKEA 438434BR
Saudi Arabian Airlines 434574BR
Saudi Arabian Airlines 446153BR
CIBC 431228BR
IKEA 438433BR
XL Global Services 438406BR
Firmenich 438514BR
Firmenich 438511BR
Macquarie Bank 439598BR
MERCK SHARP & DOHME CORP 439612BR
Vodafone Group
Vodafone Group
Data Platforms Blitz-ECM 446403BR
Westpac Corporation 443540BR
Ministry of Community and Social Services (MCSS) 448126BR
USAA LIFE INSURANCE CO 445997BR
Ministry of Community and Social Services (MCSS) 448129BR
Heineken 446893BR
Heineken 448133BR
Heineken 448135BR
Westpac Corporation 446731BR
Westpac Corporation 446732BR
Suncorp Corporate Services PTY LTD 446851BR
Westpac Corporation 446854BR
Westpac Corporation 446857BR
Heineken 447258BR
Heineken 448138BR
Heineken 448140BR
Heineken 448143BR
Westpac Corporation 446868BR
Westpac Corporation 446871BR
Westpac Corporation 446874BR
Suncorp Corporate Services PTY LTD 446878BR
Westpac Corporation 446880BR
Westpac Corporation 446884BR
CIBC 445406BR
USAA LIFE INSURANCE CO 448376BR
USAA LIFE INSURANCE CO 448604BR
USAA LIFE INSURANCE CO 448612BR
Ericsson AB 448622BR
Suntrust 445485BR
USAA LIFE INSURANCE CO 448976BR
USAA LIFE INSURANCE CO 448977BR
USAA LIFE INSURANCE CO 445656BR
USAA 448037BR
Rockwell Automation 450353BR
Rockwell Automation 450355BR
Micron Technology Inc 450351BR
Rockwell Automation 450380BR
Rockwell Automation 450381BR
Rockwell Automation 450383BR
Rockwell Automation 451648BR
Rockwell Automation 450398BR
Micron Technology Inc 450402BR
Rockwell Automation 450405BR
Rockwell Automation 450384BR
Rockwell Automation 450386BR
WATER TRANSMISSION AND TECHNOLOGIES COMPANY 450446BR
Suncor Energy Inc 450419BR
Honda NA 449178BR
Home Depot 452394BR
IKEA 452406BR
UNITED SERVICES AUTOMOBILE (USAA) 463835BR
USAA 461562BR
UNITED SERVICES AUTOMOBILE (USAA) 463649BR
USAA LIFE INSURANCE CO 463869BR
USAA LIFE INSURANCE CO 463871BR
USAA LIFE INSURANCE CO 463872BR
Westpac Corporation 463842BR
L OREAL 463863BR
IBM Internal Account - SWG 462061BR
UNITED SERVICES AUTOMOBILE (USAA) 462085BR
UNITED SERVICES AUTOMOBILE (USAA) 462086BR
USAA LIFE INSURANCE CO 462089BR
UNITED SERVICES AUTOMOBILE (USAA) 462091BR
UNITED SERVICES AUTOMOBILE (USAA) 462410BR
USAA LIFE INSURANCE CO 462417BR
USAA LIFE INSURANCE CO 462423BR
USAA LIFE INSURANCE CO 462429BR
Westpac Corporation 462100BR
BMW (UK) LTD 463956BR
Philip Morris International 462147BR
L OREAL 462156BR
IBM Internal Account - SWG 462663BR
Bell Canada 462217BR
Westpac Corporation 462238BR
Westpac Corporation 462309BR
Saudi Arabian Airlines 463477BR
IBM Internal 463014BR
Westpac Corporation 475063BR
Westpac Corporation 472759BR
Natwest Group 472764BR
Natwest Group 472784BR
Natwest Group 472790BR
Natwest Group 472796BR
Natwest Group 472812BR
Natwest Group 472815BR
Natwest Group 472820BR
Westpac Corporation 473100BR
Westpac Corporation 473105BR
Westpac Corporation 473113BR
Westpac Corporation 473120BR
Westpac Corporation 473127BR
Westpac Corporation 473135BR
Westpac Corporation 473181BR
Westpac Corporation 473186BR
Westpac Corporation 473262BR
Westpac Corporation 473266BR
Westpac Corporation 474625BR
State Farm Insurance 475097BR
Atradius 473299BR
Westpac Corporation 475065BR
Natwest Group 473369BR
Natwest Group 473410BR
Natwest Group 473420BR
Natwest Group 473428BR
Natwest Group 473438BR
Natwest Group 473446BR
Westpac Corporation 473558BR
Westpac Corporation 473564BR
Westpac Corporation 473573BR
Westpac Corporation 473580BR
Westpac Corporation 473588BR
Westpac Corporation 473596BR
Westpac Corporation 473664BR
Westpac Corporation 473673BR
Westpac Corporation 473686BR
Westpac Corporation 473700BR
Westpac Corporation 474627BR
Citigroup 475102BR
State Farm Insurance 475105BR
Atradius 473721BR
Express Scripts 471892BR
IBM Internal 462306BR
Blue Cross Blue Shield of Massachusetts 475246BR
Rockwell Automation 473731BR
Home Depot 471500BR
Home Depot 471634BR
Saudi Arabian Airlines 475258BR
Westpac Corporation 470995BR
Luminor Bank 474799BR
National Grid 475170BR
National Grid 475176BR
Micron Technology Inc 471659BR
Micron Technology Inc 471677BR
Rockwell Automation 473745BR
Inovalon 473392BR
Home Depot 472162BR
Home Depot 472177BR
IKEA 472149BR
H&M 473381BR
H&M 472154BR
H&M 472166BR
H&M 472179BR
Saudi Arabian Airlines 475270BR
Westpac Corporation 471018BR
Luminor Bank 474857BR
National Grid 475188BR
National Grid 475189BR
Micron Technology Inc 471688BR
Micron Technology Inc 471543BR
Micron Technology Inc 471557BR
Micron Technology Inc 471569BR
Micron Technology Inc 471636BR
Micron Technology Inc 471583BR
Micron Technology Inc 471655BR
Micron Technology Inc 471663BR
Micron Technology Inc 471600BR
Micron Technology Inc 471674BR
Micron Technology Inc 471610BR
Micron Technology Inc 471683BR
Commonwealth Bank
Commonwealth Bank
Commonwealth Bank
Commonwealth Bank
Commonwealth Bank
Commonwealth Bank
Commonwealth Bank
Macquarie Bank 481369BR
Macquarie Bank 481365BR
Commonwealth Bank
Suntrust
Commonwealth Bank
Commonwealth Bank
Commonwealth Bank
Commonwealth Bank
Commonwealth Bank
Commonwealth Bank
Commonwealth Bank
Macquarie Bank 481370BR
Macquarie Bank 481367BR
Commonwealth Bank
Commonwealth Bank
Commonwealth Bank
Suntrust
Allianz Belgium
Pfizer Global Pharma
Pfizer Global Pharma
Pfizer Global Pharma
Vodafone Group
Vodafone Group
Ericsson AB
CDP Worldwide
Horizon BCBS 494413BR
Horizon BCBS
UBP
Allianz Belgium
Pfizer Global Pharma
Pfizer Global Pharma
Pfizer Global Pharma
Vodafone Group
Vodafone Group
Ericsson AB
CDP Worldwide
SOCIETE DE L'ASSURANCE 488523BR
SOCIETE DE L'ASSURANCE 488525BR
SOCIETE DE L'ASSURANCE 488527BR
Horizon BCBS
MERCK SHARP & DOHME CORP 494910BR
Anthem INC 490308BR
Prudential Financials 499661BR
Pitney Bowes AMS 498614BR
Pitney Bowes AMS
Pitney Bowes AMS
Pitney Bowes AMS
CONA
CONA 499854BR
VINTURAS
Home Depot 499851BR
Pitney Bowes AMS
Pitney Bowes AMS
CONA
CONA 499858BR
VINTURAS
Home Depot 499853BR
Pitney Bowes AMS
Pitney Bowes AMS
AMERICAN ELECTRIC POWER SERVICE 493775BR
National Grid 493796BR
National Grid 493798BR
National Grid 493801BR
National Grid 493814BR
Southern Water
Southern Water
Southern Water 493823BR
Southern Water 493824BR
Bombardier Aerospace 494428BR
Bombardier Aerospace 494449BR
Bombardier Aerospace 494451BR
AMERICAN ELECTRIC POWER SERVICE 493717BR
National Grid 493720BR
National Grid 493722BR
National Grid 493724BR
National Grid 493747BR
National Grid 493750BR
Southern Water
Southern Water
Bombardier Aerospace 494433BR
Bombardier Aerospace 494434BR
Bombardier Aerospace 494438BR
Bombardier Aerospace 494440BR
National Grid
National Grid
National Grid
Micron Technology
Micron Technology
Micron Technology
Micron Technology
Applied Materials
Applied Materials
Applied Materials
Shell Oil
IBM Internal 503030BR
National Grid
National Grid
National Grid
Micron Technology
Micron Technology
Micron Technology
Micron Technology
Applied Materials 499023BR
Applied Materials
Juniper Networks 501219BR
Navistar Inc
JUNIPER United States
Saudi Arabian Airlines
Navistar Inc
JUNIPER United States
Saudi Arabian Airlines
Vodafone NL
ABBOTT
ABBOTT
ABBOTT
ABBOTT
XL Global Services
Co-op Financial Services (CFS)
The Co-Operative Bank
The Co-Operative Bank
The Co-Operative Bank
ABBOTT
ABBOTT
ABBOTT
ABBOTT
IKEA
IKEA
Navistar Inc
Ericsson AB
USAA
USAA
USAA
USAA
USAA
USAA.
USAA LIFE INSURANCE CO
USAA LIFE INSURANCE CO
USAA LIFE INSURANCE CO
Philip Morris International
Philip Morris International
Inovalon
Inovalon
Vodafone NL
ABBOTT
ABBOTT
ABBOTT
ABBOTT
XL Global Services
Co-op Financial Services (CFS)
The Co-Operative Bank
The Co-Operative Bank
The Co-Operative Bank
ABBOTT
ABBOTT
ABBOTT
ABBOTT
IKEA
IKEA
Navistar Inc
Ericsson AB
USAA
USAA
USAA
USAA
USAA
USAA.
USAA LIFE INSURANCE CO
USAA LIFE INSURANCE CO
USAA LIFE INSURANCE CO
Philip Morris International
Philip Morris International
Inovalon
Inovalon

You might also like