You are on page 1of 1

Basic qualifications

 10+ years of experience in a technical position.


 4+ years on any Cloud Platform (AWS, Azure, Google etc).
 Bachelor’s degree in Information Science / Information Technology, Computer Science, Engineering,
Mathematics, Physics, or a related field.
 Strong verbal and written communication skills, with the ability to work effectively across internal and
external organizations.
 Programming skills in Java/Python/Perl/PowerShell.
 Strong hands-on experience in integrating multiple databases like Oracle, SQL Server, PostgreSQL,
Teradata, SQL server etc.
 Familiarity with one or more SQL-on-Hadoop technology (Hive, Pig, Impala, Spark SQL, Presto).
 Ability to think strategically about business, product, and technical challenges in an enterprise
environment. Leadership and innovation around Data Analytics.
 Customer facing skills to represent AWS well within the customer’s environment and drive discussions
with senior personnel regarding trade-offs, best practices, project management and risk mitigation
 Leading/Involved in highly-available and fault-tolerant enterprise and web-scale software applications.
 High end Troubleshooting and Communication skills.
 Understanding of Apache Spark/Hadoop and the Data Analytics ecosystem. Experience with one or more
relevant tools (Sqoop, Flume, Kinesis, Kafka, Oozie, Hue, Zookeeper, Ranger, Elasticsearch, Avro).
 Familiarity with one or more SQL-on-Hadoop technology (Hive, Pig, Impala, Spark SQL, Presto).
 Ability to travel to client locations to deliver professional services when needed.

Preferred qualifications

 Hands on experience leading large-scale global data warehousing and analytics projects.
 Understanding of database and analytical technologies in the industry including MPP OR NoSQL
databases, Data Warehouse design, BI reporting and Dashboard development.
 Implementing AWS services in a variety of distributed computing, enterprise environments.
 Experience with other OLTP databases like Oracale, PostgreSQL, Sql Server etc
 Hadoop development ( HDFS, Map Reduce, Hive, HBase, Spark)
 Programming experience (Java, Perl, python)
 Experience in ETL workflow management
 Experience in MPP databases- Redshift, Netezza, Teradata, Snowflake..

You might also like