Interact with Business team to understand development requirements on regular basis
Performs analysis, design and development of ETL and ELT processes to support project requirements
Develop Informatica mappings, SQL/stored procedures as well as data maps and Unix shell scripts
Develop Hive, NiFi, Scala, Spark, Sqoop scripts to extract data to/from RDBMS to Hadoop to Oracle Exadata.
Develop Hive tables and queries.
Data migration from Hadoop/HDFS, Hive, Spark to Oracle Exadata
Develop, Execute and implement NiFi workflows and trouble shoot
Performs unit testing, QA, and work with business partners to resolve any issues discovered during UAT.
Responsible for peer-review of mappings and workflows when required
Maintains development and test data environments by populating the data based on project requirements.
Works with production control and operations as needed to promote mappings/workflows, implement schedules and resolve the issues
Reviews ETL performance and conducts performance tuning as required on mappings / workflows or SQL.
Maintains all applicable documentation pertaining to specific SDLC phases.
Skills:
Have worked in both Agile Scrum and Waterfall SDLC projects
Good exposure working with Business team
Technical Skills in order of priority: Informatica Power Centre, Hadoop Programming (Sqoop, Hive, Nifi, Spark, Scala), Strong SQL, Oracle Exadata Experience.
A minimum of 5 years’ experience in ETL and ELT Development.
2 years’ experience with Sqoop, Hive and Nifi technologies
Good skills in Scala and Spark development
Strong analytical and problem-solving skills in Oracle Exadata, SQL, Hive, Nifi, Scala, Spark and Sqoop.
Strong understanding of ETL and ELT Development best practices, Strong understanding of Database Concepts, Performance Tuning in SQL and Informatica.
Strong knowledge of technology platforms and environments
Proven ability to work independently in a dynamic environment with multiple assigned projects and tasks.
Outstanding ability to communicate, both verbally and in writing.
Ability to develop complex mappings and workflows in accordance with requirements.