Technical Systems Analyst (62843)

Location: Toronto, ON
Date Posted: 18-07-2017
What is Opportunity?
You will be a senior member of the IT Support team in the Advanced Client Experience (ACE) program providing production support. You will work in a Big Data environment deploying large distributed Big Data applications, leveraging Hadoop and other industry Big Data frameworks. Some after-hours support is required, on a regular basis. 
This position offers an opportunity to leverage the Data Maturity Model (DMM), Data Capability Assessment Model (DCAM), and Enterprise Data Management (EDM) strategies to optimize Client Experience by delivering data management and reporting capabilities using the latest Big Data technologies. These solutions will support the transformation initiative by changing and speeding up the delivery of information to analysts, managers and senior executives as one of the key tools for decision-making, based on analytics driven by the data collected from business applications.
What will you do?
• Provide support of data management solutions that use the Hadoop ecosystem, Spark and other cutting-edge Big Data technologies.
• Manage production environment infrastructure and after-hours support as required.
• Assist in the deployment and implementation of Data Analytics projects using the enterprise suite of Analytics tools in adherence to RBC’s standards and guidelines. 
• Work on production maintenance and assist with Hadoop migrations for current traditional data management implementations. 
• Work with multiple-datasets and applications, large volumes and wide varieties of data having multiple interfaces and third-parties.
• Work with Change Management and Incident Management tools such as HP Service Manager.
What do you need to succeed?
Must-haves
• Minimum of 5 – 7 years of work experience in a production support role using Big Data Analytics, Data Warehousing or Business Intelligence reporting technologies. 
• Strong technical expertise in Hadoop – HDFS, Spark, Flume, Sqoop, HBase, Kafka, NiFi, Linux, Blue Chip Cloud, Amazon MWS. 
• Working knowledge with Unix commands and Java programming
• Knowledge or experience with any Reporting tools
• Experience with any databases such as SQL Server, Oracle, MongoDB, PostgresSQL, NoSQL and DataStage
• Experience or knowledge in Agile development.
• Experience in Hadoop administration, analyzing cluster performance and bottlenecks, and capacity planning.
• Experience with investigating incidents, trouble-shooting, root-cause analysis and problem management, including the use of a service management ticketing tool.
• Intermediate MapReduce development experience.
Nice-to-have
• Experience with technical documentation.
• Working experience in front-end web development using Javascript frameworks (Angular, Backbone, JQuery, etc.), 
• Experience with other BI reporting tools such as SAP BO, Webi & Crystal Reports
• Knowledge or experience in the Financial or Insurance industry is preferred.
or
this job portal is powered by CATS