logo

View all jobs

Google Cloud Platform Data Engineer 3 / Ingénieur de données 3 (15284)

Toronto, ON
One of our digital client is looking for a Google Cloud Platform Data Engineer for a 9+ months contract in Toronto , ON.

The team includes a passionate group of strategists, UX and visual designers, full stack developers, content managers, scrum masters, testers, product owners, people experience specialists, and other digital experts.

As a GCP Data Engineer , you will be responsible for crafting, building and running the data driven applications which enable innovative, customer centric digital experiences for customers.
You will be working as part of a friendly, cross-discipline agile team who helps each other solve problems across all functions. As a custodian of customer trust, you will employ best practice in development, security, accessibility and design to achieve the highest quality of service for our customers.
Our development team uses a range of technologies to get the job done: GCP, Big Query, Airflow, Dataflow, Composer and Ni-Fi to provide a modern, easy to use data pipeline.
You will be part of the team building a data pipeline to transfer the data from our enterprise data lake for enabling data analytics and AI use cases.

Must to have skills: (please at least 3 skills/exp):
1. SQL 4-5+ Years
2 Programming experience (Phython strongly preferred) – 4+ Years
3. Experience with Google Cloud Platform (for Data Analytics)
4. Experience with Digital Transformations
5. Experience with any of the following: Big Query, Dataflow, Airflow, Dataflow, and NiFi
Nice to have skills: (please at least 3 skills/exp):
1. Big Query Experience
2. Experience with ETL Flows
3. Experience with AWZ, Azure, Hadoop
4. Data Visualization experience (Data Studio, Tableau, Domo, etc..

Great-to-haves
● Experience with other Big Data related tools and technologies (AWZ, Azure, Hadoop)
● Experience building data-as-a-product assets such as Customer Journey Tables, Advanced Reusable Reporting Queries, Symantec Layers Attributes
● Working knowledge of a telecommunication service provider’s lines of business
● Experience with connectors to data visualization tools such as Data Studio, Tableau, Domo

Here’s how
● Learn new skills & advance your data development practice
● Design, develop, test, deploy, maintain and improve the analytics pipeline
● Assist in evaluating technology choices and rapidly test solutions
● Assist the outcome teams in understanding how to best measure their web properties
● Collaborate closely with multiple team in an agile environment

You're the missing piece of the puzzle
● A passion for data and analytics
● Experience & proficiency in SQL and at least one other modern programming language (Python, Java, etc)
● Interest and ability to learn new languages & technologies as needed
● Familiar with the execution sequence of ETL Flows
● Experience with GCP, Big Query, Airflow, Dataflow, Composer and Ni-Fi
● Basic understanding of data warehouse, data lake, OLAP and OLTP applications



 
Powered by