View all jobs

Principal Solution Architect

Remote, Alberta

Description of Work:
As a Principal Solution Architect - Data and AI, the successful candidate will play a lead role in the delivery of all Data and AI projects and programs driving strategic outcomes.
This role is a key contributor to the implementation of highly secure and reliable Data, AI, RPA and BI Platforms and solutions responsible for understanding business and technology requirements, defining and communicating solution designs, providing mentorship and guidance to the agile delivery teams, and ensuring that the delivered solutions are engineered in accordance with reference architecture and other security, wide usability and good data governance.

* Develop Data and AI Technology Domain strategies and Roadmap.
* Create, assess and evaluate solution options that will address both functional and non-functional requirements.
* Provide strong technical leadership to agile project teams and oversee the solution through implementation.
* Review and recommend new opportunities, solutions, tools and approaches to better serve the long-term architectural direction.
* Work with Enterprise Architecture and other domain architecture such as cloud, and security to ensure solutions adhere to organization standards, principles and practices;
* Provide excellent service, leadership, communication, problem solving and decision making skills.
* Demonstrate strong prioritization, time management and organizational skills.

Required Skills and Experience:
* Excellent understanding of the role of data, how it aligns with the other technology domains and how it enables organization use cases such as reporting, advanced analytics and AI, integration and API-based and event-based business data access.
* Previous experience in other financial industry technology areas such as Treasury systems, self-service banking channels, etc. would be beneficial.
* Experience with architecture and system design within an agile project methodology.
* Experience working with data systems such as data warehouse and data lake, including infrastructure components, ETL/ELT, storages, search and querying, etc.
* Experience with AI technologies and solutions such as data science toolsets, machine learning, computer vision, etc.
* Experience with Robotic Process Automation and Process Performance Management solutions.
* Experience with reporting, data visualization, analytics and advanced analytical solutions.
* Experience in building CICD pipelines and a strong understanding of CICD best practices
* Experience in implementing foundational cloud services (networking, security, account/organizations, logging & monitoring, identity access management, etc.)
* Experience with cloud compute, storage, and other IaaS services
* Experience with Apigee and API Gateway Management
* Experience with container and orchestration technologies (e.g. Docker or Kubernetes, Cloud Composure)
* Experience in leveraging new technology paradigms (e.g., serverless, containers, microservices)
* Experience in working with GITLab and other similar technologies
* Experience in designing / building solutions using Google Cloud Platforms (GCP) stack including technologies such as BigQuery, DataFlow, Cloud Function, Data Fusion, Big Table, Firestore, Cloudbuild, Cloud Composer, etc.
* Experience in designing / building data integration solutions.
* Experience in designing data solutions including acquisition, ingestion and engineering of data including technologies such as Change Data Capture (CDC), ETL / ELT, SQL, API, ESB, etc.
* General understanding of Enterprise Architecture (EA) and the TOGAF framework; ability to contribute to the development of EA artifacts.
* Strong leadership skills with the ability to lead assignments/teams and mentor others.
* A university degree or college diploma in Computer Science, Computer Engineering, or a related IT discipline.

Additional Skills "Nice to Have":
Previous experience with/integrating with the following considered beneficial:
* SAP ERP and Banking services
* Major CRM, BPM and ECM systems
* Public or private cloud environments
* Google Cloud Platform
* Terraform
* Containerization / Cloud Native (OpenShift, Kubernetes, EKS/GKE, ECS).
* Big Data / Analytic
* Machine learning and Robotic Process Automation (RPA)
* Source code management with Git
* non-shell scripting language
* CICD pipelines and best practices
* Infrastructure as Code (IaC) tooling

Share This Job

Powered by