Own the strategic vision and the technology roadmap for our data platform in order to meet the data needs of various stakeholders across multiple business units.
Define and implement best practices for all data governance and data integration processes and tools.
Define and implement an operating model that delivers optimum ways of working across the data engineering, science, and analysis practices.
Deliver and execute a plan to modernize the data infrastructure and tools to efficiently support both operational and analytical data workloads.
The plan should deliver a highly scalable data lake architecture for optimal data processing.
Ensure proper data governance and data security measures are established via processes and technologies.
Establish and facilitate optimal project governance and decision-making structures, such as Steering Committee and Working Groups, to ensure smooth, efficient, and risk-managed program delivery.
Develop change management strategies and approaches to ensure transformational programs can be delivered effectively and business adoption is achieved.
Safeguard the solutions against IT/Cyber security threats through appropriate design and controls; and ensure full compliance with IT Security policies and standards.
Enable smooth business operations by ensuring high availability of the systems through appropriate design and architecture; and comply with IT availability and business continuity management processes and standards.
Provide technical expertise and support for managing highly complex bespoke data processing needs.
Effectively and collaboratively prioritize data related projects, balancing goals and dependencies from internal and external stakeholders, including product teams, technology, marketing, business leaders and partners.
Relevant College or University qualification to min Bachelor's level
Minimum 7 years relevant experience
Very advanced knowledge and experience in data engineering, data modelling, data optimization
Experience with big data architectures and data modelling to efficiently process large volumes of data.
Experience implementing Real Time , Batch and Stream Analytics
Experience in setting up and operating large data warehouses, SQL & NoSQL databases
Experience in Hadoop big data ecosystem and distributed computing
Experience in implementing Data Security
Experience in data modelling, application design, Micro Services architecture, data optimization, application development frameworks
Job Specific Skills
Exposure to application frameworks and architectures such as Spring, Node JS, Angular, Micro Services
Exposure to DevOps framework
Exposure to Cloud technologies
Exposure to programming languages such as Java, Python, Scala
Experience in Snowflake, MS Azure SQL Warehouse, Dremio or similar
Exposure to data science/ analytics/ stats packages and libraries
Very good command of English
Demonstrated success in building an environment which fosters and rewards collaboration, accountability, and trust
Excellent communication and relationship building skills across every level of the organization,