In order to search for jobs specifically for CAIA Charterholders or those pursuing the CAIA Charter please enter “CAIA” in the search panel.
This will enable you to search for CAIA specific roles globally.
What does GDMO do? The Group Data Management Office (GDMO) plans, priorities and executes key data initiatives that build long-term data governance and management capabilities, to enable OCBC Group in meeting its regulatory obligations and digitalization objectives. Position Available Candidate for the role is expected to be passionate about working with huge datasets and have the experience working with businesses to build data products and services to turn data into insights using advanced analytics & machine learning. He should have experience with curation of data for analytics, and a strategic/long term view on architecting advanced data eco systems. He is experienced in building efficient and scalable data services and has the ability to integrate data systems with relevant tools and services to support a variety of customer use cases/applications. The function includes:
Designing, implementing, and operating large-scale, high-volume, high-performance data structures for analytics and data science
Implementing data ingestion routines both real time and batch using best practices in data modeling, ETL/ELT processes by leveraging on relevant technologies and big data tools
Gathering business and functional requirements and translate these requirements into robust, scalable, operable solutions with a flexible and adaptable data architecture.
Collaborating with IT to help adopt best practices in data system creation, data integrity, test design, analysis, validation, and documentation
Collaborating with data scientists to create fast and efficient algorithms that exploit rich data sets for optimization, statistical analysis, prediction, clustering and machine learning
Helping continually to improve ongoing reporting and analysis processes, automating or simplifying self-service modeling and production support for users.
Qualifications Qualifications and Qualities The successful candidate will be expected to be/have:
7-8 years of related working experience, with demonstrated strength in ETL/ELT, data modelling, data warehouse technical architecture, infrastructure components and reporting/analytic tools.
5+ years' hands-on experience in writing complex, highly-optimized SQL queries across large data sets
3+ years of experience in scripting languages like Python etc
3+ years of experience as project lead in driving the projects under enterprise program initiative.
Experience with big data technologies (Hadoop, Hive, Kafka, Spark, etc.) and reporting platform such as Qlikview or Tableau.
Ability to deal with ambiguity and prioritise/manage multiple tasks, with good problem-solving skills
Willing to listen to multiple stakeholders and forge consensus on win-win solutions to meet sound data governance and management principles