Senior Data Architect
Companies & Intellectual Property Commission
2025/10/17
Pretoria
ADVERTISEMENT
Senior Data Architect
Job Grade: P4
Ref No.: D130001
Location: Pretoria
Job Type: 5 years fixed term contract
Job Purpose: The primary purpose of this role is to lead and implement a comprehensive Data Governance Program and establish a robust, future-proof Data Architecture across multi-cloud platforms. This role is crucial to revolutionizing CIPC's data management, enhancing data quality, streamlining analytics processes, and creating a centralized data hub, including a Data Marketplace, to foster a data-driven culture and enable advanced analytics across the organization.
Required Minimum Education / Training
Candidates must meet one of the following requirements:
Required Minimum experience
Specialized Certification and Experience Pathway (Alternative):
Required Minimum Education / Training
Required Minimum Experience
Minimum Functional Requirements (Technical Skills & Knowledge)
? Deep Data Modeling Expertise: Expert-level proficiency in designing and implementing dimensional (Star/Snowflake), Data Vault, and Relational data models, with a clear understanding of the trade-offs between them.
? Cloud Data Platform Mastery:In-depth knowledge and proven experience designing and deploying scalable data solutions using at least two major cloud platforms (Azure, AWS, or GCP), including their respective data warehousing, data lake, and compute services.
? Big Data Ecosystem:Strong hands-on experience with Apache Spark (PySpark/Scala) for large-scale data processing and experience with modern messaging/streaming technologies (e.g., Kafka).
? Data Governance:Proven ability to implement technical solutions for Metadata Management, Data Lineage, Data Quality (DQ), and Data Observability.
? Advanced SQL & Programming: Expert proficiency in writing complex, optimized SQL queries and extensive experience in a programming language like Python or Scala for pipeline development.
? Architecture Frameworks:Practical experience applying Enterprise Architecture methodologies and frameworks, such as TOGAF, to data initiatives.
? Security & Access Control: Expertise in designing and enforcing fine-grained data access controls, encryption, and data masking techniques across cloud data stores.
? Technical Skills: Strong knowledge of data modeling, ETL/ELT processes, data warehousing, and data governance best practices. Proficiency in SQL, Python, Spark, and other relevant programming languages and tools.
? Soft Skills: Excellent communication, collaboration, and problem-solving skills. Ability to work independently and lead a team of technical resources.
Key performance areas
Applicants may, as a step in the recruitment process, be subjected to competency assessment. In addition, the successful candidate must be prepared to undergo a process of security clearance prior to appointment.
Qualifications and SA citizenship checks will be conducted on the successful candidate. It is the applicant’s responsibility to have foreign qualifications evaluated by the South African Qualifications Authority (SAQA).
It will be expected of candidates to be available for selection interviews on a date, time and place as determined by CIPC.
CIPC is an equal opportunity, affirmative action employer. Preference will be given to candidates whose appointment will enhance representation in accordance with the approved employment equity plan.
Feedback will only be given to shortlisted candidates.
CIPC reserves the right not to fill an advertised position.
For further details regarding these positions please click on the link: https://cipc.mcidirecthire.com/default/External/CurrentOpportunities or visit the CIPC website at www.cipc.co.za
Kindly note that faxed, emailed, posted and or hand delivered applications will not be considered.
Should you experience any difficulty in applying please contact the CIPC Recruitment Office by dialing: 087 743 7074, 7075, 7197 or 087 260 1554
Closing date:October 31, 2025
Required Minimum Education / Training
Candidates must meet one of the following requirements:
Required Minimum experience
Specialized Certification and Experience Pathway (Alternative):
Required Minimum Education / Training
Required Minimum Experience
Minimum Functional Requirements (Technical Skills & Knowledge)
? Deep Data Modeling Expertise: Expert-level proficiency in designing and implementing dimensional (Star/Snowflake), Data Vault, and Relational data models, with a clear understanding of the trade-offs between them.
? Cloud Data Platform Mastery: In-depth knowledge and proven experience designing and deploying scalable data solutions using at least two major cloud platforms (Azure, AWS, or GCP), including their respective data warehousing, data lake, and compute services.
? Big Data Ecosystem: Strong hands-on experience with Apache Spark (PySpark/Scala) for large-scale data processing and experience with modern messaging/streaming technologies (e.g., Kafka).
? Data Governance: Proven ability to implement technical solutions for Metadata Management, Data Lineage, Data Quality (DQ), and Data Observability.
? Advanced SQL & Programming: Expert proficiency in writing complex, optimized SQL queries and extensive experience in a programming language like Python or Scala for pipeline development.
? Architecture Frameworks: Practical experience applying Enterprise Architecture methodologies and frameworks, such as TOGAF, to data initiatives.
? Security & Access Control: Expertise in designing and enforcing fine-grained data access controls, encryption, and data masking techniques across cloud data stores.
? Technical Skills: Strong knowledge of data modeling, ETL/ELT processes, data warehousing, and data governance best practices. Proficiency in SQL, Python, Spark, and other relevant programming languages and tools.
? Soft Skills: Excellent communication, collaboration, and problem-solving skills. Ability to work independently and lead a team of technical resources.