Cell C Ltd
Cell C has a vacancy for an ETL Specialist
Duties & Responsibilities
Purpose of the Job: To perform key tasks within the Data Warehouse and Big Data platforms such as systems administration and support, data analyses and reporting, performance monitoring, as well as custom system development using a variety of tools. Build data models and work closely with Platform and Infrastructure Engineers to build, scale, and ensure reliable and scalable data pipeline infrastructure.
- Build, scale and ensure reliable and scalable data pipeline infrastructure
- Developing and maintaining system improvements by analysing process workflow and monitoring system utilisation in conjunction with the EIS technical team (Oracle DB/ Hadoop Big Data Platform)
- Daily monitoring of the systems, verifying the integrity and availability of all hardware, server resources, systems, and key processes, reviewing system and application logs, and verifying completion of scheduled jobs. (Oracle and Big Data Hadoop Platform)
- Produce working code in order to meet the initial load and refresh extraction, transformation, loading, error and exception handling; and data cleansing specifications.
- Design the data cleansing, error and exception handling, and audit and control modules for the data quality process.
- Communicate closely with the data modelling manager and database designer to ensure the database design meets the database design meets the data requirements of the module functionality and that the modules load data efficiently.
- Produce module and link test plans. During the various testing activities and the production phase, this person diagnoses faults and determines corrections.
- Assist the Project team in minimising the risks in accordance with project delivery.
- Assist vendors with installation and system preparations for new patches and upgrades.
- Responsibilities include ensuring applications are correctly configured; installed, and maintained operating and development software
- Works with the project team to optimize system performance.
Data Quality Assurance Function:
- Provide knowledge and guidance regarding specific ETL tool functionality.
- Supports and provides interpretation for the tool capabilities and cross-training of other MIS team members on the tool.
- Diagnose and corrects faults found by tests.
- Executes regression tests and corrections, during production.
- Identify data issues and communicate them to the data integration manager. Increase data quality by correction of inconsistent data within and across applications and measure data quality according to the following relevant attributes:
- Focus on data quality with well-defined processes for the ongoing monitoring of quality issues and correction within source systems.
Perform any other related duties as requested by Management
Desired Experience & Qualification
- Bachelors or Diploma degree in Engineering/Sciences/Commerce preferable that s/he completed Computer Science or Information Systems as a major course.
- Certified training in dimensional modelling techniques (Ralph Kimball, etc), Database Management techniques (preferable Oracle based) and data integration techniques (Extraction Transformation, and Loading Techniques using Oracle Warehouse Builder and Informatica tools as preferred technologies or equivalent).
- Minimum 3 years experience in the Information Technology industry.
- Minimum 2 years experience Applications Support, preferably in a Hybrid Big Data environment
- Minimum 2 years experience on Oracle Databases, SQL, and UNIX operating system environment.
- At least 2 years experience building ETL jobs
- Ability to write code in scripting language(s) (e.g., SQL, Python, etc.)
- Knowledge off and preferably experience with Big Data ecosystems, how they integrate and services such as Hadoop & HDFS, MapReduce & Hive, DAG & Apache Spark
- Good understanding of telecommunications industry or CDR/EDR analysis would offer the candidate added advantage.
How to apply?