About The Role:
The role will be part of the Fazz data team. As a Data Engineer, he/she will responsible for designing and maintaining our data warehouse, as well as creating optimised data pipelines to populate the data warehouse. As a Data Engineer (Data Warehouse), he/she will work alongside other data engineers to organise our data into data warehouses, create data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across systems using Google BigQuery.
What You Will Do:
- Design data warehouse structure for data analytics and data science requirements
- Develop and maintain robust ETL pipelines to populate data in data warehouses
- Work closely with the stakeholders and with other data team members to architect, develop, and maintain advanced reporting and data visualization capabilities on large volumes of data. support analytics and leverage data to improve business processes.
- Implementing best practices in data engineering including data integrity, quality, and documentation.
- Strong analytical skills with the ability to collect, organize, analyze, and present information with attention to detail and accuracy for both structured and unstructured datasets.
What We Are Looking For:
- Minimum 1 year experience as Data Engineering and/or Software Engineering. Fresh graduate are welcome to apply.
- Excellent English communication skills, both written and verbal.
- Data modeling knowledge for efficient data architecture and cloud database design.
- Excellent SQL query skills and experience with Python programming language.
- Have some experiences with or at least some knowledge of Cloud Data technologies in GCP or AWS, and data warehouse solutions such as Redshift or BigQuery, and BI tools (Metabase, Tableau).
- Have some experiences with writing data pipelines or creating ETL/ELT using tools such as Airflow, Talend, or Pentaho.
- Ability to collaborate effectively with cross-functional teams.
- Hybrid working arrangement.