Job title: Lead Analytics Engineer
Reporting to: Head of Data Engineering
Location: Cape Town (Hybrid)

ALL STAFF APPOINTMENTS WILL BE MADE WITH DUE CONSIDERATION OF THE COMPANY’S EE TARGETS

WHAT WE DO

Lula is an innovative and human-focused FinTech company on a mission to help small businesses optimise their cash flow. Our purpose is to help SMEs manage their businesses better, faster, and more simply, so they can spend more time doing what they love.

If you’re looking for a new place to call ‘home’ that believes in the potential of the broader SME landscape in South Africa and a place where you’ll work with awesome people - then Lula’s the place for you!

We’re making business banking fast, human, Lula!

OUR VALUES

Collaborative - we’re a clan and work together as a team, always towards a common goal

Committed - we’re accountable and follow through no matter the challenge

Curious - we look for better ways to do things and make a positive difference

Connected - we stay close to, learn from and look to understand each other and our customers

Compassionate - we go out of our way to care about our colleagues, our customers and our community

OVERALL PURPOSE

We are seeking a Lead Analytics Engineer specialising in FinTech, with a strong preference for experience in the Banking and/or Credit industry. This is a hands-on leadership role that will be instrumental in building, maintaining, and optimising our analytics infrastructure. You will lead a team of engineers while actively working to design and develop data models, business data workflow optimisations, and ensuring data quality across all analytics and data science processes. This role combines leadership, technical expertise, and strategic vision to drive data-driven decision-making across the organisation.

Key Responsibilities:

  • Lead and Mentor: Manage and guide a team of analytics engineers, ensuring best practices in coding, data modeling, and engineering.
  • Hands-on Engineering: Actively participate in designing, building, and optimising scalable data pipelines (ETL/ELT) using tools like DBT, SQL, and Snowflake.
  • Data Warehousing: Architect, implement, and maintain our data warehouse (Snowflake) to ensure data is accessible, reliable, and optimized for performance..
  • DBT Implementation: Design and develop DBT models and workflows to transform raw data into actionable insights, empowering data scientists and analysts with clean and structured data for analysis and operational implementations.
  • Data Quality and Governance: Implement robust data validation and governance processes to ensure accuracy and consistency across data pipelines and models.
  • Collaboration: Work cross-functionally with data science, product, and business teams to understand and anticipate data requirements.
  • Optimisation: Continuously improve the performance, scalability, and efficiency of data models and pipelines, driving best practices in data architecture.
  • Strategic Initiatives: Contribute to the overall data strategy and help drive the adoption of new tools and technologies as required by business needs.

THE COMPETENCIES WE’RE AFTER

  • Clear and concise communication and documentation skills
  • Proven ability to lead and mentor a team of engineers, driving high performance
  • Strong cross-functional collaboration skills, with the ability to translate business needs into technical solutions
  • Process-orientated with experience in Agile
  • Critical thinking skills
  • Problem-solving abilities, with a focus on proactive issue resolution
  • Skilled in balancing multiple projects and delivering on time
  • Focused on high quality output
  • Self-starter

THE SKILLS AND EXPERIENCE WE’RE LOOKING FOR

  • Bachelor's degree in Computer Science, Data Science, or related field. A master's degree is a plus.
  • 5+ years of experience as a Data/Analytics Engineer, with 2+ years in a leadership or senior role.
  • Strong familiarity with the FinTech, Banking, or Credit industries.
  • Expertise in data warehousing technologies, particularly Snowflake.
  • Extensive experience using DBT to develop complex data models.
  • Deep knowledge of data integration, ETL/ELT pipelines, and orchestration frameworks.
  • Advanced SQL skills and experience optimising complex queries for performance.
  • Experience with cloud data platforms (AWS, GCP, or Azure) and associated data storage/processing services.
  • Proficiency in Python or other programming languages for data processing.

Nice to Have:

  • Experience with BI tools like Looker, Tableau, or Power BI.
  • Familiarity with data cataloging tools and data governance frameworks.
  • Knowledge of financial regulations (e.g., GDPR, PCI DSS) and how they impact data management.

Please note that all appointments are subject to our background checking process, which may include Credit, Criminal and any other job inherent checks