TechJobBoard
Why TechJobBoard?

WIZELINE

Senior Data Engineer (Bogotá and Medellín)

at WIZELINE

Colombia



We are:
Wizeline, a global AI-native technology solutions provider, develops cutting-edge, AI-powered digital products and platforms. We partner with clients to leverage data and AI, accelerating market entry and driving business transformation. As a global community of innovators, we foster a culture of growth, collaboration, and impact.

With the right people and the right ideas, there’s no limit to what we can achieve

Are you a fit?

Sounds awesome, right? Now, let’s make sure you’re a good fit for the role:

We are seeking a highly experienced and technically adept Lead Data Engineer to play a key role in a critical data platform migration initiative. In this role, you will lead a team of talented data engineers in transitioning our existing data infrastructure from PySpark, Athena, and BigQuery to a modern stack centered around dbt and Snowflake, with Airflow for orchestration. You will be responsible for defining the frameworks, architecting scalable data solutions, ensuring data quality and governance, and mentoring the team throughout this transformation. This is a hands-on role where you will be instrumental in shaping the future of our data capabilities.

Responsibilities

  • Migration Strategy & Execution:
    • Define and lead the comprehensive data migration strategy from existing BigQuery/Athena/PySpark data sources and pipelines to Snowflake.
    • Oversee the design, development, and implementation of scalable ELT pipelines using Airflow for orchestration and dbt for data transformation within Snowflake.
    • Develop and implement robust data validation and reconciliation processes to ensure data integrity and accuracy throughout the migration.
  • Architectural Leadership:
    • Design and optimize scalable data models and schemas within Snowflake, applying best practices for performance, cost efficiency, and maintainability.
    • Establish and enforce data engineering best practices, coding standards, and CI/CD pipelines for the new dbt/Snowflake environment.
    • Collaborate with data architects, product owners, and other stakeholders to translate business requirements into technical data solutions.
  • Team Leadership & Mentorship:
    • Lead, mentor, and coach a team of data engineers, fostering a culture of technical excellence, collaboration, and continuous learning.
    • Conduct code reviews, provide constructive feedback, and guide the team in adopting new technologies and methodologies.
    • Oversee project planning, resource allocation, and delivery timelines for data engineering initiatives.
  • Data Governance & Quality:
    • Ensure compliance with data security and privacy regulations.
    • Proactively identify and address data quality issues, bottlenecks, and performance challenges.
  • Stakeholder Communication:
    • Clearly communicate technical plans, progress, and potential risks to both technical and non-technical stakeholders.
    • Collaborate effectively with cross-functional teams including analytics, data science, and operations.

Must-have Skills 

  • Bachelor's or Master's degree in Computer Science, Engineering, or a related quantitative field.
  • 5+ years of progressive experience in data engineering, with at least 3 years in a lead or leadership capacity.
  • Deep expertise in data migration projects, particularly involving transitions between cloud data warehouses.
  • Extensive hands-on experience with Snowflake, including performance tuning, cost optimization, and advanced features.
  • Mastery of dbt (data build tool) for data modeling, transformation, testing, and documentation.
  • Strong proficiency in Apache Airflow for workflow orchestration and pipeline management.
  • Proven experience with data warehousing concepts, dimensional modeling, and ELT/ETL processes.
  • Expert-level SQL scripting and performance tuning skills.
  • Proficiency in Python for data processing, scripting, and automation.
  • Familiarity with the capabilities and nuances of PySpark, Athena, and BigQuery (source systems).
  • Experience with version control systems (e.g., Github) and CI/CD methodologies.
  • Excellent problem-solving, analytical, and communication skills.
  • Ability to work independently and collaboratively in a fast-paced, agile environment.
  • Must speak and write in English fluently; Effective communicator

Nice-to-have:

  • AI Tooling Proficiency: Leverage one or more AI tools to optimize and augment day-to-day work, including drafting, analysis, research, or process automation. Provide recommendations on effective AI use and identify opportunities to streamline workflows.

What we offer:

  • A High-Impact Environment
  • Commitment to Professional Development
  • Flexible and Collaborative Culture
  • Global Opportunities
  • Vibrant Community
  • Total Rewards

*Specific benefits are determined by the employment type and location.

 

Find out more about our culture here.

TechJobBoard

Search open jobs in the tech industry faster and find your match.

© 2023 TechJobBoard. All rights reserved.