TechJobBoard
Why TechJobBoard?

WIZELINE

Copy of Mid Level Data Scientist (Azure)

at WIZELINE

Remote



We are:
Wizeline, a global AI-native technology solutions provider, develops cutting-edge, AI-powered digital products and platforms. We partner with clients to leverage data and AI, accelerating market entry and driving business transformation. As a global community of innovators, we foster a culture of growth, collaboration, and impact.

With the right people and the right ideas, there’s no limit to what we can achieve

 

Are you a fit?
Sounds awesome, right? Now, let’s make sure you’re a good fit for the role:

Key Responsibilities

  • Design, develop, and optimize Databricks notebooks to process large volumes of data on Azure.
  • Translate business rules into PySpark code, developing robust and scalable solutions.
  • Read and process data from various sources, primarily Delta Lake tables.
  • Apply complex transformations on Spark DataFrames, including:
  • Data cleaning and preparation.
  • Creation of new columns and derivation of metrics.
  • Use of advanced functions such as Window Functions.
  • Execution of different types of joins and data combinations.
  • Write and update results in Delta tables.
  • Refactor and optimize existing notebooks to improve performance and readability.
  • Manage version control and notebook integration using Azure DevOps and Git.
  • Actively collaborate in code reviews through Pull Requests

Must-have Skills 

  • Expert-level experience in Azure Databricks
  • Solid experience with PySpark and Spark DataFrames
  • Strong hands-on expertise in Delta Lake (ACID transactions, schema evolution, optimization techniques)
  • Proficient in Azure DevOps (Repos, Pipelines, CI/CD workflows)
  • Strong Git skills (branching strategies, pull requests, code review collaboration)

Nice-to-have:

  • AI Tooling Proficiency: Leverage one or more AI tools to optimize and augment day-to-day work, including drafting, analysis, research, or process automation. Provide recommendations on effective AI use and identify opportunities to streamline workflows.
  • Solid experience in data manipulation using PySpark.

  • Knowledge of cloud-based architectures, ideally Azure.

  • Experience working with collaborative notebooks and version control in data environments.

  • Ability to translate business processes into reproducible technical solutions.

What we offer:

  • A High-Impact Environment
  • Commitment to Professional Development
  • Flexible and Collaborative Culture
  • Global Opportunities
  • Vibrant Community
  • Total Rewards

*Specific benefits are determined by the employment type and location.


Find out more about our culture here.

TechJobBoard

Search open jobs in the tech industry faster and find your match.

© 2023 TechJobBoard. All rights reserved.