Senior Data Engineer, Core Experience
at Instacart
NS Only), Remote
We're transforming the grocery industry
At Instacart, we invite the world to share love through food because we believe everyone should have access to the food they love and more time to enjoy it together. Where others see a simple need for grocery delivery, we see exciting complexity and endless opportunity to serve the varied needs of our community. We work to deliver an essential service that customers rely on to get their groceries and household goods, while also offering safe and flexible earnings opportunities to Instacart Personal Shoppers.
Instacart has become a lifeline for millions of people, and we’re building the team to help push our shopping cart forward. If you’re ready to do the best work of your life, come join our table.
Instacart is a Flex First team
There’s no one-size fits all approach to how we do our best work. Our employees have the flexibility to choose where they do their best work—whether it’s from home, an office, or your favorite coffee shop—while staying connected and building community through regular in-person events. Learn more about our flexible approach to where we work.
Overview
At Instacart, our mission is to create a world where everyone has access to the food they love and more time to enjoy it together. Millions of customers every year use Instacart to buy their groceries online, and the Data Engineering team is building the critical data pipelines that underpin all of the myriad of ways that data is used across Instacart to support our customers and partners.
About the Role
The data engineering team plays a critical role in defining how data is modeled and standardized for uniform, reliable, timely, and accurate reporting. This is a high-impact, high-visibility role owning critical data integration pipelines and models across all of Instacart’s products. You will be part of a team with a large amount of ownership and autonomy, working closely with product and data science partners from core experience, growth, fulfillment, and other teams to support various operational and analytical use cases.
About the Team
Data engineering is part of the Infrastructure Engineering pillar, collaborating closely with product teams to capture critical data needed for various use cases. Our team is responsible for building high-quality data products that empower stakeholders to make data-driven decisions, ensuring data accuracy, reliability, and accessibility across the organization.
About the Job
- Design, build, and maintain high-quality, scalable, and robust data pipelines and ETL/ELT processes.
- Collaborate with engineers and both internal and external stakeholders, owning a large part of the process from problem understanding to shipping the solution.
- Work closely with product and data science partners to capture critical data needed for various operational and analytical use cases.
- Drive organization-wide initiatives and suggest improvements to existing data processes.
- Maintain a sense of urgency while shipping high-quality and pragmatic solutions.
- Implement and maintain data quality monitoring and observability checks to ensure data accuracy and reliability.
- Address and manage data ambiguity by developing robust data validation and reconciliation processes.
- Utilize dbt (data build tool), Spark to transform raw data into meaningful datasets that meet business requirements.
- Define and document data requirements in collaboration with cross-functional teams to ensure alignment and clarity.
About You
Minimum Qualifications
- 6+ years of working experience in a Data/Software Engineering role, with a focus on building data pipelines.
- Expert with SQL and knowledge of Python.
- Experience building high quality ETL/ELT pipelines.
- Past experience with data immutability, auditability, slowly changing dimensions or similar concepts.
- Experience building data pipelines for accounting/billing purposes.
- Experience with cloud-based data technologies such as Snowflake, Databricks, Trino/Presto, or similar.
- Adept at fluently communicating with many cross-functional stakeholders to drive requirements and design shared datasets.
- A strong sense of ownership, and an ability to balance a sense of urgency with shipping high quality and pragmatic solutions.
- Experience working with a large codebase on a cross functional team.
Preferred Qualifications
- Bachelor’s degree in Computer Science, computer engineering, electrical engineering OR equivalent work experience.
- Experience with Snowflake, dbt (data build tool), Spark and Airflow
- Experience with data quality monitoring/observability, either using custom frameworks or tools like Great Expectations, Monte Carlo, etc.
- Proven ability to manage and resolve data ambiguity through effective validation and reconciliation processes.
- Strong understanding of data requirements gathering and documentation practice
- Experience leveraging AI to enhance data products.
#LI-Remote
Instacart provides highly market-competitive compensation and benefits in each location where our employees work. This role is remote and the base pay range for a successful candidate is dependent on their permanent work location. Please review our Flex First remote work policy here. Currently, we are only hiring in the following provinces: Ontario, Alberta, British Columbia, and Nova Scotia.
Offers may vary based on many factors, such as candidate experience and skills required for the role. Additionally, this role is eligible for a new hire equity grant as well as annual refresh grants. Please read more about our benefits offerings here.
For Canadian based candidates, the base pay ranges for a successful candidate are listed below.