S

Senior Data Engineer

Sparc The World
Full-time
On-site
United States
$160,000 - $190,000 USD yearly

About the Job


Who We Are


Our client is a premier workflow software and data platform for real estate professionals and consumers. Based on foundational industry data, our client builds a modern software suite to empower real estate professionals, provide differentiated service to their clients, and foster business growth.


Backed by prominent investors, our client is pioneering next-generation workflow software and data products for the multi-trillion dollar residential real estate industry. They are currently scaling their platform as the first new entrant in decades.


Position Overview:


Our client's mission is to become the fastest-growing MLS workflow and data platform in the country. Data forms the core of their operations, and they seek a Senior Data Engineer to lead their data engineering initiatives. This role involves building a robust data lake and warehouse solution, scaling existing data infrastructure, and onboarding new MLS partners in the coming months. As a senior data engineer, you will collaborate closely with cross-functional teams including Data Insights, Product, Design, and other engineering teams.


This is an opportunity to join a small but expanding team, where as a foundational member, you will influence the standards, culture, and values of the data engineering team.


What You'll Do:


  • Build ETL tooling and data pipelines to ingest data from third-party sources and APIs into the system.
  • Design and implement automated data governance measures to enhance data quality and observability.
  • Establish team processes and foster a culture of ownership and accountability.
  • Collaborate with the Data Analyst team to support analysis, dashboards, and quality assessments.

What You'll Need:


  • 5+ years of experience in data engineering, with proficiency in Python, SQL, or Kotlin.
  • Experience developing scalable and fault-tolerant data pipelines for batch and real-time use cases.
  • Expertise with ETL schedulers such as Airflow (preferred), Dagster, Prefect, or similar frameworks.
  • Proficiency in cloud architecture, preferably AWS, and related technologies such as S3, SQS, RDS, EMR, Glue, Athena, and Lambda.
  • Experience working with data warehouses like Snowflake (preferred), Redshift, or Google BigQuery.
  • Hands-on experience with CI/CD pipelines using GitLab, GitHub Actions, Terraform, or Jenkins.
  • Familiarity with microservices architecture and cloud data lake implementations.
  • Strong communication skills, both verbal and written, with an ability to collaborate effectively across teams.

Location:


This role is based out of our client's office in New York City, Soho Manhattan, with at least 3 days per week onsite.


Bonus Points:


  • Certifications in AWS, Snowflake, or Elasticsearch.
  • Experience with Ruby on Rails.

Compensation:


To provide transparency, our client offers a base salary range of $160,000 to $190,000, along with equity and benefits. Final offers are determined based on skills, relevant experience, and job-related knowledge.


Note:


At this time, our client is considering only candidates authorized to work in the U.S.


This version maintains the essence of the original job description while omitting specific company details for confidentiality.