March 30
🏢 In-office - San Francisco
• Collaborate and Solve: Work closely with diverse stakeholders to ingest and model complex datasets, ensuring they meet business needs and enable data-driven decisions. • Optimize and Automate: Identify, design, and implement process improvements, automate manual processes, and optimize data delivery to scale our systems efficiently. • Enhance Observability: Design and implement processes for data observability, usage metrics, and monitoring, ensuring the platform's reliability and performance. • Required to work on-site 3 days a week (Tuesday, Wednesday, Thursday). Managers may require additional on-site days.
• 5+ years of data engineering, analytics engineering, or software development experience • 3+ years of strong data engineering design/development experience in building large-scale distributed data platforms/products • Proficient in Python, SQL • Familiarity with Apache Kafka • Experience with dbt and data modeling • Experience building scalable, efficient data pipelines using Snowflake and other cloud-based data technologies • Familiar with workflow management systems (Airflow, Argo Workflows, etc) • Experience designing and implementing data ingestion platforms, interacting with multiple third-party data sources, and assembling them into actionable structures • Ability to establish and maintain relationships with key stakeholders and peers
Apply Now