Building products and software that make life insurance accessible to millions of families.
Data Science and User Research
September 6
🏡 Remote – Anywhere in California
Building products and software that make life insurance accessible to millions of families.
Data Science and User Research
• Build robust solutions for managing real-time and batch data • Develop hardened and repeatable CI/CD data models and pipelines to enable reporting, modeling, and machine learning • Design and implement data pipelines to support blending data from multiple sources and machine learning models, ensuring efficient data processing and feature engineering • Improve data availability and quality for our enterprise clients through automated monitoring and alerting • Leverage Google Cloud (GCP) tools and other services (e.g., Astronomer - Apache Airflow) to bring data workloads to production • Enable end-user configuration of product features, ensuring seamless synchronization with the broader application • Collaborate with cross-functional teams to deliver informed solutions that meet platform and client needs • Make team-based decisions, fostering shared responsibility for defensible design considerations
• 5+ years working in a data engineering role supporting product, analytics and data science teams • Proficient in SQL, and schema design, with experience in columnar databases such as Google BigQuery, Snowflake, or Amazon Redshift (familiarity with GraphQL a plus) • 4+ years of Python (or similar experience) writing efficient, testable, and readable code • Experience building and optimizing real-time and batch processing solutions, ensuring high availability and low latency, allowing for timely insights and actions. • Skilled in designing end-to-end data pipelines in cloud frameworks (GCP, AWS, Azure) with multi-stakeholder requirements • Familiarity with Google Cloud (GCP) tools (e.g., Cloud Run, Cloud Functions, Vertex AI, App Engine, Cloud Storage, IAM) • Experience with CI/CD pipelines for data processing (Docker, CircleCI, dbt, git) • Proficient in Infrastructure as Code (Terraform or Pulumi) and data orchestration tools (e.g., Apache Airflow
• remote (continuous 48 only)/hybrid workplace • meaningful benefits • substantial growth opportunities • equity
Apply Now