June 11
🏡 Remote – Anywhere in California
• Design, implement and maintain deployment and ETL pipelines for data products • Integrate diverse data sources and vendor products • Develop, deploy, and maintain scalable data pipelines • Implement monitoring solutions to track data pipeline performance
• Strong computer science fundamentals with 2+ years of relevant experience • Strong programming skills in Python or Java • Proficiency with SQL and relational databases • Proficiency with Snowflake and their services (development, performance tuning) • Proficiency with workflow orchestration engines (Apache Airflow, Argo Workflows) • ...
• Competitive salary • Comprehensive benefits package • Remote work flexibility • Opportunities for career growth
Apply Now