Archipelago uses AI to digitize risk for large property owners to increase resiliency and lower their cost of risk.
Data • Technology • Real Estate • AI • Machine Learning
April 24
🏢 In-office - San Francisco
Archipelago uses AI to digitize risk for large property owners to increase resiliency and lower their cost of risk.
Data • Technology • Real Estate • AI • Machine Learning
• Develop, design, create, modify, and/or test document processing pipelines or systems to support our Machine Learning and Analytics capabilities. • Collaborate with Data Science, Product Managers and Software Engineers to enable the Product Support team to deliver compelling user-facing features. • Ensure quality of data through their flow, implement guardrails, health checks and alerts • Driving the continuous improvement of our existing codebase by participating in code reviews, refactoring legacy code, and measuring code coverage and performance
• 5+ years of experience as a hands on data engineer with strong proficiency in Python, Pandas, etc. • 2+ years of experience working with data workflow platforms such as Apache Airflow, Slurm or Flyte etc. • Experience operating both internal and production data pipelines. • Experience collaborating with platform and machine learning engineers. • Experience with processing PDF and Excel files. • SQL / Analytics experience with Snowflake. • Experience working in cloud based environments, e.g. AWS, GCP, Azure.
• Company Equity Program • Flexible Time Off • Home Office Stipend
Apply Now