Empowering Truck Drivers' Lives
loadboard • trucking • logistics • freight • cloud software
September 4
🏢 In-office - San Francisco
Empowering Truck Drivers' Lives
loadboard • trucking • logistics • freight • cloud software
Design, implement, and optimize data pipelines using DBT, and AWS services like Glue, S3, and Athena among others. Develop and maintain ETL processes to ingest, transform, and load data from various sources into our data warehouse. Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver data solutions that meet business needs. Ensure data integrity, quality, and security in all data processes and workflows. Monitor and troubleshoot data pipelines, ensuring they are reliable, scalable, and efficient. Optimize data storage and retrieval strategies to improve performance and reduce cost.
Proficiency in SQL and experience with data querying and manipulation. Experience with data modeling, ETL processes, and building data pipelines. Solid understanding of data warehousing concepts and best practices. Strong problem-solving skills and attention to detail. Excellent communication and collaboration skills. Strong ability to work independently. Bachelor’s degree in Computer Science, Engineering, or a related field, or equivalent work experience. 3-5 years of experience working on similar roles. Experience with Python or other scripting languages used for data processing. Hands-on experience with AWS data services, including AWS Glue, Apache Airflow, S3, Athena, Lambda, Redshift, RDS PostgreSQL, and Kinesis is a plus. Experience with DBT (Data Build Tool) and AWS DMS (Database Migration Service). Experience working in a fast-paced, startup environment.
Offers Equity
Apply Now