Company Overview
Biobot Analytics develops cutting-edge technology to transform sewers into public health observatories. Wastewater contains valuable information about the health of communities. We collect it. We analyze it. We tell you how to leverage it to make your city better. Our first product is to measure opioids and other drug metabolites in sewage to estimate consumption in cities. With this data, those working on harm reduction can assess the scope of the epidemic, allocate resources, and gauge the effectiveness of programming over time.
About the job
About the role
We are looking for a Software Engineer, Data to join our Software team. The Software Engineer will play a crucial role in designing, building and maintaining our data infrastructure, and ensuring the availability and reliability of diverse datasets for both internal- and external-facing product offerings. The ideal candidate will have a strong background in software engineering or data engineering, a passion for working with and modeling data, and the ability to design and develop scalable data pipelines using industry-standard tools. In addition, the candidate should be capable of leading the communication and coordination with internal stakeholders to determine and clarify requirements around specific data requests
Duties and responsibilities
- Design, build, and maintain robust and scalable data pipelines to collect, process, and store data from various sources (S3, APIs, databases, etc.)
- Develop and optimize data models to support the needs of external product initiatives, as well as requests from internal stakeholders (ex. data scientists, analysts, finance, etc.)
- Create and maintain documentation for data pipelines and data models
- Implement data quality checks, validation processes, and monitoring to maintain high data integrity
- Identify and resolve performance bottlenecks in data pipelines and databases for efficient data processing
- Provide guidance and mentorship to junior data engineers by promoting coding and development best practices (testing, code reviews, etc.)
Experience & requirements
- Expert in Python and SQL
- Expert in AWS, including Step Functions, Lambda, Batch, and API Gateway
- Experience with data modelling and relational databases, including PostgreSQL
- Experience with Snowflake, or related warehousing framework(s)
- Experience with Infrastructure as code, including terraform
- Experience with CI/CD and common development workflows (Git, code reviews, unit testing, integration testing, systems testing)
- Experience with containerization (Docker, docker-compose, etc)
- BS in Computer Science, or related, plus 4 years of industry experience, or equiv
Plusses
- Experience with dbt
Experience working with scientific or wastewater data
How to Apply:
APPLY