Senior Data Engineer (Python/ETL)
Brain Corp is a San Diego-based AI company creating transformative core technology for the robotics industry. Our diverse engineering teams in Software, Hardware Design, and Embedded Systems are creating comprehensive solutions to support the builders of today's autonomous machines in successfully producing, deploying, and supporting commercial robots across industries and applications. Brain Corp is funded by the SoftBank Vision Fund and Qualcomm Ventures. For more information please visit: https://www.braincorp.com/
As a member of the Software Engineering team, the Senior Data Engineer owns the definition, execution, and delivery of technical architecture for Brain Corporation’s Product Analytics team. Your work will improve the quality, reliability, accuracy, and consistency of our data. You will help build project-specific data pipelines and validation tools. You will have a hands-on role with product specialists and be responsible for working with them to understand data requirements and then implement those requirements in software.
Duties and Responsibilities:
- Fully own critical portions of Brain Corporation’s product analytics data model.
- Collaborate with stakeholders to understand needs, model tables using data warehouse best practices, and develop data pipelines to ensure the timely delivery of high-quality data.
- Work with technical and product stakeholders to understand data-oriented project requirements.
- Write code that is understandable, simple, and easy to maintain.
- Design, implement and ensure the accuracy of validation and related services for data models.
- Think and work agile, including automated testing, continuous integration and deployment.
- Manage numerous requests concurrently and strategically, prioritizing when necessary.
What you need:
- Bachelor’s in Computer Science or other related field.
- 5-8+ years software development experience.
- Passionate about data quality and delivering effective data to impact the business.
- Experience with sourcing and modeling data from application REST-like APIs.
- MPP/Cloud data warehouse solutions (Snowflake, etc).
- Proficiency in SQL, experience writing ETLs.
Things that make a difference:
- Master’s in Computer Science or other related field.
- Experience with Kafka, AMQP or other message bus.
- Express.js, NestJS.
- SAML, OKTA.
- Azure experience is a plus.
- Being comfortable outside of your comfort zone - explore new tech, make your own tool, or find a new way to address an old problem.
The physical demands described here are representative of those that must be met by an employee to successfully perform the essential functions of this job. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions. Essential functions may require maintaining the physical condition necessary for sitting, walking or standing for periods of time; operating a computer and keyboard; talk and hear at normal room levels; using hands to finger, grasp, and feel; repetitive motion; close visual acuity to prepare and analyze data and figures; transcribing; viewing a computer terminal; extensive reading; lift, push, carry, or pull up to 10 pounds.
The work environment characteristics described here are representative of those an employee encounters while performing the essential functions of this job. The noise level in the work environment is usually quiet to moderate. Employee is exposed to the typical office environment with computers, printers and telephones.