Data Engineer
Posted on 11/2/2023
INACTIVE
Develops electric vertical aircraft and charging infrastructure.
Company Overview
BETA Technologies stands out in the electric transportation industry with its focus on safety, reliability, and sustainability, demonstrated through the development of ALIA, a technologically advanced electric vertical aircraft (EVA) with inherent stability and easy maneuverability. The company's commitment to creating an extensive charging infrastructure and its pragmatic approach to certification underscore its industry leadership. BETA's philosophy of achieving perfection through simplicity, as reflected in their products and platform, sets it apart from competitors and contributes to a culture of focused and purposeful innovation.
Industrial & Manufacturing
Food & Agriculture
Hardware
B2B
Company Stage
Series B
Total Funding
$886M
Founded
2017
Headquarters
South Burlington, Vermont
Growth & Insights
Headcount
6 month growth
↑ 19%1 year growth
↑ 51%2 year growth
↑ 135%Locations
Burlington, VT, USA
Experience Level
Entry
Junior
Mid
Senior
Expert
Desired Skills
AWS
Data Structures & Algorithms
Git
Airflow
Redshift
REST APIs
SQL
Apache Beam
Python
CategoriesNew
Data & Analytics
Requirements
- Bachelor's Degree or Master's in Computer Science, Statistics, Software Engineering, or a relevant field
- 3 years experience in a Cloud/Big Data Engineering role
- Familiarity with cloud platforms (AWS preferred)
- Experience architecting and programming large-scale software applications in Python
- Advanced working SQL knowledge and experience working with relational databases, query authoring as well as working familiarity with a variety of databases including columnar (e.g., RedShift) and non-relational (e.g. DynamoDB)
- Aptitude for rapidly learning and integrating with third-party APIs
- Experience building and optimizing 'big data' data pipelines, architectures and data sets
- Experience working with message queuing, stream processing, and highly scalable 'big data' data stores
- Experience working with Git version control and CI/CD systems
- Strong project management and organizational skills
- Exceptional troubleshooting skills with the ability to spot issues before they become problems
- Excellent communication skills, both written and verbal
- Experience supporting and working with cross-functional teams in a dynamic environment
- Experience developing Infrastructure as Code (IaC) using AWS CDK, Cloudformation or Terraform
- Proficiency in building RESTful APIs and web services
- Experience with Apache Big Data tools such as Airflow, Avro, Beam, Parquet, etc
- Familiarity with production applications such as ALM (Application Lifecycle Management), PLM (Product Lifecycle Management), Manufacturing Execution System (MES), Quality Management System (QMS), and Enterprise Resource Planning (ERP)
Responsibilities
- Develop, construct, test, and maintain analytic databases and scalable data pipelines
- Define performant data models to power dashboards and reporting, maintaining their coherence and evolution over time
- Develop and manage integrations with third-party APIs and services, ensuring stable, secure, and efficient data exchanges
- Work closely with data producers to ensure data is available, reliable, and ready for analysis
- Collaborate closely with data analysts to optimize data queries and visualizations
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc
- Implement data quality processes, ensuring the accuracy and consistency of data
- Keep abreast of emerging technologies and trends in data engineering, introducing innovations and best practices to the team and organization
- Maintain documentation of code, algorithms, and data definitions, ensuring clarity for other team members and stakeholders
- Leverage cloud-based solutions appropriately, ensuring scalability, resilience, and cost-effectiveness
- Effectively communicate with both technical and non-technical stakeholders, translating complex data concepts into understandable insights and recommendations
- Design, write, test, and deploy production-ready code
- Actively contribute to a collaborative team environment, fostering open communication, mutual respect, and a unified vision to achieve shared goals and drive business value
- Review team members' work to ensure quality, consistency, and conformity to best practices
- Work with data and subject matter experts to strive for greater functionality in our data systems