Job Description
The Data Platform Architect will play a pivotal role in designing infrastructure and processes that enable delivery of enterprise data products that meet business needs. This position involves designing and optimizing infrastructure for use by SingleStore, Snowflake, DBT, cloud native tools, and Kubernetes. They will ensure the performance, cost-efficiency, and security of data solutions. The Data Platform Architect will also be responsible for maintaining comprehensive documentation, conducting data and integration tests, and collaborating with IT partners to implement robust data solutions.
Responsibilities:
- Design and delivery of enterprise data products and systems.
- Implementation of new cloud infrastructure.
- Enhancements and bug fixes to address data quality, performance, and delivery
- Creation and maintenance of external and internal documentation and standards to optimize for performance, cost, and efficiency.
- Data and integration tests using DBT and other tools.
- Creation of performance and cost analysis documents.
- Analysis and reports of security policies.
- Participate in daily stand-up meetings.
- Analyze source data and produce reports summarizing their contents.
- Participate in user forums and collect feedback to improve data handling across the organization.
- Answer questions from the user community about data and processes.
Qualifications
- Minimum of 5 years of experience developing/architecting on Big Data / AI platforms
- Minimum 2 years of experience building enterprise solutions in the public cloud.
- Minimum 2 years of experience designing solutions using distributed systems(Apache Spark, Snowflake, etc)
- Minimum 4 years of software development using python or a comparable language.
- Minimum 3 years of experience using Jira or other agile software development tools.
- Excellent written and verbal communication skills.
- Experience designing and implementing role-based access control policy in an enterprise environment.