Position Overview
We are seeking an experienced Data Architect to leverage and optimize our cloud-based data architecture using Snowflake's platform. This role will be responsible for establishing data architecture standards, designing scalable data solutions, and leading the technical implementation of our data warehouse and analytics infrastructure.
Key Responsibilities
Architecture & Design
- Design and architect enterprise-scale data solutions on Snowflake platform
- Develop data architecture blueprints, data models, and integration patterns
- Define data governance frameworks and best practices for data management
- Create technical specifications and documentation for data architecture solutions
- Establish standards for data security, privacy, and compliance within Snowflake environment
Technical Implementation
- Lead implementation of Snowflake data warehouse solutions including databases, schemas, tables, and views
- Design and optimize data pipelines using Snowflake features (Streams, Tasks, Pipes)
- Implement data transformation processes using SQL, stored procedures, and Snowflake's native capabilities
- Configure and manage Snowflake security features including RBAC, network policies, and data encryption
- Optimize warehouse performance through clustering, partitioning, and resource management
- Design and implement AI/ML solutions using Snowflake Cortex capabilities including ML Functions, LLM Functions, and Search Services
- Architect data science workflows leveraging Snowpark for Python and Snowpark ML
Data Integration & ETL/ELT
- Design and implement data integration solutions connecting various source systems to Snowflake
- Architect ELT processes using tools like dbt, Fivetran, Stitch, or custom solutions
- Establish data quality frameworks and monitoring processes
- Design real-time and batch data processing workflows
- Implement data lineage and metadata management solutions
AI/ML & Advanced Analytics
- Design and implement AI/ML architectures using Snowflake Cortex platform features
- Leverage Snowflake Cortex ML Functions for model training, inference, and feature engineering
- Implement Large Language Model (LLM) solutions using Cortex LLM Functions for text analysis and generation
- Design vector search capabilities using Snowflake Cortex Search Services for semantic search and RAG applications
- Build end-to-end ML pipelines using Snowpark for Python and Snowpark ML
- Implement MLOps practices including model versioning, monitoring, and automated deployment
- Partner with data scientists to productionalize machine learning models within Snowflake ecosystem
- Partner with data engineers, analysts, and business stakeholders to understand requirements
- Provide technical guidance and mentorship to development teams
- Collaborate with DevOps teams on deployment automation and infrastructure as code
- Work with security teams to ensure data protection and compliance requirements
- Present architecture decisions and recommendations to senior leadership
Performance & Optimization
- Monitor and optimize Snowflake performance, cost, and resource utilization
- Implement auto-scaling and resource management strategies
- Conduct regular performance tuning and capacity planning
- Establish monitoring and alerting for data pipeline health and performance
Required Qualifications
Technical Skills
- 5+ years of experience in data architecture and data warehousing
- 3+ years of hands-on experience with Snowflake platform
- 2+ years of experience with AI/ML implementations, preferably using Snowflake Cortex or similar platforms
- Expert-level SQL skills and experience with Snowflake-specific SQL features
- Strong understanding of cloud platforms (AWS, Azure, or GCP) and their data services
- Experience with data modeling techniques (dimensional modeling, data vault, etc.)
- Proficiency with data integration tools and ETL/ELT processes
- Knowledge of modern data stack tools (dbt, Airflow, Fivetran, etc.)
- Hands-on experience with Snowflake Cortex features including ML Functions, LLM Functions, and Search Services
- Proficiency in Python for data science and machine learning workflows
- Experience with Snowpark for Python and Snowpark ML for ML pipeline development
Certifications (Preferred)
- Snowflake Data Engineer or Architect certification
- Cloud platform certifications (AWS Solutions Architect, Azure Data Engineer, etc.)
- Additional data-related certifications (Databricks, dbt, etc.)
Additional Technical Skills
- Experience with Infrastructure as Code (Terraform, CloudFormation)
- Knowledge of programming languages (Python, Java, or Scala) with emphasis on Python for ML workflows
- Understanding of data governance and data quality frameworks
- Experience with BI tools (Tableau, Power BI, Looker)
- Familiarity with streaming data technologies (Kafka, Kinesis)
- Knowledge of machine learning frameworks (scikit-learn, TensorFlow, PyTorch)
- Experience with vector databases and semantic search technologies
- Understanding of Large Language Models (LLMs) and Natural Language Processing (NLP)
- Familiarity with MLOps practices and tools (MLflow, Kubeflow, etc.)
- Experience with Retrieval-Augmented Generation (RAG) architectures
Preferred Qualifications
- Bachelor's degree in Computer Science, Information Systems, or related field
- 7+ years of overall experience in data and analytics roles with 2+ years in AI/ML implementations
- Experience in multiple cloud environments and migration projects
- Previous experience leading data architecture initiatives in enterprise environments
- Strong understanding of data privacy regulations (GDPR, CCPA, etc.)
- Experience with machine learning and advanced analytics platforms
- Background in implementing production ML systems and MLOps practices
- Experience with generative AI and LLM integration in enterprise applications
Key Competencies
Technical Leadership
- Ability to translate business requirements into technical solutions
- Strong problem-solving and analytical thinking skills
- Experience making architectural decisions in complex environments
- Ability to evaluate and recommend new technologies and tools
Communication & Collaboration
- Excellent written and verbal communication skills
- Ability to present complex technical concepts to non-technical stakeholders
- Strong collaboration skills with cross-functional teams
- Experience mentoring and developing technical talent
Project Management
- Experience managing multiple concurrent projects and priorities
- Ability to work in agile development environments
- Strong attention to detail and quality standards
- Experience with project management tools and methodologies
What We Offer
- Competitive salary and equity package
- Comprehensive health, dental, and vision insurance
- Professional development budget for certifications and training
- Flexible work arrangements
- Opportunity to work with cutting-edge data technologies
- Collaborative and innovative work environment
Success Metrics
- Successful delivery of scalable and performant data architecture solutions including AI/ML capabilities
- Reduction in data processing costs and improved query performance
- Implementation of robust data governance and security practices
- High stakeholder satisfaction with data architecture solutions
- Successful deployment of AI/ML models and features using Snowflake Cortex
- Mentorship and development of team members
We are an equal opportunity employer committed to diversity and inclusion. We welcome applications from all qualified candidates
#LI-DH1