Senior Software Engineer
Updated on 4/6/2024
UiPath

1,001-5,000 employees

AI-powered business automation platform developer
Company Overview
UiPath, a global leader in AI-powered business automation, fosters a culture of innovation and inclusivity, attracting top talent to contribute to the next leap in automation technology. The company's competitive advantage lies in its robust and user-friendly automation platform, which has helped diverse clients, from Uber to Xerox, save millions of dollars and thousands of hours, demonstrating its industry leadership. With a commitment to making the world a better place through automation, UiPath offers a dynamic work environment that encourages creativity and problem-solving, making it an ideal workplace for those passionate about technology and its potential to transform businesses.
Robotics & Automation
Fintech
AI & Machine Learning

Company Stage

Series F

Total Funding

$2B

Founded

2005

Headquarters

New York, New York

Growth & Insights
Headcount

6 month growth

6%

1 year growth

17%

2 year growth

6%
Locations
Bellevue, WA, USA
Experience Level
Entry
Junior
Mid
Senior
Expert
Desired Skills
Microsoft Azure
Python
Airflow
Apache Flink
Apache Spark
SQL
Java
AWS
Scala
Snowflake
Google Cloud Platform
CategoriesNew
Data Engineering
Backend Engineering
Software Engineering
Requirements
  • Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
  • Proven track record (8 years’ experience) of architecting and engineering world-class, large-scale commercial applications and services
  • 3+ years of experience in data engineering, ETL, or a similar role, with a strong focus on data pipeline development and optimization
  • Proficiency in programming languages such as Python, Java, or Scala
  • Experience with big data technologies and frameworks, such as Snowflake, Spark, Flink, or other similar technologies
  • Familiarity with data pipeline orchestration tools, such as Apache Airflow, Luigi, Azure Data Factory, or similar tools
  • Strong knowledge of SQL and experience working with various databases (e.g., Snowflake, SparkSQL, MSSQL, DBT, etc)
  • Experience with cloud-based data storage and processing platforms, such as AWS, GCP, or Azure
  • Strong problem-solving skills and the ability to work independently as well as collaboratively in a team environment
  • Excellent communication skills, with the ability to articulate complex technical concepts to non-technical stakeholders
Responsibilities
  • Design, develop, and maintain data pipelines that integrate data from various sources, ensuring efficient and reliable data ingestion and processing
  • Collaborate with product teams, data scientists, analysts, and other stakeholders to gather requirements and understand data needs
  • Optimize data pipelines for quality, performance and scalability, utilizing best practices in data management and ETL (extract, transform, load) processes
  • Monitor and troubleshoot data pipeline issues, implementing fixes and improvements as needed to ensure timely and reliable data delivery
  • Stay current with industry trends and technologies, incorporating new techniques and tools to continuously improve the data pipeline architecture and performance