Job Description
As a valued colleague on our team, you will collaborate with team in designing, producing, testing, or implementing moderately complex software, technology, or processes, as well as create and maintain IT architecture, large scale data stores, and cloud-based systems.
THE IMPACT YOU WILL MAKE
The Big Data Engineer - AWS role will offer you the flexibility to make each day your own, while working alongside people who care so that you can deliver on the following responsibilities:
- Independently determine the needs of the customer and create solution frameworks.
- Design and develop moderately complex software solutions to meet needs.
- Use a process-driven approach in designing and developing solutions.
- Implement new software technology and coordinate end-to-end tasks across the team.
- May maintain or oversee the maintenance of existing software.
Qualifications
THE EXPERIENCE YOU BRING TO THE TEAM
Minimum Required Experiences:
- 2+ years with Big Data Hadoop cluster (HDFS, Yarn, Hive, MapReduce frameworks), Spark
- 2+ years of recent experience with building and deploying applications in AWS (S3, Hive, Glue, AWS Batch, Dynamo DB, Redshift, Cloudwatch, RDS, Lambda, SNS, SWS etc.)
Desired Experiences:
- Bachelor degree or equivalent
- 2+ years of Java (JEE, Swing, Spring), SQL
- 4+ years of Python, SparkSQL, PySpark
- 4+ years of ETL experience
- Knowledge on Mortgage and Housing Finance domain
- Knowledge of Spark streaming technologies
- Familiarity with Hadoop / Spark information architecture, Data Modeling, Machine Learning (ML)
- Knowledge of Financial Products, Risk Management, Portfolio Management is preferred but not mandatory. Training will be provided to help you gain ground
- Excellent problem solving skills and strong verbal & written communication skills
- Ability to work independently as well as part of a team
Skills
- Programming including coding, debugging, and using relevant programming languages
- Working with people with different functional expertise respectfully and cooperatively to work toward a common goal
- Expertise in service management concepts for networks and related standards such as ITIL practices or SDLC
- Skilled in cloud technologies and cloud computing
- Experience using software and computer systems’ architectural principles to integrate enterprise computer applications such as xMatters, AWS Application Integration, or WebSphere
- Experience defining and managing changes to documents, code, computer programs, websites, and other files to enable collaboration and ensure teams are working from the latest version
- Communication including communicating in writing or verbally, copywriting, planning and distributing communication, etc.
Tools
- Java
- AbInitio
- Python
- AWS
- SQL
- PySpark