Full-Time

Job Not Found

eSimplicity

eSimplicity

51-200 employees

Healthcare IT, cybersecurity, and federal telecom

No salary listed

Columbia, MD, USA

Remote

US Citizenship, US Top Secret Clearance Required

Category
Data & Analytics (1)
Required Skills
Agile
Redshift
Python
Airflow
MySQL
Git
Apache Spark
Java
Postgres
AWS
JIRA
Scala
Hadoop
Confluence
Databricks
Requirements
  • All candidates must pass public trust clearance through the U.S. Federal Government. This requires candidates to either be U.S. citizens or pass clearance through the Foreign National Government System which will require that candidates have lived within the United States for at least 3 out of the previous 5 years, have a valid and non-expired passport from their country of birth and appropriate VISA/work permit documentation.
  • Minimum of 8 years of previous Data Engineer or hands on software development experience with at least 4 of those years using Python, Java and cloud technologies for data pipelining.
  • A Bachelor’s degree in Computer Science, Information Systems, Engineering, Business, or other related scientific or technical discipline. With ten years of general information technology experience and at least eight years of specialized experience, a degree is NOT required.
  • Expert data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up.
  • Self-sufficient and comfortable supporting the data needs of multiple teams, systems, and products.
  • Experienced in designing data architecture for shared services, scalability, and performance
  • Experienced in designing data services including API, meta data, and data catalogue.
  • Experienced in data governance process to ingest (batch, stream), curate, and share data with upstream and downstream data users.
  • Ability to build and optimize data sets, ‘big data’ data pipelines and architectures
  • Ability to perform root cause analysis on external and internal processes and data to identify opportunities for improvement and answer questions
  • Excellent analytic skills associated with working on unstructured datasets
  • Ability to build processes that support data transformation, workload management, data structures, dependency and metadata
  • Demonstrated understanding and experience using software and tools including big data tools like Spark and Hadoop; relational databases including MySQL and Postgres; workflow management and pipeline tools such as Apache Airflow, and AWS Step Function; AWS cloud services including Redshift, RDS, EMR and EC2; stream-processing systems like Spark-Streaming and Storm; and object function/object-oriented scripting languages including Scala, Java and Python.
  • Flexible and willing to accept a change in priorities as necessary.
  • Ability to work in a fast-paced, team-oriented environment
  • Experience with Agile methodology, using test-driven development.
  • Experience with GitHub and Atlassian Jira/Confluence.
  • Excellent command of written and spoken English.
Responsibilities
  • Responsible for developing, expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams.
  • Support software developers, database architects, data analysts and data scientists on data initiatives and ensure optimal data delivery architecture is consistent throughout ongoing projects.
  • Creates new pipeline and maintains existing pipeline, updates Extract, Transform, Load (ETL) process, creates new ETL feature , builds PoCs with Redshift Spectrum, Databricks, AWS EMR, SageMaker, etc.;
  • Implements, with support of project data specialists, large dataset engineering: data augmentation, data quality analysis, data analytics (anomalies and trends), data profiling, data algorithms, and (measure/develop) data maturity models and develop data strategy recommendations.
  • Operate large-scale data processing pipelines and resolve business and technical issues pertaining to the processing and data quality.
  • Assemble large, complex sets of data that meet non-functional and functional business requirements
  • Identify, design, and implement internal process improvements including re-designing data infrastructure for greater scalability, optimizing data delivery, and automating manual processes
  • Building required infrastructure for optimal extraction, transformation and loading of data from various data sources using AWS and SQL technologies
  • Building analytical tools to utilize the data pipeline, providing actionable insight into key business performance metrics including operational efficiency and customer acquisition
  • Working with stakeholders including data, design, product and government stakeholders and assisting them with data-related technical issues
  • Write unit and integration tests for all data processing code.
  • Work with DevOps engineers on CI, CD, and IaC.
  • Read specs and translate them into code and design documents.
  • Perform code reviews and develop processes for improving code quality.
  • Perform other duties as assigned.
Desired Qualifications
  • Federal Government contracting work experience.
  • Databricks Certification, Google’s Certified Professional-Data-Engineer certification, IBM Certified Data Engineer – Big Data certification, CCP Data Engineer for Cloudera
  • Centers for Medicare and Medicaid Services (CMS) or Health Care Industry experience
  • Experience with healthcare quality data including Medicaid and CHIP provider data, beneficiary data, claims data, and quality measure data.

eSimplicity delivers Healthcare IT, Cybersecurity, and Telecommunications solutions to improve healthcare quality, expand coverage, and lower costs while protecting national interests. Its Digital Services engineers build modern systems to fight fraud, waste, and abuse in healthcare and to enhance digital services. The Telecommunications team helps the DoD manage spectrum internationally, and a cleared engineering group supports the Department of Homeland Security in safeguarding national security. Compared with peers, eSimplicity combines federal-facing IT, security, and communications work with a track record of reinvention and mission-focused projects for DoD, DHS, and federal agencies. The company aims to improve the health and lives of Americans and defend national security by delivering reliable IT and communications solutions through federal partnerships.

Company Size

51-200

Company Stage

N/A

Total Funding

N/A

Headquarters

Rockville, Maryland

Founded

2016

Simplify Jobs

Simplify's Take

What believers are saying

  • NIH multiyear NHLBI contract accelerates heart disease research discoveries.
  • Cloud-native data platforms boost federal analytics and cross-agency collaboration.
  • Recent September 30, 2025 federal award signals sustained procurement wins.

What critics are saying

  • Boeing and Lockheed Martin dominate DoD spectrum contracts within 6-12 months.
  • Leidos displaces eSimplicity in NIH deals via cloud-AI in 12-18 months.
  • HHS mandates exclude eSimplicity from $500M+ AI opportunities in 12-24 months.

What makes eSimplicity unique

  • eSimplicity modernizes NHLBI legacy systems for NIH since September 2025.
  • CMS awarded eSimplicity $4M in 2020-2021 for AI-enabling cloud migration.
  • eSimplicity engineers DoD spectrum management and DHS cleared security solutions.

Help us improve and share your feedback! Did you find this helpful?

Benefits

Health Insurance

Hybrid Work Options

Company News

ExecutiveBiz
Mar 11th, 2026
NIH Selects eSimplicity for Heart Institute IT Modernization Contract

NIH selects eSimplicity for heart Institute IT modernization contract. Photo: eSimplicity IT services firm eSimplicity has secured a multiyear contract from the National Institutes of Health to provide IT modernization support for the agency's National Heart, Lung and Blood Institute, or NHLBI. As agencies pursue IT modernization initiatives like the NHLBI effort, government and industry leaders will gather to discuss emerging technologies and mission-driven innovation. Register now for the 2026 Digital Transformation Summit on April 22 to join experts as they explore how digital tools, data and AI are reshaping federal operations. What is the scope of the NHLBI contract? In a LinkedIn post published Tuesday, eSimplicity said it will help modernize legacy systems and support software development standardization across NHLBI. The company will also support the development of a secure and integrated digital enterprise environment designed to improve collaboration across NHLBI teams. The modernization effort seeks to help scientists and innovators accelerate discoveries related to heart, lung, blood and sleep conditions that affect millions of Americans. "It's a privilege to work alongside such talented teams and help drive new ideas through NHLBI. We take pride in making technology work smarter, safer, and faster so we can deliver what really matters to our partners," said AnhThu Nguyen, founder and CEO of eSimplicity. What is eSimplicity? Founded in 2016, eSimplicity is a technology company focused on providing data modernization, enterprise systems and transformation, spectrum engineering and management, artificial intelligence and machine learning, and digital experience support for federal civilian, defense, intelligence and health agencies.

Discoveries in Health Policy
Sep 1st, 2024
Highlights: From the Newsletter Politico Pulse: HHS & AI

- In 2020 and 2021, the Centers for Medicare and Medicaid Services awarded tech consulting firm ESimplicity nearly $4 million to plan how to efficiently transition its data to cloud services that could support AI use.