Data Engineer
Updated on 2/8/2024
Barbaricum

201-500 employees

Veteran-owned firm providing strategic, tech-enabled government solutions.
Company Overview
Barbaricum stands out as a Service-Disabled Veteran-Owned small business with a strong focus on leveraging emerging technologies to support long-term client goals, particularly in the realm of National Security. The company's commitment to quality and continuous improvement is evident in its ISO 9001: 2015 certification and CMMI Level 3 appraisal, and its rapid growth and recognition by institutions like Inc. Magazine and GovCon attest to its industry leadership. With a dynamic team providing global support across five continents, Barbaricum fosters a vibrant corporate culture that has earned it the title of a Best Workplace for 2017 by Inc. Magazine.
Consulting

Company Stage

N/A

Total Funding

N/A

Founded

2008

Headquarters

Washington, District of Columbia

Growth & Insights
Headcount

6 month growth

3%

1 year growth

5%

2 year growth

20%
Locations
Springfield, VA, USA
Experience Level
Entry
Junior
Mid
Senior
Expert
Desired Skills
Python
Git
Data Structures & Algorithms
Linux/Unix
Data Analysis
CategoriesNew
Data & Analytics
Requirements
  • Active DoD TS/SCI Clearance (CI Polygraph preferred)
  • 4+ years of experience in computer science and/or software development field
  • Bachelor’s Degree in Computer Science or Software Engineering
  • Advanced coding skills in Python
  • Experience coding on Linux command line
  • Experience planning and managing new application development architecture and engineering
Responsibilities
  • Collaborate with a team of software and data engineers to solve technical challenges
  • Process, analyze and search petabytes of data, developing analytics tools
  • Write algorithms with trillions of records using geo-temporal analytics
  • Provide direct support to a Tier-One SOF customer focused on achieving real-world outcomes
Desired Qualifications
  • Current or previous mission-focused work experience supporting Special Operations and/or Intelligence Community customers
  • Experience with geospatial data and the development of geospatial-based analytical models
  • Experience with CSV, JSONL ingest to Delta Lake
  • Experience with PySpark ingest, transforms, and aggregations
  • Experience with Elasticsearch queries and aggregations
  • Experience with NiFi for data transfers
  • Experience with DevOps (especially gitlab and ansible)
  • Experience with Neo4j or graph products