Data Engineer
Confirmed live in the last 24 hours
Barbaricum

51-200 employees

Veteran-owned firm providing strategic, tech-enabled government solutions.
Company Overview
Barbaricum stands out as a Service-Disabled Veteran-Owned small business with a strong focus on leveraging emerging technologies to support long-term client goals, particularly in the realm of National Security. The company's commitment to quality and continuous improvement is evident in its ISO 9001: 2015 certification and CMMI Level 3 appraisal, and its rapid growth and recognition by institutions like Inc. Magazine and GovCon attest to its industry leadership. With a dynamic team providing global support across five continents, Barbaricum fosters a vibrant corporate culture that has earned it the title of a Best Workplace for 2017 by Inc. Magazine.
Consulting

Company Stage

N/A

Total Funding

N/A

Founded

2008

Headquarters

Washington, District of Columbia

Growth & Insights
Headcount

6 month growth

1%

1 year growth

1%

2 year growth

16%
Locations
Omaha, NE, USA
Experience Level
Entry
Junior
Mid
Senior
Expert
Desired Skills
Agile
AWS
Data Analysis
Data Science
Data Structures & Algorithms
Git
NumPy
SQL
Python
NoSQL
CategoriesNew
Data & Analytics
Requirements
  • Active DoD Top Secret clearance required
  • 8+ years of demonstrated experience in software engineering
  • Bachelor's degree in computer science or a related field. A degree in the physical/hard sciences (e.g., physics, chemistry, biology, astronomy), or other science disciplines (i.e., behavioral, social, and life) may be considered if it includes a concentration of coursework (typically 5 or more courses) in advanced mathematics and/or other relevant experience
  • 8+ years of experience working with AWS big data technologies (S3, EC2) and demonstrate experience in distributed data processing, Data Modeling, ETL Development, and/or Data Warehousing
  • Demonstrated mid-level knowledge of software engineering best practices across the development lifecycle, including agile methodologies, coding standards, code reviews, source management, build processes, testing, and operations
  • 3+ years of experience using analytical concepts and statistical techniques
  • 8+ years of demonstrated experience across Mathematics, Applied Mathematics, Statistics, Applied Statistics, Machine Learning, Data Science, Operations Research, or Computer Science especially around software engineering and/or designing/implementing machine learning, data mining, advanced analytical algorithms, programming, data science, advanced statistical analysis, artificial intelligence
Responsibilities
  • Design, implement, and operate data management systems for intelligence needs
  • Use Python to automate data workflows
  • Design algorithms databases, and pipelines to access, and optimize data retrieval, storage, use, integration and management by different data regimes and digital systems
  • Work with data users to determine, create, and populate optimal data architectures, structures, and systems; and plan, design, and optimize data throughput and query performance
  • Participate in the selection of backend database technologies (e.g. SQL, NoSQL, etc.), its configuration and utilization, and the optimization of the full data pipeline infrastructure to support the actual content, volume, ETL, and periodicity of data to support the intended kinds of queries and analysis to match expected responsiveness
  • Assist and advise the Government with developing, constructing, and maintaining data architectures
  • Research, study, and present technical information, in the form of briefings or written papers, on relevant data engineering methodologies and technologies of interest to or as requested by the Government
  • Align data architecture, acquisition, and processes with intelligence and analytic requirements
  • Prepare data for predictive and prescriptive modeling deploying analytics programs, machine learning and statistical methods to find hidden patterns, discover tasks and processes which can be automated and make recommendations to streamline data processes and visualizations
  • Design, implement, and support scalable data infrastructure solutions to integrate with multi heterogeneous data sources, aggregate and retrieve data in a fast and safe mode, curate data that can be used in reporting, analysis, machine learning models and ad-hoc data requests
  • Utilize Amazon Web Services (AWS) hosted big data technologies to store, format, process, compute, and manipulate data in order to draw conclusions and make predictions
Desired Qualifications
  • ArcGIS expertise
  • Experience using Python's NumPy
  • Familiar with git-based revision control
  • Familiar with DevSecOps analytics development