Full-Time

Senior Software Engineer

Edge Infrastructure

Posted on 7/10/2025

Armada

Armada

501-1,000 employees

Real-time supply chain visibility platform

Compensation Overview

$144k - $180k/yr

+ Equity

Bellevue, WA, USA

In Person

Category
Software Engineering (1)
Requirements
  • 7+ years of professional software development experience, with a strong focus on backend and infrastructure development
  • Expert-level proficiency in Golang, with a solid understanding of Go's paradigms, idioms, and best practices. Experience with other languages like Java or Python is a plus
  • Proven track record of designing and developing large-scale, distributed systems, preferably in a resource-constrained or edge computing environment
  • Strong experience with containerization technologies (Docker, Kubernetes)
  • Experience building and managing scalable, secure, and high-performance services and APIs
  • Experience with cloud computing platforms (AWS, Azure, or GCP)
  • Strong understanding of concurrency, multi-threading, and non-blocking I/O
  • Experience with monitoring and management tools (Prometheus, Grafana, ELK stack)
  • Excellent analytical, problem-solving, and communication skills
  • Bachelor's degree or higher degrees in computer science or a related technical field
Responsibilities
  • Architect, design, and develop robust, scalable, and highly available infrastructure services for our ruggedized Galleon data centers, using Golang primarily
  • Hands-on develop critical infrastructure components, including workload and service management, for the platform
  • Decompose complex infrastructure problems into simple, straightforward solutions, providing mechanisms for prioritization and rapid execution
  • Develop and maintain APIs (RESTful and/or gRPC) for internal and external consumption, enabling interaction with its services
  • Optimize services for maximum speed, scalability, and resource utilization, ensuring high availability and responsiveness in edge environments
  • Work with cloud platforms (AWS, GCP, Azure) and deploy applications in containerized environments (Docker, Kubernetes) within the context of Galleon and Commander
  • Develop and implement automation tools and scripts (e.g., using Python, Ansible, or Terraform) to streamline infrastructure operations and deployments
  • Implement Zero Touch Provisioning solutions and software's to enable rapid deployment of Galleon data centers
  • Create and maintain comprehensive documentation for architecture, code, and operational procedures
  • Collaborate closely with other engineering teams (including networking, DevOps, frontend, and product) to ensure seamless integration of the infrastructure with the overall Armada.ai platform
  • Participate in code reviews, providing and receiving constructive feedback
Desired Qualifications
  • Deep understanding with Kubernetes architecture, operational best practices (including cluster management, resource optimization, and security), operators, and custom resource definitions (CRDs)
  • Proven experience in designing and implementing CI/CD pipelines for Kubernetes deployments using tools like Jenkins, GitLab CI, or ArgoCD
  • Strong proficiency in infrastructure as code (IaC) tools such as Terraform, especially in the context of Kubernetes and cloud environments
  • Experience with managing and securing container registries and artifact repositories
  • Experience with observability and troubleshooting in Kubernetes environments, including distributed tracing and log aggregation
  • Experience with security best practices for containerized applications and Kubernetes deployments, including RBAC, security contexts, and network security
  • Experience with automation of operating system configuration and deployment in edge environments

Armada.ai provides a cloud-based Software-as-a-Service (SaaS) platform that improves visibility and collaboration across the supply chain. It serves logistics providers, manufacturers, and retailers, and connects them through real-time analytics and a shared workflow environment. The core product integrates with customers’ existing systems and offers dashboards, data-driven analytics, and social-network-like collaboration so stakeholders can share information, identify issues quickly, and coordinate responses. Unlike many traditional supply chain tools, Armada.ai focuses on real-time visibility and cross-partner collaboration within a single platform, leveraging integrations rather than replacing current systems. The company’s goal is to help customers run more efficient, cost-effective supply chains by enabling faster decisions and better coordination across all participants.

Company Size

501-1,000

Company Stage

Late Stage VC

Total Funding

$226M

Headquarters

Cambridge, Massachusetts

Founded

2016

Simplify Jobs

Simplify's Take

What believers are saying

  • Aker BP deploys Armada Galleon on Norwegian rigs in 2025 for resilient operations.
  • US Navy uses Armada data centers at sea, proven in UNITAS 2025 exercise.
  • OpenAI partnership advances industry-specific edge AI models for critical sectors.

What critics are saying

  • NVIDIA GPU shortages halt Galleon deployments in 6-18 months.
  • OpenAI API price hikes force customer platform rewrites in 12-24 months.
  • Defense contract losses cut 40-60% revenue in 18-36 months.

What makes Armada unique

  • Armada deploys Galleon modular data centers for edge AI in remote offshore rigs.
  • Armada Edge Platform unifies compute, connectivity, and AI for distributed environments.
  • Armada integrates NVIDIA AI Grid and VAST Data for sovereign AI factories.

Help us improve and share your feedback! Did you find this helpful?

Benefits

Company Equity

Growth & Insights and Company News

Headcount

6 month growth

0%

1 year growth

2%

2 year growth

-4%
PR Newswire
Mar 23rd, 2026
Aker BP deploys modular offshore data centre with Armada on Norwegian Continental Shelf

Aker BP and Armada have agreed to deploy Armada's Galleon modular data centre for offshore drilling operations on the Norwegian Continental Shelf. The system will process and analyse drilling data directly on the rig, addressing connectivity limitations to shore-based infrastructure. The deployment aims to improve operational resilience by running AI models locally to predict equipment failures, reduce downtime and maintain operations during connectivity disruptions. The platform also enhances cybersecurity by minimising reliance on external networks and standardises vendor applications on a single edge architecture. Deployment will begin with one rig as a reference installation before expanding to additional assets. In 2025, Armada enabled the US Navy to deploy its first full-stack modular data centre at sea.

PR Newswire
Mar 17th, 2026
Armada brings NVIDIA AI Grid to telcos with edge platform for distributed AI infrastructure

Armada has announced its Edge Platform will support NVIDIA AI Grid, enabling telecommunications operators and enterprises to deploy geographically distributed AI infrastructure for latency-sensitive workloads. The platform integrates with NVIDIA technologies including RTX PRO Servers, Blackwell GPUs, Spectrum-X networking and AI Enterprise software. The Armada Edge Platform provides unified control across distributed AI infrastructure, from centralised AI factories to edge locations, managing intelligent workload placement and resource optimisation across thousands of sites. It supports applications requiring low latency, such as conversational AI, AR/XR and real-time video generation. Armada is collaborating with Nscale to deploy sovereign GPU clouds globally using the platform. The company's Galleon modular data centres provide rapidly deployable infrastructure for AI Grid sites where existing facilities are unavailable.

PR Newswire
Mar 16th, 2026
Armada leverages NVIDIA DSX Air to accelerate AI factory software development

Armada announced plans to use NVIDIA DSX Air to accelerate development and testing of its Bridge GPU management software through AI factory simulation capabilities. The company has been working with NVIDIA networking simulation technologies since late 2024 and will now extend usage to simulate GPUs, NVLink fabrics and Spectrum-X Ethernet switches. The simulation platform enables Armada to validate large-scale AI factory deployments without physical hardware constraints, compress proof-of-concept timelines from months to weeks, and create production digital twins for safer operational changes. Armada will also model its modular data centre, Galleon, within DSX Air. The full-stack edge infrastructure company serves sectors including energy and defence, delivering compute, storage and connectivity to remote industrial environments.

PR Newswire
Feb 25th, 2026
Armada partners with VAST Data to deliver distributed AI factories with sovereign cloud infrastructure

Armada, a distributed edge infrastructure company, has joined the VAST Cosmos Community through a partnership with VAST Data to deliver distributed AI factories. The collaboration combines Armada's Galleon modular data centres and Edge Platform with VAST's AI Operating System. The integrated solution enables organisations to build and scale AI factories in remote or regulated environments whilst maintaining data sovereignty and security. Armada's Galleon provides high-density AI infrastructure, whilst the Edge Platform delivers GPU-as-a-Service capabilities. VAST's DataSpace creates a unified data fabric across distributed locations. The partnership supports use cases including real-time edge inference, large-scale model training, autonomous systems and mission-critical government applications. The solution addresses regulatory, national security and data residency requirements whilst enabling globally connected AI operations.

Edge Industry Review
Dec 11th, 2025
Armada demonstrates real edge compute capability in contested maritime environments

Armada demonstrates real edge compute capability in contested maritime environments. Edge infrastructure solutions provider Armada participated in UNITAS 2025, the world's longest-running multinational maritime exercise, showcasing its modular AI data centers Galleons and Atlas software for edge computing in remote and contested environments. Galleons delivered high performance computing, networking and distributed network awareness both on land and out at sea on a Navy warship to aid in mission-critical operations. Armada technology created secure, resilient hybrid cloud environments and processed data from multi-INT robotic autonomous systems demonstrating its effectiveness in operating within power-constrained areas. "Warfighters often operate in power and communications-denied environments, and Armada's full stack modular, mobile data centers uniquely enable them to harness massive amounts of data in those conditions," says Dan Wright, CEO of Armada. "At Armada, we're not just building for tomorrow but delivering solutions today, right where they are needed, remote and contested environments both on land and at sea." The exercise underscored the value of industry partnerships, as Armada worked with U.S. Navy units, private-sector companies and research institutions. These Galleons have previously proven capabilities in other military applications, such as live drone feed processing using the American AI Stack. Armada has just unveiled Leviathan, the first modular, megawatt AI data center to drive forward distributed AI infrastructure and energy dominance. Armada is focused on innovation and cooperation enabling the technology and operational edge of the U.S. and its allied partners.

INACTIVE