Full-Time

Senior Software Engineer

Posted on 11/26/2025

Armada

Armada

501-1,000 employees

Real-time supply chain visibility platform

Compensation Overview

$144k - $180k/yr

+ Equity

Bellevue, WA, USA

In Person

Category
Software Engineering (1)
Requirements
  • 7+ years of backend development experience, with strong proficiency in Go (Golang) or a similar systems language
  • Proven expertise in designing and scaling SDKs and APIs used by external developers, including experience with API versioning, usability, and long-term maintainability
  • Deep experience with Kubernetes and containerized application deployment, including multi-tenant architectures, RBAC, and secure deployment workflows
  • Strong understanding of cloud platforms (Amazon Web Services, Google Cloud Platform, and Microsoft Azure) and hybrid cloud/edge integration patterns
  • Solid knowledge of SQL/NoSQL databases (e.g., MySQL, PostgreSQL, MongoDB, Redis) and data modeling for distributed systems
  • Experience architecting and optimizing CI/CD pipelines for Kubernetes-native environments
  • Familiarity with observability practices (monitoring, logging, alerting, metrics) for distributed services
  • Strong grasp of concurrency and performance optimization, particularly in Go (goroutines, channels, async patterns)
  • Demonstrated ability to lead design reviews, mentor engineers, and set technical direction for backend or marketplace services
  • Proficiency with version control systems (Git) and collaborative development practices (code reviews, branching strategies)
  • Commitment to producing clean, maintainable, well-tested code with unit, integration, and performance testing
Responsibilities
  • Own the design, development, and evolution of SDKs and APIs that power the Armada marketplace, ensuring scalability, usability, and long-term maintainability
  • Define and drive best practices for ISV onboarding workflows, including secure and reliable deployment patterns for applications running at the edge
  • Architect and optimize backend services for multi-tenant, cloud-native, and Kubernetes-based environments, focusing on scalability, availability, and performance
  • Collaborate closely with platform engineering (CaaS/GPUaaS), product, frontend, and DevOps teams to deliver seamless integration between infrastructure capabilities and ISV-facing services
  • Lead and participate in technical design reviews, setting standards for API contracts, versioning strategies, and developer experience
  • Mentor mid-level engineers, providing guidance on backend design, API patterns, and cloud-native best practices
  • Introduce and maintain observability (monitoring, logging, metrics, alerting) for marketplace and ISV integration workflows to ensure reliability and fast issue resolution
  • Ensure code is clean, well-tested, and maintainable; establish practices that raise code quality across the team
  • Contribute to architecture documentation and advocate for consistent technical standards across marketplace services

Armada.ai provides a cloud-based Software-as-a-Service (SaaS) platform that improves visibility and collaboration across the supply chain. It serves logistics providers, manufacturers, and retailers, and connects them through real-time analytics and a shared workflow environment. The core product integrates with customers’ existing systems and offers dashboards, data-driven analytics, and social-network-like collaboration so stakeholders can share information, identify issues quickly, and coordinate responses. Unlike many traditional supply chain tools, Armada.ai focuses on real-time visibility and cross-partner collaboration within a single platform, leveraging integrations rather than replacing current systems. The company’s goal is to help customers run more efficient, cost-effective supply chains by enabling faster decisions and better coordination across all participants.

Company Size

501-1,000

Company Stage

Late Stage VC

Total Funding

$226M

Headquarters

Cambridge, Massachusetts

Founded

2016

Simplify Jobs

Simplify's Take

What believers are saying

  • Aker BP deploys Armada Galleon on Norwegian rigs in 2025 for resilient operations.
  • US Navy uses Armada data centers at sea, proven in UNITAS 2025 exercise.
  • OpenAI partnership advances industry-specific edge AI models for critical sectors.

What critics are saying

  • NVIDIA GPU shortages halt Galleon deployments in 6-18 months.
  • OpenAI API price hikes force customer platform rewrites in 12-24 months.
  • Defense contract losses cut 40-60% revenue in 18-36 months.

What makes Armada unique

  • Armada deploys Galleon modular data centers for edge AI in remote offshore rigs.
  • Armada Edge Platform unifies compute, connectivity, and AI for distributed environments.
  • Armada integrates NVIDIA AI Grid and VAST Data for sovereign AI factories.

Help us improve and share your feedback! Did you find this helpful?

Benefits

Company Equity

Growth & Insights and Company News

Headcount

6 month growth

0%

1 year growth

2%

2 year growth

-4%
PR Newswire
Mar 23rd, 2026
Aker BP deploys modular offshore data centre with Armada on Norwegian Continental Shelf

Aker BP and Armada have agreed to deploy Armada's Galleon modular data centre for offshore drilling operations on the Norwegian Continental Shelf. The system will process and analyse drilling data directly on the rig, addressing connectivity limitations to shore-based infrastructure. The deployment aims to improve operational resilience by running AI models locally to predict equipment failures, reduce downtime and maintain operations during connectivity disruptions. The platform also enhances cybersecurity by minimising reliance on external networks and standardises vendor applications on a single edge architecture. Deployment will begin with one rig as a reference installation before expanding to additional assets. In 2025, Armada enabled the US Navy to deploy its first full-stack modular data centre at sea.

PR Newswire
Mar 17th, 2026
Armada brings NVIDIA AI Grid to telcos with edge platform for distributed AI infrastructure

Armada has announced its Edge Platform will support NVIDIA AI Grid, enabling telecommunications operators and enterprises to deploy geographically distributed AI infrastructure for latency-sensitive workloads. The platform integrates with NVIDIA technologies including RTX PRO Servers, Blackwell GPUs, Spectrum-X networking and AI Enterprise software. The Armada Edge Platform provides unified control across distributed AI infrastructure, from centralised AI factories to edge locations, managing intelligent workload placement and resource optimisation across thousands of sites. It supports applications requiring low latency, such as conversational AI, AR/XR and real-time video generation. Armada is collaborating with Nscale to deploy sovereign GPU clouds globally using the platform. The company's Galleon modular data centres provide rapidly deployable infrastructure for AI Grid sites where existing facilities are unavailable.

PR Newswire
Mar 16th, 2026
Armada leverages NVIDIA DSX Air to accelerate AI factory software development

Armada announced plans to use NVIDIA DSX Air to accelerate development and testing of its Bridge GPU management software through AI factory simulation capabilities. The company has been working with NVIDIA networking simulation technologies since late 2024 and will now extend usage to simulate GPUs, NVLink fabrics and Spectrum-X Ethernet switches. The simulation platform enables Armada to validate large-scale AI factory deployments without physical hardware constraints, compress proof-of-concept timelines from months to weeks, and create production digital twins for safer operational changes. Armada will also model its modular data centre, Galleon, within DSX Air. The full-stack edge infrastructure company serves sectors including energy and defence, delivering compute, storage and connectivity to remote industrial environments.

PR Newswire
Feb 25th, 2026
Armada partners with VAST Data to deliver distributed AI factories with sovereign cloud infrastructure

Armada, a distributed edge infrastructure company, has joined the VAST Cosmos Community through a partnership with VAST Data to deliver distributed AI factories. The collaboration combines Armada's Galleon modular data centres and Edge Platform with VAST's AI Operating System. The integrated solution enables organisations to build and scale AI factories in remote or regulated environments whilst maintaining data sovereignty and security. Armada's Galleon provides high-density AI infrastructure, whilst the Edge Platform delivers GPU-as-a-Service capabilities. VAST's DataSpace creates a unified data fabric across distributed locations. The partnership supports use cases including real-time edge inference, large-scale model training, autonomous systems and mission-critical government applications. The solution addresses regulatory, national security and data residency requirements whilst enabling globally connected AI operations.

Edge Industry Review
Dec 11th, 2025
Armada demonstrates real edge compute capability in contested maritime environments

Armada demonstrates real edge compute capability in contested maritime environments. Edge infrastructure solutions provider Armada participated in UNITAS 2025, the world's longest-running multinational maritime exercise, showcasing its modular AI data centers Galleons and Atlas software for edge computing in remote and contested environments. Galleons delivered high performance computing, networking and distributed network awareness both on land and out at sea on a Navy warship to aid in mission-critical operations. Armada technology created secure, resilient hybrid cloud environments and processed data from multi-INT robotic autonomous systems demonstrating its effectiveness in operating within power-constrained areas. "Warfighters often operate in power and communications-denied environments, and Armada's full stack modular, mobile data centers uniquely enable them to harness massive amounts of data in those conditions," says Dan Wright, CEO of Armada. "At Armada, we're not just building for tomorrow but delivering solutions today, right where they are needed, remote and contested environments both on land and at sea." The exercise underscored the value of industry partnerships, as Armada worked with U.S. Navy units, private-sector companies and research institutions. These Galleons have previously proven capabilities in other military applications, such as live drone feed processing using the American AI Stack. Armada has just unveiled Leviathan, the first modular, megawatt AI data center to drive forward distributed AI infrastructure and energy dominance. Armada is focused on innovation and cooperation enabling the technology and operational edge of the U.S. and its allied partners.

INACTIVE