Senior Geospatial Software Engineer
Confirmed live in the last 24 hours
Agriculture & climate insight platform
Company Overview
Gro's mission is to illuminate the interrelationships between the Earth’s ecology and our human economy, the company allows users to see the big picture and act on the small details. From assessing the impact of climate change in real time to optimizing agricultural supply chains, Gro’s data, analytics, and forecast models provide the honest answers to what on earth is going on.
Food & Agriculture
Data & Analytics
Government & Public Sector
AI & Machine Learning
Financial Services
Company Stage
Series B
Total Funding
$117.6M
Founded
2014
Headquarters
New York, New York
Growth & Insights
Headcount
6 month growth
↑ 0%1 year growth
↑ 6%2 year growth
↑ 35%Locations
Denver, CO, USA
Experience Level
Entry
Junior
Mid
Senior
Expert
Desired Skills
Apache Spark
AWS
Data Analysis
Data Science
Google Cloud Platform
Git
Airflow
Microsoft Azure
Rust
Kubernetes
Python
CategoriesNew
Software Engineering
Requirements
- Someone who loves to learn, is willing to experiment, and remains adaptable
- Understands geospatial data and computational challenges for N-dimensional array data and GIS data
- Someone who is passionate about building and optimizing distributed parallel computing systems and crunching loads of data quickly and efficiently
- Self-motivated to drive projects to completion
- Open-minded about languages/tools/frameworks, and able to make good decisions about which to use (and when) to solve a particular problem
- A great team player; someone with a point of view who knows how to humbly express it
- Ability to create and clearly communicate designs and ideas
- 6+ years of experience in geospatial software engineering, including cloud data pipelines, big data architecture, and automation
- Expertise in Python
- Experience with GIS libraries (e.g. GDAL, Rasterio, Rioxarray, Shapely, GeoPandas)
- Fluency with cloud (AWS, GCP, or Azure) infrastructure (AWS preferred)
- Experience with geospatial data formats and schemas - both legacy (HDF, NetCDF, GRIB) and cloud-optimized (GeoTIFF, TileDB, Zarr)
- Fluency with one or more cloud-based distributed computing frameworks (e.g. Dask, Spark)
- Experience with workflow management tools (e.g. Airflow, Dagster)
- BS in Computer Science, a related technical field, or equivalent practical experience
- Experience with cloud-native software development using containers (e.g. Kubernetes, ECS)
Responsibilities
- Design, implement, and maintain high throughput data pipelines that allow scientists to quickly analyze and transform large geospatial (multi-dimensional, high resolution) datasets in the cloud
- Manage, optimize, and even redesign existing geospatial data workflows and storage systems
- Design, implement, and manage data storage solutions for large geospatial datasets
- Collaborate with scientists, analysts, researchers, and software developers to deliver high-quality geospatial data products and services
- Work with cross-functional teams to integrate geospatial data processing tasks
Desired Qualifications
- Experience with collaborative platforms (e.g. GitHub, GitLab)
- Experience with object stores (S3, Azure Blob Storage, etc)
- Knowledge of data science,machine learning, and statistical models
- Experience with Geospatial or highly pixelated data sets
- Experience in technical leadership, leading project teams, and setting technical direction
- Proficiency with or willingness to learn Rust