Employment OS for your Business

Employment OS for Job Seekers

Spatial Data Onboarding Specialist

02 | DevOps • Calgary, Alberta, Canada • Full-time
AI Job Summary
  • 2 to 5 years working with spatial/geospatial data or structured data prep workflows.
  • Hands-on cleaning, transforming, and preparing structured datasets at production quality.
  • Use AI tools (e.g., Claude/ChatGPT/Cursor/Copilot) for scripting, schema inference, ETL debugging, metadata, and anomaly

Role Type

On-site • Permanent • Full-time • Associate

Description

About BigGeo

BigGeo is the Spatial Cloud.

We help companies manage and access the world’s spatial data.

Any size, any slice, any insight.

Delivered in seconds.

We’re building something that hasn’t existed before: a new layer of the internet where the “where” and “when” behind every decision is instantly clear, programmable, and actionable. Our platform removes the complexity that has kept spatial data locked in silos for decades and replaces it with speed, precision, and control.

We’re a Calgary-based company, early and moving fast, with real customers, real infrastructure, and a clear point of view on where the world is going.

Why BigGeo Exists and Why People Build Here

Most companies are spatially blind. They know what their data says, but not where or when things actually happen. That gap costs real money, creates real risk, and limits what AI can actually do in the physical world.

BigGeo exists to close that gap.

We’re not building another tool. We’re building the rails that connect the planet’s moving data to the systems that run the world. That’s a big problem, and it takes people who care about doing things right, not just fast.

People build here because:

  • The problem is real and the category is open. We’re not competing for the middle of an existing market, we’re defining a new one. Your work shapes what the category becomes.
  • Your fingerprints are on the architecture. We’re at the stage where the decisions you make today become the foundation tomorrow. What you ship matters.
  • We run on clarity, not politics. We move with purpose. No bureaucratic drag, just a team that agrees on the mission and gets to work.
  • You’ll grow fast because the problems are hard. Spatial data at scale is a genuinely difficult domain. If you want to be stretched, you’ll be stretched.
  • We’re building for longevity. We’re not chasing hype cycles. We’re building infrastructure, the kind that compounds in value over time and earns the trust of the companies that depend on it.

The Role

BigGeo is hiring a Spatial Data Onboarding Specialist to bring new spatial datasets into the Spatial Cloud and make sure they are ready to power real-world intelligence applications. This role sits at the intersection of external data providers and internal engineering teams, and it has direct influence on the quality, reliability, and breadth of the data ecosystem that the platform runs on.

You will own how spatial datasets are prepared, validated, and integrated into the platform. You will guide data partners through the onboarding process, clean and transform incoming data into Spatial Cloud-compatible structures, build and improve validation workflows, and work directly with data and platform engineers to make sure every dataset is indexed, documented, and accessible through BigGeo’s APIs.

This is a high-visibility role in a category-defining company. The datasets you onboard become part of the operational fabric that organizations, systems, and AI use to make real-world decisions. The work moves quickly, and the standard for data quality is high.

Key Responsibilities

Dataset Onboarding

  • Partner with external data providers and internal teams to onboard new spatial datasets into the Spatial Cloud.
  • Guide providers through the preparation steps required to meet platform standards, translating technical requirements into clear partner guidance.
  • Confirm that every dataset is structured and formatted for Spatial Cloud compatibility before it enters the platform.

Data Preparation and Transformation

  • Clean, transform, and organize spatial datasets so they are ready for ingestion, indexing, and query.
  • Standardize data structures across providers so the platform stays interoperable as the data ecosystem grows.
  • Optimize source data so it can be processed efficiently at scale.

Data Validation and Quality

  • Verify the accuracy, completeness, and geographic integrity of every dataset before it is released into the Spatial Cloud.
  • Build and maintain validation workflows that automatically detect inconsistencies, schema drift, and geospatial errors.
  • Act as a quality gate: datasets do not go live until they meet BigGeo’s standard.

Metadata and Documentation

  • Document dataset structures, schemas, attributes, projections, and geographic context so internal teams and external consumers can reason about the data.
  • Maintain organized, discoverable records of every onboarded dataset.
  • Keep metadata living and accurate as datasets are updated, versioned, or deprecated.

Platform Integration Support

  • Work hand-in-hand with data and platform engineers so datasets integrate cleanly with Spatial Cloud infrastructure.
  • Support indexing and preparation steps that enable fast spatial queries across any size, any slice, any insight.
  • Confirm that onboarded datasets are reachable and performant through platform APIs and services.

AI-Enabled Workflow Support

  • Use AI tools to accelerate dataset inspection, schema discovery, transformation scripting, anomaly detection, and metadata generation.
  • Continuously improve onboarding workflows by adding AI-assisted steps where they increase accuracy or speed.

Cross-Functional Collaboration

  • Collaborate closely with spatial engineers, data engineers, and platform teams to keep onboarding aligned with product and infrastructure direction.
  • Act as BigGeo’s point person for external data partners during onboarding.
  • Help evolve the onboarding playbook as the Spatial Cloud data ecosystem scales.

What You Bring

Required:

  • 2 to 5 years of experience working with spatial data, geospatial datasets, or structured data preparation workflows.
  • Associate degree in Geography, Information Technology, or a related field.
  • Hands-on experience cleaning, transforming, and preparing structured datasets at production quality.
  • Working familiarity with common spatial data formats (Shapefile, GeoJSON, GeoPackage, GeoParquet, KML, raster formats) and core geospatial concepts such as projections, coordinate systems, and spatial indexing.
  • A strong quality instinct: you notice when a dataset is wrong before anyone downstream does.
  • Experience building or operating data processing or ETL workflows.
  • Clear written communication, especially when documenting schemas, decisions, and onboarding steps for partners and internal teams.

Nice to Have:

  • Direct experience with GIS platforms, remote sensing data, or large-scale location intelligence pipelines.
  • Familiarity with geospatial libraries and tooling such as GDAL/OGR, PostGIS, GeoPandas, Shapely, or Fiona.
  • Experience onboarding datasets into a data platform, analytics product, or cloud data warehouse.
  • Previous work directly with external data providers, vendors, or partners.
  • Experience in a startup or fast-moving data platform environment where the playbook is still being written.
  • Exposure to cloud object storage and modern lakehouse architectures (S3, GCS, Parquet, Iceberg, Delta).

Advanced AI Skills

BigGeo is an AI-enabled company. Every role is expected to use modern AI tools to move faster, produce better work, and raise the quality bar. For this role specifically, that means:

  • Using AI copilots (Claude, ChatGPT, Cursor, Copilot) to accelerate transformation scripting, schema inference, and ETL debugging.
  • Using LLMs to generate first-pass metadata, data dictionaries, and partner-facing documentation, then editing for accuracy.
  • Using AI-assisted tools to detect anomalies, outliers, and geospatial inconsistencies at scale, rather than relying only on manual spot checks.
  • Building prompt-driven workflows that compress repetitive onboarding tasks from hours into minutes.
  • Actively sharing the AI workflows that work with the rest of the team, so the whole onboarding function compounds.

We are not looking for AI demos. We are looking for someone who quietly ships more, faster, and cleaner because AI is woven into how they work.

Success Measures

First 30 days:

Onboarded to BigGeo’s data stack, tooling, and current onboarding workflows. Shipped at least one dataset end-to-end with support, from intake to validated, documented, and live in the Spatial Cloud. Built a working view of the current data partner pipeline and where the friction points are.

First 60 days:

Running dataset onboarding independently across multiple providers in parallel. Contributing improvements to validation and quality workflows, with measurable reduction in downstream data issues. Documented reusable patterns for the most common onboarding scenarios.

First 90 days and beyond:

Recognized internally as the owner of spatial dataset onboarding quality. Actively shaping the onboarding playbook, validation standards, and AI-assisted workflows used across the team. Expanding the data ecosystem that powers the Spatial Cloud, with trusted datasets flowing in at a cadence the platform can rely on.