# Forward Deployed Engineer

> Arkham Technologies · Mexico City, Mexico (Hybrid) · Full-time · Posted 2026-03-10

**Workplace:** hybrid

## Description

**Forward Deployed Engineer**

### About Arkham:

[Arkham](https://d5v18c04.na1.hs-sales-engage.com/Ctc/OU+23284/d5v18C04/Jks2-6qcW69sMD-6lZ3p_W8Xtn473G83lYF2J6P2Cyqs6W398rnF7Lh2gmW6wsZbf71mzK2W2zx5H37C8n-1N3-TwJy6LXbbW15zrQT54WMZfW4zNvhs3C1JW1W7wwVhM2Tpc8SW8566CQ8c9jskF5rd1pGtgpWW6brY0k71jVn0N89hYs0r6TK6N5MqwG-Fq_bRW7y7-c38f1SZ4Mt_YlqyK8ylN65ljdrnh-V5W8c3RdQ5rn8j0W671dHr2VFD_9W6tbBmJ3MSQ0nf6v_4vx04) is a Data & AI platform that helps large enterprises:

-   Unify fragmented systems and data
-   Build a single source of trusted operational metrics
-   Solve complex challenges with AI tailored to their operations

Teams at **Circle K and Kimberly-Clark** **partner** with us to deploy AI-powered solutions for sell-out forecasting, pricing and promo analysis, and automated order assignment. With Arkham, they achieve high-impact results fast, creating a strong foundation for long-term AI transformation.  
  
Know more about Arkham:

-   [Our website](https://www.arkham.tech/)
-   [Youtube Channel](https://www.youtube.com/@ArkhamTechnologies)
-   [Medium blog](https://medium.com/arkham-technologies)

### **About the Role**

Our implementation model is built around two core roles:

-   Forward Deployed Engineer
-   Forward Deployed Data Scientist

As a **Forward Engineer**, you'll be the technical bridge between Arkham's Data Sync platform and our clients. Your primary focus is building the connectors that make data flow — REST APIs & JDBC/ODBC, — using frameworks like dlt, Airbyte, and Meltano (desired but not mandatory). But unlike a purely internal engineering role, you'll own the full implementation cycle: working directly with client teams to understand their data landscape, then building and deploying the connectors that unlock it.

You'll manage 3-4 client implementations at a time, moving fast from discovery to production while keeping your connector work reusable and scalable across engagements.

### **What You’ll Work On**

### **Build and Own Data Connectors**

-   Design and implement connectors for diverse data sources including:
-   REST APIs
-   JDBC / ODBC databases
-   Flat files and object storage
-   Custom ingestion scripts  
    

Every connector should prioritize **reusability, configurability, and scalability**.

### **Lead Client Implementations End-to-End**

Own the data integration process from initial discovery to production deployment.

This includes:

-   Understanding the client’s data architecture
-   Defining integration requirements
-   Building connectors
-   Deploying them into Arkham’s platform

Most integrations are expected to reach production within 2–4 weeks.

### **Design Robust Integration Architecture**

Create adapter and abstraction layers that standardize how connectors handle:

-   Authentication mechanisms
-   Pagination and incremental syncs
-   Rate limits
-   Error handling and retries
-   Schema normalization

Your goal is to ensure connectors behave consistently across systems.

### **Work Directly With Client Technical Teams**

Collaborate with engineering and data teams to:

-   Understand source systems
-   Map and validate data flows
-   Ensure outputs align with business requirements

You’ll translate real-world operational systems into **clean, reliable data pipelines**.

### **Build Analytics-Ready Data Pipelines**

Write clean SQL and design transformations that produce **high-quality datasets ready for analytics and AI workflows**.

Your pipelines should be:

-   Correct
-   Observable
-   Maintainable
-   Scalable

### **Improve the Connector Library**

Each integration should strengthen Arkham’s platform.

You’ll contribute reusable patterns and improvements back to the shared connector framework so that **every engagement accelerates the next one**.  

### **What We Require**

### **Experience**

-   2+ years of backend or data engineering experience
-   Strong **Python programming skills**
-   Hands-on experience building **REST API integrations**
-   Solid **SQL and relational database fundamentals**

### **Communication & Ownership**

You should be comfortable:

-   Explaining technical work to **technical and non-technical stakeholders**
-   Working in **client-facing environments**
-   Owning integrations **from problem definition to production**

This role requires engineers who **take responsibility for outcomes**, not just code.

### **Technical Environment**

Experience with cloud environments is expected.

Preferred:

-   **AWS**

### **Bonus Skills**

Nice-to-have experience includes:

-   CI/CD applied to data workflows
-   Data observability and testing frameworks
-   Familiarity with **AI-driven analytics and Generative AI use cases**
-   Experience with data integration frameworks such as:

-   **dlt**
-   **Airbyte**
-   **Meltano**

## Apply

[Apply at Arkham Technologies](https://apply.workable.com/arkham-technologies/j/B927AF1B70/apply)

---
Powered by [Workable](https://www.workable.com)
