Job Title: Data Platform Engineer – Supabase & GitOps Specialist
Location: Remote or On-site
Employment Type: Full-time
About FarmerTitan
FarmerTitans' AI-powered maintenance platform helps farmers keep track of their repairs and maintenance, predict breakdowns before they happen, make smarter equipment decisions, and help repair. We’re helping farmers save money across Canada, United States, Brazil and Australia.
You’d be a key engineer super integral to the company. We’re looking for someone with high aptitude, take ownership, commits to a high degree of excellence, good energy and a "lets get stuff done" mentality. Someone who is excited to be part of the journey of a fast growing startup for the next 3-5 years.
About the Role
We’re seeking a Data Platform Engineer with deep expertise in Supabase and modern GitOps workflows to design, build, and maintain the backbone of our data systems. In this role, you’ll lead the development of scalable Postgres and Redis databases, architect ETL/ELT pipelines, and enable our engineering teams to work efficiently through well-structured, version-controlled infrastructure.
You’ll be at the intersection of data architecture, DevOps, and application integration, ensuring that data systems are secure, maintainable, and seamlessly integrated with our stack (Supabase, Nuxt.js, Vue, Node.js).
Key ResponsibilitiesSupabase & Data Architecture
- Design and manage Supabase Postgres schemas, roles, policies, and extensions.
- Implement Supabase Row-Level Security (RLS) and authentication rules.
- Use Supabase’s APIs, Edge Functions, and real-time features to power application features.
- Integrate Supabase with front-end and back-end systems for optimal performance and maintainability.
Database Engineering
- Architect scalable Postgres and Redis solutions optimized for our workloads.
- Plan and execute zero-downtime migrations using GitOps workflows.
- Write, optimize, and maintain complex SQL queries, CTEs, and analytics views.
- Develop indexing, partitioning, and caching strategies for high-performance data access.
ETL/ELT & Analytics
- Build and maintain ETL/ELT pipelines for analytics, reporting, and operational systems.
- Implement data validation, monitoring, and quality checks across pipelines.
- Work closely with product and engineering teams to surface actionable analytics.
GitOps & CI/CD
- Manage database schema and migration scripts under version control using Git workflows.
- Implement GitOps pipelines to deploy database changes consistently across environments.
- Integrate schema migrations into CI/CD processes for automated testing and deployment.
- Collaborate with DevOps to containerize, test, and deploy Supabase/Postgres environments.
Bonus Areas
- AI/ML data preparation and integration.
- Python scripting for automation or analytics.
- Deploying and managing self-hosted systems like n8n for workflow automation.
Requirements
Must-Have Skills
- Proven expertise in Supabase, including Postgres configuration, RLS, authentication, and API integrations.
- Strong Postgres experience (indexes, constraints, optimization, partitioning, replication).
- Experience with Redis data modeling and caching strategies.
- Strong SQL skills (complex joins, window functions, analytics queries).
- Hands-on experience with Git-based workflows for database schema management.
- Experience implementing GitOps pipelines for database and infrastructure changes.
- Proven experience building ETL/ELT pipelines.
Nice-to-Have Skills
- Familiarity with Nuxt.js, Vue, Node.js for integrating Supabase data into apps.
- Python for automation, analytics, or ML integration.
- AI/ML project exposure.
- Experience deploying and managing automation tools like n8n.
- DevOps knowledge (Docker, CI/CD, cloud infrastructure).
What We Offer
- Competitive salary and benefits.
- Work on high-impact, cutting-edge data systems.
- Flexible remote work.
- Opportunities to grow in AI/ML, DevOps, and advanced automation.
If you’re a Supabase power user who thrives in GitOps-driven environments, this is your chance to take full ownership of data systems that will scale with our vision.
Job Type: Full-time
Benefits:
- Relocation assistance
Experience:
- Database : 6 years (Required)
- Postgres SQL: 5 years (Preferred)
Language:
- English (Preferred)