ComboCurve is a industry leading cloud-based software solution for A&D, reservoir management, and forecasting in the energy sector. Our platform empowers professionals to evaluate assets, optimize workflows, and manage reserves efficiently, all in one integrated environment.
By streamlining data integration and enhancing collaboration, we help operators, engineers, and financial teams make informed decisions faster. Trusted by top energy companies, ComboCurve delivers real-time analytics and exceptional user support, with a world-class customer experience team that responds to inquiries in under 5 minutes.
We are seeking a highly analytical and experienced Senior Data Engineer to help optimize production forecasting and operations scheduling within the petroleum engineering domain. You’ll bridge the gap between complex mathematical models (reservoir dynamics, optimization, logistics) and robust, cloud-scale data systems.
This role requires a unique combination of deep Python expertise, mastery of modern data processing and API frameworks, and a strong foundational understanding of mathematics, reasoning, and petroleum engineering principles.
ResponsibilitiesData Architecture & EngineeringDesign, build, and maintain scalable data pipelines for ingesting, transforming, and validating time-series data related to well performance, sensor readings, and operational logs.Develop robust, high-performance data models using PyArrow and Pandas for efficient analysis and transfer.Implement data quality and schema validation using Pydantic to ensure data integrity across all stages of the pipeline.Manage and optimize data storage and retrieval in
MongoDB, and integrate with cloud-native platforms like
GCP BigQuery or
Snowflake where applicable.
API & Application DevelopmentBuild, deploy, and maintain high-performance asynchronous microservices and prototypes using
FastAPI or
Flask to serve complex optimization and scheduling model predictions.Use
Postman for testing, documenting, and automating API workflows.Containerize and orchestrate applications using Docker and manage deployment on Google Cloud Platform (GCP).
Quantitative Analysis & OptimizationCollaborate with reservoir and operations teams to translate complex scheduling and logistics problems into mathematical models (e.g., linear programming, resource allocation).Implement numerical routines and simulations efficiently using NumPy for use in production environments.Apply strong logical and analytical reasoning to debug, validate, and interpret the outputs of operational scheduling algorithms.
RequirementsEducation
: Bachelor’s or Master’s degree in Petroleum Engineering, Computer Science, Mathematics, Operations Research, or related quantitative field, or equivalent experience.
Quantitative Strength: Proven ability to work with mathematical modeling, optimization, and time-series analysis, including:o Linear and Mixed-Integer Programming
o Probability and Statistics
o Algorithmic Complexity and Performance Reasoning
Collaborative mindset — experience working closely with data scientists, product owners, and domain experts to deliver production-ready systems.
Preferred QualificationsDomain Expertise
: Solid understanding of well operations, drilling logistics, production data, and scheduling workflows.Experience working with large-scale or streaming datasets.Experience with mathematical modeling and optimization libraries (
SciPy, PuLP, OR-Tools).Experience setting up
CI/CD pipelines and container deployments on
GCP.
PI9ec6a438b43a-37641-39351109