Solar Data Engineer, Cloud | New York, NY 50 views

Solar

Location: NYC
About the job
About:
Karbone Inc. is an award-winning liquidity services provider for energy transition and environmental commodity markets. Since 2008, we have offered integrated and innovative revenue hedging, risk management, and market advisory solutions to a global suite of clients across energy markets. Our teams are proudly ranked first amongst their peers and are all dedicated toward our core mission of providing our clients and partners with the necessary market access, liquidity solutions, and commercial insight to help them succeed in the new energy transition.

Position Overview:
We are seeking a Cloud Data Engineer to help build our cloud-based IT infrastructure, data foundation, and internal applications from the ground up. In this role, you will have the opportunity to shape a clean, scalable GCP data environment, with the opportunity to influence tooling and architecture decisions from day one. You will work on a focused set of data pipelines with significant flexibility in implementation, bringing hands-on expertise in cloud technologies and data engineering, and contributing to the introduction of AI/ML capabilities over time.

Responsibilities:
Build and optimize cloud-native data pipelines within a clean, scalable GCP environment.
Manage PostgreSQL and TimescaleDB systems to support complex geospatial and high-velocity time-series datasets.
Develop Python-based ETL/ELT workflows using GCP-native tools like Dataplex, Cloud Run, or Dataflow.
Implement monitoring, alerting, and dashboards to maintain data infrastructure health and uptime.
Drive the transition to ‘Infrastructure as Code’ using Terraform for reproducible and version-controlled environments.
Set up automated CI/CD pipelines via Jenkins or GitHub Actions to replace manual deployment processes.
Architect the foundation for Retrieval-Augmented Generation (RAG) using BigQuery’s vector search and Vertex AI.

Qualifications:
2–4+ years of data engineering experience on GCP or AWS (ready to focus on GCP).
Bachelor’s degree in Computer Science, Data Science, Engineering, Information Systems/IT, or a related technical field.
Proficiency in Python and SQL for ETL automation and PostgreSQL/TimescaleDB orchestration.
Familiarity with modern engineering practices, including infrastructure-as-code with Terraform, GCP-native tools like Dataflow, or experience in hybrid DevOps/Data roles.
Skill in cloud observability using tools like Grafana, CloudWatch, or Cloud Monitoring.
Independent troubleshooter capable of managing workflows in an early-stage environment.

Solar

More Information

  • Term Entry +
  • Company Karbone
Share this job