Exinity is hiring a Data Engineer

GEICO is seeking a Data Engineer with strong expertise in Azure Databricks. This role will focus on building, supporting, and administering scalable, high-performance data pipelines that power real-time and batch analytics for trading, risk, and operational use cases.

What You'll Do

  • Design, develop, and maintain robust data pipelines using Azure Databricks, Confluent, DLT, Spark pipeline, and Delta Lake to support trading and market data workflows.
  • Self-study the existing pipeline and enhance existing data pipelines, ensuring continuity, scalability, and performance improvements.
  • Provide production pipeline support services, including job monitoring, incident resolution, and performance tuning in production environments.
  • Administer Databricks workspaces, unity catalog, including cluster configuration, job scheduling, access control, and workspace optimization.
  • Build and maintain CI/CD pipelines using GitLab, enabling automated testing, deployment, and versioning of data engineering code.
  • Follow and enforce best practices in code management, including modular design, code reviews, and documentation using GitLab workflows.
  • Collaborate with fellow team members, business analysts, and data architect to understand data requirements and deliver high-quality solutions.
  • Build reusable components and frameworks to accelerate development and ensure consistency across data platforms.
  • Actively participate in Agile ceremonies (e.g., sprint planning, stand-ups, retrospectives) and contribute to continuous improvement of team processes.

What We're Looking For

  • Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
  • 5+ years of experience in data engineering.
  • At least 2 years working with Azure Databricks.
  • Strong proficiency in PySpark, SQL, and Python.
  • Experience supporting production pipelines, including monitoring, alerting, and troubleshooting.
  • Experience with GitLab CI/CD, including pipeline configuration, runners, and integration with cloud services.
  • Familiarity with financial capital markets domain, such as market data feeds, order books, trade execution, and risk metrics.
  • Proven ability to work effectively in Agile development environments.

Nice to Have

  • Azure certifications (e.g., Azure Data Engineer Associate).
  • Experience with real-time data processing using Kafka or Event Hubs.

Technical Stack

  • Azure Databricks, Confluent, DLT, Spark pipeline, Delta Lake
  • PySpark, SQL, Python
  • GitLab CI/CD
  • Kafka, Event Hubs

GEICO is an equal opportunities employer and positively encourages applications from suitably qualified and eligible candidates regardless of gender, sexual orientation, marital or civil partner status, gender reassignment, race, colour, nationality, ethnic or national origin, religion or belief, disability or age.

Required Skills
Azure DatabricksConfluentSparkPySparkDelta LakeSQLPythonKafkaGitLab CI/CDDLTData Pipelines
Landing international contracts?

Invoice globally with an EU company

GloPay creates an Estonian partnership for you automatically. Your clients get proper invoices, you keep 95% of payments. Setup takes 5 minutes, works in 100+ currencies.

EU-registered company for compliance
Multi-currency invoicing & payments
Expense tracking & tax reports
Money in your bank in 1 business day
Start invoicing free
5% per invoice • No subscriptions
About company
Exinity
Exinity is an energetic and diverse company with offices across Europe, Asia and Africa. It provides individuals in the world’s fast-developing economies with guidance, tools, and easy market access so they can trade and invest with confidence. The company's portfolio includes brands like Alpari and FXTM, serving over two million clients in 150 countries from regulated centres across four continents.
All jobs at Exinity
Job Details
Category data
Posted 3 months ago