Bratislava, Slovakia Hybrid

Bloomreach is hiring a Senior Data Engineer, Datacraft

About the Role

Shape the future of data infrastructure as a founding Senior Data Engineer on a new team building a modern data platform from the ground up. Your work will directly enable enterprise clients to unlock insights from their data across cloud environments including Snowflake, BigQuery, and Databricks, powered by an agentic analytics architecture.

What You’ll Do

  • Design and implement robust data pipelines using Kafka, GCS, Iceberg, and cloud data warehouses, following medallion architecture principles
  • Own end-to-end data modeling and transformation logic for identity resolution and consent-aware data flows
  • Develop and maintain orchestration workflows in Airflow and Cloud Composer, ensuring reliability and observability
  • Build and refine evaluation systems for the Loomi Analytics Agent, including prompt tuning and trace analysis
  • Create modular data interfaces and MCPs to support agent-driven analytics
  • Define canonical metrics and dimensional models in the warehouse to support downstream reporting
  • Enable seamless integration between AI agents and BI tools like Cube and Looker Studio
  • Break down complex objectives into deliverable tasks and lead small initiatives independently
  • Collaborate across engineering and product to ensure data systems are debuggable, reliable, and performant
  • Participate in on-call rotations and contribute to architectural decisions shaping the platform’s evolution

What We’re Looking For

  • Proven experience in data engineering with strong SQL and data modeling expertise
  • Hands-on background with star/snowflake schemas, SCDs, and performance optimization through partitioning and clustering
  • Production experience on GCP using BigQuery, Spark on DataProc, Iceberg, and Airflow
  • Familiarity with open table formats such as Iceberg, Delta Lake, or Hudi
  • Proficiency in Python and experience with DAG-based orchestration systems
  • Understanding of data quality, lineage, monitoring, and recovery strategies including backfills
  • Ability to translate ambiguous requirements into iterative, working solutions
  • Strong communication skills and the ability to collaborate across teams
  • Ownership mindset with a focus on reliability, scalability, and continuous improvement
  • Experience working in remote-first, asynchronous environments, particularly in Central Europe

Nice to Have

  • Exposure to AI-powered analytics, agent orchestration, or LLM-based data systems
  • Background in customer data platforms, marketing analytics, or semantic modeling
  • Experience with dbt, Looker, Cube, Snowflake, or Databricks

Technology Environment

Our stack includes Python, Go, SQL, Kafka, BigQuery, Iceberg, GCS, Mongo, Redis, Spark, DataProc, Airflow, Snowflake, Databricks, GCP, Kubernetes, Terraform, LLM APIs, MCP, agent orchestration frameworks, Cube, Looker Studio, Grafana, Prometheus, PagerDuty, Sentry, OpenTelemetry, GitLab CI/CD, Cursor, and Claude Code.

Work Environment

This is a hybrid role with a remote-first approach, open to candidates in Bratislava, Brno, Prague, or fully remote within Central Europe. You’ll have the flexibility to work from home or an office, supported by a culture built on trust, autonomy, and results.

Benefits & Culture

  • No rigid schedules or approval hierarchies — just ownership and impact
  • Flexible hours and virtual-first collaboration across global hubs
  • Annual paid volunteering days and quarterly DisConnect days for rest and renewal
  • Well-being support through Calm, sports activities, and an Employee Assistance Program
  • Extended parental leave (up to 26 weeks) and recognition of work anniversaries
  • $1,500 annual learning budget for courses, books, or certifications
  • Performance bonus tied to company success and immediate referral bonuses up to $3,000
  • Access to communication coaching and leadership development programs
  • Regular global events to connect across regions and teams
Required Skills
PythonGoSQLApache KafkaBigQueryIcebergGCSMongoRedisSparkAirflowCloud ComposerDataProcDelta LakeHudi PythonGoSQLApache KafkaBigQueryApache IcebergGCSMongoDBRedisApache SparkData Pipeline EngineeringGCPAirflowCloud ComposerData ModelingStar SchemaSnowflake SchemaSCD
Scaling your freelance income?

Invoice multiple clients effortlessly

Managing 3+ international clients? Glopay streamlines everything. One EU company, unlimited invoices, automatic compliance. You just send and get paid.

Unlimited clients & invoices
Multi-currency support
Automated tax compliance
Client portal for easy payments
Scale with Glopay
Trusted by 10,000+ freelancers
About company
Bloomreach

Loomi AI, Bloomreach's agentic platform, understands each customer to personalize their experience in real time — across email, web, mobile, and search. The platform connects first-party customer and product data with business metrics to deliver intelligent personalization at scale.

Bloomreach powers AI-driven marketing automation, ecommerce search, and conversational shopping experiences, helping brands increase revenue, loyalty, and conversion rates across 13+ channels.

All jobs at Bloomreach Visit website
Job Details
Category data
Posted 18 days ago