About Job
Summary
Build data-intensive systems that actually scale. Own distributed processing, not just services. Shape how large-scale data flows power products.
We’re looking for a Senior Scala / Data Engineer for one of our clients — a company operating in a data-heavy, distributed systems environment. This is a hands-on role for someone with a strong Scala background who has built and optimized high-performance data systems in production.
This role is for someone who takes full ownership of distributed data systems — from architecture to performance. You’ll work on complex, large-scale data processing challenges where efficiency, scalability, and fault tolerance are critical.
Responsibilities
- Build and optimize distributed data processing systems using Scala;
- Design and implement high-throughput, low-latency data pipelines;
- Work with big data frameworks (e.g., Spark) and distributed architectures;
- Develop and maintain scalable backend services for data-intensive workloads;
- Optimize performance of data processing jobs and infrastructure;
- Ensure fault tolerance and resilience of distributed systems;
- Implement monitoring, logging, and observability for data pipelines;
- Collaborate with engineering teams to design scalable system architectures;
- Contribute to cross-functional backend and data engineering efforts;
- Ensure high standards of code quality, performance, and reliability.
Requirements
What we Expect
- Strong experience with Scala in production environments;
- Proven experience as a Senior Data Engineer or Backend Engineer in data-heavy systems;
- Strong understanding of distributed systems and parallel processing;
- Experience with big data technologies (Spark, Kafka, etc.);
- Strong knowledge of JVM ecosystem and performance tuning;
- Experience building high-load, scalable systems;
- Strong system design and architectural thinking;
- Ability to work independently and take ownership;
- Strong problem-solving skills in complex, distributed environments;
- English level: B2+ (C1 preferred);
- Availability to work in EU timezone.
Professional Skill
Core data
Tools
Google Cloud Platform (GCP) Apache Spark Amazon Web Services (AWS)
Tech Stack
Scala
Languages
English | Upper-Intermediate
Will be a plus
- Experience with functional programming paradigms;
- Experience with cloud infrastructure (AWS / GCP);
- Experience with stream processing frameworks (Flink, Akka Streams);
- Background in low-latency or real-time systems;
- Experience in niche or hard-to-hire tech domains.
What we offer
What we Offer
- Opportunity to work in a Top-employee company (DOU 2025);
- Interesting projects and challenges that accelerate professional and personal growth;
- Work with a creative, proactive, and empathic team;
- Comfortable, stylish office in Kyiv with generator/battery backup;
- Minimal bureaucracy, regular feedback, and team support;
- Team-building events: parties, online activities, picnics, and more.
About Company
