Spark Engineer
Sollers Consulting
Wrocław
4 d. temu

Data Competency

Versatility is the keyword when it comes to what we do in IT. Data Competency is one of our initiatives to support digital transformation of financial sector, enabling our customers to become truly data-driven companies.

Our mission is to ensure capabilities to address financial sector data-related needs.

As a Spark Engineer, you will be responsible for building and optimizing our clients’ data pipelines and optimizing data flows.

You will work on various data initiatives and will ensure optimal data delivery. You are ideal candidate if you are experienced in Spark ecosystem.

If you enjoy building data systems from scratch and you like modifying existing ones, you will have those opportunities at Sollers.

Key facts about Sollers Consulting :

We are a Team of almost 700 professionals who build the Digital Future for the world’s largest insurance, banking, and leasing organizations.

Our history of business advisory and software implementation goes back to the year 2000. Sollers Consulting’s roots are in Poland, but the company’s footprint is visible around the world.

Working with us means taking part in many projects worldwide (Poland, Germany, Austria, Switzerland, UK, USA, Canada, Japan).

Our teams are located in Warsaw, Cologne, Gdansk, Tokyo, Lublin, Wroclaw, Paris, and Poznań. Being agile across our company and our projects enable us to play an active role in industries with high digitalization needs.

Tools & technologies used on projects :

  • Data architecture, data modeling, design patterns
  • RDBMS and NoSQL databases
  • DataOps, ETL technologies
  • Real-time data streaming, Spark, Airflow, Kafka
  • OLTP, OLAP, DWH, data lakes
  • BI & predictive analytics; AI / ML
  • Python, Java, Scala, R
  • You will have an opportunity to :

  • Build scalable data processing pipelines and SQL database integrations.
  • Advise on the use of appropriate tools and technologies.
  • Recommend potential improvements to existing data architecture.
  • Collaborate with analysts, experts and tech leads in Agile methodology to meet clients' needs.
  • Address aspects like data privacy and security, compliance with regulations, integrity and availability, etc.
  • Guide the team with good Spark development practices.
  • Create Spark jobs for data transformation and aggregation.
  • Perform Spark query tuning and performance optimization.
  • Define feasible test strategies and troubleshoot failures.
  • We bet on you, so we expect you to :

  • Have proven, hands-on experience with Spark.
  • Deeply understand distributed systems (e.g. CAP theorem, partitioning, replication, consistency, and consensus).
  • Be proficient with SQL as well as with Java, or preferably Scala.
  • Know how to write useful abstractions to process similarly formatted datasets in a generic way.
  • Be experienced with defining a strategy to handle data schemas in a manner that changes in the data don’t break the code.
  • Speak English (min. B2).
  • Communicate effortlessly with clients and team members.
  • Be able to work in the European Union.
  • We offer you :

  • The opportunity to quickly develop professionally.
  • Clear career path and future salary projection.
  • Individual learning & development budget.
  • German and French language classes.
  • Comprehensive health care, life insurance, travel insurance.
  • Home office policy.
  • Family support : wedding gifts, generous layette for newborns, family parties.
  • Relocation package (if you come from another city).
  • Zgłoś tę pracę
    checkmark

    Thank you for reporting this job!

    Your feedback will help us improve the quality of our services.

    Aplikuj
    Mój adres email
    Klikając przycisk "Kontynuuj", wyrażam zgodę neuvoo na przetwarzanie moich danych i wysyłanie powiadomień e-mailem, zgodnie z zasadami przedstawionymi przez neuvoo. W każdej chwili mogę wycofać moją zgodę lub zrezygnować z subskrypcji.
    Kontynuuj
    Formularz wniosku