DevOps (nice to have)
Apache Kafka (nice to have)
Airflow (nice to have)
Snowflake (nice to have)
Amazon AWS (regular)
Apache Spark (advanced)
QOOB.DEV is actively looking for a highly motivated Senior AWS Data Engineer with passion for technology. We’re strongly focused on modern Data Engineering on top of AWS, so being part of us - you will have an opportunity to build large, enterprise Data Lakes, design and implement migrations from on-premise platform to AWS, guide our Customers on cloud data engineering best practices.
Our projects contain multiple parts, such as integration of data from multiple sources, applying transformations using Python / PySpark / Glue and SQL and taking care of the performance aspects.
The perfect candidate needs to be able to effectively translate business requirements into technical specification, be experienced with Python / PySpark / Glue and / or SQL and he’s willing to extent his AWS knowledge.
Your responsibilities :
Being able to effectively convert business requirements into technical solutions
Promote software development best practices / have an DevOps mindset
Use PySpark, AWS Glue and other services to build scalable data pipelines
Building data pipelines with Apache Airflow / AWS Step Functions
Designing and building Data Lakes / Data Platforms using modern solution and data architectures
Job scheduling, execution and support during UAT / go live.
Leverage AWS analytical services to build data applications
Our requirements :
At least 3 years of experience with Python language, understanding the concept of OOP and best coding standards
Good knowledge of SQL for data transformation (including complex queries with analytical functions)
Ability to perform data manipulations, load, extract from several sources of data into another schema.
Ability to work with multiple file formats (JSON, xml, etc.) and analyze data, if required for further processing
Willing to learn AWS architecture (we support certification!)
Experience with Apache Spark (preferably Databricks)
Snowflake experience would be a plus (or similar cloud DWH technology like e.g. AWS Redshift)
Hands-On experience with Apache Airflow would be a plus
What we offer :
100% remote work
A competitive salary
Multiple opportunities to gain new knowledge from AWS / Data Engineering area in our internal knowledge sharing meetups
Money refund for Data Engineering certification expenses