As a Big Data DevOps Engineer , you will be working on a range of highly important projects for clients worldwide, mainly focused on mixed Cloud solutions, on premises solutions and Hadoop ecosystem.
These include automation of operations and managing the deployment of the platform, guide platform architecture, ensuring flexibility and scalability.
Are you ready for this kind of challenge?
WITH ATOS YOU WILL :
Get the knowledge about different solutions all over the world in BIG Data World
Work primarily with Cloud technologies such as Google, Azure or AWS
Work closely with a variety of external and internal vendors
Finally, design, create and implement the best solutions in BIG Data environments
Designing, building and maintaining of Big Data environments including Cloud solutions
Creating automation across the Big Data environments
Be active member of engineering DevOps team collaborating with customers all over the world
Participation in R&D projects related to Cloud computing and Big Data
Suggesting new standards and solutions for providing high quality of delivered services
Be a team technical leader in BIG Data solutions for DevOps engineers in Digital world
Tracking trends and latest issues related to the domain of conducted projects
Creating technical documentation, Create processes and procedures in the environments
Support deployment, customizations, upgrades and monitoring via DevOps tools
Job Requirements :
3 years of experience in Linux / Unix systems including installation, configuration, networking, backups, updates and patching;
2 years of experience in Big Data platform solutions, to include the following : Hadoop, HDFS, HBase, Spark
A very strong Java, SQL background and experience but capable of thinking in terms of networks rather than tables
Knowledge of cloud solutions Amazon, Google, Azure, Oracle)
Knowledge of different monitoring systems Nagios) and different automation tools
Knowledge of Spark streaming, Kafka, Nifi, Flume, ZooKeeper, Hive, Hawq, Cassandra, Impala
Enterprise Application and information integration
Hadoop Hortonworks Certification is a plus
Passion for technology & understanding how things work.
Ability to work occasional weekends and varied schedule. during go-live).
Competences and skills :
Inspiring, motivating and positive attitude who does not hesitate taking up any challenge.
Has a positive mind-set who demonstrates a can-do-attitude rather than exhibiting known issues or other blockers as possible disrupting agents in meeting delivery and quality targets.
Ability to communicate effectively both verbally and in writing
Good teamwork and interpersonal skills
Readiness to work with Big Data (processing of terabytes of data reliably in daily manner) and Fast data (processing tenth / hundreds thousands of events per second in cluster / cloud environment)
Very Good English languages skills (at least B2 level)
Nice to have :
Knowledge of VMware / Microsoft system administration
Knowledge of Anisble , PostgreSQL
Familiarity with languages (especially Scala, Python, R)
Your Application :
If you wish to apply for this position, please click below to complete our online application form and attach your CV in either Word, rtf or text format.
Atos does not discriminate on the basis of race, religion, colour, sex, age, disability or sexual orientation. All recruitment decisions are based solely on qualifications, skills, knowledge and experience and relevant business requirements.
We are committed to making reasonable adjustments to the applications process for people with disabilities.