Senior Data Engineer m/f/d
Hapag-Lloyd
Gdansk, PL
1 d. temu

Responsibilities and Tasks :

Innovation

  • Data Extraction Design or evaluate adaptive interface client generators that allow zero code data extraction or streaming from systems via exposed APIs.
  • Develop intelligent data augmentation and labeling automation solutions Develop intelligent data preparation and evaluation pipelines providing high quality data ready to use for ML analysis and model development.

  • Training Data Design or evaluate ai supported data labeling systems to enable fast and accurate labeling projects on big amount of unlabeled data.
  • Design and conduct, internal and crowd based data labeling projects to provide high quality training data to be used when creating new or enhancing HL ML Services.

  • Continuously search, evaluate and collect relevant Datasets for Training or Model Performance evaluations.
  • You are curiously evaluating and testing new AI data preparation and Big Data related Papers, Libraries, Third Party Solutions and vividly participate in AI / Interface / Big Data Communities like Meetups, Conferences, Hackathons or Kaggle competitions, always looking for new arising opportunities with high innovation potential for Hapag Lloyd.
  • AI Product development

  • You are a key player when new business case specific AI Modules are developed.
  • You acquire the necessary understanding of the business case, the related processes and the available data structures.
  • Consult and coordinate the training data extraction with the product teams
  • If required design, coordinate and conduct data labeling sub-projects considering budget limits, quality requirements and schedule deadlines.
  • Design and implement required data structures for the AI module specific training data
  • Integrate the training data in the managed training data repository as required for the development of AI models enabling reproducible experiments.
  • You are responsible for the alignment with the Data Lake Team and clarify the best approach to store & manage the training & operative data considering the training and operative requirements of the AI module designed together with the Data Scientists and DevOps team colleagues.
  • Support the Data Scientists on ETL tasks during the explorative Model analysis and design phase.
  • Implement the ETL pipeline as required by the Model design
  • Lead the system / data integration development of the API endpoints of the new AI Module in cooperation with the Product Teams and other source / target-System owners.
  • Conduct the operationalization of the ETL pipeline
  • Design and implement the learning loop API endpoints in cooperation with the Product Teams and other source / target-System owners.
  • participate in design and implementation of the Module performance monitoring
  • Support product teams to handle AI Module related incident situations in a fast and solution oriented manner.
  • AI Platform Development

  • You’ll be a key player in building the generic, standardized and highly reusable platform of the Hapag Lloyd Data Science Development and Analysis Stack composed of Tools, Services and Modules, to enable the AI Team as well as Business- and System- Analysts to continuously improve time to market and cost efficiency of AI Solutions.
  • You’ll be a key player in establishing and managing the AI Data Platform with a clear and understandable architecture and process concept, ready for vast amounts of data, supporting AI Module Configuration Versioning as well as repeatable model development experiments, ready for any kind of structured and unstructured data (relational, text, images, videos,.
  • and ready for operational, scalable and high performances OLTP as well as OLAP use.
  • You are organizing trainings and information sessions for IT and Business Departments as well as for Public Community Events on various AI Data Management and Quality Topics to spread the knowledge and awareness about the possibilities, limits and future of AI in Logistic and IT.
  • Requirements and Qualifications :

    A bachelor’s or master’s degree in computer science, business administration, mathematics, physics or other scientific area is preferred, but not required.

    Much more important is your experience, your attitude and your hunger for state of the art AI development.

    Two to three years of relevant experience in enterprise level IT that equipped you to communicate effectively with the diverse stakeholders at corporate level is a good starting point.

    Technical experience

  • You’ll need 3 - 4 years of development hands on experience, with significant backend, batch and API involvement. Experiences with Test-driven development is a plus.
  • SQL, Python, JEE and JS and related Development Stack elements including Git, Jenkins, common IDEs and ML frameworks are important assets for your endeavor.
  • C++ would be helpful but initially optional.

  • Hands on experience with at two or more relational DBMS like DB2, SQLite, PostgreSQL is as well as NoSQL DBS is important.
  • Distributed DBS is initially optional.

  • You should be confident working with various data formats from tabular data like CSV to some markup formats like HTML and common transfer formats like xml and json.
  • Hands on experience with at least some MS-Office, image, audio or video formats is also important.

  • You should bring at least some hands one experience with Cloud Services and related concepts and protocols of distributed computing, most preferable AWS. More is better.
  • Data engineering experiences

  • You should have basic understanding of Linear Algebra to work with Vectors and Matrices. And you should be able to come up with numeric stable algorithms.
  • You’ll need relevant hands on experience in building ETL pipelines.
  • So you should have experience in extracting data from databases and from other Systemns via Web service APIs. Being able to consume data streams would be also interesting asset.
  • For ETL you’ll need to have extensive hands on experience on data cleaning, evaluating basic data statistics, discretize, impute, encode categorical data, Randomize, Normalize, detect outlier and other transformation and evaluation methods.
  • You should have hands on experience in at least one of the major data science topics like supervised vs unsupervised learning, NLP, CNNs, RNNs, GANs and related ML frameworks like sklearn, Keras, PyTorch or Tensorflow.
  • Being able to present and explain data with proper diagrams as well as experiences with at least one of the current data science platforms like Anaconda, Dataiku, Rapidminer is a must.
  • Personal Skills

  • You’ll need to be able to understand, explain and discuss complex topics in fluent English. German and any other language is a plus in context of Multilingual Models.
  • You enjoy sharing your expert knowledge with others and thus generate new knowledge.
  • As a senior data engineer you feel responsible to support and enable your team colleagues on a professional and personal level to ensure a relaxed and collaborative atmosphere and a continuously improving team performance.
  • At least 1 year experience of working in a SCRUM team is a must. But classical project management topics like task breakdown, requirements engineering, Make or Buy analysis , Gant Diagrams will also be necessary.
  • Your good analytical understanding of complex interrelationships and a confident handling of pre-processing and evaluation of large amounts of data will support you in dealing with exciting questions.
  • Challenging problems want to be solved by you. You show high commitment and want to make a difference.
  • Furthermore, you can present results to your team, our business stakeholders in a simple and understandable way. It makes no difference to you whether in Polish, English or German.
  • Strong troubleshooting and problem solving skills.
  • Thrive in a fast-paced, innovative environment.
  • You enjoy working in an international team and are passionate about new technologies and software.
  • In a highly motivated Team of AI experts and thanks to the high impact, enterprise level business cases of Hapag Lloyd you have the chance to experience a quantum leap in your personal level of expertise and professional maturity.
  • Together we’ll take care of the right balance between, room for focus, creative innovation, getting things done, personal development, enough recreation and a relaxed, collaborative atmosphere.
  • As one of the 5 biggest carriers in the world, Hapag Lloyd is literally moving the world. Every optimization and automation is not only saving monetary costs but also reduces the ecological footprint.
  • So you’ll have a big impact on making the world a better place to live.

    With us your ideas, personality and skills have the freedom to evolve and make a difference. Hapag Lloyd is offering many different and challenging business areas like customer service, operational container steering, dangerous goods, maritime IT, and is supporting you when ever you urge for new frontiers even at international level.

    What does it mean to join a leading global liner shipping company? It means that you will have access to leading technology and career opportunities all around the world.

    You will be working with a competitive team of experts from diverse background, making sure the world keeps moving. Flat hierarchies and an innovative and performance driven culture, give you the chance to make a real impact.

    Let’s navigate the future together!

    Zgłoś tę pracę
    checkmark

    Thank you for reporting this job!

    Your feedback will help us improve the quality of our services.

    Aplikuj
    Mój adres email
    Klikając przycisk "Kontynuuj", wyrażam zgodę neuvoo na przetwarzanie moich danych i wysyłanie powiadomień e-mailem, zgodnie z zasadami przedstawionymi przez neuvoo. W każdej chwili mogę wycofać moją zgodę lub zrezygnować z subskrypcji.
    Kontynuuj
    Formularz wniosku