Big Data Solution Architect
DXC Technology Company
Warsaw, Masovian , Poland
10 d. temu

Job Description :

DXC Technology (NYSE : DXC) is the world’s leading independent, end-to-end IT services company, helping clients harness the power of innovation to thrive on change.

Created by the merger of CSC and the Enterprise Services business of Hewlett Packard Enterprise, DXC Technology serves nearly 6,000 private and public sector clients across 70 countries.

The company’s technology independence, global talent and extensive partner alliance combine to deliver powerful next-generation IT services and solutions.

DXC Technology is recognized among the best corporate citizens globally. For more information, visit .

Applications Services Global Delivery Poland , a large, growing delivery center located in Warsaw and Łódź, offers high quality Applications Development and Applications Management Services to DXC customers all over the world.

The Center’s portfolio covers full software lifecycle, including : Applications Development, Applications Management, Quality Assurance, Project Management, Testing, Business and Systems Analysis.

The main technology focus is on Big Data, Data Warehousing / Business Intelligence, Web and mobile solutions, ERP and CRM systems.

Most of the Center's employees hold multiple certifications in Hadoop, Microsoft, Oracle, SAP, ITIL, PMP, and Six Sigma.

We are currently looking for :

Big Data Solution Architect

Role Overview

  • Lead Big Data technical role for Customer’s RFI, RFP and projects planning
  • Lead Big Data team role for projects implementation
  • At least 2 Big Data projects completed as Solution Architect.
  • At least 5 years’ experience in BI / DW or Big Data area.
  • Qualifications Hadoop

  • Familiarity with : Hadoop distributions like Hortonworks / Cloudera / MapRSQL like engines like Hive / Impala / HawqData modeling and reporting on HadoopNoSQL databases like HBase / Gemfire / Cassandra / MongoDBETL tools for Hadoop Processing like Informatica Powercenter / Talendstream processing engines like Storm / Sparkmessaging system like Kafka / RabbitMQ / ZeroMQMachine Learning on Hadoop like Spark MLlib / R / Cloudera Data Science WorkbenchProcessing languages on Hadoop like Java / Scala / PythonLinux OS like RedHat / CentOS / Ubuntu with bash scriptingDevOps tools like Chef, Jenkins, Ansible, Git
  • Qualifications DWH / BI :

  • Understanding of the DWH / ETL concepts
  • Experience in working with analytical database (Oracle Exadata / Teradata / Vertica .)
  • Experience in ETL tools (like Informatica / DB2 / Talend)
  • Experience in data modeling would be a plus
  • Qualifications Cloud

  • Familiarity with one of the Cloud stacks Azure / AWS : StorageIntegrationCompute enginesDatabase enginesAnalyticsMachine LearningNetworkSecurity and IdentityDeveloper ToolsDevOps
  • Qualifications General

  • Good English language skills.
  • Problem solving and analytical skills.
  • Explore emerging technologies and quickly adapt.
  • Eagerness to learn and adaptability to changes.
  • We offer :

  • Work in an international company
  • Career development opportunities
  • Modern and friendly work environment with open door policy
  • Professional training
  • Competitive salary and flexible working hours
  • Private medical care
  • Social benefits system
  • Life insurance
  • Extended Wellness and sport Program
  • Aplikuj
    Dodaj do ulubionych
    Usuń z ulubionych
    Aplikuj
    Mój adres email
    Klikając przycisk "Kontynuuj", wyrażam zgodę neuvoo na przetwarzanie moich danych i wysyłanie powiadomień e-mailem, zgodnie z zasadami przedstawionymi przez neuvoo. W każdej chwili mogę wycofać moją zgodę lub zrezygnować z subskrypcji.
    Kontynuuj
    Formularz wniosku