Job Purpose : Citi is looking for Senior BIG Data developer with expertise in Hadoop / Spark / Scala / AKKA clustering who will help build a global Ops Dashboard platform.
He / she will have responsibility for designing, coding and implementing a large scale global enterprise distributed system, including integration with 3rd party tools, business process management and various interfaces to internal Citi Systems.
Job Background / context :
The platform provides internal clients, and Senior managers capabilities of Map and Measure , for Banking, Lending & Investment products
Manage a global development team with members located in the US, Zurich, Singapore, Pune and / or other Citi locations globally
The role requires very strong communication, organization and planning skills to ensure strong partnership with CPB business, operations and technology teams to ensure process standardization
Requires good analytical skills in order to filter, prioritize and validate potentially complex material from multiple sources
Citi has strict coding and engineering standards to follow - from following proper unit testing to continuous integration.
The candidate should be familiar with the tools needed here.
Key Responsibilities :
Hands on management of a global development team described in the section Job Purpose above.
Follows Citi’s engineering standards and deploy the software components using continuous integration
Architect, design and implement a global transacting platform with enterprise integration with 3rd party tools and various internal Citi systems
Drive architecture, design and implementation of strategic large scale distributed systems and / or projects
Knowledge / Experience :
Bachelor’s Degree and / or Masters in Computer Science or related
12+ years’ experience in relevant area
Banking, Lending and Investment product knowledge preferred
Skills Required :
Strong Knowledge on Hadoop HBASE, HIVE, Map Reduce
Strong Experience using Apache Spark, Spark SQL and other data processing tools and languages
Experience building high-performance algorithms in scalable languages such as Java, Scala, Python and R
Hands on Experience on implementing AKKA Clustering.
Experience working on Linux systems
Experience using standard SDLC tools like Jira, Git, Jenkins etc.
Experience in Kafka, Microservices, Web Services and writing Restful API’s
Experienced in handling large datasets using Partitions, Spark in Memory capabilities, Effective & efficient Joins, performing Transformations and other during ingestion process itself
Strong development lifecycle understanding and capability
Strong analytical and problem solving skills
Knowledge of Continuous integration tools like Teamcity etc.
Knowledge of WebSphere Application Server and clustering
Provide management & guidance to off-shore development teams ensuring global platform standardization
Architecture / Design experience
Knowledge / experience with high volume / throughput systems
Works well and respects other disciplines of the project teams
Ability to manage risk appropriately.
Ability to thrive in a team-oriented, fast-paced environment
Excellent technical written and verbal communication skills
Ability to prioritize individual and team tasks and projects so that deadlines are met
Experience working with business partners and engineers to gather, understand, reeingineer, standardize and bridge definitions and requirements
An innate desire to deliver and a strong sense of accountability for your work
Bachelor’s degree and / or Masters (in Computer Science or related)