Job Title: Big Data DevOps Engineer
1. Job Summary highlight project details/what is exciting about the role :
New transformational program for a UK customer focusing on Big Data and advanced visualization capabilities
Agile mode of working with teams co-located in London
A sound DevOps expertise on CI/CD pipeline within Big Data technology landscape and ability to articulate and influence leadership stakeholders on infrastructure autiomation are pre-requisites. In addition to core DevOps expertise, hand-on experience in AWS, Hortonworks, S3, Java 8, Hbase, Kafka, JanusGraph, HDF (NiFi) and related big data technology elements are a must to have. A minimum of 4 years demonstrable experience is required.
Being eligible for SC clearance – Basic eligibility UK/EU Citizen / Indefinite leave to remain Visa holder without leaving UK for more than 6 months in last 5 years.
2. Key Responsibilities list what the person will be doing on a day to day basis: % of time
Representing Cognizant as a DevOps expert, the associate needs to demonstrate the Cognizant value to clients, prospects and partners to establish trust, win confidence, and drive more business.
Responsible for the design, deployment and maintenance of complex Linux-based infrastructure.
Setting up of highly secure Hortonworks Hadoop Cluster in AWS cloud, Cluster administration, maintenance including upgrades, capacity planning and optimization
Automation of Operations including setting up of Continuous Integration(CI) & deployment(CD) pipelines, Code Versioning &Application Containerisation
Running / managing a data platform with multiple tenants and client applicationsv or both transactional (system of record) and for analytical use cases
Oversee the integration of the output from the development teams for to production quality and the support of the live system
Work as a team with other leads and developers, owning and contributing in deliverables and guide/lead the team towards success.
100%
3. Budgetary Responsibility: YES NO
If answer is ‘yes’ then please provide details:
4. Job Requirements
As per the 2006 Age Discrimination Act please do not specify number of years experience. Use words like Extensive, Strong, Good, Fair
Essential Skills:
Experience with infrastructure automation tools like Ansible, Terraform, SaltStack, Puppet and Docker.
AWS Cloud Setup & Administration
Strong hands on skills in setting up Linux-based infrastructure.
Fluency in languages including Python & Java
Expertise in CI tools such as Jenkins/Nexus
Experience with Java Build Systems, primarily Maven, Gradle.
Need to support code versioning and branching strategies using Github
Configuration management, backup/restore policies via specialized tools (Puppet) and custom BASH scripts.
System Monitoring solution like Zabbix, Promotheus and Nagios
Log management via Elastic Search and Logstash (ELK)
Hadoop Cluster administration, maintenance, capacity planning and optimization
Securing Hadoop (HDP essential) Infrastructure using knox and kerberos
Should identify areas where processes and standards could be improved and help implement any changes
Should have experience working in Agile environment
Ability to clearly articulate both the technology and business pros and cons of various technologies, platforms, and architectural options
To be able to document the processes
Excellent communication, ‘can do’ attitude and a team player
Must have excellent written and verbal communication skills and be able to work in complex and critical systems
Nice to Have Skills.
Development Experience using Java8, HDP, Kafka, Hbase
Knowledge of agile framework
Experience working in a Public Sector domain
Qualifications:
Graduate with work experience
HDPCA Hortonworks Certified Administrator
5.Travel Yes
Successful applicants are expected to work at Cognizant Clients. Primarily based in and around London. However, there may be exceptions. This is a consultancy role.