Consultant Big Data Admin

Consultant Big Data Admin

Your responsibilities as a Consultant Big Data Admin:
  • Responsible for implementation and ongoing administration of Hadoop infrastructure
  • Performance tuning of Big Data/ cloud environment
  • Professional and technical advice on Big Data concepts and technologies, in particular highlighting the business potential through real-time analysis
  • Aligning with the systems engineering team to propose and deploy new hardware and software environments required for Hadoop and to expand existing environments
  • Working with data delivery teams to setup new Hadoop users. This job includes setting up Linux users, setting up Kerberos principals and testing HDFS, Hive, Pig and MapReduce access for the new users
  • Cluster maintenance as well as creation and removal of nodes using tools like Ganglia, Nagios, Ambari and other tools
  • Performance tuning of Hadoop clusters and Hadoop MapReduce routines
  • Screen Hadoop cluster job performances and capacity planning
  • Monitor Hadoop cluster connectivity and security
  • Manage and review Hadoop log files
  • File system management and monitoring
  • HDFS support and maintenance
  • Diligently teaming with the infrastructure, network, database, application and business intelligence teams to guarantee high data quality and availability
  • Collaborating with application teams to install operating system and Hadoop updates, patches, version upgrades when required
  • Software installation and configuration
  • Database backup and recovery
  • Database connectivity and security
  • Disk Space Management
Requirements
  • General operational expertise such as good troubleshooting skills, understanding of system’s capacity, bottlenecks, basics of memory, CPU, OS, storage, and networks
  • Be able to deploy Hadoop cluster, upgrade deploy Hadoop cluster, add and remove nodes, keep track of jobs, monitor critical parts of the cluster, configure name-node high availability, Do kerberised setup for cluster, schedule and configure it and take backups
  • Knowledge of best practices for big data technologies and configuration of various components
  • Good Knowledge of Linux
  • Curiosity and frankness to Big Data technologies
  • Knowledge of relational and non-relational databases
  • Use of integrated development process models
  • Understanding of principles of Data Science and Machine Learning
  • Customer orientation and highest quality standards
  • Well-grounded knowledge of different presentation techniques
Technical Skills
  • 4+ years of experience – Knowledge of Java web/application servers, Big Data tools stack, NoSQL databases such as Hbase, Cassandra, MongoDB, Couch, Riak etc.
  • Hadoop skills like Ambari, Ranger, Kafka, HBase, Hive, Pig etc.
  • Experience on deploying and managing big data and cloud applications to production environment
  • Worked on Pig, Hive, Map Reduce with HDFS.
  • Experienced in modern software development methodologies of Big Data such as Scrum or XP
  • Experience with Cloudera/MapR/Hortonworks
  • Administrative aspects of Big Data and BigData virtual machines using cloudera, AWS or other cloud platfroms
  • Expertise in the following technologies: Core JAVA, Spring, Struts, JSP, Web-services, Gather and process raw data at scale using scripts, web scraping, SQL queries, etc.
  • Implementing ETL processes
  • Experience with various messaging systems, such as Kafka or RabbitMQ
  • Good understanding of Lambda Architecture, along with its advantages and drawbacks
Skills
  • Team and goal-oriented work style
  • Become acquainted with new topics and tasks quickly
  • Analytical and solution-oriented thinking
  • Willingness to assume leadership responsibility
Submit Your Resume

* All fields are mandatory

First Name
Last Name
Email
Phone Number
Position Applying for
Core SKills

Upload Resume
How did you hear about us?  Friend LinkedIn Google Other