Big Data

bigdata-dev
Big Data Developer / Sr.Developer

Big Data Developer / Sr.Developer

Submit Your Resume

* All fields are mandatory

First Name
Last Name
Email
Phone Number
Position Applying for
Core SKills

Upload Resume
How did you hear about us?  Friend LinkedIn Google Other


bigdata-dev
Consultant Big Data Admin

Consultant Big Data Admin

Your responsibilities as a Consultant Big Data Admin:
  • Responsible for implementation and ongoing administration of Hadoop infrastructure
  • Performance tuning of Big Data/ cloud environment
  • Professional and technical advice on Big Data concepts and technologies, in particular highlighting the business potential through real-time analysis
  • Aligning with the systems engineering team to propose and deploy new hardware and software environments required for Hadoop and to expand existing environments
  • Working with data delivery teams to setup new Hadoop users. This job includes setting up Linux users, setting up Kerberos principals and testing HDFS, Hive, Pig and MapReduce access for the new users
  • Cluster maintenance as well as creation and removal of nodes using tools like Ganglia, Nagios, Ambari and other tools
  • Performance tuning of Hadoop clusters and Hadoop MapReduce routines
  • Screen Hadoop cluster job performances and capacity planning
  • Monitor Hadoop cluster connectivity and security
  • Manage and review Hadoop log files
  • File system management and monitoring
  • HDFS support and maintenance
  • Diligently teaming with the infrastructure, network, database, application and business intelligence teams to guarantee high data quality and availability
  • Collaborating with application teams to install operating system and Hadoop updates, patches, version upgrades when required
  • Software installation and configuration
  • Database backup and recovery
  • Database connectivity and security
  • Disk Space Management
Requirements
  • General operational expertise such as good troubleshooting skills, understanding of system’s capacity, bottlenecks, basics of memory, CPU, OS, storage, and networks
  • Be able to deploy Hadoop cluster, upgrade deploy Hadoop cluster, add and remove nodes, keep track of jobs, monitor critical parts of the cluster, configure name-node high availability, Do kerberised setup for cluster, schedule and configure it and take backups
  • Knowledge of best practices for big data technologies and configuration of various components
  • Good Knowledge of Linux
  • Curiosity and frankness to Big Data technologies
  • Knowledge of relational and non-relational databases
  • Use of integrated development process models
  • Understanding of principles of Data Science and Machine Learning
  • Customer orientation and highest quality standards
  • Well-grounded knowledge of different presentation techniques
Technical Skills
  • 4+ years of experience – Knowledge of Java web/application servers, Big Data tools stack, NoSQL databases such as Hbase, Cassandra, MongoDB, Couch, Riak etc.
  • Hadoop skills like Ambari, Ranger, Kafka, HBase, Hive, Pig etc.
  • Experience on deploying and managing big data and cloud applications to production environment
  • Worked on Pig, Hive, Map Reduce with HDFS.
  • Experienced in modern software development methodologies of Big Data such as Scrum or XP
  • Experience with Cloudera/MapR/Hortonworks
  • Administrative aspects of Big Data and BigData virtual machines using cloudera, AWS or other cloud platfroms
  • Expertise in the following technologies: Core JAVA, Spring, Struts, JSP, Web-services, Gather and process raw data at scale using scripts, web scraping, SQL queries, etc.
  • Implementing ETL processes
  • Experience with various messaging systems, such as Kafka or RabbitMQ
  • Good understanding of Lambda Architecture, along with its advantages and drawbacks
Skills
  • Team and goal-oriented work style
  • Become acquainted with new topics and tasks quickly
  • Analytical and solution-oriented thinking
  • Willingness to assume leadership responsibility
Submit Your Resume

* All fields are mandatory

First Name
Last Name
Email
Phone Number
Position Applying for
Core SKills

Upload Resume
How did you hear about us?  Friend LinkedIn Google Other


big-senior
Sr. Big Data Developer

Sr. Big Data Developer

Job Responsibilities
  • Design and develop high-volume, low-latency applications for mission-critical systems, delivering high-availability and performance
  • Write well designed, testable, efficient code by using best software development practices
  • Contribute in all phases of the development lifecycle
  • Integrate data from various back-end services and databases
  • Gather and refine specifications and requirements based on technical needs
  • Hands-on development of software solutions and architectures
Requirements
  • 4 to 8 years of Java experience – Knowledge of Java, Scala, Bigdata tools stack, NoSQL databases such as Hbase/ Cassandra/ MongoDB/Couch/ Riak etc.
  • 2+ years’ experience installing, configuring, Hadoop ecosystem components
  • Strong experience in developing applications in Map-Reduce
  • Hands on experience in core Java, J2EE, Web services and JMS
  • Experience working with NoSQL databases, Messaging using JMS, Apache Kafka, Stream data analysis tools (Storm/Spark/Flink), Large scale event processing and queues would be an added advantage
  • Key Functions: Development of a Web-based Expenses application in an Agile Environment
Submit Your Resume

* All fields are mandatory

First Name
Last Name
Email
Phone Number
Position Applying for
Core SKills

Upload Resume
How did you hear about us?  Friend LinkedIn Google Other