Big Data Administrator
Big Data Administrator
Our Fortune 50 client is seeking a Hbase Administrator for managing Hadoop clustered environment. You will be working with software developers to support and operate HBase in production and non-production environments. Responsible for designing, implementing and supporting the big data platform.
- Design, configure, optimize, maintain and support Hadoop clustered environment for optimum performance and 100% uptime for 24/7 business critical applications
- Provide technical leadership within team responsible for the development and support of applications using Hadoop ecosystem (HBase, Pig, Hive, Yarn) and KAFKA
- Install, configure and maintain Kafka cluster, Kafka connect with replication across multiple datacenters
- Provide production support in 24/7 environment within on call rotation schedule
- Ensure data security using Kerberos authentication and authorization, data quality, and governance of data within the Hadoop environment
- Deploys the release of new technologies as well as designs, installs, configures, maintains and performs system integration testing
- Develop and maintain monitoring tools using Grafana, Prometheus, Zabbix to get real time view into Hadoop and Kafka clustered environment.
- Researches, evaluates and recommends software and hardware products upgrade as needed
- Develop, and maintain development best practices for developing against Hadoop clusters
- Develop and support applications based test-driven, behavior-driven, and agile development methodologies
- Provides new hardware specifications to users based on application needs and anticipated growth, installs new servers and maintains the server infrastructure
- Collaborate with administrators and application teams to ensure that business applications are highly available and performing within agreed upon service levels
- Resolves technical issues through debugging, research, and investigation. Relies on experience and judgment to plan and accomplish goals.
- Performs a variety of tasks. A degree of creativity and latitude is required. Typically reports to a supervisor or manager.
- Loading and processing from disparate data sets using appropriate technologies including but not limited to, Hive, Pig, MapReduce, HBase, Spark, Storm and Kafka.
- Bachelor's Degree in Computer Science, Information Systems, or other related field; or equivalent work experience
- Typically has 8+ years of IT work experience in big data development, administration and production support
- Experience utilizing Java and big data (Hadoop) technologies, such as Hive, Pig, Sqoop, and SQL programming language to develop big data applications
- Expert knowledge of Big Data concepts and common components including YARN, Queues, Hive and Kafka.
- Minimum 4 years of experience in administrative support of Hortonworks HDP, Cloudera or MapR
- Minimum 2 years of experience in administrative support of secured clusters with Kerberos
- Minimum 2 years of experience in administrative support Apache Kafka, Spark and streaming applications
- Good understanding of Centos 6.9 +, Apache Phoenix
- Experience with monitoring tools like Prometheus, Zabbix and Grafana
What's in it for you?:
- This team works remote on Fridays!
- Opportunity to learn Kafka or if you already know Kafka, then you will get to own the project of implementation for it.
- Sole HBase Admin so you get to run the environment how you best see fit
Eight Eleven Group provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, national origin, age, sex, citizenship, disability, genetic information, gender, sexual orientation, gender identity, marital status, amnesty or status as a covered veteran in accordance with applicable federal, state, and local laws.