The #1 Site for IT and Engineering Jobs - search all IT and Engineering  jobs.

Big Data Administrator

Job Description

Big Data Administrator

Our Fortune 50 client is seeking a Hbase Administrator for managing Hadoop clustered environment. You will be working with software developers to support and operate HBase in production and non-production environments. Responsible for designing, implementing and supporting the big data platform.

Responsibilities:

  • Design, configure, optimize, maintain and support Hadoop clustered environment for optimum performance and 100% uptime for 24/7 business critical applications
  • Provide technical leadership within team responsible for the development and support of applications using Hadoop ecosystem (HBase, Pig, Hive, Yarn) and KAFKA
  • Install, configure and maintain Kafka cluster, Kafka connect with replication across multiple datacenters
  • Provide production support in 24/7 environment within on call rotation schedule
  • Ensure data security using Kerberos authentication and authorization, data quality, and governance of data within the Hadoop environment
  • Deploys the release of new technologies as well as designs, installs, configures, maintains and performs system integration testing
  • Develop and maintain monitoring tools using Grafana, Prometheus, Zabbix to get real time view into Hadoop and Kafka clustered environment.
  • Researches, evaluates and recommends software and hardware products upgrade as needed
  • Develop, and maintain development best practices for developing against Hadoop clusters
  • Develop and support applications based test-driven, behavior-driven, and agile development methodologies
  • Provides new hardware specifications to users based on application needs and anticipated growth, installs new servers and maintains the server infrastructure
  • Collaborate with administrators and application teams to ensure that business applications are highly available and performing within agreed upon service levels
  • Resolves technical issues through debugging, research, and investigation. Relies on experience and judgment to plan and accomplish goals.
  • Performs a variety of tasks. A degree of creativity and latitude is required. Typically reports to a supervisor or manager.
  • Loading and processing from disparate data sets using appropriate technologies including but not limited to, Hive, Pig, MapReduce, HBase, Spark, Storm and Kafka.

Skills:

  • Bachelor's Degree in Computer Science, Information Systems, or other related field; or equivalent work experience
  • Typically has 8+ years of IT work experience in big data development, administration and production support
  • Experience utilizing Java and big data (Hadoop) technologies, such as Hive, Pig, Sqoop, and SQL programming language to develop big data applications
  • Expert knowledge of Big Data concepts and common components including YARN, Queues, Hive and Kafka.
  • Minimum 4 years of experience in administrative support of Hortonworks HDP, Cloudera or MapR
  • Minimum 2 years of experience in administrative support of secured clusters with Kerberos
  • Minimum 2 years of experience in administrative support Apache Kafka, Spark and streaming applications
  • Good understanding of Centos 6.9 +, Apache Phoenix
  • Experience with monitoring tools like Prometheus, Zabbix and Grafana

What's in it for you?:

  • This team works remote on Fridays!
  • Opportunity to learn Kafka or if you already know Kafka, then you will get to own the project of implementation for it.
  • Sole HBase Admin so you get to run the environment how you best see fit

Eight Eleven Group provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, national origin, age, sex, citizenship, disability, genetic information, gender, sexual orientation, gender identity, marital status, amnesty or status as a covered veteran in accordance with applicable federal, state, and local laws.

Job Requirements

Responsibilities: •Design, configure, optimize, maintain and support Hadoop clustered environment for optimum performance and 100% uptime for 24/7 business critical applications •Provide technical leadership within team responsible for the development and support of applications using Hadoop ecosystem (HBase, Pig, Hive, Yarn) and KAFKA •Install, configure and maintain Kafka cluster, Kafka connect with replication across multiple datacenters •Provide production support in 24/7 environment within on call rotation schedule •Ensure data security using Kerberos authentication and authorization, data quality, and governance of data within the Hadoop environment •Deploys the release of new technologies as well as designs, installs, configures, maintains and performs system integration testing •Develop and maintain monitoring tools using Grafana, Prometheus, Zabbix to get real time view into Hadoop and Kafka clustered environment. •Researches, evaluates and recommends software and hardware products upgrade as needed •Develop, and maintain development best practices for developing against Hadoop clusters •Develop and support applications based test-driven, behavior-driven, and agile development methodologies •Provides new hardware specifications to users based on application needs and anticipated growth, installs new servers and maintains the server infrastructure •Collaborate with administrators and application teams to ensure that business applications are highly available and performing within agreed upon service levels •Resolves technical issues through debugging, research, and investigation. Relies on experience and judgment to plan and accomplish goals. •Performs a variety of tasks. A degree of creativity and latitude is required. Typically reports to a supervisor or manager. •Loading and processing from disparate data sets using appropriate technologies including but not limited to, Hive, Pig, MapReduce, HBase, Spark, Storm and Kafka.

Job Snapshot

Location US-CO-Denver
Employment Type Contractor
Pay Type Year
Pay Rate $80,204.00 - $81,204.00 /Year
Store Type IT & Technical
Apply

Company Overview

Brooksource

Brooksource is an IT Services Company, specializing in the recruitment and placement of high level IT professionals. We offer competitive compensation, paid holidays, 401k, health benefits, flexible work schedules and just about anything a top tier candidate would demand. Our diverse client base covers all industries and provides us the opportunity to place you, the candidate, in positions that span the entire IT spectrum. Learn More

Contact Information

US-CO-Denver
Darby Scheiber
3035735954
Snapshot
Brooksource
Company:
US-CO-Denver
Location:
Contractor
Employment Type:
Year
Pay Type:
$80,204.00 - $81,204.00 /Year
Pay Rate:
IT & Technical
Store Type:

Job Description

Big Data Administrator

Our Fortune 50 client is seeking a Hbase Administrator for managing Hadoop clustered environment. You will be working with software developers to support and operate HBase in production and non-production environments. Responsible for designing, implementing and supporting the big data platform.

Responsibilities:

  • Design, configure, optimize, maintain and support Hadoop clustered environment for optimum performance and 100% uptime for 24/7 business critical applications
  • Provide technical leadership within team responsible for the development and support of applications using Hadoop ecosystem (HBase, Pig, Hive, Yarn) and KAFKA
  • Install, configure and maintain Kafka cluster, Kafka connect with replication across multiple datacenters
  • Provide production support in 24/7 environment within on call rotation schedule
  • Ensure data security using Kerberos authentication and authorization, data quality, and governance of data within the Hadoop environment
  • Deploys the release of new technologies as well as designs, installs, configures, maintains and performs system integration testing
  • Develop and maintain monitoring tools using Grafana, Prometheus, Zabbix to get real time view into Hadoop and Kafka clustered environment.
  • Researches, evaluates and recommends software and hardware products upgrade as needed
  • Develop, and maintain development best practices for developing against Hadoop clusters
  • Develop and support applications based test-driven, behavior-driven, and agile development methodologies
  • Provides new hardware specifications to users based on application needs and anticipated growth, installs new servers and maintains the server infrastructure
  • Collaborate with administrators and application teams to ensure that business applications are highly available and performing within agreed upon service levels
  • Resolves technical issues through debugging, research, and investigation. Relies on experience and judgment to plan and accomplish goals.
  • Performs a variety of tasks. A degree of creativity and latitude is required. Typically reports to a supervisor or manager.
  • Loading and processing from disparate data sets using appropriate technologies including but not limited to, Hive, Pig, MapReduce, HBase, Spark, Storm and Kafka.

Skills:

  • Bachelor's Degree in Computer Science, Information Systems, or other related field; or equivalent work experience
  • Typically has 8+ years of IT work experience in big data development, administration and production support
  • Experience utilizing Java and big data (Hadoop) technologies, such as Hive, Pig, Sqoop, and SQL programming language to develop big data applications
  • Expert knowledge of Big Data concepts and common components including YARN, Queues, Hive and Kafka.
  • Minimum 4 years of experience in administrative support of Hortonworks HDP, Cloudera or MapR
  • Minimum 2 years of experience in administrative support of secured clusters with Kerberos
  • Minimum 2 years of experience in administrative support Apache Kafka, Spark and streaming applications
  • Good understanding of Centos 6.9 +, Apache Phoenix
  • Experience with monitoring tools like Prometheus, Zabbix and Grafana

What's in it for you?:

  • This team works remote on Fridays!
  • Opportunity to learn Kafka or if you already know Kafka, then you will get to own the project of implementation for it.
  • Sole HBase Admin so you get to run the environment how you best see fit

Eight Eleven Group provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, national origin, age, sex, citizenship, disability, genetic information, gender, sexual orientation, gender identity, marital status, amnesty or status as a covered veteran in accordance with applicable federal, state, and local laws.

Job Requirements

Responsibilities: •Design, configure, optimize, maintain and support Hadoop clustered environment for optimum performance and 100% uptime for 24/7 business critical applications •Provide technical leadership within team responsible for the development and support of applications using Hadoop ecosystem (HBase, Pig, Hive, Yarn) and KAFKA •Install, configure and maintain Kafka cluster, Kafka connect with replication across multiple datacenters •Provide production support in 24/7 environment within on call rotation schedule •Ensure data security using Kerberos authentication and authorization, data quality, and governance of data within the Hadoop environment •Deploys the release of new technologies as well as designs, installs, configures, maintains and performs system integration testing •Develop and maintain monitoring tools using Grafana, Prometheus, Zabbix to get real time view into Hadoop and Kafka clustered environment. •Researches, evaluates and recommends software and hardware products upgrade as needed •Develop, and maintain development best practices for developing against Hadoop clusters •Develop and support applications based test-driven, behavior-driven, and agile development methodologies •Provides new hardware specifications to users based on application needs and anticipated growth, installs new servers and maintains the server infrastructure •Collaborate with administrators and application teams to ensure that business applications are highly available and performing within agreed upon service levels •Resolves technical issues through debugging, research, and investigation. Relies on experience and judgment to plan and accomplish goals. •Performs a variety of tasks. A degree of creativity and latitude is required. Typically reports to a supervisor or manager. •Loading and processing from disparate data sets using appropriate technologies including but not limited to, Hive, Pig, MapReduce, HBase, Spark, Storm and Kafka.
Mwt2td5z6kzxt6hfr4z
Sologig Advice

For your privacy and protection, when applying to a job online: Never give your social security number to a prospective employer, provide credit card or bank account information, or perform any sort of monetary transaction.Learn More

By applying to a job using sologig.com you are agreeing to comply with and be subject to the workinretail.com Terms and Conditions for use of our website. To use our website, you must agree with theTerms & Conditionsand both meet and comply with their provisions.
Big Data Administrator Apply now