Big Data Engineer
Structure: Long Term Contract
Location: Denver Tech Center
Working on a DevOps team to develop and maintain the backend for a usage meter platform. This meter distributes data sets based upon customer's real time usage metrics. This is a very collaborative team using cutting edge technologies such as MongoDB and Apache Spark. They support over 45 million devices across the US.
• Develop a highly robust, scalable and distributed streaming collector application from Apache Kafka using Spark Streaming and MongoDB
• Work on design and development of Spring Boot application interface with Kafka, RabbitMQ, ElasticSearch, and MongoDB
• Work on migration of Oracle tables to MongoDB collections
• Work on migration of data from Oracle/MongoDB to Hadoop
• Work on streaming data from Kafka into ElasticSearch cluster using Streamsets
• Configure Ansible playbooks to automate application - build and deploy - Cluster Management of Kafka, Spark, and MongoDB
• Research various in-memory data grids to evaluate the performance and use cases involving Common Collector
• Design and development of high-quality distributed Java middleware, data-flows, and automated unit tests
• Participation in stand-up and grooming meetings
• Partnership with other developers and participation in peer code reviews to ensure quality deliveries and share knowledge and experience
• Partnership with QA to ensure quality deliveries
• Partnership with DevOps personnel to deliver and provide escalation support of solutions in production environments
• Experience using Java, Spring boot, Spark, MongoDB, and Hadoop
• Experience with distributed messaging systems (Kafka, RabbitMQ), and databases (Oracle and NoSQL/MongoDB)
• Experience with Linux deployment environments
• Self-driven and able to work independently
• Adaptability to a fast paced and constantly changing environment
• Great team player
Desired Skills / Nice to Have
• Strong abstract thinking
• Working knowledge of industry best practices
• Knowledge and experience with secure coding practices and industry standard transport security methodologies.
• Experience with virtualization/cloud technologies, such as Pivotal Cloud Foundry, OpenStack, and/or AWS
• Experience with distributed computing and big data systems, such as Apache Spark, Hadoop, NIFI, and/or StreamSets
• Development and automation tools, such as: Maven, Git, GoCD, Ansible
• Frameworks: Spring Boot, JUnit, Mockito, Apache CXF
A collaborative DevOps team of 50 individuals in the development, ops and QA space. During the interview, you will meet a few key members.
Phone screen and then onsite
Eight Eleven Group provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, national origin, age, sex, citizenship, disability, genetic information, gender, sexual orientation, gender identity, marital status, amnesty or status as a covered veteran in accordance with applicable federal, state, and local laws.