Leidos has a need for a Data Engineer to support the development of a data lake for a project located in Mclean, VA.
In this role, candidate will work at a customer site to support the agile development of tools and leverage standard tools (particularly Apache-NIFI) for Extract, Transform, and Loading data between databases for the sponsor. The successful candidate will create custom code to quickly extract, triage, and exploit data across domains in support of analytic work while supporting the strategic development of replicable processes.
The successful candidate will use NIFI to ETL data into a secure Hadoop environment. They must write NIFI processors or, in instances where NIFI cannot be implemented, write custom Java code to ingest existing and new data sources. The candidate will conduct product usability tests and must work efficiently with a cross functional team members to include analysts, data scientists, project managers, and software solutions integrators.
REQUIRED EDUCATION AND EXPERIENCE:
Bachelor's Degree and 12+ years of experience in data engineering and/or database administration. Required skills are:
- Experience with Apache-NIFI, Kafka, and Spark Streaming for ETL work
- Familiarity with HBase, solr, Spark, Oozie, and Impala
- Java and Python proficient
- Understanding and proficiency in cross-domain solutions (ETLing data from unclassified to classified systems and across classified environments)
- Agile development and proficiency in continuous integration/delivery tools such as Jenkins, Artifactory, and Git
- Proficiency with AWS and container technologies such as Docker desired but not required.