The #1 Site for IT and Engineering Jobs - search all IT and Engineering  jobs.

Software Engineer

Job Description

As Hadoop Developer in Risk & Finance Core Services, the ideal candidate for this developer role should be able to:

1.Build high performing data models on big-data architecture as data services.

2.Build high performing and scalable data pipeline platform using Hadoop, Apache Spark and object storage architecture.

3.Partner with Enterprise data teams such as Data Management & Insights and Enterprise Data Environment (Data Lake) and identify the best place to source the data

Required Qualifications

1.Experience in Hadoop ecosystem tools for real-time batch data ingestion, processing and provisioning such as Apache Flume, Apache Kafka, Apache Sqoop, Apache Flink, Apache Spark or Apache Storm

2.Java or Python experience

3.Agile experience

4.Design and development experience with columnar databases using Parquet or ORC file formats on Hadoop

5.-Apache Spark design and development experience using Scala, Java, Python or Data Frames with Resilient Distributed Datasets (RDDs)

6.Deliver the data services on container-based architecture such as Kubernetes and Docker

7.ETL (Extract, Transform, Load) Programming experience

Hadoop Spark Apache and ETL

Job Requirements

 

Job Snapshot

Location US-NC-Charlotte
Employment Type Contractor
Pay Type Hour
Pay Rate N/A
Store Type Other
Apply

Company Overview

Collabera

At Collabera, you get a chance to do great work with some of the brightest people, without the frustration of being a nameless face in a sea of cubicles. We promote a culture of transparency and openness that embraces enthusiasm and passion. If you have what it takes, we want you to follow your passion. Whether that means working on cutting edge technology, understanding and overcoming business challenges, becoming a cross-discipline general practitioner or something else entirely. Learn More

Contact Information

US-NC-Charlotte
Binal Patel
9738895200
Snapshot
Collabera
Company:
US-NC-Charlotte
Location:
Contractor
Employment Type:
Hour
Pay Type:
N/A
Pay Rate:
Other
Store Type:

Job Description

As Hadoop Developer in Risk & Finance Core Services, the ideal candidate for this developer role should be able to:

1.Build high performing data models on big-data architecture as data services.

2.Build high performing and scalable data pipeline platform using Hadoop, Apache Spark and object storage architecture.

3.Partner with Enterprise data teams such as Data Management & Insights and Enterprise Data Environment (Data Lake) and identify the best place to source the data

Required Qualifications

1.Experience in Hadoop ecosystem tools for real-time batch data ingestion, processing and provisioning such as Apache Flume, Apache Kafka, Apache Sqoop, Apache Flink, Apache Spark or Apache Storm

2.Java or Python experience

3.Agile experience

4.Design and development experience with columnar databases using Parquet or ORC file formats on Hadoop

5.-Apache Spark design and development experience using Scala, Java, Python or Data Frames with Resilient Distributed Datasets (RDDs)

6.Deliver the data services on container-based architecture such as Kubernetes and Docker

7.ETL (Extract, Transform, Load) Programming experience

Hadoop Spark Apache and ETL

Job Requirements

 
Mw484g6wbrqg3r33cc9
Sologig Advice

For your privacy and protection, when applying to a job online: Never give your social security number to a prospective employer, provide credit card or bank account information, or perform any sort of monetary transaction.Learn More

By applying to a job using sologig.com you are agreeing to comply with and be subject to the workinretail.com Terms and Conditions for use of our website. To use our website, you must agree with theTerms & Conditionsand both meet and comply with their provisions.
Software Engineer Apply now