PROCESSING APPLICATION
Hold tight! We’re comparing your resume to the job requirements…
ARE YOU SURE YOU WANT TO APPLY TO THIS JOB?
Based on your Resume, it doesn't look like you meet the requirements from the employer. You can still apply if you think you’re a fit.
Job Requirements of Principal Engineer - Java/Hadoop:
-
Employment Type:
Contractor
-
Location:
Chicago, IL (Onsite)
Do you meet the requirements for this job?
Principal Engineer - Java/Hadoop
REMOTE
US Based candidates that can travel to Mexico
GC or USC
W2 Only
Contract 6 months+
Position Overview:
We are seeking a highly skilled and experienced Senior Hadoop Engineer to lead a transformative project migrating from Informix to Cassandra for our Data Loader system. This system manages large-scale banking information, categorizing it into various buckets such as Customer Portfolio, Credit, and Loans. You will play a critical role in designing and implementing a Hadoop-based solution to ensure a smooth transition, requiring both strategic oversight and hands-on technical work. As the project involves collaboration with a team based in Mexico City, international travel is required.
Key Responsibilities:
-
Cluster Evaluation and Optimization: Assess and evaluate Hadoop clusters, including sizes, nodes, and throughput. Provide actionable recommendations for improvements.
-
Architecture and Design: Develop a comprehensive architecture for the Hadoop environment, ensuring it meets the scalability, performance, and reliability requirements of the data transformation project. Design data processing workflows and integrate Hadoop with existing systems.
-
Design and Implementation: Architect a scalable Hadoop solution to facilitate the Informix-to-Cassandra transformation, ensuring efficient data ingestion and processing.
-
Hands-On Engineering: Configure, optimize, and manage Hadoop environments, including map/reduce functionalities.
-
Data Management: Oversee the ingestion and categorization of large-scale banking data into the correct buckets, ensuring data integrity and processing efficiency.
-
Collaboration: Work with the Mexico City-based team to align on project objectives, offer technical guidance, and support the transformation process.
-
Troubleshooting: Identify and resolve issues related to Hadoop environments, addressing performance bottlenecks and configuration challenges.
Must Have Skills:
-
Sr. Java/Python Developer or Sr. DevOps Experience: Extensive experience in Java or Python development or DevOps, with a focus on big data applications.
-
Hadoop Engineering Expertise: Deep knowledge of Hadoop engineering, including the configuration and optimization of multi-node Hadoop environments.
-
Map/Reduce Management: Proven experience managing map/reduce processes for effective data handling and processing in Hadoop.
Good to Have:
- Cassandra Experience: Familiarity with Cassandra configuration, troubleshooting, and optimization is a plus.
Qualifications:
- Bachelor’s degree in Computer Science, Engineering, or a related field; advanced degree preferred.
- Extensive experience in Hadoop engineering and big data technologies.
- Proven track record in managing similar transformation projects, especially large-scale data migrations.
- Strong analytical, problem-solving, and communication skills, with the ability to work independently and as part of an international team.
Travel Requirements:
- Willingness and ability to travel internationally, specifically to Mexico City, as needed.
#tech2023
#LI-RA1