Agile Lab is a top tier data engineering firm, specialized in Spark and many other distributed technologies since 2013. Agile Lab is now employing 60 talented data engineers/scientists in a remote-first culture and with an international mindset.
Give a look at our handbook to discover our core values and processes.
We are looking to hire a talented Big Data Engineer to develop and manage our Big Data solutions. In this role, you will be required to design and implement Big Data tools and frameworks, data-intensive applications, and cloud platforms.
To ensure success as a Big Data Engineer, you should have in-depth knowledge of distributed computing technologies and top problem-solving skills.
Responsibilities:
- Analyzes, designs, and implements complex systems.
- Challenges the team processes, looking for ways to improve
- Mentors junior engineers via pairing, design, and code review.
- Builds software solutions by adhering to our quality standards about SDLC.
- Makes active efforts to stay up-to-date with technologies.
- Leads the technical design of complex systems.
- Understands and optimizes system performances.
- Takes the initiative to fix issues before they become a problem.
- Delivers complex systems, well-baked, and almost bug-free.
- Proactively Identifies problems with requirements and project plan.
Requirements:
- Bachelor’s degree in Computer Engineering or Computer Science.
- Previous experience as a Big Data Engineer (3-5 years)
- In-depth knowledge of Hadoop, Spark, and similar frameworks.
- In-depth knowledge of programming languages including Java and Scala.
- Expert in cloud-native technologies, IaC, and Docker tools.
- Excellent project management skills.
- Excellent communication skills.
- Proactivity
- Business and functional understanding
- Team Player
- Ability to solve complex networking, data, and software issues.
- Leadership capabilities