- 2 to 4 years hands on experience (coding) in developing solutions on Hadoop, HDFS and Spark.
- Experience working with Kafka and Cassandra will be an added advantage.
- Deep understanding of distributed systems (e.g. partitioning, replication, consistency).
- Expertise with object-oriented programming,preferably Scala.
- Experience working with Apache Spark API, Apache Spark SQL DataFrame API.
- Expertise in SQL and hands on experience of at least one RDBMS - MySQL, NoSQL, PostgreSQL.
- Understanding working in Linux/Unix environment.
- Exposure to Docker and container-based development.
- Bachelor's Degree in Computer Science, Computer Engineering, or equivalent experience/knowledge desired.
- Strong knowledge in Java/J2EE.
- Good knowledge of multithreaded programing.
- Strong communication skills.
- Ability to lead teams, mentor developers.
- Understanding of Spring, Spring Boot and Core Java technologies.
- Play framework, Experience in working with Apache Spark.
- Experience in working with NoSQL db.
- Experience with modern development patterns like micro services and server less.
- Experience on SCRUM Agile methodology, Devops.
If you are looking to work with exciting talent in Data Engineering & Big data Technologies like - Hadoop, SOLR, Spark, Hive, Airflow, Elastic Search, Docker, Kubernetes, Scala, Python, and Functional Programming? Then Modak is the right place for an exciting career opportunity as Scala Developer.