Requirements
- 5+ years of in-depth technical experience with modern Big Data technologies such as Hadoop, Hive, Presto, and Spark.
- Experience with Kerberos and related Hadoop Security Technologies.
- Experience with Apache Sentry or similar Role-Based Authorization tools for Hadoop.
- Familiarity with the Linux operating system and the Bash shell.
- General software engineering expertise: Java is preferred, but we also write a lot of Python and Kotlin.
- The ability to communicate effectively in an operations environment.
- Willingness to "get your hands dirty" in a modern Big Data environment.