The functional coding language Scala is Object Oriented, type safe and compiles to ByteCode, the format for JVM. It is therefore interoperable with Java. Apache Spark understands BigData. For working with Apache Spark, you better learn Scala.
I would say it is essential to have a knowledge of Scala. Sure Java and Python are also the other languages you should have in your toolbox as all of these allow you to build parallel applications for a distributed environment. In addition to Java, Python, and Scala, R programming can also be used.