Apache Spark requires the executable file winutils.exe to function correctly on the Windows Operating System when running against a non-Windows cluster.
Download winutils.exe binary from WinUtils repository. You should select the version of Hadoop the Spark distribution was compiled with. For example, use hadoop-2.7.1 for Spark 3.0.1.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment