I used Spark 1.5.2 with Hadoop 2.6 and had similar problems. Solved by doing the following steps:
-
Download
winutils.exe
from the repository to some local folder, e.g.C:\hadoop\bin
. -
Set
HADOOP_HOME
toC:\hadoop
. -
Create
c:\tmp\hive
directory (using Windows Explorer or any other tool). -
Open command prompt with admin rights.
-
Run
C:\hadoop\bin\winutils.exe chmod 777 /tmp/hive
With that, I am still getting some warnings, but no ERRORs and can run Spark applications just fine.