About 5,190,000 results
Open links in new tab
  1. ambari - Fail: The package hadoop-hdfs-dfsrouter is ... - Stack …

    Jan 3, 2025 · Fail: The package hadoop-hdfs-dfsrouter is not supported by this version of the stack-select tool Asked 11 months ago Modified 4 months ago Viewed 311 times

  2. What is "Hadoop" - the definition of Hadoop? - Stack Overflow

    The Apache Hadoop software library is a framework that allows for the distributed processing of large data sets across clusters of computers using simple programming models. So "Hadoop" …

  3. Hadoop installation on windows - Stack Overflow

    Nov 18, 2014 · Path = "HADOOP_HOME/bin" Here is a GitHub link, which has winutils of some versions of Hadoop. (if the version you are using is not in the list, the follow the conventional …

  4. Failed to locate the winutils binary in the hadoop binary path

    Oct 27, 2013 · If we directly take the binary distribution of Apache Hadoop 2.2.0 release and try to run it on Microsoft Windows, then we'll encounter ERROR util.Shell: Failed to locate the …

  5. hdfs - How to create new user in hadoop - Stack Overflow

    Jan 18, 2022 · I am new to hadoop. I have done apache hadoop multinode installation and the user name is hadoop. I am using total 3 nodes: 1 namenode and 2 datanodes I have to create …

  6. view contents of file in hdfs hadoop - Stack Overflow

    Mar 26, 2020 · Probably a noob question but is there a way to read the contents of file in hdfs besides copying to local and reading thru unix? So right now what I am doing is: bin/hadoop …

  7. I can't open localhost:8088 when I use hadoop - Stack Overflow

    Mar 7, 2021 · 0 After I configured all the *.xmls file in hadoop. I use the command: ./sbin/start-all.sh everything goes well I use jps to check the process, all of them are running. But when I …

  8. Datanode process not running in Hadoop - Stack Overflow

    Aug 10, 2012 · I set up and configured a multi-node Hadoop cluster using this tutorial. When I type in the start-all.sh command, it shows all the processes initializing properly as follows: …

  9. Can Apache Spark run without Hadoop? - Stack Overflow

    Aug 15, 2015 · 72 Spark can run without Hadoop but some of its functionality relies on Hadoop's code (e.g. handling of Parquet files). We're running Spark on Mesos and S3 which was a little …

  10. http://localhost:9870 does not work HADOOP - Stack Overflow

    Apr 8, 2018 · 10 First, you need to check you Hadoop daemons are running by entering the command: jps. Here my namenode is also configured as datanode. Second, check Namenode …