Win a copy of Rust Web Development this week in the Other Languages forum!
  • Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other Pie Elite all forums
this forum made possible by our volunteer staff, including ...
Marshals:
  • Tim Cooke
  • Campbell Ritchie
  • Ron McLeod
  • Liutauras Vilda
  • Jeanne Boyarsky
Sheriffs:
  • Junilu Lacar
  • Rob Spoor
  • Paul Clapham
Saloon Keepers:
  • Tim Holloway
  • Tim Moores
  • Jesse Silverman
  • Stephan van Hulst
  • Carey Brown
Bartenders:
  • Al Hobbs
  • Piet Souris
  • Frits Walraven

ERROR security.UserGroupInformation: PriviledgedActionException in MapReduce program

 
Ranch Hand
Posts: 84
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
I am using Hadoop 2.2.0.
Following commands are running fine on hdfs.


I have made a wordcount program in eclipse and add the jars using maven and run this jar using this command:


it give following error:


jar is on my local system. both input and output path is on hdfs. there is no output dir exist on output path on hdfs.

core-site.xml:


hdfs-site.xml:


mapred-site.xml:


yarn-site.xml:


/etc/hosts:


please advice to solve the issue.

Thanks.
 
Ranch Hand
Posts: 544
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Hello,

Please note the exception message -

Exception in thread "main"
org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory
hdfs://localhost:54310/user/ubuntu/wordcount/input/vij.txt already
exists



You can delete the file and then try running. I cannot recollect but there is an option to overwrite the files if they exists.

Regards,
Amit
 
Babu Singh
Ranch Hand
Posts: 84
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Hello,
I have delete the vij.txt from hdfs. but issue still persist.

these are my java files:
WordCount.java:

WordMapper.java:


SumReducer.java:


please suggest to solve the issue.
Thanks.

amit punekar wrote:Hello,

Please note the exception message -

Exception in thread "main"
org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory
hdfs://localhost:54310/user/ubuntu/wordcount/input/vij.txt already
exists



You can delete the file and then try running. I cannot recollect but there is an option to overwrite the files if they exists.

Regards,
Amit

 
amit punekar
Ranch Hand
Posts: 544
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
I just happened to notice in the exception message of yours that Output directory is pointing to your input file. Please check if your code uses the input and output parameters incorrectly.

org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory

hdfs://localhost:54310/user/ubuntu/wordcount/input/vij.txt

already exists



regards,
amit
 
Don't get me started about those stupid light bulbs.
reply
    Bookmark Topic Watch Topic
  • New Topic