• Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other all forums
this forum made possible by our volunteer staff, including ...
Marshals:
  • Campbell Ritchie
  • Paul Clapham
  • Jeanne Boyarsky
  • Junilu Lacar
  • Henry Wong
Sheriffs:
  • Ron McLeod
  • Devaka Cooray
  • Tim Cooke
Saloon Keepers:
  • Tim Moores
  • Stephan van Hulst
  • Frits Walraven
  • Tim Holloway
  • Carey Brown
Bartenders:
  • Piet Souris
  • salvin francis
  • fred rosenberger

Value not defined for given nominal attribute when using DT classifier

 
Greenhorn
Posts: 29
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
I am using 10 cross validation for feature construction, when I test the code with DT classifier it works well with some folds  , but for other folds it gives me the below error
Exception in thread "main" net.sf.javaml.tools.weka.WekaException: java.lang.IllegalArgumentException: Value not defined for given nominal attribute!
  at net.sf.javaml.tools.weka.WekaClassifier.classify(WekaClassifier.java:43)
  at psofc2.MyClassifier.classify(MyClassifier.java:98)
  at psofc2.TestingWithClassifier.fc(TestingWithClassifier.java:52)
  at psofc2.Main.main(Main.java:231
and the below code is related to line 98 which gives the error
for (Instance instance : dataForClassification) {
          Object prediction = getClassifier().classify(instance);
please help me
thanks
 
Marshal
Posts: 69411
276
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
What does the documentation for classify() say about acceptable types of argument?
 
arwa ali
Greenhorn
Posts: 29
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
the line 98 regarding the error contains
Object prediction = getClassifier().classify(instance);
 
Campbell Ritchie
Marshal
Posts: 69411
276
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
That doesn't answer my question, nor does it provide information to explain the exception.
 
Saloon Keeper
Posts: 12008
257
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Obviously there's not much anybody can do to help you without knowing how you create your data or classifier.
 
arwa ali
Greenhorn
Posts: 29
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Actually I am a beginner in java and I do not know what is the documentation go for?
Anyway may be this one help
public interface Classifier extends Serializable {

   public void buildClassifier(Dataset dtst);

   public Object classify(Instance instnc);

   public Map<Object, Double> classDistribution(Instance instnc);
}
 
Campbell Ritchie
Marshal
Posts: 69411
276
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
I  found this, but it doesn't have a classify() method.
Have you got any more details about that interface? I think you will have to go back to whoever gave you that code and ask them to explain what classify() requires.
 
Stephan van Hulst
Saloon Keeper
Posts: 12008
257
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
The class is part of the Java-ML library. It is poorly documented, so looking it up won't help this issue.

Again, there's not much we can do to help without knowing how the data or the classifier is created.
 
arwa ali
Greenhorn
Posts: 29
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
The whole class that contain the classifier creation is as follows
/*
* To change this template, choose Tools | Templates
* and open the template in the editor.
*/
package psofc2;

import java.util.HashMap;
import java.util.Map;
import java.util.Random;
import libsvm.LibSVM;
import net.sf.javaml.classification.Classifier;
import net.sf.javaml.classification.KNearestNeighbors;
import net.sf.javaml.classification.bayes.NaiveBayesClassifier;
import net.sf.javaml.classification.evaluation.PerformanceMeasure;
import net.sf.javaml.classification.tree.RandomTree;
import net.sf.javaml.core.Dataset;
import net.sf.javaml.core.Instance;
import net.sf.javaml.tools.weka.WekaClassifier;
import weka.classifiers.trees.REPTree;

/**
*
*
*/
public class MyClassifier {

   //private NaiveBayesClassifier classifier;
   private net.sf.javaml.classification.Classifier classifier;
   private Random random;
   /*
    */

   public MyClassifier(Random random) {
       this.random = random;
   }

   public void ClassifierKNN() {
       System.out.println("Myclassifier:  new KNearestNeighbors(5)");
       setClassifier(new KNearestNeighbors(5));
   }

   public void ClassifierNB() {
       setClassifier(new NaiveBayesClassifier(false, true, false));
       System.out.println("Myclassifier:  new NaiveBayesClassifier(false, true, false)");
   }

   public void ClassifierLibSVM() {
       setClassifier(new LibSVM());
       System.out.println("Myclassifier:  new LibSVM()");
   }

   public void ClassifierDT() {
       setClassifier(new WekaClassifier(new REPTree()));
       System.out.println("Myclassifier: new  DT - REPTree()");
   }

   public void ClassifierRT(int noFeatures) {
       setClassifier(new RandomTree(noFeatures, new Random(0)));
       System.out.println("Myclassifier:  new RandomTree(noFeatures, new Random(0))");
   }

   /**
    * @return the classifier
    */
   public Classifier getClassifier() {
       return classifier;
   }

   /**
    * @param classifier the classifier to set
    */
   public void setClassifier(Classifier classifier) {
       this.classifier = classifier;
   }

   public double fullclassify(Dataset training, Dataset testing) {
       double[] features = new double[training.noAttributes()];
       for (int i = 0; i < training.noAttributes(); i++) {
           features[i] = 1.0;
       }
       double acc = 0.0;
       acc = classify(training, testing);
       return acc;
   }

   public double classify(Dataset training, Dataset testing) {
   
   
//        Classifier knn = new KNearestNeighbors(5);//        KNearestNeighbors knn = new KNearestNeighbors(5);
       getClassifier().buildClassifier(training);
       Dataset dataForClassification = testing;

       Map<Object, PerformanceMeasure> out = new HashMap<Object, PerformanceMeasure>();
       for (Object o : training.classes()) {
           out.put(o, new PerformanceMeasure());
       }
       for (Instance instance : dataForClassification) {
           Object prediction = getClassifier().classify(instance);
           if (prediction!=null){
           if (instance.classValue().equals(prediction)) {// prediction                // ==class
               for (Object o : out.keySet()) {
                   if (o.equals(instance.classValue())) {
                       out.get(o).tp++;
                   } else {
                       out.get(o).tn++;
                   }
               }
           } else {// prediction != class
               for (Object o : out.keySet()) {
                   /* prediction is positive class */
                   if (prediction.equals(o)) {
                       out.get(o).fp++;
                   } /* instance is positive class */ else if (o.equals(instance.classValue())) {
                       out.get(o).fn++;
                   } /* none is positive class */ else {
                       out.get(o).tn++;
                   }

               }
           }
       }
       }
//                System.out.println("out====: "+out);
       double tp = 0.0, tn = 0.0;
       double fp = 0.0, fn = 0.0;
       double Accuracy = 0.0;
       for (Object o : out.keySet()) {
           tp += out.get(o).tp;
           tn += out.get(o).tn;
           fp += out.get(o).fp;
           fn += out.get(o).fn;
       }
      Accuracy = (tn + tp) / (double) (out.size() * dataForClassification.size());
     
       return Accuracy;
   }
}
 
Example of my data like this

@relation colon

@attribute f1 numeric
@attribute f2 numeric
@attribute f3 numeric
@attribute f4 numeric
@attribute f5 numeric
@attribute f6 numeric
@attribute f7 numeric
@attribute f8 numeric
@attribute f9 numeric
@attribute class {1,2}

@data
-2,-2,-2,-2,0,-2,0,-2,0,2
-2,-2,0,0,0,2,0,0,0,1
2,2,0,-2,-2,2,2,-2,0,2
-2,-2,-2,-2,-2,2,0,2,0,1
 
Campbell Ritchie
Marshal
Posts: 69411
276
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
That code doesn't throw the exception you are suffering. It is thrown elsewhere, from a WekaClassifier object. Please find the details of that object.
 
arwa ali
Greenhorn
Posts: 29
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
No sir. the first thrown exception  which is
at psofc2.MyClassifier.classify(MyClassifier.java:98)
is related to the following line
 Object prediction = getClassifier().classify(instance);
which is included in this code
 
Campbell Ritchie
Marshal
Posts: 69411
276
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
...but it is the object whose reference is returned from getClassifer() that we need the information about.
There is only one exception, and yoiu need to investigate the first line of code in the stack trace. That class should have documentation telling you what sort of argument it will and won't accept.
 
Rancher
Posts: 4576
47
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
As Stephan says, it's JavaML, and the documentation is lacking (to say the least).

It's not what I would call a task for someone who identifies as a Java beginner.
 
Whoever got anywhere by being normal? Just ask this exceptional tiny ad:
Devious Experiments for a Truly Passive Greenhouse!
https://www.kickstarter.com/projects/paulwheaton/greenhouse-1
    Bookmark Topic Watch Topic
  • New Topic