org.deeplearning4j.nn.multilayer.MultiLayerNetwork.output()方法的使用及代码示例

x33g5p2x  于2022-01-25 转载在 其他  
字(7.5k)|赞(0)|评价(0)|浏览(99)

本文整理了Java中org.deeplearning4j.nn.multilayer.MultiLayerNetwork.output()方法的一些代码示例,展示了MultiLayerNetwork.output()的具体用法。这些代码示例主要来源于Github/Stackoverflow/Maven等平台,是从一些精选项目中提取出来的代码,具有较强的参考意义,能在一定程度帮忙到你。MultiLayerNetwork.output()方法的具体详情如下:
包路径:org.deeplearning4j.nn.multilayer.MultiLayerNetwork
类名称:MultiLayerNetwork
方法名:output

MultiLayerNetwork.output介绍

[英]Label the probabilities of the input
[中]标记输入的概率

代码示例

代码示例来源:origin: guoguibing/librec

@Override
  protected double predict(int userIdx, int itemIdx) throws LibrecException {
    INDArray predictedRatingVector = autoRecModel.output(trainSet.getRow(itemIdx));
    return predictedRatingVector.getDouble(userIdx);
  }
}

代码示例来源:origin: deeplearning4j/dl4j-examples

private static void evaluatePerformance(MultiLayerNetwork net, int testStartIdx, int nExamples, String outputDirectory) throws Exception {
  //Assuming here that the full test data set doesn't fit in memory -> load 10 examples at a time
  Map<Integer, String> labelMap = new HashMap<>();
  labelMap.put(0, "circle");
  labelMap.put(1, "square");
  labelMap.put(2, "arc");
  labelMap.put(3, "line");
  Evaluation evaluation = new Evaluation(labelMap);
  DataSetIterator testData = getDataSetIterator(outputDirectory, testStartIdx, nExamples, 1000);
  while(testData.hasNext()) {
    DataSet dsTest = testData.next();
    INDArray predicted = net.output(dsTest.getFeatures(), false);
    INDArray actual = dsTest.getLabels();
    evaluation.evalTimeSeries(actual, predicted);
  }
  System.out.println(evaluation.stats());
}

代码示例来源:origin: deeplearning4j/dl4j-examples

while(mnistTest.hasNext()){
  DataSet ds = mnistTest.next();
  INDArray output = model.output(ds.getFeatures(), false);
  eval.eval(ds.getLabels(), output);

代码示例来源:origin: deeplearning4j/dl4j-examples

while(mnistTest.hasNext()){
  DataSet ds = mnistTest.next();
  INDArray output = model.output(ds.getFeatures(), false);
  eval.eval(ds.getLabels(), output);

代码示例来源:origin: deeplearning4j/dl4j-examples

while(mnistTest.hasNext()){
  DataSet ds = mnistTest.next();
  INDArray output = model.output(ds.getFeatures(), false);
  eval.eval(ds.getLabels(), output);

代码示例来源:origin: guoguibing/librec

predictedMatrix = CDAEModel.output(trainSet);
for (MatrixEntry me: trainMatrix) {
  predictedMatrix.put(me.row(), me.column(), 0);

代码示例来源:origin: org.deeplearning4j/deeplearning4j-nn

/**
 * Label the probabilities of the input
 *
 * @param input the input to label
 * @return a vector of probabilities
 * given each label.
 * <p>
 * This is typically of the form:
 * [0.5, 0.5] or some other probability distribution summing to one
 */
public INDArray output(INDArray input) {
  return output(input, TrainingMode.TEST);
}

代码示例来源:origin: org.deeplearning4j/deeplearning4j-nn

public INDArray output(DataSetIterator iterator) {
  return output(iterator, false);
}

代码示例来源:origin: mccorby/FederatedAndroidTrainer

public INDArray predict(final INDArray input) {
  return mNetwork.output(input, false);
}

代码示例来源:origin: org.deeplearning4j/deeplearning4j-nn

/**
 * Label the probabilities of the input
 *
 * @param input    the input to label
 * @param train whether the output
 *             is test or train. This mainly
 *             affect hyper parameters such as
 *             drop out where certain things should
 *             be applied with activations
 * @return a vector of probabilities
 * given each label.
 * <p>
 * This is typically of the form:
 * [0.5, 0.5] or some other probability distribution summing to one
 */
public INDArray output(INDArray input, TrainingMode train) {
  return output(input, train == TrainingMode.TRAIN);
}

代码示例来源:origin: CampagneLaboratory/variationanalysis

public void updateWrongness(INDArray features, MultiLayerNetwork net) {
  INDArray predictedLabels = net.output(features, false);
  this.wrongness = ErrorRecord.calculateWrongness(0, predictedLabels, label);
}

代码示例来源:origin: CampagneLaboratory/variationanalysis

public void predictForNext(MultiLayerNetwork network, Iterator<DataSet> iterator) {
  Arrays.fill(resultGraph, null);
  resultGraph[0] = network.output(iterator.next().getFeatures(), false);
}

代码示例来源:origin: org.deeplearning4j/deeplearning4j-nn

/**
 * Evaluate the output
 * using the given true labels,
 * the input to the multi layer network
 * and the multi layer network to
 * use for evaluation
 * @param trueLabels the labels to ise
 * @param input the input to the network to use
 *              for evaluation
 * @param network the network to use for output
 */
public void eval(INDArray trueLabels, INDArray input, MultiLayerNetwork network) {
  eval(trueLabels, network.output(input, Layer.TrainingMode.TEST));
}

代码示例来源:origin: org.deeplearning4j/deeplearning4j-nn

/**
 * Use to get the output from a featurized input
 *
 * @param input featurized data
 * @return output
 */
public INDArray outputFromFeaturized(INDArray input) {
  if (isGraph) {
    if (unFrozenSubsetGraph.getNumOutputArrays() > 1) {
      throw new IllegalArgumentException(
              "Graph has more than one output. Expecting an input array with outputFromFeaturized method call");
    }
    return unFrozenSubsetGraph.output(input)[0];
  } else {
    return unFrozenSubsetMLN.output(input);
  }
}

代码示例来源:origin: sjsdfg/dl4j-tutorials

public static List<Double> getPredict(MultiLayerNetwork net, DataSetIterator iterator) {
    List<Double> labels = new LinkedList<>();
    while (iterator.hasNext()) {
      org.nd4j.linalg.dataset.DataSet dataSet = iterator.next();

      INDArray output = net.output(dataSet.getFeatures());

      long[] shape = output.shape();
      for (int i = 0; i < shape[0]; i++) {
        labels.add(output.getDouble(i));
      }
    }
    iterator.reset();

    return labels;
  }
}

代码示例来源:origin: rahul-raj/Deeplearning4J

public INDArray generateOutput(File file) throws IOException, InterruptedException {
  File modelFile = new File("model.zip");
  MultiLayerNetwork restored = ModelSerializer.restoreMultiLayerNetwork(modelFile);
  RecordReader recordReader = generateSchemaAndReaderForPrediction(file);
  INDArray array = RecordConverter.toArray(recordReader.next());
  NormalizerStandardize normalizerStandardize = ModelSerializer.restoreNormalizerFromFile(modelFile);
  normalizerStandardize.transform(array);
  return restored.output(array,false);
}

代码示例来源:origin: mccorby/FederatedAndroidTrainer

@Override
public String evaluate(FederatedDataSet federatedDataSet) {
  //evaluate the model on the test set
  DataSet testData = (DataSet) federatedDataSet.getNativeDataSet();
  double score = model.score(testData);
  Evaluation eval = new Evaluation(numClasses);
  INDArray output = model.output(testData.getFeatureMatrix());
  eval.eval(testData.getLabels(), output);
  return "Score: " + score;
}

代码示例来源:origin: mccorby/FederatedAndroidTrainer

@Override
public String evaluate(FederatedDataSet federatedDataSet) {
  //evaluate the model on the test set
  DataSet testData = (DataSet) federatedDataSet.getNativeDataSet();
  RegressionEvaluation eval = new RegressionEvaluation(12);
  INDArray output = model.output(testData.getFeatureMatrix());
  eval.eval(testData.getLabels(), output);
  return "MSE: " + eval.meanSquaredError(11) + "\nScore: " + model.score();
}

代码示例来源:origin: apache/opennlp-sandbox

@Override
public double[] categorize(String[] text, Map<String, Object> extraInformation) {
  INDArray seqFeatures = this.model.getGloves().embed(text, this.model.getMaxSeqLen());
  INDArray networkOutput = this.model.getNetwork().output(seqFeatures);
  long timeSeriesLength = networkOutput.size(2);
  INDArray probsAtLastWord = networkOutput.get(NDArrayIndex.point(0),
      NDArrayIndex.all(), NDArrayIndex.point(timeSeriesLength - 1));
  int nLabels = this.model.getLabels().size();
  double[] probs = new double[nLabels];
  for (int i = 0; i < nLabels; i++) {
    probs[i] = probsAtLastWord.getDouble(i);
  }
  return probs;
}

代码示例来源:origin: mccorby/FederatedAndroidTrainer

@Override
public String evaluate(FederatedDataSet federatedDataSet) {
  DataSet testData = (DataSet) federatedDataSet.getNativeDataSet();
  List<DataSet> listDs = testData.asList();
  DataSetIterator iterator = new ListDataSetIterator(listDs, BATCH_SIZE);
  Evaluation eval = new Evaluation(OUTPUT_NUM); //create an evaluation object with 10 possible classes
  while (iterator.hasNext()) {
    DataSet next = iterator.next();
    INDArray output = model.output(next.getFeatureMatrix()); //get the networks prediction
    eval.eval(next.getLabels(), output); //check the prediction against the true class
  }
  return eval.stats();
}

相关文章