org.deeplearning4j.nn.multilayer.MultiLayerNetwork.rnnTimeStep()方法的使用及代码示例

x33g5p2x  于2022-01-25 转载在 其他  
字(2.5k)|赞(0)|评价(0)|浏览(120)

本文整理了Java中org.deeplearning4j.nn.multilayer.MultiLayerNetwork.rnnTimeStep()方法的一些代码示例,展示了MultiLayerNetwork.rnnTimeStep()的具体用法。这些代码示例主要来源于Github/Stackoverflow/Maven等平台,是从一些精选项目中提取出来的代码,具有较强的参考意义,能在一定程度帮忙到你。MultiLayerNetwork.rnnTimeStep()方法的具体详情如下:
包路径:org.deeplearning4j.nn.multilayer.MultiLayerNetwork
类名称:MultiLayerNetwork
方法名:rnnTimeStep

MultiLayerNetwork.rnnTimeStep介绍

[英]If this MultiLayerNetwork contains one or more RNN layers: conduct forward pass (prediction) but using previous stored state for any RNN layers. The activations for the final step are also stored in the RNN layers for use next time rnnTimeStep() is called.
This method can be used to generate output one or more steps at a time instead of always having to do forward pass from t=0. Example uses are for streaming data, and for generating samples from network output one step at a time (where samples are then fed back into the network as input)
If no previous state is present in RNN layers (i.e., initially or after calling rnnClearPreviousState()), the default initialization (usually 0) is used.
Supports mini-batch (i.e., multiple predictions/forward pass in parallel) as well as for single examples.
[中]如果此多层网络包含一个或多个RNN层:执行前向传递(预测),但对任何RNN层使用以前存储的状态。最后一步的激活也存储在RNN层中,供下次调用rnnTimeStep()时使用。
此方法可用于一次生成一个或多个步骤的输出,而不必总是从t=0向前传递。示例用途是流式数据,以及从网络输出一步一步地生成样本(样本随后作为输入反馈到网络中)
如果RNN层中不存在以前的状态(即,最初或调用rnnClearPreviousState()之后),则使用默认初始化(通常为0)。
支持小批量(即多个预测/并行正向传递)以及单个示例。

代码示例

代码示例来源:origin: deeplearning4j/dl4j-examples

INDArray output = net.rnnTimeStep(initializationInput);
output = output.tensorAlongDimension((int)output.size(2) - 1, 1, 0);    //Gets the last time step output
  output = net.rnnTimeStep(nextInput);    //Do one time step of forward pass

代码示例来源:origin: deeplearning4j/dl4j-examples

INDArray output = net.rnnTimeStep(initializationInput);
output = output.tensorAlongDimension((int)output.size(2)-1,1,0);	//Gets the last time step output
  output = net.rnnTimeStep(nextInput);	//Do one time step of forward pass

代码示例来源:origin: org.deeplearning4j/deeplearning4j-nn

input = ((RecurrentLayer) layers[i]).rnnTimeStep(input);
} else if (layers[i] instanceof MultiLayerNetwork) {
  input = ((MultiLayerNetwork) layers[i]).rnnTimeStep(input);
} else {
  input = layers[i].activate(input, false);

代码示例来源:origin: sjsdfg/dl4j-tutorials

net.rnnTimeStep(trainData.getFeatures());
INDArray predicted = net.rnnTimeStep(testData.getFeatures());

代码示例来源:origin: sjsdfg/dl4j-tutorials

net.rnnTimeStep(t.getFeatures());
INDArray predicted = net.rnnTimeStep(t.getFeatures());
normalizer.revertLabels(predicted);

代码示例来源:origin: org.deeplearning4j/deeplearning4j-nn

out = ((RecurrentLayer) l).rnnTimeStep(current.getInputs()[0]);
} else if (l instanceof MultiLayerNetwork) {
  out = ((MultiLayerNetwork) l).rnnTimeStep(current.getInputs()[0]);
} else {

相关文章