gov.sandia.cognition.math.matrix.Vector.dotTimesEquals()方法的使用及代码示例

x33g5p2x  于2022-02-01 转载在 其他  
字(6.8k)|赞(0)|评价(0)|浏览(82)

本文整理了Java中gov.sandia.cognition.math.matrix.Vector.dotTimesEquals()方法的一些代码示例,展示了Vector.dotTimesEquals()的具体用法。这些代码示例主要来源于Github/Stackoverflow/Maven等平台,是从一些精选项目中提取出来的代码,具有较强的参考意义,能在一定程度帮忙到你。Vector.dotTimesEquals()方法的具体详情如下:
包路径:gov.sandia.cognition.math.matrix.Vector
类名称:Vector
方法名:dotTimesEquals

Vector.dotTimesEquals介绍

暂无

代码示例

代码示例来源:origin: algorithmfoundry/Foundry

@Override
final public Vector dotTimes(
  final Vector v)
{
  // By switch from this.dotTimes(v) to v.dotTimes(this), we get sparse
  // vectors dotted with dense still being sparse and dense w/ dense is
  // still dense.  The way this was originally implemented in the Foundry
  // (this.clone().dotTimesEquals(v)), if v is sparse, it returns a
  // dense vector type storing sparse data.
  Vector result = v.clone();
  result.dotTimesEquals(this);
  return result;
}

代码示例来源:origin: gov.sandia.foundry/gov-sandia-cognition-common-core

@Override
final public Vector dotTimes(
  final Vector v)
{
  // By switch from this.dotTimes(v) to v.dotTimes(this), we get sparse
  // vectors dotted with dense still being sparse and dense w/ dense is
  // still dense.  The way this was originally implemented in the Foundry
  // (this.clone().dotTimesEquals(v)), if v is sparse, it returns a
  // dense vector type storing sparse data.
  Vector result = v.clone();
  result.dotTimesEquals(this);
  return result;
}

代码示例来源:origin: algorithmfoundry/Foundry

@Override
final public Vector dotTimes(
  final Vector v)
{
  // By switch from this.dotTimes(v) to v.dotTimes(this), we get sparse
  // vectors dotted with dense still being sparse and dense w/ dense is
  // still dense.  The way this was originally implemented in the Foundry
  // (this.clone().dotTimesEquals(v)), if v is sparse, it returns a
  // dense vector type storing sparse data.
  Vector result = v.clone();
  result.dotTimesEquals(this);
  return result;
}

代码示例来源:origin: algorithmfoundry/Foundry

/**
 * Evaluates the weighted Euclidean distance between two vectors.
 *
 * @param   first
 *      The first vector.
 * @param   second
 *      The second vector.
 * @return
 *      The weighted Euclidean distance between  the two vectors.
 */
@Override
public double evaluate(
  final Vectorizable first,
  final Vectorizable second)
{
  // \sqrt(\sum_i w_i * (x_i - y_i)^2)
  // First compute the difference between the two vectors.
  final Vector difference =
    first.convertToVector().minus(second.convertToVector());
  // Now square it.
  difference.dotTimesEquals(difference);
  // Now compute the square root of the weights times the squared
  // difference.
  return Math.sqrt(this.weights.dotProduct(difference));
}

代码示例来源:origin: gov.sandia.foundry/gov-sandia-cognition-learning-core

/**
 * Evaluates the weighted Euclidean distance between two vectors.
 *
 * @param   first
 *      The first vector.
 * @param   second
 *      The second vector.
 * @return
 *      The weighted Euclidean distance between  the two vectors.
 */
@Override
public double evaluate(
  final Vectorizable first,
  final Vectorizable second)
{
  // \sqrt(\sum_i w_i * (x_i - y_i)^2)
  // First compute the difference between the two vectors.
  final Vector difference =
    first.convertToVector().minus(second.convertToVector());
  // Now square it.
  difference.dotTimesEquals(difference);
  // Now compute the square root of the weights times the squared
  // difference.
  return Math.sqrt(this.weights.dotProduct(difference));
}

代码示例来源:origin: algorithmfoundry/Foundry

/**
 * Evaluates the weighted Euclidean distance between two vectors.
 *
 * @param   first
 *      The first vector.
 * @param   second
 *      The second vector.
 * @return
 *      The weighted Euclidean distance between  the two vectors.
 */
@Override
public double evaluate(
  final Vectorizable first,
  final Vectorizable second)
{
  // \sqrt(\sum_i w_i * (x_i - y_i)^2)
  // First compute the difference between the two vectors.
  final Vector difference =
    first.convertToVector().minus(second.convertToVector());
  // Now square it.
  difference.dotTimesEquals(difference);
  // Now compute the square root of the weights times the squared
  // difference.
  return Math.sqrt(this.weights.dotProduct(difference));
}

代码示例来源:origin: algorithmfoundry/Foundry

nextDelta.dotTimesEquals(bn);
nextDelta.scaleEquals( 1.0/nextDelta.norm1() );

代码示例来源:origin: algorithmfoundry/Foundry

nextDelta.dotTimesEquals(bn);
nextDelta.scaleEquals( 1.0/nextDelta.norm1() );

代码示例来源:origin: algorithmfoundry/Foundry

weights.dotTimesEquals(globalWeights);

代码示例来源:origin: gov.sandia.foundry/gov-sandia-cognition-text-core

weights.dotTimesEquals(globalWeights);

代码示例来源:origin: algorithmfoundry/Foundry

weights.dotTimesEquals(globalWeights);

代码示例来源:origin: algorithmfoundry/Foundry

alphaNext.dotTimesEquals(b);

代码示例来源:origin: algorithmfoundry/Foundry

alphaNext.dotTimesEquals(b);

代码示例来源:origin: gov.sandia.foundry/gov-sandia-cognition-learning-core

alphaNext.dotTimesEquals(b);

代码示例来源:origin: algorithmfoundry/Foundry

alpha.dotTimesEquals(b);
final double weight = alpha.norm1();
alpha.scaleEquals(1.0/weight);

代码示例来源:origin: gov.sandia.foundry/gov-sandia-cognition-learning-core

alpha.dotTimesEquals(b);
final double weight = alpha.norm1();
alpha.scaleEquals(1.0/weight);

代码示例来源:origin: algorithmfoundry/Foundry

alpha.dotTimesEquals(b);
final double weight = alpha.norm1();
alpha.scaleEquals(1.0/weight);

代码示例来源:origin: algorithmfoundry/Foundry

/**
 * Computes the Viterbi recursion for a given "delta" and "b"
 * @param delta
 * Previous value of the Viterbi recursion.
 * @param bn
 * Current observation likelihood.
 * @return
 * Updated "delta" and state backpointers.
 */
protected Pair<Vector,int[]> computeViterbiRecursion(
  Vector delta,
  Vector bn )
{
  final int k = delta.getDimensionality();
  final Vector dn = VectorFactory.getDefault().createVector(k);
  final int[] psi = new int[ k ];
  for( int i = 0; i < k; i++ )
  {
    WeightedValue<Integer> transition =
      this.findMostLikelyState(i, delta);
    psi[i] = transition.getValue();
    dn.setElement(i, transition.getWeight());
  }
  dn.dotTimesEquals( bn );
  delta = dn;
  delta.scaleEquals( 1.0/delta.norm1() );
  return DefaultPair.create( delta, psi );
}

代码示例来源:origin: gov.sandia.foundry/gov-sandia-cognition-learning-core

/**
 * Computes the Viterbi recursion for a given "delta" and "b"
 * @param delta
 * Previous value of the Viterbi recursion.
 * @param bn
 * Current observation likelihood.
 * @return
 * Updated "delta" and state backpointers.
 */
protected Pair<Vector,int[]> computeViterbiRecursion(
  Vector delta,
  Vector bn )
{
  final int k = delta.getDimensionality();
  final Vector dn = VectorFactory.getDefault().createVector(k);
  final int[] psi = new int[ k ];
  for( int i = 0; i < k; i++ )
  {
    WeightedValue<Integer> transition =
      this.findMostLikelyState(i, delta);
    psi[i] = transition.getValue();
    dn.setElement(i, transition.getWeight());
  }
  dn.dotTimesEquals( bn );
  delta = dn;
  delta.scaleEquals( 1.0/delta.norm1() );
  return DefaultPair.create( delta, psi );
}

代码示例来源:origin: algorithmfoundry/Foundry

/**
 * Computes the Viterbi recursion for a given "delta" and "b"
 * @param delta
 * Previous value of the Viterbi recursion.
 * @param bn
 * Current observation likelihood.
 * @return
 * Updated "delta" and state backpointers.
 */
protected Pair<Vector,int[]> computeViterbiRecursion(
  Vector delta,
  Vector bn )
{
  final int k = delta.getDimensionality();
  final Vector dn = VectorFactory.getDefault().createVector(k);
  final int[] psi = new int[ k ];
  for( int i = 0; i < k; i++ )
  {
    WeightedValue<Integer> transition =
      this.findMostLikelyState(i, delta);
    psi[i] = transition.getValue();
    dn.setElement(i, transition.getWeight());
  }
  dn.dotTimesEquals( bn );
  delta = dn;
  delta.scaleEquals( 1.0/delta.norm1() );
  return DefaultPair.create( delta, psi );
}

相关文章

微信公众号

最新文章

更多