gov.sandia.cognition.math.matrix.Vector.dotProduct()方法的使用及代码示例

x33g5p2x  于2022-02-01 转载在 其他  
字(6.4k)|赞(0)|评价(0)|浏览(104)

本文整理了Java中gov.sandia.cognition.math.matrix.Vector.dotProduct()方法的一些代码示例,展示了Vector.dotProduct()的具体用法。这些代码示例主要来源于Github/Stackoverflow/Maven等平台,是从一些精选项目中提取出来的代码,具有较强的参考意义,能在一定程度帮忙到你。Vector.dotProduct()方法的具体详情如下:
包路径:gov.sandia.cognition.math.matrix.Vector
类名称:Vector
方法名:dotProduct

Vector.dotProduct介绍

暂无

代码示例

代码示例来源:origin: algorithmfoundry/Foundry

@Override
protected double computeScaleFactor(
  Vector gradientCurrent,
  Vector gradientPrevious )
{
  Vector direction = this.lineFunction.getDirection();
  
  Vector deltaGradient = gradientCurrent.minus( gradientPrevious );
  double deltaTgradient = deltaGradient.dotProduct( gradientCurrent );
  double denom = gradientPrevious.dotProduct( direction );
  double beta = -deltaTgradient / denom;
  return beta;
}

代码示例来源:origin: algorithmfoundry/Foundry

@Override
protected double computeScaleFactor(
  Vector gradientCurrent,
  Vector gradientPrevious )
{
  Vector direction = this.lineFunction.getDirection();
  
  Vector deltaGradient = gradientCurrent.minus( gradientPrevious );
  double deltaTgradient = deltaGradient.dotProduct( gradientCurrent );
  double denom = gradientPrevious.dotProduct( direction );
  double beta = -deltaTgradient / denom;
  return beta;
}

代码示例来源:origin: gov.sandia.foundry/gov-sandia-cognition-learning-core

@Override
final protected void initializeSolver(MatrixVectorMultiplier function)
{
  this.A = function;
  x = super.x0;
  residual = rhs.minus(function.evaluate(x));
  delta = residual.dotProduct(residual);
}

代码示例来源:origin: gov.sandia.foundry/gov-sandia-cognition-learning-core

/**
 * Computes the scale component for the inverse-gamma distribution
 * @return
 * Scale component for the inverse-gamma distribution
 */
public double getScale()
{
  Vector mean = this.getMean();
  Matrix Ci = this.covarianceInverse;
  return 0.5 * (this.outputSumSquared - mean.times(Ci).dotProduct(mean));
}

代码示例来源:origin: gov.sandia.foundry/gov-sandia-cognition-learning-core

@Override
public double evaluateAsDouble(
  final Vectorizable input)
{
  return this.getWeightVector().dotProduct(input.convertToVector());
}

代码示例来源:origin: gov.sandia.foundry/gov-sandia-cognition-learning-core

protected double computeScaleFactor(
  Vector gradientCurrent,
  Vector gradientPrevious )
{
  Vector deltaGradient = gradientCurrent.minus( gradientPrevious );
  double deltaTgradient = deltaGradient.dotProduct( gradientCurrent );
  double denom = gradientPrevious.norm2Squared();
  
  double beta = deltaTgradient / denom;
  return beta;
}

代码示例来源:origin: algorithmfoundry/Foundry

/**
 * Computes the scale component for the inverse-gamma distribution
 * @return
 * Scale component for the inverse-gamma distribution
 */
public double getScale()
{
  Vector mean = this.getMean();
  Matrix Ci = this.covarianceInverse;
  return 0.5 * (this.outputSumSquared - mean.times(Ci).dotProduct(mean));
}

代码示例来源:origin: algorithmfoundry/Foundry

@Override
public double evaluateAsDouble(
  final Vectorizable input)
{
  return this.getWeightVector().dotProduct(input.convertToVector());
}

代码示例来源:origin: gov.sandia.foundry/gov-sandia-cognition-learning-core

@Override
final protected void initializeSolver(MatrixVectorMultiplier function)
{
  this.A = function;
  x = super.x0;
  residual = rhs.minus(function.evaluate(x));
  d = residual;
  delta = residual.dotProduct(residual);
}

代码示例来源:origin: algorithmfoundry/Foundry

@Override
final protected void initializeSolver(MatrixVectorMultiplier function)
{
  this.A = function;
  x = super.x0;
  residual = rhs.minus(function.evaluate(x));
  d = residual;
  delta = residual.dotProduct(residual);
}

代码示例来源:origin: algorithmfoundry/Foundry

@Override
final protected void initializeSolver(MatrixVectorMultiplier function)
{
  this.A = function;
  x = super.x0;
  residual = rhs.minus(function.evaluate(x));
  d = residual;
  delta = residual.dotProduct(residual);
}

代码示例来源:origin: gov.sandia.foundry/gov-sandia-cognition-learning-core

@Override
public UnivariateGaussian.PDF evaluate(
  Vectorizable input)
{
  // Bishop's equations 3.58-3.59
  Vector x = input.convertToVector();
  double mean = x.dotProduct( this.posterior.getMean() );
  double variance = x.times( this.posterior.getCovariance() ).dotProduct(x) + outputVariance;
  return new UnivariateGaussian.PDF( mean, variance );
}

代码示例来源:origin: algorithmfoundry/Foundry

@Override
public UnivariateGaussian.PDF evaluate(
  Vectorizable input)
{
  // Bishop's equations 3.58-3.59
  Vector x = input.convertToVector();
  double mean = x.dotProduct( this.posterior.getMean() );
  double variance = x.times( this.posterior.getCovariance() ).dotProduct(x) + outputVariance;
  return new UnivariateGaussian.PDF( mean, variance );
}

代码示例来源:origin: algorithmfoundry/Foundry

@Override
public UnivariateGaussian.PDF evaluate(
  Vectorizable input)
{
  // Bishop's equations 3.58-3.59
  Vector x = input.convertToVector();
  double mean = x.dotProduct( this.posterior.getMean() );
  double variance = x.times( this.posterior.getCovariance() ).dotProduct(x) + outputVariance;
  return new UnivariateGaussian.PDF( mean, variance );
}

代码示例来源:origin: gov.sandia.foundry/gov-sandia-cognition-learning-core

@Override
final protected void initializeSolver(
  OverconstrainedMatrixVectorMultiplier function)
{
  this.A = function;
  x = super.x0;
  AtransB = (A.transposeMult(rhs));
  residual = AtransB.minus(function.evaluate(x));
  d = residual;
  delta = residual.dotProduct(residual);
}

代码示例来源:origin: algorithmfoundry/Foundry

@Override
final protected void initializeSolver(
  OverconstrainedMatrixVectorMultiplier function)
{
  this.A = function;
  x = super.x0;
  AtransB = (A.transposeMult(rhs));
  residual = AtransB.minus(function.evaluate(x));
  d = residual;
  delta = residual.dotProduct(residual);
}

代码示例来源:origin: algorithmfoundry/Foundry

@Override
final protected void initializeSolver(
  MatrixVectorMultiplierWithPreconditioner function)
{
  this.A = function;
  x = super.x0;
  residual = rhs.minus(A.evaluate(x));
  d = A.precondition(residual);
  delta = residual.dotProduct(d);
}

代码示例来源:origin: algorithmfoundry/Foundry

@Override
final protected void initializeSolver(
  MatrixVectorMultiplierWithPreconditioner function)
{
  this.A = function;
  x = super.x0;
  residual = rhs.minus(A.evaluate(x));
  d = A.precondition(residual);
  delta = residual.dotProduct(d);
}

代码示例来源:origin: gov.sandia.foundry/gov-sandia-cognition-learning-core

@Override
public StudentTDistribution evaluate(
  Vectorizable input)
{
  Vector x = input.convertToVector();
  double mean = x.dotProduct( this.posterior.getMean() );
  double dofs = this.posterior.getInverseGamma().getShape() * 2.0;
  double v = x.times( this.posterior.getGaussian().getCovariance() ).dotProduct(x);
  double anbn = this.posterior.getInverseGamma().getShape() / this.posterior.getInverseGamma().getScale();
  double precision = anbn / (1.0 + v);
  return new StudentTDistribution( dofs, mean, precision );
}

代码示例来源:origin: openimaj/openimaj

public boolean test_backtrack(Matrix W, Matrix grad, Matrix prox, double eta){
    Matrix tmp = prox.clone();
    tmp.minusEquals(W);
    Vector tmpvec = tmp.getColumn(0);
    return (
      eval(prox) <= eval(W) + grad.getColumn(0).dotProduct(tmpvec) + 0.5*eta*tmpvec.norm2());
    
  }
}

相关文章

微信公众号

最新文章

更多