org.talend.daikon.avro.AvroUtils.isSchemaEmpty()方法的使用及代码示例

x33g5p2x  于2022-01-16 转载在 其他  
字(6.8k)|赞(0)|评价(0)|浏览(90)

本文整理了Java中org.talend.daikon.avro.AvroUtils.isSchemaEmpty()方法的一些代码示例,展示了AvroUtils.isSchemaEmpty()的具体用法。这些代码示例主要来源于Github/Stackoverflow/Maven等平台,是从一些精选项目中提取出来的代码,具有较强的参考意义,能在一定程度帮忙到你。AvroUtils.isSchemaEmpty()方法的具体详情如下:
包路径:org.talend.daikon.avro.AvroUtils
类名称:AvroUtils
方法名:isSchemaEmpty

AvroUtils.isSchemaEmpty介绍

[英]check if schema is empty before using it. If empty, return true, if not, return false.
[中]在使用架构之前,请检查它是否为空。如果为空,则返回true;如果为空,则返回false。

代码示例

代码示例来源:origin: org.talend.components/components-jdbc-runtime

if (AvroUtils.isSchemaEmpty(querySchema)) {
  querySchema = source.infer(resultSet.getMetaData(), container);

代码示例来源:origin: Talend/components

if (AvroUtils.isSchemaEmpty(querySchema)) {
  querySchema = source.infer(resultSet.getMetaData(), container);

代码示例来源:origin: Talend/components

@Override
public ValidationResult initialize(RuntimeContainer container, JDBCInputProperties properties) {
  this.properties = properties;
  // In Beam, JdbcIO always has a repartition event, so we are obligated to fetch the schema before any processing
  // occurs in the nodes.
  Schema schema = properties.getDatasetProperties().main.schema.getValue();
  if (schema == null || AvroUtils.isSchemaEmpty(schema) || AvroUtils.isIncludeAllFields(schema)) {
    JDBCDatasetRuntime schemaFetcher = new JDBCDatasetRuntime();
    schemaFetcher.initialize(container, properties.getDatasetProperties());
    schema = schemaFetcher.getSchema();
  }
  this.defaultOutputCoder = AvroCoder.of(schema);
  return ValidationResult.OK;
}

代码示例来源:origin: org.talend.components/components-jdbc-runtime-beam

@Override
public ValidationResult initialize(RuntimeContainer container, JDBCInputProperties properties) {
  this.properties = properties;
  // In Beam, JdbcIO always has a repartition event, so we are obligated to fetch the schema before any processing
  // occurs in the nodes.
  Schema schema = properties.getDatasetProperties().main.schema.getValue();
  if (schema == null || AvroUtils.isSchemaEmpty(schema) || AvroUtils.isIncludeAllFields(schema)) {
    JDBCDatasetRuntime schemaFetcher = new JDBCDatasetRuntime();
    schemaFetcher.initialize(container, properties.getDatasetProperties());
    schema = schemaFetcher.getSchema();
  }
  this.defaultOutputCoder = AvroCoder.of(schema);
  return ValidationResult.OK;
}

代码示例来源:origin: org.talend.components/bigquery-runtime

@DoFn.ProcessElement
  public void processElement(ProcessContext c) throws IOException {
    IndexedRecord row = c.element();
    if (row == null) {
      return;
    }
    if (converter == null) {
      converter = new BigQueryTableRowIndexedRecordConverter();
      if (!AvroUtils.isSchemaEmpty(row.getSchema()) && !AvroUtils.isIncludeAllFields(row.getSchema())) {
        converter.setSchema(row.getSchema());
      }
    }
    c.output(converter.convertToDatum(row));
  }
}

代码示例来源:origin: Talend/components

@DoFn.ProcessElement
  public void processElement(ProcessContext c) throws IOException {
    IndexedRecord row = c.element();
    if (row == null) {
      return;
    }
    if (converter == null) {
      converter = new BigQueryTableRowIndexedRecordConverter();
      if (!AvroUtils.isSchemaEmpty(row.getSchema()) && !AvroUtils.isIncludeAllFields(row.getSchema())) {
        converter.setSchema(row.getSchema());
      }
    }
    c.output(converter.convertToDatum(row));
  }
}

代码示例来源:origin: Talend/components

.getValue() == BigQueryOutputProperties.TableOperation.DROP_IF_EXISTS_AND_CREATE) {
Schema designSchema = properties.getDatasetProperties().main.schema.getValue();
if (designSchema != null && !AvroUtils.isSchemaEmpty(designSchema)
    && !AvroUtils.isIncludeAllFields(designSchema)) {
  bqSchema = BigQueryAvroRegistry.get().guessBigQuerySchema(designSchema);

代码示例来源:origin: org.talend.components/bigquery-runtime

.getValue() == BigQueryOutputProperties.TableOperation.DROP_IF_EXISTS_AND_CREATE) {
Schema designSchema = properties.getDatasetProperties().main.schema.getValue();
if (designSchema != null && !AvroUtils.isSchemaEmpty(designSchema)
    && !AvroUtils.isIncludeAllFields(designSchema)) {
  bqSchema = BigQueryAvroRegistry.get().guessBigQuerySchema(designSchema);

代码示例来源:origin: Talend/components

@Override
public ValidationResult initialize(RuntimeContainer container, BigQueryInputProperties properties) {
  this.properties = properties;
  this.dataset = properties.getDatasetProperties();
  this.datastore = dataset.getDatastoreProperties();
  // Data returned by BigQueryIO do not contains self schema, so have to retrieve it before read and write
  // operations
  Schema schema = properties.getDatasetProperties().main.schema.getValue();
  if (schema == null || AvroUtils.isSchemaEmpty(schema) || AvroUtils.isIncludeAllFields(schema)) {
    BigQueryDatasetRuntime schemaFetcher = new BigQueryDatasetRuntime();
    schemaFetcher.initialize(container, properties.getDatasetProperties());
    schema = schemaFetcher.getSchema();
  }
  Object pipelineOptionsObj = container.getGlobalData(BeamJobRuntimeContainer.PIPELINE_OPTIONS);
  if (pipelineOptionsObj != null) {
    PipelineOptions pipelineOptions = (PipelineOptions) pipelineOptionsObj;
    GcpServiceAccountOptions gcpOptions = pipelineOptions.as(GcpServiceAccountOptions.class);
    if (!"DataflowRunner".equals(gcpOptions.getRunner().getSimpleName())) {
      // when using Dataflow runner, these properties has been set on pipeline level
      gcpOptions.setProject(datastore.projectName.getValue());
      gcpOptions.setTempLocation(datastore.tempGsFolder.getValue());
      gcpOptions.setCredentialFactoryClass(ServiceAccountCredentialFactory.class);
      gcpOptions.setServiceAccountFile(datastore.serviceAccountFile.getValue());
      gcpOptions.setGcpCredential(BigQueryConnection.createCredentials(datastore));
    }
  }
  this.defaultOutputCoder = AvroCoder.of(schema);
  return ValidationResult.OK;
}

代码示例来源:origin: org.talend.components/bigquery-runtime

@Override
public ValidationResult initialize(RuntimeContainer container, BigQueryInputProperties properties) {
  this.properties = properties;
  this.dataset = properties.getDatasetProperties();
  this.datastore = dataset.getDatastoreProperties();
  // Data returned by BigQueryIO do not contains self schema, so have to retrieve it before read and write
  // operations
  Schema schema = properties.getDatasetProperties().main.schema.getValue();
  if (schema == null || AvroUtils.isSchemaEmpty(schema) || AvroUtils.isIncludeAllFields(schema)) {
    BigQueryDatasetRuntime schemaFetcher = new BigQueryDatasetRuntime();
    schemaFetcher.initialize(container, properties.getDatasetProperties());
    schema = schemaFetcher.getSchema();
  }
  Object pipelineOptionsObj = container.getGlobalData(BeamJobRuntimeContainer.PIPELINE_OPTIONS);
  if (pipelineOptionsObj != null) {
    PipelineOptions pipelineOptions = (PipelineOptions) pipelineOptionsObj;
    GcpServiceAccountOptions gcpOptions = pipelineOptions.as(GcpServiceAccountOptions.class);
    if (!"DataflowRunner".equals(gcpOptions.getRunner().getSimpleName())) {
      // when using Dataflow runner, these properties has been set on pipeline level
      gcpOptions.setProject(datastore.projectName.getValue());
      gcpOptions.setTempLocation(datastore.tempGsFolder.getValue());
      gcpOptions.setCredentialFactoryClass(ServiceAccountCredentialFactory.class);
      gcpOptions.setServiceAccountFile(datastore.serviceAccountFile.getValue());
      gcpOptions.setGcpCredential(BigQueryConnection.createCredentials(datastore));
    }
  }
  this.defaultOutputCoder = AvroCoder.of(schema);
  return ValidationResult.OK;
}

相关文章