本文整理了Java中org.apache.gobblin.util.AvroUtils.removeUncomparableFields()
方法的一些代码示例,展示了AvroUtils.removeUncomparableFields()
的具体用法。这些代码示例主要来源于Github
/Stackoverflow
/Maven
等平台,是从一些精选项目中提取出来的代码,具有较强的参考意义,能在一定程度帮忙到你。AvroUtils.removeUncomparableFields()
方法的具体详情如下:
包路径:org.apache.gobblin.util.AvroUtils
类名称:AvroUtils
方法名:removeUncomparableFields
[英]Remove map, array, enum fields, as well as union fields that contain map, array or enum, from an Avro schema. A schema with these fields cannot be used as Mapper key in a MapReduce job.
[中]从Avro架构中删除映射、数组、枚举字段以及包含映射、数组或枚举的联合字段。具有这些字段的架构不能用作MapReduce作业中的映射器键。
代码示例来源:origin: apache/incubator-gobblin
/**
* Remove map, array, enum fields, as well as union fields that contain map, array or enum,
* from an Avro schema. A schema with these fields cannot be used as Mapper key in a
* MapReduce job.
*/
public static Optional<Schema> removeUncomparableFields(Schema schema) {
return removeUncomparableFields(schema, Sets.<Schema> newHashSet());
}
代码示例来源:origin: apache/incubator-gobblin
private static Optional<Schema> removeUncomparableFieldsFromUnion(Schema union, Set<Schema> processed) {
Preconditions.checkArgument(union.getType() == Schema.Type.UNION);
if (processed.contains(union)) {
return Optional.absent();
}
processed.add(union);
List<Schema> newUnion = Lists.newArrayList();
for (Schema unionType : union.getTypes()) {
Optional<Schema> newType = removeUncomparableFields(unionType, processed);
if (newType.isPresent()) {
newUnion.add(newType.get());
}
}
// Discard the union field if one or more types are removed from the union.
if (newUnion.size() != union.getTypes().size()) {
return Optional.absent();
}
return Optional.of(Schema.createUnion(newUnion));
}
代码示例来源:origin: apache/incubator-gobblin
private static Optional<Schema> removeUncomparableFieldsFromRecord(Schema record, Set<Schema> processed) {
Preconditions.checkArgument(record.getType() == Schema.Type.RECORD);
if (processed.contains(record)) {
return Optional.absent();
}
processed.add(record);
List<Field> fields = Lists.newArrayList();
for (Field field : record.getFields()) {
Optional<Schema> newFieldSchema = removeUncomparableFields(field.schema(), processed);
if (newFieldSchema.isPresent()) {
fields.add(new Field(field.name(), newFieldSchema.get(), field.doc(), field.defaultValue()));
}
}
Schema newSchema = Schema.createRecord(record.getName(), record.getDoc(), record.getNamespace(), false);
newSchema.setFields(fields);
return Optional.of(newSchema);
}
代码示例来源:origin: apache/incubator-gobblin
if (dedupKeyOption == MRCompactorAvroKeyDedupJobRunner.DedupKeyOption.ALL) {
log.info("Using all attributes in the schema (except Map, Arrar and Enum fields) for compaction");
keySchema = AvroUtils.removeUncomparableFields(topicSchema).get();
} else if (dedupKeyOption == MRCompactorAvroKeyDedupJobRunner.DedupKeyOption.KEY) {
log.info("Using key attributes in the schema for compaction");
keySchema = AvroUtils.removeUncomparableFields(MRCompactorAvroKeyDedupJobRunner.getKeySchema(topicSchema)).get();
} else if (keySchemaFileSpecified) {
Path keySchemaFile = new Path(state.getProp(MRCompactorAvroKeyDedupJobRunner.COMPACTION_JOB_AVRO_KEY_SCHEMA_LOC));
log.error("Failed to parse avro schema from " + keySchemaFile
+ ", using key attributes in the schema for compaction");
keySchema = AvroUtils.removeUncomparableFields(MRCompactorAvroKeyDedupJobRunner.getKeySchema(topicSchema)).get();
log.warn(String.format("Key schema %s is not compatible with record schema %s.", keySchema, topicSchema)
+ "Using key attributes in the schema for compaction");
keySchema = AvroUtils.removeUncomparableFields(MRCompactorAvroKeyDedupJobRunner.getKeySchema(topicSchema)).get();
keySchema = AvroUtils.removeUncomparableFields(MRCompactorAvroKeyDedupJobRunner.getKeySchema(topicSchema)).get();
代码示例来源:origin: apache/incubator-gobblin
if (dedupKeyOption == DedupKeyOption.ALL) {
LOG.info("Using all attributes in the schema (except Map, Arrar and Enum fields) for compaction");
keySchema = AvroUtils.removeUncomparableFields(topicSchema).get();
} else if (dedupKeyOption == DedupKeyOption.KEY) {
LOG.info("Using key attributes in the schema for compaction");
keySchema = AvroUtils.removeUncomparableFields(getKeySchema(topicSchema)).get();
} else if (keySchemaFileSpecified()) {
Path keySchemaFile = getKeySchemaFile();
LOG.error("Failed to parse avro schema from " + keySchemaFile
+ ", using key attributes in the schema for compaction");
keySchema = AvroUtils.removeUncomparableFields(getKeySchema(topicSchema)).get();
keySchema = AvroUtils.removeUncomparableFields(getKeySchema(topicSchema)).get();
keySchema = AvroUtils.removeUncomparableFields(getKeySchema(topicSchema)).get();
代码示例来源:origin: apache/incubator-gobblin
@Test
public void testGetKeySchemaWithoutPrimaryKey() throws IOException {
try (InputStream schemaNoPkey = getClass().getClassLoader().getResourceAsStream("dedup-schema/dedup-schema-without-pkey.avsc")) {
Schema topicSchema = new Schema.Parser().parse(schemaNoPkey);
Schema actualKeySchema = this.runner.getKeySchema(this.job, topicSchema);
Assert.assertEquals(actualKeySchema, AvroUtils.removeUncomparableFields(topicSchema).get());
}
}
}
代码示例来源:origin: org.apache.gobblin/gobblin-utility
/**
* Remove map, array, enum fields, as well as union fields that contain map, array or enum,
* from an Avro schema. A schema with these fields cannot be used as Mapper key in a
* MapReduce job.
*/
public static Optional<Schema> removeUncomparableFields(Schema schema) {
return removeUncomparableFields(schema, Sets.<Schema> newHashSet());
}
代码示例来源:origin: org.apache.gobblin/gobblin-utility
private static Optional<Schema> removeUncomparableFieldsFromUnion(Schema union, Set<Schema> processed) {
Preconditions.checkArgument(union.getType() == Schema.Type.UNION);
if (processed.contains(union)) {
return Optional.absent();
}
processed.add(union);
List<Schema> newUnion = Lists.newArrayList();
for (Schema unionType : union.getTypes()) {
Optional<Schema> newType = removeUncomparableFields(unionType, processed);
if (newType.isPresent()) {
newUnion.add(newType.get());
}
}
// Discard the union field if one or more types are removed from the union.
if (newUnion.size() != union.getTypes().size()) {
return Optional.absent();
}
return Optional.of(Schema.createUnion(newUnion));
}
代码示例来源:origin: org.apache.gobblin/gobblin-utility
private static Optional<Schema> removeUncomparableFieldsFromRecord(Schema record, Set<Schema> processed) {
Preconditions.checkArgument(record.getType() == Schema.Type.RECORD);
if (processed.contains(record)) {
return Optional.absent();
}
processed.add(record);
List<Field> fields = Lists.newArrayList();
for (Field field : record.getFields()) {
Optional<Schema> newFieldSchema = removeUncomparableFields(field.schema(), processed);
if (newFieldSchema.isPresent()) {
fields.add(new Field(field.name(), newFieldSchema.get(), field.doc(), field.defaultValue()));
}
}
Schema newSchema = Schema.createRecord(record.getName(), record.getDoc(), record.getNamespace(), false);
newSchema.setFields(fields);
return Optional.of(newSchema);
}
代码示例来源:origin: org.apache.gobblin/gobblin-compaction
if (dedupKeyOption == MRCompactorAvroKeyDedupJobRunner.DedupKeyOption.ALL) {
log.info("Using all attributes in the schema (except Map, Arrar and Enum fields) for compaction");
keySchema = AvroUtils.removeUncomparableFields(topicSchema).get();
} else if (dedupKeyOption == MRCompactorAvroKeyDedupJobRunner.DedupKeyOption.KEY) {
log.info("Using key attributes in the schema for compaction");
keySchema = AvroUtils.removeUncomparableFields(MRCompactorAvroKeyDedupJobRunner.getKeySchema(topicSchema)).get();
} else if (keySchemaFileSpecified) {
Path keySchemaFile = new Path(state.getProp(MRCompactorAvroKeyDedupJobRunner.COMPACTION_JOB_AVRO_KEY_SCHEMA_LOC));
log.error("Failed to parse avro schema from " + keySchemaFile
+ ", using key attributes in the schema for compaction");
keySchema = AvroUtils.removeUncomparableFields(MRCompactorAvroKeyDedupJobRunner.getKeySchema(topicSchema)).get();
log.warn(String.format("Key schema %s is not compatible with record schema %s.", keySchema, topicSchema)
+ "Using key attributes in the schema for compaction");
keySchema = AvroUtils.removeUncomparableFields(MRCompactorAvroKeyDedupJobRunner.getKeySchema(topicSchema)).get();
keySchema = AvroUtils.removeUncomparableFields(MRCompactorAvroKeyDedupJobRunner.getKeySchema(topicSchema)).get();
代码示例来源:origin: org.apache.gobblin/gobblin-compaction
if (dedupKeyOption == DedupKeyOption.ALL) {
LOG.info("Using all attributes in the schema (except Map, Arrar and Enum fields) for compaction");
keySchema = AvroUtils.removeUncomparableFields(topicSchema).get();
} else if (dedupKeyOption == DedupKeyOption.KEY) {
LOG.info("Using key attributes in the schema for compaction");
keySchema = AvroUtils.removeUncomparableFields(getKeySchema(topicSchema)).get();
} else if (keySchemaFileSpecified()) {
Path keySchemaFile = getKeySchemaFile();
LOG.error("Failed to parse avro schema from " + keySchemaFile
+ ", using key attributes in the schema for compaction");
keySchema = AvroUtils.removeUncomparableFields(getKeySchema(topicSchema)).get();
keySchema = AvroUtils.removeUncomparableFields(getKeySchema(topicSchema)).get();
keySchema = AvroUtils.removeUncomparableFields(getKeySchema(topicSchema)).get();
内容来源于网络,如有侵权,请联系作者删除!