本文整理了Java中org.apache.hadoop.hive.ql.exec.Utilities.setMapRedWork()
方法的一些代码示例,展示了Utilities.setMapRedWork()
的具体用法。这些代码示例主要来源于Github
/Stackoverflow
/Maven
等平台,是从一些精选项目中提取出来的代码,具有较强的参考意义,能在一定程度帮忙到你。Utilities.setMapRedWork()
方法的具体详情如下:
包路径:org.apache.hadoop.hive.ql.exec.Utilities
类名称:Utilities
方法名:setMapRedWork
暂无
代码示例来源:origin: apache/hive
@Override
protected void setUp() throws IOException {
conf = new Configuration();
job = new JobConf(conf);
TableDesc tblDesc = Utilities.defaultTd;
PartitionDesc partDesc = new PartitionDesc(tblDesc, null);
LinkedHashMap<Path, PartitionDesc> pt = new LinkedHashMap<>();
pt.put(new Path("/tmp/testfolder"), partDesc);
MapredWork mrwork = new MapredWork();
mrwork.getMapWork().setPathToPartitionInfo(pt);
Utilities.setMapRedWork(job, mrwork,new Path("/tmp/" + System.getProperty("user.name"), "hive"));
fileSystem = FileSystem.getLocal(conf);
testDir = new Path(System.getProperty("test.tmp.dir", System.getProperty(
"user.dir", new File(".").getAbsolutePath()))
+ "/TestSymlinkTextInputFormat");
reporter = Reporter.NULL;
fileSystem.delete(testDir, true);
dataDir1 = new Path(testDir, "datadir1");
dataDir2 = new Path(testDir, "datadir2");
symlinkDir = new Path(testDir, "symlinkdir");
}
代码示例来源:origin: apache/hive
public void testAvoidSplitCombination() throws Exception {
Configuration conf = new Configuration();
JobConf job = new JobConf(conf);
TableDesc tblDesc = Utilities.defaultTd;
tblDesc.setInputFileFormatClass(TestSkipCombineInputFormat.class);
PartitionDesc partDesc = new PartitionDesc(tblDesc, null);
LinkedHashMap<Path, PartitionDesc> pt = new LinkedHashMap<>();
pt.put(new Path("/tmp/testfolder1"), partDesc);
pt.put(new Path("/tmp/testfolder2"), partDesc);
MapredWork mrwork = new MapredWork();
mrwork.getMapWork().setPathToPartitionInfo(pt);
Path mapWorkPath = new Path("/tmp/" + System.getProperty("user.name"), "hive");
Utilities.setMapRedWork(conf, mrwork,
mapWorkPath);
try {
Path[] paths = new Path[2];
paths[0] = new Path("/tmp/testfolder1");
paths[1] = new Path("/tmp/testfolder2");
CombineHiveInputFormat combineInputFormat =
ReflectionUtils.newInstance(CombineHiveInputFormat.class, conf);
combineInputFormat.pathToPartitionInfo =
Utilities.getMapWork(conf).getPathToPartitionInfo();
Set results = combineInputFormat.getNonCombinablePathIndices(job, paths, 2);
assertEquals("Should have both path indices in the results set", 2, results.size());
} finally {
// Cleanup the mapwork path
FileSystem.get(conf).delete(mapWorkPath, true);
}
}
代码示例来源:origin: apache/hive
private void init() throws IOException {
conf = new JobConf();
resetIOContext();
rcfReader = mock(RCFileRecordReader.class);
when(rcfReader.next((LongWritable)anyObject(),
(BytesRefArrayWritable )anyObject())).thenReturn(true);
// Since the start is 0, and the length is 100, the first call to sync should be with the value
// 50 so return that for getPos()
when(rcfReader.getPos()).thenReturn(50L);
conf.setBoolean("hive.input.format.sorted", true);
TableDesc tblDesc = Utilities.defaultTd;
PartitionDesc partDesc = new PartitionDesc(tblDesc, null);
LinkedHashMap<Path, PartitionDesc> pt = new LinkedHashMap<>();
pt.put(new Path("/tmp/testfolder"), partDesc);
MapredWork mrwork = new MapredWork();
mrwork.getMapWork().setPathToPartitionInfo(pt);
Utilities.setMapRedWork(conf, mrwork,new Path("/tmp/" + System.getProperty("user.name"), "hive"));
hiveSplit = new TestHiveInputSplit();
hbsReader = new TestHiveRecordReader(rcfReader, conf);
hbsReader.initIOContext(hiveSplit, conf, Class.class, rcfReader);
}
代码示例来源:origin: apache/hive
Utilities.setMapRedWork(job, mrwork, new Path(System.getProperty("java.io.tmpdir") + File.separator +
System.getProperty("user.name") + File.separator + "hive"));
MapredWork mrwork2 = Utilities.getMapRedWork(job);
代码示例来源:origin: apache/hive
Utilities.setMapRedWork(job, mrWork, ctx.getMRTmpPath());
代码示例来源:origin: apache/drill
Utilities.setMapRedWork(job, mrWork, ctx.getMRTmpPath());
代码示例来源:origin: apache/hive
Utilities.setInputPaths(newJob, inputPaths);
Utilities.setMapRedWork(newJob, selectTask.getWork(), ctx.getMRTmpPath());
代码示例来源:origin: apache/hive
Utilities.setInputPaths(job, inputPaths);
Utilities.setMapRedWork(job, work, ctx.getMRTmpPath());
代码示例来源:origin: apache/drill
Utilities.setInputPaths(job, inputPaths);
Utilities.setMapRedWork(job, work, ctx.getMRTmpPath());
代码示例来源:origin: apache/drill
Utilities.setMapRedWork(job, mrWork, ctx.getMRTmpPath());
代码示例来源:origin: org.apache.hadoop.hive/hive-exec
Utilities.setMapRedWork(job, work, ctx.getMRTmpFileURI());
代码示例来源:origin: com.facebook.presto.hive/hive-apache
Utilities.setMapRedWork(job, mrWork, ctx.getMRTmpPath());
代码示例来源:origin: com.facebook.presto.hive/hive-apache
Utilities.setInputPaths(job, inputPaths);
Utilities.setMapRedWork(job, work, ctx.getMRTmpPath());
代码示例来源:origin: com.facebook.presto.hive/hive-apache
Utilities.setMapRedWork(job, mrWork, ctx.getMRTmpPath());
内容来源于网络,如有侵权,请联系作者删除!