org.apache.hadoop.util.Shell.execCommand()方法的使用及代码示例

x33g5p2x  于2022-01-30 转载在 其他  
字(9.0k)|赞(0)|评价(0)|浏览(119)

本文整理了Java中org.apache.hadoop.util.Shell.execCommand()方法的一些代码示例,展示了Shell.execCommand()的具体用法。这些代码示例主要来源于Github/Stackoverflow/Maven等平台,是从一些精选项目中提取出来的代码,具有较强的参考意义,能在一定程度帮忙到你。Shell.execCommand()方法的具体详情如下:
包路径:org.apache.hadoop.util.Shell
类名称:Shell
方法名:execCommand

Shell.execCommand介绍

[英]Static method to execute a shell command. Covers most of the simple cases without requiring the user to implement the Shell interface.
[中]执行shell命令的静态方法。涵盖了大多数简单情况,无需用户实现Shell界面。

代码示例

代码示例来源:origin: org.apache.hadoop/hadoop-common

/**
 * Static method to execute a shell command.
 * Covers most of the simple cases without requiring the user to implement
 * the <code>Shell</code> interface.
 * @param cmd shell command to execute.
 * @return the output of the executed command.
 */
public static String execCommand(String ... cmd) throws IOException {
 return execCommand(null, cmd, 0L);
}

代码示例来源:origin: org.apache.hadoop/hadoop-common

/**
 * Static method to execute a shell command.
 * Covers most of the simple cases without requiring the user to implement
 * the <code>Shell</code> interface.
 * @param env the map of environment key=value
 * @param cmd shell command to execute.
 * @return the output of the executed command.
 * @throws IOException on any problem.
 */
public static String execCommand(Map<String,String> env, String ... cmd)
throws IOException {
 return execCommand(env, cmd, 0L);
}

代码示例来源:origin: org.apache.hadoop/hadoop-common

static String execCommand(File f, String... cmd) throws IOException {
 String[] args = new String[cmd.length + 1];
 System.arraycopy(cmd, 0, args, 0, cmd.length);
 args[cmd.length] = f.getCanonicalPath();
 String output = Shell.execCommand(args);
 return output;
}

代码示例来源:origin: alibaba/jstorm

/**
 * Dump out contents of $CWD and the environment to stdout for debugging
 */
private static void dumpOutDebugInfo() {
  LOG.info("Dump debug output");
  Map<String, String> envs = System.getenv();
  for (Map.Entry<String, String> env : envs.entrySet()) {
    LOG.info("System env: key=" + env.getKey() + ", val=" + env.getValue());
    System.out.println("System env: key=" + env.getKey() + ", val="
        + env.getValue());
  }
  BufferedReader buf = null;
  try {
    String lines = Shell.WINDOWS ? Shell.execCommand("cmd", "/c", "dir") :
        Shell.execCommand("ls", "-al");
    buf = new BufferedReader(new StringReader(lines));
    String line = "";
    while ((line = buf.readLine()) != null) {
      LOG.info("System CWD content: " + line);
      System.out.println("System CWD content: " + line);
    }
  } catch (IOException e) {
    e.printStackTrace();
  } finally {
    org.apache.hadoop.io.IOUtils.cleanup(LOG, buf);
  }
}

代码示例来源:origin: org.apache.hadoop/hadoop-common

/**
  * Calls shell to get users for a netgroup by calling getent
  * netgroup, this is a low level function that just returns string
  * that 
  *
  * @param netgroup get users for this netgroup
  * @return string of users for a given netgroup in getent netgroups format
  */
 protected String execShellGetUserForNetgroup(final String netgroup)
   throws IOException {
  String result = "";
  try {
   // shell command does not expect '@' at the beginning of the group name
   result = Shell.execCommand(
    Shell.getUsersForNetgroupCommand(netgroup.substring(1)));
  } catch (ExitCodeException e) {
   // if we didn't get the group - just return empty list;
   LOG.warn("error getting users for netgroup " + netgroup, e);
  }
  return result;
 }
}

代码示例来源:origin: org.apache.hadoop/hadoop-common

/**
 * Returns the target of the given symlink. Returns the empty string if
 * the given path does not refer to a symlink or there is an error
 * accessing the symlink.
 * @param f File representing the symbolic link.
 * @return The target of the symbolic link, empty string on error or if not
 *         a symlink.
 */
public static String readLink(File f) {
 /* NB: Use readSymbolicLink in java.nio.file.Path once available. Could
  * use getCanonicalPath in File to get the target of the symlink but that
  * does not indicate if the given path refers to a symlink.
  */
 if (f == null) {
  LOG.warn("Can not read a null symLink");
  return "";
 }
 try {
  return Shell.execCommand(
    Shell.getReadlinkCommand(f.toString())).trim();
 } catch (IOException x) {
  return "";
 }
}

代码示例来源:origin: org.apache.hadoop/hadoop-common

@Override
protected void stashOriginalFilePermissions() throws IOException {
 // save off permissions in case we need to
 // rewrite the keystore in flush()
 if (!Shell.WINDOWS) {
  Path path = Paths.get(file.getCanonicalPath());
  permissions = Files.getPosixFilePermissions(path);
 } else {
  // On Windows, the JDK does not support the POSIX file permission APIs.
  // Instead, we can do a winutils call and translate.
  String[] cmd = Shell.getGetPermissionCommand();
  String[] args = new String[cmd.length + 1];
  System.arraycopy(cmd, 0, args, 0, cmd.length);
  args[cmd.length] = file.getCanonicalPath();
  String out = Shell.execCommand(args);
  StringTokenizer t = new StringTokenizer(out, Shell.TOKEN_SEPARATOR_REGEX);
  // The winutils output consists of 10 characters because of the leading
  // directory indicator, i.e. "drwx------".  The JDK parsing method expects
  // a 9-character string, so remove the leading character.
  String permString = t.nextToken().substring(1);
  permissions = PosixFilePermissions.fromString(permString);
 }
}

代码示例来源:origin: org.apache.hadoop/hadoop-common

Thread.sleep(nextRefresh - now);
String output = Shell.execCommand(kinitCmd, "-R");
if (LOG.isDebugEnabled()) {
 LOG.debug("Renewed ticket. kinit output: {}", output);

代码示例来源:origin: apache/hive

if (sysctlOutRef.get() == null || refresh) {
 LOG.info("Reading kernel configs via sysctl..");
 String sysctlOutput = Shell.execCommand(sysctlCmd.split("\\s+"));
 sysctlOutRef.set(sysctlOutput);

代码示例来源:origin: org.apache.hadoop/hadoop-common

/**
 * Use the command chmod to set permission.
 */
@Override
public void setPermission(Path p, FsPermission permission)
 throws IOException {
 if (NativeIO.isAvailable()) {
  NativeIO.POSIX.chmod(pathToFile(p).getCanonicalPath(),
          permission.toShort());
 } else {
  String perm = String.format("%04o", permission.toShort());
  Shell.execCommand(Shell.getSetPermissionCommand(perm, false,
   FileUtil.makeShellPath(pathToFile(p), true)));
 }
}

代码示例来源:origin: ch.cern.hadoop/hadoop-common

/** 
 * Static method to execute a shell command. 
 * Covers most of the simple cases without requiring the user to implement  
 * the <code>Shell</code> interface.
 * @param env the map of environment key=value
 * @param cmd shell command to execute.
 * @return the output of the executed command.
 */
public static String execCommand(Map<String,String> env, String ... cmd) 
throws IOException {
 return execCommand(env, cmd, 0L);
}

代码示例来源:origin: org.apache.hadoop/hadoop-mapred-test

protected static String[] getFilePermissionAttrs(String path)
  throws IOException {
 String[] command = {"bash",PERMISSION_SCRIPT_FILE.getAbsolutePath(), path};
 String output=Shell.execCommand(command);
 return output.split(":|\n");
}

代码示例来源:origin: com.facebook.hadoop/hadoop-core

private static String execCommand(File f, String... cmd) throws IOException {
  String[] args = new String[cmd.length + 1];
  System.arraycopy(cmd, 0, args, 0, cmd.length);
  args[cmd.length] = f.getCanonicalPath();
  String output = Shell.execCommand(args);
  return output;
 }
}

代码示例来源:origin: io.prestosql.hadoop/hadoop-apache

static String execCommand(File f, String... cmd) throws IOException {
 String[] args = new String[cmd.length + 1];
 System.arraycopy(cmd, 0, args, 0, cmd.length);
 args[cmd.length] = f.getCanonicalPath();
 String output = Shell.execCommand(args);
 return output;
}

代码示例来源:origin: io.hops/hadoop-common

static String execCommand(File f, String... cmd) throws IOException {
 String[] args = new String[cmd.length + 1];
 System.arraycopy(cmd, 0, args, 0, cmd.length);
 args[cmd.length] = f.getCanonicalPath();
 String output = Shell.execCommand(args);
 return output;
}

代码示例来源:origin: org.apache.hama/hama-core

protected static String[] executeShellCommand(String[] command)
  throws IOException {
 String groups = Shell.execCommand(command);
 StringTokenizer tokenizer = new StringTokenizer(groups);
 int numOfTokens = tokenizer.countTokens();
 String[] tokens = new String[numOfTokens];
 for (int i = 0; tokenizer.hasMoreTokens(); i++) {
  tokens[i] = tokenizer.nextToken();
 }
 return tokens;
}

代码示例来源:origin: org.jvnet.hudson.hadoop/hadoop-core

private static String[] executeShellCommand(String[] command)
throws IOException {
 String groups = Shell.execCommand(command);
 StringTokenizer tokenizer = new StringTokenizer(groups);
 int numOfTokens = tokenizer.countTokens();
 String[] tokens = new String[numOfTokens];
 for (int i=0; tokenizer.hasMoreTokens(); i++) {
  tokens[i] = tokenizer.nextToken();
 }
 return tokens;
}

代码示例来源:origin: com.facebook.hadoop/hadoop-core

private static String[] executeShellCommand(String[] command)
throws IOException {
 String groups = Shell.execCommand(command);
 StringTokenizer tokenizer = new StringTokenizer(groups);
 int numOfTokens = tokenizer.countTokens();
 String[] tokens = new String[numOfTokens];
 for (int i=0; tokenizer.hasMoreTokens(); i++) {
  tokens[i] = tokenizer.nextToken();
 }
 return tokens;
}

代码示例来源:origin: cdapio/cdap

/** 
 * Static method to execute a shell command. 
 * Covers most of the simple cases without requiring the user to implement  
 * the <code>Shell</code> interface.
 * @param env the map of environment key=value
 * @param cmd shell command to execute.
 * @return the output of the executed command.
 */
public static String execCommand(Map<String,String> env, String ... cmd) 
throws IOException {
 return execCommand(env, cmd, 0L);
}

代码示例来源:origin: com.github.jiayuhan-it/hadoop-common

static List<String> getGroups() throws IOException {
 List<String> a = new ArrayList<String>();
 String s = Shell.execCommand(Shell.getGroupsCommand());
 for(StringTokenizer t = new StringTokenizer(s); t.hasMoreTokens(); ) {
  a.add(t.nextToken());
 }
 return a;
}

相关文章

微信公众号

最新文章

更多