org.apache.hadoop.util.Shell类的使用及代码示例

x33g5p2x  于2022-01-30 转载在 其他  
字(9.8k)|赞(0)|评价(0)|浏览(146)

本文整理了Java中org.apache.hadoop.util.Shell类的一些代码示例,展示了Shell类的具体用法。这些代码示例主要来源于Github/Stackoverflow/Maven等平台,是从一些精选项目中提取出来的代码,具有较强的参考意义,能在一定程度帮忙到你。Shell类的具体详情如下:
包路径:org.apache.hadoop.util.Shell
类名称:Shell

Shell介绍

[英]Copy of the Hadoop Shell class with fix for HADOOP-10622. A base class for running a Unix command. Shell can be used to run unix commands like du or df. It also offers facilities to gate commands by time-intervals.
[中]Hadoop外壳类的副本,以及Hadoop-10622的修复程序。用于运行Unix命令的基类。Shell可用于运行unix命令,如dudf。它还提供了按时间间隔接收命令的设施。

代码示例

代码示例来源:origin: org.apache.hadoop/hadoop-common

/**
 * Static method to execute a shell command.
 * Covers most of the simple cases without requiring the user to implement
 * the <code>Shell</code> interface.
 * @param cmd shell command to execute.
 * @return the output of the executed command.
 */
public static String execCommand(String ... cmd) throws IOException {
 return execCommand(null, cmd, 0L);
}

代码示例来源:origin: org.apache.hadoop/hadoop-common

/**
 * Returns a File referencing a script with the given basename, inside the
 * given parent directory.  The file extension is inferred by platform:
 * <code>".cmd"</code> on Windows, or <code>".sh"</code> otherwise.
 *
 * @param parent File parent directory
 * @param basename String script file basename
 * @return File referencing the script in the directory
 */
public static File appendScriptExtension(File parent, String basename) {
 return new File(parent, appendScriptExtension(basename));
}

代码示例来源:origin: org.apache.hadoop/hadoop-common

ProcessBuilder builder = new ProcessBuilder(getExecString());
Timer timeOutTimer = null;
ShellTimeoutTimerTask timeoutTimerTask = null;
 parseExecResult(inReader); // parse the output
 joinThread(errThread);
 completed.set(true);
  joinThread(errThread);

代码示例来源:origin: org.apache.hadoop/hadoop-common

/**
 * A command to get a given user's groups list.
 * If the OS is not WINDOWS, the command will get the user's primary group
 * first and finally get the groups list which includes the primary group.
 * i.e. the user's primary group will be included twice.
 */
public static String[] getGroupsForUserCommand(final String user) {
 //'groups username' command return is inconsistent across different unixes
 if (WINDOWS) {
  return new String[]
    {getWinUtilsPath(), "groups", "-F", "\"" + user + "\""};
 } else {
  String quotedUser = bashQuote(user);
  return new String[] {"bash", "-c", "id -gn " + quotedUser +
             "; id -Gn " + quotedUser};
 }
}

代码示例来源:origin: org.apache.hadoop/hadoop-common

/**
  * Calls shell to get users for a netgroup by calling getent
  * netgroup, this is a low level function that just returns string
  * that 
  *
  * @param netgroup get users for this netgroup
  * @return string of users for a given netgroup in getent netgroups format
  */
 protected String execShellGetUserForNetgroup(final String netgroup)
   throws IOException {
  String result = "";
  try {
   // shell command does not expect '@' at the beginning of the group name
   result = Shell.execCommand(
    Shell.getUsersForNetgroupCommand(netgroup.substring(1)));
  } catch (ExitCodeException e) {
   // if we didn't get the group - just return empty list;
   LOG.warn("error getting users for netgroup " + netgroup, e);
  }
  return result;
 }
}

代码示例来源:origin: org.apache.hadoop/hadoop-common

/**
 * Returns the target of the given symlink. Returns the empty string if
 * the given path does not refer to a symlink or there is an error
 * accessing the symlink.
 * @param f File representing the symbolic link.
 * @return The target of the symbolic link, empty string on error or if not
 *         a symlink.
 */
public static String readLink(File f) {
 /* NB: Use readSymbolicLink in java.nio.file.Path once available. Could
  * use getCanonicalPath in File to get the target of the symlink but that
  * does not indicate if the given path refers to a symlink.
  */
 if (f == null) {
  LOG.warn("Can not read a null symLink");
  return "";
 }
 try {
  return Shell.execCommand(
    Shell.getReadlinkCommand(f.toString())).trim();
 } catch (IOException x) {
  return "";
 }
}

代码示例来源:origin: org.apache.hadoop/hadoop-common

/**
 * Use the command chmod to set permission.
 */
@Override
public void setPermission(Path p, FsPermission permission)
 throws IOException {
 if (NativeIO.isAvailable()) {
  NativeIO.POSIX.chmod(pathToFile(p).getCanonicalPath(),
          permission.toShort());
 } else {
  String perm = String.format("%04o", permission.toShort());
  Shell.execCommand(Shell.getSetPermissionCommand(perm, false,
   FileUtil.makeShellPath(pathToFile(p), true)));
 }
}

代码示例来源:origin: org.apache.hadoop/hadoop-common

@Override
protected void stashOriginalFilePermissions() throws IOException {
 // save off permissions in case we need to
 // rewrite the keystore in flush()
 if (!Shell.WINDOWS) {
  Path path = Paths.get(file.getCanonicalPath());
  permissions = Files.getPosixFilePermissions(path);
 } else {
  // On Windows, the JDK does not support the POSIX file permission APIs.
  // Instead, we can do a winutils call and translate.
  String[] cmd = Shell.getGetPermissionCommand();
  String[] args = new String[cmd.length + 1];
  System.arraycopy(cmd, 0, args, 0, cmd.length);
  args[cmd.length] = file.getCanonicalPath();
  String out = Shell.execCommand(args);
  StringTokenizer t = new StringTokenizer(out, Shell.TOKEN_SEPARATOR_REGEX);
  // The winutils output consists of 10 characters because of the leading
  // directory indicator, i.e. "drwx------".  The JDK parsing method expects
  // a 9-character string, so remove the leading character.
  String permString = t.nextToken().substring(1);
  permissions = PosixFilePermissions.fromString(permString);
 }
}

代码示例来源:origin: org.apache.hadoop/hadoop-common

/** Return a command to read the target of the a symbolic link. */
public static String[] getReadlinkCommand(String link) {
 return WINDOWS ?
   new String[] { getWinUtilsPath(), "readlink", link }
   : new String[] { "readlink", link };
}

代码示例来源:origin: org.apache.hadoop/hadoop-common

/**
 * Return a command to set permission for specific file.
 *
 * @param perm String permission to set
 * @param recursive boolean true to apply to all sub-directories recursively
 * @param file String file to set
 * @return String[] containing command and arguments
 */
public static String[] getSetPermissionCommand(String perm,
                        boolean recursive,
                        String file) {
 String[] baseCmd = getSetPermissionCommand(perm, recursive);
 String[] cmdWithFile = Arrays.copyOf(baseCmd, baseCmd.length + 1);
 cmdWithFile[cmdWithFile.length - 1] = file;
 return cmdWithFile;
}

代码示例来源:origin: com.github.jiayuhan-it/hadoop-common

static List<String> getGroups() throws IOException {
 List<String> a = new ArrayList<String>();
 String s = Shell.execCommand(Shell.getGroupsCommand());
 for(StringTokenizer t = new StringTokenizer(s); t.hasMoreTokens(); ) {
  a.add(t.nextToken());
 }
 return a;
}

代码示例来源:origin: org.apache.hadoop/hadoop-common

/** Check to see if a command needs to be executed and execute if needed. */
protected void run() throws IOException {
 if (lastTime + interval > Time.monotonicNow()) {
  return;
 }
 exitCode = 0; // reset for next run
 if (Shell.MAC) {
  System.setProperty("jdk.lang.Process.launchMechanism", "POSIX_SPAWN");
 }
 runCommand();
}

代码示例来源:origin: org.apache.hadoop/hadoop-common

/**
 * Static method to destroy all running <code>Shell</code> processes.
 * Iterates through a map of all currently running <code>Shell</code>
 * processes and destroys them one by one. This method is thread safe
 */
public static void destroyAllShellProcesses() {
 synchronized (CHILD_SHELLS) {
  for (Shell shell : CHILD_SHELLS.keySet()) {
   if (shell.getProcess() != null) {
    shell.getProcess().destroy();
   }
  }
  CHILD_SHELLS.clear();
 }
}

代码示例来源:origin: org.apache.hadoop/hadoop-common

/**
 * Returns a command to run the given script.  The script interpreter is
 * inferred by platform: cmd on Windows or bash otherwise.
 *
 * @param script File script to run
 * @return String[] command to run the script
 */
public static String[] getRunScriptCommand(File script) {
 String absolutePath = script.getAbsolutePath();
 return WINDOWS ?
  new String[] {"cmd", "/c", absolutePath }
  : new String[] {"bash", bashQuote(absolutePath) };
}

代码示例来源:origin: com.facebook.hadoop/hadoop-core

for (String s : this.getExecString()) { 
 command += s + " "; 
if (isDisabled()) {
 LOG.error("Trying to execute a shell call while it is disabled.");
 parseExecResult(new BufferedReader(
   new StringReader(NO_SHELL_CALL_ALLOWED_STRING)));
 return;
ProcessBuilder builder = new ProcessBuilder(getExecString());
Timer timeOutTimer = null;
ShellTimeoutTimerTask timeoutTimerTask = null;
} catch (IllegalStateException ise) { }
try {
 parseExecResult(inReader); // parse the output

代码示例来源:origin: org.jvnet.hudson.hadoop/hadoop-core

ProcessBuilder builder = new ProcessBuilder(getExecString());
boolean completed = false;
} catch (IllegalStateException ise) { }
try {
 parseExecResult(inReader); // parse the output

代码示例来源:origin: org.apache.hadoop/hadoop-yarn-server-nodemanager

private void destroyShellProcesses(Set<Shell> shells) {
  for (Shell shell : shells) {
   if(localizingThreads.contains(shell.getWaitingThread())) {
    shell.getProcess().destroy();
   }
  }
 }
}

代码示例来源:origin: io.hops/hadoop-common

/**
 * Use the command chmod to set permission.
 */
@Override
public void setPermission(Path p, FsPermission permission)
 throws IOException {
 if (NativeIO.isAvailable()) {
  NativeIO.POSIX.chmod(pathToFile(p).getCanonicalPath(),
          permission.toShort());
 } else {
  String perm = String.format("%04o", permission.toShort());
  Shell.execCommand(Shell.getSetPermissionCommand(perm, false,
   FileUtil.makeShellPath(pathToFile(p), true)));
 }
}

代码示例来源:origin: org.apache.hadoop/hadoop-common

/**
 * A command to get a given user's group id list.
 * The command will get the user's primary group
 * first and finally get the groups list which includes the primary group.
 * i.e. the user's primary group will be included twice.
 * This command does not support Windows and will only return group names.
 */
public static String[] getGroupsIDForUserCommand(final String user) {
 //'groups username' command return is inconsistent across different unixes
 if (WINDOWS) {
  return new String[]{getWinUtilsPath(), "groups", "-F", "\"" + user +
             "\""};
 } else {
  String quotedUser = bashQuote(user);
  return new String[] {"bash", "-c", "id -g " + quotedUser + "; id -G " +
             quotedUser};
 }
}

代码示例来源:origin: org.apache.hadoop/hadoop-common

/** Return a command to set permission. */
public static String[] getSetPermissionCommand(String perm, boolean recursive) {
 if (recursive) {
  return (WINDOWS) ?
    new String[] { getWinUtilsPath(), "chmod", "-R", perm }
    : new String[] { "chmod", "-R", perm };
 } else {
  return (WINDOWS) ?
    new String[] { getWinUtilsPath(), "chmod", perm }
    : new String[] { "chmod", perm };
 }
}

相关文章

微信公众号

最新文章

更多