Tabnine Logo
StringUtils.hexStringToByte
Code IndexAdd Tabnine to your IDE (free)

How to use
hexStringToByte
method
in
org.apache.hadoop.util.StringUtils

Best Java code snippets using org.apache.hadoop.util.StringUtils.hexStringToByte (Showing top 13 results out of 315)

origin: org.apache.hadoop/hadoop-hdfs-httpfs

@Override
public byte[] getBytes() {
 return StringUtils.hexStringToByte((String) json.get(CHECKSUM_BYTES_JSON));
}
origin: org.apache.hadoop/hadoop-common-test

/**
 * Convert a string of lines that look like:
 *   "68 72 70 63 02 00 00 00  82 00 1d 6f 72 67 2e 61 hrpc.... ...org.a"
 * .. into an array of bytes.
 */
private static byte[] hexDumpToBytes(String hexdump) {
 final int LAST_HEX_COL = 3 * 16;
 
 StringBuilder hexString = new StringBuilder();
 
 for (String line : hexdump.toUpperCase().split("\n")) {
  hexString.append(line.substring(0, LAST_HEX_COL).replace(" ", ""));
 }
 return StringUtils.hexStringToByte(hexString.toString());
}

origin: com.backtype/dfs-datastores

public static Object getObject(JobConf conf, String key) {
  String s = conf.get(key);
  if(s==null) return null;
  byte[] val = StringUtils.hexStringToByte(s);
  return deserialize(val);
}
origin: cerndb/hdfs-metadata

/**
 * Returns a disk id (0-based) index from the Hdfs VolumeId object. There is
 * currently no public API to get at the volume id. We'll have to get it by
 * accessing the internals.
 */
public static int getDiskId(VolumeId hdfsVolumeId){
  // Initialize the diskId as -1 to indicate it is unknown
  int diskId = -1;
  if (hdfsVolumeId != null) {
    String volumeIdString = hdfsVolumeId.toString();
    byte[] volumeIdBytes = StringUtils.hexStringToByte(volumeIdString);
    if (volumeIdBytes != null && volumeIdBytes.length == 4) {
      diskId = Utils.toInt(volumeIdBytes);
    }else if (volumeIdBytes.length == 1) {
      diskId = (int) volumeIdBytes[0];  // support hadoop-2.0.2
    }
  }
  return diskId;
}
origin: opendedup/sdfs

pos = Long.parseLong(st[0]);
int cap = Integer.parseInt(st[1]);
byte [] hash = StringUtils.hexStringToByte(st[2]);
byte [] b = new byte [cap];
r.seek(pos);
origin: com.github.jiayuhan-it/hadoop-common

/**
 * Convert a string of lines that look like:
 *   "68 72 70 63 02 00 00 00  82 00 1d 6f 72 67 2e 61 hrpc.... ...org.a"
 * .. into an array of bytes.
 */
private static byte[] hexDumpToBytes(String hexdump) {
 final int LAST_HEX_COL = 3 * 16;
 
 StringBuilder hexString = new StringBuilder();
 
 for (String line : StringUtils.toUpperCase(hexdump).split("\n")) {
  hexString.append(line.substring(0, LAST_HEX_COL).replace(" ", ""));
 }
 return StringUtils.hexStringToByte(hexString.toString());
}

origin: ch.cern.hadoop/hadoop-common

/**
 * Convert a string of lines that look like:
 *   "68 72 70 63 02 00 00 00  82 00 1d 6f 72 67 2e 61 hrpc.... ...org.a"
 * .. into an array of bytes.
 */
private static byte[] hexDumpToBytes(String hexdump) {
 final int LAST_HEX_COL = 3 * 16;
 
 StringBuilder hexString = new StringBuilder();
 
 for (String line : StringUtils.toUpperCase(hexdump).split("\n")) {
  hexString.append(line.substring(0, LAST_HEX_COL).replace(" ", ""));
 }
 return StringUtils.hexStringToByte(hexString.toString());
}

origin: ch.cern.hadoop/hadoop-hdfs

final String algorithm = (String)m.get("algorithm");
final int length = ((Number) m.get("length")).intValue();
final byte[] bytes = StringUtils.hexStringToByte((String)m.get("bytes"));
origin: org.apache.hadoop/hadoop-hdfs-client

final String algorithm = (String)m.get("algorithm");
final int length = ((Number) m.get("length")).intValue();
final byte[] bytes = StringUtils.hexStringToByte((String) m.get("bytes"));
origin: io.prestosql.hadoop/hadoop-apache

final String algorithm = (String)m.get("algorithm");
final int length = ((Number) m.get("length")).intValue();
final byte[] bytes = StringUtils.hexStringToByte((String)m.get("bytes"));
origin: org.apache.hadoop/hadoop-hdfs-test

byte[] imageBytes = StringUtils.hexStringToByte(
 "fffffffee17c0d2700000000");
FileOutputStream fos = new FileOutputStream(imageFile);
origin: ch.cern.hadoop/hadoop-hdfs

/**
 * Test case for an empty edit log from a prior version of Hadoop.
 */
@Test
public void testPreTxIdEditLogNoEdits() throws Exception {
 FSNamesystem namesys = Mockito.mock(FSNamesystem.class);
 namesys.dir = Mockito.mock(FSDirectory.class);
 long numEdits = testLoad(
   StringUtils.hexStringToByte("ffffffed"), // just version number
   namesys);
 assertEquals(0, numEdits);
}

origin: ch.cern.hadoop/hadoop-hdfs

byte[] imageBytes = StringUtils.hexStringToByte(
 "fffffffee17c0d2700000000");
FileOutputStream fos = new FileOutputStream(imageFile);
org.apache.hadoop.utilStringUtilshexStringToByte

Javadoc

Given a hexstring this will return the byte array corresponding to the string

Popular methods of StringUtils

  • stringifyException
    Make a string representation of the exception.
  • join
    Concatenates strings, using a separator.
  • split
  • arrayToString
  • toLowerCase
    Converts all of the characters in this String to lower case with Locale.ENGLISH.
  • escapeString
  • startupShutdownMessage
    Print a log message for starting up and shutting down
  • getStrings
    Returns an arraylist of strings.
  • toUpperCase
    Converts all of the characters in this String to upper case with Locale.ENGLISH.
  • byteToHexString
    Given an array of bytes it will convert the bytes to a hex string representation of the bytes
  • formatTime
    Given the time in long milliseconds, returns a String in the format Xhrs, Ymins, Z sec.
  • unEscapeString
  • formatTime,
  • unEscapeString,
  • getStringCollection,
  • byteDesc,
  • formatPercent,
  • getTrimmedStrings,
  • equalsIgnoreCase,
  • format,
  • formatTimeDiff,
  • getTrimmedStringCollection

Popular in Java

  • Creating JSON documents from java classes using gson
  • notifyDataSetChanged (ArrayAdapter)
  • startActivity (Activity)
  • setRequestProperty (URLConnection)
  • GridBagLayout (java.awt)
    The GridBagLayout class is a flexible layout manager that aligns components vertically and horizonta
  • FileWriter (java.io)
    A specialized Writer that writes to a file in the file system. All write requests made by calling me
  • Scanner (java.util)
    A parser that parses a text string of primitive types and strings with the help of regular expressio
  • FileUtils (org.apache.commons.io)
    General file manipulation utilities. Facilities are provided in the following areas: * writing to a
  • Get (org.apache.hadoop.hbase.client)
    Used to perform Get operations on a single row. To get everything for a row, instantiate a Get objec
  • Runner (org.openjdk.jmh.runner)
  • Top plugins for WebStorm
Tabnine Logo
  • Products

    Search for Java codeSearch for JavaScript code
  • IDE Plugins

    IntelliJ IDEAWebStormVisual StudioAndroid StudioEclipseVisual Studio CodePyCharmSublime TextPhpStormVimGoLandRubyMineEmacsJupyter NotebookJupyter LabRiderDataGripAppCode
  • Company

    About UsContact UsCareers
  • Resources

    FAQBlogTabnine AcademyTerms of usePrivacy policyJava Code IndexJavascript Code Index
Get Tabnine for your IDE now