Tensorflow One Hot Encoder ?

  • 3 Answer(s)

          For TensorFlow 0.8, There is a native one-hot op and tf.one_hot that can convert a set of sparse labels to a dense one-hot representation.

          This is an inclusion to tf.nn.sparse_softmax_cross_entropy_with_logits, For some cases Instead of converting them to one-hot let compute the cross entropy directly on the sparse labels.

    When you want to do it in old way by using sparse-to-dense operators:

    num_labels = 10
    # label_batch is a tensor of numeric labels to process
    # 0 <= label < num_labels
    sparse_labels = tf.reshape(label_batch, [-1, 1])
    derived_size = tf.shape(label_batch)[0]
    indices = tf.reshape(tf.range(0, derived_size, 1), [-1, 1])
    concated = tf.concat(1, [indices, sparse_labels])
    outshape = tf.pack([derived_size, num_labels])
    labels = tf.sparse_to_dense(concated, outshape, 1.0, 0.0)

    The labels, is a one-hot matrix of batch_size x num_labels.

    Answered on December 15, 2018.
    Add Comment

    In Tensorflow, the tf.one_hot() is easy to use.

    Here, the depth=4 and indices=[0, 3]

    import tensorflow as tf
    res = tf.one_hot(indices=[0, 3], depth=4)
    with tf.Session() as sess:
        print sess.run(res)

    Note that when you provide index=-1 it will get all zeros in your one-hot vector

    Answered on December 15, 2018.
    Add Comment

    A simple way to one-hot encoder is:

    a = 5
    b = [1, 2, 3]
    # one hot an integer
    one_hot_a = tf.nn.embedding_lookup(np.identity(10), a)
    # one hot a list of integers
    one_hot_b = tf.nn.embedding_lookup(np.identity(max(b)+1), b)
    Answered on December 15, 2018.
    Add Comment

  • Your Answer

    By posting your answer, you agree to the privacy policy and terms of service.