# Tensorflow: How to replace or modify gradient ?

Tensorflow: How to replace or modify gradient ?

Asked on December 19, 2018 in

A simple way to do it by using tf.RegisterGradient

Implement the backpropagated gradient clipping with matmul. which shown below:

```import tensorflow as tf
import numpy as np

# from https://gist.github.com/harpone/3453185b41d8d985356cbe5e57d67342
def py_func(func, inp, Tout, stateful=True, name=None, grad=None):

# Need to generate a unique name to avoid duplicates:
rnd_name = 'PyFuncGrad' + str(np.random.randint(0, 1E+8))

g = tf.get_default_graph()
return tf.py_func(func, inp, Tout, stateful=stateful, name=name)

""""
its L2 norm is no more than `clip_value`
"""
with tf.name_scope(name, "ClipGrad", [x]) as name:
return py_func(lambda x : x,
[x],
[tf.float32],
name=name,
grad=lambda op, g : tf.clip_by_norm(g, clip_value))[0]
```

Example Implementation:

```with tf.Session() as sess:
x = tf.constant([[1., 2.], [3., 4.]])
y = tf.constant([[1., 2.], [3., 4.]])

print('without clipping')
z = tf.matmul(x, y)

print('with clipping')

print('with clipping between matmuls')
z = tf.matmul(clip_grad(tf.matmul(x, y), 1.0), y)
```

Result:

```without clipping
[[ 3.  7.]
[ 3.  7.]]
with clipping
[[ 0.278543   0.6499337]
[ 0.278543   0.6499337]]
with clipping between matmuls
[[ 1.57841039  3.43536377]
[ 1.57841039  3.43536377]]
```