Custom weight initialization tensorflow tf.layers.dense

I think you can define your own initializer function. The function needs to take 3 arguments: shape, dtype, and partition_info. It should return a tf.Tensor which will be used to initialize the weight. Since you have a numpy array, I think you can use tf.constant to create this tensor. For example:

def custom_initializer(shape_list, dtype, partition_info):
    # Use np.ones((7, 3)) as an example
    return tf.constant(np.ones((7, 3)))

Then you can pass it to kernel_initializer. It should work if dimensions all match. I put an example on gist using Estimator to construct the model and using LoggingTensorHook to record dense/kernel at each step. You should be able to see that the weight is initiated correctly.

Edit:

I just found that using tf.constant_initializer will be better. It is used in tensorflow guide. You can do kernel_initializer=tf.constant_initializer(np.ones((7, 3))).


There are at least two ways to achieve this:

1 Create your own layer

  W1 = tf.Variable(YOUR_WEIGHT_MATRIX, name='Weights')
  b1 = tf.Variable(tf.zeros([YOUR_LAYER_SIZE]), name='Biases') #or pass your own
  h1 = tf.add(tf.matmul(X, W1), b1)

2 Use the tf.constant_initializer

init = tf.constant_initializer(YOUR_WEIGHT_MATRIX)
l1 = tf.layers.dense(X, o, kernel_initializer=init)