분석 Python/Tensorflow
[ Python ] TensorFlow Weight L2, L1 Normalization 쉽게하기
데이터분석뉴비
2019. 9. 24. 10:00
728x90
TensorFlow - regularization with L2 loss, how to apply to all weights, not just last one?
I am playing with a ANN which is part of Udacity DeepLearning course. I have an assignment which involves introducing generalization to the network with one hidden ReLU layer using L2 loss. I wond...
stackoverflow.com
How to exactly add L1 regularisation to tensorflow error function
Hey I am new to tensorflow and even after a lot of efforts could not add L1 regularisation term to the error term x = tf.placeholder("float", [None, n_input]) # Weights and biases to hidden layer ...
stackoverflow.com
기존 Loss gen_loss , disc_loss에 Weight만 L2 부여하기
t_vars = tf.trainable_variables()
G_L2 = []
for v in t_vars :
if re.search('Weight' , v.name) :
if re.search("Generator", v.name) :
print(v.name)
G_L2.append(tf.nn.l2_loss(v))
gen_loss += tf.add_n(G_L2) * 0.001
D_L2 = []
for v in t_vars :
if re.search('Discriminator' , v.name) :
if re.search("kernel", v.name) :
print(v.name)
D_L2.append(tf.nn.l2_loss(v))
disc_loss += tf.add_n(D_L2) * 0.001
L2, L1 Loss 해보기
vars = tf.get_collection(tf.GraphKeys.GLOBAL_VARIABLES,scope="Ensembles")
L2 = []
WEIGHTS = []
import re
for v in vars :
if re.search('Weight' , v.name) :
WEIGHTS.append(v)
L2.append(tf.nn.l2_loss(v))
Loss += tf.add_n(L2) * 0.1
l1_regularizer = tf.contrib.layers.l1_regularizer(scale=0.005, scope=None)
regularization_penalty = tf.contrib.layers.apply_regularization(l1_regularizer, WEIGHTS )
Loss += regularization_penalty
728x90