Searched refs:hard_sigmoid (Results 1 – 3 of 3) sorted by relevance
433 def hard_sigmoid(x): function457 return backend.hard_sigmoid(x)
5073 def hard_sigmoid(x): function
6665 recurrent activation function for GRU from `hard_sigmoid` to `sigmoid`, and6667 `hard_sigmoid` since it is fast than 'sigmoid'. With new unified backend6674 GRU(recurrent_activation='hard_sigmoid', reset_after=False) to fallback to7314 default recurrent activation function for GRU from 'hard_sigmoid' to7316 activation is 'hard_sigmoid' since it is fast than 'sigmoid'. With new7324 GRU(recurrent_activation='hard_sigmoid', reset_after=False) to fallback7535 'hard_sigmoid' to 'sigmoid' in 2.0. Historically recurrent activation is7536 'hard_sigmoid' since it is fast than 'sigmoid'. With new unified backend7543 construct the layer with LSTM(recurrent_activation='hard_sigmoid') to