I believe you are applying 0.5 dropout twice to your LSTM layer in the LRCN network, in effect applying 0.75 dropout, is that true? (Excuse me if I'm wrong)
Depending on where we want to apply dropout, we can choose from several options. This patch cannot be included directly, as one of the provided options has to be chosen. Option 1 applies dropout to the LSTM input Option 2 applies dropout to the LSTM input and to the recurrent/hidden input (in case that was the intention of the second written dropout) Option 3 applies dropout to the LSTM input, as well as the subsequent FC layer, if that was the intention of the second written dropout)