<div dir="ltr"><br><div class="gmail_extra"><br><div class="gmail_quote">On Tue, May 23, 2017 at 4:51 AM, Hideki Kato <span dir="ltr"><<a href="mailto:hideki_katoh@ybb.ne.jp" target="_blank">hideki_katoh@ybb.ne.jp</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">(3) CNN cannot learn exclusive-or function due to the ReLU<br>
activation function, instead of traditional sigmoid (tangent<br>
hyperbolic).  CNN is good at approximating continuous (analog)<br>
functions but Boolean (digital) ones.<br></blockquote><br><div>Oh, not this nonsense with the XOR function again.</div><div><br></div><div>You can see a neural network with ReLU activation function learning XOR right here: <a href="http://playground.tensorflow.org/#activation=relu&batchSize=10&dataset=xor&regDataset=reg-plane&learningRate=0.01&regularizationRate=0&noise=0&networkShape=4,4&seed=0.96791&showTestData=false&discretize=false&percTrainData=50&x=true&y=true&xTimesY=false&xSquared=false&ySquared=false&cosX=false&sinX=false&cosY=false&sinY=false&collectStats=false&problem=classification&initZero=false&hideText=false" target="_blank">http://playground.tensorflow.<wbr>org/#activation=relu&<wbr>batchSize=10&dataset=xor&<wbr>regDataset=reg-plane&<wbr>learningRate=0.01&<wbr>regularizationRate=0&noise=0&<wbr>networkShape=4,4&seed=0.96791&<wbr>showTestData=false&discretize=<wbr>false&percTrainData=50&x=true&<wbr>y=true&xTimesY=false&xSquared=<wbr>false&ySquared=false&cosX=<wbr>false&sinX=false&cosY=false&<wbr>sinY=false&collectStats=false&<wbr>problem=classification&<wbr>initZero=false&hideText=false</a></div><div> </div><div>Enjoy,</div><div>Álvaro.<br><br><br></div></div></div></div>