

The concept of NNs, as the name suggests, was inspired by the network of our own brain neurons. Neural networks (NNs) have been around for a long time, so what triggered this craze around artificial intelligence and deep learning in recent years? The answer partly lies in Moore’s law and the remarkable improvement of hardware and computing power – we can now do a lot more with a lot less.

Enjoy! Neural Networksĭriverless cars were out there as far back as 1989. If you want to pass the theory, scroll all the way down to the ‘Let’s get started with R’ section.


I will provide more detailed instructions below. Cats dataset from Kaggle via its new API feature. Finally, I encourage you to use the RStudio terminal shell to fetch the Dogs vs. If you fall short of RAM please consider adapting the script as to use less pictures or split, process and save them in separate instances. The code is available from a dedicated repo so you don’t have to copy-paste the snippets below. Today you will construct a binary classifier that can distinguish between dogs and cats from a set of 25,000 pictures, using the Keras R interface powered by the TensorFlow back-end engine. The recent development of back-end optimization tools and hardware (from Intel, NVIDIA and Google to name a few) now enables training CNNs on conventional laptop machines, hence accessible to a broader audience. CNNs underlie most advanced recognition algorithms used by the major tech giants. The plan here is to experiment with convolutional neural networks (CNNs), a form of deep learning. In the meantime, I wrote a GFLASSO R tutorial for DataCamp that you can freely access here, so give it a try! Last time I promised to cover the graph-guided fused LASSO (GFLASSO) in a subsequent post. Can we teach a computer to distinguish cats and dogs?
