Articles about deep-learning

Stuffs around artificial neural networks...

Trying OCR with GCloud Document AI

OCR stands for “Optical Character Recognition”, and is a powerful technique for extracting texts (and possibly also their position, fonts etc.) out of images. This task is far from being trivial, given all the possible fonts, colors, image qualitiesi out there. The text may also not lay on a horizontal straight line… Well you guess it, everything is possible in the wild, and the first step to make sense out of it is to extract the characters.

continue...

Hashing with Neural Network Weights

This articles presents a method for hashing strings based on the weights of an Artificial Neural Network trained on them.

continue...

Gradient Descent Explained

Gradient descent is a major technique for training ML/DL models. Let’s have a closer look at it and implement a simple example from scratch in Python illustrating the main basic concepts around gradient descent.

continue...

Cyclical learning rates with Tensorflow Implementation

The learning rate is considered as the most important hyperparameter in a neural network (Bengio 2012). Finding the right one is thus quite crucial. Even better is to find a good learning rate scheduling: modifying the learning rate during the training so that the model has a bigger chance to reach a better optimum. The goal of this article is to describe a learning rate scheduling that seems to work well, along its Tensorflow implementation and an example with a simple CNN on the MNIST dataset.

continue...

Transforming Keras Model into Tensorflow Estimator

A Tensorflow Estimator is a convenient object to manage models, especially for production. And Keras is a convenient library to build models. Thus combining both is a powerful way to leverage their strenghts. Especially since Keras will be the standard for building models in Tensorflow 2.0 Let see how it works:

continue...