As we know there are a lot of levels of abstraction between the hardware we use and what we can see on the screen. We also hear that computers work with binaries which means the unit of data is 0 and 1 (or true and false). When we tell the computer to compute something, it can’t just take numbers and do a computation. First numbers have to be converted to binary and then the computation can be performed. I decided to dive into how computers do computations in binary. For simplicity of representation, I built all functions in python.
We…
Previously we were learning how to create different parts of neural networks. To train different versions of our artificial intelligence we used abstract training data. And now the time has come to use real data.
Data
We are going to teach our neural net to recognize handwritten digits with help of MNIST dataset. Also known as the “hello world problem” in the machine learning industry.
MNIST dataset contains 60 000 labeled data points (images) as training dataset and 10 000 testing dataset. Original data stored in binaries files so we will need to do some extraction work. …
My colleague at MIT Farzad inspired me to systematize my thoughts about the learning process I follow since I began to study computer and data science. In this article I will try to express a short and precise summary of critical parts of the learning process especially when you are self-tough.
Fog of war
When we are learning a completely new subject we don’t know what we don’t know. And we don’t have such a lecture as a professor who can map it for us. So we have to do it ourselves. At first I would focus on key concepts…
As you might know right now I’m MIT ReACT program learner. This week we began a Computer Science course and the first assignment was to come up with the idea of using graph data structure. I thought I would use a knowledge example of a real world example of graph data structure. Also, I thought I should create one which will represent what we are going to learn through the program and the opportunity to show off during the class is priceless.
Knowledge Graph
Previously I already wrote about graphs data structure so I am not going to repeat myself…
Last time we were talking about adding hidden layers to our neural network, and it did well solving binary classification tasks. But what if we have more entities in our ontology? Let’s figure out how to teach our neural net how to solve multiple entity classification problems!
When it comes to the neural networks training dataset comes first! We will generate 3 segments of the dataset to demonstrate how to work with more than two classifications. Each segment will contain two dementia arrays of coordinates and we will specify the range of these points to group them together. Therefore each…
In my first and second articles about neural networks, I was working with perceptrons, a single-layer neural network. And even though our AI was able to recognize simple patterns, it wasn’t possible to use it, for example, for object recognition on images. That’s why today we’ll talk about hidden layers and will try to upgrade perceptrons to the multilayer neural network.
Why do we need hidden layers? Perceptrons recognize simple patterns, and maybe if we add more learning iteration, they might learn how to recognize more complex patterns? Actually, no. …
In our previous article, we built from scratch a simple neural network that was able to learn and perform a very simple task. Today we will optimize our network, make it object-oriented, and introduce such concepts as learning rate and biases. And let’s add a fw simple but real-world cases so 0 and 1 turn into some sort of the story.
The same as last time, we are going to need only one external library to perform computation, NumPY, and, of course, Python itself.
Learning is a meta-skill which we all desperately need to survive in our rapidly changing singularitarian reality. It doesn’t matter what you are learning to play drums, writing poetry or computer science. It does all require industry cognitive effort. It’s very common to get lost during the learning process, feel depressed or even decide to give up. Digital distracting environment which is not helping us to focus on complex problems.
Good news everyone! Learning it is a skill and like any other skill we can train it and become a master of our learning process. …
I have been wanting to play with neural networks for a very long time and finally I found a window of opportunity to mess around with neural nets. It’s pretty far away from Skynet, and I don’t think that I fully grasped the math behind this, but let’s teach AI to do something simple first.
Neural networks are not a new concept. They were first introduced by Warren McCulloch and Walter Pitts in 1943.
We are going to build a single-layer neural net without hidden layers or a perceptron. It will consist of an input layer with training examples, synapses…
Why do I love computers? Because they are able and willing to do boring work for us. If we give them precise instructions of course. I believe that computers should be used to release people from repetitive non-creative/non-philosophical work. They can do it much better and people can focus on doing something more interesting.
Problem definition
One of my non-tech colleagues had a problem processing massive pdf documents (several thousands pages). They were very frustrated because they had to spend the whole day splitting several thousands pages into smaller docs of 20 pages. …
Software Engineer, AI researcher and Transhumanist.