-
An algorithm is a series of instructions telling a computer what to do. No matter how complex, there are 3 options - and, or, not. Claude Shannon’s breakthrough thesis was that resistors use logic based on these options
-
Machine learning algorithms can learn by making inferences from data and the more they have of it the better. It is a technology that builds itself and can build other artifacts, turning data into algorithms
-
Algorithms are absolutely everywhere and are changing how we do business, make decisions, and even fall in love. It is all about accurate predictions and greatly expand this scope
-
The master algorithm is the key to machine learning, unifying various different thoughts into one ultimate algorithm - a general learner. It may be the best start or path towards a theory of everything that we have
-
Complexity is a huge battle each computer scientist must face as each algorithm is typically built on top of other algorithms. However, the learner algorithm can overcome this as it is fed the data and the desired result and spits out the algorithm that fits the situation. This type of technology is more like nature. Learning algorithms are the seed, data is the soil and the programs are the crops
-
Machine learning can be thought of as the inverse of programming as you can feed the desired output and data and out comes the algorithm
-
ML requires statistical rather than probabilistic thinking. 99% accuracy may be the best you can get. ML automates automation itself, otherwise programmers become the bottleneck
-
Data is the name of the game and is why network effects are so powerful and why google and other platform type companies have tailwinds at the back.
-
You need data commensurate with the complexity of the task at hand. The algorithm can only be as good as the data that goes into it so huge amounts of relevant data is the name of the game
-
Data can be thought of as the new oil and there is huge money in refining it
-
If something exists but the brain can’t learn it, it is the equivalent of not existing for us
-
Overfitting is a big problem and occurs when data is stuffed in and patterns are thought to be there that really aren’t. One way to limit this is by rewarding simpler theories and algorithms
-
Generalizing data for situations that haven’t been seen before is difficult. Accuracy on held out data is the gold standard for testing an algorithms accuracy
-
The S-curve is the most important curve in the world
-
The exploration vs exploitation trade off must be considered in life and in algorithms
-
Nature and nurture work together seamlessly to help us survive - the program and the data
-
Dimensionality is the second worst issue in machine learning
-
Clustering - assignment of a set of subsets (clusters) so that observations within the same cluster are similar according to some predesignated criterion or criteria, while observations drawn from different clusters are dissimilar
-
Law of affect - people move towards pleasure and away from pain
-
Reinforcement learning - long term algorithms which are programmed to choose the move with the greatest value
-
Power law of practice - chunking in action, best way to learn
-
Causality - Being aware of your environment, how your actions impact it and adapting to best get what you want
-
Relational learning - Best way to understand an entity is to see how it relates, fits in and acts with the entities around it. This way it is not an individualistic exercise, but a holistic, network-type view. Predator and have deeply intertwined characteristics. This may be one of the best ways to understand how the world works
-
A man is wealthy if he is richer than his wife’s sisters husband - HL Mencken
-
Lateral networks
-
-
They found that advertising to one of the most trusted reviewers of a product is as effective as advertising to a third of all possible customers
-
Some of the most important inventions or discoveries in history have been unifiers - things which took many separate processes can now be done in one. The internet and electricity are two examples of unifiers. The master algorithm is the unifier of ML. It let’s any application use any learner by abstracting the learner into a common form that is all the applications need to know. Out of many models, one
-
All learners have representation, evaluation, and optimization processes
-
Like the brain, genetic search followed by gradient descent may be one of the best tactics. Evolution creates the structure and individual experience molds it to specific uses
-
As you interact with algorithms, understand what model of you you want it to have and what data you can give it in order to bring that model to fruition. In the future, everyone will have bots which take your preferences, wishes, desires, etc into account and will navigate the world around you and deal with other people’s and company’s bots to get you the best outcome. Therefore, having the most accurate digital representation of you is important and the company which can safely and securely develop this virtual data storage of people and know what to share, when, and with whom is bound to get incredibly wealthy
-
Technology is a phenotype of humans and will help us continue to expand our scope and capabilities
-
Domingos paints a pretty rosy future where these algorithms help us achieve what we want and automate a lot of what we do today. There will be high unemployment but it won’t matter because the machines can produce what we need so cheaply that basic income is universal and only those who want to work will have to - necessarily in certain niches where computers aren’t as effective as humans
What I got out of it
-
The master algorithm is a general learner algorithm and the more relevant data you have, the better. Great primer into ML, algorithms, and how they are and will continue to impact our lives