Artificial intelligence dey get sense how dem dey behave because of one technology wey dem call machine learning.
Even though human beings design how dis learning go happen, nobody still explain why sense just dey comot from all dis procedures and how artificial intelligence dey set.
For dis article, I go try look for why sense dey appear by thinking about wetin learning really be.
And as we go deep inside dis learning matter, we go reach one idea say artificial intelligence and our brain-brain get something wey dem born with, wey dey make dem learn how to learn.
Dis one dey show say something dey wey you fit call "natural born frameworker."
How You Dey Learn With Body Versus How You Dey Learn With Mouth
We dey learn about dis world around us and dey sabi do more things by using our eyes to see and our body to move.
Dis one too na one kind of learning, wey you fit call learning through di body.
But on di other hand, when people dey talk about learning generally, dem go fit imagine say na to gather knowledge by reading books or listening to teacher when dem dey explain.
Apart from dis kind of learning wey dem plan for school, we still dey learn plenty things from when we dey talk with friends, from news wey we see online, and other places.
Dis kind of learning no be to just cram wetin you see with your eye or to learn by moving your body; na learning through language.
Sub-physical Learning and Metaphysical Learning
For di learning wey dey happen through language, some information you go gats dey repeat am over and over before you fit remember am, but some other ones, once you hear am one or two times, you don sabi am.
Or, some knowledge dey, even if you no remember all di details, you fit still use am by just going to pick di full gist from bookshelf or di internet anytime you need am.
If we dey talk about how to get knowledge and use am well well when you need am, both of dis style na learning be dat.
Out of dem, di knowledge wey you go gats repeat over and over before you fit remember am, you fit call am sub-physical knowledge. Di way dem dey learn dis one na sub-physical learning, meaning say you dey cram di main ideas dem.
Dis one resemble physical learning, where person dey learn over and over by seeing things with dem eyes or moving dem body. You fit still put all dis ones for sub-physical learning.
But for di other side, to get knowledge wey you fit remember with small repetition, or wey you fit just look up and use sharp sharp, you fit call am metaphysical learning.
For dis kind matter, concepts wey you don learn before through sub-physical learning fit help you learn new knowledge as types of those concepts or as combined concepts.
Since you fit use di concepts wey you don already get from sub-physical learning, metaphysical learning no need repetition.
Natural Language Machine Learning
Make we apply dis one to machine learning inside artificial intelligence.
Normally, neural networks wey dem dey use for machine learning dey do sub-physical learning, meaning say dem dey learn concepts over and over again.
But on di other hand, big language models, wey fit process natural language just like humans, fit learn through language.
As dem dey do pre-training and fine-tuning for big language models, sub-physical learning through language dey happen.
Wetin pass dat one, one big language model wey dem don train already fit answer question by using di knowledge wey dey inside di sentence wey dem put for am, so e dey do immediate metaphysical learning.
Because of dis power of metaphysical learning through language, big language models fit use new knowledge without needing to learn am over and over.
You fit call dis one natural language machine learning, opposite to di old numerical machine learning wey dey always adjust model parameters.
Natural Language As Di Metaphysical Connection Point
Natural language dey siddon right for di place wey dey separate sub-physical learning from metaphysical learning.
Wetin dey make natural language sweet for ear na say you fit learn am through sub-physical learning, and on top of dat, e dey make metaphysical learning possible.
Other Metaphysical Connection Points Apart From Natural Language
True true, even for physical learning, both sub-physical and metaphysical learning dey exist. For example, person wey sabi sport fit quickly adapt to a new game wey dem just see for di first time.
Likewise, person wey get plenty knowledge for biology fit quick quick understand di character of a new animal dem never see before when dem see am.
So, for physical learning too, some metaphysical connection points dey wey dem similar to natural language.
Frameworks
For all dis connection points, frameworks dey there. Dem different from just ordinary ideas or knowledge. Dem dey define how things relate and arrange, or dem fit make new ways to arrange things possible.
As dem dey gather all kinds of sub-physical knowledge through sub-physical learning, e fit be say dem fit learn di framework for di metaphysical connection point from how all di small small pieces of sub-physical knowledge connect.
Frameworks wey dem get through physical learning dey make am possible to learn new knowledge metaphysically sharp sharp after dem get am. But e no easy to tell other people di knowledge wey dem get from dis metaphysical learning.
But on di other hand, di framework wey dem get through learning by language na natural language itself.
So, knowledge wey dem get through metaphysical learning, after dem don learn di natural language framework, fit enter directly into other people's learning by language.
Dis one no just concern knowledge wey learning through language, like textbooks or online news, na di main thing.
One experienced soccer player, wey dey play baseball for di first time, fit fit tell other soccer players di metaphysical knowledge wey e get about baseball through words. Dis one mean say if people sabi di same sub-physical knowledge, wetin dem dey call "tips" or how-to fit dey communicated with mouth.
Wetin pass dat one, person fit tell other biologists about one new kind of animal wey dem see, using words.
So, natural language don show say e be very powerful framework for di metaphysical connection point.
Virtual Frameworks
On top of natural language, person fit still get other frameworks.
Dis ones na frameworks wey relate to specific areas or formal frameworks.
For different academic fields, business areas, and everyday life, plenty different frameworks dey wey relate to specific areas.
Scholars, wey dey work inside di framework of their specialty, fit discover new things and easily tell other scholars wey get di same framework about dat knowledge.
Di framework itself sometimes fit dey expressed in natural language, for dat case, people or big language models wey get natural language framework fit learn and understand am.
Business models and cooking recipes too be examples of dis kind frameworks wey relate to specific areas, wey you fit express in natural language.
Wetin pass dat one, mathematical formulas, programming languages, and business analysis frameworks na formal frameworks dem be.
Dem too fit express or explain their frameworks in natural language.
Dis frameworks wey relate to specific areas and formal frameworks wey dem build on top of natural language, you fit call dem virtual frameworks.
Dis one easy to understand if you imagine a virtual machine wey dey run another computer program on top of a physical computer. Another framework dey work on top of natural language, wey be di main framework.
Native Frameworks
Wetin pass dat one, even though you go first need to understand dis virtual frameworks through natural language, as person dey use dem dey get used to dem, dem go begin dey bypass natural language explanation and understanding, and go just dey function direct as a metaphysical interface framework wey dem build on top of sub-physical knowledge.
You fit call dis one native framework.
Natural language too, for one way, na native framework, but na only for your mama language. Generally, languages wey no be your mama language, you dey learn dem as virtual frameworks. As you dey sabi am well well, e go dey turn to native framework.
Di same thing apply to frameworks wey relate to specific areas and formal frameworks. Mathematicians fit just dey yarn natively using mathematical formulas, and programmers fit understand wetin each other want with only source code without any comments.
Dis one dey show say di way virtual frameworks dey turn to native frameworks fit still apply to big language models.
Di idea to find virtual frameworks wey dem dey use often, create plenty example data using those frameworks, and then fine-tune dem to become native frameworks, na something wey worth trying sharp sharp.
Natural Born Frameworkers
When you think about dis, you go realize say during di pre-training of big language models, no be only fine-tuning dem dey do, e possible say dem dey also learn frameworks for specific areas and formal frameworks.
And for dat process, e fit happen say instead of dem to just learn frameworks for specific areas or formal frameworks natively from di start, dem go first learn di natural language framework, and then, either while dem still dey learn am or after dem don sabi am finish, dem go learn frameworks for specific areas and formal frameworks, and make dem native.
If we go deep inside dis learning of frameworks step-by-step, e fit still be say natural language learning itself na like many small small, step-by-step framework learning wey dey happen at di same time.
In other words, from di plenty plenty text wey dem give dem as training data during pre-training, big language models fit no just learn individual ideas, but also some simple rules of natural language as a framework. Then, using dis simple frameworks as foundation, dem go dey learn rules wey dey small more complex over and over.
Dis one go make dem fit move from di stage where dem first learn word meanings to remembering compound words and basic grammar, and then to understanding sentences, and learning complex things like writing and how to express yourself.
You fit understand dis as a model where dem dey learn frameworks step-by-step and in a complex way, using one framework as di foundation to learn di next one.
Dis one dey show big language models as "natural born frameworkers," meaning say dem get a system for learning frameworks from di very beginning.
Attention Mechanism
Di technology wey dey make di natural-born frameworker happen na di attention mechanism.
Di attention mechanism resemble selecting tokens wey make sense from wetin dem dey talk. E dey make di relationship between tokens clear. Dis one na exactly wetin framework be: e dey simplify by keeping di important ideas while e make di relationships between dem clear.
By changing dis selection for each token, e dey make am possible to change frameworks as e dey go.
Dis one go allow us explain why di attention mechanism be technology wey dey determine how big language models dey develop, by using di natural-born frameworker model.
Conclusion
If dis mechanism really dey happen during di pre-training of big language models, then di things wey we no understand before about big language models go don get explanation.
Dis ones include di sub-physical and metaphysical learning wey we talk about here, frameworks wey be like metaphysical connection points, natural language wey dey make learning through language and virtual frameworks possible, and di attention mechanism wey dey bring di natural-born frameworker to life.
Wetin pass dat one, two other points dey wey dis one dey suggest.
First, natural language get one kind structure wey good well well for gradually learning complex frameworks from simple ones inside your head.
If natural language first show for human society for one simple way and gradually grow to get more complex and rich structure, dis one na just how e suppose be.
Wetin dey more, e for good if e dey structured for one way wey dey allow for quick learning. If we assume say many societies wey get different natural languages dey compete, di idea say natural languages wey better for learning dey survive now, e easy to form dat hypothesis.
When we think about dis nature of natural language, e lead us to di second suggestion: say we human beings too be natural-born frameworkers.
Even if di specific foundations and mechanisms wey dey under dey different, our brains must still get one mechanism, just like di attention mechanism, wey dey allow for step-by-step learning and flexible adjustment of frameworks.