A House Cat Knows More Than The IBM Watson

It ran out of productivity the moment you begged the question in your definitions, which was... oh, right, post #1.
Behaviors can change the neural network of the mind.

Machine algorithms always remain the same.

if both sides of the argument can’t come to this conclusion then you are all trolls.
 
Behaviors can change the neural network of the mind.
Should that not be stated in reverse? Behaviors are a result of natural selection and natural selection of beneficial neural patterns tends to select for greatest efficiency in advantageous survival behaviors, no?

Machine algorithms always remain the same.[/QUOTE]
IMO, mathematical functions, i.e. the algorithms of nature, always remain the same. In nature the results of mathematical functions may show aberrations due to external pressures, but that does not invalidate the mathematical guiding principles.

As far as a housecat knowing more than a computer, I beg to differ as it relates to modern AI.

If you doubt this, then investigate the current AI GPT3 , which is capable of remarkable quasi-intelligent processes.

GPT-3’s coding skills
One of the most surprising use cases people found was GPT-3’s ability to code following a natural language prompt (a prompt is the chunk of text we input the system). Sharif Shameen created debuild.co, a code generator based on GPT-3. He showed how the system was able to build a simple program in HTML/CSS/JSX from a simple set of instructions in English. Jordan Singer built Designer, a Figma plugin that can design for you. Amjad Masad built Replit, an application that explains and even tells you how to improve your code.
How can GPT-3 code from input in natural language? The reason is its multitasking meta-learning abilities. It can learn to perform text tasks it hasn’t been trained on, after seeing just a few examples. Sharif Shameen and company conditioned GPT-3 to learn these tasks. Meta-learning is an impressive ability, but we tend to overestimate AIs that acquire human-reserved skills, and GPT-3 is no different. It can code, but it can’t code everything. Here are three important limitations: .......more
https://towardsdatascience.com/gpt-3-or-any-ai-wont-kill-coding-f4cabd3a536b

While GPT3 is still limited as compared to humans, it far exceeds a housecat in all respects except perhaps in catching mice.
 
Last edited:
Machine algorithms always remain the same.
Only if you program them that way.

A great many AI's nowadays learn continuously and change their algorithms on the fly in response to new data. An AI that lets a robot walk, for example, will stumble a bit when you suddenly add a heavy load to it. But it will quickly learn to change its posture a bit, and adjust its gait, to account for the new load.
 
Back
Top