Is it possibly to functionally transfer knowledge from a neural network to another?

Discussion in 'Intelligence & Machines' started by Buckaroo Banzai, Jan 3, 2018.

  1. river

    Messages:
    17,307
    This OP is non-sense .

    The OP has no idea of the complexity of the brain .
     
  2. Google AdSense Guest Advertisement



    to hide all adverts.
  3. Write4U Valued Senior Member

    Messages:
    20,069
    From Wiki:
    In general, what made us even think about flying in the first place?
     
  4. Google AdSense Guest Advertisement



    to hide all adverts.
  5. someguy1 Registered Senior Member

    Messages:
    727
    I made the point that birds fly and planes fly, but man-made flight operates via a different mechanism than natural flight. Just as man-made intelligence operates via different mechanisms than natural intelligence, so that it's a fallacy to imagine that just because a digital computer can play chess, that humans use a digital algorithm to play chess.

    This argument is not mine, it's common in the AI debate.
     
  6. Google AdSense Guest Advertisement



    to hide all adverts.
  7. someguy1 Registered Senior Member

    Messages:
    727
    Then you are conceding my point. Thank you.

    What? Without looking it up, define exponential. And now explain to me what the F you are talking about.

    Logarithms? You're just randomly typing in math words. You're trolling me now and no longer trying to say anything substantive.

    Well, algorithms are described by computer languages. If you're not talking about formal languages then you're not talking about algorithms. You keep defeating your own point.

    Good grief. What are you talking about? You are not making any sense. Of course minds can do math. But in fact math is not algorithmic. Gödel proved that.

    No. YOU are restricting the mind to computers by insisting that the mind is an algorithm.

    No, you're just wrong about that. There are no logic gates in the brain.

    Yes but the working does not involve algorithms. You keep making a claim without providing argument or evidence.

    Call it woo-stuff then.

    It's the same as my reluctance to call a fish a brick. A fish isn't a brick.

    Sure. But not every electrical signal is an algorithm.
     
  8. someguy1 Registered Senior Member

    Messages:
    727
    Then you are calling a fish a brick and expecting me to take you seriously. If you are using the word algorithm to describe a process that is not an algorithm, you might as well call fish bricks and then demand that I agree with you.

    Yes, I've agreed to that many times. If I execute the Euclidean algorithm in my head, my brain is executing an algorithm. How many times have I already made this exact point using this exact example by now? Three or four times at least.

    You're calling fish bricks. That does not make sense. You are using the world algorithm incorrectly.
     
  9. someguy1 Registered Senior Member

    Messages:
    727
    You claimed that classical physics was reducible to propositional logic. I pointed out that this is factually not the case. You agreed with my point. That's as far as this goes. The same problem applies to any attempted digitalization of a continuous phenomenon. The accumulated rounding errors will eventually throw the model wildly off the mark. And the same problem applies even to continuous approximations of continuous phenomena. Once you have an approximation, the tiny errors accumulate.

    Well you just said that you are only talking about tranference, and now you're willing to set it aside.

    I really think my participation in this thread has reached the point of diminishing returns. Nobody's said anything new in quite some time.

    Which side of which discussion? I'm serious. I no longer know what's being argued. I don't believe the mind is an algorithm executing in the brain. That's my point and my only point.

    Diminishing returns again. Now you're claiming that a bird could fly like an airplane. Not even Icarus could pull off that trick. If we replaced every molecule of a bird with a computer chip, would the bird fly? Not bloody likely IMO. You'd be missing all the wetware that makes living things function.
     
    Last edited: Jan 20, 2018
  10. Write4U Valued Senior Member

    Messages:
    20,069
    Again, for some mysterious reason you are assigning quotes to the wrong poster. Must be something wrong with your algorithm.
     
  11. someguy1 Registered Senior Member

    Messages:
    727
    I'm sending my head in for repair right now!
     
  12. Write4U Valued Senior Member

    Messages:
    20,069
    Again, a completely unrelated word to the discussion at hand.
    Your confused algorithms make it very difficult to have a intelligent conversation. You don't even know "who" you are addressing.
     
  13. Write4U Valued Senior Member

    Messages:
    20,069
    http://www.popularmechanics.com/science/animals/a21614/frigatebird-study-how-birds-fly/
    You mean that its brain does not process the mathematics of aerodynamics and lift?

    If this is all "woo" to you, I would suggest you study how the brain functions, instead of clinging to artificial neural networks of computers
     
    Last edited: Jan 20, 2018
  14. Write4U Valued Senior Member

    Messages:
    20,069
    Let's resolve this
    https://www.psychologytoday.com/blog/the-superhuman-mind/201211/is-the-brain-computer

    Notice; no woo or foozle!

    Oh, found another link;
    https://www.quora.com/What-kind-of-...eing-developed-today-for-machine-intelligence

    Oh,and here is another one;
    https://www.ansatt.hig.no/suley/Publications/Algorithm_IC-AI06-SY-RB_CamReady.pdf

    need more?
     
    Last edited: Jan 21, 2018
  15. someguy1 Registered Senior Member

    Messages:
    727
    I did look at your links. The pdf said that SOME brain functions are algorithms. This is a point I've agreed with several times, and even gave a specific example of executing the Euclidean algorithm in your head.

    Why would you post a link that makes a point I've already agreed to several times?

    Either you don't know the difference between SOME and ALL, or you didn't read the article, or you hoped I wouldn't, or you're just not debating in good faith. It takes me time to click on a link and read enough of it to determine that it doesn't support the point you think you are trying to make. I won't be clicking any more of your links. Every link you've shown me so far fails to make your point and in this latest case, makes a point I've already agreed to repeatedly. I've now gone through three of your links and it's been completely unproductive because your links don't support your points or else they make points I've already agreed to.

    The Psychology Today article claimed the mind is a computer but "not the traditional kind." Well the problem is that as far as we know, there is no other kind. It's the Church-Turing thesis. The author doesn't know enough to be writing about computers. It's Psychology Today, not a research journal. He says brains "process information," but that fact is that information scientists have a definition of information and that's not it. Again, it's the fallacy of equivocation. Using the same word with different meanings within the same argument.

    There's no point in your Googling articles and posting them. Not without a critical eye to extract what's valuable and see where their arguments are lacking. You seem to read popularized articles and accept them at face value. None of us should ever do that. We should try to see where the argument's weak, where the author hasn't shown what he claims, and especially when he's equivocated the popular meaning of a word (algorithm, computation, information) with its technical meaning. You can't get your worldview from Psychology Today. That's like getting your politics from cable tv. It's junk information.

    Rather than making a remark about intelligent conversation, you might go through my posts and use them as a study guide for you to come up to speed on everything you don't know about this subject. Your arguments will get sharper. And other people will start sounding more intelligent because you'll know what they're talking about.

    I have nothing else to add. I'm actually getting tired of making the same points over and over. I don't feel that I've said anything new in quite some time. I'm pretty much done here. Thanks for the chat.
     
    Last edited: Jan 21, 2018
  16. Write4U Valued Senior Member

    Messages:
    20,069
    I never claimed that a brain only uses algorithms or logorithm! You just assumed I did.
    But it is some kind and collection of mathematical "rithms", an "efficient" logical way to process information.

    Fundamentally we are in agreement, but we are examining the OP question from different perspectives....

    Please Register or Log in to view the hidden image!

     
    Last edited: Jan 21, 2018
  17. someguy1 Registered Senior Member

    Messages:
    727
    How did logarithms get into this?

    Gödel showed that mathematical truth can't be determined solely by an algorithm. This is a very strong argument that the "mind is a computation" advocates have to respond to. You can't say math is an algorithm because it's not. Roger Penrose thinks this might be the key to consciousness, that the brain can take that Gödelian leap past computation. Nobody takes this particular argument of his seriously, but he's still Sir Roger so you can't dismiss the idea completely.

    I've been reading more of the Psychology Today article. It's interesting, there's a lot of good stuff in there. You know what would make your links more effective for me? If you said, "Here's a link. The whole article's great and you should read every word. But the main thing I want you to see is in paragraph 7 where they say XYX and this supports my points that A, B, and C." Give me a guidepost so I don't get annoyed having to read a long article, some of which is very interesting and some of which is just wrong, some of which supports your point and some which refutes it. I can't tell which is the part that you want to draw my attention to and why you think it's important to our conversation.

    Yes we're in agreement that mind is part algorithm and part "something else," which needs a word. Woo by the way refers to new-age stuff. Like if you meet someone and they're into crystal healing and the astral plane, that's woo. As in "woo-woo." So there's some woo-stuff going on in the brain that implements mind. What we disagree on is that you want to call that woo-stuff a computation, but you are not making a case that it's so and none of your links have made that case.
     
    Last edited: Jan 21, 2018
  18. Write4U Valued Senior Member

    Messages:
    20,069
    And what is my case?
    So you are making a case for "woo-stuff"? Good luck wit that......

    Please Register or Log in to view the hidden image!

     
    Last edited: Jan 22, 2018
  19. Write4U Valued Senior Member

    Messages:
    20,069
    My case is that the brain is a computational engine (by any other name)

    Your case is that the brain is not a computational engine but uses "woo-stuff" to make decisions.

    Now who is making more sense?

    p.s. "making sense" = "understanding", which is a result of greater abstract thinking.
    IMO, this advanced thinking began with a gene mutation, which clearly shows that an evolutionary event was responsible for the split of homo sapiens from other hominids.
    http://www.evolutionpages.com/chromosome_2.htm
     
    Last edited: Jan 22, 2018
  20. someguy1 Registered Senior Member

    Messages:
    727
    Yes it's all about what that "other stuff" is.

    If we say the "other stuff "doesn't exist at all, then everything's a computation in the technical sense -- a Turing machine. A lot of smart and famous people believe that.

    If we say that the "other stuff" is non-physical, then we are Cartesian dualists. We believe in a spiritual or metaphysical realms wherein many things may dwell. This would be the literal definition of woo-stuff. That it's woo.

    I am not a dualist.

    I am perfectly willing to agree that the "other stuff" is physical. I do NOT want to depend on a supernatural explanation.

    Now, if the "other stuff" is not computational, as that term is currently understood, what could it be?

    * It could be that it is some mode of computation we have not yet discovered. That there are meanings of "computation" that can be rigorously defined and that go beyond what Turing machines can do, and that can be instantiated in the world. Nobody has found one for eighty years but that is not to say someone won't find one tomorrow morning.

    [Note. Quantum computers can perform dramatically better on certain specialized problems. For example a quantum computer can factor an arbitrary integer in polynomial time. That's an astonishing result. Nevertheless, a quantum computer can not compute anything a TM can't. That's been proven. The TM would just run slower].

    * It could be that some new principle of physics will provide the answer. History shows that we have periodic revolutions in physics and there's no reason to think the next one won't give us some new insight into the nature of our world. In fact I tend to think that the breakthrough in computation I mentioned above would necessarily require a breakthrough in physics to make it possible. New physics and some new mode of computation becomes physically possible. That's the breakthrough we're waiting for.

    * It may be that we haven't got the ability to comprehend the nature of the world. Why should this be surprising? If we're an ant on a leaf on a tree in a jungle, we may be a brilliant ant scientist but we simply are not biologically equipped to comprehend the many levels of existence beyond our leaf.

    Why should we humans imagine ourselves to be the culmination of evolution, the universe's very means for understanding itself? Maybe we're just an ant on a leaf on a tree somewhere.

    Remember that the intellectual history of humanity is based on successive discoveries that we're not special. We're not the center of the heavens. We're not the center of the solar system. We're not the center of the galaxy or of the universe. We're not separate from the animals, we are one of them, a member of the great order of primates. In the US where the government is currently shut down you may turn on the news and find evidence of chest-beating and dominance rituals common to all the great apes. Our wise leaders wear suits, that's about the main difference.

    If we're not special snowflakes (LOL!) then perhaps we are NOT the one creature in creation capable of understanding all of creation.

    I've run across the name for this idea recently. Its called Mysterianism.

    https://en.wikipedia.org/wiki/New_mysterianism

    I can go with that! It's what I believe. That either we need revolutionary breakthroughs in physics and computer science; or else we're not actually built by nature to understand the universe or ourselves.
     
    Last edited: Jan 22, 2018
  21. Write4U Valued Senior Member

    Messages:
    20,069
    I have no quarrel with that, but note that you used the term "some mode of computation" which is as yet undiscovered. It would still be a mode of computation, though it may not strictly answer to the Turing type.

    Why you insist on using the Turing type of computation as the standard by which all other forms of computation should be judged, is a mystery to me.

    We are talking about the same thing, but because we used different perspectives, it seems we have been talking past each other, while trying to make the same point.

    IMO, every brain neuron (every neuron) has an inherent computational ability. Now we may get into the differences between afferent neurons and efferent neurons, but either way there is always a translation between information received and information forwarded. This is a computational ability, IMO.
     
  22. Write4U Valued Senior Member

    Messages:
    20,069
    Well, I call this the "mirror neural system"
    http://www.apa.org/monitor/oct05/mirror.aspx
     
  23. billvon Valued Senior Member

    Messages:
    21,634
    There are levels of computation.

    For example, it's pretty safe to say that if we did an atomic-level simulation of a human brain on a computer, the result would be almost identical (to the degree that the simulation is accurate) to what a human brain does. No existing or planned computer can simulate something like that, nor do we have the tools to create the model to simulate. (The model exists, of course - we just can't easily convert it to computer data.)

    It is very likely that if we did a neuron-level simulation of a human brain on a computer, the result would be similar to what we see in a human brain, again depending on how accurate that level of abstraction is. Again, we don't have computers that can do that yet - but the neuron-level model (often called the connectome) is more easily discovered by existing tools (PET, FMRI etc.)

    It is somewhat likely that if we did a function-level simulation of a human brain, the result would be somewhat similar to what we see in a human brain. In other words, model what a section of the brain does (say, the part that interprets motion in the visual cortex) and "run" it on a more conventional neural network, on a computer set up to do that easily. We do have computers that can do this now, but not for the whole brain - although we will in 5-10 years. How accurate it will be will depend on how close we come in terms of our assumptions on what that functionality is.

    So we have a scale of computation, from near-perfect simulation of a brain (almost impossibly hard) to functional level simulation (almost doable now.) It is likely these will be implemented on nonconventional computers (i.e. computers that are, or include, an NPU, and do not meet the classic definition of a Turing machine) - but they are deterministic, easily programmed and understandable machines, and do not require any sort of "quantum weirdness" or anything.
     

Share This Page