"I don't believe that computers will ever duplicate some of the processes of our minds. " These are famous last words in my opinion. It's just the latest in a series of defeatist attitudes taken by someone who cannot himself figure out how to do the task at hand. Never heard any of them? Let me enlighten you. Gallileo was told he was mad for suggesting that the Earth was not the center of the universe. Today, we accept that as a given. Columbus was told he would fall off the edge of the world if he tried to sail around the world. We know he was right. Thomas Alva Edison warned people about the dangers of alternating current and built an electric chair to prove it. People once thought that computers required rooms of circuits and therefore no one would ever have a computer in the home, and here we are. Many scoffed at Jules Verne for suggesting a nuclear powered submarine or travel to the moon, and the United States built the sub and launched men to the moon and brought them home safely. Another thought is that we would never see gas go above $1 per gallon, and boy did we shoot that one in the backside.
One thing I have learned is never to say never. If you stay pessimistic, you will discourage tomorrow's brightest young minds from trying to do anything new. Isn't that what we want anyhow? Someone to try something new? Mary Shelley warned us about creating life from dead people, and yet we do organ transplants daily. How many times did your mother say "Don't do that! You're gonna break your neck?" Most of us did not break our necks - nor did we all listen to our mothers. Sometimes people just want to live in a perfect world where everything stays the same and nobody wants to change anything. Not me. I want to change what I can while still preserving what must be preserved. That is our legacy as human beings. We push at what people tell us we can't do until we find a way to do it, and then we wonder if we should have been more careful.
I don't find it difficult to believe that a computer can be programmed to beat us at our own games. A computer can be programmed to "brute force" problems that are well defined, such as games.
For example, the game of chess. A computer can "look ahead" through millions more moves than any human can. This allows it to win through simple brute force. Ironically, it took many years before any computer could routinely beat the best human players. Now that computers are powerful enough to literally work through an almost infinite number of permutations, they are difficult to impossible for a human to beat.
However, a computer can't really figure anything out, or come up with a single original thought. (Something that the average 2 year old human can do.) It can only increment in on a solution following a script that some human put in it.
If you had access to all of the information on the internet, and the ability to look it up infinitely fast, you could probably beat anybody at Jeopardy. That is essentially what "WATSON" did. (another "brute force" solution). The computer is not really "thinking", or coming up with any original thought. It is just interpreting questions, and regurgitating factoids from a huge database.
Computers are simply very fast, and have almost infinite data available. Our minds will never duplicate those attributes.
But, I don't believe that computers will ever duplicate some of the processes of our minds. Thus, I don't believe that computers beat us "in every aspect of the word".
Thanks, ervin0072002, and well said. In a similar vein, robots don't think or feel or do anything else--they're machines, programmed by humans to do whatever it is they're doing, and they only do it well (or not) because of those humans' programming and because of the humans who designed the sensors and other components they use to do whatever it is they're doing.
I worked for a guy who developed technology that was able to measure the power developed at a single neuron's synapse. This multiplied by the number of neurons gives the peak output which he stated was around 20W.
Interesting thought, but it discounts the chance that we don't run out of energy or don't destroy ourselves before we finally understand how thinking actually works.
Is it possible for someone to understand their thinking processes to sufficient detail to be able to program a computer with intelligence?
I think so.
Will we as a species survive long enough to achieve this?
I don't know.
a 3rd option is that we build a quantum computer modelled on synapes and the likes and hit on the magic formula of our brains and the thing begins to think for itself and we're left none the wiser as to how thinking works. We may have to ask it (if it wants to tell us). Watch "The Forbin Project" from circa 1973 to understand where I'm coming from.
I remember Bill Gates saying we will never need more than 640k and someone else said we would never fly, and the list goes on. I don't think we have yet reached the level of understanding necessary to say with absolute certainty that this or that will never happen in technology.
Hi GTOlover, the answer to your question of why our brain processes information in the unique ways it does, is efficiency. The brain human brain dissipates about 20W of electricity, compared to the 4MW of a typical supercomputer of the type that I think you are comparing. They typically fill a large room compared to the smaller than football size of our brains.
Imagine how well we would have survived having to drag a mobile 4MW power plant while being so top heavy with a room on our shoulders :-)
The way our brain processes information is similar all other life (with brains) allowing a spider with a brain the size of the pointy end of a pin to weave intricate webs and go about its day to day routine.
And even though we like to think we have designed machines that are superior at our own games, there are areas where the best computers lag behind the best human minds. These are creativity and pattern recognition.
Sure there are examples of quasi creativity with computers, but the pure inpirational creativity people achieve (including eg. designing computers) is in a whole different league. A large part of this is the fact that even after years of research we still only have a very rudimentary idea of what creativity really is. Also creativity in regard to the unknown.
In regard to pattern recognition consider you meet someone at the age of say 16 or 17 that is about the same age. and for what ever reason part ways. A human can be walking down a street in a completely unfamiliar area some 50 or more years later and see that person in the crowd with all of the signs of aging dying of hair, addition of glasses new scars, etc etc, and still recognise that person while never actually expecting to see them or be actively looking for them, and that in a see of faces. and all within seconds. It also works for a long lost familiar face in a see of familiar faces. As far as I'm aware while the best computers running the highest speed algorithms can only run through a list of faces and compare them sequentially, and are not particularly tolerant of changes. The person having seen 1000's to 10's of 1000's of faces over a lifetime will know which is a new face and which one is familiar in an instant.
Similar things are at play when you hear a familar tune after years and know after 3 bars what the name is
Obviously not everyone can pull this off, just as not everyone is creative, but the best compared to the best I don't see our pattern recognition abilities being outdone until quantum computers have been played with for a while.
The main reason we excel in these areas is that the human brain is around 100 billion computers running in parallel at around 200Hz and according to the university of Alberta due to the way the neurons are wired the brain performs roughly 20 million billion calculations per second. Of course this isn't the full story, because neurons are analogue computers with maybe 1000 or so levels of sensitvity which magnifies the potential again, but of course not all of the brain is involved in our thinking processes, so suffice to say its processing power (for the right type of problem) is beyond the realm of current (non-quantum) computers.
Never the less, I certainly have a healthy respect for the designers of large computing arrays such as NASA's and their brainchilds, as they are an extention of the mind of researchers and engineers the world over and have revolutionised life in affluent countries by allowing them to concentrate on the aspects of problems better handled by the little grey cells.
I don't think so... Computers will never as you say it beat us... They may do some operations faster than us, in higher quantity then us and even with fewer mistakes. However I guarantee you the most powerful computer in the world is only as smart as the collection of people that programmed it. If you went through an automata theory class you would know that a computer cannot write code in any language. It can cut and paste code that was already prewritten for it but that is. Everything you see in that screen was calculated by functions that the engineering team inserted in the simulation. If anyone should impress you it's not the supercomputer... It's the people that worked on days end to make it happen the supercomputer is just an aid. As for evolution what can happen, will happen pretty much sums it up. Yes millions of molecules can gather together and form a single cell organism. We know this can happen because it has happened already. The forces that caused this are a matter of religion
Atheist: Chaos theory
Agnostics (ME): Can't prove, Don't care, Thankful to be here bye....
Yes, it's interesting how machines that humans themselves programmed beat us at our own games in every aspect of the word, GTOLover. This is quite an interesting thought to ponder, although i wouldn't begin to have a scientific answer for it. I suppose religious people (one of whom I am not) would say it has something to do with the soul, or even non-religious people might say it has to do with our emotions (although those can be scientifically proven to have something to do with how our brains work). For now, I suppose, it remains a mystery and subject of debate!
Siemens and Georgia Institute of Technology are partnering to address limitations in the current additive manufacturing design-to-production chain in an applied research project as part of the federally backed America Makes program.
Most of the new 3D printers and 3D printing technologies in this crop are breaking some boundaries, whether it's build volume-per-dollar ratios, multimaterials printing techniques, or new materials types.
Independent science safety company Underwriters Laboratories is providing new guidance for manufacturers about how to follow the latest IEC standards for implementing safety features in programmable logic controllers.
Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies. You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived. So if you can't attend live, attend at your convenience.