Well, first I would like to mention that the image to the left of this text is one lifted from a website called "Machines Like Us": http://machineslikeus.com/How does one start off a new blog and word it in such a way as to encapsulate over 30 years of work, in just a short paragraph?
A wonderful web Zine on Science News and other related topics. Please check it out if you have the chance. I'll probably end up deleting the image ( I don't know if its copyrighted)..but until told otherwise it will stay up. I have it there for no other reason than it appeals to me and depicts a fine impression of what this blog is about.
I will jump right in and explain what it is I'll be trying to accomplish here. As the need arises, we'll travel back in time now and then to gloss over the history of what brought me to a particular point in the development, theory and reason for my "Projects".
From personal experience I can state without a doubt technology has taken many leaps and bounds over the last 30 plus years. When I "fell into" the World of robotics and computers, push-button phones were still relatively new, integrated chips were just becoming common place and the talk of 'Digital anything' was still mostly relegated to science, experimenters and the stuff of science fiction. The Internet would not become a reality for at least another 18 years (and really take hold for another 2 or 3 years after that). "Car Phones" were few and far between and the property of only those in the higher tax brackets. Video games were still in the "Pong" stage, however progress was being made driven purely by market anticipation of the "New Generation" of Video and Arcade gaming industry. And of course, the Home Computer was at the least, better than a decade away.
We've come a long way from those days....and yet, I wonder if we really have? Essentially all we have really done is make existing technology smaller and faster. Millions of transistors populate common microprocessors in everyday applications, whereas 30 years ago the same size silicon real estate would only have perhaps a few thousand. The popular 8085 cpu introduced in 1976 had about 6500 transistors embedded in its substrate; compare that to the Intel Xeon cpu introduced around 2009 which contains about 781 million transistors. Comparing these 2 processors and their clock speed you'll find about 1000 times difference in operating speeds ( roughly speaking); The 8085 typically ran at about 3 mhz while the Intel Xeon operates at between 1.8 to 3 ghz. Quite literally, about a thousand times faster.
But here's the point I'm trying to make....There hasn't been any real progress in terms of the actual technology that we use. Smaller, faster, newer and more efficient materials is the reason all of today's devices can do what they do and as well as they do it. However, the electrical components; transistors, resistors, and capacitors among others, have changed very little in terms of their operation. A transistor still is basically a switch or an amplifier, a resistor still controls voltage and current, and capacitors still hold and discharge charges.
There is no paradigmatic technology that has changed the way anything actually works. For example, there is no real quantum computing or faster than light communications ( also a quantum property). We are still using bits and bytes stored either on movable media technology or in solid state devices such as DRAM and USB Flash-drives. When I sit down to design or build a new "creation", I pull parts out of my parts bin that are as relevant today as they were 30 years ago, just not as small. This becomes painfully obvious when I'm searching for a new way to fix and old problem or design a feature into a circuit board that has just been "done-to-death"; Multiplexing comes to mind, and I hate multiplexing ( the act of running several signals over a single wire without interference from each other).There is nothing in the Brain that really uses Multiplexing.
However, with that said, we seem to be getting closer to the time where silicon and space constraints will no longer be an issue to computing and memory storage systems as well as communications.
One device on the horizon is something called a Memristor. Still in the experimental stages, it promises to open a whole new way of storing memory and aiding in computation. You can read about it here: http://www.bbc.co.uk/news/technology-11165087
What does this all have to do with AI and AL (Artificial Life)? Besides the obvious reason of designing faster computers in which to duplicate " Human Thinking" ( More on that in a bit), I think it's a rather interesting analogy to the way science and society has viewed artificial intelligence or any means of replicating/duplicating the Human brain processes and consequently, Thinking in general. While advances have been made in theoretical reasoning to AI as well as some very interesting practical applications, we still seem to be no closer to real AI than we were 30 or even 50 years ago...depending on who you listen to.
There are pockets or researchers around the World that have taken several steps in the right direction; those that have dared to challenge the status quo and risk ridicule from their peers as well as the rest of the World.
Some of these people are academic, some are private and commercial researchers, some are just plain ole' garage experimenters.....and the ideas they are coming up with are astounding.
This blog is my journey through theories, experiments, and inevitable failures and successes. All who work in this fascinating field, be they professionals, hackers or hobbyists, all contribute to the collective Mind for the Search for Machine Intelligence.
Become part of this by subscribing to my Blog site. I promise an interesting ride.
Stay tuned for Part 2 of this Introduction.