Tuesday, May 30, 2006

TechEBlog ยป Top 10 Strangest Gadgets of the Future

Cool stuff! But some of it makes say, “Why?”

TechEBlog » Top 10 Strangest Gadgets of the Future.

Sunday, May 28, 2006

3DSeek from Imagenistics

Search for 3D objects by sketching. Pretty cool! (Thanks, Nathanael Miller!)

3DSeek.

Wednesday, May 24, 2006

MCAD Industry View - A May 2006 Update

My friend Russ Henke and his colleague Jack Horgan provide not only a careful analysis of the MCAD industry, but also a far-ranging view of the global economic context, with strong opinions. A fascinating read!

MCAD Industry View - A May 2006 Update

Monday, May 22, 2006

Popular Mechanics - Report on the 2006 IEEE Robotics conference

Fascinating brief update with links

A Meeting Of The Metal Minds

ORLANDO, Fla — Robots vacuum our homes, search for landmines, perform surgery, and explore Mars. They’ve been taught to dance, play chess, arm wrestle, and ballroom dance. For all of this service and goodwill toward men, robots deserve credit, but fictional and cinematic slams are the norm. You know the plot: Machines rebel, humanity is enslaved, and Asimov rolls in his grave.

One hears of no such silliness at the 2006 IEEE International Conference on Robotics and Automation (ICRA), held this week at the Walt Disneyworld Hilton in Orlando, Florida. One of the largest conferences of its type, ICRA attracts more than a thousand leading roboticists from North America, Europe, Japan, Japan, and Japan. This year’s theme is “Humanitarian Robotics.” If you’re willing to brave presentations with titles such as “Force Tracking Control for Constrained Robot with Uncertainties”—and can stomach industrial-grade linear algebra before you’ve finished your morning coffee—this is the place for learning the latest on robotics. ICRA draws the scientists, programmers, and engineers who really have their heads under the hoods. Or torsos. Or . . . you get the idea.

Popular Mechanics - A Meeting Of The Metal Minds.

Thursday, May 18, 2006

Two interesting blogs by Bob

Two interesting blogs by Bob

Zee News - India to embark upon robotics, remote tech weapons: PM

Clearly this is a big and important trend in warfare.

Zee News - India to embark upon robotics, remote tech weapons: PM.

Thursday, May 11, 2006

Digital Chosunilbo (English Edition) : Daily News in English About Korea

Korea Unveils World's Second Android

'Intelligent' Robots Hold Rich Potential for Korea
R2D2 May Soon be Your Household Companion
Korea's Smart Robot Ambitions Catch Int'l Attention
October to See Venture Into Space-Age Robot Utopia
Korea has developed its own android capable of facial expressions on its humanoid face, the second such machine to be developed after one from Japan. The Ministry of Commerce, Industry and Energy invited some 60 children to the Kyoyuk Munhwa Hoekwan in Seoul to introduce Ever-1 to the public. The name combines the first human name found in the Bible, Eve, with the "r" in robot.

The Korean Institute for Industrial Technology (KITECH) said the android, which has the face and body of a woman in her 20s, is 160 cm tall and weighs 50 kg. Ever-1 can move its upper body and “express” happiness, anger, sadness and pleasure.

Digital Chosunilbo (English Edition) : Daily News in English About Korea.

Monday, May 08, 2006

Nanotechnology: Economics

Economic Impact of the Personal Nanofactory
by Robert A. Freitas Jr.

Deflationary forces resulting from mass availability of desktop personal nanofactories can be opposed by inflationary forces competently initiated by governmental monetary authorities.


Originally published in Nanotechnology Perceptions: A Review of Ultraprecision Engineering and Nanotechnology, Volume 2, No. 2, May 8, 2006. Reprinted with permission on KurzweilAI.net, May 8, 2006.

Is the advent of, and mass availability of, desktop personal nanofactories (PNs) [1] likely to cause deflation (a persistent decline in the general prices of goods and services), inflation (a persistent general price increase), or neither?

KurzweilAI.net.

Wednesday, May 03, 2006

IST Results - BabyBot takes first steps

BabyBot takes first steps
BabyBot, a robot modelled on the torso of a two year-old child, is helping researchers take the first, tottering steps towards understanding human perception, and could lead to the development of machines that can perceive and interact with their environment.

The researchers used BabyBot to test a model of the human sense of 'presence', a combination of senses like sight, hearing and touch. The work could have enormous applications in robotics, artificial intelligence (AI) and machine perception. The research is being funded under the European Commission’s FET (Future and Emerging Technologies) initiative of the IST programme, as part of the ADAPT project.

"Our sense of presence is essentially our consciousness," says Giorgio Metta, Assistant Professor at the Laboratory for Integrated Advanced Robotics at Italy's Genoa University and ADAPT project coordinator.

Imagine a glorious day lying on a beach, drinking a pina colada, or any powerful, pleasurable memory. A series of specific sensory inputs are essential to the memory.

In the human mind all these sensations combine powerfully to create the total experience. It profoundly influences our future expectations, and each time we go to a beach we add to the store of contexts, situations and conditions. It is the combination of all these inputs and their cumulative power that the ADAPT researchers sought to explore.

Engineering consciousness
"We took an engineering approach to the problem, it was really consciousness for engineers," says Metta, "Which means we first developed a model and then we sought to test this model by, in this case, developing a robot to conform to it."

Modelling, or defining, consciousness remains one of the intractable problems of both science and philosophy. "The problem is duality, where does the brain end and the mind begin, the question is whether we need to consider them as two different aspects of reality," says Metta.

Neuroscientists would tend to develop theories that fit the observed phenomena, but engineers take a practical approach. Their objective is to make it work.

Called the synthetic methodology, it is essentially a method of understanding by building. There are three steps: model aspects of a biological system; abstract general principles of intelligent behaviour from the model; apply these principles to the design of intelligent robots. Model, test, refine. And then repeat.
How self-perception emerges during the early stages of human development

To that end, ADAPT first studied how the perception of self in the environment emerges during the early stages of human development. So developmental psychologists tested 6 to 18 month-old infants. "We could control a lot of the parameters to see how young children perceive and interact with the world around them. What they do when interacting with their mothers or strangers, what they see, the objects they interact with, for example," says Metta.

From this work they developed a 'process' model of consciousness. This assumes that objects in the environment are not real physical objects as such; rather they are part of a process of perception.

The practical upshot is that, while other models describe consciousness as perception, cognition then action, the ADAPT model sees it as action, cognition then perception. And it's how babies act, too.

When a baby sees an object that is not the final perception of it. A young child will then try to reach the object. If the child fails, the object is too far away. This teaches the child perspective.

IST Results - BabyBot takes first steps.