Many brains work better than one

U0303270 Quak Yeok Teck

Perhaps the concept of robots still linger in the humanoid form in most of our minds, but robotics have been aiding the manufacturing industry for many decades now. The most common form of robotics in the industry would be the mechanical arm form, which aids in precision fabrication of products as well as welding.

One of industrial robotics supplier is ABB Robotics, which recently managed to create a multi-robot, arc welding system. It was demonstrated recently at MACH2006, with a fully operational 'MultiMove' arc welding cell. This showcases ABB's revolutionary IRC5 control software, MultiMove, which allows up to 4 robots, to workin in fully co-ordinated operation. What does this mean for automation? Imagine a typical automotive operation, one robot can lift and hold a car door, a second picks and locates a hinge, and the third welds the hinge in place.

This ability is made possible by the incredible processing ability of the IRC5 control module computer, which is capable of calculations for up to 36 servo axes, while directing up to 4 drive modules.

Such a system offers total freedom of motion and optimum working position, while eliminating the need for extra jigs and manual labour involved in mounting objects. In addition, every robot knows what each other is working on, collision can be reduced, production flow optimised and throughput increased.

Reference
http://www.manufacturingtalk.com/news/abd/abd174.html

Maintenance & Repairs in Space - Rangers

u0204781 Peh Meng Wee


"Hello International Space Station maintenance. What! Sector 4 has been hit by an asteroid! We'll send in the robot maintenance crew to take a look immediately."

Does the previous scenario entice you? Or does the image of R2D2 repairing Luke Skywalker's fighter in "Star Wars" amazes you. Well, soon these scenes will not be science fiction stuff but would be played out right above our heads. in space

Since the beginning of time, humans have looked to the stars for guidance and dreamed of ascending to the heavens. However, hostile conditions in space have impeded mankind's development and knowledge in this area. The construction of the international space station (ISS) represents a leap in mankind's determination to explore and conquer the last unexplored frontier. However, following the Challenger disaster, there has been an upsurge in interest to use robots to replace humans in doing dangerous jobs like repair/maintainence in space. Unforseen circumstances like space debris, radiation makes it dangerous for astronauts to work outside their ships or structrues. Moreover, robots do not need to eat and sleep and will not tire, thus making them better suited for "living" in space.

After the lengthy introduction, I must introduce you to the Ranger, a space repair and maintenance robot. The Ranger system include four manipulators: two 7-DOF bilateral dexterous manipulator(one a normal arm and one an engineering arm), a 6-DOF grappling manipulator for worksite stability, and a 5-DOF camera positioning manipulator to locate a pair of stereo video cameras. A second video camera on the vehicle centerline will provide a stable visual reference for free-flight maneuvering and autonomous docking.

Unfortunately the rangers system is still in development. Only prototypes are available. However the prototype has shown that it is capable to do heavy tasks like opening hatch doors and menial tasks like tightening of a bolt under neutral buoyance. The fact that it can do this 2 totally different kind of tasks shows that the precision and control involved in the system is very advance and state-of-the-art. Those intersted in the specfications of the Ranger prototypes can be go here.

This short clip will simulate portions of the removal and replacement of a Hubble Space Telescope electronics control unit (ECU). In the clip, the Ranger engineering arm moves back to get the bare bolt drive which is mounted on a tool post. With the bare bolt drive attached to its wrist, the arm moves to one of the ECU keyway slot bolts. After turning the bolt, the arm moves away from the bolt and places the bare bolt drive back on a tool post. The parallel jaw mechanism is retrieved which has a set of "fingers" that fit around the tether loop. The "fingers" are closed around the tether loop as the video ends.

A lengthier video of the prototypes under buoyancy tests performance various tasks can be seen here.

However, sad to say, the rangers are not autonomous but rather teleoperated. They can be controlled either from Earth or from a base station in space. But then, sometimes it might be better to put your lives in the hands of another human being instead of a robot.

I believe that the Rangers will make the headlines soon when the question of prolonging the lifespan of the Hubble Space Telescope comes up. The Rangers would most probably be called upon to do the job and when that time comes, it will mark a new era in roles of robots in the exploration and development of outer space.

For more information, please visit this websites.
NASA Telerobotics Program Overview
Ranger Robotics Program

Humans or Robots? Can you decide?

The future of edutainment and service robots...

U0308283 Wu Chengyu

Introduction
Robots are usually designed to function optimally, be it streamlined to reduce resistive forces or having extra legs for traveling on rough terrains. But come 21st century, robots today are now deployed for a new reason – to facilitate and improve communication. To achieve this goal, robots have to have a new appearance. As psychologists have shown, robots with a human-like appearance have a stronger presence and humans are more likely to interact with them. Hence, the new generation of robots is born. Humanoid robots, with appearance and behaviour similar us, are the solutions to our new needs. Indeed, should we perfect this technology, we could be looking at a whole new range of jobs robots can help us with. This would include using robots as entertainers such as actors, performers, dancers etc or in service sectors like as a receptionists or usher or in education fields as a translator or teacher. The possibilities are endless should we be able to pass robots off as a human substitute when needed. Here, let us marvel at some examples of humanoid robots that have already been tested.


Repliee Q1 & Q2
Humanoid robots Repliee Q2 and ‘her’ predecessor Repliee Q1 can be said be the closest to humans that were ever made. Whether in terms of appearance or behavior, these 2 robots were modeled as closely to humans as possible. Instead of having hard plastic as a skin, these 2 robots are made of flexible silicone to give a skin like look and texture. In addition, the two robots each have 42 actuators, allowing her to do the most minute yet smooth motions such as the fluttering of her eyelids. They can even stimulate breathing by the subtle rising and falling of their chests. Finally, these two are even programmed to shift about their positions randomly, much akin to their creators, humans. Another version of Repliees Q2 is the Repliee R1 which is modeled after a 5 year old girl instead but using the same technology.


Actroids
There have been several other human-like robots like the Repliees but the most significant of them will be the ones built from the Japanese company Kokoro and Advanced Media. Called Actroids, many of them have already been employed commercially especially in service. In this newest version of Actroids, a female type reception robot has been employed in an information booth. These Actroids look very much like humans but are able to recognize and respond in up to 4 different languages, making them even better than their human counterparts.




Motion System
To achieve humanlike motion, both Repliee and Androids or humanoid robots in general use an air compressor to power their motions. Highly pressurized air is supplied to an actuator called a cylinder to move mechanical units. Generally speaking, an actuator used for heavy machinery employs hydraulic pressure instead of pneumatic one. Mounted with 42 actuators, the Repliee and Actroids are able to move very smoothly, much like a human being. Actroids are also suited with a system to control their motions such that when making conversation, they would look at the enquirer’s face and move its lips, as though pronouncing their sentences. They may even change their facial expressions or show some hand gestures according to the context of the conversation. Although to date, most humanoid robots can only sit down and move the upper part of their body. This is due to the constraint that these robots have to be attached to the air compressor, which is too large to be fitted inside her body, hence limiting her mobility.


Voice Recognition and Response System
The most challenging part of creating a practical humanoid robot for interactive purposes would be to give it intelligence to recognize speeches or questions and generate an appropriate response. Let us study one of the most advance voice recognition engine present in robots today - AmiVoice®. Developed by Japanese company Advance Media Inc, AmiVoice® is currently the voice recognition engine used by Actroids. With this engine, Actroids can effectively understand and speak up to 4 different language, namely English, Chinese, Korean, and Japanese. This is done through a speech recognition check not per word, but per sentence of each speech. Such a utility is exceptionally useful when the guests could come from more many countries and hence predefining a language is unfeasible. AmiVoice® has been proven to accurately recognize the language of each person regardless of the difference in intonations and accents, high or low pitches or speaking speeds. But to be able to first hear the enquirer, Actroid must first be able to filter off the background noises and decide when someone is speaking to her. This is done also in the AmiVoice®’s noise cancellation technology which ensures the accuracy of the voice recognition. To prevent Actroid to start talking to herself, there is another echo cancellation algorithm which would prevent Actroid from recognizing and processing her own voice. Finally, as a final step to simulate humans, Actroid has a voice synthesizer system within which would generate a natural voice that’s similar to a human’s.

Here is a flowchart of Actroid’s response program.





















Future Trends
Robotics today is advancing in an amazing pace. Now, there is talk of new technology there can create a fake skin for robots, giving them not only a sense of touch but also the ability to detect pressure and temperature or maybe even humility, light, strain and sound which human skin cannot sense. There is also further progress in trying to incorporate muscles into robots, hence allowing them to move much like us. These new technologies, if successful, can be added into our humanoid robot. With a sense of touch like ours and muscles to generate an even smoother motion, robots may eventually look identical to a normal human being. We may eventually find robots acting on our tv screens or standing in front of our tutorial rooms! While this may open up a wide range of uses for us, it is still quite scary should one be no longer able to tell the difference between a robot and a human. Who or what are you?



References

http://news.bbc.co.uk/1/hi/sci/tech/4714135.stm
http://automatesintelligent.blog.lemonde.fr/automatesintelligent/2005/08/repliee_ou_line.html
http://news.nationalgeographic.com/news/2005/08/0817_050817_robotskin.html
http://news.thomasnet.com/IMT/archives/2006/03/robot_muscles_double_as_fuel_cells.html




Robota Doll - An Educational Toy



U0205391 Beeharry Vishal

Toy robots have since long been viewed as entertainers for children. In the recent years, with the rise of technological motor-driven toys, a wide range of robot toys, for example AIBO from Sony and NeCoRo from Omron Corporation, have seen light. More humanoid robots, such as ASIMO from Honda and EMIEW from Hitachi, have also been added to the list of entertainment robots.

However, while being absorb with entertainment robots, one must not ignore the fact that robot toys can also be used for educational purposes. The AuRoRa project, started in 1998 in collaboration with different schools in the Hertfordshire (University of Hertfordshire) and Essex areas, including Radlett Lodge School, Colnbrook School, and Bentfield Primary school, aims at using robot toys as an educational or therapeutic role for children with autism. One of the few robots used in the project is the Robota Doll. The picture on the left shows the first prototypes of Robota.
Robota Dolls are mini humanoid robots which have been developed specifically to serve as educational toys. These ‘dolls’ are sophisticated enough to be able to have complex interactions, such as speech, vision and body imitation, with humans. They have proved to be very useful to help children diagnosed with autism symptoms.
In case many may not be aware of autism, autism is a development disorder affecting 91 people in every 10,000. Although both the cause and cure of this disorder are currently unknown, their symptoms vary from children to children. Generally, children affected with the autism syndrome tend to have behavioral problems. They seem to be ‘cut off’ from the world and, acting as an observer, they avoid social contact while finding it difficult to participate in social situations. To help these children to have interactions with the people around them, they need to be taught such interaction ‘skills’ through adults. However, adults’ social behaviors are very delicate, elaborate and widely unpredictable (which often appear frightening to children with autism). Most autism children are more comfortable with mechanical toys. This is where the AuRoRa project comes into action.
The Robota Doll serves as a robotic platform to provide a safe, simplified (as compared to the real world) and predictable environment to familiarize these children with the socialization skills. Depending on the children’s abilities, the complexity of interaction can be varied. The pictures below show how the children can get involve in imitation and turn-taking games with the robots.



















Technology
Robota, being a humanoid doll, stands at 45 cm high. Her arms, legs and head are made from plastic components of a commercially available doll. The motors that drive the arms, legs and the head have each 1 DOF. She can detect and respond to touch through potentiometers to detect passive motion of its limbs and head. Various sensors (emitter/receiver, light detectors, etc,) can also be connected to Robota.

The best feature of Robota is that she can copy and imitate upward movements of the user’s arms and sideways movements of the user’s head when the latter is sitting close to the robot. Thus, the user can ‘play’ imitating and turn-taking games with Robota. Furthermore, machine learning algorithms allow Robota to learn from the user, for example, she can be taught a sequence of actions as well a vocabulary.

Some of other robots used in the AuRoRa project are:
(1) Mel Robot
--> 38 cm long
--> 30 cm wide
--> 12 cm high
--> 8 IR sensors
--> 4 wheels



(2) Pekee Robot
--> Oval-shaped
--> 2 motorized wheels
--> 15 distance measuring sensors









In all the fields (assistive, security, home, entertainment, educational) robots are seen to have excelled beyond expectation. Though it may seem a distant future, the day where robots will be able to roam around on the streets, as “friends” with the humans, may soon arrive. To add on, who knows, maybe the scenes in the movie I,Robot can someday become reality.
To conclude, below are some quotes (fro movie I,Robot):

1. “Robots don’t feel fear, they don’t feel anything”.
2. “Can a robot write a symphony; can a robot turn a canvas into a beautiful masterpiece?”
3. “The future begins today (…) more sophisticated, more intelligent, and of course, three laws safe.”
4. “One day they’ll have secrets, one day they’ll have dreams”.
5. “We all have a purpose”.

A Human-Like Semi Autonomous Mobile Security Robot

U0204840 Lin Ming Zheng

The Mechatronics Group of the University of Waikato has developed a fleet of five mobile robots capable of autonomous operation. These robots are design to move on a variety of terrains including farms, forests, underwater and smooth indoor surfaces. MARVIN (Mobile Autonomous Robotic Vehicle for Indoor Navigation) is designed to act as a security agent for indoor environments. They are able to interact with people who may have little or no knowledge of robotic devices. This interaction must be made as natural as possible in order for the human to be comfortable communicating with MARVIN. To facilitate this, MARVIN has been substantially redesigned and provided with speech recognition and speech synthesis software as well as the ability to verbally and non-verbally convey emotional states. These emotion states can include actions like nodding or shaking of the head.

MARVIN is also equipped with different sensors to allow it to avoid obstacles. Different sensors are used to detect obstacles at different distance (short, intermediate and long). In operation, MARVIN scans its environment, waiting until it has detected a dynamic (moving) obstacle. Once this is confirmed, the laser ranger can help determine if this dynamic feature is possibly a human. If so, MARVIN approaches the “moving obstacle” and interrogates it. If it is a human, MARVIN expects an identification card to be shown. It then searches its database to find the owner of this card, and will prompt the user for his password. MARVIN will try three times to elicit the password from the user. If unsuccessful, MARVIN will become more aggressive (see diagram), and demand the user leave the premises. Although not implemented at this stage, the plan is for MARVIN to also notify a remote human security agent via the on-board wireless LAN that an intrusion has taken place, and send a picture of the intruder.

I think that this system is useful as a security agent. When multiple robots are used, this can serve as an efficient way to patrol a large area. However there is still a need for a human security guard to oversee the entire operation as unexpected situations may arise if an intruder behaves abnormally.

Roaring Roboraptor

u0205159 Du Xing

The Roboraptor measures about 80cm from head to tail and comes to life with realistic motions and advanced artificial intelligence. It has 40+ pre-programmed functions and comes with dinosauresque advanced artificial intelligent personality, realistic biomorphic motions, direct control and autonomous (free-roam) modes. It has three fluid bi-pedal motion: walking, running and predatory gaits, and comes with realistic biomorphic body movements such as turning head and neck and whipping tail actions. It even has three distinct moods! namely hunter, cautious and playful.

It is able to autonomously interact with the environment, such as responding with mood specific behaviours and sounds. It also has mood dependent behaviour, and multi-sensors on its tail, chin, mouth touch sensors and head sonic sensors that allows it to responds to touch and sounds. With an infra-red vision system detects objects in path, or approaching. It has powerful jaws that play tug-of war games, “bite” and pull, with “laser” tracking technology: trace a path on the ground and Roboraptor will follow it. visual and sonic guard mode. It even responds to commands from Robosapien V2.

Roboraptor will start to explore his environment autonomously in Free-Roam Mode if left alone for more than three minutes. While Roboraptor is in Free-Roam Mode he will avoid obstacles using his Infrared Vision Sensors. Occasionally he will stop moving to see if he can hear any sharp, loud sounds. After 5 to 10 minutes of exploration Roborapto will power down. We can also put the Roboraptor into Guard Mode. Roboraptor will perform a head rotation to confirm that he is in Guard Mode. In Guard Mode Roboraptor™ is using his Infrared Vision Sensors and Stereo Sound Sensors to guard the area immediately around him. If he hears a sound or sees movement he will react with a roar and become animated. Occasionally Roboraptor will turn his head and sniff.

Roboraptor has Infrared Vision Sensors that enable him to detect movement to either side of him. However infrared functions can be affected by bright sunlight, fluorescent and electronically dimmed lighting. Upon activation Roboraptor will be sensitive to sound, vision and touch. If you trigger the Vision Sensor on one side more than three times in a row, Roboraptor will get frustrated and will turn away from you. This will also happen if you leave him standing with his head facing a wall. Roboraptor uses his Vision Sensors to avoid obstacles while wandering around. While walking he will not be able to detect movement so he will react to you as if you are an obstacle.

Roboraptor can be guided around using “Laser” targeting. A green Targeting Assist Light from the remote control will make the Roboraptor move towards the light. Roboraptor’s Infrared Vision System and the “laser” targeting are based on reflection. This means that he can see highly reflective surfaces like white walls or mirrors more easily and at greater distances. Roboraptor also walks best on smooth surfaces.

With its stereo sound sensors, the Roboraptor can detect sharp, loud sounds (like a clap) to his left, his right and directly ahead. He only listens when he is not moving or making a noise. In Hunting Mood when he hears a sharp sound to his side he will turn his head to look at the source. If he hears another sharp sound from the same direction he will turn his body towards the source. If he hears a sharp sound directly in front of him he will take a few steps toward the source; In Cautious Mood, when he hears a sharp sound to his side he will turn his head to look at the source. If he hears a sound straight ahead he will walk away from it; In Playful Mood When he hears a sharp sound to his side he will turn his head to look at the source. If he hears a sound straight ahead, he will take a few steps backward, then take a few steps forward.

Roboraptor has multiple touch sensors which allow him to explore his environment and respond to human interaction. If you press the sensors on Roborapto’s tail , the Tail Touch Sensors with produce reaction varies depending on his mood; Pressing the sensor under Roboraptor’s chin activates the Chin Touch Sensor, which also produce reaction depending on his mood. There is a also a mouth touch sensor on the roof of Roboraptor’s mouth. In Hunting Mood, touching this sensor will trigger a biting and tearing animation. In Cautious and Playful Moods, Roboraptor will play a tug-of-war with whatever is in his mouth.

You might wonder how we can control the Roboraptor’s Moods, it is done with a button on the remote control. Hunting Mood is the default mood that Roboraptor is in
when turned on. It can also be set in the playful mood or cautious Mood. As mentioned above, the moods determine the way Roboraptor reacts to some of his sensors. In Playful Mood Roboraptor will nuzzle your hand if you approach from the side. In Cautious Mood, Roboraptor will turn his head away from movement to the side.
In Hunting Mood, his reactions are much less friendly.

Technology: biomorphic robotics
Biomorphic robotics is a subdicipline of robotics focused upon emulating the mechaninc, sensor systems, computing structures and methodologies used by animals. In short, it is building robots inspired by the principles of biological systems.
One of the most prominent researchers in the field of biomorphic robotics has been Mark W. Tilden, who is the designer of Robosapien series of toys.One of the more prolific annual Biomorphic conferences is at the Neuromorphic Engineering Workshop. These academics meet from all around the world to share their research in what they call a field of engineering that is based on the design and fabrication of artificial neural systems, such as vision chips, head-eye systems, and roving robots, whose architecture and design principles are based on those of biological nervous systems.
There is another subdiscipline is neuromorphic which focus on the control and sensor systems)while biomorhpics focus on the whole system.
http://www.roboraptoronline.com/
Other toys by wow wee and Mark Tilden: Robosapien and Robopet.

RoboWalker - The solution to assisted walking

u0204699 - Ong Chin Soon


RoboWalker is one of the main projects in the area of legged robots and powered leg orthotics that Yobotics, a cutting-edge robotic design, consulting, and research firm specializing in biomimetic robots, powered leg orthotics and force-controllable actuators, is currently engaging in. The main aim of the RoboWalker is to assist people suffering from weakness in lower extremities by augmenting or replacing muscular functions of the lower extremities. Possible causes of such disabilities come in many forms, including stroke, post-polio syndrome, multiple sclerosis and muscular dystrophy.



A power-assisted wearable device that could provide the leg strength, support and endurance for the elderly and those with effects of lower limbs weakening diseases, RoboWalker, if successfully developed, would be a state-of-the-art breakthrough in orthotic device, which currently consists of only passive device. RoboWalker webs the leg and foot in a series of artificial, exoskeleton springy tendons and muscles. The device is able to, through the numerous sensors, know exactly the user’s next course of action, and provides bursts of muscular energy through the brace when needed. Such burst of energy provides the necessary strength that could assist the user in accomplishing the task (for instance walking up the stairs or even standing up from sitting position) which he or she may otherwise have difficulty accomplishing. Nevertheless, just like its scaled-down cousins RoboKnee or RoboAnkle, RoboWalker is not useful for paraplegics (complete paralysis of the lower extremities), since the user must have the ability to first put the leg where it needs to go in order for RoboWalker to provide the necessary assistance to complete the motion.

While the RoboWalker prototype has had rather impressive trial results, there are also certain drawbacks associated with it. Firstly, the batteries that power the device can only last for a relatively short period of use. Recharging the batteries or changing a new set would be required after about 30 to 40 minutes of wireless assisted walking. Secondly, while the final price of the actual device is unknown, the estimated cost in the range of $10,000 may make it prohibitively expensive for most people.


The success of RoboWalker will see the change in lives of many disabled and elderly people. It will definitely replace the wheelchair as the preferred means of locomotion for these people. Not only will this innovation greatly reduce the inconvenience brought to people suffering from weakness in lower extremities, such breakthrough in technology would also bring about a huge cost savings to social welfare system, where the needs for modifications to improve mobility (wheelchair-friendly houses, stair lifts, car lifts, home aids etc) would be reduced.

Links:

http://yobotics.com/robowalker/robowalker.html

http://www.roboticstrends.com/displayarticle35.html?POSTNUKESID=a8fd9114cf249d6866513b067b7a5f8f

Your eye in the sky - The Unmanned Aerial Vehicle (UAV)

Martin Wiig - s0500296

An unmanned aerial vehicle, or UAV, is an aircraft which is either self-controlled or remote-controlled. It can carry different payloads, such as sensors, cameras, radar and weapons, all depending on the use of the UAV, which may be quite varied. For civilian use, the UAV's have mainly been used for fun and recreation in the form of model planes, and also in research. In the military however, the UAV's have a role of ever increasing importance. From simple model planes used to train anti-aircraft gunners after World War 1, to the V1 bomb of the german Luftwaffe, to the complex, more or less autonomous aircraft of today, such as the Fire Scout and Predator. These are used for tasks such as reconnaissance, intelligence gathering, assesment of damage after a battle, target acquiring for artillery and ship bombardement and so on. They can even carry weapons to be delivered to the enemy, luckily still with a human hand on the trigger. Currently, at least 700 unmanned aerial vehicles are being used in Iraq.

Three modern UAV's, two of them in use and one of them still undergoing testing, are described in this posting. All of them are part of the American RQ series.

RQ-1: Predator
The RQ-1:PredatorThe Predator is a propeller driven air vehicle 27 feet in length and with a wingspan of 49 feet. It is able to operate for more than 40 hours at an altitude of up to 25 000 feet with a cruising speed of over 130 km/h and a range of 740 km. The aircraft is part of a system of 4 planes, one ground control station (usually a van) and a satellite communications system. It is equipped with a satellite dish for communicating with the ground control station, with infrared sensors, cameras and radar, and may also be equipped with weapons.

The Predator has been empoyed by the U.S.A. in Bosnia since 1995, and it is also in use in Afghanistan and Iraq. It's main uses are, as with most UAV's, reconnaissance, but it is also capable of carrying up to 14 Hellfire missiles. The Predator became (in)famous when it was used by the CIA to assassinate six suspected terrorists in Yemen in 2003, the first ever attack by an unmanned aircraft outside a theatre of war. The Predator has a bright future in the U.S, as the Pentagon plans to buy at least 219 Predators over the next five years.

RQ-2: Pioneer
The Pioneer was an early bird among modern military UAV's. It began service as early as 1985, when it was employed by the US Navy. It had several teething problems though, facing for instance electromagnetic interference from the ships it launched from, leading to several crashes. A USD 50 millon research and development project was consequently launched, bringing the Pioneer to a level of "minimum essential capability".

In spite of this, the history of the Pioneer is a history of success. It was used extensively during Iraq War 1, where it carried out reconnaissance, target acquiring and battle damage assessment missions, among others. It became famous when a group of Iraqi soldiers surrendered to the Pioneer, fearing the ship bombardement that usually followed an overpass. This was the first time human soldiers surrendered to a machine, and is thus a landmark (if a somewhat scary one) in the history of robotic warfare.

The Pioneer is smaller than the Predator, with a length of 14 feet and a wingspan of 17 feet. It is a propeller driven aircraft and has a range of 185 km, a cruising speed of 120 kmh and operates at altitudes up to 15000 feet. The pioneer is also used by the Israeli.

RQ-8: Fire Scout

The Fire Scout is an unmanned robotic helicopter still under development and testing. It showed an amazing degree of autonomy when it, whitout interference from human hands, was able to land on an aircraft carrier moving at 27km/h. Landing at an aircraft carrier is known as the most difficult part of piloting a navy plane, as it requires the pilot to have very good reflexes to adjust to the ship as it pitches and rolls. The Fire Scout may thus also lead to the development of automatic landing systems for manned planes, surely a great relief for pilots.

In conclusion, it is clear that the future of unmanned aircraft vehicles is bright from a military perspective. But is killing and war all these machines can be used for? Certainly not. As the technology becomes more and more advanced, UAV's are more than likely going to enter civilian life to a higher degree. I can easily envision UAV's for use of monitoring rainforests and other endangered enviroments, for natural research, spraying of fields and so on.


For interested readers, Wikipedia offers several links to open-source UAV projects on the net.

References:
General information on UAV's:
http://en.wikipedia.org/wiki/Unmanned_aerial_vehicle
http://www.fas.org/irp/program/collect/uav.htm
http://www.uavforum.com/library/photo.htm
http://www.marinetimes.com/story.php?f=1-292925-1649125.php

About Predator:
http://www.fas.org/irp/program/collect/predator.htm
http://www.airforce-technology.com/projects/predator/
http://en.wikipedia.org/wiki/RQ-1_Predator

About Pioneer
http://www.fas.org/irp/program/collect/pioneer.htm
http://en.wikipedia.org/wiki/RQ-2_Pioneer

About Fire Scout
http://en.wikipedia.org/wiki/RQ-8A
http://www.signonsandiego.com/news/business/20060119-9999-1b19scout.html

Robot Doctor

By:WANG LIWANG "U0205321"
A few days ago, the local news from Channel5, 8 and U reported that a Taiwan Hospital bought a few Mobile Robot Doctors from UK and put them into operation for daily nursing and performing ward rounds.

The robot is equipped with video camera and a LCD panel for display as shown in the photo. It is remotely controlled by doctor with joystick. The robot doctor itself does not provide any form of physical examination, but it proves a mobile platform for communication between doctor and patients. With which, doctors can do his/her routine at different places, as many doctors with specialize skill and knowledge are required to be in several places. The robot checked up on patients, asked them how they were feeling, inspected their surgical sites to ensure proper healing, and answered questions.

There is no intelligent in the robot, a laptop with a web cam can eventually perform the same task. And even through there is a display video of doctor in the monitor, it can never give the sense of interact with real doctor. As a doctor form the hospital said, "Our robots would never replace all doctors on ward rounds, but they are a communication tool which allows a doctor to have direct contact with their patient".

Such an application shows another potential of a robot being used in medical area. It may not have precise control like surgery robots, but it also makes doctors work easier and more convenient.

Interestingly, in NUS control lab, we also have a similar robot, it’s created as a robot usher, with its facial display by LCD and sonar sensors, stereo cameras for path planning. It should be smarter than the robot doctor above, as it can follow people by navigate itself and avoid obstacle in its path.


[1] "Does Dr Robot usher in new era of metal medics?" --http://www.e-health-insider.com/news/item.cfm?ID=766

[2] "First robot doctors start work in UK hospitals" THE GUARDIAN , LONDON Friday, May 20, 2005,Page 6

Kismet - the Face of the Future

U0204438 Huang Shichao Alvin

Traditionally, autonomous robots have been designed to perform hazardous and/or repetitive tasks. However, a new range of domestic applications such as household chores and entertainment is driving the development of robots that can interact and communicate with humans around them. At present, most domestic robots are restricted to communicating with humans through pre-recorded messages. Communication amongst humans, however, entails much more than just the spoken word, including other aspects such as facial expression, body posture, gestures, gaze direction and tone of the voice.

The Sociable Machines Project in MIT has developed an expressive anthropomorphic robot called Kismet that "engages people in natural and expressive face-to-face interaction". To do this, Kismet takes in visual and audio cues from the human interacting with it through 4 colour CCD cameras mounted on a stereo active vision head and a small wireless microphone worn by the human. Kismet has 3 degrees of freedom to control gaze direction and 3 degrees of freedom to control its neck, allowing it to move and orient its eyes like a human, which not only increases the visual perception mechanism but also allows it to use its gaze direction as a communication tool. Kismet has a 15 degree of freedom face that can display a wide assortment of facial expressions, as seen in the picture. This allows it to display various emotions through movement of its eyelids, ears and lips. Lastly, it also has a vocalization system generated through an articulatory synthesizer.In terms of the behaviour of the robot, the system architecture consists of 6 sub-systems: low-level feature extraction, high-level perception, attention, motivation, behaviour and motor systems. Using these systems, the visual and audio cues it receives are classified using the extraction and perception systems. Using the attention, motivation and behaviour systems, the next action taken by the robot is calculated and executed using the motor system. For more information, please refer to here

Kismet exhibits several human-like behaviours, which are modeled on the behaviour of an infant, such as moving closer towards an object it is interested in by moving its neck or engaging in a calling behaviour to cause the object to move nearer. It also changes its facial expressions according to whether the visual and audio stimuli are causing it to "feel" happy, sad, etc. Videos of these interactions can be seen here. The use of infant behaviours are meant to simulate parent-infant exchanges to simulate socially situated learning with a human instructor.

Kismet represents the next step in human-robot interaction, where robots will share a similar morphology to humans and thus communicate in a manner that supports the natural mode of communication of humans. This will lead to more intuitive and "friendly" designs for robots that will allow them to be more easily accepted by humans as robots become more and more ubiquitous in our lives.

Link to Kismet homepage

Lego Mindstorms NXT, a kid's toy (available from August 2006)

U0303505 Pham Dang Khoa

Introduction

Smarter, stronger and more intuitive than ever, LEGO MINDSTORMS NXT is a robotics toolset that provides endless opportunities for armchair inventors, robotics fanatics and LEGO builders ages 10 and older to build and program robots that do what they want.

Building upon the success of the globally-renowned Robotics Invention System, the next generation of LEGO MINDSTORMS makes it quicker and easier for robot creators to build and program a working robot in just 30 minutes. Simultaneously, new technologies and expanded sensor capabilities add a level of sophistication to excite and challenge more experienced robot creators.

Technology

The heart of the new system is the NXT brick, an autonomous 32-bit LEGO microprocessor that can be programmed using a PC, or for the first time in the retail offering, a Mac. After building their robots, users create a program within easy-to-use yet feature-rich software, powered by LabVIEW from National Instruments.

Downloading programs to an invention is easy. Users with Bluetooth®-enabled computer hardware can transfer their programs to the NXT wirelessly, or anyone can use the included USB 2.0 cable to connect their computer to the NXT for program transfer. The robot then takes on a life of its own, fully autonomous from the computer. The inclusion of Bluetooth technology also extends possibilities for controlling robots remotely, for example, from a mobile phone or PDA.

It was demonstrated, for example, how with a Bluetooth phone, it could direct the movement of one of the robots. Then it was showed how the robot was programmed so that when it moved and bumped into something, it would send a signal to his phone directing it to snap a digital photograph

Feature highlights

• All-new NXT intelligent brick
• 3 interactive servo motors feature inbuilt rotation sensors to align speed for precise control
• New ultrasonic sensor makes robots see by responding to movement
• New sound sensor enables robots to react to sound commands, including sound pattern and tone recognition
• Improved light sensor detects different colors and light intensity
• Improved touch sensor reacts to touch or release and allows robots to feel
• 519 hand-selected, stylized elements from the LEGO TECHNIC® building system ensure robot creations will be sturdy and durable while also looking authentic
• Opportunities for physical programming of robots and interaction with robots during programming
• 18 building challenges with clear, step-by-step instructions help acclimate users to the new system to create robots ranging from humanoids and machinery to animals and vehicles
• Digital wire interface allows for third-party developments
• Further, the robots are Bluetooth-enabled, meaning they can be controlled by, and can control, any Bluetooth device.

Application: kid’s toy

Mindstorms NXT is said to be aimed at children 10 and older but it's that obvious Lego is hoping the toy will actually appeal to adults.


Home Vacuum cleaning robots

U0308030 PHUNG DUC KIEN

Introduction: Currently, there are many available home cleaning robots in the market. Each robot has its own strengths and weaknesses. I will give this exclusive chance to compare the products, to help the consumers differentiate between the robots and make their own choice.


I-robot Roomba



Roomba is currently the most popular cleaning robot
Price: Roomba Discovery: $249
Technology:
Using a front bumper sensor to detect obstacle
Dirt-detection: Using sound feedback. It is a very clever technology, when the dirt is sucked into the box, depends on the frequency of sound detected, the robot can interpret the dirt density. The robot can hear the dirt!!!

Automatically avoiding stairs, avoiding falling-off from high altitude using Infra-red sensors.
Home-base: When it's done cleaning or runs low on a charge it will reliably return to the charging base if it's in the same room.

Strengths: Dirt-detection, less noisy than conventional vacuum cleaner, good cleaning algorithm, can automatically return home to recharge


Weaknesses:
It will not be gently with your valuable furniture and pets because it will actual bump into them.
The robot has difficulty when cleaning on the carpet


Sharper Image eVac




Price: $199
Technology: Similar to Roomba
Strengths:
Cheaper than Roomba
Better cleaning power, thus noisier

Weaknesses:
It will not be gently with your valuable furniture and pets because it will actual bump into them.
The robot has difficulty when cleaning on the carpet
Very noisy as compared to Roomba
The robot doesn’t have a function to return to home base station




Applica Zoombot



Price: $99
Strengths:
Possibly the Cheapest Cleaning Robot available
Can avoid stairs but sometimes it is stuck which 1 wheel hanging in the air if we let it run on the table.

Weaknesses:
Run very slow
Poor functionalities
Bad cleaning power


Electrolux Trilobite





Price: $1,799


Technology: Using ultrasonar sensors to detect obstacles, so that the robot will not bump into your furniture. Thanks to these sensors, it can differentiate between the objects and your pets.

Magnets are everything to Trilobite: it uses magnetic strips for room containment, magnets in the base station that tell Trilobite where home is, and even magnets to hold the dust bin door shut. Electrolux warns that the vacuum may mistake a speaker lying on the floor for its base station since speakers use large magnets in their drivers. Trilobite took no interest in our large floor standing speakers, though.

Strengths: Trilobite has fantastic performance. It has excellent cleaning ability, especially with pets’ hair. It can clean nicely on carpet. It can go back the charging station when the batteries are running low. Trilobite has a user-friendly LCD.

Weaknesses: Very expensive. It is a high-end product after all.


Reference:
http://www.everydayrobots.com

NEC's Health and Food Advice Robot


u0303819
Ang Yong Chee

Health and Food Advice Robot

While most of the media attention goes to high publicized domestic robots like the Sony’s QRIO and the Mitsubishi’s Wakamuru , another Japanese giant NEC has announced that it has developed a new robot that is capable of tasting food .


Above NEC's Health and Food advice robot , Sony's QIRO , Mitsubishi's Wakamura (from left )

This robot with “taste buds” is a new feature added to the common existing ones like patrolling the home with built-in cameras for detecting intrusion , recognizing faces and voices to communicate with its owners as well as to provide information and controlling home appliances .

Officially called the “ Health and Food Advice Robot “ and dubbed as the world’s first partner robot with a sense of taste by its creator, NEC System Technologies . The robot is able to analyze the food and ingredients and also to perform food tasting . In other words , the robot is able to break down the compositions of the food and also differentiate among the variants of a particular food . For example, the robot is able to determine the amount of fat composition in a cheese and possibly what kind (brand) of cheese it is .

On top of it , the robot can also offer advices to its user if it is given its user’s health profile . The advices include how to improve the user’s health and eating habits based on the robot’s analysis of the user’s diet .

Technology behind the “food tasting”

The robot has an infrared sensor equipped to one of its arms . This robot utilizes a property called “ spectrum reflection ratio” to determine the composition of the food . Varying wavelengths of infrared light are beamed onto the food, where the spectrums of the reflected infrared lights are analyzed to determine the actual contents in the food.



Now the food compositions in term of water, protein and other molecule types have been determined by the infrared sensor. Given the robot’s database of the food compositions, the robot is able to identify the food if it exists in its database or “remembers” it if it isn’t.

( picture left shows the infrared sensor on its arm )




References : http://www.necst.co.jp/english/press/20050609/index.htm
http://www.roboticsdaily.com/headline/Health-Food-Advice-Robot.html

Photo sources from : 
NEC :
http://plusd.itmedia.co.jp/lifestyle/articles/0506/09/news070.html
Mitsubishi Wakamaru : 
http://www.mhi.co.jp/kobe/wakamaru/english/about/index.html
Sony QIRO :
http://www.sony.net/SonyInfo/QRIO/top_nf.html



Surveillance Robotics: Using mobile phones to control robots for household security

U0307641 Low Youliang Freddy


In the present day, using mobile phones has become the norm. While people are out at work, they may be constantly worried about the condition of their household; be it thieves breaking in or an accidental fire being set. By combining these two pressing issues together, the idea of a mobile phone controlled robot is being explored upon. In fact, companies such as Fujitsu Laboratories Ltd has developed such a robot known as MARON-1. This kind of robot can be remotely controlledby mobile phone to operate home electronic devices and monitor home security.

This robot is equipped with a wide range of functions, including telephone, camera, remote control, timer and surveillance equipment. With these features, it is forseen that MARON-1 could be used for monitoring homes or even offices at night or for checking up on persons requiring special care and monitoring.

Maron-1 consists of a drive mechanism, a camera that can rotate left, right, up, and down, a programmable remote to control home electronic appliances, and a PHS communication card that, together with specially designed i-appli (*1) software, enables the robot to be operated remotely by mobile phone.
With this special feature to be operated by a mobile phone, the robot can take pictures and relay them to the phone's screen, such that the owner can check conditions at home. In addition to this, the robot is not static. The owner can give precise commands for moving the robot forward, backward or turning in a desired direction. Also, by storing the home's layout in the robot's memory, the owner can give the robot a destination, and it will automatically navigate to that point, avoiding obstacles and maneuvering over door saddles and other surface gradations along the way. Alternatively, a pattern may be established for it to patrol a designated course.
Images sent by the Maron-1 can also be used for specifying a destination. The robot's infrared remote control capability can be used to operate appliances such as air conditioners, televisions and VCRs. With today's technology, i believe that it is also possible for the robot to control devices using "bluetooth" technology, giving it a wider range of operation.
By positioning the robot one or two meters from a spot the owner would like to monitor (for example, the front hall or a window) and turned appropriately, MARON-1 is able to detect anyone or anything entering its field of view. If it does detect an intrusion, it can sound an alarm and call a pre-set number.
The robot can also be scripted to take specific actions at specific times. For example, it can be used as an alarm clock or timer, or it can be programmed to take pictures around the house at pre-set times.
With its built-in PHS capability, the robot can be used as a hands-free telephone. Frequently dialed numbers can be stored in its memory for one-touch dialing. Other commonly performed actions may also be assigned to function buttons.

References:
[1] http://www.fujitsu.com/global/news/pr/archives/month/2003/20030313-03.html
[2] http://www.vimicro.com/english/whatsnews/newes/3/November82002.htm

Robots in Mining

U0206584
Vidhya Ganesan

ROBOTS IN MINING

“In the ten years between 1988 and 1998, 256 miners died and over 64,000 were injured in mining accidents!”
“World metal prices have been falling for decades due to increases in efficiency. If a mine is unable to become more productive, it will go out of business!”

Yes! The vision of robotic industries, science fiction only a few years ago, is poised to become reality in the global mining sector, driven by the twin needs for safety and efficiency.

CSIRO's deputy chief executive for minerals and energy, Dr Bruce Hobbs says research teams at CSIRO are trialling and developing a range of giant robotic mining devices, that will either operate themselves under human supervision or else be "driven" by a miner, in both cases from a safe, remote location. “It is all about getting people out of hazardous environments," he says.[1]

Robots will be doing jobs like laying explosives, going underground after blasting to stabilize a mine roof or mining in areas where it is impossible for humans to work or even survive. Some existing examples of mining automation include

· The world's largest "robot", a 3500 tonne coal dragline featuring automated loading and unloading

· A robot device for drilling and bolting mine roofs to stabilize them after blasting

· A pilotless burrowing machine for mining in flooded gravels and sands underground, where human operators cannot go

· A robotic drilling and blasting device for inducing controlled caving.

Robots must demonstrate efficiency gains or cost savings. The biggest robot of them all, the automated dragline swing has the potential to save the coal mining industry around $280 million a year by giving a four per cent efficiency gain. Major production trials of this robot are planned for later in the year 2000.

Unlike their counterparts commonly found in the manufacturing industry, mining robots have to be smart. They need to sense their world, just like humans.

"Mining robots need sensors to measure the three dimensional structure of everything around them. As well as sight, robots must know where they are placed geographically within the minesite in real time and online," says Dr Corke. "CSIRO is developing vision systems for robots using cameras and laser devices to make maps of everything around the machine quickly and accurately, as it moves and works in its ever-changing environment," he says.

Dr Corke insists that the move to robots will not eliminate human miners, but it will change their job description from arduous and hazardous ones to safe and intellectual ones.

The Technology :

Example 1: RecoverBot [2] (used in mine rescue operations) , a one hundred and fifty pound tethered rectangular unit, has two maneuverable arms with grippers and four wheels that support an open box frame with power units, controllers and video cameras separately built with their own individual metal armor. Lowered down the target shaft to prepare a recovery, the telerobotic eyes "see" for the surface controller and the arms move the body into a second lowered net by lifting and dragging. An "aero shell" protects the robot during the lowering operation from a winch to protect from falling debris, and then removed when bottom is reached. Then RecoverBot performs it’s mission, observed from two points of view-the overhead camera used by current mine rescue to image deep shafts-and the robot, who’s video are the mine rescuer’s second view. When the mission is completed the robot is then raised to the surface after the victim and overhead camera is withdrawn.

Example 2: Groundhog [3], a 1,600-pound mine-mapping robot created by graduate students in Carnegie Mellon's Mobile Robot Development class, made a successful trial run into an abandoned coalmine near Burgettstown, Pa. The four-wheeled, ATV-sized robot used laser rangefinders to create an accurate map of about 100 feet of the mine, which had been filled with water since the 1920s.

To fulfill its missions, the robot needs perception technology to build maps from sensor data and it must be able to operate autonomously to make decisions about where to go, how to get there, and more important, how to return. Locomotion technology is vital because of the unevenness of floors in abandoned mines. The robot also must contain computer interfaces enabling people to view the results of its explorations and use the maps it develops.The robot incorporates a key technology developed at Carnegie Mellon called Simultaneous Localization and Mapping (SLAM). It enables robots to create maps in real time as they explore an area for the first time. The technology, developed by Associate Professor Sebastian Thrun of the Center for Automated Learning and Discovery, can be applied both indoors and out.

"Mining can be a hazardous job. Getting robots to do the job will make mining safer and ensure the long-term viability of the industry".

References:

[1] http://www.spacedaily.com/news/robot-00g.html

[2] http://www.usmra.com/MOLEUltraLight/MOLEUltrLight.htm

[3] http://www.cmu.edu/cmnews/extra/2002/021031_groundhog.html

U.S. Air Force testing robots as security guards

u0307999 ZHAI NING

U.S Air Force is taking steps to try out new robotic security guards to replace the human duties in order to save potential losses of life and improve efficiency.

There are two major robotic is on trial at EGLIN AIR FORCE BASE, Florida.

mini-robot deployed

One robot being tested is a Jeep-size, four-wheeled vehicle that has been equipped with radar, television cameras and an infrared scan to detect people, vehicles and other objects. It carries a breadbox-sized mini-robot that can be launched to search under vehicles, inside buildings and other small places.

Another robot is fashioned from an off-the-shelf, four-wheeled all-terrain vehicle, giving it added versatility because a human also can ride it like a normal ATV. Both vehicles can be remotely operated from laptop computers and can be equipped with remotely fired weapons, like an M-16 rifle or pepper spray.


Those robotics can be programmed to patrol and in case of suspicions, they will sound loudly to tell potential threats, and interesting can use different languages to question the intruders.

But the Air Force still use a human to be always around because the military doesn't want to give machines complete discretion.

Very practical example of what the situation is about the security robotic. But it also pose a problem to us, how much we can believe a robotic to secure us, and how about the social issures with the deployment of the robotics?

Why we need Homeland Robotic Security Systems?

u0307999 ZHAI NING

Do you still the scene of 911 attack? It is an unexpected attach which demonstrated how a small group of people can have a huge destructive power on once invulnerable to large-scale terrorists attack, U.S. And this events unveiled the limitless possibilities for more to come if we do not secure ourselves well.

This introduction of weapons of mass destruction furthers the ability of a small group with relatively limited military assets to wreak havoc on asymmetrical warfare or terror. The principle defense against surprise attacks of this or any other nature is advanced warning, which
inherently depends upon the timely and accurate collection and assessment of appropriate information.

Thus robotic security system comes. We need advanced detection scheme to detect the undetected by human beings. We need the advanced assessment technology to identify the different scenario. Because of the characteristics of changing parameters, we need the homeland security robotic system to be very adaptive as well as having the ability to learn from past. Thus for the new kind of homeland robotic security system, it is relatively hard to achieve a satisficatory result as other robotics can achieve.

Surveillance Robotics: Using colors to analyse

u0307999 ZHAI NING

Among the possible applications which have been foreseen for Service Mobile Robots, surveillance robots (i.e., robots designed to replace human security guards in
making rounds) are becoming more and more popular, as witnessed by the many systems commercially available and the growing interest in the research community. In this blog, I just concentrate one area:

How is a surveillance robltic can detect unexpected changes in the environment?

Well I found out one method from a paper written by Mattia from university of Genova, Italy, which detailed the use of the following mechanism:

The robot “looks at” the environment through a TV camera; next, it compares what “it sees” in a specific moment with what “it should see” at that same location. In particular, an approach is proposed for images comparison which requires to find color clusters in the color histograms corresponding to the images to be compared: by analyzing the color clusters in the two
images, the system detects similarities or differences between them and consequently deduces if something has changed in the scene.

Well that seems like an computer vision problem, and for sure it is. According to another report written by Paolo in the same university, he propose a “ad hoc” algorithms have been implemented for color clusters comparison. The detail is covered in Proceedings 2003 IEEE International Symposium on Computational Intelligence in Robotics and Automation July 1620, 2003, Kobe, Japan.

I want to comment on the simply idea: it is only a simplification of the human unconscious way of dection. But it helps a lot for robotic. Can the robotic community do better by consulting more life sciences research result? Maybe it is a good way to start think about.

Rock-a-bye baby robot

Yan Meixian U0205044

Making robots look like humans or animals is nothing surprising anymore. Living things especially human beings have such a complex structure that it is a never-ending challenge to reproduce the various aspects of it such as its emotion, its facial expressions, its flexibility in physical motion etc. So it seems to me that most of the time, the robots that makes the news headlines are those which mimic the human beings in the most life-like manner. A very good example is Asimo.

I guess a reason why people choose to produce life-like robots is because they would like the robots to be their personal companions. Facing a robot which indicate some form of emotion seem more pleasant than facing a lifeless and hard machine, especially if that robot was meant to be a toy.

iRobot Corporation produced a super-realistic interactive life-sized human baby toy, known as My Real Baby (MRB) with the purpose of giving young children a very stimulating play experience.

The MRB comes with high-tech animatronics and emotional response software, so each doll has the ability to change its face in numerous different ways allowing it to convey its emotions to the child playing with it. The first generation of the MRB, known as IT, could already shake hands with people, smile if anyone takes his picture, and gets frightened if people get too close to him.
The second generation, BIT, used behavioral language and could also sense if he’s upside down or not and express that he doesn’t like the feeling of being inverted.


The MRB features a range of real and virtual sensors which iRobot did not give any details about. The previous generation used 5 electric motors and had orientation, reed switches, a microphone and a light sensor but the MRB did not seem to have all of these. The doll changes its emotion using the Behaviour Language Operating System (developed by Rodney Brooks of the the MIT AI Lab and Cog fame). The behavior model that of a real baby very well. If it is not fed it, it gets hungry and cries. If it is fed, burped and rocked to sleep, it will stop crying.
Here are other amazing stuff that the MRB can do. It can change its facial expression rather adeptly like moving its lips, cheeks and raising eyebrows. It can also blink and suck its thumb and bottle. The doll has a great collection of different baby noises and words and can randomly combine them. What I find more amazing is that the longer you play with the doll, the more it starts to piece sentences together in a coherent fashion, just like a real baby.

A reviewer of the doll played around with it. The doll could giggle and gurgle and respond well to tickling and being burped on its back. If left untouched, the baby would sleep till it is given a gentle nudge to wake up. The reviewer then tried to be “mean” and make the doll cry, but he did not succeed since the manual stated that the MRB does not respond to aggressive behavior.




It seems quite an ideal toy for young children since it does not encourage violence. But the idea of having a life-like baby may have a bigger scope than just being toys. If we let wild ideas run, maybe one day, the robotic baby may be so realistic that couples who can’t have kids may choose to adopt a robotic baby just to get an idea of what parenthood is about.
Article referred from:

Security Robots Aren't Science Fiction Anymore

U0205183 Teo Yinling

Security and surveillance robots have evolved much since they were introduced in the early 80s. As technology improves over the years, new security robots can do much more and some are even replacing humans. There are some which are not just security robots only.

The world's first autonomous security robot was developed at the Naval Postgraduate School- the ROBART I. It had collision avoidance sensors, but this research platform had no sense of absolute location within its indoor operating environment, and was thus strictly limited to navigating along preprogrammed patrol routes defined by the relative locations of individual rooms, periodically returning to a recharging station by homing on an optical beacon. From a security perspective, the platform could only detect suspected intruders, with no subsequent intelligent assessment capability to filter out nuisance alarms.


The second-generation follow-on to ROBART I was ROBART II , which also operated indoors, incorporating a multiprocessor architecture and augmented sensor suite in order to support enhanced navigation and intelligent security assessment. The addition of an absolute world model allowed ROBART II to:
(1) determine its location in world coordinates;
(2) create a map of detected obstacles; and
(3) better perform multisensor fusion on the inputs from its suite of security and environmental sensors . This last feature facilitated the implementation of a sophisticated threat assessment algorithm that significantly increased the probability of detection while virtually eliminating nuisance alarms.

In 2003, Wakamaru, an experimental Linux-powered humanoid robot was developed by Japan's Mitsubishi Heavy Industries; The 3.3 foot tall, 60 pound robot is described as the first human-size robot capable of providing companionship or functioning as a caretaker and house sitter. The battery-operated robot moves about on wheels and recharges itself when its batteries run low.

Wakamaru has an internal software platform that was developed using MontaVista Software's embedded Linux distribution and tool suite. Its project manager attributed the choice of embedded operating system to its "sophisticated software base" and "superior networking capabilities," which enabled the team to "focus on the complex programming that makes this new robot human-like." Additionally, the robust operating system also played an important role in enabling Wakamaru to service a household 24 hours a day.
Some of Wakamaru's main difference with other security robots are:
(1) A robot that is friendly to people and useful for your life at home.
(2) Lives with family members. Speaks spontaneously in accordance with family member's requirement. Has its own role in a family.
(3) Natural and enriched communication in accordance with life scenes. Recognizes approximately 10,000 words required for daily life and provides topics in accordance with life scenes and communicates in a friendly manner using gestures.
(4) Autonomous action in accordance with its own rhythm of life. The robot has its daily rhythm of life, moves in accordance with time and purpose, automatically charges its batteries and lives with family members.
Wakamaru was introduced into the Japanese market beginning in 2004, priced at about 1 million yen, which is approximately US $14,250.

The latest security robot would be the one by Hitachi. It is a proto-type security robot on wheels that stands 22-inches tall.


Hitachi's robot has a parascope camera that protrudes from its head and though it appears awkward it can watch for suspicious changes in the landscape and send photos to a guard. The camera can swivel so the robot doesn't have to do an about face to look around. The prototype which basically has a laptop on board for a brain can figure out the shortest path to a spot. When it gets there if something is missing or moved it can send back images to a security guard.
The "Star Wars"- looking robot still has problems with battery life and recognizing objects smaller than a soda can, but Japanese electronics maker believes the roving robot, which can figure out the best route to a spot on its own, is better than the stationary cameras now common for security.
Universities and even Honda Motor Corp. have developed robots that can recognized its location and objects moving, but many such robots require marks on the floor to pick up on its cameras. Another way robots figure out where they are is by global position system, using satellites.
At the present time Hitachi has no plans for commercialization of its prototype security robot, but in the future that could change and the future's probably not to far away.

References:
http://home.kyodo.co.jp/modules/fstStory/index.php?storyid=233998
http://www.spawar.navy.mil/robots/
Related Posts Plugin for WordPress, Blogger...