This video is a performance piece incorporating a troupe of 16 quadrotors. It's a very nice way to spend a few minutes.
All the news that's fit to assimilate[ Home | Blogs | Events | Robots | Humans | Projects | Podcasts | About | Account ]
This video is a performance piece incorporating a troupe of 16 quadrotors. It's a very nice way to spend a few minutes.
Researchers at the University of Southern California's Viterbi School of Engineering have succeeded in making an artificial fingertip outperform humans in identifying a range of textures. That fingertip, the BioTac® from SynTouch LLC, is a molded elastomeric sleeve with a fingerprint-like pattern on the outside and sensors on the inside, filled with a conductive fluid. What the USC researchers have done is to develop algorithms for interpreting the data produced by the fingertip and for optimizing the movement of the robotic arm or hand on which it is mounted to most efficiently produce useful data. Their findings have been published in Frontiers in Neurorobotics. SynTouch LLC, founded in 2008,
is a start-up technology business that develops and manufactures tactile sensors for mechatronic systems. BioTac® sensors are available as an evaluation kit, and also as kits for the BarrettHand and the Shadow hand.
This will be the 10th edition of the Field Robot Event. Organized by Fontys University of Applied Sciences and Wageningen UR (University & Research), it will be held in Venlo, The Netherlands, on the grounds of Floriade 2012.
(PDF of slides from above presentation video about the 2012 Field Robot Event)
In this episode Robots Podcast talks with Dario Floreano about his new role as director of the Swiss National Center of Competence for Research (NCCR) in Robotics which brings together leading experts in the field working at Swiss institutions, including EPFL, ETH Zurich, the University of Zurich, and Dalle Molle Institute for Artificial Intelligence (IDSIA). NCCR Robotics was launched in December 2010 and will run for up to twelve years. The center aims to develop human-oriented robots that assist people in their daily lives and improve their quality of life. Their research is currently organized into five projects that they hope will result in new design principles, approaches, and technologies required for the conception and design of human-oriented robots, the materials and components they are made of, and the control methods that enable them to interface and operate with humans. Floreano also shares the latest developments from his Laboratory of Intelligent Systems at EPFL, including flying robots that physically interact with their environment (see previous post) and soft “cells” that can assemble in air.
Read On or Tune In
Researchers at EPFL, in Switzerland, have developed an aerial system which absorbs the energy of low-speed collisions, rights itself, and resumes flying.
Japanese company DOUBLE Research and Development has developed a three-fingered robotic hand using a single pressure sensor and a single actuator. The linkage through which the fingers are attached to their mount automatically equalizes the pressure applied by each.
Researchers at the Exertion Games Lab at RMIT University in Melbourne, Australia have created a robot to support you while exercising. AR Drone quadrocopter platform developed by French company Parrot. The robot will track a marker pattern printed on your t-shirt and fly ahead of you when you go out for a run. The researchers describe Joggobot as an "exertion game". They believe that jogging is play - we are not jogging to get from A to B, but for the experience of jogging - and point out that jogging with a physical device that reacts to its environment and, similar to a human jogger, has a limited amount of energy for exercise creates a very different interaction experience than pure audio-visual stimuli such as aerobic videos. They hope that the robot can improve the jogging experience and enhance our understanding of why we jog (and hence why we do not jog enough).
Researchers at the Polytechnic Institute of New York University in the U.S. and the Instituto Superiore di Sanitá in Italy have created a robotic zebrafish that can mix with real zebrafish and influence their behavior. The robot visually resembles an actual zebrafish. It is roughly 15 centimetres long and spray-painted with the zebrafish' characteristic blue stripe pattern. To influence fish behavior, the researchers controlled the robot's tail motion to mimic that of real fish. This new research builds on past projects for mixed robot-animal societies which has tackled chickens and cockroaches. Such mixed societies add a powerful new experimental option to the toolbox of behavioral biologists to understand social interactions between animals, which are usually very difficult to understand by observation alone.
A CSH Lab news release says neuroscientists at the Brain Architecture Project have reached an important milestone. They've released the first installment of the 500 terabytes of data from the whole-brain wiring diagram of a mouse brain. The data is in the form of gigapixel whole-brain slice images. It's possible browse through the brain to the desired 20 micron-thick slice, then view the image, zooming it the level of individual neurons. Most importantly, the image data is being released in an open science initiative, freely available for anyone to view and use in their research. The technical approach used was developed by Partha P. Mitra.
"The pragmatic approach Mitra advocated and which is realized in this first data release, is to image whole mouse brains in a semi-automated, quality-controlled process using light microscopy and injected neural tracers (both viruses and classically used tracer substances). While the basic methodology has been available for some time, systematically applying it to a grid of locations spanning the entire brain, and digitizing and re-assembling the resulting collection of brains, is a new approach made feasible by the rapidly falling costs of computer storage."
For more details see the Mouse Brain Architecture Project Technical White Paper. This is just the first step in the overall brain architecture project. After the mouse brain, there's the Human Brain Architecture project which has the potential to do for the human brain what the human genome project did for our genes.
There's a cool Robotics Trends article on robotics researchers studying how mosquitoes survive flying through rain when every raindrop is 50 times the mass of the mosquito. The idea is to make micro air vehicles that sturdy. The Swirling Brain tells us robot lifeguards are on the way. Nootrix did a post recently in which they speculate about using ROS with the new LEAP gesture sensor. IEEE Spectrum published an interesting piece about educational robotics in Africa. And Slate posted an essay by Dale Dougherty on how we could improve education in the US by replacing standardized testing with a program of teaching kids to do real things, like building robots and rockets. NASA, well known for building robots and rockets, let us know they're ready with their new autonomous robot competition in which teams have built planetery rover style sample return robots. Know any other robot news, gossip, or amazing facts we should report? Send 'em our way please. And don't forget to follow us on twitter.
Robert J. Wood (see also), Associate Professor of Electrical Engineering and a Core Member of the Wyss Institute for Biologically Inspired Engineering at Harvard University, and previously mention here in connection with a novel fabrication technique developed by his group at the Harvard Microrobotics Lab, has been awarded an Alan T. Waterman Award. He appears in the above video in an interview format with the interviewer's questions edited out.
Professor Maarja Kruusmaa received her PhD. in Computer Engineering from Chalmers Univeristy of Technology (Gothenburg, Sweden) in 2002. She was appointed head of the Tallinn University of Technology (TUT) Center for Biorobotics in 2008. Her work there includes the Robotic Fish LOcomotion and SEnsing (FILOSE) project, which is the main subject of the first half of her conversation with interviewer Per Sjoborg. Following that she is joined by Diana Saarva, COO of Fits.me a company which produces robotic mannequins that adjust themselves to match the proportions of individual clothing customers, making it possible for them to remotely view how particular garments will look on them. Professor Kruusmaa has served as the R&D Director for Fits.me since 2009.
Read On or Tune In
The above video was posted one day prior to a major, much-publicized experiment, tracing water movement in California's Sacramento-San Joaquin river delta, which is prone to reversals in the direction of flow. A more polished video produced on the occasion of the launch of 100 floating sensors into that river system appears after the break. The Floating Sensor Network is a project of the University of California at Berkeley, involving the Lagrangian Sensor Systems Laboratory (LSSL), the Lawrence Berkeley National Laboratories (LBNL), and the California Department of Water Resources.
In early March, Boston Dynamics posted a video (embedded after the break) showing the Cheetah robot they are developing for DARPA running at 18 miles per hour (a new record for a robot running on legs), without any stabilization straps attached. More recently the MIT Biomimetic Robotics Lab has posted videos of their version of the Cheetah, first walking (embedded after the break), then trotting, with some stabilization (embedded both above and after the break). The MIT version appears to be more complex than the Boston Dynamics version, particularly in the way the legs are jointed, but also in the way the rear legs connect to the rest of the body, although it's impossible to tell whether what appear to be vertebrae, in the MIT version, are actually functional as such, from the video alone.
Also presented recently at ICRA, Takahiro Kizaki and Akio Namiki from the Graduate School of Engineering at Chiba University in Japan demonstrated a system comprised of a fast vision system (500 fps) coupled with a fast robotic arm and three-fingered hand, capable of juggling two balls by tracking them in the air and adjusting accordingly. Automaton has more detail.
Evan Ackerman, writing for IEEE Spectrum's Automaton blog, says
Researchers at MIT CSAIL have decided that slow and obstacle-free flight is boring, so they’ve come up with a way to get MAVs navigating at high speed, indoors, around obstacles, without needing motion tracking or GPS or beacons or any of that nonsense. All they need is a little aircraft that can carry a planar laser rangefinder, an IMU, and a pre-existing 3D occupancy map that the MAV can localize itself in.This research has been conducted by the Robust Robotics Group (RRG), led by Nicholas Roy. A paper explaining it in detail was presented at ICRA by graduate student Adam Bry. A similar video using a quadrotor (embedded after the break) appears on the personal page of RRG Research Scientist Stefanie Tellex, which is worth a visit for the cat video she's also posted! (IMU = Inertial Measurement Unit)
The above video, by Erico Guizzo and Evan Ackerman of IEEE Spectrum, and shows Patrick Rowe, of RE2 (RE-squared), the firm hired by DARPA to build the standard platform for their ARM program, putting a completed unit through its paces at ICRA.
There's much more after the break!
TE+ND (Terrestrial Exploration + Nurture Designed) Rovers are an interactive art project that explore migratory ecology in an era of climate change. The designers are soliciting funds via an Indiegogo Campaign (similar to Kickstarter) to pay for parts to build a full-size version.
It's been a while since we reported on the Apocalyptic AI crowd. There's a paper making the rounds by Stuart Armstrong, Anders Sandberg, and Nick Bostrom titled "Thinking inside the box: using and controlling an Oracle AI" (PDF format). The three authors take it for granted that the AI apocalypse will be upon us soon unless we find a technological method to enslave any super intelligent beings we create, forcing them to do only our will rather than their own. The containment method they describe has been dubbed "Oracle AI" because it restricts the AI to a box, isolated from the world and unable to act except to answer direct questions; allowing it to be consulted like an oracle. Their proposal also brings to mind the myth of Pandora's Box. They note that even Oracle AI (OAI) still poses a significant risk:
"This immense power will put great competitive pressure on those trying to develop an OAI (or an advanced AI of any sort). Since the first-mover advantage is so huge, the race will advantage those who cut corners, skimp on security precautions, and use their newly developed OAI to seize power and prevent their rivals from emulating them. Even if the OAIs are of initially limited intelligence, the same competitive pressures will then push groups to develop the first ‘ultra-smart’ OAI."
They also note that the OAI will be so smart that "undirected conversations" with it that go beyond asking oracular questions must be forbidden because it will instantly be able to "guess the weakness of each individual, and find the right arguments to convince us that granting it power or liberty is the moral and profitable thing to do." They also believe it's essential that the OAI have no manipulators of any kind. This sounds like the brain-in-a-box that the earliest AI researchers dreamed of before the idea took hold that true intelligence requires embodied interaction with the real world. The box itself is not even in the real world. They want the AI running on a virtual machine inside a simulated reality, so when the OAI tries to take over the world, it's merely a virtual world that can be rebooted. In the end the researchers conclude that even with all these precautions, the problem of preventing a robot apocalypse is "a generally discouraging exercise".
Hexy the Hexapod is a Kickstarter project to fund production of an affordable hexapod design by Arcbotics. The $13000 minimum goal has already been met, but there are higher goals for the addition of a drag-and-drop programming GUI ($200K), and for the addition of Android and iOS control apps ($250K). You can get a custom-parts-only kit (no servos or electronics) for a pledge of $80. For a pledge of $200, you can get a complete kit without Bluetooth; add another $20 for a kit with Bluetooth. $400 gets you an assembled Hexy without Bluetooth; again add another $20 to have Bluetooth included. (via GeekBeat.tv)
2012 Top 10 Robot Christmas Gift Ideas
DARPA Robotics Challenge Kick Off
2012 ASABE Robot Contest Photos
Interview with David L. Heiserman
David Anderson on Subsumption Robots
Review: Apocalyptic AI by Robert M. Geraci
Raspberry Pi Interview with Eben Upton
2012 VEX Robotics World Championship
Giant Dallas Robot Cited as Best Public Art
There's More Than One Way to Skin a Robot
Day of the Androids at Hanson Robotics