This is the Pyrosphere shown in the Burning Man 2010. I think it is really cool because I like light and fire that are programmed to the music. The Pyrosphere is thirty-seven feet tall and it has ninety-one flamethrowers. So it is quite big. And the best is that it’s control by a Arduino.
Here is a movie of what it can do:
From the site Nexus Burning Man 2010 Photos, from there you can find another page where they are showing how they did it. So go there and get some inspiration.
Here is a fun way for people to interact with robots: Tweenbots!
It’s a project by Kacie Kinzer.
She created a number of small robots, put flags on them with destinations, and then sent them down the street in different directions, relying on people on the streets to help them reach their goals.
Apparently, this worked pretty well! Many of the Tweenbots actually reached their intended destinations. Not even one of the bots was damaged or lost, which I think is incredible given this was in New York!
Read more and watch the clip here!
Tobias Johansson, Group G
Everybody knows about the concept of replacing parts of the human body for mechanical ones from scifi-movies. These techniques may however be closer to publical release than you think.
Dean Kamen, inventor of the segway, has been working on a prosthetic arm for some time. The arm is special in that it is controlled with the same nerves that a normal arm is “controlled” with. This is done by relocating nerves from the original arm to the breast muscle of a patient thus giving the patient the ability to control this muscle via the same “inputs” as had it been the arm. Sensors are then connected to this muscle, and these sensors are connected to the robotic arm.
It is easy to find out more about on the web.
Chris Harrison and Scott E. Hudson has presented an approach to tackle the lack of tactile elements in touchscreens while preserving its dynamic qualities in their paper: Dynamically Changeable Physical Buttons on a Visual Display.
A bad thing about touchscreens is that you have to focus your visual attention on them when it could needed elsewhere e.g. on the road. Tactile objects, such as buttons, requires less attention, but the user interfaces are restricted by them as they can not be as easily exchanged as elements in touchscreens. Their approach is to combine the good features from them both by “…producing pneumatically actuated, dynamic, physical buttons on a visual display.”.
I think this would be both a fun and useful way to interact with displays as you miss the touchy-feely when interacting with touchscreens at times.
 Chris Harrison , Scott E. Hudson, Providing dynamically changeable physical buttons on a visual display, Proceedings of the 27th international conference on Human factors in computing systems, April 04-09, 2009, Boston, MA, USA
/Gustav Börman, Group H
Control of precise aggressive maneuvers with an autonomous quadrotor helicopter. This is a small autonomous Unmanned Aerial Vehicle (UAV). Demonstrations of flips, flight through windows, and quadrotor perching are shown. Work done at the GRASP Lab, University of Pennsylvania
Quote From frankerssj “We designed a system that is capable of autonomous navigation with real-time performance on a mobile processor using only onboard sensors. Specifically, we address multi-floor mapping with loop closure, localization, planning, and autonomous control, including adaptation to aerodynamic effects during traversal through spaces with low vertical clearance or strong external disturbances. All of the computation is done onboard the 1.6Ghz Intel Atom processor and uses ROS for interprocess communication. Human interaction is limited to provide high-level goals to the robot.”
Samira Afshoon Group F
I watched this lecture a little more than a year ago (ages ago in internet-time). It got me going in a “the future is now” kind of a way. In the lecture Pattie Maes (MIT Media Lab) introduces sixthSense an attempt to add a data-layer on top of the perceived world. The display part isn’t as cyberpunk as I would like, but the concept is made with tech available today unlike my cyberpunk daydreams (contact lens display).
Can Arduino be any cooler?
The above was my inspiration piece but I couldn’t skip this sharing this ultra slick could be assassin/spy drone platform. It’s open source so the developers must be the friendly go happy types but DARPA is sure to put this to some not so friendly use. Here and here are some flicks of quad drones in action by Daniel Mellinger at UPenn. Here is a flick of the Arduino drone, not as cool but maybe one of us could improve it…
Olafur Nielsen (group V)
I found a nice algoritm for locating an object from a robot using a servo and a Sharp-sensor at the Society of Robots . The code is written by John S. Palmisano for his mini-sumo Stampy. It finds an object and continously regulates the robot into that object (which is good for e.g. sumo robots). The pseudocode for this algoritm is as follows:
if sharp IR detects object
scanning IR turns left
else //no object detected
scanning IR turns right
//robot motion code
if scanner is pointing far left
robot turns left
else if scanner is pointing far right
robot turns right
else //scanner pointing forward
robot drives straight
I actually implemented this on a project of my own, called Wagg-E (you can view some info on it at my not very active blog at sterna.crf.nu).
Link to the project mentioned Stampy the Sumo robot. It’s a very nice project in at the whole, and it gives some good ideas for a sumo robot.
/Erik Sternå in group F