Monthly Archives: September 2010

Georgia Tech Musical Instrument Competition

Gerogia Tech Center for Music Tech hosts an annual competition for new and innovate musical instruments/interfaces and controllers.

See last years entries including the “Double Slide Controller” . Check them out  here.

Updated files for the Controller workshop

Hi everyone, sorry that the tryout took so long to get it up and running. I have uploaded a new archieve that contains all code and the updated processing file, along with bug fixes. So hopefully, there will be no more problems to connect to the game. Good luck with your controllers!

Get code here.

There are three lines at the beginning of the processing file that is of interest to you, which is commented. I would also like to remind you that the net.jar in the processing folder, needs to be replaced with the one in the archieve to get rid of latency. If you have any problems, come talk with me or Niklas.

The tcpInterface enables your computer to connect to the host of the game, you need to change the IP address in order to connect. The address will be given to you when needed.

//Johan

Dynamically Changeable Physical Buttons on a Visual Display

Chris Harrison and Scott E. Hudson has presented an approach to tackle the lack of tactile elements in touchscreens while preserving its dynamic qualities in their paper: Dynamically Changeable Physical Buttons on a Visual Display.
A bad thing about touchscreens is that you have to focus your visual attention on them when it could needed elsewhere e.g. on the road. Tactile objects, such as buttons, requires less attention, but the user interfaces are restricted by them as they can not be as easily exchanged as elements in touchscreens. Their approach is to combine the good features from them both by “…producing pneumatically actuated, dynamic, physical buttons on a visual display.”[1].
I think this would be both a fun and useful way to interact with displays as you miss the touchy-feely when interacting with touchscreens at times.

Sources:

[1] Chris Harrison , Scott E. Hudson, Providing dynamically changeable physical buttons on a visual display, Proceedings of the 27th international conference on Human factors in computing systems, April 04-09, 2009, Boston, MA, USA

http://www.chrisharrison.net/projects/pneumaticdisplays/

/Gustav Börman, Group H

Agressive Maneuvers for Autonomous Quadrotor Flight

Control of precise aggressive maneuvers with an autonomous quadrotor helicopter. This is a small autonomous Unmanned Aerial Vehicle (UAV). Demonstrations of flips, flight through windows, and quadrotor perching are shown. Work done at the GRASP Lab, University of Pennsylvania


Quote From frankerssj “We designed a system that is capable of autonomous navigation with real-time performance on a mobile processor using only onboard sensors. Specifically, we address multi-floor mapping with loop closure, localization, planning, and autonomous control, including adaptation to aerodynamic effects during traversal through spaces with low vertical clearance or strong external disturbances. All of the computation is done onboard the 1.6Ghz Intel Atom processor and uses ROS for interprocess communication. Human interaction is limited to provide high-level goals to the robot.”

Samira Afshoon Group F

Data-junkies behold…

I watched this lecture a little more than a year ago (ages ago in internet-time). It got me going in a “the future is now” kind of a way. In the lecture Pattie Maes (MIT Media Lab) introduces sixthSense an attempt to add a data-layer on top of the perceived world. The display part isn’t as cyberpunk as I would like, but the concept is made with tech available today unlike my cyberpunk daydreams (contact lens display).

Can Arduino be any cooler?
The above was my inspiration piece but I couldn’t skip this sharing this ultra slick could be assassin/spy drone platform. It’s open source so the developers must be the friendly go happy types but DARPA is sure to put this to some not so friendly use. Here and here are some flicks of quad drones in action by Daniel Mellinger at UPenn. Here is a flick of the Arduino drone, not as cool but maybe one of us could improve it…

Enjoy

Olafur Nielsen (group V)

Using one sensor and a servo to find an object

I found a nice algoritm for locating an object from a robot using a servo and a Sharp-sensor at the Society of Robots . The code is written by John S. Palmisano for his mini-sumo Stampy. It finds an object and continously regulates the robot into that object (which is good for e.g. sumo robots). The pseudocode for this algoritm is as follows:

//scanner code
if sharp IR detects object
scanning IR turns left
else //no object detected
scanning IR turns right

//robot motion code
if scanner is pointing far left
robot turns left
else if scanner is pointing far right
robot turns right
else //scanner pointing forward
robot drives straight

I actually implemented this on a project of my own, called Wagg-E (you can view some info on it at my not very active blog at sterna.crf.nu).

Link to the project mentioned Stampy the Sumo robot. It’s a very nice project in at the whole, and it gives some good ideas for a sumo robot.

/Erik Sternå in group F

The Fun Theory – Piano stairs

The Fun Theory is a website that, according to it’s own description is “dedicated to the thought that something as simple as fun is the easiest way to change people’s behavior for the better.”

The video has circulated the Internet for about a year and if you were in Stockholm last October you might even have experienced it.

/Josef Ottosson, Group F

*EDIT*
I noticed that this has already been added to the blog (though not under “Inspiration”, so I didn’t find it when looking through the previous submissions).

Anyway, here’s something that’s from the same project, but hasn’t been posted yet:

*/EDIT*