Thursday, June 25, 2009
HugCoat v1.0
The Printing Press
The Telegraph
The Telephone
Television
Personal Computers and the Internet
All of these important technological advancements have at their core the concept of telepresence: the ability to extend one's presence and reach of communication across a long distance. Perhaps the most influential inventions leading to our contemporary state of ubiquitous and instantaneous long-distance communication.
We are now surrounded by innovations that extend our ability to convey information to virtually anywhere on the planet. At any moment, via social networks, blogs, my cell phone , etc, I have the ability to receive and transmit anything to just about anyone.
In the long march towards constant telepresence, technology has rarely, if ever been applied to our immediate and personal interactions in physical space, the kinds we have with fellow humans every day. Indeed, actual presence social interactions have remained fundamentally untouched by technology.
In an increasingly technologized world, it is sometimes easy to forget how to interact with humans without the comfortable distance created by technology and text-based communication.
This is the thought process that launched me into my final project: Instead of using technology exclusively to extend our presence and communication to long distances, how can we use technology to improve our immediate and physical interactions?
Often we find ourselves greeting others in passing, on the street or at social gatherings. It is proven that a large percentage of what we communicate to others is sent via body language, and our bodies convey many things that are not under our control. Is it possible for us to use technology to further control our body language and thus smooth out our daily social interactions?
The hug has become a standard greeting between friends and acquaintances in America, but things do not always go as smoothly as we would like. Often complicated questions arise in the moment right before a hug takes place. Questions about positioning, closeness, duration, etc. are common. This sort of confusion can lead to limp, unfulfilling hugs that end up as a quick and awkward sort of touch, or the opposite occurs, and, not knowing when to let go, one or both parties might hang in just a bit too long, leading to another kind of quiet awkwardness.
My project, the HugCoat, currently version 1.0, seeks to remedy this kind of awkward exchange by providing automatic protocol and tangible feedback into the situation. By alerting the user via a small vibration motor when a hug is complete, it allows one to approach a hug with confidence. It is much easier to go in for a full, hardy sort of hug when you know for a fact it will go on for an acceptably short, but still fulfilling duration.
Designed within an attractive sport coat, the HugCoat can be worn on a daily basis, as part of a regular wardrobe. It will come in handy particularly at parties and other medium to large social gatherings, where one is likely to come into contact not only with close friends, but also relatively familiar acquaintances, the kind of people with whom awkward hugs most regularly take place.
The HugCoat is particularly for those who may find that they have trouble empathizing in immediate space with others, due to personal social discomfort or as a side effect of too much technologically based communication. Certain people just have trouble pulling off proper hugs. However, the HugCoat is not limited to this demographic alone. At some point almost everyone feels confused by modern social rituals, which are often codified in a sense, but not in a direct and obvious way. Through innovations like the HugCoat, we might all be able to smooth out our social interactions, and learn to empathize with machine prescision.
Tuesday, June 16, 2009
H-Bridge Lab
After the H-Bridge lab, I'm really starting to see even more how with the Arduino and a few other components you can control just about anything you want. I'm tempted to get a few motors and sensors together to make some kind of small robot or other moving object. It looks like we're coming full circle in the class, now with the ability to use motors to actually change and control physical space. The only difficult part of the lab for me was getting the H-Bridge to stay in my breadboard. The legs were a little too springy and didn't want to straddle the center piece. I ended up holding it in while I tested the motor.
I went on to assemble the gearbox that comes in the lab kit, and hook it up to a motor and the circuit. The decrease in speed/increase in torque is immediately apparent on the 1:60 gearing.
Thursday, June 11, 2009
High Current Loads Lab
Serial Duplex Lab
The serial duplex lab was extremely helpful for getting familliar with the intricacies of serial communication. For the lab I used a potentiometer and a photocell as my analog inputs. I plan on using multiple analog inputs in my final project, so the example programs here will give me a great start with formatting, sending and retrieving serial data. I also hope to use some of the stuff I learned from the lab to make our midterm project run more smoothly, because Clare and I ran into some issues there with serial communication.
Thursday, June 4, 2009
I completed the analog graphing lab, which graphically charts the analog signal from Arduino using processing. It's clear how useful this example program can be if you wanted to test a large number of analog sensors. Both graphing and printing the value coming from Arduino gives an excellent idea of how a sensor reacts to physical interactions. It's also just weirdly relaxing to twist the potentiometer back and forth and watch that blue wave move up and down. Call me crazy.
The midterm project that Clare and I are working on is also coming along nicely. I am setting up the basic game code in processing. From there, it should be pretty easy to get Arduino feeding in data from flex sensors for control.
The midterm project that Clare and I are working on is also coming along nicely. I am setting up the basic game code in processing. From there, it should be pretty easy to get Arduino feeding in data from flex sensors for control.
Tuesday, June 2, 2009
Analog Output!
Completed the analog output lab using a photo cell to control the servo. It's amazing how varied the values are when using different analog sensors. When I was messing around with a flex sensor, my values fell in a tiny range between about 1017 to 1023. With the photocell, the values coming in were between 1 and 67, give or take. This was indoors on a cloudy day, so I imagine the values can shift quite a bit. I can see the importance of a calibration period at the start of a program in order to account for different light conditions.
Thursday, May 28, 2009
Analog input lab
After doing the lab using the potentiometer, I tried hooking up a flex sensor to the circuit using the same code, to see if I could use the flex input to dim and brighten the LED. Testing the flex senor, I found the input that I read from it to be very small in range, from about 1017 to 1023, with occaisional very small outliers like 10 or 107. Because of the tiny range, bending the flex sensor made no visible difference in the output of the LED. I decided to try and just make the flex sensor turn the LED on and off. Because of the erratic nature of the flex sensor readings, I decided to use a for loop to generate an average of many readings in each program loop. The for loop takes 30 readings and finds the mean average, and the program then either powers the LED or not, depending on weather or not there is a discernable bend in the sensor.
Here's the code I ended up with:
int flexPin = 0;
int flexValue = 0;
int led = 9;
int flexSum = 0;
int flexAverage = 0;
void setup(){
Serial.begin(9600);
}
void loop(){
flexSum = 0;
for(int j = 0; j < flexvalue =" analogRead(flexPin);" flexsum =" flexSum" flexaverage =" flexSum/30;">
Subscribe to:
Posts (Atom)