Philly Tech Week Wrapup

Sorry to everyone for how long it took me to get this write-up out. The perils of starting a new job, I guess. Luckily, we had plenty of notes taken for the entire week, so very little was lost to the black hole that is my memory.
Our arrangements for Philly Tech Week were pretty impromptu, but we managed to pull off a number of fun things.

Monday, 25th: Open Work Night
Open Work Night turned out to be an extension of our spring cleaning from over the weekend. We got the space nice and tidy for everyone who would be visiting later in the week. One visitor came by and helped us put together a few shelves, which was incredibly handy as they required some “lite modification” with a hacksaw before they would fit in our ceilings. Oh, I know! Our ceilings are freaking tall, what was up with the shelves?

Tuesday, 26th: Micro-controller Show and Tell
The evening had a pretty light showing as people hadn’t really quite caught on to what we were doing. However, some of our members (Mike, Chris, and PJ) did get a start on a mirror-and-laser text display system. Very cool.

Wednesday, 27th: Regularly Scheduled Open House + Late Night Karaoke
On Wednesday night, we hosted a number of guests for what is normally our Open House night. These normally turn into social gatherings of sorts, and Tech Week was no exception. We found out that one of our guests is getting ready to launch a new social networking site, another who has started a vending machine company focusing on local goods (http://snacklikealocal.com), and another kind soul looking to donate a Smithy Lathe!

PJ got his MIDI Nintendo Running pad working. Basically, the old running pad controller used with the NES is interpreted through an Arduino to send MIDI signals back to a host computer, where it is used with any MIDI capable software, in this particular case Ableton Live.

We did get Late Night Karaoke going, and it was a blast. Adam rocked out with the Darkness’ “I Believe in a Thing Called Love”. Sean sang Gershwin’s “Foggy London Town”. PJ wooed everyone with The Temptations’ “My Girl”. Corrie set us all rolling laughing with Gayla Peevey’s “I Want A Hippopotamus For Christmas”. Chris fiiiiinaly got up to sing Soul Survivor’s “Expressway To Your Heart”. And Brendan was Brendan with Rick Astley’s “Never Gonna Give You Up”.

Thursday, 28th: DIY/Electronic Music
We had a good mix of newcomers and members for our music night. One person brought in a completely hand-made, 7 sting electric guitar he built. The thing was sick, really wish we had gotten pictures. We jammed out with various synths and drum machines. Sean further extended his Atari Punk Console with a low-pass filter to give it a rounder tone, then blipped and buzzed along with everyone else. Brendan rocked out on the guitar, and Dan was really tearing it up on the keyboard. Definitely a fun night, and we will be looking to do more such nights in the future.

Friday, 29th: “Bricks and Grips” – Arm Wrestling/Puzzle Game Tournament
This night, we actually had more guests than members show up. The first two challengers for Arm Wrestling Tetris were Sean McBeth (the creator) and Robert Cheetham, founder and president of Azavea, a GIS software firm in Center City that is doing some extremely revolutionary work (I know, I used to work in the industry). We also had a bit more electric music jamming, which was a great time.

It was a real team effort getting the game together, between Brendan’s sound track, PJ’s voice over work, and Sean’s programming and construction, it all fit together perfectly. Next up, Punching Bag Double Dragon!

Saturday, 30th: Artemis Game Session
The developer of Artemis just released a new version that includes canned missions. We played the first mission with Sean as Captain and survived to tell the tale.

While en route to our primary mission objective of observing anomolies in a nearby nebula cluster, we encountered a squadron of Krellians lying in wait, having prepared for an ambush against us. Lt. Commander Santoro showed great skill and initiative in destroying the three ships in mere seconds with two well-placed nuclear torpedoes.

After the brief battle, we intercepted a distress call from Deep-Space 49 as they took fire from another battle group of Krellians. Running low on energy and weapons, we barely scraped by and defended the station after a core-burning sprint at maximum warp that nearly left us depleted of energy. Lt. Peterson performed admirably in her duties managing power levels and surely is responsible for our survival.

DS-49 provided us with much needed supplies as we returned to our primary mission: scanning nebulae. We returned to the cluster to find another hidden flotilla of Krellians. This time, we were completely out of nukes and were unable to deal with them handily as we did before. We managed to warp out of weapons range before any serious damage came to the ship. Our second sortie against the Krellians fared better, we damaged them, but had not completely destroyed them. Running low on weapons, Commander Toliaferro performed commendably in maintaining a flanking position on the enemy, allowing Lt. Commander Santoro to dispatch the enemy with beam weapons.

Completely depleted of forward torpedoes, running low on energy, we were ambushed by a third squad of Krellians while under way to DS-45 for supplies. While we managed to warp into a nebula for cover, the nebula destroyed our shields and we were stuck with the enemy between us and our safety. Having nothing but mines left, Captain McBeth hatched a plan. We would fly through the center of the squadron, diverting repair crews and energy to protect critical systems as we bore the brunt of the frontal assault, then dropping our mines in the middle of the squad as we passed through them, to warp away to safety on the other side. The plan required a high level of coordination by all crew members. As Commander Toliaferro deftly navigated at close quarters through the heart of the beast, the first pass dealt great damage to the enemy, but they weren’t quite finished. Rather than coming about for another pass, Captain McBeth ordered all-stop in the middle of battle. Allowing the enemy to come in to weapons range, Lt. Commander Santoro dropped the last few mines, while Lt. Peterson delicately balanced the needs o the repair crews, shields, weapon systems, and engines largely under instinct, not having time to run the proper load balancing calculations. As a result, the final Krellian fleet was completely destroyed while the S.S. Artemis flew home under her own power, completely undamaged, back to DS-45.

Another mission accomplished.

Deepest appreciation for all the piscine comestibles

Dear Hive76 community,

With appreciation for the last year and a half with great people and projects, as of yesterday, I’ve resigned from Hive76’s Board of Directors. There are some amazing things in the works for the group, and I look forward to watching it grow and helping out from time to time. The organization has grown so much since I first heard about it, and there is a strong core of smart people making it run. I am confident that even more positive changes are on the way for Hive76 and the Philly science/tech community.

Recently, I’ve had a lot of demands placed on my time, and I feel that I can do more for DIY, science education, and technology for social change, from a different context. I’ve picked up some other tech/ed projects around the city (like Random Hacks of Kindness), so while I’m moving out of the building, I’ll still be in the neighborhood. You can keep an eye on what I’m up to and get in touch over here.

Don’t be a stranger…any questions, don’t hesitate to ask. Thanks again, it’s been a blast. And stay tuned for the exciting results of Hive’s next round of elections!

Stephanie Alarcon

Hackerspace-Wide Gaming

Tuesday we played Artemis Spaceship Bridge Simulator internationally with another hackerspace, 091 Labs in Galway Ireland. Despite time zone differences, initial network trouble on our end, and the lack of a full crew, we still had quite a bit of fun.

First we played co-operatively. Trying our best to stay some what coordinated, we swooped in on enemy fleets and made quick work of them. Either of us would get in to trouble in battle, with the other ship warping  in and turning the tide at the last moment.

Eventually we  challenged each other to a battle, and I must say it was extremely intense.

For the next game (May 21st), there is talk of having a few other hackerspaces possibly joining in, so stay tuned!

 

Suggestion Box

What would you like to get from us here at your favorite hackerspace?

Suggestion Box
Suggestion Box

Below is a poll where you can vote on new things to do, or add your own suggestion. (If you do, keep it brief) Feel free to have a discussion about new col things in the comments below.

[polldaddy poll=”5034351″]

Artemis Spaceship Bridge Simulator, Tuesday!

Captain’s Log, May 8th, 2011…

It’s almost time again for our Artemis Spaceship Bridge simulator night, so get ready to beam up! Due to popular demand we’re doing it twice this month (May 10th, and May 21st).

If you’ve never played Artemis before, it’s a networked computer game that simulates the bridge of a spaceship, much like what you’d see on Star Trek®. The game allows for 5 bridge officers, plus a captain, and all you need is the software (which is provided) and a laptop to play.

Artemis Spaceship Bridge Simulator Night
Hive 76 (915 Spring Garden)
Tuesday, March 10th @ 7:00 PM.

 

Engage!

 

Philly Tech Week signature event

One of our lovely members, Jim Fisher, was kind enough to send me and a friend along to the Signature Event for Philly Tech Week because he liked my SketchUp class so much. Thanks Jim! There was an open bar, some food, and a big screen displaying a TwitterFall for the tag #phillytechweek. I’m not one to pass up an opportunity, but I didn’t have the right hardware to exploit such a lovely tweetstream. Instead I bugged Adam into doing my dirty work in exchange for a burrito. The result:

HIVE76 in ASCII
You might want to zoom in

Woot! Great success! High fives all around.

There was also a collaboration suggestion board where someone suggested that we team up with Breadboard to hack some art. Okey doke, let’s do it!

Suggestion
Suggestion

Philly Tech Week Events

For Philly Tech Week, we’re opening our doors every night of the week at 8pm, extending our normal Open House format to the entire week, for this week only. We have a variety of different activities planned. Check it out.

Useless Photo
It's gonna be hot!

Monday, 25th: Open Work Night
For the first night of Tech Week, we’ll be working in the space on projects together. Come stop by and say hi, lend a hand, or just to jibber-jabber about your own projects. This is a little different than normal Open Houses in that we typically curb work sessions for the night.

Tuesday, 26th: Micro-controller Show and Tell
Have an Arduino, MSP430, Propeller, or other MCU project that you want to show off? Want to learn some basics of gettings started with the MSP430? Come out this night and have fun with bit-twiddling, speaker beeping, and LED-blinking.

Wednesday, 27th: Regularly Scheduled Open House + Late Night Karaoke
Our regularly scheduled social hour. We have a hacktastic “karaoke machine” running on a Macbook that lets you queue songs through our IRC channel. We don’t usually start the Karaoke until 10pm, but if enough people are interested we’ll get it started early.

Thursday, 28th: DIY/Electronic Music
Step-tone generators, electric guitar effects pedals, sequencers, keyboards. Whether you’ve made your own instrument or not, any way you want to make music tonight, come on down and jam with us.

Friday, 29th: “Bricks and Grips” – Arm Wrestling/Puzzle Game Tournament
Based on a similar concept that we are not permitted to mention due to trademark issues, this game is a standard 2-player, head-to-head Tetrimino Puzzle Game, where players manipulate their pieces through an arm wrestling competition on a specially designed arm wrestling table-shaped controller.

Saturday, 30th: Artemis Game Session
For all you trekkies out there, Artemis Spaceship Bridge Simulator is a networked multiplayer game that simulates a spaceship’s bridge; much like what you’d see on Star Trek®.

Hive76/UArts Special Event, “Artisanal Technology”, April 23rd at UArts

Hive76 and The University of the Arts have teamed up for this one, and we managed to persuade Leah Buechley to bring a bit of the MIT Media Lab to Philly in the form of a special presentation titled “Artisanal Technology“.  Showtime is April 23, 11:30 A.M.  The location is 5th Floor, Terra Hall (211 S. Broad).

Leah Buechley demos some atrtistic options with the LilyPad, using Boulder's Pearl Street Mall as her canvas

In Leah’s words, “This talk proposes an alternate model for the production, distribution, and consumption of consumer electronics that emphasizes diversity, small scale production, and thoughtful consumption.  I will raise and discuss several questions, including: what kind of technologies can be artisanaly produced–crafted in small batches?  what benefits might society reap from artisanal technology? what benefits might we expect as designers and manufacturers? what tools need to be built to support an artisanal technology ecology?

You can get tickets here.  Admission is free and seating is limited, so we expect the event to fill up quickly.  If you are interested in attending, get your tickets ASAP.

The talk is inherently cross-disciplinary, and we have done our best to recruit some of the more colorful members of the Philly creative community to attend.

We are hoping to have some displays in the room that give a sampling of High Low Tech, Philly Style.  If you have some work that you’d like to display, please feel free to leave a comment — we’ll see what we can do.

Thanks, and hope to see you there!

User-Literate Technology

This is a broad-concept idea that I’ve had in my head for a while and have discussed with a few people. This post is mostly a direct adaptation of those discussions. I’ve taken to calling the idea “User-Literate Technology”, mostly because, in the same way we might say that a particular person is technology-literate, we should also be able to say that a particular technology is user-literate.

In some ways, this is similar to “user-friendly”, except that it places the burden on the technology to adapt to the user, rather than the technology making it easy for the user to adapt to it. Does some particular technology in question create its own gestures and idioms, while seeking to make them easy to learn, or does the technology capture idioms that are common in the culture for which the technology is intended? If the technology errs more on the latter side, then it is “User-Literate”, more than “User-Friendly”.

Before systems can become more User-Literate, they largely need to dispense of their most prevalent interface: the keyboard and mouse. The keyboard is a text and data entry tool, but as an interface into consumer computing, it is roughly 150 keys of confusion, distraction, and indirection. For example, why do we still have a Scroll Lock function on our keyboards? Scroll Lock hasn’t been a useful feature for the last 20 years; in other words, one of the most important and significant markets for consumer computing has never lived in an era that needs a Scroll Lock. It’s like issuing every new driver a buggy whip with their driver’s license.

Mice are nice for tasks that involve precise selection of elements on a 2D plain. It was designed in an era when graphical displays were not much larger than 640×480 pixels. Nowadays, I have a laptop with a native resolution of 1600×900, and I can hook up a second monitor to it to double that space. We’re talking about screen real estate that is five to ten times larger than when the mouse first became popular. To give you an idea of what that means, take a look at the 640×480 highlighted area on my desktop screenshot (and yes, I paid for Photoshop).

Imagine using only the lower-left corner

Computing has seen more huge leaps and bounds in usability than it has incremental improvements. Check out this screenshot of the Xerox Star GUI. I remind you that this is from 1981. Try to identify any functional elements from modern computer interfaces that are not in this image (protip: from a technical perspective, there aren’t any, they are all adaptations of concepts shown here).

Xerox Star GUI
The Graphical User Interface from the Xerox Star experimental OS, 1981

The early GUI interfaces like Star and its clones (including Macintosh and Windows) got something very right: they made functionality discoverable. There were two primary ways in which they did this, by providing visual cues on the screen immediately in the user’s field of view, and by providing multiple access points to the functionality to accommodate users who work in different ways. Having a menu option labeled “Help” is very inviting, but advanced users learn to ignore large portions of screen text, so it’s very important to make systems that cater to both the wide-eyed (literally) newb and the hardened veteran.

Regardless, monitors are only good if the user A) has a fully functional visual sense, and B) is able to devote their attention to the display. If the user is blind or distracted by other visual tasks (say, operating a heavy machine) then the display is a large, hot paperweight on the desk.

Luckily, we are starting to see some very basic work in this area hitting the consumer market. Between systems like the iPad and the hacktastic stuff being done with the Kinect, there is a lot going on with removing computing from its keyboard-and-mouse hegemony. Still, in many cases, they often rely on the user being able to memorize gestures and command sequences. If a user has to do something unnatural–even if it is done through advance motion sensing and image processing–then it might as well just be any other button-pushing interface.

This is why I never got into the Nintendo Wii. Yes, the motion tracking of the controller was a wonderful sweet-spot between price and precision. Despite that, few–if any–of the games were doing anything actually unique with it. Instead of waggling a joystick permanently affixed to a controller base and mashing buttons, you were… waggling a joystick in mid-air and mashing buttons. The user still had to learn new motion patterns and adapt to the system.

I think Google kind of picked up on the absurdity of most modern motion-tracking systems with this year’s April Fools prank, the “Gmail Motion“. Also, I think there are some good examples of user-literate technology on the market already.

I have a Wacom tablet here that is not only pressure- but also tilt-sensitive. I’ve found that the primary training hang-up is the disconnect between moving the stylus in one location and the drawing marks showing up in another; without strong hand-eye coordination that can be difficult to adjust to. Wacom has had LCD displays for a while now that have the full touch-and-tilt sensitivity built into them. I can’t imagine how amazing working with them must be (and probably won’t for a while, the smallest one is only 12” across and costs nearly $1000. The one that I would actually want is 2 kilobucks).

There is a weather station ran by MIT with a natural language processor that you can call on your phone called JUPITER. I’ll be damned if I couldn’t figure out how to trip this thing up. Even with a fake southern accent (though reasonable, I’ve spent enough time in the south to know what they actually sound like) I couldn’t baffle it. Anything that it faltered on, I had to admit that a human would have had a hard time understanding me anyway. It’s best feature was context tracking, you could ask for the weather on a certain day in a certain city, receive it, then make an extremely contextual query like “what about the day after?” and it would get it right, “and the next day?” and BAM, weather forecastery in your ear. I heard about this thing over 5 years ago, why don’t we have flying cars yet? I understand the technology was based on a DARPA project that was being used for automated logistics in battlefield scenarios. People getting shot at don’t have time to remember how to talk to a computer. So they built a computer that could understand a screaming, cussing US Marine.

My sister cued me in to a project being developed by a group of high-schoolers in Portland, OR. A team of two 11th graders are developing voice-emotion recognition technology that; they’ve already won the Siemens Science Award in the team category. You talk into a microphone and the computer judges the emotional state you were in when you spoke. The kids are currently envisioning developing a wristwatch for autistic children who have difficulty assessing others’ emotions. The watch will flash an emoticon indicating the emotional state of the person the child is talking to.

So what is the point of all of this talk? I am organizing a symposium/exposition for User-Literate Technology. I want it to be a spring-board for starting to talk about technology that adapts to and understands how people work, rather than having artificial systems that strive to be easy to learn. Hopefully, we can have it going either by the end of the year or by this time next year. I’d like it to be a multi-disciplinary event, with equal participation from industry and academics, from artists and computer scientists and engineers.  If you or your organization is interested in participating, you can reach me through the gmail with the name “smcbeth”.

We haven’t seen a major innovation in human-computer interaction in over 30 years. It’s time to start working on the problem.

How springs are made

I just thought I would take the time to post these videos of springs being made.  I apologize in advance for BLOWING YOUR MIND.

#1, #2, #3, #4, #5

P.S: The machine in that vid #5 looks totally make-able, no?