Clearly today is the day I should publish my yearly blog post. I think I’ll make a two-parter, since most people coming here today are going to care mostly about my personal encounter with the co-founder of Google.
Last night I ran into Sergey Brin on the subway ride home. I got on the downtown 3 express train at Times Square. Almost got into a different car, but switched to the next because the there were some people exiting slowly from the set of doors at which I was standing. I plopped myself down in an open seat, admittedly looking a little worse for wear after the two-hour bus ride down from a weekend in Woodstock. Now I’ve already encountered a couple of people wearing Glass, and an acquaintance is actually a member of the UX team. I also met and spoke with somebody from Google X who was attending the Invensense Motion Interface Developers Conference at which I spoke last year. So I looked up and there was a fellow wearing a Glass unit. Cool. I’ve been to Google NYC for a tech talk (a great one about Street View) and I see Googlers on the subway periodically, so it wasn’t that much of a surprise. But… that guy sure looks a lot like Sergey Brin.
I asked if I could take his picture and he smiled and consented. I asked how the project was coming along and how he liked where it was right now. Of course he told me that he loved it and that it was coming along really well. Somehow, though, I just didn’t trust my own eyes enough to believe it was really Sergey Brin sitting across from me. I mean, I’ve seen the dude’s private jetliner with my own eyes while working out at NASA Ames in my previous job. What would he be doing on the subway? Aside from the fact that he has a ginormous corporate facility and an apartment here.
Anyhow, I asked if he was part of the core X team and he said that he was. He told me that there are about one hundred other people outside of X who have prototype devices. I told him that I was a Vuzix M100 developer and was looking forward to getting a dev unit and getting to do a side-by-side comparison with Glass. Actually, as it turns out, I inadvertently lied and told him that I was expecting to receive a dev unit shortly. The tracking number sent to me was actually for the M100 SDK, which arrived today. As I’ve signed an NDA, I can’t say anything about it, but it looks really good. I’m not sure when I’ll actually get my hands on the hardware.
But seeing as I wasn’t at Google I/O, I know for certain that I won’t be getting Google Glass Explorer Edition anytime soon. I told Mr. Brin that I know a few people who are eagerly looking forward to the Glass Foundry events. He told me that the Explorer Edition would be shipping out to devs in a couple of months. If I’d really been confident that it was him, I’d have given him my card and asked for an invite. I have been told several times today that I’m a punk for not having asked regardless. Oh well.
So we got to the 14th Street station and were still talking when he realized that it was his stop and jumped up. I bid him “take care” (by all accounts, he does), and that, as they say, was that. I took out my phone. Looked at the pictures, and thought “yeah… that really was Sergey Brin, you dummy… couldn’t you have thought of something intelligent to say? Or told him that you’ve been working on building a wearable Human Interface Device accessory specifically suited to HUD applications?”
But I have a funny way of running into people, so I’ve got no regrets. I recently wired up some Hasbro NERF Stampede guns up to some Neurosky headsets from a Mattel Mindflex Duel game to create a fun little mental face-off game. At CES, my girlfriend and I, by total coincidence, ended up sharing a cab with the designer of Mindflex Duel, who left Mattel and is now at Hasbro. I know that doesn’t quite compare, but I’m just saying that the universe seems to have a funny way of timing my random introductions.
Now it’s rather funny, all of this excitement about the upcoming consumer-ready HUDs. People keep talking about them in the context of Augmented Reality, which seems to cause confusion on several fronts. Yes, Google Glass is a see-through display, but it clearly isn’t the visual overlay that is necessary for “real” AR, and Google isn’t positioning it as such. There are still a lot of challenges to overcome before we can expect those. Those who are new to the term Augmented Reality, and to HMDs in general, seem frequently to lack understanding of what a fixed focal-depth means for these displays.
This post isn’t finished, but I’m hitting publish just to have something up for now.
As long as you’re here, check out this just-released music video that I helped make this summer. I used a bunch of Arduino Megas to drive about 250 fluorescent tubes to the cues in Robert DeLong’s first single.