Google may unveil more about its Project Glass this month as new details reveal a more complex pair of glasses.
According to ABC News, the company is holding two events, one in New York City and another in San Francisco, where software developers can get an early look at the futuristic glasses.
Like Us on Facebook
"Join us for an early look at Glass and two full days of hacking on the upcoming Google Mirror API in San Francisco or New York," Google said in an email invite it sent out to those who signed up to be on its developers list. These software and app creators will get a device to use on site and will come up with different software ideas and tie-ins.
"These hackathons are just for developers in the Explorer program and we're calling them the Glass Foundry," the invite added.
"We're looking forward to what developers will do with Glass, but we don't have more details to share at this time," said a Google spokesperson when ABC News asked about the upcoming events.
Gothamist ran into a woman who worked for Google and she was wearing the glasses. The media outlet called her the "Cyborg from the year 2014."
Business Insider put together what was learned from the Gothamist interview:
- The glasses "basically do everything that a smartphone can do, but faster and without having to pull out your phone."
- People will be able to tell when you're not paying attention to them - your interactions with the glasses are noticeable.
- Even if it weren't her job to test them, the Google employee says she'd "absolutely" still use them herself.
- You can scroll through Web pages by making a scrolling motion with your finger next to the headgear.
According to Gizmodo, a new Google patent suggests that a laser-projected control pad might be in the running as a part of Project Glass.
"Currently, Project Glass uses a touch pad that runs down the side of one of its arms. Trouble is, that means reaching up every time you need to adjust a setting," the report said. "This idea, though, would use a laser projector to throw a control pad onto any surface that you're looking at: wall, desk, arm, whatever. Then, a small camera would interpret finger movements in the region of those buttons and turn them into commands. Simple."
What do you think of Project Glass? Sound off below!