https://www.youtube.com/watch?v=LPszKhewSec
Published on May 21, 2014
19 sensor inertial chest to finger system. @BrandonJLa
It's unreal. Apparently costs $13,000 right now (read it in the comments). Haven't found an article or official site yet.
Moderator: JC Denton
by icycalm » 03 Jun 2014 13:42
Published on May 21, 2014
19 sensor inertial chest to finger system. @BrandonJLa
by icycalm » 06 Jun 2014 19:58
BeekleMatter wrote:
- lhAqWfC.png (246.42 KiB) Viewed 7995 times
Basically, it motion caps your arms and hands down to the fingertips. Looks amazing for the next step in achieving VR presence. Totally reasonable goal of $250,000 goal, too. Seems to get the dev kit that includes the entire rig, it'll cost $600.
Just went up, check it:
https://www.kickstarter.com/projects/co ... on-and-mor
by icycalm » 06 Jun 2014 20:07
BeekleMatter wrote:Man wrote:Noticeable latency.
I noticed that, too. But in this video of them playing a game they made on it, there is virtually none.
- ezgif-save_(1).gif (3.21 MiB) Viewed 7994 times
by infernovia » 11 Jun 2014 15:34
Tom Forsyth wrote:@JonOlick As ever, the elephant in the room is to clap your hands. Vicious test of both latency and precision.
The first demo with the gloves that I got to try was a moon exploration demo that they created. Your avatar is wearing a full astronaut suit with some weird interface at the bottom of your helmet that I didn't really get (I think they should remove those). Once Spencer told me that I could look at my hands I was amazed! I could rotate them, squeeze them, do weird finger movements and they were all tracked fluidly. The latency was barely noticeable honestly. I would guesstimate it to around 50-75 ms between moving a finger and seeing it move in the HMD. They also had a TV behind showing what I was seeing. Later on I could notice the TV had a much higher latency than what you actually see in the HMD.
I tried several things to push the gloves limits. I tried holding my hands together; the palms were touching in the game but the fingers were bending backwards a little bit. This was due to calibration they said and in fact later on, after re-calibrating, my fingers were closer when doing the same pose. The arm models moved very closely to my actual arms, this was quite impressive. The shoulders of the skeletal model were probably a little wider than mine so some arm/elbow movements were a little off. Also when putting a hand on top of my head, the hand appeared a n inch in front of my forehead in game. Again I think the little glitches come from the initial calibration. It's important to note that none of the demos had any noticeable drift. In fact, the rift yaw (DK1) seemed to drift while my arms and fingers stayed in place.
The next demo I tried was a modified Tuscany demo with tables that had beach balls, baseballs, cakes, big macs, cakes and other things to grab. Without haptic its a little strange to grab things but after a couple minutes you get used to it. Their demo isn't really polished; the gestures to grab things are rudimentary and the objects you pick up anchor to your wrists and its hard to figure out how to throw/drop them. Throwing ping pong balls didn't quite work: the balls ended up leaving your hand at 30 degrees left of your intended direction for some reason. Throwing big macs over the stone wall in Tuscany was quite a challenge for another guy that tried it after me.
In conclusion I think the tech has a lot of potential. The hardware is pretty accurate and the latency is barely noticeable. The problems they need to solve are mostly software related. I think that with a proper SDK, developers will be able to accomplish wonders with this. I'd like to thank Brandon, Spencer and the entire team for inviting me over. I had a lot of fun and I'm really looking forward to trying the released product. Good luck guys!