default header

Hardware

Control VR

Moderator: JC Denton

Control VR

Unread postby icycalm » 03 Jun 2014 13:42

Control VR w/ Oculus DK1 in Tuscany
https://www.youtube.com/watch?v=LPszKhewSec

Published on May 21, 2014

19 sensor inertial chest to finger system. @BrandonJLa


It's unreal. Apparently costs $13,000 right now (read it in the comments). Haven't found an article or official site yet.
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Canary Islands

Unread postby icycalm » 06 Jun 2014 19:58

http://www.neogaf.com/forum/showthread.php?t=831538

BeekleMatter wrote:
lhAqWfC.png
lhAqWfC.png (246.42 KiB) Viewed 7946 times

ffe1da20bac18a1951ed584dd7b21734_large.png


Basically, it motion caps your arms and hands down to the fingertips. Looks amazing for the next step in achieving VR presence. Totally reasonable goal of $250,000 goal, too. Seems to get the dev kit that includes the entire rig, it'll cost $600.

Just went up, check it:
https://www.kickstarter.com/projects/co ... on-and-mor


They already have $133,349 out of the $250,000, with 29 days to go.

Zuckerberg will be needing a couple more billion soon.
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Canary Islands

Unread postby icycalm » 06 Jun 2014 20:07

http://www.neogaf.com/forum/showthread. ... t115033234

BeekleMatter wrote:
Man wrote:Noticeable latency.


I noticed that, too. But in this video of them playing a game they made on it, there is virtually none.

ezgif-save_(1).gif
ezgif-save_(1).gif (3.21 MiB) Viewed 7945 times


That video could have easily been made to show no latency by a little editing, something which could not have been done in the first video...
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Canary Islands

Unread postby infernovia » 11 Jun 2014 15:34

They invited people to check out their hardware: https://www.youtube.com/watch?v=xLAdnDKZ77o

At the end of the video, they point out that the motion to photon latency is about 52ms, and a keyboard in the same system has around 40ms. Didn't see the evidence so I am not buying it, but it's nice that it's not close to 300ms as some people were saying. Hopefully we can get an expert to look at it like Tom Forsyth or Doc_OK who has done some great analysis on this type of hardware: http://doc-ok.org/?p=1003

https://twitter.com/tom_forsyth/status/ ... 5117425664

Tom Forsyth wrote:@JonOlick As ever, the elephant in the room is to clap your hands. Vicious test of both latency and precision.


Not an expert's criticism, but brings up some points I hadn't seen before:

http://www.reddit.com/r/oculus/comments ... pressions/

The first demo with the gloves that I got to try was a moon exploration demo that they created. Your avatar is wearing a full astronaut suit with some weird interface at the bottom of your helmet that I didn't really get (I think they should remove those). Once Spencer told me that I could look at my hands I was amazed! I could rotate them, squeeze them, do weird finger movements and they were all tracked fluidly. The latency was barely noticeable honestly. I would guesstimate it to around 50-75 ms between moving a finger and seeing it move in the HMD. They also had a TV behind showing what I was seeing. Later on I could notice the TV had a much higher latency than what you actually see in the HMD.

I tried several things to push the gloves limits. I tried holding my hands together; the palms were touching in the game but the fingers were bending backwards a little bit. This was due to calibration they said and in fact later on, after re-calibrating, my fingers were closer when doing the same pose. The arm models moved very closely to my actual arms, this was quite impressive. The shoulders of the skeletal model were probably a little wider than mine so some arm/elbow movements were a little off. Also when putting a hand on top of my head, the hand appeared a n inch in front of my forehead in game. Again I think the little glitches come from the initial calibration. It's important to note that none of the demos had any noticeable drift. In fact, the rift yaw (DK1) seemed to drift while my arms and fingers stayed in place.

The next demo I tried was a modified Tuscany demo with tables that had beach balls, baseballs, cakes, big macs, cakes and other things to grab. Without haptic its a little strange to grab things but after a couple minutes you get used to it. Their demo isn't really polished; the gestures to grab things are rudimentary and the objects you pick up anchor to your wrists and its hard to figure out how to throw/drop them. Throwing ping pong balls didn't quite work: the balls ended up leaving your hand at 30 degrees left of your intended direction for some reason. Throwing big macs over the stone wall in Tuscany was quite a challenge for another guy that tried it after me.

In conclusion I think the tech has a lot of potential. The hardware is pretty accurate and the latency is barely noticeable. The problems they need to solve are mostly software related. I think that with a proper SDK, developers will be able to accomplish wonders with this. I'd like to thank Brandon, Spencer and the entire team for inviting me over. I had a lot of fun and I'm really looking forward to trying the released product. Good luck guys!


The hybrid camera system could eventually fix the hand placement on top of the head, so I am not that worried about it, but I hope they can fix the ping pong ball not throwing properly.
User avatar
infernovia
 
Joined: 21 Apr 2009 19:37
Location: Wisconsin, US


Return to Hardware