It's safe to say we had a great time at our first virtual event for Looking Glass Portrait. You can now relive the entire webinar, now with some response to questions we didn't get a chance to answer during the program. We'll be updating these in the next 24 hours so if you don't see your question answered, we'll hopefully have them here before the end of our campaign.
Want to chat more about Looking Glass Portrait? We're all ears (and rays of light). Send us a message at future@lookingglassfactory.com.
Looking Glass Portrait is available for pre-order here.
How did you come up with this device?
We'd been working on Looking Glass Portrait for over a year before we made the announcement. Though we've been shipping holographic displays since 2018, they were mainly used by a budding community of developers, hackers and technical specialists. We invented Looking Glass Portrait because we wanted to make holographic display technology available to an even wider audience – with holographic creation just taking . For the full story of how Looking Glass Portrait came to be, check out this post by Looking Glass Factory Co-founder and CEO Shawn Frayne.
Which smartphones are supported?
Any iPhone that can take a Portrait mode photo is supported. Additionally, we've recently published an initial list of supported Android devices:
Can we see a demo of the photos we sent running in Looking Glass Portrait?
We're currently working on filming the initial submissions. Those folks will receive clips of their holograms in Looking Glass Portrait soon. Want to be the next to see themselves in Looking Glass Portrait? Send us a shot here.
Would Looking Glass Portrait allow live model viewing within a 3D software? I would like to use this as a reference as I model.
Absolutely! We recently published a posts with best practices for creating holographic renders in Blender and Unity. Using another program? Drop us a line at future@lookingglassfactory.com and let's chat about it!
Does Looking Glass Factory collaborate with artists to rent sets for gallery spaces or for theatrical productions? Custom built frames?
Definitely. We've partnered with artists in the past to work on some pretty neat custom exhibits, including this one with the visionary Tim Burton at the Neon Museum. If you had something like this in mind, definitely send us a message!
Is there a possibility to transform (2-photo) stereo photos?
Yup! Early on during the Kickstarter campaign, we unlocked a stretch goal where by Q3 of 2021, we'd build in direct import of SBS format images into HoloPlay Studio.
Can you run your own dynamic apps on the device, like ones that pull data from external sources
On Desktop Mode, Looking Glass Portrait is fully capable of running interactive applications built with any other peripherals you choose to experiment with! A lot of early applications built for our first generation systems, for example, made use of the Leap Motion Controller for experimental motion control.
How long does the last? Meant to be left on indefinitely like a digital photo-frame replacement? Warranty?
The estimated lifespan of the screen is about 15,000 hours which is around 5 years when operated at 8 hours a day. We offer a 12-month limited warranty.
Does this support OTA content sharing? Like, can I install one in my mom’s house and send content direct to it in standalone mode?
At initial ship (March 2021), this won't be a built-in feature though we are hearing a lot of early feedback from the community already about sharing between Looking Glass Portrait units and will definitely discuss this internally to see how we can build in very basic functionality of this sort sometime later this year.
Would it work if we do an illustration and paint a synthetic depth map for it?
Yes absolutely! While we haven't explored this medium as much as we'd like, Missy from our team has experimented with editing depth maps for more clarity in this post she wrote a couple years back.
Did you fix the issue with the corners getting black?
Yes! That was a slight manufacturing defect in our older generation systems that we've since solved for completely. In fact, if you still have an older system that you want us to take a look at, shoot us an email at future@lookingglassfactory.com and we'd be more than happy to replace it for a small shipping fee.
Would this allow live model viewing within a 3D software? I would like to use this as a reference as I model.
With our current software, you'd be able to do live 3D modeling in Blender as well as live development in Unity and Unreal using our plugins. These are all available here for you to start browsing on our software page.
Is it possible to retouch those generated depth maps you announced?
So glad you asked – yes! We wrote a post about this for our earlier generation products, you can read more about that process here.
Does HoloPlay Studio allow a user to directly connect the Azure Kinect in the way you demonstrate?
The software that you saw in the webinar is not HoloPlay Studio but a separate app called Depth Recorder which will also be available for Looking Glass Portrait users to to download and use when we ship. Recorded videos can then be saved and imported into HoloPlay Studio for loading into your Looking Glass Portrait.
Will the Raspberry Pi be fully accessible?
You can access the Pi 4 by taking off the back of the display which is attached with Philips head screws. We don't currently have an API to modify its functionality, but it is hacker-accessible. Opening the back of the system will not void your warranty if you just want to take a look, but if you make modifications to the system, the warranty will be voided.
What is the viewing angle?
The viewing angle is 58°.
Can the motion controller be used in standalone mode?
I think you're talking about the Leap Motion Controller to which the answer is sadly no :( the SDK software for that peripheral is only available for PC.
Will the app and software be shared with us before the item is shipped?
Absolutely! That will all be ready when your units ship. In fact, we've been making holographic displays for a while now so some of the software we mentioned is already available on our website here.
Does that Kinect stand extend vertically so that the cam is above the portrait, in a video conference mode.
While we haven't released software of our own for holographic video conferencing, the Azure Kinect Stand was built for Looking Glass Portrait in the way you describe like so:
For capturing a series of photos, is it best to have the camera slide in a line or a circular arc?
We recommend taking a series of these photos in a horizontal straight line.
When will Looking Glass Portrait will be produced on a large scale and commercialized? When will it be released? In the US and other countries? And what estimated values?
The first Looking Glass Portrait units will start shipping out to customers as early as this month (for our beta shipments). For the next batch of customers, these will start shipping out in March 2021 and when our campaign ends, we'll continue to take pre-orders directly via our website at lookingglassfactory.com. Right now, Looking Glass Portrait units ship worldwide (aside from a few select countries).
Does it have to stay portrait, or can it be set on its side for horizontal?
Funny you ask! We actually released a landscape version of this display (in a slightly different form factor) as our first generation system back in 2018. You can see that here.
Your light field rig looked pretty precise. How difficult will it be to capture a light field photo without the rail?
Hah! Thanks! We tested out a bunch of camera rails and dolly's before settling on this one. You certainly can and we definitely want to stress that you do not need our rail to get started, we just wanted to make it easy for you to get set up! Most motorized rails on the market would work fine and some more expensive rails out there may work even better than ours. It all depends on how far you want to go with the medium and we definitely encourage more folks to start playing around with different setups (because some may be better than ours!)
Can you use light field cameras, like Lytro?
We will not build support for loading .LFP files directly. However, using Lytro Desktop Software (http://lightfield-forum.com/lytro/lytro-archive/lytro-desktop-software-official-product-information/), you'll be able to process and export depth maps, which you can then load into HoloPlay Studio to be visualized.
In the Azure Kinnect demo, what are you streaming? RGB+D or something else? And what is your frame rate?
Yep, we're streaming RGB+D and the frame rate is up to 30fps.
Can a multiple camera set up generate a hologram?
It certainly can! We'll release some more guidance over the next few months about what that capture format looks like but essentially if you were to set up a big camera rail system that had multiple cameras set to go off all at once, they would make for some incredible light field photos :D
Does the light rail take a video (30/60 fps) or a sequence of photos at a much lower fps?
Before I dive in, just wanted to note that the Light Field Photo Rail is just the motorized rail and doesn't include the camera (haha!) It is intended to capture a panning video that you can then import into HoloPlay Studio which will do all the frame extraction needed to then convert that capture into a holographic image for Looking Glass Portrait.
Would display size of 4x as big work ?
We make them already! Check them out on our website here. We launched the Looking Glass 8K in late 2019.
Is there an internal battery so that it is portable? If so, is it replaceable?
There is no internal battery in the Looking Glass Portrait system though we have had it run for a few hours with an external battery pack! The battery pack will have to provide at least 5V3A.
Will your applications be made available for linux systems?
Our Linux support is currently limited because of how much our users (and us!) use game engines like Unity and Unreal, which themselves have only limited Linux support. Having said that, we’ll be building out our Linux offerings in 2021 and will likely have more information about this in early 2021 when shipments start going out. But likely it will not be the 32-bit or ARM systems.
Are (will) other 3D cameras be incorporated as input sources? (eg: Kinect, OAK-D, etc)
The Azure Kinect will be the first compatible 3D camera to work with Depth Recorder and after our first round of user testing, we'll explore building in support for more. That said, while certain 3D cameras might not be supported in our official tools, there are already members of our community experimenting with things like OAK-D > Looking Glass Portrait streaming.
I have pre-ordered one Looking Glass Portrait already. How do I add another one?
You can add an additional Looking Glass Portrait to your pledge right now OR you can wait till BackerKit surveys go out in order to add to your pledge. There might be a slight cost addition (maybe the difference of about ~$10-$20) when pledged after the campaign ends so we definitely recommend that doing it now!
This was all about Standalone Mode. How will the Desktop Mode work?
Great question! While most of the demos you saw in our webinar was Standalone, the Azure Kinect demo was actually being run on Desktop mode (though we hid the laptop off-camera). Though, there are certainly a lot of elements around Desktop Mode we didn't quite get into so it seems you might have given us an idea about what our next webinar could be about!
Want to know more? We'll be updating this list over the next day so stay tuned!
One last thing, be sure to follow us on Twitter, and join our Discord server!