Loading...

Follow InstaVR on Feedspot

Continue with Google
Continue with Facebook
or

Valid
InstaVR Interviews: Meet the VR Practitioners

InstaVR Interviews is a blog series where we turn the spotlight on our customers. We find out why they create VR, how they use InstaVR, and what the future of VR will look like. To read more interviews, visit the InstaVR Interviews homepage.

Antoine Vivet, Founder of Longue-Vue (a division of Manzalab Group)

About Antoine :

After graduating at the Olivier de Serres Arts school in Paris, Antoine started working in the communication and brand business areas. Then after directing two award-winning short films, he naturally turned to Multimedia, Web Design, Motion Design and UX Design. In 2004, he created the LONGUE-VUE agency, and worked with big companies such as RATP, Orange, the Château de Versailles… Since 2016, Antoine has been working with Manzalab to produce interactive 360°  films.

About Manzalab :  
Created in 2010, Manzalab has been exploring new technologies and inductive methods in order to create more immersive and collaborative experience in training, communication and design fields.
Offices in France & Singapore, With a Focus on Cross-Device VR & AR Digital Experiences

Question: Tell us about Manzalab…

Answer: Since 2010, we use neuroscience research scientific rigor and technological developments to support all sorts of creative ventures.

With offices in Paris, Aix-en-Provence and Singapore, our ecosystem comprises five entities, five different but complementary excellence centers:
Manzalab, Manzavision, Akenomy, CyberZen and Longue-Vue. Together, we create new digital experiences covering several different areas such as: training, communication, design, cyber-security and entertainment. At the forefront of Collaborative VR, we design our experiences for PCs, tablets and mobile phones as well as for virtual and augmented reality environments.
VR Immersion is Key

Question: What specific about Virtual Reality drew you to the technology?

Answer: We have been drawn by the possibility of visual and sound immersion, and by the potential of becoming. It is the way to become an actor in an interactive experience. It’s also the only way to discover an environment that is difficult to access. To save time and discover a distant place. And to learn more about an existing place.

Use a Combination of Insta360 + GoPro Cameras, as Well as Premiere Pro + After Effects 

Question: How do you create your 360 environments? What hardware do you use? What additional software besides InstaVR do you use?

Answer: We use the Insta360Pro camera for high-resolution stereoscopy. To complete the ambisonic sound, we add HF microphones for actors. We also use this camera for its high quality 360° HDR photos. We use the GoPro fusion camera and Inta360 ONE X for camera movements, thanks to its ease of use on a drone and underwater.

For post-production, we use Premiere Pro and After Effects and various 3D software.
Selected InstaVR for Easy Administration and White-Label Oculus Go and Vive Publishing

Question: How did you learn about InstaVR? How did you end up selecting us as your VR publishing partner? 

Answer: We studied several online solutions before choosing to work with InstaVR.

Our interactive projects without 3D or complex interaction are made with this online solution (InstaVR), very useful for our customers who want to use an administration interface to manage their media management interface. InstaVR Pro is an online solution providing tools to create white label 360-degree interactive content working on Oculus Go and Vive.
Projects Customized Based on Client Needs

Question: Are you using the advanced features of InstaVR?

Answer: We try to customize the navigation icons as much as possible so that we can be faithful to our customers’ brand guidelines. This is important to be close to the guide line of our customers. The same applies to the choice of media style in 360° spheres.

Main Output for Customers is Oculus Go

Question: What are your main output types from InstaVR? Why have you chosen them? 

Answer: Most of our customers ask us for interactive movies for Oculus Go headsets. But we like to offer a VR web version, very useful for the validation phases, because it is easy to see on a computer screen.

Better Concentration Major Selling Point of VR

Question: How do you initially pitch clients on why they should use VR? 

Answer: Our main selling point is immersion. We also insist on a better attention focus when using a VR headset rather than in front of a flat screen.

Featured Clients Include Paris Natural History Museum and EDF 

Question: Can you give a couple examples of the success that your clients have achieved by using VR? 

Answer: We made a 360° and stereoscopic film for the Paris Natural History Museum, which is very popular. Interactive virtual tours are also very interesting, both to discover places that are difficult to access and to provide explanations.

For example, we worked for EDF, the French main electricity provider, on a remote island near Singapore, to showcase a mini autonomous green power plant. The visitor can discover the components of the power plant with explanations such as details on how solar panels or generators work.
Looking Forward to More Robust VR Resolutions 

Question: What’s next for you? Where do you see the future of VR and what excites you about the medium going forward? 

Answer: The current resolution of 2.5K VR headsets is not optimal yet. It would require 4K or 6K for a better user comfort. We are also looking forward to mixed reality. We hope that InstaVR will allow the integration of 3D objects into 360°, such as 3D spheres, to enhance reality or to simulate a space under construction.

Thank you to Antoine Vivet for the interview! To learn more about his work and Manzalab Group, visit https://www.manzalab.com

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Facebook’s Mark Zuckerberg revealed at F8 Conference the new Oculus Quest will begin shipping this month (May 2019). We announced shortly afterwards InstaVR publishing compatibility with the headset, including use of the Touch Controllers.

There’s been tremendous excitement for the Quest since then, with numerous media outlets already giving it glowing reviews. Many companies and VR fans have already placed their pre-orders for the headset.

But for those newer to VR — or that are currently displaying their apps on Oculus Go or Samsung Gear VR headsets — the question is: Should you invest the $399 per Oculus Quest to help you achieve your business goals? The answer for many businesses is “yes.”

We previously reviewed the specs of the Quest headset when it was first announced. It is definitely a good fit for a subset of users that need a more powerful, mobile, standalone headset. This is particularly true for enterprise use cases — employee training, sales/marketing presentations, and healthcare/engineering/BIM.

We wanted to delve a little deeper though into the three main features of the headset that will drive adoption. If you need any of these three features, you’ll likely want to invest in the Quest. Let us know if you have any questions on the headset and we look forward to seeing your published apps in a couple weeks!

1. If you need 6 Degrees of Freedom (6 DoF), with inside-out tracking, in a mobile standalone headset…

The first major differentiator you’ll notice about the Quest vs. the Oculus Go/Samsung Gear VR is that it utilizes 6 Degrees of Freedom (6 DoF) tracking. By comparison, the Oculus Go only allows for 3 Degrees of Freedom (3 DoF).

What are Degrees of Freedom? They’re the motion direction you can go in that is recognized by the headset, and conveyed properly in the headset. With the Oculus Quest, four cameras that reside within the headset enable positional tracking. For a more complete understanding of that feature, read this informative overview.

The Oculus Go does standard 360-degree images and video well. So if you’ve recorded a simple 360 video you’d like to use for a sales presentation, where the viewer can sit in a chair and move their head up/down/left/right, then you’re probably fine using the Go.

But clients are making more and more complex applications. One where orientation in a scene is important. Where you can walk around objects and see them from multiple angles.

That is where the 6 DoF of the Oculus Quest becomes necessary. It more accurately represents true movement, and allows for greater scene activity. You can, for instance, walk around a virtual car, and the Quest will use sensors in the headset to more accurately convey your position within the virtual scene.

Importantly, early reports on the inside-out tracking have been strong. There’s always a fear the positional tracking will have limited range or have interruption issues — but most reviews we’ve read, in addition to our own work with the Quest, suggest the inside-out tracking is top notch for a mobile headset.

2. If You Want Powerful Touch Controller Interactivity…

Some people prefer gaze navigation. Others like the single-touch option of the Oculus Go & Samsung Gear VR controllers. But for users that want maximum interactivity — the ones serious about VR that are using InstaVR Enterprise + Oculus Quests — a powerful hand controller is essential. And Oculus Quest provides two Touch controllers that have tested very well so far.

As you can see from our screenshot below, you’ll be able to more closely mimic immersion into a virtual world. Those two virtual hands can do so much more than with a single-click controller. In the case of the below scene, you can move the chair to a new location using the Touch controllers. This level of motion with hand controllers provides a greater level of interactivity, and consequently, a more immersive and memorable VR experience.

For instance, a virtual surgeon application will more closely approximate your real hand movements, allowing for a more realistic and thorough training experience. The Touch Controllers + 6 Degrees of Freedom tracking provider Oculus Quest users a huge leap forward over the more basic 360 video watching possible with Oculus Go & Samsung Gear VR.

3. If You Want Amazing Visuals, Including 1,440 x 1,600 Resolution Per Eye…

The Oculus Quest has dual OLED displays that provide 1,440 x 1,600 resolution each lens. Very impressive for a mobile headset, especially when you realize the Oculus Rift only supplies 1,080 x 1,200. The Field of View is roughly the same as the Oculus Go (~ 100 degrees), which is fine for most users.

Early reviews, and our own experience, is that the visual quality is equal to or even better than the Oculus Rift. This is pretty amazing for a mobile headset not tethered to a high-powered computer and graphics card. We can thank the Qualcomm Snapdragon processor for that!

Why do visuals matter so much for InstaVR users?

As part of our announcement on compatibility with Quest, we also announced standard Enterprise client ability to ingest 3D CGI files such as FDX, OBJ, and VRML. This is something we’ve previously been able to do, but hadn’t productized for end users — reserving the capability for certain top-level clients. But now anyone with an InstaVR Enteprise subscription can create immersive VR with these file types.

There is a high level of visual complexity involved in 360 3D CGI high resolution files. The Quest is a near perfect headset for viewing them, especially given the headset’s price point. So far, we haven’t experienced any lag or pixel distortions in our testing. The level of immersion truly is extraordinary, perfect for our clients that are capable of creating 360 3D CGI renders and videos.

4. Conclusion

As we said in our initial impressions of the Quest last year, the headset is the perfect fit for a certain subset of users with particular needs. Clients serious about VR — be it Fortune 500 members, VR agencies who use the technology daily, or forward-thinking companies distributing VR to hundreds or thousands of employees — will want to take a serious look at the Oculus Quest.

More and more companies are seeing the importance of using VR in their enterprises. As such, they’re migrating from using standard 2D or 3D image/video to immersive, interactive 3D 360 CGI environments. Authoring in InstaVR Enterprise and deploying to Oculus Quest provides one of the most impactful ways to utilize VR in your organization. We’re looking forward to helping you create amazing VR experiences!

*Note: Publishing 360 3D CGI files such as VRML, STL, and FDX to Oculus Quest and Oculus Rift S headsets requires an InstaVR Enterprise subscription. To learn more about InstaVR Enterprise, contact our sales team via email: contact at instavr.co

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

InstaVR Launches Single-Click Publishing to
Oculus Quest and Rift S, Including
World-Class Touch Interactivity
Easily Create and Distribute Immersive 3D Applications to the Oculus Platforms without Coding

San Jose, CA – May 1, 2019 – InstaVR Inc (https://www.instavr.co), a virtual reality (VR) authoring, publishing, and analysis company, today launched Oculus Rift S and Oculus Quest publishing compatibility, including the ability to interact with 3D environments using Touch controllers.

Facebook’s Oculus division will start shipping this month the Oculus Rift S and Oculus Quest, high-quality VR headsets that enable 6 Degrees of Freedom (6 DoF) movement and interactivity through powerful hand controllers.

With InstaVR, you can already utilize a wide variety of 3D file formats, including FBX, OBJ, and VRML. After publishing to the Oculus Rift S or Quest with a single-click from InstaVR, users can be transported in their headsets to immersive digital environments. Touch interaction is automatically enabled when 3D models are deployed, allowing for maximum interactivity.

InstaVR clients can also publish new or existing standard 360-degree image and video-based applications to Oculus Rift S and Quest headsets. Existing InstaVR features including Success Reports are compatible with the new publishing outlets too.

“Our clients create immersive, interactive VR applications that are integral to their businesses,” said Daniel Haga, CEO and Founder of InstaVR. “We’re very happy to provide publishing compatibility with these two new exciting Oculus platforms, which we expect many of our forward-thinking clients to utilize.”

Popular use cases for these new robust VR publishing outlets include employee training, employee recruitment, sales & marketing presentations, and client communications.

InstaVR 3D application publishing to Oculus Rift S and Quest, including Touch interactivity, is part of the InstaVR Enterprise subscription.

InstaVR Lunches Touch Interactivity on 3D Objects With Oculus Quest / Oculus Rift S Compatibility - YouTube

###

About InstaVR Inc

Based in San Francisco, InstaVR is a virtual reality focused technology company, providing tools and services to enable anyone to author and publish interactive VR applications. InstaVR requires no prior VR experience and no specialized engineering knowledge. Since launching in early 2016, over 50,000 companies have created over 300,000 immersive 360-degree applications for iOS, Android, Web, Gear VR, Google Daydream, HTC Vive, and Oculus Go and Rift headsets. Clients include Toyota, TUI Group, PwC, US Navy, and GOV.UK. For more information and to access the InstaVR platform, visit www.instavr.co.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

How to Use the Oculus Go for Sales & Marketing Presentations

We’ve already established that Virtual Reality (VR) is great for sales and marketing presentations. It’s immersive, interactive and memorable. It transports viewers through sight and sounds in an incomparable way.

Now that Oculus Gos are widely being used in B2B and B2C contexts, we wanted to explore more about how to use the headset specifically for these types of presentations. The Oculus Go, like VR in general, is a perfect fit for for sales/marketing — it’s mobile, easy-to-use, and very powerful for the price point. It’s an incredible sales tool at only $200 per headset!

Below, we’ll walk you through some best practices for creating great sales/marketing presentations with InstaVR + Oculus Go. This can be for 1-to-1 sales meetings, one-to-many presentations, trade shows, events, and much more. The headset is flexible for multiple use cases.

And creating VR apps for the Oculus Go is easy. InstaVR requires no coding or prior technical knowledge. We offer both web-based and hybrid software/cloud solutions for creating compelling VR sales experiences.

Thanks for using InstaVR to create Oculus Go apps and reach out to us at “contact at instavr.co” with any questions!

1. Make Your Sales Presentations Visually and Audibly Impressive by Using a Good 360-Degree Camera

If you haven’t made a VR app before, read our guide to authoring a VR application first.

The publishing section of that guide talks specifically about iOS/Android/Google Cardboard. You’ll be skipping those last few steps and instead using the Oculus Go Publishing Guide we’ve put together. All the other authoring features are applicable to the Go though, including Navigation Links, Hotspots, Voiceover, and more.

You’ll want to use for your apps either really good 360-degree CGI renders or a really good 360-degree camera. We’ve come a long way in the camera department in just a few years. We now have 180-degree cameras, which allow you to quickly and easily do a shoot without much prep. We have low-cost high-quality cameras, like the Insta360 One X that came out at the end of last year. And we have extremely immersive 3D 360-degree cameras like the Insta360 Pro, used by most VR agencies and companies serious about great presentations.

A good rule of thumb is any camera under $400 or 4K in pixels is probably going to give you a sub-optimal image and video quality.

Don’t forget about the importance of audio. For maximum effect, you may want to use either an external microphone OR overlay your video with additional sounds. Our only requirement for upload of voiceover or narration is that it is in mp3 format.

2. Make Your Sales Presentations Interactive

The Oculus Go comes with an intuitive and easy-to-use hand controller. Make use of it through adding interactivity to your scenes!

Sales presentations will have increased engagement if you utilize Hotspots (2D overlaid images/videos) and Navigation Links (allowing users to choose new scenes). Why? Because interactivity makes the user pay attention, make decisions, and creates an overall more memorable experience.

Hotspots can also provide valuable information to a user. If, for instance, your sales presentation is on a car engine… you can use Hotspots to show a close-up of an important engine part, or video overlay to explain how a part of the engine works, or audio narration to audibly explain what the viewer is seeing.

The possibilities are many with Hotspots and Navigation Links. You can create a virtual catalog or make a virtual tour. You can even add Quizzes, so the user engages with the app to answer questions. You can also place clickable Hotspots in your WebVR that open to specific web pages, as Zuo Modern has done so well on their virtual home showroom built using InstaVR.

3. Distribute Globally Using Oculus Go Headsets in Multiple Locations

One of our favorite features of the Oculus Go is the ability to push out apps globally to specific headsets using the Oculus Cloud. This is through Oculus’ Release Channels functionality.

Rather than have your distributed sales team continually send back headsets for you to load new apps via USB, you can instantaneously distribute VR sales/marketing apps globally. So if your salesperson has a big meeting with say General Motors, you can customize your presentation specifically for GM and then easily push the app out to your sales person in Detroit. They can focus on the client, and you don’t have to spend any time on logistics of physical headset deployment.

Release Channels also allows you to easily localize for language, update information such as pricing, and do A/B testing of your sales apps.

Oculus Go is the perfect headset for large or distributed sales teams. Marketing can manage the creation of apps, and sales teams can easily update via a click of the hand controller in their VR headset. Never has there been an easier, more mobile way to showcase your company in immersive 360-degree 3D virtual reality.

4. Update a Single Sales App Continuously Using InstaVR’s “Direct Publishing” Capabilities in Conjunction with Oculus Go

Another really cool feature of Oculus Go is the Direct Publishing capabilities with InstaVR.

Essentially, it allows you to update an existing app on the Oculus Go headset without having to download the file from InstaVR and re-upload to the Oculus Cloud. What does this mean? You can have a single Sales app on an Oculus Go headset, and then continuously remotely update that one app from your InstaVR account.

Remember all the potential custom updates we mentioned possible with Release Channels? Things like company logos, voiceover, text, etc… This means you can do those changes in the InstaVR Console, and see the update in your headset in seconds. For instance, you can use the same app for multiple clients, just swapping in their logo as a splash image. Doing that type of update will take you just a few minutes in the InstaVR Console — an incredible time saver!

Direct Publishing also allows you to regularly push new app updates to your sales team via Oculus Go headsets. If you have a weekly sales huddle, for example, you can push that out now virtually using Direct Publishing, a very practical alternative to the traditional conference call. You can also train your sales employees by direct publishing training apps. Super easy to accomplish, and you can convey all the important company updates and news in a unique and memorable way.

5. Conclusion

The Oculus Go has revolutionized virtual reality B2B and B2C sales presentations. It’s got very strong technical features, as we mentioned in our initial review, but doesn’t require the tethering to a computer like an Oculus Rift or HTC Vive. You can easily take an Oculus Go (or a few of them) in your carry-on luggage for meetings.

It also has revolutionized the distribution of VR apps. It’s so simple to push out a single app or many apps globally to multiple headsets (up to 100!), as well as to one-click update an app using our InstaVR Direct Publishing capabilities.

If you’re thinking VR might be a good fit for your sales teams, now is the time to fully jump in. It’s easy to use, affordable, and InstaVR makes the app creation process simple. You can be making sales presentation apps in hours and days, instead of weeks and months.

And one thing is absolutely for sure — you’ll make a more compelling, interactive, memorable sales presentation using Virtual Reality than you would with traditional video on a laptop monitor. The future of sales presentations is already here, and it’s made possible via InstaVR + Oculus Go!

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 
InstaVR Interviews: Meet the VR Practitioners

InstaVR Interviews is a blog series where we turn the spotlight on our customers. We find out why they create VR, how they use InstaVR, and what the future of VR will look like. To read more interviews, visit the InstaVR Interviews homepage.

Dr. Tarsem Singh Cooner, Department of Social Work and Social Care at the University of Birmingham, UK

Tarsem is a Senior Lecturer in Social Work, and Programme Director for BA Year 3. In 2011 Tarsem was awarded a Birmingham University Teaching Fellowship in recognition of his wide ranging and significant interventions in learning and teaching across the institution. In his work across the institution he has served on the Board of the Valuing Teaching at Birmingham initiative, the University Learning Environment Group and the University Massive Online Open Courses (MOOC) Steering Group.

Tarsem has been instrumental in developing innovative learning approaches such as his mobile phone/tablet app to help social workers explore how to navigate the ethical issues of using social media in social work.

Tarsem is currently developing an approach to disseminating ethnographic research about child protection using 360-degree video apps.

To download Dr. Cooner’s apps from iTunes and Google Play, visit https://swcpp.weebly.com/360-degree-immersive-apps.html

Inspired by seeing 360-video on “The Gadget Show”; taking the research off the page to convey sense of presence to viewers

Question: Tell us why you decided to use VR for your child protective services research?  

Answer:

We were undertaking a research project in the UK into protecting children from all types of harm. The research took place over two years at two different local authorities. I wanted to present the research in a way that was more accessible and user friendly. I also wanted it to be more realistic and engaging than something conveyed using just text on a written page.

Our original idea was to create YouTube videos of talking heads. To be honest, we got to the stage where we thought everyone’s doing that and it isn’t conveying the sense of presence — tone, body language, atmosphere — that is present on a social work visit, particularly a child protection one.

We have a television show in England called “The Gadget Show.” I was watching it one evening and they were demonstrating 360-degree videos. I thought, this is the perfect way to convey what we as researchers are actually seeing in the field!

I could see immediately that the 360-degree videos could allow us to take viewers with us on a visit, where they could get a sense of the atmosphere, the interactions between parents, their children, get a layout of the home, the anxiety the social worker may feel etc. We wanted to create a resource that allowed the viewer to have “the best seat in the house” whilst accompanying us on a child protection visit.

So that’s what got me interested. I wanted to take the research off the page and convey a sense of presence for the viewer. I felt that by having a sense of presence, the viewer could learn from our research, in a realistic way, about what can create enablers and barriers to effective child protection work.

Research looks at organizational approach to social work, as well as how to prepare workers to achieve better outcomes

Question: Taking a step back, what is the goal of your research? 

Answer: 

We’re working on a number of apps at the moment. For example, we’re looking at how the design of an organization can enable or impede effective child protection work. For instance, what happens when an organization values time spent developing relationships rather than on performance management tasks, which is more effective at creating safer environments for children and families? How does the philosophy of the organization influence supervision?

We’re also looking at the characteristics workers need to develop effective professional relationships. What kinds of things scare them and cause them to freeze, and not think outside the box and challenge poor parenting? What makes a social worker good at communicating and developing effective working relationships with children and parents in often highly fraught and challenging situations?

Using the 360 media with VR goggles, we have the ability to re-create situations from our research and convey them in a realistic way. For example, having someone shouting at you creates a sense of anxiety, engaging in this scenario using 360-degree videos enables us to throw the viewer in at the deep end where they can reflect on how they reacted and whether this resulted in them losing sight of the child. What we are able provide is an experience where the viewer is able to question how the anxiety provoking encounter impacted on their ability to function at a number of levels.

We also want to demonstrate good practices — illustrating how sitting lower than somebody else, adopting a good tone, body language to discuss awkward or challenging topics can lead to better relationship-based practice. Our research has demonstrated that if you’re valued at work, if you have good supervision, and you have time to develop relationships long term, better outcomes are produced for children and families. We want to use the apps to convey these messages in a manner that recreates as far as possible, experiences of real life situations.

Used a script on shoot day, but allowed for improv; will use branching sequences in the future, but wanted to keep file sizes manageable for today’s cellphones

Question: Can you discuss how you approached creating these apps?

Answer: 

This is something brand new that I was playing with. I’m responsible for filming, editing, and stitching the whole thing together. And so, I’d not done this before, and we couldn’t film actual research visits for very sound ethical reasons.

So what we did was work with real social workers and service users to re-enact scenes from our research experiences. The actors were people who had engaged with the child protection system from the practitioner and service user point of view, so they had a deep understanding of roles they needed to play.

Through analysis of our research we developed themes from the findings and created the scripts, but  the   scripts   were   flexible   as   long   as   we   addressed   the   core   issues.   The   interactions between the actors could be improvisational.

We hired a house and made sure it looked like one you’d typically find. Then during the day, we looked at the script and practised it. When we were ready to go, we filmed it. What we wanted to do, and what was critically important, was to try to demonstrate in real time the aspects of a visit that “enabled” or created “barriers” to effective child protection work.

What I also wanted to do was put in branching sequences. At different points you could stop the video and say, “What options would you take here?” That is something we will consider, but the trade off is file size. With branching options, you’ll end up with pretty huge file sizes. On the basis of Moore’s Law, we know what we develop today will become better as phone capacities increase in the near future, so watch this space!

1st person POV gives a feeling of presence, making viewer more part of the scene

Question: Why did you decide on 1st person POV approach? 

Answer:

When we think about 360 filming, your original thought is the 2D approach — you place the camera somewhere, and you see the social worker walking up to the house. That just doesn’t work. Conceptually, it won’t be the best use of 360.

So that’s where the idea of having you accompanying the researcher came up. We could take you, as the viewer along on the journey. So you walk to the house beside the researcher, you’re in the house, you’re able to follow the social worker and researcher around the house — you get a sense for what the visit and research observation is really like.

I think for anyone developing 360 videos, they have to consider the perspective from which they want the viewer to engage with the scene. Otherwise the viewer may not feel part of the action.

We Used a Go Pro Fusion; editing was done on Final Cut

Question: What hardware, software and other filming equipment did you use? 

Answer:

The camera I used was the Go Pro Fusion. The reason, after doing research, I found it to be the best camera for the price. I got a couple of ultrafast 128 GB Micro SD cards so it could cope with what we were filming. I think the quality is just beautiful.

With the Go Pro Fusion, you get a stick, so rather than using a gimbal, we just used the stick. I tried to hold the camera at eye level as much as possible, to give you the sense you were eye level with the different actors, or if you were sitting down, you’d get the feeling of what it would be like if someone was standing talking over you.

I used a MacBook Pro and Final Cut Pro for editing. I started with Premiere Pro, but I’m an Apple accredited editor, with years of experience using Final Cut. For  me  it  just  felt  that  the  workflow is better in Final Cut.

Selected InstaVR for intuitive and easy to use nature; also allows Dr. Cooner to show other Academics how to build VR

Question: Can you talk a little bit about your Immersive University, a new teaching approach that incorporates VR?

Answer:

When the University funded this part of the 360 production, I initially had to do a lot of explaining to the funding body. This was because I couldn’t demonstrate anything to them. Once they got the idea, they were and have been very, very supportive.

What I did first was use Adobe Animate + Google VR to develop 360 videos. Part of my bid for funding was that I would share the development process with other academics so they could explore the use of this approach in their own fields. When I was at that stage, I thought unless you’re acoder, you won’t be able to put this stuff together. It’s at that point that I started doing web searches to find an alternative, easier 360-degree video app development workflow.

I came across InstaVR and a couple of others. I looked at InstaVR and read the web site reviews. The best thing about InstaVR was the option to do the free trial. I found it really intuitive, it was very clear the steps you needed to take to develop the apps.

I tried to put myself in the shoes of a fellow academic. Let’s  be honest — it’s still technically  quite challenging to produce and edit the 360 videos. But InstaVR was the easiest platform  I found for stitching the materials together to create the apps. So it should still be possible for my colleagues to be able to produce similar apps even if a small team is involved.

Getting the Pro account with the education discount makes a big difference. Having the educational price opens up the platform for academics to use. It met a criteria that was part of my grant application  —  I’d be able to demonstrate to other academics how  to  produce  360 materials. You don’t have to code anything and it’s very straightforward.

First distribution is via Google Cardboard, because of access and lack of tethering to a computer

Question: What is the distribution plan for the apps? How are you promoting it?

Answer:

What I want is the ability for people to pick up and run with the apps with the minimum hardware

requirements. We bought 50 Google Cardboard headsets so we can use these with our apps to prepare our BA and MA Social Work students for the realities of social work child protection practice. For our students, they have the unique experience at the University of Birmingham of having not only access to the apps, but also to the tutors who carried out the research. They will have opportunities to learn from research and engage with the researchers to develop the knowledge, skills and approaches required for effective child protection work.

To help promote the project, the University are developing a web page for the project. In addition, we will also be developing the research project website further as well.

Once we have the official launch, we’ll be using the apps to engage with other practitioners, policy makers, students and academics using platforms such as Twitter. We want to discuss and share our research using the apps with the wider community to share knowledge of those elements that can improve child protection practices.

We’re also going to local child protection agencies to train their trainers to use the apps so they can use these resources with their students and practitioners. It opens an easy to use and different medium for practitioner development. The local agencies are also very interested in the supervision and organisational design apps we are developing. They have expressed an interest in using these with us to explore their current approaches and to see how their organisations can use our research to become more effective. Initial discussions and engagement suggest that allowing policy makers and managers to immerse themselves in 360-degree video scenarios allows them to experience in an empathic way, what enables or creates barriers in effective child protection work. This can lead to a greater understanding and motivation to create effective personal and organisational changes.

Still formulating how to quantify impact of VR, but end goal remains better outcomes for children in need

Question: Where do things go from here with your apps and research?

Answer:

We want the students attending the University of Birmingham to have the most innovative and up to date research based teaching. We want to use new and upcoming teaching methods and technologies to prepare them to become the best social workers they can be.

We also want to see what impact this approach can have in the real world, that the investment in time, effort and creativity is making a change in the field. We’re at that stage now where colleagues within the wider university want to learn more about this 360-degree video approach to learning and research dissemination. But to justify more investment we want to quantify the impact. We’re still working on it, we’ve broken new ground and are learning very important lessons that should benefit a wide range of stakeholders. However, in this process, we’ve not lost sight of the fact that the end goal is to improve the lives of children and families in need.

Thank you to Dr. Tarsem Singh Cooner for his time and insight! For more on his research with VR, visit their project web site at https://swcpp.weebly.com/360-degree-immersive-apps.html

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

How to Add Quizzes (Questions + Answers) to a VR App InstaVR is a great solution for building VR apps across all major VR headsets. With a combination of Hotspots and our new Analytics feature, you can build in-app quizzes (aka questions & answers), and see in real-time* what answers are given. You can see a completed one scene example app referenced below by clicking here. Below are the step-by-step instructions on how to add questions and answers. Let us know if you have any questions, and best of luck with your VR projects! *Analytics data is only passed to InstaVR from mobile devices and VR headsets that have internet connectivity.
1. Name the 360-Degree File (aka Scene Name) Something Identifiable

Prior to uploading the 360-degree scene that will contain your questions/answers to File Manager, give it a name that will be recognizable in our Analytics section.

In the case of the example app, I named the file “Backyard Inspection.” You could also do it numerically, if you wanted — ie “Question 1 Scene.”

2. Give the VR User Instructions

The first step in creating a quiz is to explain the instructions to users. There’s two ways to do this in-app — via audio or text.

For audio – Record an .mp3. You can do this with many popular desktop programs (ie Apple Logic Pro). There are plenty of free apps that you can run on your mobile device that record .mp3s too. In the InstaVR Console, you’ll then upload the .mp3 in the far right of the Authoring Console under “Voice Over”.

For text – Add a Hotspot that explains the instructions. Most clients will create a graphic (.jpg or .png) with the instructions in something like Photoshop, and then upload it as either a Hotspot image or the Hotspot icon. For the latter, the user doesn’t have to engage the Hotspot to read the instructions. You can also use the Text box space below a Hotspot, if the instructions are relatively short.

For the example app, I’ve added a .mp3 for the instructions I titled “BackyardInspection.mp3.”

3. Place the Question in the Scene

For each scene, you’ll likely have a question, or maybe a few.

Just like with the instructions, you can ask the question verbally using a .mp3. Or you can use text or an image.

For text/image, you’ll want to place the question somewhere easily readable. It can be conveyed through a Hotspot’s custom icon (as I’ve done for the example app) or through the Hotspot text function.

The name of the question file is not important, as it won’t appear in the Analytics. (The aforementioned scene name and/or answers will identify the question in Analytics)

4. Provide the Possible Answers

You’ll again want to place new Hotspots above the question with potential answers. For the example app, I’ve kept it simple — Hotspot icons that say “Yes” and “No”. Again, you can use the text box in lieu of a custom icon — but the custom icon allows you a bit more customization possibilities.

5. Present/Record the Answers via Hotspots  (.jpg/.png)

A user will then pick their answer. How they do so depends on your publishing output — for instance, for the Oculus Go or Gear VR, it will be via hand controller. For iOS/Android, it will be via gaze. And for WebVR, it will be clicking.

Naming the Hotspot that gets initiated by the choice is important. Why? Because the analytics will show you the file name of that Hotspot .jpg/.png. So for this backyard inspection, I’ve named the answer choices “Backyard – Incorrect No Answer” and “Backyard – Correct Yes Answer”.

6. Reviewing Answers in Real-Time and Exporting them to Excel

As discussed in our Analytics article, we’re reporting and presenting to you every interaction with your VR app in real-time, assuming the headset or device is connected to the Internet.

So as long as you know who wore the headset at what time, you can identify their answers.

To get to the Analytics, you click the Account silhouette icon in the upper right hand corner. From that drop-down, select Report.

InstaVR Pro users can select the duration of data they’d like — daily, weekly, monthly, or custom.

In the example app, you can see the first user selected the right answer, the second user selected the right answer, and the third user selected the wrong answer.

You can also export this report to Excel. By doing so, you can add additional fields like names or notes.

6. Summary

InstaVR gives you maximum flexibility when creating quizzes. You can use custom text (colors, font, size, etc), custom graphics, and even add audio files. Creating quizzes within your InstaVR apps has never been easier.

And now with our Analytics feature, you can get real-time data on what answers are being given by your users. Either aggregate all your users to see trends, or parse out by time to see individual user answers.

Look forward to hearing all your success stories with quizzes!

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

How to Analyze Virtual Reality (VR) Heatmaps and Analytics

Many organizations now use virtual reality as an integral part of their business. They’re making VR experiences for employee training, sales/marketing presentations, recruiting, communications and more. As VR applications take on a more prominent role in organizations, so too does the data generated from the apps — specifically heatmaps and analytics.

InstaVR provides a wealth of data to our Pro and Enterprise subscribers. Heatmaps are not only visually displayed in the Console, but also exportable and able to be parsed by date range. Analytics shows just about every conceivable data point — views, device types, scenes viewed, hotspots/navigation links interacted with, time of each of these interactions, etc. And analytics, just like heatmaps, are fully exportable for sharing with others in your organization.

We’re going to below do a deep dive into VR heatmaps and analytics to help you maximize your use of these powerful tools. We’re also proud to announce our new InstaVR Success Report — a weekly email all InstaVR Pro and InstaVR Enterprise users receive with a summary of the activity related to their VR applications. This is a great automated report that gives you an executive level view of your VR initiatives.

If you have any questions on VR heatmaps or analytics, don’t hesitate to reach out to our Sales or Customer Success teams. Enjoy!

How to Analyze VR Heatmaps

How Heatmaps Work

We released Version 2 of our heatmaps feature in late 2017. The largest change was we that broke the heatmap panorama visually into quadrants — basically squares arranged in a 7 x 12 grid  — to more easily display the user focus. This change also allows for exporting to a grid CSV, something you can do in addition to exporting as an image file.

Heatmaps can be accessed through your main InstaVR Console, by clicking the “Heatmap” button in the upper right. Heatmaps can be created for both 360-degree images and videos. The numbers & colors correlate to aggregate focus, on a 1-10 scale. If one user looks at a part of a scene for 1 second, for example, and the second user looks at the same area for 10 seconds, that would add up to 11 seconds. The 1-10 scale is used to normalize the data to a common scale.

InstaVR Pro and Enterprise users can also use custom date ranges for their heatmaps. They can select to visualize gaze focus over a day, a week, a month, or a custom date range they define.

Note: each heatmap is project specific and only accumulates data if the user has a device or headset that is online to pass data to us.

How to Analyze the Heatmap Data

Quick story: when we rolled out VR180 compatibility last year, we looked at heatmaps to determine how effective VR180 as a format would be, particularly for employee training. The answer is “very”! We know this because the heatmaps showed much more intense focus on the front 180-degree field of view of a scene — based on the initial focal point — and considerably less in the back 180-degree FOV. This matches up with what we intuitively thought, and confirms our belief that in a 360-degree scene, creators should spend more time perfecting the action that takes place in the immediate front viewing area.

Back to heatmap analysis…

There’s many things you can do with a heatmap, including:

– Use it to improve your VR. If you’re doing employee training, and employees consistently are not looking at an important area of a scene, you need to call it to attention either through a hotspot or voiceover narration. Or by starting the initial POV on the area that is not getting enough attention.

– Use it to test employees and viewers. We see many academic and corporate researchers of InstaVR accessing the heatmap feature. Why? Because it provides a lot of information on where viewer attention is. If you’re training new doctors to avoid distractions through VR, like they’re doing at Stanford University School of Medicine, you can see which of the distractions elicit the most focus vs. the least. And then in future trainings, really focus on getting the doctor to ignore the most attention-grabbing distractions. Another similar use case we’ve seen is consumer products. You can test different packaging and store layouts to see if your products get more attention in a simulated VR shopping experience.

Ultimately, heatmaps are better than self-reporting by users. It gives you actual data on where they are looking. Oftentimes, people don’t even notice what is getting their attention while in a VR experience, but the heatmaps will show it.

How to Analyze VR Analytics

How Analytics Work

Any time a user accesses your VR project created with InstaVR — as long as the device connects to the Internet — we’ll be pinged back with a wealth of data pertinent to their interaction. In a sense, this is very similar to web analytics, if you’ve ever used them before.

You can access your analytics on a per project basis by selecting “Report” from the drop-down in the upper right of your Console.

The analytics section encompasses: Number of Devices, Type of Device or Operating System, Number of Scenes Viewed, Total Viewing Hours, and then every single action taken by users (Scenes, Navigation Links, Hotspots, Calls-to-Action). All of the interaction data is timestamped, so if you know who was using your app at a certain time, you can pinpoint their actions exactly (more on that later).

Like with heatmaps, Pro and Enterprise users can do customized data ranges for the analytics — daily, weekly, monthly, or custom. The full report can also be exported to CSV, for offline analysis.

How to Analyze Your VR Analytics

We’re presenting so much data to you in Reports. But how can you take action on it? The following are just a few of the most popular things to analyze within this section of the Console.

– What devices are being used to access your VR? Looking at the device types helps you to understand how your audiences access your app. For example, if you’ve posted your completed app to iTunes, Google Play, Web, and the Oculus Store — you can see exactly which of these channels gets the most plays. Based on that data, you can then come up with a more focused marketing strategy.

You can also make decisions based on device type. For instance, as we’ve discussed previously, WebVR is a much better fit for 360-degree images than 360-degree videos, currently. So if you see most of your users on Web (vs. mobile app), you might focus more of your VR development on 360-degree images. Or the reverse could be true, and if app downloads are popular for you, you can concentrate more on developing 360-degree videos.

– How effective is the layout of your VR? What scenes do people want to view? The very granular interaction data gives two important indicators: how well you’ve laid out your VR experience and what scenes are popular. For instance, if you have an intended path you’d like users to take (Scene 1-> Scene 2 -> Scene 3 -> Scene 4), you can see where in the funnel they’re dropping out. That indicates to you either that navigation is too hard or that you’re losing the interest of your users partway through.

You can also see which specific scenes people are interested in. So, for instance, if you’ve built a travel VR app where the user can initially choose between visiting Hawaii, Alaska, Maine, or California, you can understand which choice is the most popular. (side note: If you haven’t read our Tui Group case study, it’s worth your time!)

– Testing employees. With VR more and more being used for employee training, we get asked how you can incorporate and analyze questions into scenes? Through analytics! If you create questions using either Hotspot images or text — and you properly label your Hotspot file names — you can see exactly what answers were given to questions. We’ll have a full step-by-step guide for setting up questions, and collecting answers, on the site next week.

There’s plenty more things you can do with heatmaps and analytics. We’re just scratching the surface here. If you have any feedback or suggestions on either feature, don’t hesitate to reach out to us at contact@instavr.co.

*header image courtesy of Dr. Dean Whitcombe, University of South Wales

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 
InstaVR Interviews: Meet the VR Practitioners

InstaVR Interviews is a blog series where we turn the spotlight on our customers. We find out why they create VR, how they use InstaVR, and what the future of VR will look like. To read more interviews, visit the InstaVR Interviews homepage.

Conor Todd, Founder of Concierge VR360 (“CVR360”)
With past experience in Media Production and Governmental Relations, Conor Todd has made it his mission to ensure accuracy, excitement, and intrigue are built into messaging. Taking advantage of multiple forms of media communication Conor has effectively used the spectrum from the written word, podcast, traditional video, and 360 experiences to bring viewers and consumers closer to the experience. Fascinated with new technology, Conor is always exploring new forms of interacting between consumers and brands.
CVR360 born from the knowledge that users are captivated by VR experiences; initial use case is for promoting restaurants, combining 360 tour with restaurant specific information

Question: Can you tell us about the origins of Concierge VR360?

Answer:

The idea started a year ago when I was working for a local marketing company. I was trying to get businesses and clients interested in VR and 360. I was pitching them, “Hey, this has the opportunity to connect with people.”

I’m very interested in the click-through and view-through rates of 360 experiences. The fact that people are captivated by them.

So I was trying to pitch people on the idea that we should be creating some of their content in 360 because people will check it out. And the people that do look at it will stay and really explore it.

At that time, a year ago, I didn’t know about InstaVR. I also don’t have any coding background, to be quite frank, apart from HTML. And my bosses said we don’t have those capabilities in-house, so let’s not pitch it.

But then I started talking with my friend Jessie, who has been developing a VR concept for storage in conjunction with a CMS company. The idea of turning CVR360 into a business started after we stopped offering it at that marketing firm I was at.

I’ve developed it on my own and it’s a simple idea — it started with hotels, and giving travelers a way to explore restaurants in the area before they go. The idea was to mitigate being let down by the experience. We call it “Visual Collateral” when we talk to people at the hotels.

Now that we’ve built the prototype, I’ve started taking it to hotels and showing it to people there. So CVR360 is about putting all the information for a restaurant in one place. We look at so many different places — like Yelp, Zagat, and Social Media. We combine that with the visuals of the environment, giving you a full experience. You can then figure out “Is the ambience to my liking?” and “Does the information on the restaurant make me want to go there?”

Prototype built using still images from the Yi camera; wants to keep VR as accessible as possible to as many people

Question: For your initial app, what equipment did you use? And is it for still images or video? 

Answer:

Filming has been done with the Yi camera. It’s great for now. The whole 360 and VR industry is moving so quickly though. I’ll probably be getting the new Insta360 Pro.

But for now it’s just been still images. That goes with my thought that for that everyday customer — that traveler — we don’t want to make VR any more intimidating than it already is. It’s been difficult for some people to get into 360 and VR because the technology and features have grown so quickly without the everyday person being able to connect with it.

So the thought is let’s keep it very simple using static images. We’ll do more video work down the line, but for the prototype, we wanted to just keep it simple.

Jesse James (UX Designer) with Conor (Right)

Conor uses Apple’s Preview feature and screen grabs to add interactivity

Question: You’ve done some cool overlays in the prototype you shared. Can you discuss how you built those?

Answer:

That was really easy. I just used the new features of Preview on Apple. It involved finding images online and cutting them down to size to fit the icons. The reason I spent time doing that was to use some trusted names and credit them appropriately. That seemed like the best way to do it.

Hotspots have gotten major use from me. Particularly using Hotspots with text. If you click around in the experience, the Yelp reviews are screen grabs where we’re inserting the .jpgs.

Promotion done through partnerships with municipalities and associated non-profits

Question: What are the distribution and promotion plans for the VR you’re building?

Answer:

In my mind, InstaVR is the perfect platform to get going on and start taking on clients.

The plan is to work in conjunction with city tourism departments. We want to work with municipalities or non-profits associated with the cities.

So here in Annapolis, for instance, we’re having conversations with Visit Annapolis, which is a non-profit that works with a local marketing firm that does all of the downtown partnerships. All the restaurants are a part of it, and they help them do advertising. It helps promote Annapolis in general.

For distribution, we see the opportunity as let’s partner with these entities that already have big channels that are robust and full of content, that people are used to going to when looking for information about the city. Let’s work in conjunction with them to engage local businesses, to purchase production that we assemble. We can then hand it off to places like Visit Annapolis, saying here’s this great piece of content we produced and paid for, but we’re giving it to you to reach the right people and groups.

We will do a standalone app for iOS and Android. For right now though, we’re just doing Web apps. We really want to make it easy to get to and access. I love the fact that it publishes to the Web, and it looks great on your browser or you can throw it in a pair of Google Cardboard.

Business plan includes filming cost + hosting fees; will provide services to Baltimore-DC-Annapolis-Philadelphia-New York

Question: How do you plan on charging clients? And what geographic footprint are you covering?

Answer:

We want to go to the restaurants and engage them to purchase production. That would be an upfront cost, basically an hourly fee.

Then we’ll charge them a hosting fee to maintain the content. And the subscription will allow for updates, since restaurants inherently change — new beers, different specials, all those things shift over time.

The subscription could be quarterly or monthly, depending on if they want a lot of updates.

We’re aggressive. We’re going after Baltimore, DC, Annapolis, and then up towards Philadelphia and New York as well. I’m originally from New York, so it’s easy for me to get up there and do work.

Thank you to Conor Todd of CVR360. If you’re in the restaurant, hospitality, or CVB industries, and are interested in utilizing virtual reality for promotion, you can connect with Todd on LinkedIn.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

With the release of the Oculus Go headset, many InstaVR clients learned about the benefits of using something called Release Channels. You could take your InstaVR-generated Oculus Go app and easily distribute it globally — to other offices, partners, clients, and more. It’s the easiest way to get your VR app on specific Oculus Go headsets without physically having to touch the headset.

Below we’ll walk you through what Release Channels are, how to use them with your InstaVR account + Oculus Go, and some popular use cases.

1. What are Oculus Release Channels?

There’s two ways to get your VR apps onto the Oculus Go: via USB Cable (ADB) or through the InHouse Oculus Store.

The latter involves uploading your created .apk to a cloud managed by Oculus. As opposed to the regular public Oculus Store, the InHouse Oculus Store is for building and deploying private apps not accessible to the general public. You can select and choose which users/headsets can access your app.

The Release Channels go by three different names — Alpha, Beta, and RC. Essentially, for your purposes, they’ll all serve the same function. It just helps you for organizational purposes to have more than one to choose from.

Adding an app to a Release Channel means you can then distribute that app simply by adding an email address associated with Go headset to the Release Channel. The headset user will have to put their Headset in Developer Mode, and have signed up for a free Oculus Developer account online. But the ease of distribution makes this well worth it.

Because the Oculus Store method won’t publish your app to the general Oculus Store, you don’t have to worry about unwanted parties accessing it. You can also update the already distributed app using our Direct Publishing methodology — see video below — which enables updating an app without having to touch a headset or downloading/uploading a .apk.

If you haven’t used Release Channels before, it’s actually relatively easy. No technical knowledge is required. And because you can add up to 100 email addresses to a Release Channel, you can distribute an app very widely through your company or to partners/clients.

InstaVR: One Click Direct Publishing to Oculus Home - YouTube
2. How to Use Oculus Release Channels

1. Create a new project on the Oculus Developer Page (note: you’ll have to sign up for a Free account with Oculus, if you don’t already have one)
2. Create the .apk VR experience using InstaVR and package it for the Oculus Go. Packaging instructions can be found in the How-to Guide for Publishing to the Oculus Store.
3. Log back into your Oculus Developer account, choose the project, click Manage Builds, and select the Release Channel you’d like to upload the .apk file to. You can select Alpha, Beta, or RC.
4. Upload your .apk. Oculus will then after a few seconds or minutes confirm the file meets their standards with a blue dot and the text “Complete” under Test Status. If there’s any errors in your app, the Oculus Developer page will let you know. (The most likely cause of error is not entering the right App ID from the Oculus page in InstaVR prior to packaging)
5. Still in Managed Builds, click Users. Add the email address of the Oculus Developer account paired to the headset you’d like your app to appear on. Then click Add Users.  You can add up to 100 email addresses in the Release Channels, meaning potentially 100 headsets.
6. The email address you added will get an email from Oculus/Facebook. The recipient will have to click the link in the email to approve. The app will then be accessible in their headset to select to add to the main library. (Note: headset must be in Developer Mode for the app to appear. If you followed the previous steps, but don’t see your app in the headset, go to the widget in the lower right of the Oculus Go home screen to ensure you are in Developer Mode)
7. If you’re planning on updating that single app on a regular basis, you can use our Direct Publishing methodology. This allows for remote updates without having to upload any new .apks to the Oculus Developer Page.
8. That’s it! Any questions or issues, reach out to our Support Team over Live Chat on weekdays.

3. Example Use Cases

Geographically Distributed Go Headsets 

More and more companies have locations throughout not only a single country, but also the world. A lot of training you put together for one location may be applicable in other locations too. For instance, if you standardize your warehouses, you can create one VR training experience for all warehouses.

Release Channels allows you to distribute the apps without having to touch the headsets using a USB cable. This is particularly valuable if your headset users are non-technical. And since you can add up to 100 associated email addresses to the Release Channels, you can even segregate your headsets by division (ie Human Resources, Operations, Sales, etc).

Making Oculus Go VR Apps for Partner/Clients

Many clients of InstaVR users are not technical by nature. So the ability to remotely push out Oculus Go apps to them is very valuable. You can easily, remotely add apps to a headset without confusing the client — they just need to press a link in an email for it to appear in headset.

The Direct Publishing feature also has a lot of value. You can have select clients only maintain one app on a headset, and continuously update it. This would be great if you’re planning to routinely — say weekly or monthly — update the content of the app of your client/partner.

For Employees that Travel a Lot (ie sales presentations)

Often the employees utilizing VR are not the ones making it. If you have a travelling sales organization, for example, you can’t expect them to create, download, and transfer apps easily to a headset. So your creative team, the ones making the VR apps, can remotely program which content goes on a headset. This frees up your sales team to do what they do best — run sales presentations.

(Image courtesy of trias)

Conclusion

Release Channels are a great way to distribute Oculus Go apps easily and securely across an organization. The InstaVR -> Oculus Go headset process is smooth and scaleable. Regardless of your use case — employee training, sales/marketing presentations, operations, education — it’s worth a look at using Release Channels if you’re purchasing more than one Oculus Go.

As always, let us know if you have any questions. Good luck with all of your Oculus Go apps!

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

With interest in VR on the rise, many new to the technology start off with a simple question: should I build it myself or outsource to an agency?

At InstaVR, we have a mix of direct clients and agency clients. Our platform is simple, powerful, and flexible enough for all to use. So the answer comes down to a number of factors: budget, time, access to equipment, # of VR apps you’re planning to build, how often you’ll be updating them, etc.

Below, we’ll go through each factor to help you decide when you should use InstaVR to build your own apps vs. when you should potentially outsource to an agency using InstaVR to build for you. And if you have any questions on the process of building VR, schedule a call with our sales team to learn more!

1. Budget

There’s a pretty broad range of agency pricing. Some charge per hour, some per project. In general, to get high quality VR filming, you’ll have to pay a premium because there are still relatively few agencies dedicated solely to filming in VR.

That being said, if you do go the agency route, you don’t want to underfund your project. Why? Underwhelming VR will hurt enthusiasm for the technology at your company. All it takes is one bad experience — shaky camera, confusing narrative — to sour users.

On the flip side, if you have a high budget, you can also buy top of the line camera, audio, and lighting equipment to film yourself. The quality of the VR experience can be on par because you’ll have access to the same equipment. But you’ll miss out on some of the post-production services that agencies offer, like adding customized icons into InstaVR or injecting additional audio.

If your budget is small… build it yourself. All you need is a camera (even a newer prosumer camera like the Insta 360 One X), an InstaVR Pro subscription, and a laptop. And you can even do like our client Chili’s, and rent cameras and headsets at a significant cost savings over buying them.

If your budget is higher… you have the choice to either hire a good agency or buy more expensive equipment and film yourself. This decision really depends on some of the factors we’ll discuss subsequently (ie how many apps will you be creating?)

2. Time

This consideration is complicated.

Many of the top VR agencies are booked out months in advance. You could however procure your own equipment and begin filming + creating VR apps in InstaVR the same day. That being said, there’s a bit of a learning curve to using filming equipment if you’re brand new to the 360 world. So you have to balance out the time it would take you to learn equipment vs. how soon you need your VR apps created.

If you have a long time horizon… we suggest reaching out to multiple agencies, asking them for example VR apps they’ve produced, and working out a whole project plan with them. It’s best to have options and choose between agencies, but to do so requires a longer lead time to perform due diligence.

If you have a shorter time horizon… Amazon and many electronics stores (Best Buy, B & H, etc) have the equipment you need — cameras, microphones, lighting, etc. You could same day film if you’re in a rush. And since InstaVR is drag-and-drop and simple to use, there isn’t a steep learning curve to that aspect.

3. Access to Equipment

It’s amazing how ubiquitous 360 degree camera equipment has become in the last three years. Cost-effective prosumer cameras and professional cameras abound. We’ve even reached the point where you can rent equipment like an Insta360 Pro without having to outlay the normal $3500, if you’re just working on a single project.

On the other hand, if you really are just needing one app — say for employee recruitment or an event — and you don’t want to own any equipment, outsourcing to an agency may be easier. Of course, a lot of clients after they’ve built one app find other departments want to build apps too. You can’t really know until you gather feedback from that first app distribution.

If you have access to the 360 degree filming equipment… you can very easily create VR apps yourselves. You’ll definitely want a professional end-user experience though, so if all you have is a low-end consumer grade 360 camera, you’ll either need to upgrade to a $500+ camera or outsource to an agency.

If you don’t have access to 360 degree filming equipment… the easy solution is to outsource to an agency. But you can also easily buy all the equipment you need online or at an electronics store. Or even rent it.

4. Number of VR Apps You’re Planning to Build

Agencies are great if you have one or just a couple of apps you’re planning to build. But if you’re planning to seriously incorporate VR into your company — for training, for sales/marketing, for operations — then you’ll probably want to take ownership of the production process. The costs and planning in using an agency for a large number of apps is often difficult to manage. Plus, you lose the opportunity to build ad hoc apps, since you need to schedule with agencies.

If you’re just wanting 1-2 apps… an agency might make sense. The cost, plus the guarantee of high quality outputs, could make the whole process easier.

If you’re building many apps or want VR to be an integral part of your business… it’s probably better to take ownership of the whole process. As stated earlier, you can learn to use 360 VR equipment very quickly. And the ability to film yourself, when you want, guarantees maximum flexibility. InstaVR is simple for users in any business unit at your company, meaning you can manage the end-to-end process completely yourself.

5. How Often You’ll Be Updating Your Apps

One of the biggest issues with using an agency is the difficulty in updating an application. You have to re-engage with them to update the content. If there’s a scheduled update — say yearly — this isn’t too big of an issue, as you can plan ahead.

But if you’re needing to do periodic unplanned updates, you’ll want to have ownership of the whole project. With newer technologies like the Oculus Go Release Channels feature, updating apps has never been easier. You can even use the InstaVR Direct Publishing feature to have a single app on a Go that constantly has new content.

If you won’t be updating your apps often… an agency makes sense. You’ll be able to have a high-quality app available for a long time.

If you’ll be updating your apps frequently, particularly unplanned… you’ll want to build the VR yourself. Updating apps has become much easier, so as your company or processes change, you can update your VR easily using InstaVR.

Conclusion

The answer to “Build Yourself or Outsource” depends on a number of factors.

It’s important to understand your goals with VR before making this decision. While your company may engage with an advertising agency for your online & television commercials, Virtual Reality is a whole other medium which you can actually manage yourself. Particularly with solutions like InstaVR, bringing the building process inhouse has never been easier.

What we definitely suggest — if you’re thinking about using a VR agency, speak with a few of them and don’t under-budget your project. The ROI of great VR experiences is significant, and the people and equipment you use are a major component to how your VR turns out.

Let us know if you have any questions and best of luck with your decision!

Read Full Article

Read for later

Articles marked as Favorite are saved for later viewing.
close
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Separate tags by commas
To access this feature, please upgrade your account.
Start your free month
Free Preview