Voxon Photonics CEO Gavin Smith has spoken about the company's story, told us about the VX1 volumetric display and how it works, and explained how the device can be used to turn your 3D renders into "holograms".
In case you missed it
You may find these articles interesting
80.lv: Please introduce yourself to our readers. Where did you study? How did you get the technical skills you have?
Gavin Smith, Co-Founder and CEO of Voxon Photonics: Hi, my name is Gavin Smith, and I’m the Co-Founder and CEO of Voxon Photonics, a tech start-up based in Adelaide, South Australia. I grew up in Elgin, a small town in the northeast of Scotland. At school, I was easily distracted and became frustrated by teachers who didn’t have the imagination to make lessons interesting and engaging, instead choosing to read textbooks to us. There were some exceptions, including Mr. Roberstson, our physics teacher, who would get the pupils to link arms in a circle and then form a kind of Mexican wave, demonstrating the passing of electrons in a circuit. Following school, I enrolled in a course titled Technology and Management at Paisley College of Technology. The course gave me an introduction to Manufacturing, Design, Engineering, Electronics, and Computing, and luckily for me, the College converted into a University in my second year and I graduated with a BSc and a skill set that was designed to create managers for the flourishing electronics manufacturing industry that was growing in Scotland at the time. Having really enjoyed the computing side of my course in Paisley, and largely in part to the Scottish government proving free university tuition, I then enrolled in Strathclyde University and completed a Master's Degree in Information technology.
When I graduated in 1996, most of the tech jobs that exist now were non-existent, the Internet was only just beginning, I’d never heard of a startup, and the general advice to graduates was to start a career with a big company. After about five rounds of interviews, I was accepted into the 1996 Royal Bank of Scotland graduate intake and started a 10-year career in programming IBM mainframe computers. This was seen as a good "career" opportunity, but with only a text-based console to work with all day, my love of design and engineering was always in the back of my mind and I never really enjoyed the work. Ten years later, I met my wife-to-be Kate, left the Bank, and moved to Adelaide, South Australia. I struggled to find work in Mainframe programming and had to adapt and learn new skills. I eventually landed a job at Adelaide University, doing tech admin of their CRM product, and then moved into a role as an Oracle DBA. In parallel with this, once a week I started hanging out in my friend Will Tamblyn's shed every Thursday evening. We both had a passion for engineering, inventing, making, tinkering, breaking, and brainstorming crazy ideas. It was this "Thursday night lab session" that was the catalyst for what became Voxon Photonics.
We had a long list of things to make. Magnetic bike shoes, generative music machines, laser shows, CNC machines, Vacuum formers, and lots lots more. But on the 2nd of May 2007, we started a project entitled "Steampunk holographic display – Help me Ob1!"
The idea was inspired by Star Wars and the vision was to re-create the now famous 3D projections that had become the hallmark of every sci-movie since George Lucas’s classic in 1977.
Will and I during one of our “lab nights” in 2010.
For five years, we researched everything there was to know about about the state of art in 3D displays. Most of our resources came from things we found in auctions, on the side of the road, and sometimes we bought broken gear on eBay, which we dismantled in search of parts. We knew about holograms, the kind you find on your credit card, but despite popular opinion, they bore almost no resemblance to the 360-degree floating scenes that we’d seen in movies. Scenes that people gathered around, and could all look at from different angles, scenes that occupied a physical space, a space that you could measure in liters or gallons.
Our first prototype was a little rough... We taped an old phone to the piston of a lawnmower engine and turned the crank using an electric drill. The rapid reciprocation of the screen created a volume of light. A cube of addressable pixels. At around 15 frames per second, the refresh rate was about 200 times too slow, but the concept was good.
We very quickly realized that what we wanted to build was different from anything that we had seen before, a display that projected objects made from the light inside a volume, a display that had an X, Y, and Z resolution.
After five years of experimenting, Will and I completed our first "swept surface volumetric display" prototype. It consisted of a rotating helical screen that we vacuum-formed over a mold that we machined from wood on our homemade CNC machine. The screen rotated on a small DC motor and we projected onto it several thousand frames per second of binary images that were generated in a program written in Java. The display had no feedback loop and the graphics were pre-processed animations that were produced by rendering out thousands of Boolean intersections from 3ds Max.
This elephant is one of the first scenes that we rendered.
80.lv: Could you tell us more about Voxon Photonics, when was the company founded?
Gavin: After making a film of our first display and uploading it to YouTube, we were contacted by a team in New York. Unbeknown to us, they had been working on a similar project and it made sense to combine our ideas and form a company, thus Voxon was started. Around this point, we realized that a graphics engine written in Java was less than ideal and so decided to look for somebody to lead our computing team. Ken Silverman is best known for writing Build, the graphics engine behind Duke Nukem 3D. Along with Wolfenstein 3D and Doom, Duke Nukem 3D is considered to be one of many titles responsible for popularizing first-person shooters and was released to major acclaim in 1996.
In 2013, Ken was given a demonstration of our prototype, and shortly after, joined Voxon as a Co-Founder and Chief Computer Scientist.
At this point, the company was really a hobby and took up all our spare time. We knew nothing about running a business, struggled to work as a team, and had no money to spend on product development, legal, and numerous other areas that demanded our attention. It became clear that having a day job and running a business were mutually exclusive.
In 2015, we started winning prizes in competitions, South Start, Maker Faire, and Tech Crunch Disrupt. It was now or never. Will and I left our day jobs and moved into the Tonsley Innovation District in Adelaide, a brand-new space for startups located on the site of a former car factory. We completed Venture Dorm, a business accelerator based out of Flinders University, and won a trip to Singapore, where we got more pitching experience. Six months later, we pitched and raised $1 million. This allowed us to build a team, bring in engineers and programmers, and a highly qualified COO with real-world experience, somebody who could help turn our clunky prototypes into our first product.
Voxon team lunch during our time at the Tonsley Innovation District in 2019.
Since 2017, we have continued to push the boundaries of 3D display technology, and have grown from a shed-based hobby project into a global leader in the design and manufacture of real-time glasses-free 3D volumetric display technology.
80.lv: How big is the team?
Gavin: As of November 2022, we have a small team of 11, split across engineering, computing, design, and business development. We have two engineers, two computer programmers, an artist/photographer/videographer, two business development managers, a financial director, a COO, and myself.
I split my time between CTO and CEO roles and thus perform a wide variety of tasks including team management and leadership, 3D media design, volumetric workflows, Technology R&D, tech demonstration, and business development.
- Jordan and David, our engineers work on both VX1 production and proof of concept (PoC) design and manufacture. The work includes 3D design, PCB design, thermal management, managing a complex supply chain, 3rd party component integration, quality control, troubleshooting, optimization, MTBF improvements, part reduction, and all the other things that go into maintaining a production line and development of new products.
- Ken and Matt manage the development of our software IP and work on a variety of tasks ranging from core engine optimization, new data type support, UX and UI, 3rd party integration, bespoke applications for projects, and maintaining the SDK and our custom-built online software distribution platform Vertex. Ken also works on new hardware design and prototyping assisted by his father Harvey.
- Steve, our COO, manages day-to-day operations, contractual and legal obligations, HR, and payroll, and also project manages our PoC work.
- Will works mostly on business development, but also handles customer inquiries, arranges meetings with prospective clients, PoC scoping, and works as part of the corporate team on some of the less exciting tasks such as R&D tax, compliance, and grant applications.
- Jacob works on 2D and 3D artwork for both social media and product design. Most recently he has been working on our Instagram presence, which has a more arty flavor than our other feeds, which are more focussed on the technical aspects of our offering.
- Tennyson, our financial director helps manage our capital-raising efforts and assists with setting the Business Development and strategic direction of the company.
We also have two additional part-time business development managers working on specific markets in the USA and Australia.
In addition to our core team, Voxon has hosted around 8 University internships over the years, some of whom have been offered full-time roles upon completing their studies. We also take on school work experience students every year for a week, giving them the opportunity to work on real projects and get a glimpse of what working in a startup is like.
A typical debugging session during COVID.
80.lv: How is the working process organized?
Gavin: Voxon’s current workload is broadly categorized into VX1 sales, PoC, and R&D. VX1 sales are mostly generated following social media posts on LinkedIn, Twitter, YouTube, Facebook, and Instagram.
We spend a lot of time researching 3D data workflows, talking to other companies or individuals who are generating interesting data, and then producing and filming content to share on social media channels. Most inbound enquires come from these posts or from direct visits to our website thanks to our top SEO ranking in the field of 3D display tech.
After first buying a VX1, some companies then go on to do a PoC, a project to build a bespoke display using larger or higher resolution components. These PoCs start with requirements and end with a deliverable that comprises hardware and software.
We manage the production of hardware and software using Jira and use Google Suite to organize everything else.
80.lv: How do different teams communicate with each other?
Gavin: We mostly talk face-to-face when in the office, but use Google Chat to communicate in a variety of threads related to manufacturing, software development, and business development. We have all staff video chat every Wednesday.
Video call with Ken to debug some new hardware prototypes.
80.lv: What skills should potential employees have in order to join team Voxon?
Gavin: Everybody needs to be inquisitive and passionate about pushing the boundaries of what is possible with our technology. We strongly believe in experimentation over theory. I’m a big believer in designing from first principles and learning through experimentation.
- Computing: C, C++, and C sharp are mostly used in writing software for our products, using our own SDK which requires standard object-oriented programming skills and a good grounding in 3D geometry programming. For engine programming, we would look for C, C++, Assembler, AVX2, Intrinsics, Optimisation, Multi-threading, Real-time, FPGA, ASIC, Signal processing, embedded systems, microcontrollers, Image processing, Mathematics, and Physics.
- Engineering: Proven experience in digital manufacturing techniques. 3D CAD, CAM, Electronics, Material Science, Optics, Thermal management, Design for manufacture, 3D printing, Laser cutting, aerodynamics.
- Content creation: 3D Generalist with working knowledge of Blender, 3D transformation and transcoding, Animation, Video Editing, Fluid Simulation, Rigging, Unity, Particle effects, Baking textures, UVW mapping, Depth Cameras, Photography, and Videography, Creative writing, Social Media.
- Business Development: Solid understanding of 3D ecosystems and value propositions. Networking skills, account management, contract negotiations, contract law.
Voxon VX1 Volumetric Display
80.lv: Please tell us more about your flagship product – Voxon VX1 volumetric display – how did you come up with the idea of creating it?
Gavin: Whilst we started our journey using a rotating helical screen, we deviated from that path around 2014 to pursue an alternative design. During an experiment in the shed involving a salad bowl, some rubber bands, an old subwoofer, and a Beegees greatest hits CD, we stumbled upon the ability to move a screen up and down very efficiently and at high speeds using natural resonance. That same phenomenon that allows an opera singer to break a glass, it turns out can be turned exploited to vibrate things very efficiently. By intensionally designing a projection screen that resonated at exactly 15hz, we found that we could vibrate it a little bit and nature would do the rest, turning a small input motion into a large output motion. When moving up and down 30 times per second, the screen became a blur to the human eye. This “swept volume” became the space to be filled with light. Hundreds of millions of dots of light, volumetric pixels known as voxels.
When you shine a light onto a moving screen, the movement of the screen makes the light streak, leaving a trail behind it.
Initially, we tested the concept by shining a laser on a subwoofer. The movement of the speaker cone stretched out the dot into a line. That “persistence of vision”, the inability of the human eye to track moving objects, was the trick to making a volumetric display.
The first volumetric displays were most likely made by Neanderthals about 400,000 years ago. Technology has improved since then, but the basic principle is the same. Set fire to a stick, wait for a glowing ember, and then wave the stick around in the air at night. The trace left behind in the air is a volumetric image. A shape that occupies a physical space, made up of nothing more than a stack of moments left behind in the human eye.
Whilst those early displays used a single spark to trace out a line, our displays use a lot more. Inside our projection engines are microchips with over a million physically tilting mirrors. These microscopic mirrors can be turned on and off thousands of times per second, and together can render 500 million dots of light every second inside the volume.
These millions of tiny dots of light become the building blocks with which to construct animated scenes which are generated in real-time by our “photonic engine”, a graphics rendering engine that converts regular 3D geometry into thousands of digital object slices every second, which are in turn projected onto the screen of the VX1, and all are kept in sync using our own control circuitry. The VX1 is in essence a 3D printer, that prints objects made from light 30 times every second.
We started making the VX1 in 2017. Its design evolved out of our limited access to manufacturing equipment. Most parts were 3D printed and laser cut. Each unit took weeks to construct. We had endless technical issues trying to make them work well. 3D prints were delaminated, USB cables stopped working, and glue came unstuck. Windows updates bricked our core rendering engine and bearings dried out and stuck.
We implemented numerous design changes, replacing sliding parts with flexures, swapping ball bearings for composite bushings, ditching coil spring in favor of opposing magnets, trading acrylic screen for gorilla glass, and slowly, over time, the VX1 transitioned into a fully digitally designed and fabricated product that we can make here in our Adelaide office in a day.
80.lv: Are there any big-league companies that utilize your product?
Gavin: Our client base is diverse and spread across numerous industries. Sony, Toyota, Nissan, Unity, Harvard, MIT, CMU, BAE Systems, Lamborghini, Siemens, Nokia, Honeywell, and many more spread across use cases including education, medical imaging, mining, gaming, simulation, defense, communication, advertising, entertainment, and more.
The Technology Behind the Display
80.lv: Please tell us about the technology behind the display, how does it splice the image into multiple layers?
Gavin: Our tech is classified as a "swept surface volumetric display". Most people call it a hologram, and you can’t blame them as the term is generally used to describe any time of floating 3D image seen in sci-fi films. Most other 3D technologies, including VR and 3D cinema, use a pair of stereoscopic images presented to the left and right eye and the brain then rec-creates the depth information based on the different perspectives. Our approach is very different, and instead of rendering a left and right image, we reconstruct an entire scene using thousands of slices. The slices are displayed so quickly, that they blend together using persistence of vision, and the result is a floating scene, that can be looked at from any direction. The VX1 renders 30 volumes every second, and each volume has a resolution of around 1000x1000x200 voxels. Unlike VR and other 3D displays, the data rendered by a Voxon display physically exists where your brain thinks it should and so there is no "vergence/accommodation conflict", the physiological effect that happens when you focus on a 3D screen, but the apparent 3D scene is behind or in front of it.
The VX1 is one manifestation of our display technology. We have several other hardware products, each using a different kind of swept surface, and each designed for a different form factor, but each built around the concept of ultra-high-speed binary image projection.
Practical Applications and Use Cases
80.lv: Could you tell us about the practical applications of Voxon VX1? Your website mentions that the display can be used "for education, data visualizations, experiential marketing, or medical imaging," could you please elaborate?
Gavin: There are numerous practical applications spread across numerous disciplines that include industrial, academic, medical, entertainment, communication, defense, marketing, automotive, scientific, gaming, and sports replay.
Here are a few examples of use cases for volumetric displays:
- Out-of-home entertainment: The video game arcades of the 80s were places to experience digital entertainment before consumer electronics enabled people to experience games in their living rooms. Location-based entertainment venues are now increasing in popularity offering a mix of VR, AR, thrill rides, and other digital entertainment offerings. Our first product that is targeting this space is Space Invaders – Next Dimension, the first volumetric video game that allows you and three friends to play face-to-face in a glasses-free 3D, re-imagining of the 1978 Taito classic.
- Education: Our technology is being used in many universities including Harvard, MIT, and CMU. With so many options for content creation, students can use our tech to express themselves creatively or to visualize the output of research. Collisions of particles, protein structures, molecules, or simulation data. Data that may be otherwise trapped in a spreadsheet or PowerPoint can now be shared in an engaging and dynamic manner, with none of the audience required to wear headgear.
Complex mathematical concepts come to life on a VX1.
- Medical Imaging: DICOM is the industry standard for storing CT and MRI scans. A typical DICOM comprises a folder with around 200 greyscale images, each representing a slice of the human body. We support this format natively. Our rendering engine converts the DICOM file into a 3D mesh in real-time, allowing you to interactively explore the data by "thresholding", selecting the density of the region of interest, and in doing so, you can seamlessly move from bone to soft tissue to muscle, and share the results with a group of people, each having their own unique perspective. More recently, CT scanners are being used in airports to scan luggage and we can visualize the contents very clearly and with a high degree of spatial accuracy.
Visualizing medical data on a Voxon VX1.
- Stakeholder engagement: On large projects, there will be times when multiple teams need to come together to align their understanding of a scenario. In the case of underground mining, you may have a team comprising mining engineers, geologists, managers, financial controllers, geo-physicists, and 3rd party sensor providers. Being able to combine data from all these disciplines into an interactive visual representation of a mine site has the potential to save time and keep all parties fully in sync with one another.
- Sports replay: People love watching sports, in person, or on TV, there are now more ways to engage in an ever-increasing range of sporting activities, and not just watching live, but analysis of tactics and replays are becoming as popular as the games themselves. Companies such as Canon and Intel started experimenting with the capture of sport in 3D using photogrammetry. An entire game of soccer could be converted to a 3D animated mesh within several hours using massive parallel computing clusters. In the last year, we have seen an explosion in AI-based capture from companies such as MoveAI. These newer techniques offer real-time capture because they include an understanding of human kinematics as opposed to brute force point cloud capture. What this means is that very soon, we will be able to render a real-time game of soccer or other sport as a "hologram" on a table football-sized display in a bar. No more looking up at a TV on the wall.
Some motion-captured soccer data from MoveAI.
- Communication: The human face, everybody has one and it is the most important part of any person when it comes to social interaction. Since the birth of radio in 1890, the evolution of communication has not slowed down, and we now all take for granted the ability to have a face-to-face video call with family or friends on the other side of the planet. In 2018 we worked with Verizon and Erricson at MWA in LA to enable the first 3D Volumetric video call over 5G. We used depth cameras from Intel to capture the data which was transmitted from one side of the convention center to the other, enabling two people to talk to each other and see a physical representation of one another. We have since been working on a much larger display for this purpose, and when ready will be able to render a 1:1 scale full-size human head that is flicker-free and visible in ambient light conditions. With so many new facial capture techniques now available, commercial possibilities are endless. Real-time language translation avatars, video chat, karaoke, music and entertainment, historical storytelling… there are so many more.
Jacob recording himself with a Microsoft Kinect Azure 3D camera.
- Domain awareness: There are many use cases in defense and commercial situations where it is critical to understand the position and relationship of assets in a global context. Voxon has written a 3D global mapping API that combines ariel satellite imaging with topography and bathymetry. We can overlay that map with geo-located assets using standard LAT/LONG and altitude. This framework allows us to build applications for numerous use cases including battlespace visualization, underwater navigation, satellite and space debris tracking, air traffic management, fire fighting, and any other disciplines where tracking global operations and informed decision-making is critical.
Tracking Aircraft over a 3D moving global map.
Using VX1 With 3D Software
80.lv: Could you please tell us more about using VX1 with various 3D applications? How does one turn a 3D render into a "hologram"?
Gavin: There are multiple ways to render data on the VX1, and each has its own workflow. Whilst we mostly use Blender for regular video posts, you can create media using any 3D application as long as you can convert it into OBJ, STL, PLY, MOL, DICOM, or Height Map. If you are using Unity, you can use FBX and any file formats that are supported by the Unity engine.
Use cases can be categorized into interactive applications, pre-computed animations, and streamed data.
Voxon has spent close to a decade building a fully functional SDK (available for download on our website). The SDK includes numerous code samples written in C & C++ as well as a plug-in for Unity.
Programmers can jump in and write a "Hello World" application within a few minutes, and using our header files, should be up and running very quickly testing out the long list of volumetric function calls available. As you can see below, they require you to draw anything from primates to 3D texture meshes.
If you run the volumetric program on your own PC, the application will be rendered in a VX1 simulator. All the controls and menu options will be the same, and you can rotate and scale the image as though it were on real VX1 hardware. The same program will run on real hardware without any modifications.
To run your program on a real VX1, you simply transfer it to the VX1 via USB or network. It will then be available in Vertex, Voxon's app browser which is installed on each VX1. Vertex also enables you to download new demos from our website and additional media packs containing animations and other 3D files.
Multi-user 3D drawing program written in C.
For Unity developers, we have a package that you can insert into your Unity project. It's then a case of simply dragging and dropping a “capture volume” asset into your scene and Unity will pass the mesh data to our DLL, which will then render it on the VX1. We use mesh colliders to tell which 3D assets are inside the volume and thus candidates for being viewed.
A leap motion demo made with Unity.
A game written in Unity.
Blender has been the main choice for creating animations for testing on the VX1. The workflow simply requires you to export a series of mesh files in a numbered sequence and then save them in a zipped folder.
File types that are natively supported include OBJ, STL, and PLY. If you are using texture maps, then OBK is the best option. Prior to exporting from Blender, we usually apply a decimate modifier to reduce the polygon count to keep each frame below about 3MB, this speeds up loading when playing the animation back.
During playback of an animation in Voxieos (our playback software) you can move, rotate, zoom, and change many options relating to the visual appearance of your work. You can also enable light effects such as normal-based illumination.
A complex fire simulation created in Blender and exported to our VXR Display.
The VX1 allows you to create applications that run over a network, examples of these are KNIVIEW, a program that streams data from depth cameras such as Intel RealSense or Microsoft Kinect.
We also have a plug-in for Blender, a Python script addon, which when installed in Blender, will allow you to connect to a VX1 on the same network, and render the contents in real-time.
The Business Side of Things
80.lv: How do you promote Voxon Photonics and Voxon VX1? What's your current business model?
Gavin: Voxon’s business strategy is developed and refined by our corporate team and is guided by our previous traction as well as market trends and product/market fit. With any new technology, there are stages of development where some markets are more ready for mass adoption than others. Strategic development is a process of building relationships with clients, collaborating on PoCs, and commercial testing.
The VX1 was designed to allow clients to access volumetric display technology and experiment with it, understand it, and brainstorm use case and new form factors.
It is our job to educate our clients on what can be achieved, how to integrate with existing technologies, and what might the future hold in terms of product development.
In order to promote our technology, we continuously scour the internet for the latest in 3D data and then integrate it into a demonstration, film it, and share it on social media. There is never a shortage of suitable media with which to explore, and new technologies evolve on a daily basis, opening up new opportunities for creative visualization.
We post videos of each experiment on LinkedIn, Twitter, and YouTube, and recently started with Instagram too, but that has a more arty direction to it.
One of our biggest challenges is proving to people that the technology is real. If you search the internet for "3D holograms", you will find numerous examples of impressive-looking larger-than-life creations that are, it turns out, no more than CGI renders, illusions, or visual tricks. These can look good in a 2D video on YouTube, but simply don’t translate to the real world. Our approach has been that of honesty, only showing real videos of real people using real hardware and software. When people comment "that's fake" on our YouTube channel, they are actually giving us a compliment. What they are really saying is "that's unbelievable".
Types of 3D Displays
If you have ever been to a live sporting event, you may have noticed that ads written on the playing field look distorted. This is because they are designed to be looked at only by the TV cameras, and from that perspective only they look correctly proportioned.
The technique is called "anamorphic projection". The giant "3D Holographic billboards" simply show animations that have been distorted in such a way, that from one single camera angle they look correct. The entire illusion relies on the fact that the vast majority of people looking at the ads, do so via social media on their phones, from the only viewpoint that works. If you walk past one in real-life you will see a warped mess, unless you find that exact spot to stand where things look interesting (but still 2D). In terms of real 3D displays, most fall into the category of "lenticular multi-view autostereoscopic".
These types of displays are essentially LCD screens that have a special micro-lens array providing a 3D view within the bounds of the screen. They can create a convincing 3D image, but obviously, you can't walk around the back of the screen, or look down on them from above. Other than the aforementioned LCD screens, most other "holographic" displays are actually 2D illusions, a mix of led spinning fans, Peppers Ghost illusions, and clever use of shadows and forced perspective.
Things to Come
80.lv: What are your future plans? How do you plan to develop and promote the company?
Gavin: We are in the process of commercializing our first major product, Space Invaders – Next Dimension. The game is the culmination of over ten years of work and uses our next-generation VXR technology. A volume five times larger than the VX1, with triple the color rendering and twice the refresh rate. The game is currently undergoing technical testing in entertainment venues in Australia.
The original Space Invaders game was manufactured and sold by Taito in Japan in 1978, and by 1982 had grossed $3.8 Billion. We met the Taito management team at the Tokyo Game Show in 2018, and after showing them our gaming version of the VX1, we started working on a concept for a new game, combining their iconic game design, with our futuristic 3D visuals.
Prior to commercialization, Voxon will be completing a series A round of investment to finance the design for manufacture, distribution, and marketing of the game.
Whilst Space Invaders is targeted as the product to introduce Voxon’s unique display technology to the world, the business model that follows it will be laser focussed on building a recurring revenue-generating business model based on location-based entertainment. By licensing global brands, Voxon intends to build a catalog of cutting-edge Volumetric 3D entertainment experiences which will be playable in a wide range of public venues including experiential zones, theme parks, cinemas, festivals, theatres, arcades, bars, clubs, and other places where people gather.
Whilst that is the plan for the DOOE ( Digital Out of Home Entertainment ) sector, as our core rendering capabilities mature, we will start commercializing other sectors too. Retail, advertising, art, education, medical, defense, museums, and many more.
Post investment, we plan on widening our global presence, by partnering with leading visual artists and brands to find demonstration locations around the globe. Venues that are accessible to the public and managed by creative and fun people like ourselves.
People often ask about our plans for the Metaverse. My answer to that is as follows. The Metaverse is not a place, product, or brand. It is a collection of tools, techniques, protocols, capture devices, applications, data abstraction layers, networking hardware, compression techniques, rendering techniques, GPUs, CPUs, distributed computing, game design, artificial intelligence, art, 3D modeling, texture mapping, procedural synthesis, haptic feedback, and 3D display devices.
The Metaverse is a subset of the internet that encompasses everything related to spatial computing. There are numerous ways to engage with 3D data including VR, AR, 3D screens, 2D screens, and now Volumetric Displays. Each technology has its own strengths and weaknesses, and each will play its own part in the future of human-computer interaction.
To us, the appeal of Volumetric Displays is exemplified in their ability to bring people together, foster conversation and exploration, entertain and educate, and form a bridge between the digital and physical worlds.
As a small company, Voxon relies heavily on word of mouth to communicate its unique value proposition around the globe. We have built a solid community of friends in countries all over the world who all share our vision and look forward to what the future may hold.