I’m back from ISE Amsterdam and straight to the Blake Street R&D studio at Mersive. I’ve returned from the world’s largest AV show. For my readers that aren’t as familiar with “AV” – it stands for Audio Visual. The world of AV rarely evokes images of cutting-edge technology – it’s more likely to bring forward images of the old “AV cart” that made its way through the halls of our schools – covered in cables and outdated projectors. At best, it reminds us of our home theatre, stereo system, or at-best, home automation. This is only because AV used to be defined by hardware. All of that has changed.

I’m energized by what I saw and heard at the show, and I can’t help but think that AV has become something altogether more exciting – the frontier of technology that’s defining how we experience our environment. This is a tall order, but AV is uniquely positioned to define the intersection of computing, interaction, and the physical world. AV’s heritage is rooted in the physical – speakers, displays, interactive signage, immersive experiences, and events. Over the past several years the AV community has been busy absorbing technologies from fields like software development, computer science, data science, and artificial intelligence. What you end up with is an AV field that is not only passionate about creating fantastic experiences that sometimes seem like science fiction but is equipped to pull it off.

So what at ISE led me to think this way?  You can always spot an innovative field as some of the exhibits at tradeshows defy practical purpose. You can shun some of the things on display at ISE as impractical, but I’ve seen this pattern before in other communities. When a field is generating valuable innovation but is also spinning-off creative, seemingly purposeless inventions, it’s a sign that the field is about to take off. If you attended shows like SIGGRAPH in the mid-1990s you know exactly what I mean

Another feature of a market that’s about to change is the separation that occurs between different products and approaches in the space. This is because when new technologies combine to create a fundamental rebirth of AV, some companies are left in the old market and haven’t realized it yet.

Below, I’ll point out some of the highlights of the show, in general, but I want to then use my field “Collaboration Technology” to give you an example of how quickly products can separate.

Here are three highlights from the show:

In the “Just Because You Can” Category
Projection mapping could be found all over the event center. I still recall the first demonstration of a full motion texture map that was shown to me at UNC-Chapel Hill by Ramesh Raskar, Greg Welch, and Henry Fuchs. As soon as I saw the demo in 2000, I knew this would find a home. The technology is well-productized, but it’s not found a killer application outside of production events. Just like any technology, there is a hidden, deeper heritage behind projection mapping, but it’s about to explode.

As an example, I noticed this giant projection mapped head. It was equal amounts of amazing and disturbing. Again, this is what innovation looks like. Perhaps in the next few years, you’ll be experiencing projection maps in your everyday life.

The People
I’m a futurist and technologist at heart, but the pursuit of technology is fundamentally a human imperative. ISE was an impressive showcase of the people behind the latest technologies. I must have had four different conversations with AV folks who used to write control panel code in specialized-vendor specific languages (yes that’s a thing), who are now writing generalized applications in python, Swift, and even Rust.

Shout out to the focus on diversity at the show (another example of innovation and forward-thinking communities are ones that have the self-awareness to seek out a diversity of ideas). I had fantastic dialogs over beer (or even on a boat in one case) related to privacy, the role of AV on emotive state and its connection to productivity, and AI-assisted environments for wayfinding, scheduling, and predictive support. These are topics that ten years ago would only have taken place in other communities.

Clive Couldwell, editor of AV Magazine, was kind enough to invite me on a panel to discuss the future of the workplace. I was impressed by the panel members ability to peer into a future that they are defining. Two large companies joined me on the panel, and you could tell they both cared deeply about how technology could have a positive impact on their users’ lives – bringing them together, allowing freedom of expression, and the ability to work as they see fit. Jenny Hicks, a leading thinker from the AV integrator channel, has the same level of passion about making complex systems simple that I’ve seen in leaders in other communities (like Cybernetics)

AI for AV is Here
In defining how users will experience their workday – whether it’s a meeting in a conference room or a casual space for collaboration – we need to think about how to better understand how technology impacts those users. Machine learning will allow us to discover what meetings are productive, and why so that we can better refine how our spaces respond to our needs. This isn’t as far fetched as it seems. Even Barco, a company that got its start with some of the first radios in the 1930s is trying to look at “Insights” based on device utilization. Our own AI monitoring portal Kepler won Best of Show! Talk about an acknowledgment that this market has moved away from focusing on video cables and standards to technology.

So what about my comments related to product separation? This is a sensitive topic because I often avoid promoting our efforts, but it’s clear that as AV undergoes such a rapid, positive transformation, not all companies will come along for the ride.

As an example, Mersive revealed a new approach to collaborative markup called “Solstice Ink” that is a re-think of the whiteboarding approaches of the past. It’s centered on allowing any participant to use their smartphone to highlight, draw, and emphasize content on the display collaboratively. When a user draws a line, circle, or arrow the system recognizes the user’s intent and morphs the markup into a “perfect” fit of the corresponding shape. The approach drives engagement and allows all users to collaborate equally. The technology required to bring this to life is related to everything from mobile app development to predictive filters (i.e., the Kalman Filter) from the field of statistical control theory.


Was it easy to build? No. Was it quick? No. It took us more than two years to understand the real need that underpins the user’s desire to annotate in a meeting.  Was it worth it? Absolutely. Our users will be more engaged and enjoy meetings just a little bit more. This was our big announcement at the show, and I feel like it was accepted with enthusiasm by the community.

At the same time, our competitors focused on business-led, incremental changes to product line-ups. It’s the difference between truly accepting a software-paced approach and sticking with thinking derived from hardware lifecycles. This thinking is a trap that companies can quickly fall into. Creating new opportunities for your business certainly can come from introducing the same product with self-imposed limitations at a different price point, but what users want is creative, novel approaches to solving their problems – not new product SKUs. This type of thinking will create product separation. The good news is that more than half of the AV community recognizes this and is on an exciting path to the future.

If you’re interested in learning more about Solstice version 4.0 and the Gen3 Pod, please visit the Mersive website.

About Christopher Jaynes

Jaynes received his doctoral degree at the University of Massachusetts, Amherst where he worked on camera calibration and aerial image interpretation technologies now in use by the federal government. Jaynes received his BS degree with honors from the School of Computer Science at the University of Utah. In 2004, he founded Mersive and today serves as the company's Chief Technology Officer. Prior to Mersive, Jaynes founded the Metaverse Lab at the University of Kentucky, recognized as one of the leading laboratories for computer vision and interactive media and dedicated to research related to video surveillance, human-computer interaction, and display technologies.

Submit Comment