We are unprepared

January 28, 2022

You cannot stop technology. Once a certain technology is out there, there’s no stuffing it back into a box; and on the internet, there’s a good chance you’ll cause the Streisand effect instead.1

There’s been a lot of talk about the metaverse2 lately, and if you put the progress in machine learning and combine it with the reality and rumors of upcoming AR tech, some might find it harder and harder to escape the internet.

As if it isn’t already hard enough to avoid the internet.

If you were concerned today, allow me to just point out a few things:

  • Deepfake technology is improving incredibly quickly
  • Voice synthesis technology is maturing
  • Augmented reality headsets are probably coming
  • We can already do incredible things in real-time with cameras and neural nets

The future has never looked as promising, or as scary.

Painting a picture

You’ve never had this much knowledge and information at your fingertips. Trained neural nets can give you personalised information, and communication has never been easier. Your friends can be on the other side of the world but also next to you.

On the other hand: the economic divide will never have been greater. These devices will take a bit to become commoditised, and those who adopt the technology early will reap the benefits. Perhaps we won’t be comfortable with wearing these AR devices all the time (or at all, initially), but how long until we have something that people are comfortable with?

How many generations until our progeny laughs at the fact that we had to use dedicated screens? That is, assuming we haven’t destroyed our planet at that point, of course.

Just fantasising

Here’s a couple of fun (and questionable) applications of this technology that I just made up. I’m not even thinking out of the box here. These all sound plausible to me in the context of AR:

  • Filter out specific people from your field of vision
    (we can apply real-time filters to faces today)
  • Replace real-life advertisements with personalised ads
    (we already have all kinds of personalised ads today)
  • Witness real or idealised representations of actual humans in our day-to-day life
    (we have vtubers and VRChat today)
  • Even more data gathering about people’s movements and the things they perceive (under the veil of “necessary data collection for product improvement”)
    (this already happens to some extent)

I’m sure more nefarious minds will come up with even more scary stuff, like stealing all this data.

Imagine the theft of a virtual persona; an identity based on your real identity, but stolen from you. There’s a thief traversing the metaverse with your body, your voice, your identity… we have already seen the rise of questionable adult content with deepfakes of celebrities’ faces — and this is just the beginning.3

We are (legally) unprepared

The time to think about the legal implications of all our modern tech is yesterday. So we best get caught up.

But as it currently stands, with our current political systems, we do not have knowledgeable people in politics who can legislate this stuff.

Because once the cat’s out of the bag… you’re not putting that kitty back in.

We will pay for that mistake. I’d argue that we already are (see also: anti-vaxx movement, misinformation campaigns, information bubbles). I originally thought that this was a mostly US-centric issue, but forget it. It’s a global issue now. In regard to the pandemic, we have seen an incredible proliferation of fake news to a scary degree.

We will pay for the fact that most of our politicians are technologically illiterate, and that our laws are already poorly tailored to reality. We must do better.

Because there’s no stopping technology. We must be prepared.

  1. Another reason why making encryption illegal in certain situations is a stupid idea (not to mention the security implications). Now that the technology is here, there will always be someone doing encryption in e.g. a chat client. It’s a game of cat and mouse at that point. No need to risk everyone else’s information because you’re scared of what could be said in private. We already have that problem in real life, too. People lie, people hide things, people are awful. Encryption is just math. You can’t outlaw mathematics. 

  2. No one wants a “metaverse”, by the way. It’s a terrible word that is currently quite meaningless and is more about aspirations rather than reality (at this time). 

  3. We’re already at the point where video footage might be edited to the point that it isn’t self-evident to analyse whether particular footage is real or was doctored in some capacity. Just look at the shitstorm caused by the Tom Cruise deepfakes last year.