Under the Skin Surveillance – COVID-19 series

During a crisis, we have to take immediate action. But sometimes, the short-term fixes themselves become their own long-term problems.

Post 9/11

After 9/11, some things changed. We now have a permanent public security mindset. Everywhere people gather en masse, the threat of terrorism is omnipresent. This means we get our bags checked, walk through metal detectors and face a few other procedures for our collective safety. But the inconvenience hasn’t massively impacted our civil liberties.

Post Corona

Post COVID-19, we’ll see a similar pattern. Except this time, they won’t be checking for weapons we are carrying outside our bodies, but weapons we are carrying inside our bodies – viruses. Biometric testing will become the norm in places where people gather – public transport, stadiums, schools, universities and workplaces. We’ll walk through temperature sensors, breathe into analysers, look into iris scanners and be monitored by any other mass biometric measure device you can imagine. Again – not such a bad thing to keep society healthy.

Under the Skin

The problem with the above method of course, is that a virus has been shown to be capable of spreading far and wide through non-public venues. So, let’s imagine our government comes up with a better method. Every man, woman and child is given an Apple watch. The watch comes with additional sensors whose outputs automatically feed directly into a government database. The sensors constantly measure body temperature, heart rate, blood pressure, sleep patterns, where you go and who you’re near. It can even record all your conversations 24/7, which can help locate and then minimise transmissions if you are infected – just in case. Oh, and it’s mandatory for free watch holders to use Apple Pay for all their purchases – which also links to the database.

Like magic, government algorithms could analyse data and find health problems before we know we are sick and stop a potential chain of infection in its tracks. A potential epidemic could be over in mere days.

That would be awesome, right?

Long Lead Thinking

While the benefits of the above idea can clearly be seen, giving legitimacy to this level of surveillance would have a compound effect in other areas of our lives. We’d be opening up our bodies and letting big tech and government get under our skins, literally. How could the data be used in unintended ways? How could minorities be targeted? What if the data were to be hacked? How could it be matched geographically and time-stamped to other online activities? The government can start to know not just what we watch and read, but what we think and our emotional reactions towards it. All of a sudden we could have a surveillance state that literally knows how we personally feel about everything. Happiness, sadness, elation, fear and anger – our most internal and private states of being would all be on file. Forget personality testing, we’d be ranked.

The Power of Inconvenience

We need the wisdom to understand convenience always has a price. And if the price of goods in 7-Eleven has taught us anything, It’s always high.

  • Fast food is quick and convenient – but has had a massive consequence on our health
  • Fossil-fuelled economies grow quickly, but at the cost of endangering the climate
  • Handing over our personal data can produce powerful information for collective gain, but we lose privacy and individual agency.

The consequences of actions today, happen long after the moment has passed. And often, they are beyond anything we can even imagine.

Privacy as a Luxury

Here’s some light entertainment to end the week. Our latest Future Sandwich Now-Soon-Later episode. In this episode we take a look at in big tech’s move to infiltrate our homes (ironically when we are spending more time in them)  Smart speakers, sorry, microphones, smart homes and surveillance capitalism. You’ll learn about the one thing we all used to have, that we’ll have to pay for to get back.

I’d love it if you could subscribe to our Future Sandwich Youtube channel, make a comment (feel free to disagree) and share with someone who has let Big Tech spy on them inside their house! There’s a free Google Nest Mini for the best comment. Oh, before I forget, this was recorded pre Social-Distancing.

I hope you’re all keeping safe, healthy and minimising contact. Just like business, if we do more than we’re asked, we might get a better result, and out of this situation sooner.

Steve.

P.S. Due to many requests I’m currently writing up how I see the world changing forever post Corona – which I’ll share here at a later date. 

A Privacy Tipping Point?

This week I did media interviews from Sydney to New York to Washington to San Diego about the sudden popularity of FaceApp. I’m guessing you’ve already tried it. If not – you upload your photo to the app and choose a filter to either make you younger, older, a different gender or sprout some facial hair. Powered by AI, the app magically spits out a photo of you that can be plain frightening.

FaceApp improved its software this week and celebrities have been posting photos of their future elderly selves. 150 million downloads later, security experts have sounded the alarm about the consequences of uploading your data to an app based in Russia. But here’s the rub: its terms and conditions aren’t really any different to most social media platforms. Why is this concern over cybersecurity very much now in the zeitgeist when we hand over much more personal data to tech giants like Facebook, Amazon and Google every day? Yep, you got it, it’s because The Ruskis are involved. Personally, I’m more worried about Mr Zuckerberg and so is Wired magazine. In any case, it’s clear every big tech database has already been hacked by foreign entities, including the Russians.

While it is kinda weird it took a foreign social app to generate such a media storm, I’m thankful it has. We might just finally be starting to get woke to the compounding effect of copious amounts of personal information being vacuumed up. What is clear is that we always turn a blind eye to the downside of anything when the short-term benefits outweigh the long-term consequence – which is what Big Tech does so well. They know we can’t live without their services on a daily basis.  But when it comes to FaceApp…a few funny photos is all they provided and all of a sudden, we get worried about what we are giving away. Maybe they should also have promised to make the world a more open and connected place?

–   –   –

Be sure to check out and subscribe to Future Sandwich Now-Soon-Later short weekly web video series. You’ll totally dig it.

There’s only one real world

It’s very hard to understand the consequences of something when you can’t touch it, feel it and experience it in the physical form. Many of our virtual experiences seem displaced from a physical reality. It’s as if it didn’t really happen, that it’s only information and information isn’t real. Digital privacy fits neatly inside this parable.

I’m sure you’ve heard the following statement when it comes to online privacy:

“If you’ve got nothing to hide, you’ve got nothing to worry about.”

The next time you hear that, ask the person who said it to hand over their phone. Ask them to tell you the password, unlock every app inside it and to let you browse through at your leisure. I’d be surprised if any adult in the free world would feel comfortable with this. I know I wouldn’t. It’s not because I have anything to hide, it’s because privacy and secrets are not the same thing. And some things in my life, like everyone, are private. The phone is not a phone – it’s a digital manifestation of the physical self. It’s the most personal device humans have ever owned in history.

To gain access to it, our governments and tech companies have conspired to conflate privacy and secrets to be the same. It suits both of these actors. Governments get access to all that we do – just in case a terrorist is hiding inside their gigantic digital dragnet, or someone tries to use crypto currency to dodge tax. Simultaneously the tech giants get to continue their business of Surveillance Capitalism. And the externality of both these things, is that basic human decency, respect and freedom is compromised for all.

If there’s one thing we need to get better at as a society, it is understanding the physical consequences of informational actions. If you’d keep something private in the physical world, then we should have the ability to do that in digital realm too. If you wouldn’t say something to someone’s face, we shouldn’t say it on-line. And if you think that your online life is different to your physical life – then it’s time to start remembering that all these things interact in the one physical world we live in.

Why we need to rebuild the internet

In life and in business I believe in a few guiding principles. Two I like in particular are very common across cultures:

  1. Create more value than you extract.
  2. Treat people the way you’d like to be treated.

I imagine everyone reading this would agree. Now let’s consider this juxtaposition:

What a CEO says: “We want to build a more open and connected society”

What a CEO does: Buy the 4 houses surrounding his in order to protect his own privacy.

Someone who sells privacy for a living, often without permission and tricks his customers into giving up more than they understand, wants to protect his own. The fact that I don’t need to mention the person’s name is telling. Well, you might say it’s not a fair comparison between how someone behaves in their digital and offline lives. Fair call, but consider the fact that up until last week the person in question could delete private messages from another person’s private inbox, after the messages had been sent to and received by the other party. A privacy feature he wasn’t generous enough to give to his users. Oh, by the way, I can think of another industry where ‘dealers’ call their customers ‘users’. We both get our minds messed with in ways we can’t understand and end up addicted and worse off.

It’s a well known technological trope that data is the new gold, an entirely new class of asset. And that’s where the problem lies. This asset class is so new, few people understand it. We could liken this to the age of discovery when imperialists took control of abundant natural resources, resources which were viewed by the conquered as something no one could really own or control.

The net result is that the greatest wealth creation event in the history of humanity. The Internet has resulted in a massive centralisation and control, and spawned the era of the data imperialist. Even those who understood the power of data have far less chance of leveraging it on their own, because of the dramatic impact of network effects, and zero cost digital transfers both have in creating a winner takes all economy. To quantify: the net worth of the 4 founders of the top 3 technology companies since the dot com era have a collective net worth of $281 billion dollars as of today.

The internet needs saving.

What started out in all probability as altruism – the dream of a free web funded by advertising, has become a nightmare panopticon and it’s time we pushed back. Hard.

Technology stalwart and all round good guy Jaron Lanier says we can no longer call these companies Social Networks, but ‘Behaviour Modification Empires’. Services which use algorithms to make us stay longer by giving us sugar hits of fear, jealously and other powerful negative emotions. Lanier also says that we can’t have a society where if two people want to communicate, it can only happen if it is financed by a 3rd party or corporation selling advertising. It’s worth investing 15 minutes of your time to hear him talk about it here.

But I will add a little more to his talk… the missing piece.  Personally, I hope Facebook isn’t fixed. It’s only when something stays broken that we get a chance to put something better in its place. For me that would be a social network that no one owns or controls – something funded by the people using it, without a financial corporate imperative shaping our most valued human asset – our interactions.

We need each other, Steve.

It’s time for digital organics – Algorithms are the MSG of the modern age

Increasingly, our lives are shaped by secrets companies keep. The corporate secret de jour is algorithms.

These secret algorithms are designed to do two things:

  1. Make us like the product more.
  2. Improve the profit of the company via the algorithm.

(Objective 1 is only ever designed to facilitate objective 2.)

No doubt you’ve heard the word ‘algorithms‘ bandied around recently in the media, but unless you’re involved in tech or have had someone explain them to you, it is difficult to know what they are, what they do and why you should care. The definition coming straight at the top of a Google search is a pretty good one:

Algorithm: A process or set of rules to be followed in calculations or other problem-solving operations, especially by a computer.

For the most part, algorithms are a damn convenient tool during an era where the amount of data is literally exploding and we need shortcuts. By the way, an algorithm helped me find that definition too. From the screen print below, you can see the results of Google’s algorithms: a definition inside their search results, hence removing my need to leave Google and go to an online dictionary – aka a competitor.

Rule 101 for Algorithms is pretty simple. They are designed to benefit their creator. If they can serve the end customer too – well, that’s a bonus. The problem of course, is that the customer doesn’t know what they didn’t see and which decisions have been made for them. It’s hard to make an informed choice when the algorithms increasingly make those choices for us via filtered options.

A little over 6 months ago, I wrote about the fact that we will need to open up the black box of algorithms if we want to maintain a democratic society – yes, it’s that important. Before anything physical happens in our world, something informational always happens first.

A recent landmark federal court case in Australia focused on a poker machine called Dolphin Treasure whose manufacturer and casino operator have together been accused of misleading gamblers about their chances of winning. This is essentially algorithms on trial. It’s the start of something much bigger, and we can expect to see our most successful and revered technology companies algorithms on trial very soon. All it takes is a little more understanding by the public, and some front page news of algorithms gone wrong where there is blood on the floor – and sadly, it will happen. In many lower profile cases it has happened already.

Here’s what we can expect to see in the corporations around the world: C-suite level executives to emerge in order to build better algorithms and understand those in the market they need to deal with. Boards will need and put algorithm experts on their roster.

Here’s what we can expect to see from the Ambulance Chasers: Hidden algorithms to be the target of legal cases which deceive and cause financial or physical harm to consumers – a new angle to misleading and deceptive conduct.

Here’s what we ought expect from each other: To educate each other on the good, the bad and the ugly of algorithms so we can help shape a world we want to live in. Like we did with food and other suboptimal corporate behaviour patterns.

Here’s what I’d like to see from entrepreneurs: To launch services that benefits users sans algorithm as a key selling point or algorithmic ingredients on clear display – a new form of Digital Organics… to invent a new market and make the entrepreneurial profits they deserve by doing it.

What we need to remember is that every problem presents a new opportunity for nimble entrepreneurs. For business people who steer technology from its current trajectory to a new path is to say ‘no’, we want and deserve more than what you’ve giving us, and we are going to be the people who do it.

Check out my new book – The Lessons School Forgot – to redesign your own future.

The start of the end of the screen – Google Home

Why is no one talking about the things that really matter with Google Home? Like how it changes the economy, and how it might have the kind of impact mobile apps did on our web habits. I’ve read a number of articles about the Google Home device being launched in Australia this week. Lots of them discuss the effectiveness of the natural language processing and which apps it works best with. Like this article and this article. None of them seem to cover the issues that really matter on the topic. So here they are.

Ambient Computing: This is a shift away from typing to talking. We are now entering the age of ambient computing. The killer apps on interacting with artificial intelligence have just shifted from eyes and fingers, to mouths and ears. This is the start of a permanent change in the way humans interact with intelligent machines. The shift is as big as the smart phone was. The only difference is that this will take a little longer to establish itself. The reason it will take longer than the smart phone did is that there isn’t a direct substitute for such home devices. The smart phone had the advantage of replacing a tool we all already used – a feature phone. Most of which had a 12-24 month replacement cycle – like items under contracts typically do. Therefore, we can put this device in the Amara’s law category – a bit slower to take hold, but once they do arrive en masse, the impact will be greater than most people suspect.

The smart home killer app: Every new regime in technology requires a centre piece technology to augment and co-ordinate disparate devices. The graphical browser ushered in the era of the World Wide Web. Google home and friends, namely Echo and Homepod are the devices that will usher in the era of the smart home. A home where everything functional, mechanical, and electrical will interact with web. This is where we can expect to move to renewable energy faster than most predict. Currently just under half the energy we consumer in home is wasted. We don’t need more efficient PV Solar panels and larger batteries, what we need is homes that know how to efficiently allocated energy and resources to the devices inside it.

So what does a smart home look like? It’s a place where most everything has computational capacity, it knows everything that’s in it and it efficiently allocates energy and activities based on what it learns. We can expect energy usage in the home to decrease by at least 30% in a truly smart home. When technology makes our homes more efficient, the value equation and ability for renewables to create an off-grid solution increases exponentially. A positive cycle of both demand and supply side efficiency may change how we power our homes ahead of schedule due to the arrival of complimentary technologies. We can expect the centre piece AI to be a party to the dismantling of the coal and fossil fuel industries. Disruption is horizontal – it is usually a juxtaposed technology which changes things unexpectedly.

The end of SEO: Once people start talking to their devices and asking for and expecting verbal responses, being on the homepage of Google becomes irrelevant. There wont be a page at all. In a world of ambient computing, we need be the first recommendation which gets returned audibly. Which means any brand, product or service hoping to be recommended by a search engine needs to be asked for by brand, or be the best in category. Even worse, companies like Amazon and Google might not care what’s most relevant, and instead start recommending what is most profitable. So long as it ‘solves the problem’ of the end user it’s most likely to give them the highest margin option, for them. Remember, Google promises not to be evil – to it’s share holders at least. SEO, will become VPO – Voice Pod Optimization, a game where only a single option is mooted to the end user.

Privacy on steroids: This is the time when we allow multinational corporations with backdoor pipes to governments hear every word in our homes and learn every habit. All of which is permanently recorded. And if you think this only matters for people committing crimes, then never forget that the most extreme externalities are those we can’t plan for, or even predict. If this isn’t enough to convince you to think twice about privacy, this little post might at least open the mind a little. Privacy and secrecy are not the same thing.

Given these changes aren’t in the maybe category, best we start acting on them now.