The Attention Scoreboard

Scoreboards change everything. Once we have a metric in place, whatever improves that metric becomes the focus of what we do. We do it with sport, we do it with money, and now we do it with content. A metric you all know is how many views on average your social posts attract. We all do.

Attention wins

Once upon a time, great content was the focus. Content is king, we’d all chant ad nauseam. The inference was that the best stuff would naturally bubble up to the top. However, humans are weird and emotional creatures. We have certain neuroses we can’t help but serve. As a social species, we are infused with the need to be part of a group to survive. When others are engrossed in something new, of course our curiosity is piqued. In this era, we can see exactly how many people are paying attention to something. This scoreboard sits under every photo and video. It almost doesn’t matter what it is, so long as people are watching. Algorithms perpetuate that further. Something that is popular becomes more popular, because it is popular.

This presents a real challenge for people who want to produce content that is important, but perhaps not quite as entertaining. While a business may insist their primary goal is quality for their content, it’s hard not to be swayed even more by the number of views it receives. When Tommy and I launched our TV show, The Rebound, every sponsor told us their focus was delivering great insights and hacks to the audience, yet what they couldn’t resist asking first in every review would be, “How many viewers did each episode get”?

Know your metric

All this comes back to your ‘why?’ Why are you creating content? The truth is it isn’t that hard to go viral. I have a handful of YouTube videos that have clocked up well over a million views. We just need to appeal to the market of emotions and be as extreme as possible. If attention is the key metric, then we just follow where the attention economy takes us. Get on a bandwagon, roll the dice enough times and eventually you’ll score. But for what purpose? If entertainment and attention are your key metrics, then that’s exactly where your focus should be.

One the other hand, if you’re not aiming for the mass market, then deeper consideration must be taken.

I’m after an audience who want knowledge and seek consulting services and keynote speeches from me. We need to decide what we are chasing before we start creating. We need to resist the temptation to chase numbers, otherwise the attention scoreboard redefines what we do and who we are, with unintended consequences,.

Keep Thinking,

Steve.

The Social Media Game Show

While undertaking my TikTok experiment, I’ve noticed my feed being filled with ‘Random acts of Kindness’ – and I’m wondering, How random can it be when it is clearly a content play?

Contrived Acts of Kindness

For the uninitiated, these ‘random acts of kindness’ are a bit like a new genre of game show. The acts often include;

  • Handing bunches of flowers to people then walking off
  • Giving people money on the street, sometimes thousands of dollars
  • Paying strangers’ bills in supermarkets, gas stations or shopping centres
  • Asking people what makes you happy – and then going to buy it for them.
  • Arriving at homeless encampments and giving goods to the residents
  • Driving round town looking for people to help …. always with gifts or money

Of course, the camera is never off. The video is often made with someone hiding around the corner, and it’s mostly uploaded without permission. The whole thing is an act, and in my view, a distasteful one. They are turning people’s lives into content. This is what the Attention Economy has created, a world where the only goal is to get views. A world where it matters not what people are watching, just that they are watching.

The ugly reality is that these so called acts of kindness are just a cost of doing business. A small financial outlay that is later offset by monetary gains the channel makes for the creator. In today’s digital age, attention equals money.

If you manage to garner enough eyeballs around what you do – people will pay for access to your audience. It’s old school media, in a micro way.

Fee for Audience

On TikTok the going rates for an audience are quite significant. People’s reach can be accessed for fees based on their level of ‘influence’. Cost per single post below:

  • Nano: $50–$300 (1-10k followers)
  • Micro: $300–$1,250 (10-75k followers)
  • Mid: $1,250–$3,500 (75-250k followers)
  • Macro: $3,500–$12,000 (250k – 1m followers)
  • Celebrity: $12,000+ (1m+ celeb status) 

Just last week – a famous TikTok-er known as @LifeofHarrison gave what he described as an old lady some flowers by asking her to hold them – and then says have a lovely day and walks off.

This garnered 65 million views with the comments fawning over the act. The lady however, Maree, said she felt dehumanised and said she didn’t like being described by others as an old lady. In an interview on TV @LifOfHarrsion, aka Harrison Pawluk said, “I want my content to make people feel happy….” Interesting that the word ‘content’ was featured in that sentence. Rather than him just wanting to make people feel happy.


A Legal & Ethical Minefield

While it is legal to film someone in a public space, it’s illegal to use people for commercial purposes without their explicit consent. When we record my TV show The Rebound, every person who appears needs to sign a release form. The only exception, is if you’re part of a larger public audience at a place like a football match. It’s another example of technology companies having a different set of rules to traditional media. But we all know, they’re a really just media companies in disguise.

Of course we shouldn’t blame viewers for watching these acts – they just want to see something positive. But we ought also remember that the creators have been influenced by the media landscape they now live in. First we shape our tools, thereafter, they shape us. While I don’t question that most people would be happy to receive something valuable for free, it does seem a little opportunistic and avoids all of that difficult stuff like working on the structural issues to alleviate hardship. I guess real solutions don’t really suit the short video format.

But if we want to teach our kids anything in a surveilled society, it should be that what we do when no one is watching is what really matters.

The only problem, is that the spaces where we are able to do something without anyone watching, are in rapid decline.

Keep Thinking,

Steve.


Technology Externalities – Q&A

Last week I did a rare keynote for a key regulatory body where I was asked to go deep into technology externalities. After it we had a Q&A session and over email I got asked a number of additional questions. Many of which I’m sure you’ve wondered about. So here they are!

  1. What would be the “Vaccine” for a digital pandemic? For me this would be global implementation of BlockChain based technology. This is for two reasons: (1) BlockChain could allow for cold storage (offline) of each block and (2) also is the only fully distributed data storage system which has the highest levels of cryptography. If everything went off line we’d have a rational starting point to reboot from. But in truth we need an off switch. Digital Security is not possible without analogue optionality. True digital security requires physical replication and or isolated mechanical (non-digital) operational ability.
  2. What technology that we currently rely on- do you think is most at risk at becoming redundant? The Energy Grid. With the exponential improvement in renewables and battery storage (Graphene & other emerging storage solutions) we will very soon move to a localised energy generation / storage systems. In this instance each home, office, building, factory will generate and store its own energy on premise. Like we have without own computer systems. However, an energy trading system will emerge where we can generate and sell energy across wires directly to other places that need it immediately. Like our computers we will have the equivalent of ‘hard drives’ – batteries – and some ‘cloud storage’ but mostly we’ll have enough storage locally and only big industry will need big storage solutions and trading of KwHs. We’ll buy and sell energy directly with each other, in the same way we trade content / information today.
  3. As we move further into the shiny new digital world and digital twinning, are we more likely to de-prioritise the physical world? No – I think it will facilitate and create more attention to physical spaces – COVID also reminded us that the physical world is vital and we can’t operate in pure isolation or without certain physical realities. By not trying to replace– but augment our physical world it will equalise attention and maybe bring physical back as a focus because all physical things will be augmented digitally. Digital wont’ be a place we go to but like an atmosphere we will under.
  4. We influence but don’t make policy. What penetration have you had in Canberra? Policy is a function of prevailing social sentiment and narrative. As we’ve seen with diversity, climate policy and other social issues it sometimes takes decades before issues are acted upon. The most important function of a society is to share concern and raise the profile of issues which are important to our collective. That is the first task of change – in some ways ‘markets evolve from conversations’. I’ve worked with Government of some issues at a Sate and Federal level but big tech power seems low of the priority list at present. My personal view is that this is because many policy makers don’t understand the potential longer term consequences, and we haven’t had many local industries directly upended by it. We can see that it has only been prioritised so far with News. This was because we have a powerful lobby here wanting to protect that industry and its advertising revenue. To this point powerful lobbyists have been more effectual – than consumer intrusion or longer term surveillance risk. I’d also add that Governments having access to the data and tools big tech have in every consumers pocket could provide a perverse incentive to turn a blind eye to other downsides.
  5. What are your thoughts on big tech self-regulating on issues such as dis and misinformation? No for-profit industry has ever self-regulated out of the goodness of their heart in the history of capitalism. Wow – I said it. We should not expect it to happen now. To date, their efforts have been to maintain control by ensuring their own AI systems are the solution to the misinformation spread – which to date has been largely ineffectual, and it seems clear the problem can never be solved in this manner by AI in isolation which is reactive in nature and needs training for every new problem. Their strategy (Big tech – eg Facebook) has been to delay and obfuscate and sadly, it is working. ‘We need to do better’ gets rolled out with every hack. Big tech and all self-publishing platforms need to be responsible for all the content of their sites, just like McDonalds needs to ensure the teenagers making their burgers don’t poison anyone. Profitability and their business model should not matter in making this decision. A simple solution would be having an onboarding process where all publishers / people need to be verified with 100 points of ID and all corporate advertisements in said channels approved by an actual person and not an AI. These companies only became so big because of their lack of regulator restriction or over site. They should be treated the same as any publisher.
  6. How do we get ourselves unhooked from devices and back to reading books? Discipline. It’s not easy and no different from choosing the right food out of our fridges and cupboards. The depth of the crisis although obvious now, won’t be acted on for a generation, as per obesity.
  7. Gig economy, work from home, virtual companies. Hard to regulate loose affiliation of people, eg: Bitcoin. We are used to regulating companies, what do you think? Just like the Taxation system, we need to develop regulatory models which are designed for individuals & corporations as we enter the new economy. As we’ve had expansions of how individuals and companies and can participate in the market – we need to regulate accordingly – some of which will afford the populace protections from Corporations, for example gig workers; Here we might be able to do something like provide mobile employee benefits on all work regardless of employment status. We could possibly do a percentage loading on money paid for every individual task or gig where for all forms of work completed in a freelance oriented marketplace get money paid into a systems which administers annual leave, sick leave, health benefits super etc. We may also need protect individuals from themselves with emerging industries like Fintech (Buy Now Pay Later and Crypto Gambling – yes it is gambling). What is certain is that we need regulate loosely affiliated people based on intention and outcome of activities and not define it into industrial era corporate structures. New eras need new definitions, and matching regulations to cope with structural shifts. It won’t be easy, but it will be necessary.
  8. What’s your view – does social media do more harm than good? I think social media does more good than harm. But the ratio is of harm is far too high. A very large percentage of the content of the platform is fake, untrue, sensationalist, enraging, divisive and often other peoples content which was stolen without permission and monetised. The corporate loophole is that results of social media interactions is often two or three steps removed from the social media forums themselves. So consequences and their responsibility for what happens after the digital interaction has plausible deniability. Take for example the correlation of increases in teenage female suicide and social media usage with this cohort. Even if the ratio of good to evil was say 90% – at this scale (Facebook has 2.3 billion members) that could be a very real problem for society. The Antivax movement has used these tools to gain a lot of traction and has a real impact on Covid Vaccine hesitancy, which is having an immediate impact on rollout. I’d hazard a guess at least 20% of content on social is bunk – it is very difficult to determine this as algorithms and data is an internal corporate secret. For Social Media to not have a negative societal impact it would need to be 99.9% without misinformation. That could only be achieved with very clear ‘road rules’ / auditing and regulation. We’d need something like forensic data inspectors similar to OH&S inspectors in a factory. In addition to that, the business model of Free Services – creates a market of Surveillance Capitalism, which will not end well I’m certain.

– – –

Keep Thinking,

Steve.

Food, Data and Modernity

People are driven by scarcity. Things of value, with limited availability, drive a strong desire for more. Information used to be like that. We had very few channels for accessing knowledge. It used to be difficult to find esoteric content. But once we found it, it was usually of high quality. But information in today’s world has done a complete turn around. Now it’s easy to find on any topic, but much harder to rely on the quality.

It’s as if we are so thrilled to find information on our topics of interest and existing opinions that we rarely stop and consider what we’re feeding our minds. We are bingeing – we are becoming addicted. And sometimes, it’s an all you-can-eat buffet of informational bullshit.

While information can be wonderful and powerful, it’s a lot like food, If we consume the wrong stuff, it can have a massive impact on our well being. We’re now entering the era of ‘digital obesity’: a world full of people consuming the wrong information in copious quantities. Often facilitated by those who profit from the distribution of bad content.

It’s not the first time we’ve faced a problem like this.

Up until about 100 years ago – very few people had more food than they could eat. But once food became heavily industrialised and super cheap, we indulged in excess calories. For the past 70 years, humans in developed economies had access to much more food than they needed. The net result is more shocking than surprising. Around the world today, there are more people who eat themselves to death than starve to death. The problem of course is that we’ve been programmed over the past 200,000 years to eat as much as we could, whenever food became available, to simply stay alive. Our DNA evolved to cope with periods of feast and famine. Today, it’s just a feast, for most people in developed economies. Now the biggest health problems facing our species are the results of over-eating.

The good thing is now we’re aware of the downsides of having too much to eat, we’re adapting. We’re re-educating each other on what good food looks like, how to resist the junk and how to resist eating more than we need. So many processed foods are calorie-dense and nutrition-poor that they trick the mind to crave the wrong stuff.

Maybe it’s the same with ‘processed’ information? We are getting sugar rushes with every click, but we are not providing our minds with the nutrients it needs to grow and sustain itself.  We also need to learn to leave some information on the table. It seems the shift from scarcity to excess (in many forms) is an endemic problem of modernity. We’ll have to keep adapting to resist the excess, and find the quality. While it’s not our fault we’ve reacted this way, if we are at least aware of it, we can make a concerted effort to feed our society and our brains the nutritious content our mind really needs.

Algorithmic Bias

It’s easy to think machines have some kind of impartiality to them given they are, well, machines. But anything built by humans has a human inside it. Algorithms are no different, and just like us, they are filled with bias.

Algorithm – a word once confined to University mathematics departments and computer labs – now takes pride of place in every second tech news article, determines what you see online and why you received this email at 7am Australian Eastern Standard Time. So it pays to understand what they are, the impact they have and the biases they’re so often driven by.

So let’s go back to the start – what are algorithms in simple terms?

Algorithms 101 – An algorithm is a set of step by step instructions used to do something or make a decision.

With a definition this broad you can see that we humans use them everyday. Even sorting the laundry into darks, colours and whites, and washing them each separately is technically an algorithm. The steps to cook something (a recipe) is an algorithm. Where computers come in is that they can follow a very large number of steps, on a very large data set, and make decisions quickly and precisely. (In the clothing example above – the data set is the clothes and their colours, and the steps are where to sort each piece of clothing).

Algorithmic bias occurs when a computer system reflects the implicit values of the humans who are involved in coding, selecting and collating data to execute the algorithm. The emergent problem with the algorithms in big tech is that they’re designed to achieve corporate outcomes, not societal ones. Their values are simple: to make as much money as possible.

Algorithms now run so deep and cross-reference so much data that what we input has little to do with the outputs we receive. What we now have on the web is ‘the illusion of choice‘. It isn’t just our feeds on social forums which are decided by algorithms. Even what we search for is biased towards corporate algorithmic design parameters. Just search for anything on google you want to buy, in any category and the first bias is plain to see – it assumes you are after the cheapest version possible of every item: clothing, sneakers, airline tickets, hotel rooms, you name it. Apparently we all want to cheapest version of everything no matter what it is. Even if you put the word ‘high quality‘ before the primary search term you’ll still be guided by price. It’s not before we get very specific with words like expensive or search specific brands before we can find what we might need. Another in search is recency bias. Search will always show the latest version, or story of anything and anyone unless a clear time stamp is included.

When we look at social feeds – it’s clear that their algo-game is built on emotional leverage: birthdays, parties, engagements, births, deaths, family events and of course, controversy. These stimulate engagement and keep us on the site longer. Our desire to feel loved, important and often enraged are all that matter to them.

While these examples seem innocuous enough, the proliferation of an algorithm-based society is reinforcing many social biases such as gender, race, ethnicity and economic status to name a few. The canary in the coal mine is dead, the miners are still digging, and yet Silicon Valley are still making bank unfettered. So what should we do?

Like all technology, algorithms are neither good nor bad – they’re just tools. Tools that need to be civilised with some metaphorical workplace health and safety guide rails. They are here to stay, and so our best bet is to make them better. I see two paths forward:

  1. Change – We need to push for transparency on algorithms. Know what’s in them and have to ability to turn them off on demand. I don’t care if the algorithm owners are for-profit corporations – we can and should be able to regulate their output as much as we can a box of cornflakes. No company ever made a decision which reduced its profit until society made it so. It’s time we pushed for big tech to air their dirty laundry.
  2. Opportunity – We need to remember that every flaw in an industry, every broken promise or self-serving design leaves the door ajar for a nimble entrepreneur to make a more respectful version of the product we’ve got little choice on. Algorithm-based tech is no different. Maybe it’s just me, but I think the world is waiting for a social platform we can trust that isn’t designed around extracting unlimited hours and outrage from the people it’s supposed to serve.

Above all, we should never forget that capitalism works best when it is guided by society, not the other way around.

If you like this blog post – forward to a friend who will dig it.

Why other industries need to call out Facebook’s advertising policy

Let’s for a minute imagine these as Corporate Policies:

Car Manufacturer: We’ll take a car off the road if an unsafe model gets out of the factory and is sold, but we can’t promise all our cars are safe until you start driving them. If you see an unsafe car out there, please let us know. 

Fast Food Outlet: If our pizza has salmonella or listeria, you can return it, but we can’t promise all our food is safe to eat. If you get sick or know someone who did, please let us know and we’ll take the pizza back. 

Packaged Goods Manufacturer: If our shampoo has chemicals that are unsafe and burn your head, we’ll change the formula, but we’re not sure until we sell it if it’s OK. If you see anyone with a burned head, ask them what shampoo they used, and if it’s our brand, we’ll happily take it off the shelf.

This is essentially what Facebook Inc. have just announced as their Global Policy for Advertising. All I’ve done is paraphrase their policy, and changed the product and industry. Here it is below for your reference:

Joel Kaplan – Global Policy VP

“We try to catch content that shouldn’t be on Facebook before it’s even posted, but because this is not always possible, we also take action when people report ads that violate our policy”

Facebook claim it isn’t possible for 2 simple reasons:

  1. Because it isn’t profitable for them to check every advertisement before it goes out.
  2. Because they haven’t been regulated in the same way other media organisations are.

While I understand 2 billion peoples comments can’t be moderated before they’re published, maybe paid advertising on Facebook should be. Facebook at least ought to be held to account financially when their ‘platform’ creates problems for society. Their current MO when anything outside their policy happens is ‘oops, sorry about that’ . They get away with it because society and regulators let them. A good starting point to fix this is to start calling out Facebook for what it actually is – a media company, not a technology business. There is a certain responsibility that goes with being a media company and its resultant influence, yet Facebook continues to flout the responsibility that is incumbent upon such power.  To call it a technology company is ridiculous. All companies employ technology – Boeing and Ford have a far greater breadth and use of technology than Facebook, but at least they admit they sell airplanes and cars. Facebook sells advertisements to their audience, not technology – seems like a media company to me.

It’s also worth noting that the update from Facebook policy resulting from controversy surrounding fake ads and alleged Russian influence on the US election didn’t address the problems of false information, only ‘transparency’ of what was published, promoted and who did it. The extreme targeting possible on Facebook is itself one of the problems. Those likely to spot a misleading advertisement are unlikely to see it. In this sense the promise of transparency is a moot point. A further quote from the statement in relation to advertising via Russian accounts below is quite telling:

” All of these ads violated our policies because they came from inauthentic accounts” 

Not because the information was misleading? And further on…

“Our ad policies already prohibit shocking content, direct threats and the promotion of the sale or use of weapons.”

Apparently advertising false information is OK? No mention of it anywhere… You can read it for yourself here. 

While Facebook promises to create a more open and connected society, it is in reality creating a more silo-ed and disconnected society. When governments first gave out spectrum at the birth of the TV era, it came with the responsibility of providing unbiased news and balanced data on issues affecting society. We didn’t let the idea of innovation or new technology interfere with creating the kind of society we all want to live in.

I think social media is one of the most amazing things to evolve in my lifetime. The power provided through connection and sharing thought has helped me re-invent my career, find like-minds and gain knowledge that just wasn’t available in the mainstream media era. For that I’m grateful.

But it is time that we took its power more seriously. It’s time to add seat belts and brakes to the data vehicles driving our lives and admit that no technology out of control or without failsafes ever benefits society.

– – –

If you liked this post – you’ll dig my new book – The Lessons School Forgot – a manifesto to survive the tech revolution. 

Why I'm using Snapchat

Snapchat Billboard

I got message from Richard on Linkedin regarding my use of Snapchat recently: here it is below:

Steve, I noticed around the time of this post you started posting about the benefits of Snapchat, rather than promoting your book. I get it, especially for connecting to the next generation. I also get that Snapchat is a whole new, exciting landscape & way of connecting that’s diff. to other social. I’d to hear more about why you’ve made the switch and have started on it as a platform. Only so many hours in the day. Pls explain why time on this is good for startups. Listened to https://lnkd.in/bXKk5D4 but would love YOUR take 🙂

Here’s my response:

Hey Richard,

Props for paying attention to my stuff and thanks for reading. I started to mix it up a bit at the bottom of my blog posts. Still some book, but mostly snapchat. My book, I promote less because it is getting old and I’m writing my next one… is all. The reason for Snapchat is that Twitter has very little engagement these days. I have nearly 7000 genuine followers on twitter, I never chased a follow ever, all came to me. I get about 300 people with the opportunity to see my tweets, about 7 looks or so click or engage – Pathetic. With Snapchat in just a few weeks I get around 50 engagements. People literally looking at my story. Snap is simply replacing my twitter. But more than that, we need to be nimble and change forums when the disco gets too crowded and noisy and no one can hear each other speak. For me it’s not about connecting to the next generation, it’s more that my generation are now using Snapchat – it’s my old twitter audience changing which nightclub they dance in….Another reminder that we need to be loyal to our audience, and not worry about the forum or the way they want to connect. I was an early Naysayer of Snapchat and it’s value. But their ‘My Story’ feature (24 hours of snaps) changed it’s value proposition for me – and when the facts change, I change my mind.  In fact, I’m gonna blog this…. trust you’re Ok with that… 🙂

Steve.

P.S – If this blog disappears quickly, it’s because Richard wasn’t ok with it!