spinofflive
Boring phone (left) vs exciting phone (right)
Boring phone (left) vs exciting phone (right)

TechAugust 4, 2018

One simple hack to (maybe!) cure your phone addiction

Boring phone (left) vs exciting phone (right)
Boring phone (left) vs exciting phone (right)

After years of watching powerlessly as his iPhone took over his brain, Duncan Greive found a way of getting it back.

Last Saturday I was sitting with a friend, patting a puppy while she idly thumbed her phone. I was a few feet away, but it sure looked like the screen was black and white. She confirmed that my eyes weren’t deceiving me. “I made it grayscale,” she said. I teased her about it, operating on the (fair, I felt) assumption that she’d done it to have a chic phone. She replied that it was instead an attempt to make her phone less exciting. “It gets rid of the dopamine hit,” she said. And I was suddenly very interested.

She showed me how to do it in my accessibility settings – instructions at the foot – and, soon after, my phone was boring too.

I did this because I hate my phone. I mean, I love it – use it all day for a million things and get itchy when I’m away from it for too long. But I hate what it’s made me, the unthinking action induced. Checking Slack all weekend in case someone’s said something funny (they mostly have). Trying to stay on top of emails which will only ever precipitate more emails. The mindless scrolling of pictures of other people’s kids when my real life children are sitting right there. And all the while knowing it was some base chemical thing, an element of primordial ooze-learned behaviour I have proven myself completely incapable of outrunning.

I’ve tried things, obviously: deleting apps, changing the password to Twitter, turning off almost all notifications. None of it really worked as well as I hoped.

(It’s not that I have a problem with other people using their phones, incidentally – I’m a firm believer that humans should do as they please so long as it doesn’t harm other humans in the process. If you love your phone and it doesn’t make you feel kinda sick then I’m so happy for you. I just can’t escape the vague but unmistakable sense of having willingly sold my attention to the shareholders of a few Californian corporations’ for data gathering purposes – and that they are getting a lot more out of the transaction than I am.)

Then, grayscale. It worked, almost instantly. Instead of the phone being brighter than the world around me, it was significantly duller. Instagram in particular basically became a procession of bad art photos. But everything was worse – from Snapchat to ESPN, my phone seemed drained of life and colour – because it was. (The least affected are text stories, incidentally – so keep reading The Spinoff’s great and now free app even in monochrome!)

There are a few stories around about it, though it still seems a fairly fringe activity. One mentioned a slot machine analogy. This really stuck with me – the image of people slumped at casinos pumping money into pokies is one of the main reasons casinos have such a bad rep. Yet I imagined myself doing essentially the same on my couch, without even the chance of a jackpot to keep me there.

Paul Corballis, associate professor in the school of psychology at the University of Auckland, thought the comparison worked. “There’s very likely to be something to the slot machine analogy. Variable reinforcement is a powerful way to maintain a behaviour,” he wrote me. “That is, the relatively infrequent, but unpredictable delivery of something rewarding keeps you coming back to check your phone.”

He was sceptical about the extent to which colour was the driver of phone usage.

“I’d guess that it doesn’t do all that much. I’m sure the colours, flashes, animations, and such are rewarding to a degree, and changing to greyscale will disrupt that aspect of the salience of the phone. It may also disrupt the overall visual salience of the device — making to stand out less from the background and capture less of our visual attention. These effects are likely to be small compared to the overall effects of variable reinforcement, though.”

To Corballis the parts of phone usage less dependent on colour for the reward function would power through the superficial change. “People who are heavy users of Snapchat, Instagram and the like will get more benefit from the grayscale than people who are motivated by streaks, Facebook likes, and other, less visual aspects of the variable reinforcement that maintains their behaviours.”

That might be so. There’s a good chance that the specific type of phone issues I have are my own – I have almost no notifications on, I don’t care about streaks, read things more than look at things. It might therefore be more powerful for me as a type than most. Still – it feels good that there’s an easy and accessible tool out there to mess with my behaviour. Because the attention war is real, and innumerable companies big and small have worked hard to find ways of making these beautiful objects addictive and useful in a huge variety of ways.

So far, grayscale is working for me. I’m a week in and I just look at it different now. My phone is just not as exciting as it was. The homescreen is duller than everything around it, where before the reverse was true. When I get in to do something, I get out a lot faster. I feel a much smaller rush when I see it in the morning. It’s becoming more of a routine object in my head, like a TV remote or a chair: great, love it, wouldn’t live without it – but not the only thing in the world.

How to make your phone boring

iOS

On an iPhone go to Settings  / Accessibility / Display & Text Size / Colour Filters / Grayscale.

Android

Android phones have different methods according to your model, but it’s generally found through the accessibility menu, too.

Keep going!
bill AI

PoliticsJuly 30, 2018

NZ’s public sector needs to get on board with AI, or the future is bleak

bill AI

Trusting machines to predict citizens’ need for targeted resources can be damaging and increase bias. New Zealand has no choice but to get onboard.

When you think about it, a lot of the services the state provides are ones that you might not wish to be party to: criminal prosecution, incarceration, tax investigation, deportation, and child protection services all come to mind. Being on the receiving end of these services when you really don’t qualify can be harmful – perhaps an example of that puzzling phrase “unnecessary harm”. Even the process of establishing that you are not in “need” of such services can be unpleasant, expensive, and stigmatising.

Most people are law abiding, so directing these services where they are needed is like finding a needle in a haystack. Relying on human expertise entirely can introduce bias or blind spots and doesn’t allow scarce resources to be directed to best effect since you don’t have a view of the totality of any issue, nor necessarily what combination of policies and actions most usefully address it.

Using algorithms to target resources can and has caused harm.

Since the Cambridge Analytica scandal, we are more aware than ever that the information which comes to us via social media or internet search is selected by algorithms trained on our past behaviour and the behaviour of those like us. Generally, the data upon which such algorithms are trained are biased, and if the creators of the algorithms aren’t diligent, such bias will get built into the algorithm. The algorithm then becomes an instrument to perpetuate bias or prejudice. In the Facebook and Microsoft image libraries, it’s more common for people to be labelled as women in images containing kitchen objects; this builds in a propensity to identify men as women when they are next to stoves. In the US, Google searches for African-American sounding names such as Trevon, Lakisha, and Darnell are 25 percent more likely to return arrest-related advertisements than for those with white-sounding names.

NZ is too small to not use this technology.

It’d be tempting to ban all use of algorithms for targeting “services” where a false positive or false negative is harmful. That wouldn’t get rid of bias and prejudice, but it would reduce the scale of the impact and not systematise this prejudice. Unfortunately, New Zealand does not have that luxury. Because of our small size and relative lack of wealth, our future relative standard of living depends on the effective adoption of algorithms for resource allocation.

A surprisingly realistic stock image of a teacher and students

Government resource expenditure falls into two categories.  It’s either expenditure on things that scale with population, such as front-line delivery (police, teachers, firefighters etc); or expenditure on things that scale with the complexity of the society, such as whether we have a regulated currency, a legislature, social support programmes, economic development policy etc.

Unfortunately for us, government resources (money) more or less scales with population (though Singapore or the OPEC countries buck that trend, for example).

We are a complex society, at least as complex as much larger or wealthier countries – think Japan, Australia, UK, even Singapore – so we have just as much need for the stuff that scales with complexity as those bigger countries. Except we have much less money to spend or people to do the work.

Something has to give. If we don’t find smart ways of efficiently delivering services and making decisions, we won’t be able to attend to all the needs of people living in a healthy, prosperous and happy society. And over time this relative lack will result in other countries having better standards of living, all other things being equal.

Which means our young people will leave for places with better jobs, education, and healthcare, and we’ll be less attractive to immigrants. It’s a downward spiral from there.

That’s the downside of not using AI and the like to do things smarter. But there is an additional upside if we do adopt this technology. Large countries have to struggle with issues that we don’t, at least not to the same degree: coordination and communication, physical distance or multiple time zones, jurisdictional issues, or extreme societal heterogeneity. These issues can be a real drag on efficiency and effectiveness. Perhaps the judicious and effective use of algorithms in the public sector will level the playing field, or even allow our small size to become an advantage.

While there are justified concerns about using algorithms for the targeted delivery of government services, we really have no choice in the matter. We just need to figure out how to do it well and ensure that the public servants responsible have sufficient maturity and expertise to do the job.

We need fewer generalists and more specialists in public sector leadership.

The public service is largely led by generalists, so it’s rare for specialist skills to be present when they’re needed. It’s harder to take measured risks when you need to rely entirely on someone else for your information. The State Services Commission has a deliberate policy of selecting generalists for public sector leadership, ostensibly to promote stability in the public service by forming a large pool of experienced leaders able to be parachuted into vacancies as needed.

This policy, while directed at chief executives and their direct reports, will have an effect on every layer of management, and lead to specialists feeling discouraged from pursuing leadership careers. We expect hospitals to be led by doctors, universities to be led by academics, laboratories to be led by scientists – why should positions in the public service, accountable for technical work such as building machine learning tools, be led by generalists?

Whether we call it AI, neural networks, algorithmic decision support, machine learning models, or predictive analytics, the public sector must adopt this technology if we are to flourish as a nation. But there’s great risk if it’s done poorly. To ensure that it’s done well we need appropriate checks, balances and ethics frameworks – which, to their credit, the public sector is already creating –  and we need those responsible for this technology to understand it at a level that they can provide effective oversight while pushing forwards.

Editor’s note: This article was edited on September 25 2018 to remove reference to the Allegheny Family Screening Tool and research by NZ academics, following correspondence with the researchers involved