Women’s Bodies_BNessay_0043_Screen Shot 2020-08-04 at 5.53.22 PM

SocietyAugust 26, 2020

Alice Snedden: We need more non-sexual depictions of women’s bodies in the media

Women’s Bodies_BNessay_0043_Screen Shot 2020-08-04 at 5.53.22 PM

The ‘Free The Nipple’ movement seemed to me a low-priority feminist issue at first, but it’s representative of a much larger concern.

Watch Alice Snedden’s Bad News – Women’s Bodies and other episodes in the series here.

As a teenager, it would have been my absolute worst nightmare to show my breasts to anyone, let alone show them on screen. They were my deepest shame, because when I was a kid and I finally got breasts, they were different sizes. Different enough that I would say my left breast is my right breast’s goal weight.

It caused me a lot of anxiety, and I often sought comfort from my mum about it. However, as fate would have it, my mum’s breasts were perfectly symmetrical and she had no idea that it was normal for them not to be. I used to imagine my parents having late night discussions about it, like, “Ugh, shit, that fifth kid, something went wrong there, she turned out a bit messed up – they warn you not to have a baby when you’re old, and they were right.” ‘Cause my mum had me when she was 44, and it really shows in my right breast.

To think that now I’m happily getting my breasts out on the world wide web makes no sense to me – if my 15-year-old self could see me now, she’d be mortified. But thankfully, I’ve evolved. When I was 15 the importance I placed on my breasts was completely driven by my concern that they were the number one determining factor in how attractive I was. I understood, from consuming the media around me, that most of my sexual worth was tied up in my breasts and whether or not men would like them. I thought of them only in terms of how sexually attractive they made me to other people. I didn’t have an opinion on them outside of that, and that’s because I didn’t see depictions of breasts that weren’t sexual. Almost all of the imagery I saw (that wasn’t breastfeeding posters in doctors’ waiting rooms) was through a male gaze. And the male lens was sexualising.

The “Free the Nipple” movement was really the first time I paused to think. When I first heard about it, my initial response was that this should surely be low down on the priority list for feminist issues. But once I started reading about it, I understood that “Free the Nipple” is actually symbolic of a much larger issue. The reason nipples on a woman are sexual has nothing to do with women. It’s exclusively to do with how men perceive women. Breasts are sexual, because they are the object of a man’s sexual desire and therefore taboo in an everyday context. What this means is that women are once again being defined by how men see them and not by how they want to be seen. This isn’t to say that breasts can’t be sexual or arousing – those are some of the best times – but it’s a trap to insist they can only ever be this.

It may seem contradictory to say that breasts shouldn’t be viewed as sexual but also sometimes they can be sexual. But all it really comes down to is women getting to decide how they want to be seen. Women being able to determine when they want to be objectified and when they don’t. It’s allowing women to decide how they want to be in the world. They can be sexual or not, but ultimately it’s their choice.

Keep going!
A surveillance camera at a metro station in Moscow, where facial recognition cameras are being used to show the spread of Covid-19 (Photo: Kirill KUDRYAVTSEV / AFP)
A surveillance camera at a metro station in Moscow, where facial recognition cameras are being used to show the spread of Covid-19 (Photo: Kirill KUDRYAVTSEV / AFP)

SocietyAugust 25, 2020

Facial recognition technology is here. New Zealand’s law is nowhere near ready

A surveillance camera at a metro station in Moscow, where facial recognition cameras are being used to show the spread of Covid-19 (Photo: Kirill KUDRYAVTSEV / AFP)
A surveillance camera at a metro station in Moscow, where facial recognition cameras are being used to show the spread of Covid-19 (Photo: Kirill KUDRYAVTSEV / AFP)

Without a strong legal and ethical framework and clear policy for use, FRT can have grave implications for individual and collective rights, writes Nessa Lynch.

Automated facial recognition technology, which involves the use of an algorithm to match a facial image to one already stored in a system, is used in automated passport control and other border control measures, as a biometric identifier in the banking, security and access contexts, and on social media platforms and various other consent-based applications.

The use of FRT in policing is controversial worldwide. Unlike other biometric indicators used in policing, such as DNA and fingerprints, automated collection and matching of facial images is generally not covered by legislation. Facial images may be collected at a distance, without the person’s consent or even their knowledge. Yet, there is a spectrum of impact on individual and collective rights, from simple identity matching with a person who is arrested, up to live collection and matching of images by means of FRT-equipped camera surveillance systems. It matters also whether image matching is with existing police databases or watchlists, other state databases, private sector supplied image databases, or open source data. As my research collaborators have found, the use of live automated FRT in public places has significant implications for privacy rights as well as concerns around a chilling effect on rights to freedom of expression and lawful protest.

While we have no evidence that FRT is in current widespread use in the policing context in New Zealand, the police are known to have tendered for an updated FRT system and to have carried out a small-scale trial of a controversial system known as Clearview earlier in the year. FRT is known to be used more widely in this country by other public agencies such as Immigration NZ and the Department of Internal Affairs, and in the private sector.

A recent decision of the Court of Appeal of England and Wales illustrates the risks where the technology is trialled without having an appropriate legal and ethical framework in place.

The appellant, Mr Bridges, is a resident of Cardiff, in Wales. He was scanned by FRT, which had been (overtly) deployed by South Wales Police on a public street in Cardiff city centre, and on another occasion at a protest at a defence exhibition. The system used is named “AFR Locate” and operates by capturing facial images from a CCTV camera and automatically comparing biometric data from the images with images derived from a “watchlist”. A police camera operator may then review any matches, before making a decision on further actions or interventions.

Bridges took a case against South Wales Police alleging his right to respect for private life had been infringed and that the use of the technology breached data protection and equality legislation. A divisional court found that the right to private life was engaged but that the use of FRT was lawful and proportionate in the circumstances.

Bridges appealed to the Court of Appeal. The court held unanimously that the interference with Bridges’ privacy was unlawful because there were no clear guidelines on parameters of use, meaning that police had too wide a discretion.  However, the Court did find that the use was proportionate, meaning that the impact on Bridges was minor while the benefits (presumably crime control and public safety) were significant. It was also found that South Wales Police did not undertake a proper data protection assessment and had also failed to assess whether the system could be biased.

While the court noted that automated FRT involved a much higher level of intrusion than the police taking photos or using CCTV in public places, it declined to say that specific legislative authorisation was required, unlike DNA or fingerprints. Thus, although campaigners have called for a ban on use, police forces around the United Kingdom are now refining policy and guidance on the use of the technology to take account of the decision.

As to the implications for New Zealand, police here have indicated that after the criticised trial of Clearview, they are undertaking a review of surveillance technologies to ensure privacy implications are properly considered. While adherence to the privacy regime is necessary and welcome, there remains concerns about the lack of a wider legal and regulatory framework for the use of FRT and other surveillance technologies, as well as a clear policy for use. It is worth remembering that Bridges was able to take his action on the basis of alleged breaches of rights-protecting statutes that do not have direct equivalence in this jurisdiction.

New Zealand does now have what is said to be the world’s first Algorithm Charter, which sets principles for public sector agencies using algorithms for the basis of, or to guide, decision-making. Not all agencies (including, at time of writing, the police) have signed up to the charter. This is a voluntary set of guidelines, and the means by which an individual can query improper use and seek redress is unclear. It may also be noted that the government chief data steward has convened an independent group that is available to assist public sector agencies with data ethics issues, particularly relating to algorithms. (I am a member of this group, but the views expressed here are my own.)

Finally, our research group will report our findings on lawful and ethical use of FRT later in the year. With a report from the Law Commission on the use of DNA in criminal investigations also expected, it may be an appropriate time for reflection on the wider framework for the use of biometrics in policing in the particular societal and cultural context of Aotearoa New Zealand.

Nessa Lynch is an associate professor at the Faculty of Law, Victoria University of Wellington, and is currently leading a research project on facial recognition technology