Tech liberates Western women, but it oppresses women in developing nations – not that the tech giants care. Across the globe, tools that empower American women are being reconfigured to cage and degrade women. From the recent innovation that can ‘out’ women in porn, to Saudi Arabia’s use of women-tracking apps, to the surveillance potential of China’s Uighur-tracking systems, women are being colonized by tech.
From the washing machine to the smart phone, technology has allowed women to be in control of their own time and space. If we’re walking home late at night, being able to reach out and let a friend know where we are gives a sense of security. So does knowing the name and rating of your Uber driver, being able to share your ride status, checking in when you arrive at or leave an event, or sharing your location and what you might be doing.
When Western women track themselves, and permit friends and family to track them, they’re creating a communal safety net. This gives us an unprecedented freedom and safety of movement. We’re aware that we’re not entirely alone out there, waiting to get raped and ditched by bad actors. But for women in developing nations, the increase in mobile tech adds to the burden of male oppression.
In Saudi Arabia, where women cannot legally travel alone without the permission of a male guardian, the widely derided Absher app allows men to ‘list dependent women’, and to ‘either deny or allow individual travel’. In a country where women activists are routinely imprisoned and tortured for advocating for a women’s right to move freely, the installation of an app like Absher on a woman’s phone is more than a slap in the face from the men who would deny her natural rights; it is an existential threat to her life. Her body is not her own, but the property of the men to whom state and religious authorities grant control over her. This kind of tech misogyny spreads male dominion over women into the digital sphere, further entrenching the women’s legal degradation.
In China, a user on China’s social media network Weibo claims to have identified 100,000 women who have performed in pornographic films by cross-referencing photos on social media with faces pulled from videos on adult platforms like Pornhub. Carrying the story without confirming it, VICE questioned whether it exists at all. But the potential abuse of facial recognition technology in the context of porn for women is obvious.
Pornography is a huge and growing global market, with demand for female performers especially high. Western culture has a vested, capitalist interest in reducing the stigma associated with commercialized sex. The language is altered to make it more acceptable; ‘sex work’ as opposed to ‘prostitution’, and averrals of ‘porn pride’ from the industry. But porn is still all about performing sex for money, and secrecy and shame still attach to it. Not every porn actress is out there waving her ‘porn pride’ flag, or wants her cinematic catalog to be cross-checked against her Facebook and Instagram posts.
If women who choose to do this work have the right not to be identified, women who do not choose to make porn have every right to ensure that their friends, family or employers don’t think they do. Facial recognition tech has already come under fire for recognizing and differentiating between white and male faces and non-white and female ones. The risk of erroneous recognition for women, especially black and brown women, is high. What happens to women who have not appeared in porn, but whose facial features are insufficiently differentiated by the technology? What redress will they have when they are targeted online as ‘porn stars’?
The Weibo user and his associates claim that they created this program ‘to have the right to know on both sides of the marriage’. After a public outcry, he later claimed his intention was to ‘allow women, with or without their fiancés, to check if they are on porn sites and to send a copyright takedown request’. As ever, surveillance is sold as a kind of public service. But the public pays the price in privacy.
To see what life in a facial-recognition society is like, consider China’s Xinjian province. The Uighurs are under constant surveillance. In the emerging Chinese surveillance state, people are afraid to speak their mind, to move freely, and to make choices about their own lives. Individuals who feel they are being watched behave as though they are being watched. They scurry about, fearful and uneasy. They cower inside their own hearts before the demands of the government, and they conform their behavior to external expectations. Anyone who has been a child of disciplinarian parents know what it is to hide inside yourself, and wait until the moment they are gone before you feel easy in your own breath.
The idea that if we have nothing to hide, we should not be afraid of being watched and exposed is internalized authoritarianism. What we might have to hide is not the question. Without the consciousness of free movement, we have no agency or control over our own lives. We become lab rats, awaiting our next stimulus. The aversion to being tracked, to having our movements known, is instinctive, and essential to our liberty.
Technology isn’t liberation in itself, but only a means to an end. The potential benefits of each piece of technology are moderated by the culture that applies it. In societies where women are already under the thumb of men, or where individual rights are already weak, women will be the ones who take the digital hit in the real world.