Skip to main content
Advertise with the Morning Star
Covert surveillance, blatant coercion
ROX MIDDLETON, LIAM HELLEWELL and MIRIAM GAUNTLETT take a look at what it means to live in an algorithmic society
[Alexandre Debiève / Creative Commons]

TEN months ago, in April 2023, Google discontinued a device that it had worked on and publicised for 10 years. Google Glass looked like a pair of clear wrap-around glasses with a chunky bit at the top of the frame. The “chunky bit” was a tiny projector which projected onto part of the glasses themselves so that the user could see a screen in their field of vision. In September Google stopped updating or maintaining the Google Glass products that it had already sold. 

What happened? Although the product had been through significant development since the original headset design, it was clearly not a moneymaker. The project started off at “X Development” (not associated with Elon Musk, but at the Alphabet subsidiary that Google uses for development projects). 

The company trialled the product on a range of test users in an attempt to make the Google Glass technology seem useful and not creepy; for example, publicising its use by doctors to get a “surgeon’s eye view,” and also by a group of breastfeeding mothers who could apparently use the device to talk to experts and read advice while breastfeeding their babies.

The 95th Anniversary Appeal
Support the Morning Star
You have reached the free limit.
Subscribe to continue reading.
Similar stories
8computerdata
Features / 2 October 2025
2 October 2025

Digital ID means the government could track anyone and then limit their speech, movements, finances — and it could get this all wrong, identifying the wrong people for the wrong reasons, as the numerous digital cockups so far demonstrate, warns DYLAN MURPHY

WAR ON CLAIMANTS: Liz Kendall outside the Department of Work and Pensions, March 2025
Features / 20 May 2025
20 May 2025

While claiming to target fraud, Labour’s snooping Bill strips benefit recipients of privacy rights and presumption of innocence, writes CLAUDIA WEBBE, warning that algorithms with up to 25 per cent error rates could wrongfully investigate and harass millions of vulnerable people

MORE QUESTIONS THAN ANSWERS: AI Truth Machine / LIT Law Lab,
Features / 12 April 2025
12 April 2025
ANSELM ELDERGILL asks whether artificial intelligence may decide legal cases in the future, in place of human judges, and how AI could reshape the legal landscape
Features / 7 January 2025
7 January 2025
Huge new buildings kitted out with powerful computers consumed 21 per cent of the nation’s electricity last year – leading to increased power demands and increased bills. MATT O’BRIEN reports