Published 13 October 202513 October 2025 · Technology Killer robots can dance, too Martyn Pedler Twenty years ago, Pauline Hanson appeared on Dancing With The Stars. The fact that she managed to avoid far-right dog-whistles — there’s no “waltzing while dressed in a burka”, for example — may strike us as something of a miracle. But it was also the entire point. This was a prime example of “funwashing”: taking part in a popular event or show to improve one’s reputation. Watching Hanson dance made her seem more relatable, even likeable, to mainstream Australia. See? She’s struggling with a cha-cha, just like you or me! The goal of funwashing is to humanise its subjects — but what happens when you attempt to funwash things that weren’t human to begin with? In February 2025, a robot designed for warfare performed a DJ set at a San Francisco nightclub. Created by Foundation Robotics Labs, “Phantom” is a humanoid robot with a faceplate like one of Daft Punk’s iconic helmets. (They were men pretending to be robots; Phantom is a robot pretending to be a man.) According to the New York Post, “[t]he unveiling of the humanoid robot as a DJ was a way for the company to show that the war machine can also be used for fun.” Promoting Phantom as having multiple functions foregrounds the notion that it’s a tool that can be used for good or bad — for fun-having or death-dealing. It’s the logic made famous by the National Rifle Association: guns don’t kill people, people kill people. Just because Phantom’s a war machine, it doesn’t mean it can’t party! Phantom performed at an US artificial intelligence influencer event — no surprise — but this kind of funwashing also takes place at the highest levels of Australian culture. The National Gallery of Victoria recently featured a four-month residency for artist Agnieszka Pilat and her three Boston Dynamics robot dogs named Basia, Vanya, and Bunny. These robots have been programmed to make art for an exhibition the NGV described as “futuristic, yet joyful”: Heterobota. Pilat calls herself a “propaganda artist”. She says this is a little bit sarcastic — I’m playing at being 100% for technology, which is very controversial. Of course there are valid concerns about technology. But I chose to engage with it and train it. It’s my way of dealing with the problem. That sarcasm, however, does not often come across in Heterobota. Any sense of the disquieting, the uncanny, the dangerous is provided by the audience — not fostered by the art. There’s a reason Pilat is beloved by Silicon Valley. The robots, programmed to be charming and childlike, even know how to pose for selfies. How bad can they be? These machines can be programmed to play music or make art, but no one could claim such activities are their primary purpose. You could carve a sculpture by blasting away at a block of marble with an AR-15, too, but at least the gun wouldn’t pose for photographs afterwards. I’m not sure that what Pilat describes as doing “silly stuff with the robots” will be much comfort if one of these creations confronts you during a riot or in a war zone. There’s a reason why these dogs need their reputations laundered. As Sian Cain quipped in The Guardian, “that police forces keep buying Spots hasn’t helped.” Funwashing isn’t just for robots. AI can be presented as warm and fuzzy, too. In fact, OpenAI was forced to roll back a recent update because it made its LLM too agreeable, even sycophantic. (You’d be forgiven for thinking that all AI wants is love.) Think, too, of the landslide of photographs doctored by AI to look like the work of the beloved animation outfit Studio Ghibli. Harmless fun! At worst, you’re letting corporations scrape your data in return for a quick thrill. Then Israel’s IDF used the same algorithm to cutesify images of its troops and proudly post them on its social media accounts. Suddenly it didn’t seem quite so whimsical anymore. Headlines like “Army Testing Robot Dogs Armed with Artificial Intelligence-Enabled Rifles in Middle East” test the public’s acceptance of machines that “think” for themselves. There is an old conspiracy theory that Steven Spielberg was paid by the US government to make E.T. and Close Encounters to prepare the population for alien contact. Look back at the history of robots on screen and you could be convinced something similar happened with autonomous intelligence decades ago. For every HAL 9000 there’s three dozen R2-D2s. As the meme goes: you do not, under any circumstances, “gotta hand it to them” — but at least the NRA understanding of a tool is one that maintains human agency above all. Every time we convince ourselves that an algorithm is “thinking”, we accept the idea that it can make its own decisions. If we outsource our creativity to AI, why not our morality, too? Writing this, I kept thinking about the movie The Iron Giant. It’s the story of a young boy who finds a giant robot in the woods. The robot has lost its memory. It doesn’t remember that it’s a war machine, and when it eventually finds out, it’s horrified. The Iron Giant was reportedly pitched by director Brad Bird in the form of a question: “what if a gun had a soul?” As I watched Pilat’s robots, I thought that the difference was that the Giant didn’t want to be a gun. The robot dogs Basia, Vanya, and Bunny don’t care what they do — whether it’s painting a wall or chasing down a refugee. Giving robots agency is just abdicating moral responsibility. Australia has a particularly grim recent piece of history with this: the Robodebt scandal. An automated (and illegal) debt assessment and recovery system for Centrelink recipients, it accused many of owing thousands of dollars, driving some to despair, others to suicide. Despite being forced to pay back wrongful claims and face a Royal Commission, the Morrison government never offered a formal apology. After all, it was the robots who got it wrong. Image: Stop Killer Robots Martyn Pedler Martyn Pedler is a PhD candidate at Swinburne University, writing an interdisciplinary thesis on superhero stories. You can read more of his fiction and criticism at http://martynpedler.com More by Martyn Pedler › Overland is a not-for-profit magazine with a proud history of supporting writers, and publishing ideas and voices often excluded from other places. If you like this piece, or support Overland’s work in general, please subscribe or donate. Related articles & Essays 8 December 20258 December 2025 · Technology Howard and Harradine’s vision for the Internet is finally realised Mish Pony What binds these actors is not simply distaste for pornography, but a conviction that sexuality and identity require moral regulation. Through advisory roles and institutional influence, they're shaping a digital environment where sexual expression is suspect and censorship is framed as moral duty. 29 July 2025 · Transhumanism Bryan Johnson vs the death defenders Erin Stewart When Johnson claims that “death defenders” are “helpless”, he unwittingly gets to the point. The majority of the world does not have the cash to pay a team of doctors so that they may live. If you want to extend human life, you need to think about climate and environment policies, free healthcare and schooling, good jobs, income redistribution and an end to armed conflicts and genocides.