Using the tiniest amount of RAM and code, each ghost is programmed with its own specific behaviors, which combine to create the masterpiece work, according to Paul Galloway, collection specialist for the Architecture and Design Department. This was the first time I’d seen video games inside a museum, and I had come to this exhibit to see if I could glean some insight into technology through the lens of art. It’s an exhibit that is more timely now more than ever, as technology has been absorbed into nearly every facet of our lives both at work and at home – and what I learnt is that our empathy with technology is leading to new kinds of relationships between ourselves and our robot friends. Also: How to use Dall-E 2 to turn your wildest imaginings into art The exhibit wants to show how interactive design “informs the way we move through life and conceive of space, time, and connections, well beyond the game screen,” according to MoMA. The interfaces we use to access the digital universe “are visual and tactile manifestations of code that both connect and separate us, and shape the way we behave and perceive life,” it said when the show was announced. On my tour of the exhibition, I continued past more masterpiece video games – Minecraft, Tempest, SimCity 2000, and Never Alone (Kisima Ingitchuna) to name a few – stopping to play with any open consoles. Many of the games seemed simple at first, limited to a single joystick and a couple of buttons, or a keyboard. Yet when I tried to play them, it took me a while to learn the ways of the game. Some of them, especially Minecraft, didn’t make sense to me at all, and I had to watch a child play around with it to understand the game’s intricacies of world building. The other museumgoers meandered through the games, waiting for a spot to open up. When one did, their eyes immediately stuck to the screen as they plunged into a new world with new rules. I was most drawn to the robots and gadgets, including a 1984 version of the Macintosh SE Home Computer, the iPod, and the EyeWriter, an eye-tracking technology created by designers for a graffiti artist with ALS that allowed him to create tags on city buildings from his bed. According to Galloway, the Never Alone exhibit is linked to an Iñupiaq video game included in the exhibit called Never Alone (Kisima Ingitchuna). This idea came from the Cook Inlet Tribal Council, who represent the native peoples of Alaska, and it was created in an effort to conserve the legacy of their culture and connect with the younger community. Also: One of the smallest countries plans to upload itself to the metaverse in the face of climate change “They made a video game and the core idea of the game is that it’s through a connection with each other and our shared cultures that we can find wisdom and peace, especially in facing the challenges of a changing world, and I think that just seemed a perfect metaphor,” Galloway said. So according to Galloway, here lie two meanings of the Never Alone exhibit. The first is that when we’re in a video game, we’re technically never alone, as the input, player, and designer are all parts that must work together for technological design to work. As players of the game, we’re constantly interacting with the input that the designer has created for us to explore such an interface. In this sense, it is impossible for us to truly ever be alone when we’re utilizing interactive design. The second thread is that – thanks to technology – we really are never alone, even during the most difficult of times, such as during a pandemic. We are constantly connected through technology, whether that be through connecting a community to a culture or simply staying in touch with each other online. This exhibit is a way of exploring our humanity and how our relationship with technology can reassert our empathy instead of causing us to become less human alongside these robots. Also: Remote work has changed everything. And it’s still getting weirder Galloway told me that the exhibit was divided into three parts: The Input, Designer, and Player. “We thought about the three different parts in that exchange. There’s the actual machinery, there’s the person using the machinery – the user or the player – and then there’s the person who is designing all the experiences,” Galloway said. “Part of the reason this exhibition is happening after the pandemic is we spent two years glued to our screens and interacting with each other through the mediums of the various programs, whether it was Zoom calls or Fortnite Battle Royale, or playing Among Us,” said Galloway. “Our interactions with each were mediated by these tools and that made us all very much pros at interactive design.” Also: Can the metaverse save video meetings? Here’s what we found For a while, many of us were effectively forced to channel our interactions with each other through devices and screens. And the Never Alone exhibit is also asking – perhaps unexpectedly – how far we can extend our empathy not just through devices, but to the devices themselves. One way to examine such interactions is through the Technological Dream Series: no. 1, Robots project installation by Anthony Dunne and Fiona Raby, which is in one corner of the exhibit. A variety of differently shaped objects – a red circle, what looked like a large shower head, a bent wooden rectangular prism, and something that looked a lot like a lamp – are all sprawled out across the floor. In the accompanying video, a woman stands by these objects, periodically picking them up, examining them, and listening to them seemingly whine, as if they are yearning for her attention. Are these objects supposed to be robots? “Robots can take any shape, and again [we’re] investigating our ability to extend empathy to these things that are completely alien and inhuman-looking,” Galloway said. “It’s not like a Roomba cleaning your floor for you, instead it’s some dumb robot that can’t even move. All it can do is cry,” Galloway said. “How do we look at ourselves and extend our humanity to something in that way? “I think that [the pandemic] was so heavily mediated and informed by screens, and digital devices, and interactive software that I can’t think of all that stuff the same after that experience,” he said. Also: The best robot toys for kids This exhibit is the perfect opportunity to examine our renewed empathy and realize that perhaps our empathy for these devices was, in fact, always there. For example, consider the Tweenbot. The Tweenbot came from a project back in 2009 when Kacie Kinzer let this little, smiling cardboard robot wander around Washington Square Park in New York City with only the assistance of passerby and a flag that said “Help me”, pointing to a specific direction to help it get to its destination. Surprisingly, the brisk New Yorkers walking at their New York walking pace stopped to help the Tweenbot stay on the right path and disentangle him whenever he encountered any obstacles. The Tweenbot was successful in reaching his destination and surprisingly did not end up mangled in a ditch somewhere in the trenches of the city. The Tweenbot wouldn’t have been able to complete his mission without the help of humans to guide him. So, there must be something in us humans who – walk the streets of the bustling city on a daily basis, without ever making eye contact with anyone – stop and take the time to get the little robot back on track again. It seems counterintuitive for humans to help a robot (or any piece of technology) achieve a goal, instead of the other way around. After all, robots are supposed to make our lives a little easier. They can complete tasks ranging from simple to complicated, such as cleaning, making deliveries, and even cooking. But Kinzer’s project showed us that, when the roles are reversed and robots are the ones who are dependent on humans to get something done, humans are capable of extending empathy to them. Perhaps that’s a positive sign for us all – that our interactions through technology can keep us connected with the people we care about but also make it easier for us to extend that empathy to the world around us, too.