top of page

Imagining the user: On designing technology with older test users

Neven, Louis. "‘But obviously not for me’: robots, laboratories and the defiant identity of elder test users." Sociology of health & illness 32, no. 2 (2010): 335-347.







Newspapers, podcasts and TV news commonly describe new technologies, including robots, either in very negative ways or with the hope that they will provide solutions for many societal problems. Among the issues that technologies could fix is the rapid aging of populations coupled with the decreasing numbers of healthcare staff to cater to older people who are often (considered to be) in need of care.


An article in New York Times, for example, argued that the Covid19 pandemic has given a "new urgency" to the development of robots and other technologies to support the physical and emotional wellbeing of older adults. Robots have also been conquering people's imagination through fiction novels in which they care for children and older people in the not-too-faraway future. In Klara and the Sun and Plum Rains, two novels which are tellingly both placed in Japan, social robots are designed specifically to care for children and older people respectively. In these works, robots are depicted as superior to humans even when it comes to care - a trait that philosophers like Joan Tronto have claimed to be essential to being human.


And yet, social robots have not been taking off as well as their producers and designers had hoped for, not even in Japan, as James Wright writes. Pepper, one of the most famous Japanese social robots, has been discontinued in 2021, and others have shared this fate. So, to improve social robots, technology creators are increasingly relying on participatory design which includes potential end users of a certain technology into the design process:



Among the discontinued robot projects was iRo (fictitious name), a prototype developed in the Netherlands. Although Louis Neven published his findings on iRo more than a decade ago, his article remains relevant to this day: the case of iRo is surprising because this robot was indeed designed through participatory research as described above, and yet it failed. As Neven (2010, p. 338) notes, the researchers involved in this project "went to considerable lengths" to better understand elder users by carrying out a literature review, organising a brainstorming workshop and finally conducting tests with people who could potentially become iRo users. What went wrong, then? As Neven writes,


despite this concern for understanding users, the researchers would struggle to incorporate the participants' alternative views on iRo which emerged during the tests (p. 338).

Most importantly, the designers and their test users had different ideas about who the prospective users of iRo would be. The designers imagined old people as varied, having different preferences, needs, lifestyles, mental and physical abilities and social environments. However, Neven's analysis shows that elder test users, too, had their own representations of potential robot users - and they differed significantly from how they thought of themselves:


the test users would state that they liked iRo, liked playing games with it and thought it worked very well. However, they also repeatedly, and often without provocation, mentioned that iRo was not a robot for them ... Most participants felt that the intended user was housebound, old, lonely, feeble and in need of care and attention, and they did not want to be equated with that person (p. 341).

Neven argues that the role of media is significant: even if robot designers didn't imagine older people in stereotypical, simplistic terms as frail, lonely or forgetful, test users had read articles and seen news on television that portrayed robots, similar to iRo, placed in retirement homes. This helped to create a connection between social robots and 'old people' as being of certain age as well as lonely, in poor health and needing care. Neven notes that


ageing people can create an image of 'old' people and actively dissociate themselves from that group. Test users did exactly that; they made the prospective user of iRo into somebody else, somebody older, lonelier and in need of care, and they actively associated themselves from that user (p. 342).

They did so by creating another image of themselves - as helpful elder test users. They embraced the idea of successful aging and positioned themselves as active, healthy, altruistic, helpful older adults, untouched by the negative consequences of aging (Neven 2010, p. 342).


While the designers also imagined the elder users as people who would need and eventually also want (health) robots, they did not take into account that elder users might straightforwardly refuse to be positioned as old, lonely and frail - and therefore reject the technological solution that was proposed to them.


But not only engineers, researchers and designers should be held accountable for this outcome; media played a significant role in this process through maintaining a particular popular discourse of elder users of robots. As Neven (p. 345) concludes:


Ideally, a number of parties - for instance, journalists, but also policy makers, politicians and scientists - should share the responsibility for creating more refined, non-ageist images of elders users, which could thus lead to more refined user representations.





bottom of page