Non-user
Abstract
A “non-user,” as the name suggests, refers to an individual who does not use a given product or system. Critical work on non-use elaborates a range of applications for the term we consider here. The variations of non-use under discussion encompass both voluntary and involuntary cases of non-use.This article belongs to the Glossary of decentralised technosocial systems, a special section of Internet Policy Review.
Definition
A “non-user,” as the name suggests, refers to an individual who does not use a given product or system. Critical work on non-use elaborates a range of applications for the term we consider here. The variations of non-use under discussion encompass both voluntary and involuntary cases of non-use.
CONTEXT FOR NON-USER DISCOURSE
What broadly comprises “non-user discourse” is derived from user discourse. Commentary about the “user” originated in systems design, which emerged in the United States and Europe as part of a wider effort to advance the development of military technologies. As computing systems evolved, so too did the “user” for whom these technologies were designed.
Early data processing systems originally responded to the needs of information intensive industries. User organisations in both public and private sectors oriented the design of information technologies to enhance the productive capacities of their respective operations (Yates, 1993). It is within the context of user-organisation that innovation studies introduced the concept of “lead users'' into user discourse. Research focused on single industries identified the “lead user” as an individual who proposes key innovations from outside the industry (Oudshoorn & Pinch 2003, p. 541; von Hippel, 2007; Graham, 2006). What distinguishes the lead user from ordinary users is a set of skills that exceed the given functions of a particular device (von Hippel, 1976).
As demand for micro-electronics and personal computers surged in the 1980s, “user-centred” design and “user experience” re-oriented the design of systems to accommodate individual consumers (Oudshoorn & Pinch, 2003). With the convergence of information and communication technologies, models of human computer interaction turn their attention from the single user tethered to a single device to multiple users distributed across large networks.
In contrast to their predecessors, these products incorporated the “holistic study of users from the viewpoint of the user” rather than the system (Dervin & Nilan, 1986; & Hartel 2007, p. 2; White & McCain, 1998). Harnessing cognitive psychology to improve how systems were designed, the study of “user experience” deepened the existing view of users by taking into account the “emotions, beliefs, preferences, perceptions, physical and psychological responses, behaviours, and accomplishments” (ISO, 2009) that condition human computer interaction (Rheinfrank, 1995).
Research on users in human-machine interaction, information science, and cognitive psychology (Cooper & Bowers, 1995; Kosara et al., 2003; von Hippel, 2005) since then, has provided a basis for critical work in the field of science, technology and society (STS). It is within this context that discourse on non-users takes shape.
VARIATIONS OF NON-USE
From the standpoint of HCI, non-users are a technical designation for “potential users'' (Satchel & Dourish, 2012, p. 9). Implicit in HCI’s model of non-use are a set of assumptions that elicit much debate outside the field. Studies in STS identify a range of cases for non-use: resistance, rejection, exclusion, expulsion, lagging adoption, disenchantment, disenfranchisement, displacement and disinterest (Wyatt et al., 2002; Satchell & Dourish, 2009).
This spectrum of negative actions captures what makes non-use particularly difficult to define in positive terms. Because non-use is not observable in the way uses are, the study of it presents a formidable challenge for how scholars approach the topic (Dourish, 2001, p. 56; Treem, 2014). For the purposes of this glossary entry, we organise the different types of non-use into two primary categories. The first encompasses cases of voluntary non-use, while the second circumscribes examples of involuntary non-use.
VOLUNTARY NON-USE
Opting-out of use is a singular action which belies a complex of subjective considerations and varies in relation to economic conditions and ideological commitments (Brubaker, Ananny et al., 2016).
Insofar as voluntary non-use presumes a certain degree of individual choice, it refers to a set of economic conditions specific to market-based capitalism. Non-users who terminate their engagement with one company, for example, may opt into a platform belonging to a competitor. Scholarship on the attention economy (Crary, 2001) expands on the subjective dimensions intrinsic in the economic model of consumer choice. Such scholarship examines how individual attention is structured by the products and services which compete for it (Crawford, 2015; Davenport, 2001).
Organised boycotts present a collectivised form of voluntary non-use. In these cases, a set of political and ethical commitments lend a social form to the decisions of individual non-users who reject the products of a given entity. This non-use as a form of consumer activism is based on the voluntary rejection of a user technology (Wyatt et al., 2002). The duration and degree to which non-users participate in the boycott varies: some partially and temporarily suspend use, while others may completely and permanently terminate their use of a particular good or service altogether.
Individual cases of non-use that are not principally motivated by political concerns have their origins in nineteenth century bourgeois culture. With the expansion of cities and industrial processes came a rich body of literature that broadly envisioned different means of withdrawal from the increasingly oppressive conditions intrinsic to modernity. Technology’s relationship to nature and the rationalisation of society has long preoccupied critics of modernity, who consider the political subjects industrial development reciprocally determines (Marx, 1964; Kracauer, 1924). Risk assessment made on an individual basis underlies more recent examples of voluntary non-use that are motivated by concerns about public health. “Internet addiction” was officially declared a public health issue in China as early as 2008, when an uptick in searches for the term “digital detox” coincided with the launch of the first iPhone (Jiang, 2014). “Digital detox” posits a solution to problems of over-connectivity (Syvertsen & Enli, 2019) that applies the moral rhetoric of contemporary wellness regimes (Madsen, 2015) to the digital age (Syvertsen & Enli, 2019).
INVOLUNTARY NON-USE
Cases of non-use which are involuntary present a much more elusive object of research than the examples of voluntary non-use outlined in the previous section. Nevertheless, secondary literature on compulsory non-use can be subdivided into three different units of analysis: infrastructural, structural, and individual.
Discrepancies in access function as a point of departure for work on involuntary non-use at the infrastructural level. By examining differences in access among various populations, this research shows how historically marginalised populations have been disproportionately affected by lack of internet access. The extent to which race, gender, and class play a role in the distribution of access to digital technologies is the source of much debate among social scientists (Dewan & Riggins, 2005; DiMaggio et al., 2004).
Lack of access to content and different platforms as a result of mandates is a form of involuntary non-use that takes place at the structural level. These cases tend to presume a centralised structure of authority, such as the corporation or state, which has the capacity to revoke content and prioritise the use of certain systems.
In certain cases, individuals may fall under the category of involuntary non-users because of a gap between their skills and those required to navigate advanced information systems. Without the appropriate skills, these individuals attain non-user status. Debates over digital literacy are of central relevance to users (and non-users) of decentralised systems insofar as their accessibility determines who can and cannot be considered a user. One challenge decentralised computing infrastructures face is the creation of end-user-friendly systems. (Gervais et al., 2014). In prioritising technological design over usability, decentralised systems can be prohibitively difficult to use—even as they impact economic, civic, and social opportunities for users and non-users alike (DiMaggio et al., 2004). Potential users who cannot engage in decentralised platforms may consequently be “left behind," thus becoming involuntary non-users. Further, users may have difficulty leaving centralised platforms for less mainstream, less easily accessible decentralised alternatives. In other words, digital literacy impacts not only who is able to use decentralised systems, but also, who has the choice to swap their usage of centralised systems for decentralised ones. Here it is important to note that scholars who research digital literacy emphasise the importance of studying population segments, and disaggregating digital literacy and non-use.
ISSUES RELATED TO NON-USE
Voluntary and involuntary cases of non-use present a number of issues that range in practical and theoretical significance.
Where access to user technology is assumed, issues related to non-use take on practical considerations. The transfer of data from centralised platforms to alternative ones for example raises a problem concerning “portability.” Users who opt out of one platform sometimes encounter difficulties with transporting their data as a result of conflicting proprietary arrangements. A solution to this problem may be found in open standards, which considers how user data may be portable, by enabling system interoperability (Barbas et al., 2017).
Determining who counts as a non-user remains largely contingent on how users themselves are defined. In HCI, the question of whether the user assumed in user-centred design can accommodate the diversity of interactions between humans and computers is a source of much debate (Baumer & Brubaker, 2017). One side of it maintains that by flattening the full range of human activity into “systems, interfaces, design practices, and discourse” (Baumer and Brubaker, 2017, p. 6291), user centred design posits an inherently exclusionary model of human computer interaction. Though HCI acknowledges its cultural specificity, certain methods central to the field nevertheless continue to employ a universalist approach which assumes an omniscient creator (Philip et al., 2012).
In calling attention to normative conceptions of user at work in popular narratives about technological development (Oldenziel, 2001; Star, 1991), feminist and postcolonial critiques of technoscience challenged prevailing definitions of the user and non-user by attending to positions which have historically been excluded from these narratives. This discourse focuses on the wider conditions of uneven development that have shaped who designers and engineers assumed to be the user (MacKenzie & Wajcman, 1999; Williams et al., 2005).
Anti-universalist methods which have emerged in response to these debates apply decolonial critiques of knowledge and artefact production to the design of HCI (Johnson, 1998; Suchman, 2002). How the global division of labour is gendered and racialized in the technological imagination is the object of considerable research in STS (Oudshoorn & Pinch, 2003). Expanding the frame of HCI to geographies and peoples beyond the industrial north provincializes dominant narratives about innovation, which have long been weaponized against indigenous movements in newly industrialising countries across the global south (Chakrabarty, 2000; Mignolo, 2007).
Although HCI theoretically recognizes the cultural specificity of designed products, a number of design processes and methods remain universalist in their approach (Philip et al., 2012) by assuming the ability to design for one user at the exclusion of many others. Adapting anthropocenic and decolonial critiques to HCI design, designers have increasingly turned to methods which aim to decentre the human, and attend to subaltern modes of knowledge production (Tunstall, 2020; Schultz, 2018). In centreing human agents, user and non-user discourse minimises the non-human agents that shape and are shaped by use. Actor-network theory (ANT) (Latour, 2005) provides one alternative to this human-centred framework through a definition of the user which extends to animals, plants, minerals and cities typically outside the core interaction between human and machines. ANT encompasses technological deterministic views of user-technology relations and social constructionist approaches to technology, by attending to how agency is distributed among humans, non-humans, and the technologies which mediate their relationship. This conceptualization places the user as an agent within relational networks aligns with anthropocenic debates, and calls for rethinking systems and technological approaches that concentrate the authority over these networks in human agents who comprise only one aspect of them (Light et al., 2017).
CONCLUSION
In conclusion, non-use belies a complex of subjective considerations, which we sort in two primary categories: voluntary and involuntary cases of non-use. Attending to the non-user presents an opportunity to contextualise user agency, and access. Whereas systems design adopted a centralised model of human computer interaction as its basic unit of analysis, non-user discourse accounts for a more diverse range of interactions.
REFERENCES
Barabas, C., Narula, N., & Zuckerman, E. (2017, September 8). Decentralised social networks sound great. Too bad they’ll never work. WIRED. https://www.wired.com/story/decentralized-social-networks-sound-great-too-bad-theyll-never-work/
Baumer, E. P. S., & Brubaker, J. R. (2017). Post-userism. Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, 6291–6303. https://doi.org/10.1145/3025453.3025740
Benkler, Y. (2016). Degrees of freedom, dimensions of power. Daedalus, 145(1), 18–32. https://doi.org/10.1162/DAED_a_00362
Brubaker, J. R., Ananny, M., & Crawford, K. (2016). Departing glances: A sociotechnical account of ‘leaving’ Grindr. New Media & Society, 18(3), 373–390. https://doi.org/10.1177/1461444814542311
Bush, V. (1948). As we may think. The Atlantic.
Central Government Portal. (XXXX). 我国首个《网络成瘾临床诊断标准》通过专家论证 (Country’s first “Clinical Diagnostic Criteria for Internet Addiction” passed expert demonstration).
Chakrabarty, D. (2000). Subaltern studies and postcolonial historiography. Nepantla: Views from South, 1(1), 9–32.
Cooper, G., & Bowers, J. (1995). Representing the user: Notes on the disciplinary rhetoric of human-computer. The social and interactional dimensions of human-computer interfaces. In P. J. Thomas (Ed.), The Social and Interactional Dimensions of Human-Computer Interfaces (pp. 67–106). Cambridge University Press.
Coutard, O. (Ed.). (2002). The Governance of Large Technical Systems. Routledge.
Crary, J. (2001). Suspensions of Perception: Attention, Spectacle, and Modern Culture. MIT Press.
Crawford, M. B. (2015). Introduction: Attention as a cultural problem. In The World Beyond Your Head: On Becoming an Individual in an Age of Distraction. Farrar, Straus and Giroux.
Davenport, T. H., & Beck, J. C. (2001). The Attention Economy: Understanding the New Currency of Business. Harvard Business School Press.
Dervin, B., & Nilan, M. (1986). Information needs and uses. Annual Review of Information Science and Technology, 21, 3–33.
Dewan, S., & Riggins, F. J. (2005). The digital divide: Current and future research directions. Journal of the Association for Information Systems, 6(2), 298–337.
DiMaggio, P., Hargittai, E., Celeste, C., & Shafer, S. (2004). Digital inequality: From unequal access to differentiated use. In K. M. Neckerman (Ed.), Social inequality. Russell Sage Foundation.
Dourish, P. (2001). Process descriptions as organisational accounting devices: The dual use of workflow technologies. Proceedings of the 2001 International ACM SIGGROUP Conference on Supporting Group Work - GROUP ’01, 52. https://doi.org/10.1145/500286.500297
Dourish, P., & Mainwaring, S. D. (2012). Ubicomp’s colonial impulse. Proceedings of the 2012 ACM Conference on Ubiquitous Computing - UbiComp ’12, 133. https://doi.org/10.1145/2370216.2370238
Gervais, A., Karame, G. O., Capkun, V., & Capkun, S. (2014). Is Bitcoin a decentralized currency? IEEE Security & Privacy, 12(3), 54–60. https://doi.org/10.1109/MSP.2014.49
Goodin, T. (2018). Off: Your digital detox for a better life. Abrams. https://www.overdrive.com/search?q=448CC5C3-82D3-4702-8C5F-30109EED9AC7
Graham, M. B. W. (2006). Comment: Exploring the Context of Use. Enterprise & Society, 7(3), 456–461. https://doi.org/10.1017/S1467222700004341
International Organisation for Standardisation (ISO). (2009). ISO 9241-210:2010 Ergonomics of human-system interaction—Part 210: Human-centred design for interactive systems.
Jiang, Q. (2014). Internet addiction among young people in China: Internet connectedness, online gaming, and academic performance decrement. Internet Research.
Johnson, R. R. (1998). User-centered technology: A rhetorical theory for computers and other mundane artifacts. SUNY Press.
Kosara, R., Healey, C. G., Interrante, V., Laidlaw, D. H., & Ware, C. (2003). Thoughts on user studies: Why, how, and when. IEEE Computer Graphics and Applications, 23(4), 20–25. https://doi.org/10.1109/MCG.2003.1210860
Kracauer, S., & Levin, T. Y. (1995). Boredom. In The Mass Ornament: Weimar essays. Harvard University Press.
Latour, B. (2005). Reassembling the social: An introduction to actor-network-theory. Oxford university press.
Low, C. (2020). Accessibility in tech improved in 2020, but more must be done. Engadget. https://www.engadget.com/accessibility-in-tech-2020-150002855.html?guccounter=1&guce_referrer=aHR0cHM6Ly93d3cuZ29vZ2xlLmNvbS8&guce_referrer_sig=AQAAAB1mniJhfdvluBKHyU7WgmH0vChPpNU9Imj2S_1OsjNw8SYZVWVAazfGik-zzFJ6e-hdfO150-HuB9ANzKlPGm2sBTYkEN_gyKVhTSEKinwiE2Fd6ZPAsiXwKEDS80GBPtYmotNi-tX0ePaNNaNx7jYlEatlfFUHSDbCIap_narn
MacKenzie, D., & Wajcman, J. (1999). The Social Shaping of Technology. Open University Press.
Madsen, O. J. (2015). Optimizing the Self: Social Representations of Self-Help. Routledge.
Manley, J. (2020, November 26). The ethics of rebooting the dead. Wired. WIRED. https://www.wired.com/story/ethics-reviving-dead-with-tech/
Marwick, A. (2011). If you don’t like it, don’t use it. It’s that simple. Social Media Collective Research Blog. http://socialmediacollective.org/2011/08/11/if-you-dont-like-it-dont-use-it-its-that-simple-orly/
Mignolo, W. D. (2007). Delinking: The rhetoric of modernity, the logic of coloniality and the grammar of de-coloniality. Cultural Studies, 21(2–3), 449–514. https://doi.org/10.1080/09502380601162647
Oudshoorn, N., & Pinch, T. (Eds.). (2003). How Users Matter: The Co-construction of Users and Technologies. MIT Press.
Perrin, A., & Atske, S. (2021). Americans with disabilities less likely than those without to own some digital devices [Report]. Pew Research Center. https://www.pewresearch.org/fact-tank/2021/09/10/americans-with-disabilities-less-likely-than-those-without-to-own-some-digital-devices/.
Philip, K., Irani, L., & Dourish, P. (2012). Postcolonial computing: A tactical survey. Science, Technology, & Human Values, 37(1), 3–29. https://doi.org/10.1177/0162243910389594
Rheinfrank, J. (1995). A conversation with Don Norman. Interactions, 2(2), 47–55. https://doi.org/10.1145/205350.205357
Satchell, C., & Dourish, P. (2009). Beyond the user: Use and non-use in HCI. Proceedings of the 21st Annual Conference of the Australian Computer-Human Interaction Special Interest Group on Design: Open 24/7 - OZCHI ’09, 9. https://doi.org/10.1145/1738826.1738829
Schultz, T. (2018). Mapping Indigenous futures: Decolonising techno-colonising designs. Strategic Design Research Journal, 11(2), 79–91. https://doi.org/10.4013/sdrj.2018.112.04
Star, S. L. (1991). Power, technology and the phenomenology of conventions: On being allergic to onions. In J. Law (Ed.), A Sociology of Monsters: Essays on Power, Technology and Domination (pp. 26-55 ,). Routledge.
Suchman, L. (2002). Located Accountabilities in Technology Production. Scand. J. Inf. Syst, 14, 2, 91–105.
Syvertsen, T., & Enli, G. (2020). Digital detox: Media resistance and the promise of authenticity. Convergence: The International Journal of Research into New Media Technologies, 26(5–6), 1269–1283. https://doi.org/10.1177/1354856519847325
Talja, S., & Hartel, J. (2007). Revisiting the user-centred turn in information science research: An intellectual history perspective. Information Research, 12(4), 12–14.
Treem, J. W. (2014). Technology non-use as avoiding accountability. In E. P. S. Baumer, M. G. Ames, J. R. Brubaker, J. Burrell, & P. Dourish (Eds.), CHI ’14 Extended Abstracts on Human Factors in Computing Systems (pp. 65–68). ACM. https://dl.acm.org/doi/10.1145/2559206.2559224
Trevisan, F. (2018). Disability Rights Advocacy Online: Voice, Empowerment and Global Connectivity (First issued in paperback). Routledge, Taylor & Francis Group.
Tuhiwai Smith, L. (2021). Chapter 2; Research through Imperial Eyes. In Decolonizing Methodologies: Research and Indigenous Peoples. Zed Books. https://doi.org/10.5040/9781350225282
Tunstall, E. D., Gunn, W., Otto, T., & Smith, R. C. (2020). Decolonizing design innovation: Design anthropology, critical anthropology, and indigenous knowledge. In Design Anthropology Theory and Practice (pp. 232–250). Routledge.
von Hippel, E. (1976). The dominant role of users in the scientific instrument innovation process. Research Policy, 5(3), 212–239. https://doi.org/10.1016/0048-7333(76)90028-7
von Hippel, E. (2005). Democratizing Innovation. MIT Press.
von Hippel, E. (2007). Horizontal innovation networks—By and for users. Industrial and Corporate Change, 16(2), 293–315. https://doi.org/10.1093/icc/dtm005
White, H. D., & McCain, K. W. (1998). Visualising a discipline: An author co-citation analysis of information science, 1972-1995. Journal of the American Society for Information Science, 49(4), 327–355.
Williams, R., Stewart, J., & Slack, R. (2005). Social learning in technological innovation: Experimenting with information and communication technologies. Edward Elgar Pub.
Wilson, T. D. (1981). On user studies and information needs. Journal of Documentation, 37(1), 3–15. https://doi.org/10.1108/eb026702
Wyatt, S. (2003). Non-users also matter: The construction of users and non-users of the Internet. In N. Oudshoorn & T. Pinch (Eds.), How Users Matter: The Co-construction of Users and Technologies. MIT Press.
Wyatt, S., Thomas, G., & Terranova, T. (2002). They came, they surfed, they went back to the beach: Conceptualising use and non-use of the Internet. In S. Woolgar (Ed.), Virtual society? Technology, cyberbole, reality. Oxford University Press.
Yates, J. (1993). Control through communication: The rise of system in American management (Johns Hopkins paperbacks ed). Johns Hopkins Univ. Press.
Zhao, Z., Laga, N., & Crespi, N. (2009). A survey of user generated service. 2009 IEEE International Conference on Network Infrastructure and Digital Content, 241–246.
Add new comment