Technology use is/as organizational culture
The phrase “AI policy” sounds serious. Policy is part of a stable of serious words such as governance, framework, principle, and guideline. These words - which represent types of rules, generally - are high-level, often abstract. They leave a lot to interpretation, by good design. And they’re often the kinds of rules we look to so that we don’t get in trouble, or so that we don’t do something wrong.
While that framing is important when thinking about technology use, it’s far from exhaustive. And in some cases, these types of rules - and the work they create - maraud as control.
By mistaking technology adoption for a compliance exercise, we cede significant power to shape “how” we do (or don’t) use new technologies.
Ursula Franklin spoke of the ways that technology can reduce and/or eliminate reciprocity. It's a helpful observation to keep in mind when considering automation, efficiency, and the relationships that matter to you and your organization. You can use technology to increase reciprocity in relationships, but you have to be intentional about it. You also have to refuse to see AI as a thing and understand it instead as politics, politics you should engage with.
Another way to think about new technology is reorganizing a conversation from: “should we / can we / is it ethical to use technology this way?” to “what do we want our jobs / profession / workplaces to be like?”
Rather than ending at the high-level technocratic documents, we have an opportunity to exert more agency and power in deciding how we use technologies together. Organizations that stop for long enough to be intentional rather than reactive about the adoption of new technologies can help protect those parts of their culture that require the most care and attention.
Organizations that stop for more than a few breaths before they increase automation also create space to ask if the tasks that could be automated are even the right tasks for today’s mandate and context. Leadership often inherits all kinds of different technical systems and sometimes doesn’t have time to stop and assess the status quo. It’s counter-productive to allow the introduction of new technologies to further cement processes that haven't been reviewed in years.
AI, and technology writ large, impact many things that do not fit neatly into technocratic rules. Relationships are probably the most important ones. Relationship to self, relationships with your colleagues, relationships with the people you serve. By failing to have conversations and come to agreement on how exactly we want to use new tools – in small teams, in organizations – we automatically ingest and normalize features that get pushed out into our computers. We allow technology providers to influence our operations and our relationships.
More important than any cultural agreement on use, however, is its maintenance. Creating a culture where questioning the use of technology is welcomed - tinkering and reflecting on what is working, what is not, what needs to change. This kind of reflection is the true practice of ethics. The best agreements have life. They are light, they are active, they are welcome, they are supportive. I’d even argue that they are short. Rules can be created in positive frames and can help everyone move through discomfort and uncertainty and change.
If you’re in a leadership or management position, and you’d like advice on how to politically navigate these issues, please get in touch. If you’re in a union and you’d like advice on how to engage on this issue, please get in touch. If you’d like to talk about creating a technology use agreement for your organization, or if you’d like to hold a workshop or training session to get into these issues, please get in touch. I’m especially excited to work in partnership with various arts organizations to bring you creative programming to really get into the cultural consequences of what technology is or does. If this sounds like a fun team-building or retreat idea, please get in touch too, at the link below :)