As our world simultaneously becomes more unified and more divided over virtual spaces, we must think more critically about the systems that technology operates in, our relationship with technology as consumers, and the responsibilities of big tech companies.

On Thursday, the Undergraduate Committee on Global Thought hosted an event entitled “Tech for Social Good?” that featured speakers who specialized in cyber law and computational journalism. The speakers engaged in a discussion about the increasingly complex relationship between technology, law, human rights, and journalism.

In an apt metaphor, Harvard law professor Jessica Fjeld described technology as “field multipliers for the things humans want to accomplish, both good and bad.” She contrasted how the internet allowed people who may not have had the opportunity to get a secondary education to learn languages and other skills, but also how it was weaponized to provoke ethnic violence in Myanmar in 2017. Professor Fjeld referred to the UN Guiding Principles on Business and Human Rights to discuss the responsibilities big tech has to support social equity and communities. She pointed out that these principles, however, would argue that states differ from corporations in that where the state is expected to protect human rights, corporations are only asked to respect them. This is particularly challenging in the case of technology since it often comes with unexpected effects.

Mark Hansen, a computational journalism professor, noted that part of the unexpectedness of the impacts of technology stems from user-centered design, using Uber as an example. In focusing on a narrow audience such as passengers, a user-centered design “fails to take a broader look at the system that is supported by that particular business. There are a lot of people who are not identified as the user to whom bad things happen to make good things happen for the user.” He advocated for alternate principles of design which would force companies to ask some much-needed questions and evaluate the system as a whole. The problem with this, however, is that while “there are many people in those companies who are really trying to effectuate change…the problem is that they’re operating in systems that are not really designed for that,” according to Gautam Hans, an Assistant Clinical Professor of Law at Vanderbilt University. Even if companies may not be interested in having those conversations, he suggested that public pressure and governmental oversight could help steer us towards change.

Similarly, Professor Fjeld supported the idea of governmental oversight and legislative action, pushing back against the narrative in technology that places the burden of action on consumers to protect their own rights. It is unfair, she argued, to discuss issues such as privacy as something that could be resolved merely by turning on privacy controls on Facebook, especially since employed technology like auto-tagging, facial recognition, and metadata in photos are often outside our control. While regulatory changes would be the best way to spend our energy, the speakers warned against implementing one-size-fits-all standards. Especially as we look towards regulations on a global scale, we need to create space in global regulation for different value systems and approaches. “I think if we had a single regulatory regime, it would be influenced by a kind of coloniality because the existing power structures right now put a lot of wealth and power in western countries and western value systems, and western thinking,” Professor Fjeld said. 

Originally titled “Tech for Social Good” as a statement rather than as a question, the retitled event highlights the unique set of challenges that come with technology. These challenges often leave us swinging between believing in the positive role technology can play and the concerns and skepticism that come with trying to regulate its negative consequences. As we grapple with ethical questions about our rights and relationship with technology, the speakers left us to consider how we can be part of a solution—what will we do to put public pressure on companies and our government to protect our privacy and rights?

Tech For Social Good? via Global Thought