BY JOSHUA TAN
The following is the second of four collective summaries published by the Singapore Policy Journal’s reading group on Digital Technology. Each collective summary is a product of the topics discussed and the various research directions of the members of the reading group. The reading group comprises various individuals from multiple backgrounds, providing a multidisciplinary approach to digital technology.
The recent focus on the Criminal Procedure Code’s application to TraceTogether data and the use of the Protection from Online Falsehoods and Manipulation Act (POFMA) during the 2020 General Elections have highlighted the increasing urgency of the need for awareness of emerging digital technology issues and how they affect society. What are our roles and responsibilities as citizens, and how much trust can we place in public office when it comes to the control and use of digital technology?
The Singapore Policy Journal’s reading group invited David Eaves and Bruce Schneier to share their work on public policy and digital technology with the group. Eaves is an expert in public policy and information technology, and a lecturer at the Harvard Kennedy School. He has worked with many various governments, advising on corporate policy structures for an open data strategy. Schneier is an internationally renowned public interest technologist working at the intersection of security, technology and people. He is also a lecturer at the Harvard Kennedy School and the author of 14 books, including the New York Times bestseller, Data and Goliath. This session, “The Foundations of Trust in a Digital Society,” focused on the principles and frameworks that underpinned the use of digital technology in Singapore. Specifically, the reading group looked at the policy dilemmas governments faced in the regulation of technology, the economics driving the security and insecurity of digital technology, and their implications for Singapore’s Smart Nation ambitions.
Eaves shared about open data and how it could contribute to more transparency and public engagement. Open data has “created a research space where people could go and grab information, and build out ideas that others could take to scale or that could influence what the government realizes is possible with their data.” He continued to show how open data has allowed budgetary comparisons between cities like Johannesburg and Cape Town in South Africa, and how it could even simplify complicated parliamentary documents through websites like OpenParliament.ca, a search tool to find parliamentary transcripts based on keywords. Eaves argues that “transparency can help drive public engagement, and increase levels of accountability that sometimes governments don’t want, but are actually good for the public good.” He is interested in how private stakeholders can release data to serve public interest, but also how private sector interest can serve public value.
Despite his praise of the potentials of open data, Eaves is not an idealist, and he warns that we should “not equate transparency with trust.” Having more open data may mean that malicious actors could manipulate information towards their purposes—more so than motivated public actors that are under resourced. He posits that more data literacy is needed as more open data is available. Critical thinking is our best defence to malicious intentions.
One could expand on Eaves’ presentation by considering the disparities in data and access for less-developed regions and underrepresented communities. This may exacerbate inequalities in data-informed decision-making and the provision of services. This applies equally to both policymaking in the public sector and the decisions made in commercial spaces and the private sector. For example, Shoshana Zuboff highlights the danger of cartographic representation with Google Maps, arguing that maps are essential for the imagination of territories. Given that Google has the power to decide who “exists” on these maps, and that these maps have the potential to inform public sector projects, the consequence of not being represented is the erasure of existence. In the Singapore context, the risk of excluding underrepresented or digitally “handicapped” communities may also be present when regional digital initiatives informed by open data do not take the limitations of these communities into account. For example, the JTC Corporation brochure of the Punggol Digital District boasts an “open digital platform” for participation and experimentation. The use and application of open data in this way privileges those with the technology and know-how, disadvantaging residents and demographics that are not as conversant.
Schneier followed Eaves with the economics and enablers of trust. He cited the example of SolarWinds, a company that develops software for network and systems management. The hacking of SolarWinds exposed sensitive information about consumers through SolarWinds’ clients. Schneier argues that the company’s underspending on security essentially “transferred risk” from themselves to “their customers without their knowledge and consent for short term profit.” Schneier uses this incident to frame four pieces of economics that are relevant for security:
- Network effect: Value of product is geometrically proportionate to the number of users
- vs. Marginal Cost: Most products have a significant marginal cost during production, but digital technologies have very low marginal costs, but high fixed costs (think music, software, pharmaceuticals…)
- Lock in: Difficulty of switching to a competitor. In software, there is usually high lock in.
- Lemon’s Market: A market where the seller knows more about the market than the buyer, and the buyer cannot make a good buying decision. This means that bad products drive good products out of the market, and it means that the mediocre succeeds while quality doesn’t.
He continues to describe how trust is enabled by four main “pressures”:
- Moral Compass: We don’t steal because we know it’s wrong.
- Reputation: We don’t steal because it’ll affect our reputation.
- Institutions and Laws: We don’t steal because we will go to jail.
- Technology: We don’t steal because technological devices that enforce security prevent us from doing so.
Schneier’s description of the economic concepts of digital products are convincing, albeit somewhat general. Companies that create digital and technological products work to develop and embed their products into their own “eco-systems” (think Apple), and the larger a particular piece of software becomes, the more desirable it is (think Microsoft and Social Media). The concept of fixed and marginal cost also seems to be generally true, although further investigation will definitely show how there are exceptions (think data storage facilities, the energy to run them, software updates, employment of security teams etc.). As Eaves hinted at in his sharing, society has to become more data literate and technologically savvy as advancements in digital technology continues. This would potentially allow the consumer to avoid the issues of “high lock in” and the “Lemon’s Market.”
On Schneier’s perspective on morality in enabling trust, it must be cautioned that the universalization of morality and human impulses risks simplifying the complex socio-economic and cultural contexts of communities, and with that, our understanding of trust in different contexts. The definition of our “moral, ethical, and religious codes” on the premise of “human evolutionary tendencies” is troubling as it assumes that the current codes are superior to others of the past. This critique underlines that as we continue to think about trust in public and private actors when it comes to digital technology, it is important not to fall back into the universalist frameworks and simplified narratives. Context is important, and the challenge facing Singaporeans as we navigate the issues of responsibility and accountability need to be addressed through a contextual and sensitive approach.
The reading group subsequently split into two groups for roundtable discussions. The first group discussed the role of government in regulation of technology, and whether the focus should be on implementation or on development. It was argued that Singapore may not really have control over digital technology, since there are few tech companies at the cutting edge based locally. Furthermore, policy challenges to large multinational companies might produce negative consequences, as seen in Facebook’s retaliation against Australia’s proposed regulations. The Singaporean government has also been observed to be cautious when regulating technology, waiting several months before deciding to regulate Uber. Keller Easterling’s concept of the “Extrastatecraft,” was brought up, where new private-public configurations create zones of exceptions and avoid normal legal and financial restrictions. Governments may therefore be seen to be working with private companies rather than with the public for economic growth and efficiency. The group later discussed the issue of Singapore’s Smart Nation and its challenges and advantages, questioning whether the Singaporean government is the best arbiter to regulate digital technology? This question slided into a conversation about whether elected officials should have the right to label digital or social media posts false under POFMA. Was it their right and duty to do so given that they were the elected officials of the people? Or would there be a conflict of interest? It was also noted that Singapore has commissions created to regulate digital technologies like Singapore’s AI Governance and Ethics initiatives. However, questions remain over the influence and authority these academics and policy researchers will have over decision makers and elected officials.
The second group discussed trust in digital technology and the question of evaluating risks. The group delved into discussions about what the trade offs are when regulating digital technology. Does the issue ultimately come down to balancing budgets and cybersecurity? What are our expectations of the government and the responsibility that we take? Trust in government depends on where the line between privacy and security is drawn, and where it should be versus where it really is. Linking it back to the Singapore context, the group discussed how this may play out in the country before posing an important question that needs to be answered: who is the Smart Nation Initiative for? The group further emphasized the importance of digital literacy when it comes to the accountability of the government. The issue of education was further discussed, and the group differentiated between teaching people how to code and what code should be used for. However, it was also noted that teachers may not have the necessary capacity and resources needed to teach them in schools. Further discussions took place on data exploitation, where it was acknowledged as not just a technical issue, but also an ethical one. Therefore, it becomes crucial to teach people to code ethically and understand how their skills should be used. Given the different solutions that were thought of, the group pondered on where the government should prioritise. It came to the conclusion that educating the public, keeping data and systems safe, keeping in pace with technological developments, and balancing trade-offs were important.
Joshua Tan is an M.Arch I candidate at the Yale School of Architecture. His design work has explored the use of open data in urban analysis, and his research interests include food and water infrastructures, land tenure, building technology, and housing development
 See Shoshana Zuboff, “The Elaboration of Surveillance Capitalism,” The Age of Surveillance Capitalism. (New York: Public Affairs, 2019), 151-152.
 Esri Singapore, Punggol Digital District Brochure. Accessed 31 March 2021. https://esrisingapore.com.sg/punggol-digital-district
 Isabella Jibilian and Katie Canales, “Here’s a simple explanation of how the massive SolarWinds hack happened and why it’s such a big deal” Business Insider. Feb 25, 2021, https://www.businessinsider.com/solarwinds-hack-explained-government-agencies-cyber-security-2020-12
 Bruce Schneider, “Moral Pressures”, Liars and Outliers, (Indianapolis: John Wiley & Sons, Inc., 2012), 77
 See Keller Easterling, “Zones,” Extrastatecraft, (Verso, 2014)