In TraceTogether We Trust: Singapore’s Challenge with Data Governance and Ethics


Trust has become a foundational aspect of our digital world, undergirding our everyday interactions with data, technology, and institutions. It plays a fundamental role in engendering the confidence needed to facilitate and enable relationships among various stakeholders. In turn, this empowers digital systems to continue to work in equitable and accountable ways.

However, one recent incident has highlighted the difficulty in maintaining that trust. The success of TraceTogether, Singapore’s digital COVID-19 contact tracing system, in containing the spread of community cases in the country was lately dampened by concerns over its usage safeguards. On January 4, 2021, Minister of State for Home Affairs Desmond Tan stated, “The Singapore Police Force is empowered … to obtain any data, including TraceTogether data, for criminal investigations.”[1] His statement represented a reversal of the government’s previous position that TraceTogether data was not intended for use by law enforcement, and was met with widespread consternation.

This outcry is unsurprising, given that TraceTogether captures the movements of over 80% of Singapore’s population.[2] Its system of Bluetooth-enabled proximity data tracking for identifying close contacts is currently voluntary and employed alongside SafeEntry—another contact tracing system that requires anyone entering public venues to digitally check-in with personally identifiable information, such as their NRIC or phone numbers. However, the government has announced its intention to merge both systems, which would make TraceTogether mandatory.[3] Against the backdrop of TraceTogether’s continued use and the lack of wider policy reform, the government’s broken promise has led to continued public scrutiny over Singapore’s data policies.[4] The government has somewhat mitigated privacy concerns by passing legislation that explicitly restricts usage of TraceTogether data to contact tracing and severe criminal investigations only, but the move is limited in scope. By focusing solely on TraceTogether, it leaves unaddressed some prevailing gaps in policymaking processes towards other forms of digital technology.[5]

It is worth noting here that the primary contention was not over the technical specifications of TraceTogether. In fact, several independent privacy organizations and reviews have deemed the application one of the “least intrusive” contact tracing applications in Southeast Asia, and determined that it satisfied a set of technology principles guided by the American Civil Liberties Union, including the amount of time data should be stored and the amount of data collected.[6] Rather, the issue at hand is that such crucial legal liabilities and societal implications were not factored into the development process—indeed, Prime Minister Lee Hsien Loong acknowledged in a March 2021 interview that “[the government] made a mistake” in not communicating data usage policies upfront.[7] Given inaccurate and ultimately delayed communication of data use and privacy policies, the application was bound to suffer from breaches of trust, regardless of its technical quality.

The incident thus belies a larger problem: that current legislative and bureaucratic structures surrounding digital technology in the public sector may be insufficient to inform privacy, security, and ethical concerns in their development. Technology is a strategic national resource for Singapore, and digital solutions will certainly become more prevalent as the country embarks on its Smart Nation Initiative. But to harness the strength of digital technology, it must be underpinned by a strong culture of public trust that empowers its widespread use. This trust can only be nurtured and sustained if we ensure that our systems have sufficient guardrails to guide technology policy and governance.


Assessing Trust and Singapore’s Approach to Data

Singapore’s rapid pace of technological integration is an expression of its Smart Nation Initiative, a national-level ambition to harness emerging technologies to improve lives and livelihoods.[8] This entails leveraging information technologies, networks, artificial intelligence (AI), and data analytics to augment the delivery of government services and encourage digital innovation in all sectors. Singapore’s Smart Nation strategy sets the expectation that government functions, both internal and citizen-facing, should be almost entirely moved to digital platforms by 2023.[9] Such an initiative is undeniably crucial for Singapore’s continued development and will rely heavily on positive reception from, and mass uptake by, the population. This will have to be underpinned by a strong culture of public trust—both in the legal frameworks that govern data access, and in the public institutions that govern data use—that citizens will be protected.

Strong legal protections around data privacy and use are a crucial element for trust in technology. In this regard, the Singapore government is bound by multiple pieces of data legislation. However, the stringent data management practices laid out in these laws are caveated by high-level clauses that ultimately grant the government sweeping control over citizens’ data whenever necessary for the public interest.[10] At the center of the TraceTogether incident was the Criminal Procedure Code (CPC), which authorizes the police to access data it deems “necessary or desirable” for any criminal proceeding, regardless of what the data is or who the data is held by.[11] Similarly, Section 4 of the Public Sector Governance Act allows Ministers to issue data-sharing directions whenever necessary for the public interest, which is defined by subjective criteria (e.g. “to uphold and promote the values of the Singapore public sector”).[12] In practice, government use of these legal backdoors has been limited and judicious, and there are good reasons to afford flexibility in the interpretation of the “public interest.” But given that this policy ecosystem could theoretically grant the government unfettered access to personal data, the government should be transparent about its rationale for when these backdoors are invoked. While strong government presence is admittedly largely normalized and tolerated in Singapore, the TraceTogether incident has shown that this tolerance is tenuous and can erode quickly.[13] Pervasive use of digital solutions has to be sustained by public confidence in the legal boundaries of technology and data.

Trust in legal protections must be paired with trust in the public institutions that use technology. Studies of civic technology in other societies offer cautionary tales of how perfectly legal digital tools can engender unintended social harms. The design of digital government tools may unwittingly render services inaccessible to particular populations—in Norway, prospective child benefits recipients who could not access the automated online application faced higher rates of delay and rejection.[14][15] Reliance on seemingly neutral data can blind decision-makers to inequities—in Kansas City, Missouri, the city government used service hotlines as a problem-spotting tool, which left low-income and minority neighborhoods that were less likely to report quality-of-life issues unaccounted for.[16] Biased algorithms can encode social injustice into government functions: American AI risk assessment tools disproportionately and inaccurately predicted that Black defendants had higher likelihoods of committing future crimes, resulting in unjust decisions to limit defendants’ freedoms.[17] All of these potential harms are exacerbated when they originate in the hands of the government, since unlike in private industry, there are no competing alternatives to government services. Codified principles can bring these social considerations to the forefront of government technology design and guide public servants as they make these difficult trade-offs. Without such guardrails, unintended social consequences of government technology are inevitable. Such consequences do not only incur social harms, but also reduce public confidence in otherwise valuable digital services.


A Symptom of Broader Institutional Shortcomings

As technology becomes more intertwined with our everyday, tensions between civil liberties, societal equities, and the public good will only necessitate greater attention to the policy tradeoffs we collectively decide upon. But given the temporal and resource constraints brought about by the pandemic, it would be unfair to portray the TraceTogether privacy incident as a total systemic failure. Especially in light of the system’s good design, it is certainly not. Instead, the government’s overlooking of the CPC in public communications on TraceTogether should be understood as a symptom of institutions without sufficient process safeguards.

Publications on technology policy are limited in their provisions for privacy, transparency, and other ethical considerations. Strategy documents such as “Smart Nation: The Way Forward”and the Digital Government Blueprint outline the long-term principles, strategic approaches, and goals for public sector digitization.[18][19] A recurring principle of “user-centricity” emerges in these strategies. However, this principle is solely framed in the context of the immediate user interface and experience; a truly user-centric approach should also consider the broader product-society or government-user tensions that the product could create. That privacy, transparency, and other ethical considerations do not feature explicitly in such high-level direction-setting arguably undermines the completeness of the “user-centric” approach. Similarly, the GovTech Digital Service Standards operationalize technology design principles into mandatory standards for all government-created digital products. These principles are “Intuitive Design and Usability,” “Accessibility and Inclusivity,” and “Relevance and Consistency.” Again, these standards solely focus on the user interface itself, providing public servants with limited guidance on how to factor privacy, transparency, and accountability into product design.[20] While government personal data protection policies are publicly available, they are largely written in technical terms, and once again carry that major caveat of data access and usage for the purposes of “public interest.”[21] These gaps in the consideration of ethics extend to organizational accountability as well. In a public apology, Minister-in-charge of the Smart Nation Initiative Vivian Balakrishnan asserted that the connection between the CPC and TraceTogether had not “crossed the minds” of himself or his engineers.[22] In a best-practice scenario, catching such critical policy relationships should not be left to chance. Rather, policy considerations should be baked into the product design and rollout process, and led by dedicated product managers, policy directors, and even technology ethicists. It is clear that current policies and guidelines surrounding Singapore government technology have room to grow.

Political and economic considerations also support expanding technology governance. Internationally, Singapore is poised to become a regional leader in digital government. It has begun to leverage its smart nation expertise through public-private partnerships abroad,[23] while making its civic technologies available widely: for example, TraceTogether’s codebase has been made public for others to adapt.[24] There have also been increasing calls for international agreements to regulate the responsible use of technology.[25] If Singapore successfully implemented a national technology ethics framework, it could serve as a policy laboratory to influence global conversations around technology governance. Singapore’s policies could even serve as precedents that inform future international agreements, much like the European Union’s General Data Protection Regulation (GDPR) is poised to do in the field of data privacy. The economic incentives here are evident too, since creating a strong and vibrant environment of trust makes Singapore a more attractive hub for digital firms and industries. As such, it is within Singapore’s geopolitical and economic interests to proactively codify a comprehensive technology governance approach in the near term.



While Singapore’s approach to technology governance has significant room for improvement, it benefits from a strong existing foundation. For instance, Smart Nation initiatives are aimed at building technical capacity for digitalization across all of government, and include digital training programs for civil servants. GovTech is experienced in imbuing civic technology with a user-centric approach, and has “mystery shopping” in place as an enforcement mechanism for its standards. These features, among others, can and should be built upon to encapsulate the holistic impacts of technology on society.

First, a set of broad, non-binding principles is needed to address the full scope of technology-society interfaces. While the Singaporean government has published various materials on technology standards, they are either shrouded in legalese or missing content regarding privacy and ethics principles. The United Kingdom has set a positive example in this area. The Data Ethics Framework sets standards for transparency, accountability, and fairness in public sector data usage. The Technology Code of Practice, similar to GovTech’s Digital Service Standards, has additional principles that encourage open-source, privacy-forward, and flexible digital practices. Should Singapore adopt analogous standards, they should be designed with the layman in mind, both in terms of language and distribution. Even the most comprehensive and transparent principles cannot impact public trust if they are locked behind obscure language or many layers of webpage navigation. By creating accessible technology ethics standards, the Singaporean government could enable accountability to the public, guide public servants involved in technology design and delivery, and clarify its technology policy approach—for instance, delineating what “public interest” tradeoffs would warrant the collection of personal data—to improve confidence in its digital ambitions.

To facilitate compliance, these principles should be operationalized into standard operating procedures (SOPs). In particular, checklists have demonstrated success in securing team awareness around the planning, implementation, and evaluation of technology policies and services to prevent oversight or miscommunication. Leading companies such as Microsoft and Google have developed ethics checklists to ensure all AI products are developed with social implications in mind, from start to finish.[26] Similarly, the United States Digital Services Playbook offers checklist items that address privacy, security, openness, and holistic social impact in the context of government technology development.[27] Precedents for this have already been widely embedded in other fields like research and academia, where checklists mandated by Institutional Review Boards ensure the protection of the rights and welfare of human subjects in research protocols. Portions of the government have already begun to adopt operational principles for both public and private organizations in the field of AI.[28][29] Such precedents hold promise for SOPs to be developed across all areas of government technology.

Finally, the organizational structure surrounding government technology should enable the implementation of these principles and checklists. Responsibilities over technological social impact should be clearly delineated by role, to prevent oversights and imbue a sense of accountability over potential social harms. Should social risks be uncovered in the development process, team members should be empowered to raise their concerns in an open and collaborative environment. Communications plans should also capture the broadest extent of technology’s impact, to avoid another miscommunication à la TraceTogether. Talent processes can also be updated to screen for job candidates’ attitudes towards digital ethics, incorporate ethics training into public servants’ digital education, and potentially hire dedicated technological ethicists.



As technological solutions continue to proliferate throughout the public sphere, the case for a concerted approach towards institutionalizing proper safeguards around digital technology is more urgent than ever. With few legal and political mechanisms to hold the government accountable for data misuse, it is in the public interest to ensure that guardrails at the organizational and policy levels are embedded in our technology design and implementation processes. The social implications and trade-offs that could result from insufficient attention to these issues could serve to undermine the public trust that Singapore’s technological ambitions hinge on. The TraceTogether incident has highlighted new opportunities for reform, and we must not let this lesson slip by.


This article was written as part of the SPJ Reading Group on Digital Technology.

Sarah Anderson is an MPP-MBA degree candidate at the Harvard Kennedy School and Stanford Graduate School of Business.  Her academic interests include government technology design and delivery, and cross-sector technology regulation.

Lionel Oh is a Master’s candidate in the Regional Studies – East Asia program at Harvard University. He is also the current Editor-in-Chief of the Singapore Policy Journal and serves in the Republic of Singapore Air Force. His academic focus centers on security and foreign policy relating to China, with a specific emphasis on cybersecurity and cyberwarfare.


[1] “Singapore COVID-19 contact-tracing data accessible to police,” Reuters, 4 January 2021,

[2] Tham Yuen-C, “More than 4.2m people using TraceTogether, token distribution to resume soon: Lawrence Wong,” The Straits Times, 4 January 2021,

[3] Mohit Sagar, “SafeEntry and TraceTogether to be used to enable further safer reopening,” OpenGov Asia, 21 October 2020,

[4] Grace Ho, “Critical need to rebuild the public’s trust in TraceTogether,” The Straits Times, 3 February 2021,

[5] “Upcoming Legislative Provisions for Usage of Data From Digital Contact Tracing Solutions,” Smart Nation Singapore, 8 January 2021,

[6] Kyra Jasper & Camille Bismonte, “Singapore’s Updated TraceTogether Privacy Policy Could Erode Public Trust,” Center for Strategic & International Studies, 17 February 2021,

[7] Lee Hsien Loong, “PM Lee Hsien Loong’s interview with BBC for Talking Business Asia,” Prime Minister’s Office, 14 March 2021,

[8] “Transforming Singapore,” Smart Nation Singapore,

[9] Ibid.

[10]“Government Personal Data Protection Policies,” Smart Nation Singapore,

[11] “Criminal Procedure Code,” 31 August 2012,

[12] “Public Sector (Governance) Act 2018,” 9 March 2018,

[13] Terence Lee & Howard Lee, “Tracing surveillance and auto-regulation in Singapore: ‘smart’ responses to COVID-19,” Media International Australia, 12 August 2020,

[14] Thad Hall & Jennifer Owens, “The digital divide and e-government services,” International Conference Series on Theory and Practice of Electronic Governance, September 2011,

[15] Karl Kristian Larsson, “Digitization or equality: When government automation covers some, but not all citizens,” Government Information Quarterly, January 2021,

[16] Constantine Kontokosta & Boyeong Hong, “Bias in smart city governance: How socio-spatial disparities in 311 complaint behavior impact the fairness of data-driven decisions,” Sustainable Cities and Society, January 2021,

[17] Julia Angwin et al., “Machine Bias,” ProPublica, 23 May 2016,

[18] “Smart Nation: The Way Forward,” November 2018, Smart Nation Singapore,

[19] “Digital Government Blueprint,” 30 December 2020, Government Technology Agency,

[20] “Digital Service Standards,” 2020, Government Technology Agency,

[21] “Government Personal Data Protection Policies”

[22] Vivian Balakrishnan, “COVID-19 (Temporary Measures) (Amendment) Bill” (speech, Singapore, 2 February 2021), Parliament of Singapore,

[23] Priyankar Bhuniar, “IE Singapore signs MOU with Indian city to facilitate participation of Singapore companies in smart city projects,” OpenGov Asia, 28 October 2017,

[24] “6 things about OpenTrace, the open-source code published by the TraceTogether team,” Government Technology Agency, April 9, 2020,

[25] “Responsible Use of Technology,” World Economic Forum, August 2019,

[26] Michael Madaio et al., “Co-Designing Checklists to Understand Organizational Challenges and Opportunities around Fairness in AI,” CHI, 25 April 2020,

[27] “Digital Services Playbook,” U.S. Digital Service,

[28] “Principles to Promote Fairness, Ethics, Accountability and Transparency (FEAT) in the Use of Artificial Intelligence and Data Analytics in Singapore’s Financial Sector,” Monetary Authority of Singapore,

[29] “Singapore’s Approach to AI Governance,” Personal Data Protection Commission,


Subscribe to our website: Get notifications when we publish new pieces

Like our SPJ Facebook page for updates on pieces and to see when we hold small group Discussions in Boston

In Boston? Like the SSEAF page for updates when we hold panels and events with distinguished academics and thinkers

Want to write? Submit an article to this email address

Interested in responding to one of our published pieces? We welcome Letters To The Editor

Image credit: Catherine Lai, AFP