Mandated TikTok Transparency is Needed to Protect US Users
Mark MacCarthy, Carl Schonander / Feb 10, 2026One national security concern that motivated the law mandating the transfer of TikTok from its Chinese owner, ByteDance, to a US controlled entity was the possibility that the Chinese government, through its control of ByteDance, could use TikTok for influence operations or other propaganda campaigns to shift US public opinion in a pro-China direction. The January 22 press release announcing the establishment of a new US-controlled TikTok joint venture contained two important safeguards to protect US users from such Chinese influence operations. But it needs to be supplemented with further transparency measures to ensure that these safeguards are effective.
Both China hawks and China doves should agree with two additional transparency measures: first, that the new joint venture release the terms of any agreement with ByteDance in connection with licensing its underlying content recommendation algorithm and second, that
it provide access to independent researchers to determine whether the ongoing operations of TikTok’s social media platform are free of Chinese government manipulation.
Safeguards in the agreement
The first safeguard in the announcement says that the US-controlled joint venture “will safeguard the US content ecosystem and have decision-making authority for trust and safety policies and content moderation.” This is intended to protect US users from all influence operations run by third-party users of the TikTok app, including those run by the Chinese government. Its effectiveness depends on how vigorous a content moderation program the new TikTok company establishes and maintains.
The second protection says that the new US controlled joint venture “will retrain, test, and update the content recommendation algorithm on US user data.” This retrained “content recommendation algorithm will be secured in Oracle’s US cloud environment.” The idea here is that by retraining ByteDance’s underlying algorithm the US entity could strip out the controls aimed at complying with China’s content laws and also remove any additional manipulative content filters or amplifiers targeted specifically at US users.
But this second protection contains an obvious loophole. It seems that ByteDance will retain ownership and control over the underlying algorithm and license it to the US entity for further training for US users. This matters because the licensing terms might forbid the US entity from tampering with the Chinese government-mandated content controls. Moreover, Chinese influence operations might be so embedded in the algorithm that simply retraining it on US user data will not be enough to remove the ideologically slanted bias in content delivered to US users.
Content moderation decisions aimed at removing harmful material, even if made by the US-controlled joint venture, might not be enough to counter this underlying bias. Alex Turvy and Rebecca Scharlach, in a commentary for Tech Policy Press, note that if the problem is manipulation of what US users see on TikTok, “a licensing arrangement that leaves algorithm IP in Beijing doesn't obviously answer it.” Kenton Thibaut from the Atlantic Council makes much the same argument, pointing out that this arrangement "could still hypothetically leave room for PRC influence over the algorithm," including "how the system evolves."
Similar concerns have been expressed in connection with the Chinese AI language models developed by DeepSeek, even though the model weights are open to all and can be altered directly by retaining.
It seems likely that, if permitted by licensing terms, the US entity will seek to remove any Chinese propaganda embedded in the underlying algorithm. But to fully protect US users the new entity must show its work. It must disclose the licensing terms and allow outside researchers to run tests to determine whether propaganda campaigns continue to operate through the platform.
Additional transparency measures
TikTok already has transparency centers and issues transparency reports regularly. Presumably, the new joint venture will continue to issue these reports as part of its content moderation function. But under the current system, only “invited guests have the opportunity to see up close how we moderate and recommend content.” Moreover, independent researchers criticize current access as too limited. After being granted access to TikTok data, the Integrity Institute, a non-profit organization of trust and safety professions, concluded that while TikTok’s transparency centers and reports are good, they do not provide enough access to the granular types of information needed for independent analysis.
This has to change if the new arrangement is to be credible. The solution may be for the joint venture to be obliged to provide access to vetted independent researchers to assess, among other things, the prevalence of Chinese propaganda on its platform. This system of researcher access could be modeled on the one mandated in Article 40 of the European Union’s Digital Services Act, which protects company intellectual property through strict, legally binding confidentiality obligations on approved researchers.
Going forward, TikTok-issued transparency reports will not be enough. There should be an independent annual report that provides information on the effective separation between ByteDance and the joint venture with respect to the algorithm and analysis on the amplification or suppression of content of interest to the Chinese government. There may well be other questions an independent report should analyze, for instance, whether the joint venture tilts content that matters politically one way or the other irrespective of whether the Chinese government is behind it. This might help to respond to bias allegations, such as recent charges that TikTok throttled videos about Immigration and Customs Enforcement raids, the late sex offender Jeffrey Epstein and posts related to the fatal shooting of Alex Pretti in Minneapolis.
Perhaps the joint venture might consider an oversight board along the lines Facebook created five years ago. Yes, there are varying views as to how effective this Board has been. But such a body could be charged with vetting independent researchers, determining the extent of their data access and overseeing the issuance of an annual report. This board might help provide the credibility the joint venture should want to earn. It should include both Republicans and Democrats, as well as academics and individuals respected for their impartiality and social media domain knowledge.
Bottom line: The joint venture is potentially promising, but performance will have to be independently verified.
Authors


