Code of X’s Algorithm Published

X (formerly Twitter) has now published the source code of the software that selects and evaluates posts for users’ “For You” feed. It is apparent that the feed is predominantly populated by AI, the data basis of which is unclear. In an expert opinion for the Science Media Center, Dr. Gregor Wiedemann and other experts explained why the publication is therefore not very meaningful.

Information from the Science Media Center Website

The “For You” feed shows users not only posts from accounts they follow, but also posts from other accounts. This feed is the default setting when opening the platform. You can manually switch to the “Following” feed, which only shows posts from the accounts you follow. The selection process for posts in the “For You” feed is unclear. Furthermore, the role that the size of posting accounts, the number of likes and reposts, and the similarity to other accounts you follow play is unclear. These factors have led to repeated accusations against the platform. For example, Elon Musk has been criticized for increasing his personal reach, as well as that of accounts close to him or his views. Conversely, some argue that the reach of accounts representing contrary opinions is being restricted.

This criticism is not new. Even under Jack Dorsey, when Twitter was still called X and functioned differently, it was unclear what criteria the algorithm used to amplify or restrict the reach of posts. At that time, many Republican and conservative users felt that their reach was being restricted.

However, it is questionable whether publishing part of the X code will provide clarity. First, it’s unclear what role the Grok AI model plays in prioritizing posts for the “For You” feed. Although Grok-1 code is published on GitHub, information about the training data and final model integrated into X is missing. This data is sensitive, so it’s not surprising that it’s not published. This raises the question of how transparent the functioning of a social medium or a large AI model can actually be. Furthermore, it is likely that the algorithm does not use the published Grok model, but rather a specialized model.

Additionally, the GitHub publication does not appear to include the weightings for prioritizing and reaching posts on X, which makes it unclear why certain posts are displayed so prominently.

To gain insight into what has and hasn’t been published and what can be learned about the functioning of the “For You” feed from this publication, SMC asked experts for their initial assessment.

Expert Opinion by Dr. Gregor Wiedemann

Senior Researcher Computational Social Sciences, Leibniz Institute for Media Research | Hans-Bredow-Institut (HBI), Hamburg

“The recently published source code for X’s recommendation algorithm is less transparent than one might initially assume. Earlier versions directly revealed preferences for Elon Musk’s posts and measurements favoring Republican over Democratic posts. These preferences could potentially influence the composition of users’ personalized ‘For You’ feeds. The version now available uses a transformer-based ranking model called Grok to control the selection. Biases and preferences for right-wing extremist content and conspiracy theories are no longer directly embedded in the source code but are built into the machine model. During training, the model learned that content generating a lot of attention in the form of likes, replies, or retweets is potentially interesting to others. Populist and right-wing actors achieve a disproportionate share of such interactions through emotional content, insults, toxic language, and disinformation. These quickly elicit reactions ranging from agreement to outrage. To give his platform the right-wing slant that X openly shows to every new user, Elon Musk no longer needs to instruct his programmers to translate his worldview directly into the algorithm. It is enough that his AI has learned that anger keeps people on the platform. This does not facilitate balanced, diverse, and objective discourse in which users democratically argue for the best solutions.”

Last update: 27.01.2026

Newsletter

Information about current projects, events and publications of the institute.

Subscribe now