Tanzania has been advised to start fully addressing the responsibility of the reality of journalism by tracking and verifying the origin, history, and ownership of digital content (Content Provenance) if it is important in the fight against Artificial Intelligence or AI.
It has been stated that it is very important to take precautions and finally take action to combat the issue due to the challenges arising from AI technology, especially due to the emergence of new generation technologies such as deepfakes.
“Here is a reason why the reality of the content is important in this technological environment”, an expert on these matters from the United States, Mrs. Amy Larsen, said.
Mrs. Larsen was speaking in a debate on the topic: “Civic Participation, Media Awareness, Deepfakes, and About Content in the Age of AI, organized by the American Embassy in Dar es Salaam and attended by content writers of Social Media Bloggers Network (TBN) and The CHANZO at the American Center, National Museum, Shaaban Robert Street, Dar es Salaam
Mrs. Larsen, who is the Director of Strategy and Business Management in the Democracy Forward team of the Microsoft company, said that it is necessary to fight Misinformation because the spread of AI-based content, such as deepfakes, has made it easier for people with bad intentions to create images, videos, and fake voices that can mislead and deceive people.
The expert, who holds an important position in managing programs related to cyber security, misinformation and fake news, election integrity, protection of local journalism, and promoting corporate civic participation, said the years 2024 and 2025 are the years important for democracy and technology, given that a large number of people around the world will vote for their leaders, while the rapid development of AI brings opportunities as well as great challenges at the same time.
He said one of the biggest concerns is the misuse of AI to create “deepfakes”—real fake videos, audio, and photos that can deceive voters by changing the appearance, voice, or actions of politicians.
“This emphasizes the great difference between the promise of technological innovation and its risks, especially in democratic processes”, Mrs. Larsen emphasized.
Speaking about the immediate steps being taken to combat the challenge, Ms. Larsen has explained that 20 technology companies have joined to establish a new program called Tech Accord to Combat Deceptive Use of AI in 2024 Elections, which was launched at the Internet Security Conference in Munich, Germany, recently.
He said the main goal of this agreement is to fight the use of AI-generated deepfakes that can manipulate public opinion and jeopardize the integrity of elections.
“This effort does not belong to any political party and respects freedom of expression, instead it aims to ensure that voters make the right decisions without being misled by fake information generated by AI”, he said.
He said that these efforts will help journalists and media organizations that rely on honesty and integrity, establish an open chain of reality, news stations can confirm that the pictures and videos they publish are real and have not been altered.
This will help protect the integrity of the information they provide and ensure that their listeners can trust the information they receive. In an age where fake news is a serious problem, content authenticity can be a safeguard for journalistic standards.
Ms. Larsen has called on Tanzania to be vigilant at this time when AI technologies continue to grow, and the importance of content reality continues to increase.
There is a great need for Tanzania to fully participate in this new cyber security program of the Tech Accord to Combat Deceptive Use of AI in 2024 Elections, which he said provides an important system to verify reality, protect against misleading information and maintain trust in digital interactions.
“By investing in strong virtual reality systems, we can ensure that the benefits of AI are realized while reducing the risks associated with the misuse of this technology”, said Larsen.