Providing clear insights into where & how an AI has generated it;s results is one of, if not the biggest, builder of trust & understanding for users.
Avoiding the "black box" perception is key to users staying with a product long term & using it frequently. Once they know how it works & trust it's results, they can feel comfortable to explore new use cases beyond their initial experiences.
Citations apply to all users, a fundamental aspect to making a user feel comfortable utilising it's output for which they are still likely accountable.
Citations also provide a clear path forward for a user to investigate sources directly or provide their own documented references.
It's clear that providing important citation to a results sources is crucial but it can be a complex set of UI elements to present. Consider how many & how important citations are in the regular user flow.
This example shows two common approaches. The first, simply appends the cited sources after the result. The second, displays a citation indicator inline which open another sidebar to display the citations. This allows for more real estate to provide more information per citation, however, the appended citations are more easily accessible.
Depending on the amount of citations require you can pick a pattern that works or make it dynamic.