Screenshot of the AI-generated Christopher Pelkey video. Youtube Closed subtitles
For two years, Stacey Wales' man who killed his brother in a road rampage in Chandler, Arizona has been listing everything she said during a sentencing hearing.
But when she finally sat down to write her statement, Wales was trapped. She tried to find the right words, but one voice was clear: the voice of her brother.
"I can't help but hear him," Wales told NPR.
That came to her idea: using artificial intelligence to create a video about how her late brother, Christopher Pelkey, will speak to the court, especially the person who fatally shot him at a red light in 2021.
Wales stands in court on Thursday to play videos - what AI experts say may be the first time the technology is being used in the United States Create an impact statement of AI rendering reading by the victims of the deceased.
[embed]https://www.youtube.com/watch?v=cms-_8etnts[/embed]
Wales has been considering her victim impact statement since its first trial in 2023. The case was re-reported in 2025 due to procedural issues in the first trial.
The opportunity to speak in court means a lot to Wales, who blocked her emotions during both trials to avoid affecting the jury.
"You're told you that you can't react, you can't act, you can't cry," she said. "We look forward to (sentence) because we can finally react."
The Wales lawyer told her to humanize Perki and provided full photos of who he was.
Therefore, Wales continued to carry out its mission. She said she From Pelkey's life to many people - from his elementary school teachers to the dates of high school dances to his soldiers who served side by side in Iraq and Afghanistan.
A photo of Christopher Pelkey walked down the aisle along the aisle at her wedding, Stacey Wales. Photos of Chris Pelkey's sister Stacey Wales, who was walking along the aisle at the wedding. Closed subtitles
Wales collected a total of 48 victim impact statements, not their own. It was time to write about her, and she was torn between speaking out her true feelings and what she thought the judge wanted to hear.
"I don't want to stand there and say, 'I forgive you,' because I don't, I haven't." "The dichotomy is that I can hear Chris's voice and he's like, 'I forgive him.'"
According to Wales, Perki's mantra has always been to love God and love others. He was the kind of person who would take his shirt off his back, she said. When she tries to find the right words for herself, Wales says writing from his perspective comes naturally.
She added: "I know his position and it's very clear to me what he's going to say."
That night, Wales turned to her husband, Tim, who had experience using AI.
Wales refers to her brother saying, "He has no say. He has no chance to speak." "We can't let that happen. We have to make a sound for him."
Tim and their business partner Scott Yentzer have only a few days to make the video. Challenge: There is no single program built for such a project. They also need long, clear audio clips of Perki's voice, and photos of him looking straight into the camera - neither Wales.
Still, the Welsh husband and Yentzer used several AI tools and managed to use about 4.5 minutes of Pelkey, his funeral photos and a script prepared by Welsh to create a compelling video. They digitally removed the sunglasses on the top of Pelkey's hat and trimmed their beards - which caused technical problems.
The people involved in Wales who are heavily involved in ensuring that the video is loyal to life say it is particularly difficult to recreate the laughter of the older brother, as most of Perki’s clips are filled with background noise.
This experience led Wales to reflect on its own mortality rate. So one night Wales walked into her closest person and recorded a 9-minute video of herself talking and laughing - just in case her family needed a clear voice one day.
"Thinking like this with your own mortality rate is a strange experience, but you never know when you won't be here," she said.
The night before the sentence hearing, Wales called her victim rights lawyer Jessica Gattuso to tell her about the video. Gattuso told NPR that she was initially hesitant about the idea because she had never heard of it in an Arizona court. She also worried that the video might not be received well. But after watching the video, she felt compelled to watch it in court.
"I know this will have an impact on everyone, including the shooter, because it's a message of forgiveness," Gattuso said.
Ten people supported Perki at the sentencing hearing. The video generated by AI is the last one.
AI Avatar said: "Hello. To see this clearly everyone, I'm recreating the version of Chris Pelkey through AI.
video Continue to thank everyone at Pelkey Contribute to impact the life of a statement and attend hearings. Then, the video was spoken to his shooter Gabriel Paul Horcasitas.
The video says: "It's a pity that day we met each other in this situation. In another life, we might be friends. I believe in the God of forgiveness and forgiveness. I always have, and I still do it."
The video ends with avatar Encourage everyone to love each other and to enrich their lives. "Well, I'm going to go fishing now. Love all of you. See you on the other side," it concluded.
Neither the defense nor the judge postponed. Later in the hearing, Justice Todd Long said: "I like that. Thank you."
Photos of Christopher Pelkey. Stacey Wales Closed subtitles
He added: "It shows the family because you told me how angry you were, you asked for a maximum sentence. Even thinking that was what you wanted, you made Chris speak from the bottom of his heart. I saw it. I didn't hear him asking for a maximum sentence." Horcasitas received 10.5 years for Manslugher.
Wales said she didn't realize how deeply the video had affected her and her family. For her teenage son, it was a chance to hear his uncle say goodbye. For Wales, it gave her the power to finally review her brother’s photos.
“It was very cathartic to go through this process of AI and how he sounded, trimming his beard and inserting laughter and all of these other things, and it was very cathartic and part of the healing process,” she said.
Over the years, more and more examples have tested the boundaries of AI’s role in court.
For example, in 2023, former President Trump's lawyer Michael Cohen unknowingly sent lawyers AI-generated lawyers. Recently, last month, a man tried to use AI-generated lawyer Avatar in court - a judge quickly shut down the effort.
However, Maura Grossman, a professor at the University of Waterloo, believes that the application of AI has been studied in criminal and civil cases. She added that in the Perki case, she did not see any major legal or moral issues.
"Because it's in front of a judge, not in front of a jury, and since the video is not as evidence itself, its impact is even more limited."
Some experts, including Grossman, predict that generative AI will become increasingly common in the legal system, but it raises various legal and ethical issues. When it comes to victim impact statements, key issues include questions about consent, fairness, and whether content is raised in good faith.
"This victim statement that really tries to represent the voice of the deceased is probably the most incredible use of AI to create fake videos or statements," Gary Marchant, professor of law, Sandra Day O'Connor School of Law, wrote in an email.
He added: “Many attempts to create deep fakes using AI will be even more malicious.”
Wales herself warns people that they may follow her footsteps in order to act in integrity, rather than being driven by selfish motivations. "I could have been selfish," she said. "But it's important not to close any person or group to keep others out."