Futures
Access hundreds of perpetual contracts
TradFi
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Launchpad
Be early to the next big token project
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
399 Yuan "AI Resurrection" of Loved Ones: What Legal Risks Are Hidden?
Reading Prompt
As generative AI technology becomes more widely adopted, the practice of mourning through “digital resurrection” has gone viral online. Behind it are multiple legal and ethical red lines, including infringements of portrait rights, data security risks, and consumer fraud.
On the eve of the Qingming Festival, a reporter from the Workers’ Daily found through an investigation that “digital resurrection” is gradually shifting from a simple emotional outlet toward commercialization—from AI memorial videos made by netizens for the sake of remembrance, to customized services on e-commerce platforms clearly priced and sold. But during this rapid growth, issues such as legal infringement, commercial fraud, and data leaks are occurring frequently. These not only infringe on the deceased’s rights to personal dignity and on consumers’ lawful rights and interests, but also hide the threat of telecommunications network scams.
Rights to Personal Dignity of the Deceased Are Protected by Law
Recently, well-known education blogger Zhang Xuefeng died. Some netizens, out of remembrance, used his live-stream footage from his lifetime to create and disseminate AI memorial videos online. There are also people with ulterior motives who stole his portrait to promote falsehoods. On March 24, a company under Zhang Xuefeng—Suzhou Fengyue Wanjuan Culture and Books Co., Ltd.—issued an announcement saying it would revoke all previously issued “Portrait Use Authorization Letters” sent to all parties. It required counterparties to completely remove within 24 hours all promotional materials and short-form video content using Zhang Xuefeng’s portrait, video clips, his name, and related imagery.
In fact, controversies triggered by “digital resurrection” involving deceased public figures have precedents. In October 2025, a deceased well-known tea scholar was “resurrected” using AI technology by a tea company and turned into a commercial endorsement video. Although the company claimed it had obtained authorization from the scholar’s son regarding promotion of tea culture, the deceased scholar’s widow clearly stated her opposition, arguing that the video used her late husband’s image for commercial promotion and was essentially “degrading” and “insulting.”
Earlier, some deceased celebrities had also encountered similar situations. Netizens, without the consent of the family members, used materials such as the deceased’s stage performances, interview clips, and private videos from their lifetime. Using AI technology, they “resurrected” them and produced related videos that were widely shared on short-video platforms, provoking strong opposition from family members. Some family members of the deceased had publicly spoken out, clearly stating that such AI videos made without authorization show disrespect for the deceased, and demanded that the relevant platforms immediately take down all related content.
Renowned professor Shijia You of the School of Law at Renmin University of China pointed out that the deceased’s rights to personal dignity are protected by law. Infringement is not limited to commercial use. Even if the AI resurrection video is made and publicly disseminated for the purpose of mourning, without consent from the nearest relatives, it may still constitute an infringement of the deceased’s portrait rights and right to reputation.
Some Merchants Use “Missing-You Marketing” Targeting Older Adults
Ms. Li Huarong (a pseudonym) once had an e-commerce merchant create a “digital resurrection” video of the long-deceased idol Zhang Guorong. “At the time, it was because I missed him—I wanted to hear him talk to me.” She found a merchant that offered a priced customization service: “photo open-mouth talking” for 49 yuan. The merchant only asked her to provide 1 photo with clearly identifiable facial features and a 15-second reference audio clip. Then it could make the person in the photo “speak” 50 words. Throughout the transaction, the merchant did not mention any legal risks and did not inform her how the data would be handled afterward.
The reporter noted that on multiple e-commerce platforms, products and services named “AI resurrection,” “make photos talk,” and “AI digital humans” are priced anywhere from 10 yuan to several thousand yuan. However, the product effects vary widely. Some merchants use nostalgia and longing emotions for marketing, turning sincere feelings into commodities. There are cases involving inducing consumption or even fraud.
Ms. Li Huarong said that she had also tried buying another store’s service priced at 399 yuan to “resurrect” her mother. At the time, the merchant claimed it could “make my mother like she’s in front of me, and also let her interact with me, and talk in video.” However, in the end, the video she received was not much different from the one she had paid 49 yuan for earlier. “It was basically just that the person in the photo talked for a longer time—the time went from 15 seconds to 1 minute—but it didn’t look like my mother at all. It just feels like the money wasn’t worth it.”
There are also some merchants who focus on elderly groups with weaker辨识能力, using the most low-end generative AI technology to profit at minimal cost.
Professor Shijia You stated that for “missing-you marketing” directed at older adults, under certain circumstances it constitutes consumer fraud. If a merchant exaggerates a rough, template-based video into technical effects like “permanent companionship,” conceals technical risks such as data leaks, or if the emotional comfort service provided has serious defects—leading older adults to purchase the service based on mistaken understanding—then it constitutes fraud. He suggested that market regulatory departments establish an algorithm ethics and data source filing system for merchants providing “digital emotional services.” They should carry out special inspections in areas where older adults gather, such as around nursing communities and funeral service venues, with a focus on cracking down on practices like “script-based inducement” and “unilateral tyrant clauses.”
Regulation of Small-Workshop-Style “AI Resurrection” Still Needs Strengthening
Beyond infringement disputes, when users “resurrect” loved ones, the photos, audio, and other biometric information they upload also face extremely high risks of leakage and misuse. If merchants illegally buy and sell, these sensitive data can easily flow into black-and-gray industry chains and may be used by criminals to carry out scams such as face-swapping and voice-swapping using AI. According to reports from anti-fraud authorities in multiple places and typical cases, some criminals illegally obtain biometric information such as audio and photos from the deceased’s lifetime, fabricate excuses like the deceased leaving debts or “sending dreams” to request transfers, or—after gaining an elderly person’s trust through a virtual persona—carry out scams. This has led to some grieving groups suffering a double blow in both emotions and property.
Ms. Li Huarong now recalls the process of using the “resurrection” service back then again and feels fear: “The merchant also didn’t tell me whether they would delete those photos and audio. I didn’t know how the related data would be handled in the end.”
Professor Shijia You emphasized that data uploaded by users, such as photos and recordings of the deceased, contains sensitive personal information. Once it is misused, the consequences are very serious. Under the Personal Information Protection Law, before a platform processes a deceased person’s personal information, it should obtain consent from the deceased’s nearest relatives. The nearest relatives have the right to exercise rights regarding related information, such as access, copying, correction, and deletion. Platforms need to take measures such as encryption and anonymization to prevent data leaks. After an “AI resurrection” service ends, they should promptly fulfill the deletion obligation and must not retain user data to train other models on their own. The existing “Regulations on the Administration of Deep Synthesis Internet Information Services” impose stronger constraints on major platforms, but for individual developers and small-workshop-style “AI resurrection” services that are widely present on e-commerce platforms, there are still no effective regulatory tools and traceability mechanisms.
For system governance of “digital resurrection” services, Professor Shijia You suggested building a diversified governance framework of “legal bottom line + technical standards + industry self-discipline + ethical review.” At the level of private law, citizens should be allowed to make legally binding arrangements in advance for their “digital persona” after death through the form of a digital will. At the level of public law, models providing “AI resurrection” services must pass a safety assessment before being put into use, and all AI-generated content must be clearly labeled in a prominent location with the statement: “This content is generated by artificial intelligence.” At the industry level, an ethical code of conduct for the “digital resurrection” industry should be formulated, clearly establishing “non-commercialization, non-public dissemination” as the bottom-line red line, and applying digital traceability technology to ensure that infringing videos are traceable.