4 K-Pop Agencies Have Promised Strong Action Against Deepfake Idol Porn Sites So Far
In recent days, a serious issue has become focus in the K-Pop industry as deepfake pornography featuring numerous female idols has surfaced online.
These photos and videos — which manipulate AI deepfake technology to superimpose idols’ faces onto explicit content — have caused significant concern among fans and industry professionals alike.
200+ Female Idols Found In Deepfake Porn, Netizens Disgusted
Several K-Pop agencies have taken a strong stance, vowing to take legal action against those responsible for creating and distributing these materials. Here is how some of the major companies have responded.
1. JYP Entertainment
JYP Entertainment, home to popular girl groups such as TWICE, ITZY, and NMIXX, was among the first to address the situation.
The company expressed its deep concern over the widespread distribution of these AI-generated videos involving their artists. They emphasized that this is a serious violation of the law and declared that they are gathering all necessary evidence to pursue the strongest legal action possible.
JYP assured fans that they will take decisive measures to protect their artists’ rights and will not be lenient in their legal response.
딥페이크 (AI 기반 합성) 영상물 확산에 대한 대응 관련https://t.co/wipV8pJJWv
— TWICE (@JYPETWICE) August 30, 2024
2. ATRP
ATRP — the agency representing LOONA member and popular solo star Chuu — also released a statement regarding the issue.
“Hello, this is ATRP.
We want to inform you that the recent spread of deepfake videos targeting our artists is not only a violation of their rights but also a very serious societal issue.
We will not stand by and will respond with strong legal actions, without any leniency, to protect the rights of our artists.
Through thorough monitoring, we will continue to collect evidence of the creation and distribution of malicious videos in any form, and we ask that fans report any malicious posts involving our artists to the official email address.
Thank you.”
— ATRP
[📢]
아티스트 불법 영상물 확산 대응 안내🔗https://t.co/buveaodPBn#CHUU #츄
— CHUU (@chuu_atrp) August 31, 2024
3. MODHAUS
MODHAUS — the agency representing groups like TripleS and ARTMS — also addressed the growing concern among fans regarding deepfake content.
“Hello, this is Modhaus.
Recently, the issue of illegal deepfake videos has become a significant social problem. We have heard the concerns of our fans and are taking this situation very seriously.
Modhaus is closely monitoring the situation to protect our artists.
We promise to do everything in our power to prevent our artists from being exploited in any way and will respond actively to ensure their protection.We kindly ask for your continued interest and support. Thank you.“
— MODHAUS
딥페이크 관련 공지 올라왔네 빠르고 좋네요 이런건…
모드하우스 아티스트 권리침해 제보 report@mod-haus.com pic.twitter.com/C6J9ULjWr8
— 38 (@38ttaku) August 30, 2024
4. FCENM
The agency behind the group ILY:1 issued a stern response after learning that AI-generated videos targeting their artists were being circulated online.
“Hello, this is FCENM.
We have recently learned that AI-generated synthetic videos involving our artists are being distributed online. This behavior is a serious crime that damages the reputation of our artists, and we are taking it very seriously.
We are currently gathering all relevant evidence and working with a professional legal team to pursue strong legal action.
We want to make it clear that we will not tolerate any illegal activities that infringe on the rights and honor of our artists.
FCENM will continue to prioritize the protection of our artists and will take all possible measures to do so.
Thank you.”
— FCENM
[공지] 아일리원 (ILY:1) 관련 딥페이크 (AI 기반 합성) 영상물에 대한 대응 방침 안내 (240831)https://t.co/2LAPAu68Kj pic.twitter.com/5Qb6zqv5bB
— ILY:1 (@FCENM_ILY1) August 31, 2024