Privacy online Archives - 成人VR视频 Institute https://blogs.thomsonreuters.com/en-us/topic/privacy-online/ 成人VR视频 Institute is a blog from 成人VR视频, the intelligence, technology and human expertise you need to find trusted answers. Wed, 04 Feb 2026 19:03:55 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.3 The child exploitation crisis online: Gaps in digital privacy protection /en-us/posts/human-rights-crimes/children-digital-privacy-gaps/ Wed, 04 Feb 2026 18:39:04 +0000 https://blogs.thomsonreuters.com/en-us/?p=69312

Key highlights:

      • Fragmented protection creates vulnerability 鈥擟urrent US privacy laws operate as a patchwork system without comprehensive national standards, leaving children and other users exposed to data exploitation across state lines and international borders.

      • Body data collection opens future manipulation potential 鈥擵irtual reality platforms collect granular biometric information through sensors that can reveal deeply sensitive information about users.

      • Use-based regulations outlast technology changes 鈥 Restricting harmful applications of data provides more durable protection than the current regulatory approach, which relies on categorizing rapidly evolving data types.


Virtual reality (VR), social media, and gaming companies have long avoided robust content moderation, largely out of concern over implementation costs and the risk of alienating users. This reluctance stems from platforms wanting to have the widest pool of users as possible. Yet, the shortsightedness of this decision has consequences, including insufficient protection of children and long-term cost to companies鈥 bottom-lines.

The child exploitation crisis in digital spaces requires better laws and a reimagining of how VR, gaming, and social medial companies balance privacy, safety, and accountability across diverse platform architectures, according to , an expert in child exploitation methods in digital spaces and Policy Advisor at the NYU Stern Center for Business and Human Rights.

Limitations of existing regulatory frameworks

The current regulatory landscape is insufficient to protect children online. The lack of a comprehensive national privacy law in the United States, the use of consent mechanisms, and the haphazard rollout of age verification all expose protection gaps and come with economic and psychological costs, according to Olaizola Rosenblat. For example, some of the dangers include:

Gaps in patchwork of regulations leave children vulnerable 鈥 Regulatory demands for child safety often collide with privacy protections, creating contradictory obligations that platforms cannot realistically satisfy. In the absence of unified standards, however, companies operate in a jurisdictional maze that leaves most users, including children, exposed to data exploitation across borders.

America鈥檚 regulatory landscape remains especially fragmented, with no comprehensive national privacy law to provide consistent protection. comes close to establishing meaningful safeguards, according to Olaizola Rosenblat, yet it still permits companies to collect data even after users opt out of the sale or sharing of their data.

digital privacy
Mariana Olaizola Rosenblat, of the NYU Stern Center for Business and Human Rights

Federal reform attempts, including the , collapsed amid conflicts between states demanding stronger protections and tech lobbyists aligned with conservative representatives seeking weaker standards. In addition, child-specific laws, such as the , provide protection only for those under 13, which leaves older minors and adults vulnerable.

鈥淥nce users turn 13, they fall off a regulatory cliff,鈥 says Olaizola Rosenblat. 鈥淭here is no federal child-specific data protection regime, and existing state-level safeguards are patchy and largely ineffective for teens.鈥

Internationally, the European Union鈥檚 (GDPR), although considered the gold standard for regulation, suffers from a persistent gap between its ambitious text and its uneven enforcement.

Age verification tensions 鈥 These regulatory shortcomings also are evident in debates over age verification. Protecting children requires collecting data to determine user age, yet privacy advocates frequently oppose such measures. Without pragmatic guidance acknowledging these inherent trade-offs, platforms often face contradictory obligations they cannot simultaneously fulfill.

Current consent frameworks offer little protection 鈥 Current consent mechanisms offer users an illusory choice that fails to protect children from data exploitation. Even relatively robust frameworks like the GDPR rely on consent models in which refusal means exclusion from digital spaces essential to modern life. This approach proves particularly inadequate for younger users. Indeed, that about one-third of Gen Z respondents expressed indifference to online tracking.

VR data collections may allow future exploitation

VR platforms differ fundamentally from traditional gaming spaces and social media platforms. Users with VR headsets embody avatars that move through thousands of interconnected experiences. While no actual touching occurs, the experiences feel visceral. Indeed, the psychological and physiological responses can mirror aspects of real-world experiences, which include sexual exploitation, even though no physical contact occurs.

Olaizola Rosenblat explains that the data collected from the sensors can open up the potential for future exploitation. “The inferences that can be drawn from your body-based data collected by these sensors is granular and often intimate,鈥 she explains. 鈥淭he power that gives to companies is pretty remarkable in terms of knowing things about you that you might not even know yourself.鈥

Recommended actions to address challenges

Addressing the child exploitation crisis in digital spaces requires coordinated action, according to Olaizola Rosenblat, and that needs to include:

Universal protection standards 鈥 Corporate action in partnership with legislators is necessary for effective reform that protect all users rather than fragmenting safeguards by age or vulnerability status. Current approaches that shield only younger children create dangerous gaps and leave adolescents and adults exposed once they age out of protected categories.

Enforce existing regulations 鈥 Even well-crafted legislation proves meaningless without robust enforcement mechanisms. Commitment by government agencies along with the appropriate levels of funding is the most meaningful approach to achieve desired outcomes.

Technology-agnostic use regulation 鈥 Rather than attempting to categorize rapidly evolving data types, companies in the VR, gaming, and social media sectors must work with legislators to restrict harmful uses of data such as manipulation, exploitation, and unauthorized surveillance, regardless of technical collection methods. Regulating data use 鈥 rather than the current method of regulation based on categories of data, which include personally identifiable information 鈥 is the right approach.

Public mobilization is essential 鈥 Citizens must understand that the stakes of data exploitation beyond corporate collection also include hacking vulnerabilities and manipulative deployment. Without consumer demand for better protection and the willingness for legislators to pass the laws, regulation will not happen.

The path forward

The digital exploitation of children demands immediate action that transcends partisan divides and corporate interests. Only through coordinated regulatory reform, meaningful enforcement, and sustained public pressure can we create digital spaces in which innovation thrives without sacrificing our privacy and safety. The cost of continued inaction grows steeper each day we delay.


You can find out more on how organizations and agencies are fighting child exploitation here

]]>
New US law emerges to fight 鈥渞evenge porn鈥 amid surge in reported cases /en-us/posts/human-rights-crimes/revenge-porn-law/ Thu, 05 Jun 2025 13:40:56 +0000 https://blogs.thomsonreuters.com/en-us/?p=66175 Over the past 15 years, the convergence of increasing use of social media and ubiquitous utilization of mobile devices has sparked a new area of crime known as image-based sexual abuse (IBSA), or non-consensual intimate imagery 鈥 also known as revenge porn and digital sextortion. Indeed, recent studies suggest this area of illegal activity has jumped by double digits year-on-year.

In the United Kingdom, for example, the Revenge Porn Helpline in reports in 2023, compared to the previous year. Sextortion remained the predominant concern and made up more than one-third (34%) of all cases. The data reveals that overall, there was a 54% increase in sextortion cases compared to 2022, highlighting a concerning pattern that hasn鈥檛 been seen since 2021.

In the United States, the data is a little dated but double-digit increases too are suggested. In 2016, , reported being victims of nonconsensual pornography. A larger , showing 鈥 all in the space of just three years.

Given the rise in the use of generative AI to create deep fakes, it is easy to surmise that cases of IBSA will continue to be on the rise for some time.

Improvements in legal landscape on the horizon in the US

Obtaining justice for victims of IBSA around the world is no easy task. Only about a concerning IBSA as of 2018. In addition, prosecuting offenders in the US up until now has been a patchwork of local and state laws. Indeed, 48 states plus the District of Columbia and Guam .

The good news is that change is on the horizon with the bill known as the or the TAKE IT DOWN Act that was passed on April 28 and signed into law in late May.

The new law prohibits the intentional disclosure of nonconsensual intimate visual depictions, including digital forgeries. It amends Section 223 of the Communications Act of 1934 to criminalize the publication of such depictions without consent, especially if the images are intended to harm or result in harm, including psychological, financial, or reputational damage.


Join us for a free online Webinar: World Day Against Trafficking in Persons to learn more about the complexities of human trafficking, the impact on victims, and effective strategies for prevention and intervention


The Act also defines key terms, such as identifiable individual, and intimate visual depiction, which are important concepts to enforce the bill. For example, an identifiable individual is someone who appears, in whole or in part, in an intimate visual depiction and whose face, likeness, or other distinguishing characteristic (such as a unique birthmark or recognizable feature) is displayed in connection with such depiction. And an intimate visual depiction carries the meaning given in section 1309 of the Consolidated Appropriations Act, 2022, which includes visual representations involving nudity or sexually explicit conduct.

In addition, the bill establishes penalties for offenses involving both adults and minors. For adults, violations can result in fines or imprisonment of up to two years; while for minors, the penalties increase to fines or imprisonment of up to three years. Exceptions to the prohibition include lawful activities by law enforcement and disclosures made in good faith for legitimate purposes such as legal, medical, or educational needs.

The Act also mandates covered platforms, including websites, services, or applications serving the public with user-generated content, to establish a notice and removal process for nonconsensual depictions. Platforms must remove such content within 48 hours of a valid request and are protected from liability for good faith removal actions. The U.S. Federal Trade Commission is empowered to enforce compliance, treating violations as unfair or deceptive acts.

More action to protect potential victims needed

Research into the negative impacts of victims is significant. Among IBSA victims, 93% experienced considerable emotional distress, and 82% faced substantial challenges in social, work, or other vital aspects of their lives, according to the . Meanwhile, just more than half (51%) of victims revealed they had contemplated suicide. Additionally, 55% were concerned about their professional reputation being damaged due to IBSA, and 39% reported that it had negatively impacted their career.

These negative implications point to the need for additional measures beyond legal avenues to safeguard potential victims. One area in which there is a large need is building awareness among adults under 40, which of IBSA cases. This is especially true because people in this age range have .

Likewise, increasing education for caregivers and parents in order to better prevent their children from becoming victims is also critical. Resources such as 成人VR视频鈥 Safe Settings campaign and the U.S. Department of Homeland Security鈥檚 are great examples of awareness-building campaigns.

In particular, caregivers should protect their children by teaching them about the risks of sharing personal or inappropriate content online and the lasting nature of digital information. They should also guide them on how to set up privacy controls on mobile apps, recognize online predators, and find trusted adults for support. Parents should also stress the consequences of sexting and cyberbullying and emphasize that sharing sexual abuse material is illegal.

The TAKE IT DOWN Act represents a significant step forward in the fight against IBSA and digital exploitation, offering new legal avenues to protect victims and hold offenders accountable. However, legal measures alone are not enough, and continued efforts in education and awareness are essential to prevent future occurrences, empower potential victims, and foster a safer online environment for everyone.


You can find out more about the ways to strengthen online privacy rights here

]]>
Protecting children’s privacy online: How to harmonize federal & state laws to ensure internet safety /en-us/posts/human-rights-crimes/harmonizing-laws/ Wed, 21 May 2025 17:03:57 +0000 https://blogs.thomsonreuters.com/en-us/?p=65907 In 2022, about 1.7 million children were victims of a data breach, which means that they had personal information exposed or compromised. In addition, 90% of parents told Pew Research Center that they were having access to their personal information.

Safeguarding children’s data and online privacy is challenging due to the existing fragmented legal framework, which consists of various federal and state laws with differing methods and restrictions. Even so, there are ways to address these gaps, says , Partner in the cybersecurity and data privacy litigation practice at Mayer Brown.

Understanding the current federal and state legal landscape

鈥淭he current legal landscape aiming to protect children’s data and online privacy is a complex patchwork of federal and state laws, each with distinct approaches and limitations,鈥 says Thomson. At the federal level, the Children’s Online Privacy Protection Act (COPPA) is the cornerstone legislation and is enforced by the U.S. Federal Trade Commission (FTC). COPPA primarily targets websites and online services directed at children under 13 years of age and mandates parental consent for the collection, use, and disclosure of personal information. Despite its foundational role, COPPA has faced criticism for its limited age scope and challenges in enforcement.

On the state level, Thomson notes that there has been a notable surge in initiatives to enhance children’s privacy protections. California, for example, leads with the California Consumer Privacy Act and its successor, the California Privacy Rights Act (CPRA), which extend privacy safeguards to minors under 18. This trend has inspired other states to enact similar laws, focusing on regulating children’s data, particularly in connection with social media.


Join us for a free online Webinar: World Day Against Trafficking in Persons to learn more about the complexities of human trafficking, the impact on victims, and effective strategies for prevention and intervention


These state laws often include provisions for age-appropriate design codes and “harmful content age verification” laws, which aim to shield children from potentially damaging online content. However, these efforts sometimes face opposition on grounds of infringing on free speech rights, highlighting the ongoing tension between privacy protection and other legal considerations.

At the federal legislative level, efforts to strengthen children’s online safety have seen mixed success. Initiatives like the Kids Online Safety Act have been proposed to address broader online safety issues, although many such efforts have not yet been passed into law. Recent U.S. Senate hearings have continued to highlight the need for comprehensive federal action.

The challenge remains to harmonize these federal and state efforts to ensure consistent and effective protections for children’s data privacy across the United States. Such enhancements to current protections could include standardizing age definitions, increasing parental control, imposing stricter penalties for non-compliance, and improving education and awareness about online privacy risks. These measures, combined with potential international collaboration, could help close existing gaps and create a more cohesive legal framework to protect children online.

Areas of commonality and divergences

The current legal landscape protecting children’s data and online privacy reveals several important commonalities across jurisdictions. Most prominently, there is widespread recognition that children deserve special privacy protections beyond those afforded to adults, according to Thomson. For example, laws at both federal and state levels requiring parental consent mechanisms for data collection from younger users demonstrate this special protection.

Another common thread is the growing emphasis on privacy by design principles, which requires online services to build child safety and privacy considerations into their products from inception rather than as an afterthought. Additionally, there is increasing consensus that certain exploitative design features which may target children should be restricted, with many laws limiting data retention periods and collection practices.

Despite these commonalities, Thomson points out that significant divergences create a fragmented regulatory environment. Perhaps most problematic is the inconsistent definition of child across jurisdictions. Indeed, COPPA applies to children under 13 only, while state laws like California’s CPRA extend protections to minors under 18. This creates compliance challenges for companies operating across multiple states.


The challenge remains to harmonize these federal and state efforts to ensure consistent and effective protections for children’s data privacy across the United States.


Another key divergence lies in the scope of covered entities. While some laws apply only to child-directed services, others extend to general audience websites that are likely to be accessed by children. Enforcement mechanisms also vary, with some laws relying primarily on regulatory action while others provide private rights of action.

These inconsistencies create regulatory gaps that sophisticated companies and bad actors can exploit, which clearly underscores the need for more harmonized approaches to children’s data protection that can keep pace with rapidly evolving technologies and business models that target young users.

How to close the fragmented legal landscape

To strengthen protections for children’s data privacy and close existing gaps, Thomson explains that a comprehensive approach at both federal and state levels is necessary, which specific steps including:

Establish a consistent age definition 鈥 A uniform age definition should be established across all jurisdictions to ensure consistent application of privacy protections. This would address the current discrepancies under which federal and state laws currently operate.

Improve monitoring tools for parents 鈥 Additionally, enhancing parental control mechanisms, such as developing more user-friendly tools, would allow parents to monitor and manage their children’s online activities effectively.

Expand the scope of protections of personal information 鈥 Specific efforts to reduce the exploitation of children online should include expanding the definition of personal information to encompass biometric data, reflecting the growing use of such data in digital services.

Improve transparency to parents 鈥 Require companies to provide clear, detailed disclosures about their data collection practices and any third-party sharing. This would help parents and guardians make informed decisions about their children鈥檚 digital interactions.

Strengthen consistent protection across geographies 鈥 Establishing global standards for children鈥檚 data privacy through international collaboration can also play a significant role in providing consistent protection across borders.

The splintered legal landscape protecting children’s data privacy creates regulatory gaps that sophisticated companies and illicit actors can exploit. As the digital world continues to evolve, it is imperative that lawmakers and regulatory bodies work together to establish a more cohesive and comprehensive framework for protecting children’s online privacy 鈥 one that will prioritize their safety, well-being, and rights in the face of increasingly complex technological advancements.


You can find out more about how organizations and individuals can fight against child exploitation both online and in the real world here

]]>
Study highlights need for new tools, tactics & frameworks to combat child exploitation & trafficking /en-us/posts/human-rights-crimes/new-tools-tactics/ Thu, 17 Apr 2025 02:11:28 +0000 https://blogs.thomsonreuters.com/en-us/?p=65575 Thanks to the proliferation of online access around the world, the exploitation of children, especially on the internet, is getting worse. In fact, of unidentified children revealed that more than 60% of unidentified victims were prepubescent, including infants and toddlers; and 84% of images on the database contained explicit sexual activity, according to a joint report published by INTERPOL and听ECPAT International in February 2018.

Since then, children accounted for 38% of victims detected globally, and since 2019 there has been an increase of approximately 38% in recorded child victims, according to a .


Join us for a free online Webinar: World Day Against Trafficking in Persons to learn more about the complexities of human trafficking, the impact on victims, and effective strategies for prevention and intervention


To combat this growing issue, the , which is part of the Horizon 2020 Research and Innovation program funded by the European Commission, collaborated on a cross-jurisdictional initiative aimed at addressing human trafficking and child sexual abuse, including that which occurs online.

According to , an anti-trafficking expert at the (ICMPD), the project brings together comprised of 24 partners in 17 countries, international organizations, civil society groups, research institutes, and educational institutions. ICMPD鈥檚 work is focused on research and prevention strategies developed with partner organizations in Spain, the United Kingdom, Bangladesh, and Colombia. These four countries were selected to provide varied perspectives and approaches to tackling these issues.

Study reveals common factors

The HEROES Project identified several across the countries studied that place children at risk of online sexual exploitation and abuse. Some of these factors include:

Environments with emotional neglect 鈥 Children lacking emotional support from dysfunctional family backgrounds consistently appear most susceptible to exploitation, according to Hainzl. In particular, adolescents and teenagers represent a particularly vulnerable demographic due to their developmental stage and increasing online presence. While girls are targeted more frequently, boys are not exempt from these dangers. Similarly, children with psychological or emotional challenges, including those experiencing isolation or loneliness, also faced heightened risk as they may seek connection online, according to the project鈥檚 findings.

Perpetrators using similar tactics to exploit victims online 鈥 Research on child sexual abuse reveals important distinctions between online and offline exploitation patterns. In many cases, child abuse and child sexual abuse is often perpetrated or initiated by individuals close to the child, including family members or acquaintances, and may correlate with socioeconomic factors. By contrast, online harassment often involves strangers contacting numerous potential victims simultaneously and pursuing those who respond. This distinction highlights the different mechanisms of exploitation that protection frameworks must address.

Economic vulnerability 鈥 Financially disadvantaged households also had higher levels of exploitation across the studied countries. Poverty and economic inequality create conditions in which children and families become vulnerable to various forms of abuse and exploitation.

Gaps in implementation of legal frameworks 鈥 While legal frameworks often exist on paper, inadequate implementation remains a critical issue, with authorities frequently lacking resources or training to enforce existing laws. Underdeveloped protection and migration services further compound these problems by failing to identify potential victims or provide adequate support.

Legal and judicial challenges

The rapid evolution of online technologies and reliance on digital tools by a growing number of people across the globe have brought about a new wave of child exploitation challenges for law enforcement agencies and judicial systems worldwide. As the internet knows no borders, the complexities of investigating and prosecuting online crimes that involve multiple jurisdictions, varying legal frameworks, and ever-changing digital landscapes have created significant obstacles for authorities.

Also, one of the most pressing issues is the inconsistency in the terminology and legal definitions related to child sexual abuse among countries. “There is a problem that is noticed in the terminology being different, especially when it comes to cross-border cases,鈥 says Hainzl adding that this creates confusion in prosecution efforts. And while many countries have legislation addressing human trafficking and child sexual abuse, these laws often lack provisions specifically tailored to online aspects of these crimes.

Further, law enforcement agencies face numerous technical obstacles when investigating online exploitation. Encrypted communications, rapidly changing online behaviors, and difficulties in obtaining and preserving digital evidence all impede successful investigations. 鈥淟aw enforcement should be able to respond to this very quick and very often, but it is not possible because of a lack of knowledge on current and changing tactics and tools, technology, and equipment,鈥 explains Hainzl.

The global fight continues

The global fight against online child sexual abuse requires coordinated international action across multiple fronts. One fundamental starting point would be the development of a common international definition for online child sexual abuse, similar to how the UN Palermo Protocol established a unified understanding of human trafficking, Hainzl says.

In addition, comprehensive training programs for law enforcement and all responsible professionals are essential along with efforts to raise public awareness about online risks. Educational initiatives must target children directly, teaching them to recognize potential dangers in online interactions.

The corporate sector bears significant responsibility in this fight because their platforms often facilitate exploitation. Already, proactive identification systems using algorithms to detect potential cases represent a shift from purely reactive approaches to prevention-focused strategies. And developing regulatory frameworks similar to their own corporate sustainability due diligence requirements could hold companies accountable for monitoring and preventing exploitation on their platforms, says Hainzl.


You can find more on this topic in the 成人VR视频 Institute鈥檚Human听Rights Crimes Resource Center

]]>
How technology is essential to protecting children from predators /en-us/posts/human-rights-crimes/technology-protecting-children/ Wed, 18 Dec 2024 13:03:18 +0000 https://blogs.thomsonreuters.com/en-us/?p=64118 Every child deserves a safe childhood. Yet each year, countless children around the world go missing or are put in dangerous and vulnerable situations. 鈥淭here is no corner of the world not touched by these issues,鈥 says Michelle DeLaune, President and CEO of the National Center for Missing and Exploited Children (NCMEC). 鈥淲e remain committed to finding new and better ways to reach the people who need us most.鈥

Founded in 1984, NCMEC is the nation鈥檚 largest and most influential child protection organization, dedicated to finding missing kids, stopping child sexual exploitation, and preventing crimes against children. Fulfilling that mission requires a partnership among stakeholders across society. 鈥淧rotecting children is everyone鈥檚 responsibility. It requires both the public and private sector working together, bringing their expertise, tools, and resources to make a difference,鈥 DeLaune says.

In collaboration with law enforcement agencies, families, and child welfare organizations, NCMEC has contributed to the recovery of missing children in more than 400,000 cases.

Unfortunately, child exploitation has reached alarming new heights, fueled partly by the widespread use of social media and advanced technology. Paradoxically, these same digital tools are crucial in combating this issue. NCMEC harnesses online data and analytics to provide vital insights, recognize patterns, and develop global initiatives aimed at safeguarding children. This tech-driven approach enables NCMEC to stay at the forefront of child protection efforts worldwide.

Bringing critical information to light

Indeed, AI-driven technology enables data analysis tools that can comb through copious amounts of data to identify patterns and trends in child exploitation and abduction cases. This helps in pinpointing potential risks and focusing resources where they are most needed. 鈥淲e need technology that enables us to connect the dots, target cases where there is the most urgent risk to children, and act on them quickly,鈥 DeLaune explains. 鈥淲e have so much data coming in 鈥 more than human beings can sift through to surface the right information. It鈥檚 the proverbial needle in a haystack; but in this haystack, the needles we are searching for are children in need of assistance.鈥

NCMEC has partnered with 成人VR视频 (TR) for 15 years, and according to DeLaune, partnerships with TR and others allow the organization to have access to tools that 鈥渉elp us do our job faster and get the right information out to law enforcement and the public. That鈥檚 been a game changer.鈥

Also, NCMEC analysts use technology on a daily basis to support missing and exploited child investigations, according to Angela Aufmuth, Executive Director of Analytical Programs at NCMEC. 鈥淲hen a report on a missing or exploited child comes into the Center, time is of the essence. We need to analyze data rapidly,鈥 Aufmuth says. 鈥淲ith technology, we鈥檙e able to access data from multiple sources, quickly putting together pieces of the puzzle. This enables us to identify possible locations and persons of interest and get that information out to law enforcement agencies so they can perform their investigations in the field.鈥

Technology can focus investigations

Of course, access to public records is vital. Technology allows NCMEC to quickly sift through data, such as records of deaths, to focus law enforcement efforts more effectively. Whether working on a missing child case, a case of suspected sex trafficking, a noncompliant sex offender, or a child abduction, having access to public records information is absolutely critical, explains Aufmuth.

鈥淭echnical solutions have given us the ability to access different types of data very quickly.鈥 she says, adding that data on deaths, for example, have been very useful. 鈥淲hen conducting noncompliant sex offender operations, law enforcement will give us a large amount of information,鈥 Aufmuth notes. 鈥淲e鈥檙e then able to batch that data against the technical resources, quickly identifying individuals who are deceased. This enables law enforcement to conduct investigations in more focused and efficient manner.鈥

At NCMEC, the focus is entirely on supporting children and their families as they navigate through unimaginable hardships 鈥淗aving access to these technology tools is absolutely essential to our work. If we did not have access to the data and analysis they provide, we would not be able to support the families we serve and help bring their children home,鈥 Aufmuth explains.

DeLaune agrees, adding: 鈥淣CMEC is a lifeline for families searching for a child or helping them rebuild their life after an exploitation issue. And they are grateful for anyone who can enhance our ability to serve them. The families are counting on NCMEC 鈥 and in turn, NCMEC counts on technology partners to deliver results.鈥


You can find more information about here.

]]>
Awareness of caregivers is key to mitigating online child exploitation via 鈥渟afe settings鈥 /en-us/posts/human-rights-crimes/online-child-exploitation-safe-settings/ https://blogs.thomsonreuters.com/en-us/human-rights-crimes/online-child-exploitation-safe-settings/#respond Mon, 02 Dec 2024 12:40:11 +0000 https://blogs.thomsonreuters.com/en-us/?p=63872 Tragically today a record number of children are being exploited online, and no child is immune from falling victim to sexual abuse. In fact, online child exploitation can take various forms including the distribution of child sexual abuse materials, sextortion schemes, and child sex trafficking.

Sextortion is a form of online blackmail in which children are tricked online into sending intimate pictures of themselves to people who then threaten to distribute the sexual images unless the victim complies with their demands, often money or additional explicit images.

Some of the facts about sexual exploitation among children are stunning, including that:

      • globally, more than 300 million children under the age of 18 have been affected by online child sexual exploitation and abuse in the last 12 months alone, according to Childlight Global Child Safety Institute;
      • there are more than 36 million reports of suspected online child sexual exploitation in the United States alone in 2023, according to U.S. National Center for Missing & Exploited Children (NCMEC);
      • between 2021 and 2023, there was a more than 300% increase in the continued rise of financial sextortion and reports of online enticement of children in the US, according to the NCMEC.

More importantly, the growing use of more advanced content-creation technology like generative AI (GenAI) to produce material depicting child sexual abuse is further fueling the rising threat to children, which include AI-generated images of such materials to extort real images from children and AI-generated images of real children that are manipulated to make it appear as though the child is nude or engaged in sexual acts.

Raising awareness among caregivers is critical

Education campaigns, such as the U.S. Department of Homeland Security鈥檚 and 成人VR视频鈥 Safe Settings campaign continue to provide parents, caregivers, and others with children in their lives, critical information they need to help keep children safe online.听Real online safety starts, and continues, by about the dangers of online exploitation. Indeed, it’s important to create physical safe settings by sharing age-appropriate resources on how kids can safely navigate online, and it鈥檚 also important to use the built-in safe settings on those websites and platforms to which kids have access.


You can learn more here.


When it comes to protecting younger children from online risks, it’s essential to teach them basic safety rules, which Homeland Security suggests should include:

      • Protect children with basic steps 鈥 Start by instructing them not to click on pop-ups, share personal information online, or trust people they meet online.
      • Report questionable content 鈥 Create a plan with them on what to do if they encounter inappropriate content, such as looking away and telling a trusted adult.
      • Teach kindness 鈥 Teach children online etiquette and respect for others, and ensure they know to whom to turn for help if they feel disrespected or uncomfortable.

For tweens and teens, the conversation should focus on more advanced online safety topics, such as:

      • No sharing of personal information 鈥 Discuss the dangers of posting personal information or inappropriate content, and the permanency of online data.
      • Teach privacy, recognize predators, and find help 鈥 Ensure privacy and teach teens how to set up privacy controls on their devices, recognize warning signs of online predators, and identify trusted adults to whom they can turn for help.
      • Discuss sexting, cyberbullying, and other legalities 鈥 Share information about sexting, cyberbullying, and the importance of not sharing sexual abuse material, which is illegal.

For caregivers, it is important to recognize that educating kids about online safety is a process. Having an open and honest conversation with them about the potential risks and consequences of their online behavior occurs best by using two-way dialogue to ensure your child feels comfortable sharing their concerns and experiences.

Steps caregivers can take now

In addition to this open conversation, there are several practical steps that caregivers can take to further safeguard their child’s online activities. These include password-protecting their app store and gaming downloads, setting time and area limits for device use, and setting all apps, games, and devices to private. Turning off location data services on social media and non-essential apps in order to prevent unwanted tracking is also recommended.

Further, educating children about the permanence of online data and the potential long-term consequences of their online actions is also key to having them avoid negative outcomes later. One tool for parents is creating a contract with their child that outlines expectations around online behavior and consequences for misbehavior. Also, create a safety plan in case the child encounters a potentially dangerous situation. Monitoring their friend lists to remove any strangers and warning them about the dangers of chatting with unknown individuals on different platforms are additional best practices.

If a child does fall victim to a predator, avoid forwarding any explicit content and deleting any messages, images, or videos. Instead, save evidence 鈥 including usernames, screenshots, and images or videos 鈥 for law enforcement to collect directly from the device.

Raising awareness among caregivers is paramount in the fight against online child exploitation.听And arming caregivers with the knowledge and tools to create a safe online environment can mitigate the devastating effects of child exploitation and ensure that every child has the opportunity to thrive in a safe and healthy digital world.


You can learn here.

]]>
https://blogs.thomsonreuters.com/en-us/human-rights-crimes/online-child-exploitation-safe-settings/feed/ 0
New laws and regulations around child safety and privacy raise significant questions /en-us/posts/government/child-safety-privacy/ https://blogs.thomsonreuters.com/en-us/government/child-safety-privacy/#respond Tue, 01 Oct 2024 14:49:27 +0000 https://blogs.thomsonreuters.com/en-us/?p=63220 Over the last five years, we have seen a major shift towards the regulation of children鈥檚 safety and privacy in the digital environment. The shift has been driven by increasing public concerns about the risks children face online and the growing realization that internet that was not designed with them in mind.

International governments and non-governmental bodies also have responded to this challenge. In 2021, the United Nations issued its , the adoption of which made explicit that children鈥檚 rights apply in the digital world as well as the real world. The rights covered by this action included the right to privacy, freedom of expression, and protection from commercial exploitation.

Also in 2021, the Organisation for Economic Co-operation and Development (OECD) produced its , which included principles for a safe and beneficial digital environment for children, with the child鈥檚 best interests as a primary consideration. The OECD鈥檚 action also highlighted the need for risk-based and proportionate regulation, supported by measures that provide age-appropriate child safety by design.

As a range of legislation has been announced by the European Union, the United Kingdom, and the United States, it is clear that policymakers are moving away from industry self-regulation as a solution.

Europe and UK pass groundbreaking laws

The most significant new legislation being developed come from the EU and UK. In 2022, the EU passed the (DSA), which includes a requirement that risk assessments be undertaken of the impacts on children鈥檚 rights online. The DSA also includes a prohibition on targeted advertising to children, and it imposes stronger obligations on the largest platforms and search services, including transparency requirements to publish risk assessments.

In the UK, the (OSA) was passed in 2023, imposing duties of care on platforms that provide user-to-user services and search services. This includes risks assessment duties for services that are likely to be accessed by children. The OSA also goes further than the DSA in setting out the types of content that that platforms must prevent children from encountering online.


In 2021, the OECD produced its “Recommendation on Children in Digital Environment”, which included principles for a safe and beneficial digital environment for children, with the child鈥檚 best interests as a primary consideration.


Also in the UK, the government鈥檚 data protection regulator 鈥 the Information Commissioner鈥檚 Office (ICO) 鈥 has introduced the (AADC), which sets out 15 standards that online services likely to be accessed by children must follow to mitigate risks of harm related to data and privacy. This includes a requirement to conduct data protection impact assessments and implement by default protections. The ICO will apply the AADC when enforcing the UK General Data Protection Regulation.

A recent report, , has tracked changes made by online platforms between the years 2017 and 2024, and it highlights 128 changes made Meta, Google, TikTok, and Snap. The report also discusses how these platforms have increasingly focused on safety by design changes, with clear links to the influence of the AADC, DSA and OSA, despite the relatively early stage of implementation.

US moves towards new legislation and regulation

Meanwhile, the US has had longstanding law in the form of the Children鈥檚 Online Privacy Protection Act (COPPA) at federal level, which has been in place since 1998. COPPA imposes certain requirements on operators of online services directed at children under 13 years of age, and on operators of other websites or online services that have actual knowledge that they are collecting personal information online from a child under 13 years of age.

There is now a debate in the US about the limitations of the safeguards provided in COPPA, and the U.S. Congress is now debating the introduction of the Kids Online Safety Act (KOSA), which would create a duty of care for companies operating a platform. Social media platforms would also have to provide children with tools to protect their personal data, disable addictive product features, and opt out of personalized algorithmic recommendations.


There is now a debate in the US about the limitations of the safeguards provided [in previous laws], and the U.S. Congress is now debating the introduction of the Kids Online Safety Act, which would create a “duty of care” for companies operating a platform.


The U.S. Senate passed their bill on July 30 by a vote of 91鈥3, but it is still unclear whether it will pass in the U.S. House of Representatives and reach the statute book before the coming election in November. The law is somewhat controversial as some free speech advocates are concerned that the definition of harm spelled out in KOSA is too vague and could lead to censorship of information.

A number of US states have also passed their own individual legislation. In fact, a from the University of North Carolina at Chapel Hill found that 13 states had passed a total of 23 laws regarding children鈥檚 online safety.

Not all these laws, however, have passed judicial muster. Most famously, the (AADC), which was modeled on the UK approach and passed in 2022, was blocked by a federal court a year later on the grounds that its requirements for companies could violate the First Amendment rights of free speech. California Attorney General Rob Bonta filed to appeal the preliminary injunction with the U.S. Court of Appeal and awaits the final judgment. (Maryland has also passed its version of the in May 2024.)

Conclusion

With other countries, such as Canada and Brazil, also considering similar laws and regulations, there is clear direction towards regulating children鈥檚 safety and privacy online. This creates responsibilities for those companies that provide online services to assess the harms and risks to children, and develop solutions by design to mitigate these risks. Further, these new duties will require companies to develop child rights impacts assessments for their online services and for new features and design changes. Companies will also have to demonstrate how their design solutions effectively protect children in practice.

The new laws and regulations are also moving companies to use age-assurance technologies, to test and understand their effectiveness, and consider other consequences for privacy and the overall user experience.

In response to these new regulations, many companies are developing new governance methods, policies, and work processes to ensure the requirements of these new laws are embedded into the companies鈥 design and engineering practices. Providing evidence and documentation will also be crucial to demonstrate companies鈥 compliance and accountability to regulators.

Indeed, many of these regulations are still new, and companies also are seeking further guidance about what the requirements mean in practice. We also await the first case decisions from regulators which will illustrate how they are setting the bar in terms of unacceptable risks and practices among these online platforms.

Alongside these questions, the new focus on content risks 鈥 not just data protection and privacy risks 鈥 will also create significant challenges for platform operators in balancing new regulatory requirements with their users鈥 expected freedom of expression.

]]>
https://blogs.thomsonreuters.com/en-us/government/child-safety-privacy/feed/ 0