AI for Legal Aid Archives - 成人VR视频 Institute https://blogs.thomsonreuters.com/en-us/topic/ai-for-legal-aid/ 成人VR视频 Institute is a blog from 成人VR视频, the intelligence, technology and human expertise you need to find trusted answers. Thu, 19 Feb 2026 18:20:08 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.3 When courts meet GenAI: Guiding self-represented litigants through the AI maze /en-us/posts/ai-in-courts/guiding-self-represented-litigants/ Thu, 19 Feb 2026 18:20:08 +0000 https://blogs.thomsonreuters.com/en-us/?p=69532

Key insights:

      • Considering courts鈥 approach 鈥 Although many courts do not interact with litigants prior to filings, courts can explore how to help court staff discuss AI use with litigants.

      • Risk of generic AI tools 鈥 AI use in legal settings can’t be simply categorized as safe or risky; jurisdiction, timing, and procedure are vital factors, making generic AI tools unreliable for court-specific needs.

      • Specialty AI tools require testing 鈥 Purpose-built court AI tools offer a safer alternative for litigants, yet these require development and extensive testing.


Self-represented litigants have always pieced together legal help from whatever sources they can access. Now that AI is part of that mix, courts are working to help people use this advanced technology responsibly without implying an endorsement of any particular tool or even the use of AI.

Many litigants cannot afford an attorney; others may distrust the representation they have or may not know where to begin. In any case, people need a meaningful way to interact with the legal system. Used carefully and responsibly, AI can support access to justice by helping self-represented litigants understand their options, organize information, and draft documents, while still requiring litigants to verify their information and consult official court rules and resources.

These issues were discussed in a recent webinar, , hosted by . The panel explored the potential benefits of AI for access to justice and the operational challenges of integrating AI into public-facing guidance for litigants.

The problem with “Just ask AI”

Angela Tripp of the Legal Services Corporation noted that people handling legal matters on their own have long relied on a mix of resources, “some of which were designed for that purpose, and some of which were not.” AI is simply a new tool in that environment, she added. The primary challenge is that court processes are rule-based and time-sensitive; and a mistake can mean missing a deadline, submitting the wrong document, or misunderstanding a requirement that affects the case.

Access to justice also requires more than just access to information in general. Court users need information that is relevant, complete, accurate, and up to date. Generic AI systems, such as most public-facing tools, are trained on broad internet text may not reliably deliver that level of specificity for a particular court, case type, or stage of a proceeding. In these cases, jurisdiction, timing, and procedure all matter. Unfortunately, AI can omit key steps or emphasize the wrong issues, and self-represented litigants may not have the legal experience to recognize what is missing.

At the same time, AI offers several potential benefits to self-represented litigants. It can explain concepts in plain language, help users structure a narrative, and produce a first draft faster than many people can on their own. The challenge is aligning those strengths with the precision that court processes demand.

A strategic pivot: from teaching litigants to equipping staff

In the webinar, Stacey Marz, Administrative Director of the Alaska Court System, described her team鈥檚 early efforts to give self-represented litigants clear guidance about safer and riskier uses of AI, including examples of how to properly prompt generative AI queries.

The team tried to create traffic light categories that would simplify decision-making; however, they found this approach very challenging despite several draft efforts to create useful guidance. Indeed, AI use can shift from low-risk to high-risk depending on context, and it was hard to provide examples without sounding like the court was endorsing a tool or sending people down a path to which the court could not guarantee results.

The group ultimately shifted to a more practical approach 鈥 training the people who already help litigants. The new guidance targets public-facing staff such as clerks, librarians, and self-help center workers. Instead of teaching litigants how to prompt AI, it equips staff to have informed, consistent conversations when litigants bring AI-generated drafts or AI-based questions to the counter.

The framework emphasizes acknowledgment without endorsement. It suggests language such as:

“Many people are exploring AI tools right now. I’m happy to talk with you about how they may or may not fit with court requirements.”

From there, staff can explain why court filings require extra caution and direct users to court-specific resources.

This approach also assumes good faith. A flawed filing is often a sincere attempt to comply, and a litigant may not realize that an AI output is incomplete or incorrect.

Purpose-built tools take time

The webinar also discussed how courts also are exploring purpose-built AI tools, including judicial chatbots designed around court procedures and grounded in verified information. Done well, these tools can reduce common problems associated with generic AI systems, such as jurisdiction mismatch, outdated requirements, or fabricated or hallucinated citations.

However, building reliable court-facing AI demands significant time and testing. Marz shared Alaska’s experience, noting that what the team expected to take three months took more than a year because of extensive refinement and evaluation. The reason is straightforward: Court guidance must be highly accurate, and errors can materially harm someone’s legal interests. In fact, even after careful testing, Alaska still included cautionary language, recognizing that no system can guarantee perfect answers in every situation.

The path forward

Legal Services鈥 Tripp highlighted a central risk: Modern AI tools can be clear, confident, and easy to trust, which can lead people to over-rely on them. And courts have to recognize this balance. Courts are not trying to prevent AI use; rather, many are working toward realistic norms that treat AI as a drafting and organizing aid but require litigants to verify claims against official court sources and seek human support when possible.

Marz also emphasized that courts should generally assume filings reflect a litigant’s best effort, including in those cases in which AI contributed to confusion. The goal is education and correction rather than punishment, especially for people navigating complex processes without representation.

Some observers describe this moment as an early AOL phase of AI, akin to the very early days of the world wide web 鈥 widely used, evolving quickly, and uneven in its reliability. That reality makes clear guidance and consistent messaging more important, not less.

This shift among courts from teaching litigants to use AI to teaching court staff and other helpers how to talk to litigants about AI reflects a practical effort on the part of courts to reduce the risk of harm while expanding access to understandable information.

As is becoming clearer every day, AI can make legal processes feel more navigable by helping self-represented litigants draft, summarize, and prepare; and for courts to realize that value requires clear guardrails, court-specific verification, and careful implementation, especially when a missed detail can change the outcome of a case.


You can find out more about how AI and other advanced technologies are impactingbest practices in courts and administrationhere

]]>
Scaling Justice: Unauthorized practice of law and the risk of AI over-regulation /en-us/posts/ai-in-courts/scaling-justice-unauthorized-practice-of-law/ Mon, 01 Dec 2025 19:35:29 +0000 https://blogs.thomsonreuters.com/en-us/?p=68596

Key insights:

      • Are regulations choking innovation? 鈥 Current regulatory efforts may be stifling innovation in AI-driven legal solutions, exacerbating the access to justice crisis and prioritizing lawyer business model protection over consumer needs.

      • Some safeguards already in place 鈥 Existing consumer protection laws and product liability laws already provide robust safeguards against potential AI-related harm, making it unnecessary to impose additional restrictive policies on AI-driven legal services.

      • A balanced regulatory approach is best 鈥 An approach that encourages responsible innovation, prioritizes consumer protection, and fosters a data-driven mindset can best unlock the transformative potential of AI in addressing critical gaps in access to justice.


As AI-driven legal solutions gain traction, calls for regulation have grown apace. Some are thoughtful, others ill-informed or protectionist, and many focus on the issue of unauthorized practice of law (UPL). While protecting the public is crucial, shielding the legal profession from competition is not. A large majority (92%) of low-income people currently receive no or insufficient legal assistance; and the ongoing uncertainty in the legal AI and UPL regulatory landscape is chilling innovation that could support them.

The legal profession has always struggled to provide affordable, accessible services even as they simultaneously attempt to block those working ethically to bridge the gap with technology. When done right, legal industry regulation should balance protection with progress to avoid stifling innovation and exacerbating the access to justice crisis.

Consumer protection laws already provide robust safeguards against potential AI-related harms. Existing product liability laws and enforcement actions by state attorneys general ensure that consumers have recourse if AI legal tools cause harm. Despite these safeguards, concerns about unregulated AI filling the gaps in legal services persist.

It is time to upend the calculus of consumer harm and examine the motives of regulation. Rather than forcing tech-based legal services to prove they cause no harm in order to avoid changes of UPL, regulators should be required to justify, with data, that legal technology companies cause harm and whether any ruling will constrain supply in the face of a catastrophic lack of access to justice.

Uneven regulatory efforts raise questions

Current regulatory efforts tend to focus on companies that directly serve legal consumers, while leaving broader AI models largely unchecked. This raises uncomfortable questions: Are we truly protecting the public, or merely constraining competition and thereby reinforcing barriers to innovation in the process?


You can find out more about here


“If UPL’s purpose is protecting the public from nonlawyers giving legal advice 鈥 and if regulators define legal advice as applying law to facts 鈥 how many legal questions are asked of these Big Tech tools every day?鈥 asks Damien Riehl, a veteran lawyer and innovator. 鈥淎nd if we won’t go after Big Tech, will regulators prosecute Small Legal Tech, which in turn utilizes Big Tech tools? If Big Tech isn’t violating UPL, then neither is Small Tech [by using Big Tech’s tools].”

Efforts to regulate the use of AI-based legal services are, de facto, another path to market constraint. Any attempt to regulate AI should be rooted in actual consumer experience. Justice tech companies, by definition, pursue mission-driven work to benefit consumers, but if an AI-driven tool causes harm, it should certainly be investigated and regulated. State bar associations are not waiting for harm to occur before considering regulating AI-driven legal help 鈥 and we must wonder why.

The risks of premature regulation

We must enable, not obstruct, AI-driven legal solutions and ensure that innovation remains a driving force in modernizing legal services. If restrictive policies make it difficult to develop cost-effective legal solutions, fewer consumers 鈥攑articularly those with limited resources 鈥 will have access to legal assistance.

AI is developing far too quickly for a slower regulatory trajectory to keep up 鈥 any contemplated regulation would be evaluating last year鈥檚 technology, which is at best half as good as the latest iterations. Regulating AI-driven legal services now is akin to prior restraint, as when published or broadcast material is anticipated to cause problems in the future and is suppressed or prohibited before it can be released. However, this approach does not apply to new technology; we already can look for evidence of harm in product liability.

By prioritizing consumers rather than lawyer business model protection, AI-enabled legal support would be monitored for potential harm with data collected and analyzed to bring to light any issues. That way, regulations could be built around that defined, data-backed harm. For instance, we might require certification protocols for privacy or security if those issues prove problematic.

Forward-thinking states are going further

In July, the Institute for the Advancement of the American Legal System (IAALS) released a new report, , which advocated for a phased approach to regulation, beginning with experimentation, education, and consumer protection, while gathering and evaluating data. Later phases could involve potential regulation based on what is learned. In this way, innovation is encouraged while consumer needs and public trust remain paramount.

Also this year, Colorado cut the proverbial Gordian Knot by releasing a 鈥 consistent with existing analysis of UPL complaints in the state 鈥 for AI tools focused on improving access to justice. Guiding principles include ensuring consumers have clarity about the services they receive and their limits, educating consumers on the risks inherent in relying on advice from non-lawyer sources, and including a lawyer in the loop. Utah, Washington, and Minnesota all have considered similar policies. And IAALS now is collaborating with Duke University鈥檚 Center on Law & Tech to create a toolkit and templates to make it easier for other states to adopt UPL non-prosecution or similar policies.

Yet, some regulators seek the opposite, looking to define the exact types of business activity that will lead to UPL prosecution. While this framework is more likely to become obsolete more quickly, it serves a similar purpose: providing clear guardrails that allow innovation to flourish, while protecting consumers by clearly indicating the limitations of the software. The to specifically exclude tech products from UPL enforcement, provided they are accompanied by adequate disclosures that they are not a substitute for the advice of a licensed lawyer. Such policies are essential, and they can encourage those entrepreneurs aiming to ameliorate the justice gap.

What鈥檚 next?

The legal and justice tech industries should aim for a regulatory framework that encourages responsible, iterative innovation 鈥 and participants should take some proactive steps, including: i) justice tech companies should participate in the discussion and share their business- and mission-focused perspectives to help shape any new regulations; and ii) regulators with internal non-prosecution policies should consider making them public to encourage entrepreneurs in their state.

These approaches would enable positive change for state residents, support overburdened legal aid organizations and courts, and foster a flourishing tech ecosystem aimed at serving unrepresented and under-represented parties.

The legal profession has not been able to ensure justice for all, making it even harder for low-income and unrepresented parties to find the help they need. Now, AI-driven legal service providers are moving forward on addressing critical gaps in access to justice.

With a measured and equitable approach to regulation that neither ignores AI鈥檚 risks nor overlooks its transformative potential, the legal industry and regulators must keep pace with today鈥檚 technology 鈥 and such efforts should not obstruct those legal providers who can bring the law closer to that ideal and help close the justice gap.


You can learn more about the challenges faced by justice tech providers here

]]>
Legal aid leads on AI: How Lone Star Legal Aid built Juris to deliver faster, fairer results /en-us/posts/ai-in-courts/legal-aid-ai-lone-star-juris/ Mon, 10 Nov 2025 15:57:22 +0000 https://blogs.thomsonreuters.com/en-us/?p=68394

Key takeaways:

      • Legal aid is leading on AI adoption 鈥 Legal aid organizations are leading the way in leveraging AI with 74% using AI in their work, driven by the need to serve millions of citizens who lack legal help.

      • Lone Star Legal Aid creates Juris 鈥 A new AI-powered tool Juris from Lone Star Legal Aid improves accuracy and trust through retrieval-augmented generation, source-cited answers, and a secure Azure-based architecture with an integrated citation viewer.

      • Keeping costs low 鈥 A phased, two-year build-and-test process kept costs low (at about $2,000 a year in infrastructure costs, plus about 300 staff hours) and produced dependable results.


A finds that under-resourced legal aid nonprofits are adopting AI at nearly twice the rate of the broader legal field because of the urgency of the need to serve millions of Americans who may lack legal help. The study shows that almost three-quarters (74%) of legal aid organizations already use AI in their work, compared with a 37% adoption rate for generative AI (GenAI) across the wider legal profession. (LSLA), a legal aid non-profit serving easter Texas, is one of early adopters of AI.

According to LSLA, its attorneys were spending too much time and money hunting for answers across pricey platforms and scattered PDFs. Key materials lived in research databases, internal drives, and static repositories, while individual worker-vetted documents were not centrally accessible. Without a single, trusted hub, staff experienced slower research time that affected clients through duplicated effort and delays.

These strains are not unique to LSLA. In fact, court help centers and self鈥慼elp portals face the same fragmentation, licensing costs, and uneven access to authoritative guidance. A verifiable, consolidated knowledge hub that could stabilize quality while reducing spending would be a needed solution.

To solve this problem, LSLA turned to AI to create a legal tool called Juris built to return fast, source鈥慶ited answers. Juris was designed to centralize high鈥憊alue legal materials, cut reliance on expensive third鈥憄arty platforms, and lay a flexible foundation that the organization could reuse beyond legal research for internal operations and future client tools.

Multifaceted approach to ensuring accuracy and reliability

There were several aspects of Juris that designers used to help its mission to increase access to justice, including:

Design methods fuel trustworthy output 鈥 Juris was built to ensure accuracy using a number of methods, such as a retrieval-augmented generation (RAG) pipeline to ensure the chatbot delivers fact-based, source-cited answers. It also uses semantic chunking, a process that breaks a document into natural, meaning鈥慴ased sections (for example, a heading plus the paragraphs that belong to it) so the original context stays together.

When a user asks a question, Juris retrieves only the most relevant of these sections. Limiting the AI to evidence from those passages improves accuracy and reduces hallucinations because the model is not guessing from memory. Instead, it is grounding answers in the text it just accessed.

Solid technical architecture helps reliability 鈥 Juris鈥檚 technical architecture also ensures reliable results because it combines Azure OpenAI for secure, stateless access to AI models to better handle document ingestion, processing, and vector storage. Users interact through a custom internal web interface that integrates a PDF viewer alongside the chat experience that enables seamless citation and document navigation. The platform is securely hosted on Azure App Service with continuous deployment orchestrated through GitHub, which provides reliable operations and streamlined updates.

Phased approach to building and testing yielded dependability 鈥 Also to ensure trustworthy results, LSLA developed Juris by following a structured, phased approach over two years. It began with a concept phase that was focused on clearly identifying the problem, followed by a platform evaluation that compared open-source and commercial solutions. A prototype was then created and demonstrated as proof of concept.

In addition, internal testing included adversarial exercises, hallucination detection, and rigorous validation of citation reliability. Based on these findings, the team implemented enhancements, such as moving from size-based to semantic chunking, improving the interface, and expanding the set of source materials. Juris is now in pilot preparation and undergoing final refinements before its release to a select group of subject matter experts.

Efficient resourcing and sharing learnings

LSLA鈥檚 phased method to building and testing also made sure that sustainability was built in from the beginning. Indeed, ongoing maintenance is minimal, and Microsoft鈥檚 nonprofit Azure credits keep infrastructure costs around $2,000 per year.

The most significant cost was in staff time. Development so far totals roughly 300 staff hours (or about 0.5 full-time equivalent, plus 0.3 FTE over two years). Once Juris enters phase two, which has been funded by a Legal Services Corporation (LSC) technology initiative grant, expected benefits will include faster, more consistent research and reduced workload for frontline and administrative staff, plus a modular framework that others can adapt.

Other legal service organizations that face similar challenges can learn from the Juris development, testing, and implementation as well as other related case studies. These recurring lessons include:

      • beginning with a small, manageable scope
      • inviting end users in from the start, and
      • carving out protected time so staff can innovate alongside daily duties.

Looking ahead, the LSLA team will continue to roll Juris out in phases, while building sister tools. LSLA also plans to share lessons learned through LSC鈥檚 AI Peer Learning Labs to help other organizations replicate the model.

Real change at scale, such as this, will only come from collaborating across organizations to share playbooks, pool datasets, and co鈥慸esign tools that lift quality while lowering cost. It is only with such partnership and sharing lessons from early adopters of AI that peers can adapt the model and, together, scale solutions that narrow the justice gap.

Angela Tripp, Program Officer for Technology for the Legal Services Corporation contributed to this article.


You can learn more about the ways legal aid organizations are using advanced technology to better serve individuals as they access the justice system here

]]>
Access to housing justice: Leveraging AI to solve NYC鈥檚 security deposit crisis /en-us/posts/ai-in-courts/security-deposit-crisis/ Thu, 14 Aug 2025 16:29:27 +0000 https://blogs.thomsonreuters.com/en-us/?p=67208

Key insights:

      • Security deposit disputes are a significant issue in New York City 鈥 Half a billion dollars is locked in security deposits at any given time, and nearly 5,000 official complaints about illegally withheld deposits have been filed with the New York Attorney General since 2023. However, many cases go unreported due to the complexity and cost of pursuing justice.

      • AI-powered tool Depositron helps NYC tenants claim their security deposits 鈥 By guiding users through a simple process and generating customized, legally sound demand letters, Depositron can give tenants the easy access and means to assert their rights.

      • AI tools Like Depositron can improve scale and access to justice 鈥 Depositron’s modular architecture makes it possible to expand to other jurisdictions with similar legal frameworks, providing a scalable blueprint for addressing other legal challenges.


Security deposit disputes are a significant and persistent housing justice issue in New York City. The NYC Comptroller estimates that at any given time. With the as of April, tenants are routinely required to provide large deposits to secure housing.

Since 2023, have been filed with the New York Attorney General, according to Gothamist, but most likely, this only scratches the surface. Most cases go unreported because the time, cost, and complexity of pursuing justice are too high for most tenants.

Despite reforms like the Housing Stability and Tenant Protection Act of 2019, which mandates that landlords return deposits within 14 days and provide itemized deductions for any money withheld, enforcement remains weak and affordable pathways for recourse are slow or not available. And unfortunately for tenants, Legal Aid organizations prioritize eviction defense, not deposit recovery; and the NY AG鈥檚 complaint process is slow and opaque.

AI powered tool delivers agency at scale

To address this challenge,, a long-time tenant advocate with more than 20 years of experience litigating housing justice cases in NYC courts, and, CEO and founder of LawDroid (and听contributor to the 成人VR视频 Institute blog site), developed听. This free, AI-powered, mobile phone-accessible tool is available around the clock to help NYC tenants听and those in New York state听recover their security deposits quickly, legally, and without the need for a lawyer. Depositron also has plans to launch in Florida and Chicago soon.

鈥淒epositron delivers more than a legal document 鈥 it delivers agency,鈥 says Nori. Indeed, this encapsulates the tool鈥檚 core value because it empowers tenants to take action and reclaim what is rightfully theirs.


鈥淏y making legal self-advocacy accessible, Depositron fills a gap left by traditional legal services, which often cannot take on these cases due to capacity or cost constraint.鈥


Depositron guides users through an intuitive, plain-language process to collect relevant details about their housing situation, including lease facts, deposit amount, move-out date, and landlord information. Users also can upload photos that document the apartment鈥檚 condition to strengthen their case. The tool then generates a customized, legally sound demand letter that cites New York laws and incorporates the user鈥檚 evidence. This process not only educates tenants about their rights but also gives them a practical, actionable way to assert those rights without facing the intimidation or expense of seeking traditional legal help.

Not surprisingly, AI is at the core of Depositron鈥檚 effectiveness. Unlike generic form generators, Depositron takes a hybrid approach, combining advanced large language models and structured prompts, together with conditional logic, to answer basic legal questions and capture the unique facts of each dispute and then translate them into a persuasive, legally grounded narrative.

This customized approach simulates the client interview and legal writing process and makes it possible to help thousands of tenants efficiently. Early testing with law students, tenant advocates, and pro se renters demonstrated success. In fact, users reported that Depositron made them feel more confident, informed, and in control because many recovered their deposits faster than they would have through conventional means.

鈥淏y making legal self-advocacy accessible, Depositron fills a gap left by traditional legal services, which often cannot take on these cases due to capacity or cost constraints,鈥 Nori explains. This is a crucial differentiator for Depositron because legal aid organizations and private attorneys are rarely able to assist with disputes over relatively small sums of money.

Poised for expansion in policy and geography

By enabling tenants to send professional, well-researched demand letters at scale, the platform changes the risk calculation for landlords. As more tenants assert their rights with credible legal documents, landlords are incentivized to comply with the law rather than risk penalties or further legal action.

The tool also contributes to systemic change by collecting anonymized data on violation patterns, which can be shared with advocates and enforcement agencies. This data-driven approach enables targeted interventions and supports broader policy efforts to improve housing justice.


By enabling tenants to send professional, well-researched demand letters at scale, the platform changes the risk calculation for landlords.


In addition, Depositron鈥檚 modular architecture is designed for expansion to other jurisdictions with similar legal frameworks, such as California, Illinois, and Massachusetts. The technology also can be partnered with local legal aid organizations to accelerate adoption and impact in new markets. Nori and Martin say they have already heard from advocates in Michigan, Maryland, Washington, Tennessee, California, and Washington DC about building platforms in those markets.

Further, the path taken in Depositron鈥檚 development can offer lessons for the development of future access-to-justice tools. For example, user empowerment and autonomy are essential, and intuitive design is as important as legal accuracy, Nori and Martin discovered. More specifically, targeted solutions for discrete problems can drive meaningful changes, and AI can serve as both the engine and the interface for delivering legal services at scale.

Depositron demonstrates that technology can bridge longstanding gaps in legal access by making protections both real and actionable for everyone, not just those who can afford traditional representation. By transforming a process that traditionally had taken longer than a year to one that can be resolved in a matter of weeks, Depositron restores agency and financial stability to tenants and provides a scalable blueprint for addressing other access-to-justice challenges in the digital age.


You can find out more about how justice tech solutions and tools are working to improve citizens鈥 access to justice here

]]>
Putting people first: Leveraging AI to support human interaction and close the justice gap /en-us/posts/ai-in-courts/supporting-human-interaction/ Fri, 13 Jun 2025 13:05:38 +0000 https://blogs.thomsonreuters.com/en-us/?p=66292 The access to justice gap in the United States represents one of our legal system’s most persistent challenges, with millions of Americans unable to access adequate legal help due to financial constraints, geographic barriers, and systemic inequities. Indeed, 92% of civil legal problems reported by low-income Americans receive inadequate or no legal assistance, according to the .

Now, as AI-driven tools and technologies rapidly evolve, they offer unprecedented opportunities to scale solutions and bridge this divide through automation, information retrieval, and process optimization.

However, the promise of AI in reducing the justice gap lies not in replacing human connection but in enhancing it. While technology can expand reach and efficiency, the human elements of empathy, trust, and contextual understanding remain irreplaceable in legal assistance.

, Executive Director of (LLL), says she this fact over her eight years of working at the intersection of the justice gap and technology. By combining AI with human-centered design principles, Brown and her peers are providing efficient referrals to enable legal aid organizations to connect with humans faster.

Leveraging AI to expand human capacity

The most effective AI implementations recognize a fundamental truth: People in legal crisis crave human connection. “We hear constant feedback on our tools along the lines of ‘I just want to talk to someone,鈥欌 says Brown, adding that this insight is driving a shift in focus in LLL鈥檚 work toward answering how the organization might best maximize meaningful human interaction between potential clients and legal service providers.

Across Louisiana, Brown is undertaking innovative projects that seek to expand awareness around legal issues, open access to self-resolution pathways, and provide more sophisticated referrals, working toward a future-state in which there are more high-quality interactions between service providers and those seeking legal help.


Register now for The Emerging Technology and Generative AI Forum, a cutting-edge conference that will explore the latest advancements in GenAI and their potential to revolutionize legal and tax practices


At the current moment, Brown acknowledges that there is a lot of work required to leverage technology and reduce administrative burdens in order to pave the way for a future in which human-to-human interactions are paramount. LLL is is doing its part by using AI as a force multiplier for existing services, including:

Creating civil justice content 鈥 LLL is using AI to accelerate the production of written articles, video scripts, brochures, translations, and social media content that can be used to help build awareness of tools and resources available for civil justice needs.

Offering sophisticated referral systems 鈥 LLL is developing an AI-powered referral function that can take complicated intake rules and case acceptance guidelines from Louisiana civil justice organizations and match those seeking legal help with the appropriate resources. The organization is also building AI-supported content-retrieval functionality within its web site that will issue relevant spot-and-select information guides, self-help tools, and referral resources from the organization’s database.

Modernizing traditional processes 鈥 Brown and her team are working on another innovation: Paper-to-digital intake workflows supported by AI, which can reduce the time needed to input data and leave more time for a conversation with a potential client.听Less tech-savvy individuals or those that have lower access or awareness of resources still will be able to use a paper-first approach to connect with service providers that can help.

AI tools are beneficial, but a cohesive strategy is essential

The true potential of AI tools to maximize human-centered approaches in legal services can only be realized when such approaches are incorporated into a comprehensive and unified strategy to close the access to justice gap for good, explains Brown. Right now, this is lacking, she says, adding that we need a strategy 鈥渢hat firmly orients our community around a broad vision with specific outcomes and goals, but still encourages experimentation and innovation from a place of clarity about where we are going. We ultimately have to get very clear on who’s doing what, what capacities we have or don’t have, all the various inputs that make up the system as a whole,听and importantly, how all of it relates to each other.鈥

In addition, the assessment process of gaining clarity regarding goals, outcomes, and existing capacities is a key requirement for legal aid鈥檚 success in deploying AI tools with empathy-driven design. The most effective implementations are built upon established processes and use AI to enhance rather than replace existing frameworks. And Brown鈥檚 initiative to transform paper intake forms into hybrid workflows while preserving familiar client experiences is a notable example.

Until an integrated approach is realized, Brown and her team at LLL will keep working to embrace technological advancements to better streamline administrative processes and foster a future that emphasizes the importance of close human interactions. This reflects her current ethos around AI 鈥 and around technology in general 鈥 that lawyers and access to justice professionals should be working to maximize human interaction wherever possible.

鈥淵ou cannot replicate compassion and understanding for those in crisis with a decision tree or a chatbot,鈥 she says, noting that AI can expand our capacity in ways that leave time for these crucial, personal interactions, and allow us to 鈥渢houghtfully design experiences that make our humanity the centerpiece in a very highly automated service.鈥


You can learn more ways that AI-driven technology is impacting access to justice here

]]>
AI and legal aid: A generational opportunity for access to justice /en-us/posts/ai-in-courts/ai-legal-aid-generational-opportunity/ Mon, 03 Feb 2025 17:36:36 +0000 https://blogs.thomsonreuters.com/en-us/?p=64671 More than 50 million low-income Americans don鈥檛 receive any or enough legal help for , according to the Legal Services Corporation (LSC), the largest funder of legal aid in the United States. This leaves people without adequate legal assistance in critical areas like housing, healthcare, and immigration. Criminal defendants often face similar challenges with inadequate representation that ultimately impact their lives and futures.

Indeed, pro se litigants are often navigating a complex system without counsel, filing papers on their own as best they can, submitting hand-written complaints, and struggling to navigate legal precedent they may not fully understand.

Using GenAI to improve legal service organizations

Legal aid professionals dedicate themselves to bridging this justice gap while working with extremely limited resources. The good news is that with advancements in technology, particularly generative AI (GenAI), there is a generational opportunity to transform legal aid and make significant strides in closing the justice gap.

To explore this opportunity further, more than 300 members of the legal aid community from 47 different states gathered at the inaugural AI for Legal Aid Summit, hosted by the LSC. A host of use cases in which GenAI has been used to by legal service organizations (LSOs) to better serve clients and expand access to their services were featured at the Summit.听Some of these use cases discussed included:

Efficiency in legal workflows

GenAI can efficiently handle repetitive tasks and enhance decision-making with its ability to read, analyze, and write as skillfully as a junior associate. This means that it can complete many types of work in a fraction of the time it would take a human. For instance, it can review hundreds of pages in a litigation record in minutes and identify key terms from thousands of contracts in hours. While GenAI is not a lawyer 鈥 and as a result, attorneys always need to review thoroughly the output from GenAI-enabled tools 鈥 the ability to lean on technology to collect (at super-human speed) the building blocks needed for human decision-making is incredibly powerful.

This efficiency gained from using GenAI allows legal aid professionals to make strategic decisions faster, potentially resolving clients’ problems sooner. By delegating time-consuming tasks to GenAI, lawyers can focus more on strategy, advising, and supporting clients, which are all the aspects of legal work that machines cannot replicate. This shift in work distribution can all lawyers to serving more clients effectively.

Improved internal operations

Beyond direct client work, GenAI can also enhance productivity and efficiency in managing legal aid organizations. It can assist in various organizational tasks, such as marketing, grant writing, and human resources (HR). For example, GenAI can function as a marketing assistant by creating web pages, social media posts, and advertisements; and it can function as an HR aide, by writing interview questions and creating performance review templates. It also makes quick work of grant applications, with one LSO leader saying that an application that would normally take him two days to draft took him merely 30 minutes with the help of an AI-powered tool.

By taking on these administrative and organizational tasks for nonprofit LSOs, AI frees up valuable time for legal aid professionals to focus on their core mission of providing services to those in need. This increased efficiency and productivity can allow legal aid organizations to expand their reach and impact.

Stunning impact

The most encouraging 鈥 and inspiring 鈥 part of the Summit was hearing from LSO members who have been at the forefront of leveraging AI-powered solutions to better serve communities in need.

For example, Legal Aid of North Carolina built an AI chatbot called LIA, in partnership with LawDroid to provide actionable resources for simple legal matters, focusing on cases involving domestic violence, child custody, landlord-tenant disputes, and consumer law.

Also, the nonprofit Housing Court Answers worked with NYU School of Law and the legal technology company Josef to build a tool that helps NYC tenants understand and advocate for repairs to which they are entitled under the city鈥檚 municipal code.

Finally, leaders from the Innocence Center, an organization that leverages an AI-driven legal assistant to cut down the time to review litigation records and draft habeas petitions, said that in a recent case of a person exonerated after being wrongfully imprisoned for more than 20 years, the Center believes that if it had had this technology earlier in the case, it could have gotten this person released a decade or more earlier.

How to get started

As legal aid professionals embark on the process of incorporating GenAI into their workflows, it’s essential to recognize that success hinges on a thoughtful and intentional approach that includes the following actions:

Select the right GenAI-powered tools 鈥 First, legal aid professionals should focus on finding the right solution for their specific needs. It is essential to choose professional-grade AI solutions that are grounded in reliable, authoritative, domain-specific content, and continually tested for accuracy by domain experts.

View AI mastery as a journey 鈥 Getting the most out of GenAI in the early stages of experimentation also requires the right mindset, which means seeing AI competence as an unfolding path. Rather than trying to adopt AI overnight, professionals should approach their GenAI journey one step (and one use-case) at a time, and be prepared to invest time and energy in learning how to effectively use the new technology.

To start, most people pick one internal area to improve, such as writing a difficult email or revising a job description. In no time it will be easy to graduate to building a personal GPT 鈥 those large language models that can generate text in natural language. In fact, this was something that Summit attendees actually did in small groups together in real time.

Dedication to continuous learning 鈥 The path to using GenAI well requires a commitment to skill development. Learning how to craft prompts in AI solutions effectively enables users to create precise and effective instructions that yield desired outcomes. LSO attendees also learned this skill firsthand at the Summit.

Likewise, the essential skill of developing the ability to delegate tasks to AI systems allows for the optimal distribution of work between human creativity and machine capabilities. Adaptability is critical for upskilling that demands practitioners stay updated on new techniques, models, and best practices to maintain their expertise and maximize the technology’s potential.

The future is now

Lawyers in LSOs stand at the threshold of a transformative era in legal practice. The time for adoption to expand access to legal services is now. By embracing AI solutions, legal aid professionals can amplify their impact, serve more clients, and tackle the overwhelming needs that have long challenged the legal aid profession.

Indeed, the legal aid community always has been at the forefront of innovation in service of those in need, and by combining this commitment with using GenAI-enabled tools, community members can create a more just and equitable legal system for all.


You can find more about the impact of AI in legal aid here

]]>
FosterPower: Empowering youth with tech tools to thrive /en-us/posts/technology/fosterpower-empowering-youth/ Mon, 27 Jan 2025 11:17:48 +0000 https://blogs.thomsonreuters.com/en-us/?p=64606 In the world of child advocacy, few tools have the transformative potential to empower foster youth as much as . Founded by Taylor Sartor, a Tampa-based attorney with deep experience in educating and representing children in foster care, the FosterPower app provides a centralized place for children and stakeholders alike, including judges and guardians, to access the information they need to thrive.

With more than 20,000 children in Florida鈥檚 foster care system at any given time, Sartor is setting a precedent for how technology can be harnessed to inform, guide, and inspire youth towards a better future.

From advocacy to action

Sartor鈥檚 journey began with a during college, in which she volunteered to advocate for children in foster care. Her passion deepened during her time as an AmeriCorps member, mentoring students in a pilot program. While at AmeriCorps, Sartor represented one of her students, improving his life while solidifying her interest in attending law school to represent even more youth with urgent legal needs.

Sartor鈥檚 dedication carried through to law school, where she earned an Equal Justice Works Fellowship at the Children鈥檚 Law Center. One specific challenge continued to stand out: the absence of a centralized resource in Florida to help foster youth understand their rights and access critical services. Inspired by initiatives in other states, Sartor partnered with law students to create a know-your-rights guide, which she used in her legal practice. The project gained even greater traction after a grant from the allowed the guide to be digitized.

FosterPower was developed with a clear vision: to inform foster youth about their benefits, protections, and legal rights. 鈥淲e don’t cater to any other audience than the youth 鈥 FosterPower is made for them,鈥 Sartor explains. 鈥淗owever, the content is so simplified and cites to the law at the bottom of each section that it is truly a useful resource for anyone in child welfare. We do have case managers, attorneys, and judges that use the app, but our focus audience has been and always will be youth in foster care.鈥

To build the app, Sartor took a hands-on and sensitive approach by working directly with foster youth. 鈥淭hese youth have experienced significant trauma and talking about their experiences can be triggering, so I wanted to make sure that it was approached in a youth-centered, trauma-informed manner,鈥 she says. 鈥淚 did not want to outsource this task to a vendor that did not have the experience working with this vulnerable population.鈥


FosterPower
FosterPower’s Taylor Sartor

鈥淲e don’t cater to any other audience than the youth 鈥 FosterPower is made for them. 鈥淗owever, the content is so simplified and cites to the law at the bottom of each section that it is truly a useful resource for anyone in child welfare.”

 


Sartor notes that the biggest factor was really just listening to the youth and creating an app that is based on their feedback and what they wanted to see when they pulled up FosterPower. She also says they prioritized compensating foster youth for their time, ensuring that they were truly valued and respected as a part of the process.

The result is an app that鈥檚 as practical as it is accessible. Available on both Android and iPhone 鈥 a must-have to Sartor 鈥 FosterPower was designed to function offline, ensuring that users can access critical information anytime, anywhere. Content is reviewed on an annual basis with local experts, ensuring that information is as up to date as possible.

A vision for the future

Since its launch, thanks to word of mouth, local partnerships, and social media posts, FosterPower has achieved impressive milestones. The app has been downloaded more than 4,000 times, and its website has gathered more than 10,000 views. The app鈥檚 educational videos have reached more than 100,000 viewers 鈥 proof of the demand for foster care-related education in Florida in beyond.

To further its reach, Sartor and her team conduct in-person presentations and CLE sessions. Looking to the future, Sartor says they鈥檒l be hiring a community marketer and trainer to build relationships with case managers and ensure wider adoption of the app. Recent updates include an immigration section available in Spanish and Creole, as well as a forthcoming human-trafficking module, reflecting the diverse needs of Florida鈥檚 foster youth and FosterPower鈥檚 commitment to addressing them.

Sartor鈥檚 current focus is on ensuring all foster youth across the state are empowered with the information they need to succeed. 鈥淲e continue to work on getting it out across the state so that all youth know about it,鈥 she explains, adding, however, that her ambitions for the app extend beyond Florida. With the help of funders and other local experts, she says she envisions expanding FosterPower to other states, encouraging local organizations to adopt and adapt the platform for their own unique needs.

For those inspired by Sartor鈥檚 work, her advice is clear: Listen to the population you aim to serve, involve them in the design process, and compensate them fairly. She stresses that building tech solutions for vulnerable populations requires deep empathy, adaptation, and a commitment to sustainability.

FosterPower is a unique opportunity to change lives for some of the country鈥檚 most vulnerable 鈥 foster youth 鈥 and the reception of their app is a testament to their its thoughtful design and development process, which stands as a testament to the power of user-centered design and compassionate innovation. With every download, FosterPower moves closer to ensuring that no child in foster care is left without the resources they need to succeed and thrive.


You can find out more about how legal technology is helping further the cause of justice here.

]]>
AI puts Spotlight on victim identification in fight against domestic minor sex trafficking /en-us/posts/human-rights-crimes/spotlight-trafficking-victim-identification/ Fri, 24 Jan 2025 13:34:47 +0000 https://blogs.thomsonreuters.com/en-us/?p=64618 Over the last decade, domestic minor sex trafficking (DMST) has evolved to become increasingly sophisticated, with traffickers leveraging technology to accelerate the spread of exploitation. In response, Spotlight, a non-profit organization, has been at the forefront of combating DMST, utilizing artificial intelligence-powered technology to aid investigations and identify juvenile victims.

Since its launch in 2014, has grown to be the leading application for investigators of juvenile sex trafficking nationwide, enabling them to quickly identify victims and disrupt trafficking networks, according to CEO . 鈥淲hat we learned from interviews with investigators who were investigating child sex trafficking was that it was a very manual process for them to be sifting through classified ads to identify a juvenile and next to impossible to tell that a victim had been moved across state lines. It was prototyped and then it was distributed around Phoenix’s Super Bowl in 2015. It turned out that the technology actually did work.鈥

Over the past decade, it has expanded its capabilities and reach, becoming an across the United States, with more than 8,000 users in the United States and Canada. While the application initially focused on identifying victims advertised online, Spotlight has evolved its approaches as technology and trafficking methods have changed. It now incorporates a broader range of data sources and continues to innovate to keep pace with the changing landscape of human trafficking. For example, Boorse states, 鈥淲e’ve gone through several different iterations and have advanced to perceptual matching.鈥 In addition, Spotlight uses artificial intelligence to predict content which allows investigators to include and exclude certain tags to .


The organization’s approach has broken down bureaucratic barriers and enabled rapid innovation and deployment of new technologies to protect children from sex trafficking.


The organization’s approach has broken down bureaucratic barriers and enabled rapid innovation and deployment of new technologies to protect children from sex trafficking while equipping frontline responders with intelligence to identify juvenile victims and sex trafficking networks. For instance, a 15-year-old girl was recovered within two weeks of using Spotlight.

By integrating with existing law enforcement databases and tools, Spotlight enhances the ability of investigators to . This leads to more successful recoveries of victims, and since its inception Spotlight has played a critical role in identifying over 26,000 victims. Investigators who use the application daily report a 60% time savings.

A three-pronged approach in the fight against domestic minor sex trafficking

At the same time, technology and collaboration with law enforcement only go so far. Additional efforts by legislators and companies are needed to address the complexities of domestic minor sex trafficking.

Legislation听plays a crucial role in establishing legal frameworks and mechanisms to combat DMST. Laws must target traffickers with stringent penalties, protect victims by providing support services and safe harbor provisions, and mandate reporting requirements for platforms and industries susceptible to exploitation. The recent REPORT Act exemplifies this by expanding reporting obligations for online platforms and increasing penalties for non-compliance.

Corporations听have a responsibility to actively participate in combating DMST. This includes implementing robust internal controls to prevent their platforms from being exploited, collaborating with law enforcement, and supporting organizations dedicated to fighting trafficking. Platforms susceptible to exploitation by traffickers must prioritize detecting suspicious behavior, identifying patterns and reporting .

New technology is constantly being exploited for issues like human trafficking. Traffickers increasingly use social media, messaging platforms and alternate payment methods, but efforts to monitor and disrupt these spaces lag behind. Corporations who offer these services need to invest in detection methods, enhance their ability to identify suspicious activity and report early to hold traffickers accountable and keep their platform safe.

Technology听presents both a challenge and a solution in the fight against DMST. While traffickers exploit online platforms, technology also empowers investigators and organizations like Spotlight to identify victims, track networks, and disrupt their operations. AI-powered tools analyze vast datasets to identify patterns and connections. This allows law enforcement to intervene quickly.

However, constant innovation is needed to stay ahead of traffickers who adapt quickly to new technologies. 鈥淪ome of the biggest gaps in child trafficking are hiding in plain sight in many of the places where people are, such as being advertised on most of the major social media platforms and dating applications and websites. That is why there is a need for everyone to be involved,鈥 Boorse adds.


You can find more information on the terrible problem of Sex Trafficking here

]]>
Scaling Justice: Collaborating for impact 鈥 5 essential tips for justice tech providers /en-us/posts/technology/scaling-justice-collaborating-justice-tech/ Tue, 21 Jan 2025 13:02:39 +0000 https://blogs.thomsonreuters.com/en-us/?p=64549

This article is the first in an ongoing series titled ““, by Maya Markovich in consultation with the 成人VR视频 Institute. This series aims to not only explore how justice technology fits within the modern legal system, but how technology companies themselves can scale as businesses while maintaining an access to justice mission.


Mission-driven legal technology providers 鈥 often called justice tech providers 鈥 and legal service organizations (LSOs) share a fundamental commitment to expanding access to justice for all individuals. While both aim to provide essential legal information and support to those in need, they each operate with distinct cultures and approaches.

Recognizing and understanding these differences can help these organizations collaborate more effectively, ultimately leveraging their collective expertise to better serve people seeking legal assistance.

This shared mission drives LSOs and justice tech companies to meet clients where they are, whether through community-based clinics or digital platforms. Working with limited resources, both have developed strategies to maximize their impact through creative solutions 鈥 and collaboration is a natural extension of their shared mission for justice equity.

Finding shared values in legal services

LSOs bring decades of community trust and deep systemic knowledge to their work. Through direct service, they’ve built strong relationships with clients, courts, and community partners. Their comprehensive understanding of legal systems comes from daily navigation of these complex procedures on behalf of vulnerable populations. However, they cannot serve everyone who seeks their help 鈥 due to limited capacity, matter type, or other reasons.

Justice technology providers complement the traditional strengths of LSOs with innovative approaches. Their focus on user-centered design and client outreach brings fresh perspectives to legal service delivery, while their ability to rapidly iterate on solutions enables quick adaptation. Often, justice tech founders have lived experience with the problems they鈥檙e seeking to solve and are similarly plugged into their communities. Their broad understanding of legal challenges and use of technology to scale support helps bridge the justice gap for legal aid clients as well as the often-overlooked middle-income population that may be seeking legal recourse.

Navigating partnership challenges

Despite complementary strengths and mission alignment, partnerships between LSOs and justice tech organizations can face challenges.

Time constraints affect both parties: LSOs are often stretched thin, serving as many immediate client needs as possible with little bandwidth to experiment; meanwhile, tech organizations face pressure to develop, test, and deploy solutions quickly. With LSOs typically moving deliberately due to institutional restrictions and tech organizations moving fast to deliver their best products through rapid iteration, different organizational paces can create friction in joint projects.

Further, funding constraints and different financial models can misalign partnership expectations and jurisdictional complexities, while regulatory hurdles can often complicate implementation of tech across regions. Sadly, it is frequently the simple lack of awareness about potential partnerships that prevents promising collaborations.

5 essential tips

These 5 recommendations for justice tech companies to build meaningful collaborations with LSOs are drawn from listening sessions with LSOs in 2024.

1. Authentically present yourself

A long-term commitment to access to justice principles demonstrates alignment with LSO values. Justice tech firms should communicate their genuine passion for solving legal access problems, not merely as a professional or company goal, but as the North Star that motivates their work to bridge the justice gap through technology.

For a justice tech leader, sharing your path to justice technology is a powerful way to connect and build trust. Highlight relevant background and experience that uniquely positions you to solve the problem, including your technical expertise, and how and why you built your solution and selected your team.

2. Practice radical product transparency

LSOs are stretched for time and operate on scant resources. They are also justifiably concerned about running up against boundaries that could jeopardize their funding. As a justice tech leader, you should distinguish clearly between the legal information and advice your products and services provide. Be transparent about what your product can do today, and what you鈥檙e working on building toward in the future.

Unrepresented people are often subjected to predatory practices, and LSOs can rightfully be fiercely protective of their clients, so trust concerns may emerge around data security and client privacy. Address data concerns proactively and provide comprehensive explanations of data privacy practices. Outline clear data-sharing policies and document your security measures.

It is also critical to validate the impact your product may have in order to encourage LSOs to invest in technology that scales. Share your data, demonstrate your focus on measurable outcomes, and propose success metrics that will measure progress for both organizations.

3. Build strong relationships

Trust is essential. LSOs are more apt to invest in building external partnerships that have worked successfully with peers, so tech leaders should actively work to build goodwill and cultivate positive references now. They should engage their advisory board members as advocates and connectors, ask for feedback after connecting with non-profits and government agencies, and document client success stories.

Maintaining clear communication is also critical, so leaders should practice responsive engagement, use clear and jargon-free language, and document commitments and follow-through. Demonstrate reliability with consistent follow-up, adherence to deadlines, and addressing concerns promptly.

4. Research non-profits, generally and specifically

Understanding the dynamics of non-profits means accounting for unique funding cycles, grant report deadlines, and budget constraints. Navigating their processes requires a nuanced approach that acknowledges resource limitations and recognizes the complex ecosystem in which these organizations operate. Justice tech companies that are non-profits themselves may be more familiar with these issues, but for-profit companies may need to take time to familiarize themselves with the non-profit model and its key stakeholders.

Effective partnerships also require thorough due diligence on the specific non-profit of interest. By carefully examining financial documents, studying annual reports, and researching funding sources, justice tech startups can gain crucial insights into an LSO鈥檚 financial health, mission, and operational capabilities.

5. Ensure long-term sustainability and reliability

Technology partnerships in legal services require more than an initial connection 鈥 they need a strategic approach to sustainability and support. Justice tech companies should design smooth handoff processes that promote intuitive knowledge transfer supported by comprehensive documentation. Robust training resources ensure that LSOs and tech providers can fully leverage new systems, bridge potential skill gaps, share resources, work to each organization鈥檚 strengths, and empower teams to use technologies effectively. Successful collaborations should focus on creating seamless transitions that minimize disruption and maximize ongoing stakeholder value.

Readying for the long journey

Successful partnerships among justice tech firms and LSOs are built on trust, transparency, and a shared commitment to expanding legal access. These collaborations transform potential connections into powerful opportunities to advance justice equity by creating lasting value for both organizations and the communities they serve.


If you鈥檙e currently operating in the state of North Carolina or building toward that market, consider applying to collaborate with . For more information about mission-focused legal tech, visit the and the

]]>
AI for Legal Aid: How to empower clients in need /en-us/posts/legal/ai-for-legal-aid-empowering-clients/ https://blogs.thomsonreuters.com/en-us/legal/ai-for-legal-aid-empowering-clients/#respond Tue, 15 Oct 2024 13:33:17 +0000 https://blogs.thomsonreuters.com/en-us/?p=63385 It鈥檚 hard to overstate the impact that artificial intelligence (AI) is expected to have on helping low-income individuals achieve better access to justice. And for those legal services organizations (LSOs) that serve on the front lines, too often without sufficient funding, staff, or technology, AI presents perhaps their best opportunity to close the justice gap. With the ability of AI-driven tools to streamline agency operations, minimize administrative work, more effectively reallocate talent, and allow LSOs to more effectively service clients, the implementation of these tools is essential.

Innovative LSOs leading the way

Already many innovative LSOs are taking the lead, utilizing new technology to complete tasks from complex analysis to AI-driven legal research. Here are two compelling examples of how AI is already helping LSOs empower low-income clients in need.

The Legal Aid Society of Middle Tennessee and the Cumberlands: Automating expungements for economic mobility

In Tennessee, where a large number of individuals are eligible to expunge their criminal records, the Legal Aid Society (LAS) of Middle Tennessee and the Cumberlands 鈥減reviously handled expungement petitions manually 鈥 sometimes typing them, sometimes even handwriting them,鈥 says Zachary Oswald, Senior Deputy Director of Client Services at LAS. 鈥淲hile the work was important, it was extremely time-consuming and felt like a process ripe for automation. Recognizing the scale of the issue across the state and the potential impact we could have, we set out to find a more efficient solution.鈥

In response, paralegal Mustafa Enver led the team in building an AI solution that could successfully identify, draft, and file expungement petitions. Their program taught ChatGPT, a publicly available generative AI (GenAI) platform, to read anonymized criminal records, sort out convictions from charges, and flag the charges that were eligible for expungement in a spreadsheet. Once pushed to third-party document automation tool, the team could then automatically generate expungement petitions.


Criminal charges, even those that are eligible for simple, free expungement, can prevent someone from obtaining housing or employment. This is a simple barrier to overcome if only help is available.


The results have been tremendous. For example, at a one-day legal clinic, LAS expunged 324 charges for 98 people 鈥 work that would have taken much longer without automation. 鈥淐riminal charges, even those that are eligible for simple, free expungement, can prevent someone from obtaining housing or employment. This is a simple barrier to overcome if only help is available,鈥 says Oswald.

Despite initial reluctance from some attorneys, the project has proven that AI can free up legal professionals鈥 valuable time to focus more on clients. 鈥淭echnology won鈥檛 replace the human element,鈥 Oswald emphasizes, noting that clients still need to understand the legal process and how it impacts their lives. Instead, AI should be utilized as a support function for pro bono lawyers to aid in the process 鈥 and still be checked by attorneys before submission. In the case of expungement, leveraging tools to identify and craft petitions for attorney review and submission can save time, scale the number of those served, and allow attorneys to amplify the human experience in their engagement.

Legal Aid of North Carolina: Virtual assistants for legal information

Legal Aid of North Carolina鈥檚 Innovation Lab is leveraging AI to provide 24/7 access to legal information through a virtual assistant known as the AI Legal Information Assistant (LIA). Available on the organization鈥檚 website, LIA answers questions from the public about housing, family law, and consumer rights in plain language, making legal information accessible to those who may not have immediate access to an attorney.

Scheree Gilchrist, Legal Aid of North Carolina鈥檚 Chief Innovation Officer, saw an opportunity to apply AI to the organization鈥檚 knowledge set to better scale how the agency could help a higher number of clients. Particularly in rural and low-income areas, legal aid offices may not be easily accessible or have long wait times for clients to speak with an attorney on a time-sensitive matter. 鈥淲e recognized that human-centered solutions are critical for addressing access to justice challenges, but we also needed a scalable approach to meet the rising demand for legal services, particularly as it relates to accessing legal information and resources,鈥 says Gilchrist.

鈥淎I offers the capacity to provide quick, accurate information to a vast audience, particularly to those in urgent need,鈥 she adds. 鈥淎I can also help reduce the burden on our legal staff, allowing them to focus on more complex or nuanced cases that require human intervention while simple, routine questions and information can efficiently be handled by the AI.鈥


AI offers the capacity to provide quick, accurate information to a vast audience, particularly to those in urgent need. AI can also help reduce the burden on our legal staff…


Partnering with a legal technology company, the team engaged in internal research projects and client discovery conversations to identify the areas of legal assistance that were most requested. They then developed a closed system knowledge base to ensure high-quality responses and engaged law students and clients to repeatedly test the service. After several iterations, the team launched the tool on the organization鈥檚 website and have averaged more than 95,000 views on their Get Help webpage in a five-month period, with 20,000 views on our housing resources alone.

Legal Aid of North Carolina updates and refines LIA on an ongoing basis and has seen the investment pay off. 鈥淲ith more than 400,000 requests for services annually, many people exit our process without ever connecting with an attorney or intake staff 鈥 a reality we found unacceptable,鈥 says Gilchrist. 鈥淎I won鈥檛 solve this problem entirely, but it can ensure that individuals who contact Legal Aid of NC receive accurate information that helps them take informed next steps.鈥

In its next phase, Gilchrist says she envisions LIA being able to serve as a concierge for applicants, ensuring they connect with accurate and trusted resources, or connect with legal aid staff, as appropriate. 鈥淲hile AI can鈥檛 replicate the nuanced understanding of a human lawyer, it can provide the first step toward resolving legal issues and empower individuals to navigate complex legal systems with confidence,鈥 she explains.

Evolving AI for legal aid

These case studies highlight the power of AI to not only strengthen legal aid organizations鈥 work internally to allow them to serve more people, but also to provide critical resources to individuals who may be seeking legal information on their own. By taking a human-centered approach to design, double-checking outputs, and continuously improving the AI models, these advanced-tech pioneers are creating playbooks that over time, will significantly change how individuals increase access to justice on a larger scale.


You can find more about how AI for Legal Aid can help those accessing these services to better secure results, here.

]]>
https://blogs.thomsonreuters.com/en-us/legal/ai-for-legal-aid-empowering-clients/feed/ 0