SINGAPORE: With the judiciary heading the charge in technology and innovation by introducing artificial intelligence (AI) tools to the small claims tribunal, generative AI is also quietly reshaping how some law firms operate.
When the Singapore courts issued a guide on the use of generative AI tools by lawyers back in Oct 2024, law practices of all sizes had already jumped on the bandwagon to adopt the technology in their processes, albeit cautiously.
This includes generating quick summaries of cases, transcribing audio recordings for reference against witness statements, and drafting first-cut legal submissions – work that would otherwise have required painstaking and time-sapping effort.
Their goal? To boost efficiency by cutting through stacks of case documents, without compromising legal rigour.
Speaking to CNA, representatives from six law firms stressed that human lawyers have to verify the work done to protect against occasional over-statements and unverifiable assertions including AI hallucinations and bias.
Most firms are choosing to deploy AI in baby steps, not big moves — though some are already planning the rollout of more ambitious AI agents in the years ahead, as CNA has learnt.
HOW AI IS DEPLOYED
Those from the larger firms said they use tools such as HarveyAI, a generative AI platform designed for the legal sector, and Microsoft Copilot, an AI assistant that helps streamline workflows and automates repetitive actions.
The Singapore Courts renewed its collaboration with the American start-up HarveyAI to use its technology in September this year.
Mr Adrian Wong, head of dispute resolution at Rajah & Tann Singapore, said the firm uses Harvey and RelativityOne, an AI-enhanced e-discovery tool for dispute resolution.
The tools are used for drafting legal documents, summarising court judgments, preparing chronologies of events and extracting key issues from large sets of data or documents.
They are also used to analyse facts and identify key legal issues in a dispute, translating documents, ideating and testing alternate arguments for cases, and conducting a general search on a legal topic.
“Harvey is also used as a knowhow repository for our inhouse template pleadings and precedents, enabling our disputes lawyers to efficiently locate and retrieve relevant materials through natural language queries on the platform,” said Mr Wong.
Lawyers and support personnel at Rajah & Tann also use Microsoft Copilot, an AI assistant that helps streamline workflows and automates repetitive actions.
They use the tool to draft meeting notes and action items arising from meetings, to conduct market research on business and legal trends, to generate quick recaps or updates on a project or case with reference to data from Microsoft applications such as their inbox, meeting transcripts in Microsoft Teams or OneDrive.
The tool is also used to adjust the tone and language of email and written communications, to translate documents or to draft legal documents with reference to data, research materials and evidence stored in Microsoft applications.
All the lawyers at Rajah & Tann have access to at least one generative AI tool and might tap on others to assist with specific tasks, such as Opus 2 or Epiq, which are dispute case management platforms for complex dispute resolution matters, said Mr Wong.
Mr Chou Sean Yu, managing partner at WongPartnership, said his firm started leveraging AI technology as early as 2017. He claimed that it was the first Singapore law firm to deploy natural language processing for contract review and e-discovery with Luminance.
Natural language processing enables computers to understand, interpret and generate human language in a natural spoken or written way, and can break down data.
In 2024, WongPartnership formed a “strategic partnership” with Harvey and was one of the early adopters in Southeast Asia to deploy its generative AI system.
Mr Chou said automating routine tasks with AI allows lawyers to achieve better work-life balance and allows them to focus their expertise on higher-value strategic tasks.
“While AI enhances efficiency, it complements rather than replaces the personalised attention and strategic insight that our clients depend on,” he said.
Drew & Napier began using generative AI in its workflows this year, said Mr Rakesh Kirpalani, chief technology officer and director of dispute resolution and IT.
About 350 lawyers use the technology, with all lawyers in the firm provided access to Microsoft Copilot, and about 20 per cent of them having access to Harvey at any one time through accounts given to allocated lawyers.
Mr Kirpalani said Harvey was “particularly useful” when working with large document sets, while Copilot is best suited for administrative tasks or for use with more sensitive documents, since it does not generally upload or process files beyond the firm’s Microsoft trust boundary.
For court-related work, the firm uses or intends to use generative AI for initial document reviews to identify key issues and potentially relevant documents from large data sets.
The lawyer can then narrow down timeframes and keywords that are likely to be relevant before performing a manual review, said Mr Kirpalani.
Lawyers also use the tools to prepare brief analyses or summaries of documents to help them in preparing pleadings, submissions and witness statements. The tools help them create chronologies or summary tables, or provide preliminary views on the documents.
“Rough and ready translations of documents to be used for initial reviews before professional translations are obtained,” said Mr Kirpalani.
He said the firm was exploring other legal-related use cases with Harvey and Copilot and educating its lawyers on how to use prompts with generative AI, and that these uses are intended for “low-level tasks that require minimal legal input”.
The results will then facilitate higher level reviews, analysis and decision-making by Drew & Napier’s lawyers, said Mr Kirpalani.
At Withers KhattarWong, they are adopting a phased approach in integrating generative AI tools, with more than 60 per cent of its lawyers across the firm using Copilot as of early September this year.
“Microsoft Copilot has been adopted as a general-purpose AI tool across the firm, which supports operational efficiency across both fee-earning and business services teams,” said Mr Pardeep Khosa, Withers’ head of litigation.
“Building on this foundation, the firm has layered additional generative AI tools tailored to specific departments, teams, and use cases. These tools are designed to complement core workflows, streamline routine tasks, and enhance strategic output.”
Withers lawyers also use legal-specific platforms such as LegalScape and LexRoom, which are piloted for research and drafting.
SMALLER PLAYERS TESTING WATERS
Apart from the big law firms, boutique firms are also starting to venture into the use of AI.
Ms Bozy Lu, partner at boutique law firm Han & Lu Law Chambers, said the use of AI was minimal but Copilot is deployed in administrative tasks.
“For instance, we handle personal injury claims and clients often hand us a stack of their medical and transport invoices. We use Copilot to assist in reviewing these documents to tabulate into a format that is neat and presentable for court use,” she said.
The firm is piloting using AI to summarise documentary evidence and to draw family trees, but this is “very much work-in-progress at the moment”, said Ms Lu.
Her firm also recently dabbled in using Gemini to draft legal submissions for road traffic accident cases, such as proposing how much liability should be given to parties in the case.
This was in a recent collaboration with Bizibody Technology at the Personal Injury Property Damage (PIPD) Seminar 2025 with the Law Society of Singapore.
“It was quite an enlightening session sharing with the PIPD bar on it as the output was decent and could be comparable to a first year associate’s drafts,” said Ms Lu.
Ms Simran Toor, partner and head of regulatory and investigations at boutique firm Tang Thomas, said the new firm was in the process of setting up its systems and was looking at adopting AI in two key areas.
“To support the business of law; and to support the practice of law,” she said. “The first would help us enhance backend operations such as file management and customer relationship management, and the second would support legal output like drafting and research.”
While the firm recognises “the importance of efficiency and streamlining costs for our clients”, it is taking “an intentional and well-considered approach to adopting our AI tools” due to questions on whether the use of such tools may compromise legal privilege and client confidentiality.
AI-POWERED PROSECUTORS
Prosecutors are also embracing the use of AI, such as through AI-enabled image recognition software, said Solicitor-General Daphne Hong.
“When our prosecutors have to go through large amounts of digital child sexual abuse material as part of their review of evidence for a case, the tool can help to produce a preliminary classification and identification of these materials,” said the senior counsel.
“This significantly reduces the time spent by the officers sifting through the evidence, which can frequently amount to thousands of such photos and videos.”
The Attorney-General’s Chambers’ Legal Technology Innovation Office, staffed by a mix of lawyers and IT professionals, helps to drive the development and adoption of AI and other technological tools, said Ms Hong.
She said the AGC is focusing its efforts to develop other AI-tools to help its officers handle high volume, but repetitive tasks more efficiently, encouraging all AGC officers to leverage on such tools.
“AI can help our officers with our work, including preparation for court hearings, but at the same time, we are cognisant that individual officers are responsible for the work they produce, and they must personally ensure the accuracy and quality of their work,” said Ms Hong.
MEDIATORS’ AI ASSISTANT
Over at the Singapore International Mediation Centre (SIMC), mediators have been using MAIA (Mediation AI Assistant), a tool SIMC developed with AI vendors, since it was launched in August last year.
An enhanced version called MAIA 2.0 was launched during Singapore Convention Week 2025.
The tool is designed exclusively for mediators, helping them cut down and digest the large volume of documents submitted by parties before mediation, said SIMC chief executive officer Chuan Wee Meng.
“Unlike litigation and arbitration, where the judges or arbitrators have weeks or months to master the facts and understand the issues before the hearing, the mediator only has seven days or less to read and digest large volume of documents submitted by the parties before the mediation,” said Mr Chuan.
In some cases, the documents may exceed tens of thousands of pages of opposing or even conflicting positions of a long-running dispute and it is a “huge challenge” for a mediator to understand the facts and draw out the key issues before the actual mediation, he said.
With MAIA, mediators can obtain a summary of the dispute, create a chronology of events and compare similarities or difference of parties’ positions and expert reports.
Since its launch, MAIA has been used in more than 25 mediation sessions and is being adopted “by a growing number of mediators at SIMC”, said Mr Chuan.
“We have received strong feedback. One mediator called its deliverables ‘shockingly accurate’. Justice Lee Sieu Kin said ‘I would rate it a seven out of 10. It has the potential for much improvement over time as the system gets tuned with greater use’,” said Mr Chuan.
The tool is provided at no additional cost to parties, and one mediator said it has helped cut down the time for preparation by at least 30 per cent, as it could give an overview of the case and documents and provide a chronology to refer to while reading the documents in detail.
POWERFUL, BUT DANGEROUS
While the users of AI in legal practice recognise its usefulness, they also are aware of its limitations and safeguard against potential inaccuracies and hallucinations, such as a recent case where non-existent case precedents were used.
“AI is undeniably powerful. Its ability to process vast amounts of data and generate insights at speed makes it a valuable tool in modern legal practice. But its limitations are equally real,” said Withers’ Mr Khosa.
Because AI operates on input-driven mechanisms, its outputs are only as reliable as the data and context it receives, raising critical concerns around confidentiality, he said.
“Moreover, some AI tools are known to produce plausible yet inaccurate content (so-called ‘hallucinations’) which, if unchecked, can compromise legal analysis and reputational standing. Responsible adoption requires not just technical safeguards, but a clear boundary between operational support and professional judgment,” said Mr Khosa.
Ms Lu from Han & Lu Law Chambers said she has a “love-hate relationship” with AI because of its shortcomings.
“The prompts that are being fed into AI have to be rather well-crafted and specific, sometimes it’s like teaching a child so it is crucial to go step by step. We have had occasions where AI may generate different answers to similar prompts put in and at times just blank even though it responds that certain thought processes have been endeavoured and some output was ready,” she said.
She added that the well-crafted responses by AI may appear “perfect” and this may breed complacency.
“If lawyers do not check the work generated by the AI and make submissions based on it, that becomes inherently dangerous in the court of law,” said Ms Lu.
Mr Wong from Rajah & Tann said the firm’s use of AI is guided by its internal policy on the responsible use of generative AI, which addresses data protection, privacy, transparency, human oversight and accountability.
The firm has a strict policy on using only authorised technology and AI solutions for specific tasks, and reviews and evaluates each solution before adopting it, to ensure that the information security and data privacy standards meet legal requirements and firm standards as well as professional conduct rules.
“We maintain full transparency with our clients about the use of generative AI in our legal work products. Should a client request that generative AI not be used for a specific matter or for their matters generally, we will comply with their instructions and refrain from its use accordingly,” said Mr Wong.
Occasionally, there are over-statements and unverifiable assertions in raw AI drafts, which reinforces the need for human verification and tone control, he added.
Drew & Napier’s Mr Kirpalani said research with reference to a known universe of precedents and authorities helps to minimise hallucinations, such as conducting research on Harvey with constraints to refer to Singapore law decisions publicly provided on e-litigation or the online Singapore statutes.
“As a rule, we do not use AI as a final decision-making tool without independent verification of its output. We avoid using AI for highly sensitive matters, and will not use AI if instructed otherwise by our clients, or if there are other contractual prohibitions against the use of AI,” said Mr Kirpalani.
To mitigate hallucination risks, all lawyers must verify the output and check it against citations to ensure accuracy, with responsibility for any AI output remaining with the individual lawyer, he added.
“Like any technology, different AI systems come with their own limitations. AI is not a magic fix and requires varying degrees of human oversight at all times,” said Mr Chou. “It performs well in some aspects of legal work and not so well in others. The challenge lies in identifying where AI can add real value and understanding how much reliance can be placed on its output.”
LOOKING TO THE FUTURE
WongPartnership’s Mr Chou said the firm has consistently been a pioneer in embracing technology and will continue to pursue strategic collaborations with leading tech providers, to harness generative AI to enhance both transactional and disputes-related work alongside its lawyers on the frontlines.
In future, Rajah & Tann will integrate generative AI into its workflow through guidance from its generative AI committee, as well as support from its tech arm, Rajah & Tann Technologies.
It intends to build solutions that are “not only tailored to legal workflows but also deeply embedded in our operational and compliance frameworks”, and to leverage agentic AI to automate legal and operations workflows.
Agentic AI refers to a broader field of AI that allows agents to determine the steps needed to achieve a goal without the need for constant human oversight.
Rajah & Tann also hopes to classify and tag its internal precedents, work products and datasets to enhance and personalise output from its AI platforms and distinguish its AI-enhanced work products from other law firms, said Mr Wong.
Tang Thomas is exploring the use of AI for its social media content, business development and client education tools, while Han & Lu is looking at possible greater use of AI “with cautious reliance” in what it called “a trial and error process”.
Withers hopes to embed AI into “precedent libraries” to support faster and more informed legal advice for its attorneys.
Another possibility would be to develop secure, AI-enabled interfaces that allow clients to interact with their data, track matter progress or access tailored legal updates.
“Ultimately, the goal is to elevate the quality of the service we provide and be better professionals,” said Mr Khosa.