As artificial intelligence (AI) shapes up to be a major disruptor in the modern age, it is no surprise that generative technologies are rapidly transforming industries. The real estate sector is certainly no exception. From virtual home-staging to AI-generated property valuations, the potential efficiencies and innovation that these tools offer to real estate salespersons are vast.
However, such developments come with their fair share of complex legal, regulatory and ethical risks. Identifying these implications and creating strategies to manage risks appropriately is therefore key to ensuring responsible and compliant AI adoption by the real estate industry.
While this article focuses on real estate, AI remains an emerging field and we look forward to exploring its broader impact across other industries in future articles, so stay tuned.
Singapore’s regulatory approach to AI: “Masterly inactivity”
In reaction to the advent of AI, Singapore has adopted a broad-brush strategy of masterly inactivity — a term coined[1] to describe the deliberate restraint exercised by regulators in the face of fast-evolving technologies. Rather than rushing to implement binding regulations, we have chosen to rely on a combination of soft-law frameworks and existing legislation to manage AI usage.
Notably, the Infocomm Media Development Authority (IMDA)’s “Model AI Governance Framework for Generative AI“[2] sets out principled guidelines to promote the responsible use of such tools. The masterly inactivity regime has also drawn on existing laws such as the Personal Data Protection Act 2012 (PDPA) to provide a baseline standard of protection against users of generative AI systems — the Personal Data Protection Commission (PDPC)’s “Advisory Guidelines on the Use of Personal Data in AI Recommendation and Decision Systems”[3] (PDPC AI Guidelines) lists provisions of the Personal Data Protection Act 2012 (PDPA) that may remain applicable against developers of AI tools that farm personal data illegally.
The nation’s intentions are clear — to observe and engage with the risks of embracing generative AI while also avoiding hard regulations that might stifle innovation. There are, however, no AI-specific regulations targeted at the real estate sector as of yet. Thus, real estate salespersons who choose to incorporate generative AI in their practice must keep abreast of any changes to Singapore’s evolving AI policy landscape. There is no need to dispense with such innovative tools, but it is key for agents to remain diligent in following existing guidelines in order to protect their professional liabilities.
Lessons from abroad: Attaching explicit disclaimers of AI usage
Foreign jurisdictions are no stranger to the implications of using generative AI in real estate work either, and it can be helpful to draw on their considered approaches.
In 2017, American tech real-estate marketplace Zillow Group Inc. (Zillow) introduced “Zestimate” — an AI-powered tool that estimated home values for buyers and sellers by analysing sales transactions and property details amongst other data points.[4] Though its figures were meant as a starting guide, users of the tool began to perceive these estimates as definitive valuations. A class action lawsuit was subsequently filed, alleging that Zillow had misled consumers. While the U.S. District Court for the Northern District of Illinois eventually dismissed the suit, Judge Amy St. Eve emphasised the importance of including clear and explicit disclaimers that distinguish AI-generated estimates and actual appraisals whenever AI tools are used.
This case has relevance for Singapore, where real estate platforms like the SRX Property Exchange also use AI to generate property value estimates (the “X-Value”). In a bid to reduce liabilities, SRX’s terms of use explicitly state that these are computer-generated and not investment advice.[5] Moreover, users are encouraged to consult a qualified person for a formal valuation report.
Attaching explicit disclaimers to discourage users from placing complete reliance on AI-generated values can be a good practice to follow for real estate salespersons who intend to embrace such tools.
The use of generative AI images in property marketing — commonly known as virtual home staging — has also risen. Just last year, English property marketplace Rightmove gained attention for “fixing” property pictures with AI image generators on almost all of their home listings.[6] Prompts like “make the kitchen look bigger” were submitted, resulting in virtual stagings that increased the space but inserted child-sized stovetops to achieve the result. Disclaimers were also not well placed to send a clear message that these were AI-generated images.
The reality of a property can be easily distorted, posing risks of regulatory action or professional liability if buyers or tenants are misled. Therefore, it becomes crucial for real estate salespersons to err on the side of caution — to ensure that any AI-enhanced images are clearly marked as such and accompanied by disclaimers where necessary.
Such disclosures do not need to come in the form of lengthy walls of text that belabour the user. Instead, attaching simple messages under AI content such as “Disclaimer: This output has been generated by artificial intelligence and should not be relied upon completely.” may be sufficient to warn consumers that whatever they see is not human advice.
Lessons from abroad: Upholding the duty of care as real estate salespersons
Beyond providing disclaimers of AI usage, it is perhaps more fundamental for real estate salespersons to ensure that they do not adopt AI outputs wholesale when advising their clients. Gone are the days where humans are the only vessels of advice — intelligence models like ChatGPT are now able to generate pages of guidance in a matter of seconds.
In 2020, the United Kingdom’s Financial Conduct Authority reviewed AI-advisory platforms to ensure that they met standards of suitability and due care.[7] Though no formal legal actions were taken, the FCA reinforced the duty of care owed by real estate salespersons to their clients when it came to providing AI-generated advice.
Locally, the takeaway is clear. Real estate professionals using AI-generated advice in an attempt to save time must not convey such information to consumers without checking through the output. It is key for salespersons to fact-check, clarify uncertainties, and ensure the advice is tailored and appropriate for each client’s specific circumstances. The presence of AI does not absolve the professional from responsibility.
Alignment with existing legislation: AI and the protection of personal data
Legal risks to real estate salespersons also lie in the sphere of personal data, with AI tools designed specifically to handle vast databases that record inputs. Hence, the PDPA still retains its importance and imposes obligations even where AI systems are deployed to sieve through data.
An example would be using AI to identify potential clients by running through an existing client database, which could breach the PDPA if done without proper prior consent from data owners for such purposes.
The PDPC’s AI Guidelines clarify that while Singapore has not legislated AI-specific regulations, developers of AI systems could still be liable under the PDPA for breaches of data protection laws, especially where consumers are not aware of their data being fed into such systems.
Conclusion
AI presents tremendous opportunities for the real estate sector — enhancing efficiency, refining marketing strategies, and improving customer engagement. However, its use must be tempered with legal and ethical awareness. From accurate disclaimers and proof-reading data outputs to compliance with evolving regulations, the real estate sector must proactively safeguard their practices.
As Singapore positions itself as an AI hub, the balance between innovation and regulation will continue to be tested. By taking a risk-based, principles-led approach, the industry can ensure that AI serves not just as a tool for productivity, but also as a force for trust and accountability.
The preparation of this article was supported by the invaluable assistance of dispute resolution intern Noah Tan.
As part of our broader examination of emerging technologies, this article offers an initial perspective on AI’s implications for the real estate sector and may be complemented by future insights exploring other sectors and industries.
For further information, please contact:
Sharon Lin, Partner, Withersworldwide
sharon.lin@withersworldwide.com
Footnotes
[1] Pang Cheng Kit, Kit, “A Comparative Analysis of Artificial Intelligence Regulation: Implications for Singapore” (2025) at Page 193
[2] Infocomm Media Development Authority, “Model AI Governance Framework for Generative AI” (2024)
[3] Personal Data Protection Commission, “Advisory Guidelines on the Use of Personal Data in AI Recommendation and Decision Systems” (2024)
[4] “Zillow wins dismissal of ‘Zestimate’ lawsuit in US”, Reuters (2017), https://www.reuters.com/article/technology/zillow-wins-dismissal-of-zestimate-lawsuit-in-us-idUSKCN1B32RM/
[5] “SRX X-Value”, SRX Property Exchange Portal, https://www.srx.com.sg/xvalue-pricing
[6] “Rightmove’s AI Home Staging”, Vice Media, (2024), https://www.vice.com/en/article/ai-generated-furniture-real-estate-listings/
[7] “FCA AI Update” (2025) https://www.fca.org.uk/publication/corporate/ai-update.pdf