Artificial intelligence is already proving to be valuable to early adopters in commercial real estate who are involved in content creation and marketing, property valuation and market analysis, predictive analytics, and risk assessment. But unquestioningly trusting the data can be risky.
Hallucinations and Other Problems
“Hallucinations” are AI-generated responses that sound plausible but are factually incorrect. They’re difficult to detect because AI-generated answers are typically well-stated and sound matter-of-fact, yet might provide false data sources and, in some cases, even manufacture quotes.
“AI models depend heavily on the quality of input data used for training models. If it’s incomplete, it can lead to biased predictions or responses,” says Ra’eesa Motala, sior, president of Chicago-based Evoke Partners. “Do you have all the key factors inputted correctly to ensure your output will yield results with little room for error or miscalculation? Have you taken variables into account? You need to understand for yourself what went into that decision-making to avoid [hindering] an investment or development transaction.”
How to Ask Questions
“The way that AI generates a response is based on how you ask the questions,” says Kim Ford, sior, CEO of the Rise Agency Group in Pittsburgh. ChatGPT provides tips on how to ask questions that will result in the most useful answers:
- Be specific with your request.
- Provide context and background information.
- Use explicit constraints and guidelines.
- Experiment with various phrasings and approaches.
Another challenge, Ford says, is the lack of publicly available information to input into AI, especially on the leasing side, whereas recorded data exists for building sales.
Adapted from SIOR Report, Spring 2024. Reprinted from NAR / CREATE Summer 2024 by permission of NAR. https://cdn.nar.realtor/sites/default/files/documents/create_summer-issue-green-roi-2024-06-12.pdf?_gl=1*1wpnx6n*_gcl_au*MTk3NDc0NjUxMy4xNzIyMzU3Mjcx. |