With over 122 million renters in the United States, the experience of filling out a rental application is nearly universal, crossing socioeconomic boundaries, generations, and geographic regions. Every application represents an individual or a family hoping to find a place to call home - and these decisions impact our real lives.
In recent years, the landscape of rental applications and tenant screening has undergone a dramatic transformation with the introduction of sophisticated algorithms and AI. While these advancements have promised a new era of efficiency and speed in decision-making for individual landlords and real estate agents, the increased reliance on AI and algorithms in tenant screening has inadvertently created a system that disproportionately harms vulnerable renters raising critical questions about fairness, transparency, and ethical considerations in the rental housing market.
Most of us are familiar with the process of applying for rental housing: you submit an application with your basic personal information, proof of income, references, and consent for the landlord to pull various personal reports. These reports typically include a credit report, criminal history, and eviction records. In the traditional screening method, a landlord would independently review this information to decide whether to approve or deny the housing application.
With technology, now landlords receive this information but also a third-party’s analysis or a thumbs up or thumbs down decision. Most companies don’t disclose how they are making these decisions. This analysis could result from combining an individual's background reports with larger datasets to predict the likelihood of them being a stable renter or from their own proprietary analysis. Others simply assess the background reports to create their a proprietary analysis. Landlords are led to make decisions without even having a clear understanding of what went into making that decision and on the other side, renters are completely in the dark about why they are being rejected.
The rental industry urgently needs a shift towards more transparent and fair tenant screening solutions. To address this:
As we navigate the evolving landscape of tenant screening, it's crucial to strike a balance between technological efficiency and human discernment. While AI and algorithms offer speed and data processing capabilities, they lack the nuanced understanding that human judgment provides. This balance is essential to ensure fair, comprehensive, and contextual evaluations of potential renters. Moving forward, all stakeholders in the rental market - landlords, renters, real estate agents, screening companies, and policymakers - must actively advocate for and implement more equitable practices. This includes demanding greater transparency in screening processes, considering a broader range of factors beyond traditional credit scores, and empowering renters with more information and control over their applications.
By collectively pushing for these changes, we can create a more just and efficient rental ecosystem that benefits both property owners and renters alike, fostering trust, fairness, and long-term stability in the housing market.