Shine a light on tech’s hidden impacts. Triple your donation.
Skip navigation

Story Recipes

Journalists: Investigate Homeless Vulnerability Scoring in Your City

The survey we investigated was also used in dozens of other cities and counties, according to its creators. Here are our tips for looking into it

Illustration of hands holding a recipe card. In the background are 6 vignettes. Three of them are data graphics from the ISP story, three of them show a cleaver chopping data blocks, a pie chart in a pan, and a pot letting out speech bubble steam, respectively.
Gabriel Hongsdusit

Today, The Markup published an investigation into Los Angeles’s use of a scoring system to guide who receives priority for housing assistance. Under this system, people in need of housing take a survey designed to measure their vulnerability, with those deemed more vulnerable receiving higher priority. The story was based on an analysis of data from more than 130,000 surveys that we obtained by filing a public records request with the agency responsible for coordinating homeless services in Los Angeles. (You can read in detail about our analysis of the survey data here.) 

We know that, as of 2015, one of the surveys used by Los Angeles was also used in dozens of other cities and counties (see page 12) across the United States, according to its creators. This survey is known as the Vulnerability Index-Service Prioritization Decision Assistance Tool, or VI-SPDAT.

If you’re interested in obtaining homeless scoring system data for your own city or county, there are generic how-to guides on filing public records requests and even automated generators of such requests out there. But we thought we might share some quick and more specific tips—some that became clearer as we reported this story—that might help you in the process. Here’s what we learned.

↩︎ link

1. Find as much documentation as possible

The data we were looking for wasn’t publicly available, but before submitting a public records request, we set out to look for documents and other materials that explained what the data is and how it should be used. We found blank surveys on LAHSA’s website, which gave us details to ask for when we crafted our public records request, such as veteran status and what geographic region a person was associated with. 

↩︎ link

2. After putting in a request, don’t fear the follow‑up

Don’t be afraid to follow up and ask for additional information or records, particularly if you know the agency has things that might answer some of your questions. When we first started reporting on one of the tools we were writing about, the VI-SPDAT, we filed a simple data request that asked for a fraction of the data that we ultimately ended up using in our analysis. The more we reported out the story, the more detailed our questions—and requests—became. By the end of our investigation, we followed up to ask for more records at least a half-dozen times, either formally or informally. Be careful about requesting more data just for the sake of it, though—you’ll find better results when you’re intentional and have clear hypotheses you’re trying to test.

↩︎ link

3. But keep in mind that requests can take a long time

That said, it’s better to ask for too much data rather than too little. Public records requests can take a notoriously long time to be fulfilled, depending on where you’re requesting the information from. If you file a request, receive the data, and realize you need something else, you might have to restart the process and go back to the well for more data. We did! Our initial request was sent in August 2021, and our final request was filed in September 2022. 

↩︎ link

4. When in doubt, ask the people who use the data every day 

For our story, we spoke with people who interacted with the scoring system in a lot of different ways: case managers who administered the surveys, matchers who used the results to determine housing placements, and the people who were surveyed themselves. When we had questions about the data, these people were often our best sources for explanations. Because they were the people creating and using the data on a daily basis, they were experts in all of the nuances and shortcomings that we might not have otherwise known about. When reporting on algorithms, figuring out who uses the results and talking to them about their experience can help shape your understanding.

↩︎ link

5. See how others have analyzed the data in your city (or elsewhere)

If an algorithm is broadly used, there are often other people looking into it. In our case, VI-SPDAT is a tool that has been used by dozens of communities for many years. By the time we started reporting on its use in Los Angeles, academics had already studied the tool’s effects in other parts of the country. We read papers and spoke to researchers about their work to understand how they did their analyses and what sort of questions we could ask of our data. 

↩︎ link

6. You don’t have to start from scratch

No matter whom you’re requesting records from, there’s a strong chance that they’ve received public records requests in the past—and those may even be available to view online. MuckRock is a great online service for requesting records and also has a vast library of records requested in the past. You can search to see if the agency you want to request records from has fulfilled requests before, along with how long it took them.

Ultimately, we requested data from the LAHSA on every vulnerability survey administered since Jan. 1, 2016, and specifically asked for the following data from or about each survey:

  • Unique client ID
  • Acuity score (sometimes referred to as the “vulnerability score”)
  • Ethnicity
  • Race
  • Gender
  • Age at the time of assessment, if possible. If not, either birth date or age at the time data was retrieved
  • Initial assessment date
  • The name and specific version of the assessment used (e.g., "VI-SPDAT v3")
  • Veteran status
  • SPA (“service planning area”)
  • All responses to all questions that account for the scored portion of the intake survey (i.e., Sections A-F for the youth survey and A-D for the individuals survey) 
  • All scores for all sections of the survey
  • For each intervention provided to the client, the date and type of the intervention

If you find these tips helpful and are using them to do your own investigation into how VI-SPDAT is used in your area, please let us know at hello@themarkup.org. We’d be happy to help answer any questions you have and to help amplify your work when it’s published. Good luck!

We don't only investigate technology. We instigate change.

Your donations power our award-winning reporting and our tools. Together we can do more. Give now.

Donate Now