Chatbots in New York City and State
When does New York buy chatbots? What are they used for? How many are there? How much do they cost?
Government officials at the city and state level are talking a lot about chatbots lately.
New York City released a chatbot for the MyCity portal that’s supposed to be a “one-stop shop” for business related questions; it’s clearly marked as a beta product that can make mistakes, and the mistakes have been clearly noted. It came up again during last weeks’ City Council technology committee hearing.1
At the state level, Senator Kristen Gonzalez has introduced a bill that would require generative AI tools (like the chatbots above) to come with warning labels:
Requires the owner, licensee or operator of a generative artificial intelligence system to conspicuously display a warning on the system's user interface that is reasonably calculated to consistently apprise the user that the outputs of the generative artificial intelligence system may be inaccurate and/or inappropriate.2
Clearly, the policy world is paying more and more attention to AI tools, and specifically generative chatbots.3 There’s a lot you could say about what policy ought to be with regard to these tools, but in the interest of focusing on is before ought, here’s my question:
❓What kinds of chatbots are currently in use at the city and state level? Who is using them? Are they deterministic, generative, or some kind of mix? How much do they cost? How would you even find this all out?
This is a new policy area for me, so I don’t know the “standard” answer to these questions, if one even exists. So what follows is my tentative exploration of the topic.
My process:
Bulk download government contract data.
Search those contract summaries for keywords like “chatbot.”
See the results I get, explore the raw data with my eyes, think about what the search results tell me and don’t tell me.
Repeat steps 2 and 3 a few times based on what I learn.
Get preliminary results, write them up here.
(Step 1) Download government contract data
Both New York City and State have tools that allow anyone to look through government contracts—Checkbook NYC at the city level, and Open Book New York at the state level. These tools are definitely not perfect, but they have basic search and data export functionality. That’s all I need to get started!
I went to Open Book New York’s search portal, and didn’t put in any filtering search terms. I just hit the “search” button to get everything back. It returned 245,159 rows of data in a spreadsheet (each a contract).4
(Steps 2-4) Search the data for keywords, analyze
Searching through a spreadsheet that large can be slow if you just stay within the spreadsheet, so I wrote a short Python script (hosted on Replit) that looked through the spreadsheet for my list of specified key terms5, and then produced a much smaller spreadsheet with 114 rows of data (contracts).
From those 114 contracts, I simply did a control+f for words like “chat,” “chatbot,” etc. I found 7 obvious chatbot contracts, all of which were approved/filed in 2020 or later—this makes sense, since chatbots didn’t really take off in a big way until that point.
But I also found 1 nonobvious (probably) chatbot contract:
MedChat provides a variety of services, many of which are AI-powered chatbot interfaces. Here’s one example from their website:
But you wouldn’t immediately catch this contract if you were just searching for words like “chatbot.” The contract description is “Pharmacy patient engagement software tool.”
And while it’s possible that this isn’t a chatbot—I bet it is. So this prompts me to think: clearly there are other chatbot products that wouldn’t be found with obvious keyword searches. “Engagement” might be a good word to add onto the search list (which I did), but there are probably others.
I also decided to take one of the contractors that popped up on this list, SHI International Corp, and search New York City’s contract database to see if they had any non-state chatbot contracts. They do—this one! The contract is for “1 Year of Acuvate Chatbot License [Maintenance],” and the city has spent $352,800 on it. (It was clearly extended longer than one year.)
(Step 5) Preliminary results
Top level research conclusions: With very basic, non-comprehensive data exploration, I found a good handful of chatbot contracts at the New York State level—and with just one inferential guess I easily found a city chatbot contract. I’m confident there are far more than this (the MyCity chatbot is one), and that a dedicated, small team could flesh out the picture in a short amount of time.
Expenditures: these contracts are worth a lot of money. MedChat’s contract for 2023-2028 is worth $1.2 million alone. Career America LLC has netted $216,762 since 2020 making chatbots for the SUNY and CUNY systems; three of the chatbot contracts are still open with $58,112.81 of authorized expenditure to go.
During last week’s City Council technology committee hearing, NYC CTO Matt Fraser said that the city had spent $60 million to-date on the MyCity portal, and opened 67 contracts as part of that. I wonder how much of this work was chatbot development!
Note: whenever I discuss city/state technology contracts with industry professionals, they almost always say that the government is vastly overpaying for the products that it receives. This is part of a larger concern of mine—and an opportunity for the city budget if they can improve the situation—but is beyond the scope of this essay.Quality of chatbots: I don’t know! In the case of Career America’s chatbots, it looks like they’re internal tools for SUNY/CUNY, and I don’t have access to those internal tools. In the case of commercially available tools like MedChat, I suppose I could get a demo. It would take a technologist to really get a read on the quality and nature of these tools.
Going further: chatbots are only one particular use case of AI. I sort of suspect that some officials are focusing a bit too much on them, and not on the other, more potent (and harder to understand) uses of predictive models/LLMs/other AI technology.
Committee chair Gutiérrez was concerned about errors it had been making, and the Office of Technology and Innovation head Matt Fraser said that the department was taking proactive steps to address the issues.
State Senator Kristen Gonzalez also delivered testimony during this hearing, in which she also expressed concern about the MyCity portal’s chatbot, as well as government use of AI more generally.
The bill passed the state Senate this year, but has not yet passed the Assembly.
But not just chatbots. Assembly Member Alex Bores has introduced a bill that would “…[require] political communications that use synthetic media to disclose that they were created with the assistance of artificial intelligence; requires committees that use synthetic media to maintain records of such usage.” That bill is currently in committee in the Assembly, but it doesn’t seem like it’s leaving soon.
You might wonder how many years this encompasses. Per the search portal: “All State agency contracts, in effect 4/1/12 or later, which includes contracts approved by the Office of the State Comptroller (OSC) and those that don't require OSC approval.”
search_terms = [" chat ", "chat ", "chatbot", "chat bot", " bot ", "engagement"]. If you’ve never explored data before, spaces can turn out to be important. Putting a space before and after a word ensures that you’ll get results for a freestanding word, instead of part of another word. For example, a search for just “bot” could also return the word “both.” There are many ways to optimize search, and I didn’t do them all, I just wanted to do a cursory search and see what popped up.