It鈥檚 been nearly two years since Cleveland police searched an apartment at 403 E. 152nd St., looking for evidence in a murder case.
Inside, they found a gun and a black sweatsuit, which police said are key pieces of evidence. Shortly after, Qeyeon Tolbert was charged with aggravated murder for the death of Blake Story.
But the search wasn鈥檛 based on any physical evidence or human sources, just surveillance camera footage and an identification that was made possible with facial recognition technology.
鈥淚've been practicing criminal law for 45 years, okay? And this is the first time I was ever made aware that there was artificial intelligence used,鈥 said Tolbert's attorney, Brian Fallon.
Based on what was written in the application for the search warrant, it wasn鈥檛 clear that AI was involved. There was one passage in there that raised red flags for Fallon; 鈥淯tilizing the Fusion Center, they received an identification of the as of yet unidentified male suspect based on the recovered surveillance video.鈥
鈥淚 asked the prosecutor, 鈥楾hey said that there's an identification made by the Fusion Center. So, where's the report?鈥欌 Fallon said. 鈥淚 said, 鈥楾hat's like an expert report. That's like saying you got fingerprints or DNA. Let me see the report of that.鈥欌
Eventually, Fallon received a report from the facial recognition company .
Concerns about Clearview AI
Use of the technology by Cleveland police, and including it in court filings, is under scrutiny in this case in Cuyahoga County Court of Common Pleas, partially because of a longstanding lack of transparency about its use.
Clearview鈥檚 software has compiled from the internet that police can use to identify suspects.
over this practice. Plaintiffs were given a 23% stake in the company instead of a cash settlement.
Fallon successfully argued in court that the search warrant was unlawful because of the role facial recognition played and law enforcement鈥檚 failure to disclose it in the search warrant application.
鈥淭he Court finds that Det. (Michael) Legg鈥檚 failures to include information in the warrant about the AI identification of Tolbert was a knowing and intentional material misstatement,鈥 Judge Richard McMonagle wrote in a Jan. 6, 2026, decision dismissing the search of the apartment where Tolbert was staying.
McMonagle criticized the Cleveland police detective investigating the case for failing to disclose that the identification came from facial recognition. A disclaimer on the Clearview AI report said; 鈥淭hese search results are not intended or permitted to be used as admissible evidence in a court of law or any court filing."
Legg used the report in the warrant despite the disclaimer.
Additionally, McMonagle wrote that Det. Legg failed to disclose that Clearview identified several possible matches, instead only writing that police 鈥渞eceived an identification鈥 from the Fusion Center.
Fallon also asked the court to hold a hearing on the reliability of Clearview AI鈥檚 technology, which has not occurred because the search warrant was thrown out. But unresolved questions remain about the technology's reliability.
Facial recognition has come a long way with the development of artificial intelligence, said Erman Ayday, an assistant professor of data science at Case Western Reserve University, who specializes in privacy and data security.
It can compare tens of thousands of bits of information from a picture, like the distance between the center of one's eye and nose.
鈥淭he AI, or the machine learning algorithms, deep neural networks, they identify which feature is more important when you want to make a face recognition and then ... extract the features, they give some importance to different features and then they do the matching based on that,鈥 Ayday said.
In 2021, the . According to a press release from Clearview, the testing found that .
But one issue, especially for law enforcement, is the quality of the image they鈥檙e trying to find a match for, which often comes from closed circuit cameras, according to Ayday.
鈥淚t's not a very high-resolution image and sometimes it's a partial image," Ayday said. "Sometimes the suspect is wearing a cap. You cannot even get the full capture of the face."
That reduces the reliability of the matches.
tested facial recognition technology鈥檚 reliability using degraded images and found it became less reliable, especially for Black faces.
Ayday said other studies have found up to a 30% false positive rate when it鈥檚 a Black face in the photo.
鈥淭here is no "one size fits the mold" type of an approach," Ayday said. "So, for different demographics, for different races, for different, even genders, those important features may be different."
Ayday thinks companies should not be allowed to use social media photos for facial recognition.
Cleveland hasn't disclosed its use prior to Tolbert case
As recently as November, city leadership denied that it uses facial recognition technology at its Real Time Crime Center at all, where thousands of city-owned surveillance cameras are monitored by police.
, then Council Member Rebecca Maurer put the question to Public Safety Director Wayne Drummond this way:
鈥淎re we currently using any type of AI or facial recognition technology at the Real Time Crime Center?鈥
鈥淭hrough the chair to the councilmember, we are not,鈥 Drummond said.
鈥淵ou are not?鈥
鈥淭hrough the chair, that鈥檚 correct, we are not.鈥
Technically, that鈥檚 true.
Fallon learned Cleveland police sent a photo of Tolbert, captured at a convenience store six days after the murder, to the Real Time Crime Center.
The Real Time Crime Center then sent the image to the . The Fusion Center, which is overseen by the federal government, sent a Clearview AI report back to the detective. The report listed multiple possible matches pulled off social media sites and at least one rap album cover.
鈥淣o officers or detectives have ANY direct contact with facial recognition software as the City does NOT have a contract for that software,鈥 said city spokesperson Tyler Sinclair, who referred questions about policies governing its use to the Fusion Center.
Fusion Center policy allows law enforcement to access the technology to identify suspects, victims or potential witnesses. Outside law enforcement is not trained on its use. Only users at the Fusion Center are able to conduct searches.
The policy only addresses the use of images from social media and other websites by saying; 鈥淭he NEORFC (Northeast Ohio Regional Fusion Center) will contract only with commercial FR (facial recognition) companies which provide assurances that their methods for collecting, receiving, accessing, disseminating, retaining, and purging FR data comply with applicable local, state, tribal, territorial, and federal laws, regulations, and policies, and that these methods are not based on unfair or deceptive information collection practices."
The City of Cleveland also relies on laws and court cases to determine limits on the technology.
鈥淭he City currently follows all applicable laws and understands this new, rapidly-evolving technology creates important questions that are certainly worthy of debate 鈥 which is why we continue to monitor both the courts and legislative systems to ensure we remain in compliance with the law,鈥 Sinclair said.
According to Cleveland State University Law Professor Jonathan Witmer-Rich, the technology can be a useful tool for law enforcement if it鈥檚 used in certain circumstances.
鈥淣ote that, in this case, it was not that the facial recognition was being used to convict the defendant of having committed the crime, rather it was being used to get a warrant and when they got a warrant, they found evidence that was connected to the murder, at least allegedly,鈥 Witmer-Rich said.
But Witmer-Rich said the city could have been more transparent in how it was used in the Tolbert case.
The photo entered into Clearview AI鈥檚 database for identification came from a surveillance camera at a convenience store down the street from where the murder occurred, captured six days after.
鈥淭here's a video of the murder happening, but the quality of the video is poor and so you can't get a clear view of the person's face," Witmer-Rich said. "They did not run that video through as far as we know any facial recognition software or if they did, they didn't get results."
Instead, the detective who applied for the search warrant watched city-owned surveillance footage of the street where the murder occurred.
Six days later, he 鈥渙bserved a male matching the description of the suspect walking from the area of 403 E.152nd St. and continue to the store located at 15208 Lakeshore Blvd. This male having the same build, hair style, clothing and walking characteristics as observed on the suspect on Feb. 14, 2024.鈥
The footage at the store is what led to the search warrant. The judge found that detail was not clearly disclosed in the search warrant affidavit.
Witmer-Rich said Cleveland police should be more open about what technology they鈥檙e using and how they鈥檙e using it.
鈥淲hat body of images do we want the police to be using if, if using it at all?鈥 Witmer-Rich said. 鈥淎nd is it anything that a company can scrape off of the internet, or should it be something more limited than that? Something that's limited to a certain set of images that we know are reliable or that we know haven't been manipulated?鈥
The judge in Tolbert鈥檚 murder trial has agreed with his attorneys that law enforcement should have been transparent about how it got the identification that led to a search of Tolbert鈥檚 apartment.
The Cuyahoga County Prosecutor is appealing that decision.