Data journalist for hire.

A color-saturated illustration with a green utility pole equipped with surveillance cameras in the foreground against a purple background showing a map of Toledo, Ohio with dots representing crimes and icons indicating the locations of apartments.

‘Clearly discrimination’: How a city uses Fusus to spy on its poorest residents

Fusus’s technology allows police to tap into live feeds from public and privately owned surveillance cameras. A Gizmodo investigation and data analysis shows cops in Toledo, Ohio primarily use that power to surveil low-income housing.

A close up shot of a fingerprint on a computer keyboard

This AI tool helped convict people of murder. Then someone took a closer look.

Global Intelligence claimed its Cybercheck technology could locate a suspect in real time or at any point in the past based on which wireless networks their devices pinged. A WIRED investigation revealed Cybercheck provided leads that were demonstrably false or couldn’t be confirmed by other methods.

A pixelated image of three students raising their hands. The raised hand of the student in the center of the picture is highlighted in red.

Inside America’s school internet censorship machine

A WIRED investigation into internet censorship in US schools found widespread use of filters to censor health, identity, and other crucial information. Students say it makes the web entirely unusable.

A drawing of a woman surrounded by pill bottles that are dissolving into pixels

‘Out of Control’: Dozens of telehealth startups sent sensitive health information to big tech companies

A joint investigation by STAT and The Markup of 50 direct-to-consumer telehealth companies found that quick, online access to medications often comes with a hidden cost for patients: Virtual care websites were leaking sensitive medical information they collected to the world’s largest advertising platforms.

A judge's gavel laying against a black background

Payday lenders are big winners in Utah’s chatroom justice program

The state’s new online system for resolving small claims cases was meant to help residents. Instead, they lost their cases without ever having a day in court.

An illustration of stick-figure students in graduation robes.

False alarm: How Wisconsin uses race and income to label students ‘high risk’

After a decade of use and millions of predictions, a Markup investigation found that Wisconsin’s dropout prediction algorithm incorrectly and negatively influenced how educators perceived students, particularly students of color.

Major universities are using race as a ‘high impact predictor’ of student success

Documents and data obtained by The Markup show that the predictive algorithms large public universities use to assign students risk scores incorporate race as a ‘high impact predictor” and that Black students are labeled ‘high risk’ at as much as quadruple the rate of their white peers.

shadowtrack.jpg

‘They track every move’: How U.S. parole apps created digital prisoners

State and federal law enforcement agencies are increasingly forcing low-risk probationers and parolees to install tracking apps on their cell phones. Users say the loss of freedom is like being locked up all over again, and the companies behind these apps are looking to roll out predictive analytics features that critics say will lead to unnecessary re-arrests.

flawedalgs.jpg

Flawed algorithms are grading millions of students’ essays

Fooled by gibberish and highly susceptible to human bias, automated essay-scoring systems are being increasingly adopted, a Motherboard investigation has found.

Google purged almost 1,000 abusive apps. Now some of them are coming back.

Some of the programs banned by Google have now rebranded or added disclaimers and returned to the Play Store. Meanwhile, new programs with overtly abusive purposes have slipped through the company’s automated monitoring systems.

juryselection.jpg

This company is using racially biased algorithms to select jurors

Momus Analytics' predictive scoring system is using race to grade potential jurors on vague qualities like "leadership" and "personal responsibility."

Schoolspyware.jpg

Schools spy on kids to prevent shootings, but there’s no evidence it works

Companies that make this software say that their machine learning detection systems keep students safe from themselves and away from harmful online content. Their numbers aren’t always trustworthy and no independent research backs up their claims.