My first known ancestor in the Americas was an Ashanti woman called “the African.” We don’t know her name, but through records kept by slaveholders, we know she existed.
We know she was transported to Jamaica, where my known lineage began. These records of property bought and sold were a form of surveillance at the time.
Early technologies, and the policies and practices that undergird them, were forged to separate the citizen from the slave. The slave passes, branding, and lantern laws of then have become the cellphone trackers, facial recognition software, and body-worn police cameras of now. Their mission, however, hasn’t changed much—to catch and control black dissidence—only now they’re doing so in a digital age.
These technologies have been incorporated into the law enforcement process at every level, from predictive algorithms for assessing pre-trial risk and criminal activity to widely adopted police technologies that face little to no oversight. These technologies—including cell-site simulators and surveillance cameras—are trained on communities of color, especially blacks, immigrants, Arabs, and Muslims. In each case, the presence of technology, of math, is touted as the lynchpin for countering bias despite clear evidence that data derived from discriminatory processes reinforces, not eliminates, bias.
Read the full piece at The Atlantic.