DNA Testing Has Grown In Popularity. But What Does That Mean For Your Privacy?
In the past, DNA testing was an expensive technology largely associated with forensic use. Now, it seems commercials for consumer DNA testing are nearly impossible to avoid. With companies like Ancestry and 23andMe making DNA testing easier to access, more than 26 million people have taken an at-home ancestry test by the start of 2019. If people keep taking tests at the same pace, companies could soon have data on more than 100 million people within 24 months, according to MIT Technology Review.
DNA tests got a major boost in public image when police arrested Joseph DeAngelo — the Golden State Killer — for over a dozen murders and 50 rapes committed in the 1970s and ‘80s. Police located DeAngelo after making a fake profile and uploading his information to an open-sourced DNA database, GEDMatch, where they happened to find a relative. Over 20 arrests have been made using similar technology. The nature of DeAngelo’s case made it easy for the news to be celebrated while the very obvious privacy concerns that came with it were brushed aside.
As Slate writer Will Oremus pointed out, the case bears a few resemblances to Facebook’s Cambridge Analytica scandal, writing, “When you sign up for an online service, it’s rarely just your own data that you’re handing over. In many cases, you’re also giving up the goods on people you know—often, without their knowledge, let alone consent.”
Shows like CSI may have tricked you into thinking that DNA testing is a perfect science, but that’s just not the case. Take Josiah Sutton, for example. He was serving 4 years of a 25 year sentence after DNA testing proved that he was innocent because the tests performed by the Houston police were faulty. Or the case of Chen Long-Qi, where a bad DNA test led to him being convicted for rape. According to Gizmodo, the victims never accused Chen, and no one placed him at the scene. It took five years before a second DNA test found that Chen wasn’t a match after all.
Within her book, Inside the Cell: The Dark Side of Forensic DNA, NYU law professor Erin Murphy captured some of the main concerns surrounding DNA testing: “The same broken criminal-justice system that created mass incarceration and that has processed millions through its machinery without catching even egregious instances of wrongful conviction, now has a new and powerful weapon in its arsenal.”
You may think this has nothing to do with consumer DNA tests, but remember that the police don’t need permission to comb through your DNA profile if it’s in an open-sourced database. Most importantly, remember that your DNA isn’t unique to you alone. You share your genetic data not only at your own risk but at that of your family as well. Companies like Ancestry and 23andMe do have guidelines around genetic privacy, including refusing to turn over DNA without your permission. However, those guidelines don’t tell you what the company may have to do if they’re ordered to give up the data by a court instead of simply petitioned.
With a criminal justice system that is racially biased and DNA sites that market specifically to the descendants of slaves (such as this Ancestry commercial), there needs to be conversations about handing over your biometric information that easily.
As a category itself, biometric identification technologies were largely developed for the surveillance of Black people. Privacy SOS, an organization tied to the ACLU’s Massachusetts branch, wrote, “The demise of the institution of slavery did not stop the advancement of biometric identification technologies largely premised on systems motivated by white supremacy.”
It’s easy for DNA companies to make working with the police sound appealing when they side-step the anti-Black elements that led to the construction of biometric identification as a whole.
The temptations of consumer DNA tests are clear. People are interested in knowing more about themselves, but taking these tests means that you are putting your family — including your parents, children, cousins, etc. — into databases that they did not consent to. Right now, it’s not clear what the limitations are around how those databases can be used.