Nearly a year after the Facebook and Cambridge Analytica scandal, the Government Accountability Office released a report on the lack of a comprehensive federal privacy law governing private companies’ collection, use or sale of user data. Its big conclusion was that...yea, it’d be a good idea to have a federal law that protects people. Who would’ve thought?
Luckily, a framework already exists as a road map to creating this desperately needed federal law. It’s known as the Fair Information Practices, or FIPs for short. FIPs includes eight, thorough recommendations to create comprehensive privacy protection and digital rights in the United States, including establishing fair and just data practices and accountability to counter bias and discrimination. It also goes after inequality with a provision to prohibit “Take it or leave it” terms of service. These terms require individuals with less means to waive their privacy rights in order to get a quality good or service for less or for free. The Electronic Privacy Information Center, known as EPIC, in addition to 15 other consumer, privacy and social justice organizations, have signed on to the FIPs framework.
And in February, EPIC, along with 40 civil rights, civil liberties and consumer groups, wrote a letter to Congress to specifically address data-driven discrimination. The letter outlined the critical importance for any privacy legislation created to be consistent with the “Civil Rights Principles for the Era of Big Data.” These five principles are: ending high-tech profiling, ensuring fairness in automated decisions, preserving constitutional principles, enhancing individual control of personal information, and protecting people from inaccurate data.
You can help demand the end of algorithmic bias and discrimination. Call your members of Congress to demand that FIPs and the 5 Civil Rights Principles are included in the creation of future U.S. privacy law.
Learn more about this issue by going to EPIC.org and searching for “Algorithmic Transparency” under their Privacy Campaigns. You can also go to The Leadership Conference on Civil and Human Rights site at civilrights.org and visit the "Media & Tech" page under the “Our Work” tab. On social media, check out the hashtag #DataDiscrimination and search for the term or hashtag “algorithmic bias.”
So, if making our technological world safer, transparent and discrimination-free is important to you, be sure to hit the share buttons to spread the word about Turning Two Frameworks for Fair and Just Data Practices into Federal Privacy Law via social media so that others in your network can spread the word too.
A comprehensive national standard for #privacy is needed - but it should be a baseline that gives states the room to innovate, as has historically been the case in federal privacy laws. #RealPrivacy #Privacy4All #DigitalRights4All https://t.co/QE0j1iJbKX pic.twitter.com/962DO8yL39— EPIC (@EPICprivacy) February 26, 2019
Five years ago, along with partners like @ColorOfChange, we spoke out about the importance of privacy and big data for communities of color, women, and other historically disadvantaged groups. Read our civil rights principles for the era of big data here: https://t.co/fluXsGer0I https://t.co/VIrB9nximQ— The Leadership Conference (@civilrightsorg) February 26, 2019
Tell Congress to Create a Federal Privacy Law using:
- Fair Information Practices (FIPs) | (Retweet it)
- Civil Rights Principles for the Era of Big Data | (Retweet it)
- EPIC's "Algorithmic Transparency" campaign
- The Leadership Conference on Civil and Human Rights' "Media & Tech" page
- Read the coalition letter to Congress
EDUCATE YOURSELF & SHARE
Discriminating algorithms: 5 times AI showed prejudice (NewScientist)
Even Kids Can Understand That Algorithms Can Be Biased (Scientific American, Roots of Unity Blog)
A.I. Could Worsen Health Disparities (NY Times)
What Women Know About the Internet (NY Times)
Posted April 19, 2018; Written by Best of the Left Communications Director, Amanda Hoffman
Hear the segment in the context of Best of the Left Edition #1266: Writing human bias into the code that runs our lives (Algorithms)