We wear it, watch it, work with it — and may suffer separation anxiety when it’s not available. What are the implications of human interaction with technology, particularly when it’s used to predict or police us?
As part of its first program, Algorithmic Culture, the Institute for Policy, Ethics, and Culture at Michigan Technological University brought social justice advocate and thought leader Marika Pfefferkorn to campus to talk about her work. In this Q&A, Pfefferkorn shares more about her grassroots activism, how it's spreading nationwide and the actions she hopes our community will take in the wake of her visit.
Q: What do you do and how did you get into this work?
A: The Minneapolis-St. Paul-based Twin Cities Innovation Alliance (TCIA) is a coalition representing a cross sector of public, private and community organizations, corporations and institutions. TCIA was founded to be proactive about the emergence of Smart Cities and to engage in entrepreneurial solutions that lead to connected communities. We focus on the importance and role of data across the domains of Smart Cities, including education, governance, infrastructure and transportation.
What's a Smart City?
As this International Electrotechnical Commission graphic shows, Smart Cities are interconnected in ways that improve efficiency, from infrastructure to human services.
Data is like oxygen for Smart Cities, and we realized from the beginning that it was and is imperative to share with community members the promise and perils that exist because of the use of their data. Now our work is expanding nationwide.
Q: What got you interested in the inherent bias in predictive algorithms?
A: In 2018, 43 Minnesota school districts were identified by the Minnesota Department of Human Rights for discriminating against black, brown and indigenous students in subjective out-of-school suspensions. When Ramsey County, the City of St. Paul and St. Paul Public Schools entered into a Joint Powers Agreement with integrated data sharing, they proposed applying predictive analytics and using a risk assessment to assign a risk score to flag children at risk for future involvement with the juvenile justice system. When I learned that they proposed using out-of-school suspensions as an indicator for their risk assessment, I was concerned because the data they would be entering was wrong and racially biased against students of color and students receiving special education services. I recognized that decision makers and policy makers had not done their due diligence in order to protect our students from racial profiling or stigmatizing labels that would follow them well beyond their K-12 education.
Q: You found that both the families who could be affected by a problematic algorithm and the officials who supported the data-sharing agreement as a more effective and efficient way to help their communities had knowledge gaps and/or felt intimidated discussing digital technology. How did the coalition you organized work to get information to people?
A: Once we recognized that no one truly understood the implications of such an expansive Joint Powers Agreement, the Coalition to Stop the Cradle to Prison Algorithm invested time in breaking down the document piece by piece: if we had questions we asked, if we had concerns we named them and if we needed clarity we accessed our community network and reached out to partners like the Minnesota ACLU and Data for Black Lives. As we learned, we created tools for everyone to use as a catalyst for more constructive conversations. One example of this is the policy brief we created and shared with our elected officials and stakeholders to explain our concerns and highlight problematic aspects of the agreement. We engaged parents as ambassadors and supported their efforts to get the message out. We hosted multiple community summits and forums that positioned other community members as experts to talk about what was going on and their concerns. We hosted a Dare to Data clinic and an Algorithmic Improv workshop to help interrupt much of the intimidation our community members felt when navigating legal or technical language with elected officials and policy makers.
"We had become the experts we were looking for. What would it look like if we created our own algorithm for success?"
Q: People might be left with the impression that you’re anti-data. Is that the case?
A: I laugh when people think I am anti-data because so much of my work relies on having access to good data to tell the whole story. For many people in this work, there is no problem — unless you can prove there is a problem with data. Much of my work in discipline disparities requires that I advocate for greater data collection around suspensions, office referrals and expulsions; my concern will always be how are we protecting student data from misuse and abuse — the why is really important.
Q: Sometimes it feels like many of us simply go from computer to computer, from our watches and cars to our home assistants. What do you think a proper relationship to technology looks like?
A: A proper relationship with technology begins with understanding that technology is a tool and to remember our humanity comes first. Technology is not the silver bullet, technology is not the answer to everything, and technology must be kept in proper perspective, not deferred to as neutral, above question. Most importantly, the role of technology must be made transparent.
Q: Recent headlines, including accusations of credit card gender bias and the collection of “secret” financial scores (pages and pages of data, based on everything from takeout food orders to TV-watching habits) reveal how little privacy we actually have. Is privacy possible in an algorithmic age?
A: We are at a turning point in our democracy where people are willing to give away the baby with the bath water without truly understanding the long-term impact of privacy in an algorithmic age. Technology companies are banking on this. I believe it is the role of parents, educators, school districts, states and the federal government to better equip people to understand and navigate this new technological era — especially our digital natives who do not know any differently. A mass education must be a priority if we are to protect the privacy protections that do exist. We cannot take our privacy for granted.
"There is no privacy without the expectation of privacy."
Q: You met with faculty, students and staff earlier this month. What are some of the most meaningful interactions you had on campus?
A: I truly enjoyed my Michigan Tech visit. I learned so much about the Institute for Policy, Ethics, and Culture from the founding committee members I met with — I heard their vision for IPEC come alive as they described the work they hope to accomplish. I hope I will have the opportunity to continue to work with them as they roll out this work.
My favorite part of the visit was engaging with students in a Law and Society class, and meeting with computer science faculty and students. Although many of the students I interacted with may have a very different worldview, through our storytelling and discussion I found common ground and genuine interest in the topics of big data, predictive analytics and algorithms. I also walked away with a new reading list. I found that most students really cared, asked great questions and were engaged in the conversation because they made a personal connection between theory and real-life experience.
Speaker Series Continues
The Algorithmic Culture Speaker Series brings two more thought leaders to campus:
- John Cheney-Lippold, author of We Are Data: Algorithms and the Making of Our Digital Selves, presents “Algorithms, Accidents, and the Imposition of a World of Calculation” with Q&A at 7 p.m. Monday, Nov. 18 in room 0103 of the Electrical Energy Resources Center.
- Meredith Broussard, author of Artificial UnIntelligence: How Computers Misunderstand the World, presents "Artificial UnIntelligence" with Q&A at 7 p.m. Thursday, Dec. 5, in Memorial Union Ballroom B.
“Community activism, academics, data journalism — our first guest speakers are a wonderful blend of diversity in terms of issues, location in culture and their own interests,” said IPEC Founding Director Jennifer Daryl Slack.
The Institute’s next project, Designing the Anthropocene, begins in 2020.
“We’re placing policy, ethics and culture at the center of the inquiry and asking how we should be concerned about the issues in the context of invention, use and the ways the culture is evolving in relationship to those technological changes,” Slack said. “We engage in these issues to interact with our campus and community to see what we can learn with them and how we can collaborate with them.”
Q: What’s the thing you brought to our community that you hope will stay with us?
A: Integrated data sharing, the application of predictive analytics and use of algorithms is not a new practice, but community members must be aware and prepared to participate in discussions and decision-making processes about their use to ensure that outcomes are not separate from the communities they are intended to serve and to know that certain students are more negatively impacted from others. When we talk about data and systems, we must recognize the harm that communities have experienced in the misuse of data and systems, and the individuals that represent them must do everything in their power to repair that harm and begin to build trust before any next steps are taken.
Q: What’s next for your projects?
A: The Twin Cities Innovation Alliance has launched the Data for Public Good Campaign to work side by side with communities to identify emerging technological trends in education across the country for the purpose of creating a heat map to follow the development, resources and policies at play. We will convene these community co-researchers to identify recommendations and guidelines from a community perspective as another tool for communities to use in protecting individual data and ensuring trust between systems and communities.
Michigan Technological University is a public research university founded in 1885 in Houghton, Michigan, and is home to more than 7,000 students from 55 countries around the world. Consistently ranked among the best universities in the country for return on investment, Michigan’s flagship technological university offers more than 120 undergraduate and graduate degree programs in science and technology, engineering, computing, forestry, business and economics, health professions, humanities, mathematics, social sciences, and the arts. The rural campus is situated just miles from Lake Superior in Michigan's Upper Peninsula, offering year-round opportunities for outdoor adventure.
Comments