Sign In / Sign Out
- ASU Home
- My ASU
- Colleges and Schools
- Map and Locations
“Big data” sounds like it could be a pretty boring topic, maybe not something you want to bring up during a dinner party.
But the intrigue builds when you discover that corporations are making money off the data you create while grocery shopping, applying for a home loan or casually strolling through the mall, transforming our everyday activities into "invisible labor" for them. It’s part of the “big data” construct, a virtual representation of your movements, interests and interactions tied to purchases and the use of our many smart devices.
In an effort to inform the public about big data and how it’s used, Arizona State University’s Human Security Collaboratory is hosting a series of “Critical Conversations” about human security research and impact activities. The first of these free, public conversations, “(Un) Corporal Technologies: How Data, Algorithms and Interfaces Rub Up Against the Body,” is from 11:30 a.m. to 1 p.m. Feb. 1, in room 492 of Interdisciplinary Science and Technology Building 4 (ISTB4), on the Tempe campus.
While new technologies and devices have shaped and redefined our world, not all of it is for the better. The information shed from your devices and personal computer can also be detrimental — it can prevent a person from getting a job, a house or vital medical coverage — all because your data is packaged and sold to corporations who may be using your personal data for very different purposes.
“The legal world has yet to catch up with what our digital tools are doing in our everyday lives in regards to Digital Civil Rights,” said Jacqueline Wernimont, a professor in the Department of English and co-founder of ASU’s recently formed Human Security Collaboratory.
“Everyday technology puts you at risk because the information is shared and sold, and has a much bigger impact than most people realize.”
Launched by the Global Security Initiative, the Human Security Collaboratory is focused on addressing complex problems affecting the security of individuals and communities, with a special emphasis on digital technologies and their uses.
“The idea of big data is a very intangible thing, and many people might not even know what it means,” said Jessica Rajko, a professor in the School of Film, Dance and Theatre, who, along with Wernimont, is co-director of the collaboratory.
“So how do we start to engage the public to make them not only understand what it means, but also how it can be meaningful for them?”
Last semester, Wernimont and Rajko presented “Vibrant Lives and Data Archives,” a performance installation that demonstrated the concept of personal “data shed” by providing an experience in which the data could be seen, heard and felt.
Data shed refers to the nearly 3.5 million bytes of data produced per person, per day. The data is unique to each person because it is derived from things like smartphone apps or wearable fitness devices that record a person’s behavior and data. It sounds relatively innocuous until Wernimont cites a few examples.
“If you’re at the mall with your iPhone or Android in your pocket with the Bluetooth on, it is possible to track what stores you've visited and for how long,” Wernimont said. “Or, let’s say you’re a recruiter searching for potential employees on a job website and put in certain parameters and lo and behold, there’s nothing but male candidates for the job. Well, that's potentially the result of algorithms that are favoring a certain community and that's illegal. Similarly, we see the phenomenon of reverse redlining, which has leveraged everyday information to target people for subprime loans.”
And that discount card you use at the grocery store that entitles you to some pretty smokin’ deals? Rest assured the corporation who owns the grocer is not only using it to track consumer behavior, but is probably selling the data to others and getting their money back tenfold.
Which is why the collaborative is engaging ASU faculty and researchers with its series of lunchtime conversation.
“What we’re doing is finding as many different ways as possible to meet the average person where they are in terms of knowledge about their data shed,” said Rajko, who said the collaborative will also host lab sessions, host a personal data and wearable-devices charette and present its work at an international retreat.
“These are all creative strategies to understand and address these topics because our goal is to be proactive rather than reactive.”Human Security Critical Conversations:
• “(Un) Corporal Technologies: How Data, Algorithms and Interfaces Rub Up Against the Body,” 11:30 a.m. to 1 p.m. Feb. 1 in ISTB4 492
• “Who Has the Rights? Data Ownership, Invisible Labor, and Agency,” 11:30 a.m. to 1 p.m. Feb. 22 in ISTB4 492
• “Healthy Data: Health, Data and Healthy Practices in the Age of the Quantified-Self,” 11:30 a.m. to 1 p.m. March 21 in ISTB4 492
• “Algorithmic Bias: Subjectivity and Implicit Biases in Algorithm and Tech Design,” 11:30 a.m. to 1 p.m. April 18 in ISTB4 492