Category: Consumer

Democratic Privacy Reform

If you aren’t familiar with the issues surrounding personal data collection by corporate tech giants and online privacy, I recommend you flip through Amnesty International’s publication Surveillance Giants: How the Business Model of Google and Facebook Threatens Human Rights. I would also suggest reading A Contextual Approach to Privacy Online by Helen Nissenbaum if you are interested in further discussions on the future of data collection. Actually, even if you are familiar with these issues, read them anyway because they are very interesting and you may learn something new.

Both articles offer interesting suggestions for governments and corporations to ensure online privacy is protected, and it is clear top-down approaches are necessary for upholding human rights. Substantial effort will be required for full corporate compliance however, as both law and computer systems need updating to better respect user data. While these measures ensure ethical responsibilities are directed to the appropriate parties, a complementary bottom-up approach may be required as well. There is great potential for change if citizens were to engage with this issue and help one another better understand the importance of privacy. A democratic strategy for protecting online human rights is possible, but it seems quite demanding considering this work is ideally performed voluntarily. Additionally, I fear putting this approach into practice is an uphill epistemic battle; many individuals aren’t overly bothered by surveillance. Since the issue is complex and technological, it is difficult to understand resulting in little concern due to the lack of perceived threat. Thus, there will always be a market for the Internet of Things. Moreover, advertising revenue provides little incentive for corporations to respect user data, unless a vocal group of protesters is able to substantially threaten their public image. Enacting regulatory laws may be effective for addressing human rights issues but the conflict between governments and companies is likely to continue under the status quo. Consumers who enjoy these platforms and products face a moral dilemma: is this acceptable if society and democracy is negatively impacted? Can ethical considerations regarding economic externalities help answer this question? If not, are there other analogous ethical theories which may be appropriate for questions regarding the responsibilities of citizens? If activists and ethicists are interested in organizing information and materials for empowering voters and consumers, these challenges will need practical and digestible answers.

Works Cited

Amnesty International. Surveillance Giants: How the Business Model of Google and Facebook Threatens Human Rights. Research article, amnesty.org/en/documents/pol30/1404/2019/en/, 2019.

Nissenbaum, Helen. “A contextual approach to privacy online.” Daedalus 140.4 (2011): 32-48.

Addiction by Design: Candy Crush et al.

For class this week, we read the first four chapters of Natasha Schull’s book Addition by Design. I think the goal was to consider the similarities and differences between slot machines and gaming applications on handheld devices.

While the two addictions are comparable despite their differences in gameplay format, apps like Candy Crush have found profitable solutions to their unique problems. Developers expect players to “leave their seats” as cellphone use generally orbits around other aspects of daily life. While “time on device” (58) is surely an important part of app design, creating incentives for users to return are also significant. Though this may be accomplished in a number of ways, a common strategy is to generate frequent notifications to both remind and seduce users back to their flow state (49). Overall, the approach may seem less inviting than sounds and lights but its ability to display explicit directions may be effective. Text has the ability to specify rewards if the user opens the app right then and there. A pay structure involving varying wait times may also push users to pay for the ability to return to “the zone” (2). This may take the form of watching an advertisement or being disallowed to play for intervals from an hour to a day, sufficiently frustrating users to pay to continue playing. Similarly to embedding ATMs in slot machines (72), app stores with saved credit card information allow developers to seamlessly lead users to the ‘purchase’ button, quickly increasing revenue. Financial transactions thinly disguised as a part of the game offer a new way to siphon money from vulnerable individuals, especially parents of children with access to connected devices. Additionally, gaming apps are typically weakly associated with physical money like bills and coins, unlike slot machines from mid 20th century (62), perhaps making it easier for consumers to pay without drawing their attention to the movement of money. This brief analysis suggests the nature of gambling is evolving by modifying existing modes of persuasion and adapting to new technological environments.

One large concern, however, arises from where this money goes; while governmental agencies oversee regulations (91) and collect revenue (5) to fund programs and projects, private companies simply collect capital. This carries severe implications for individuals, communities and economies as this alternative stream of income dries up. Therefore, it could be suggested that state and provincial legislators should consider addressing this issue sooner than later.

Works Cited

Schüll, Natasha Dow. Addiction by design: Machine gambling in Las Vegas. Princeton University Press, 2014.

Algorithmic Transparency and Social Power

This term I’m taking the course Science and Ethics, and this week we read Langdon Winner’s 1980 article “Do Artifacts have Politics?” along with a paper from 2016 published by Brent Daniel Mittelstadt and colleagues titled “The ethics of algorithms: Mapping the debate.” We are encouraged to do weekly responses, and considering the concerning nature of what these articles are discussing, thought it should be presented here. There is definitely a lot that could be expanded upon, which I might consider doing at a later time.

Overall, the two articles suggested risks of discriminatory outcomes are an aspect of technological advancements, especially when power imbalances are present or inherent. The paper The ethics of algorithms: Mapping the debate focused particularly on algorithmic design and its current lack of transparency (Mittelstadt 6). The authors mention how this is an epistemic concern, as developers are unable to determine how a decision is reached, which leads to normative problems. Algorithmic outcomes potentially generate discriminatory practices which may generalize and treat groups of people erroneously (Mittelstadt 5). Thus, given the elusive epistemic nature of current algorithmic design, individuals throughout the entire organization can truthfully claim ignorance of their own business practices. Some may take advantage of this fact. Today, corporations that manage to successfully integrate their software into the daily life of many millions of users have little incentive to change, due to shareholder desires for financial growth. Until the system which implicitly suggests companies can simply pay a fee, in the form of legal settlements outside of court, to act unethically, this problem is likely to continue to manifest. This indeed does not inspire confidence for the future of AI as we hand over our personal information to companies and governments (Mittelstadt 6).

Langdon Winner’s on whether artifacts have politics provides a compelling argument for the inherently political nature of our technological objects. While this paper may have been published in 1980, its wisdom and relevance can be readily applied to contemporary contexts. Internet memes even pick up on this parallel; one example poses as a message from Microsoft stating those who program open-source software are communists. While roles of leadership are required for many projects or organizations (Winner 130), inherently political technologies have the hierarchy of social functioning as part of their conceptual foundations, according to Winner (133). The point the author aims to stress surrounds technological effects which impede social functioning (Winner 131), a direction we have yet to move away from considering the events leading up to and following the 2016 American presidential election. If we don’t strive for better epistemic and normative transparency, we will be met with authoritarian outcomes. As neural networks continue to creep into various sectors of society, like law, healthcare, and education, ensuring the protection of individual rights remains at risk.

Works Cited

Mittelstadt, Brent Daniel, et al. “The ethics of algorithms: Mapping the debate.” Big Data & Society 3.2 (2016): 1-21.

Winner, Langdon. “Do artifacts have politics?.” Daedalus 109.1 (1980): 121-36.