Binary Governance: Lessons from the GDPR’s Approach to Algorithmic Accountability (Q2401)
Revision as of 09:41, 8 December 2019 by Podehaye (talk | contribs) (Created claim: Property:P126: Failing to be transparent about the fact that individuals are being targeted or the reasons why they are targeted itself may threaten autonomy. Secret profiling and decision-making can lead to manipulatio...)
No description defined
Language | Label | Description | Also known as |
---|---|---|---|
English |
Binary Governance: Lessons from the GDPR’s Approach to Algorithmic Accountability
|
No description defined
|
Statements
Secret profiles and decisions made based on secret profiling can threaten personhood and thus dignity by proscribing active individual involvement in the construction of this objectified version of the self.
0 references
the “ ‘data shadows’ . . . threaten to usurp the constitutive authority of the physical self despite their relatively attenuated and often misleading nature”
0 references
Algorithmic decision-making founded on individual profiling limits the choices and, thus, the freedom a person will have.
0 references
Limiting the choices we see—whether by failing to show opportunities or by offering only bad options—limits our freedom to make choices.
0 references
Failing to be transparent about the fact that individuals are being targeted or the reasons why they are targeted itself may threaten autonomy. Secret profiling and decision-making can lead to manipulation. Without knowing how we are being targeted or why, we can be manipulated into making choices that are not autonomous at all.
0 references