... We are seeking a designer with the requisite skills and experience to contribute to the design and layout of an upcoming brochure. Further details regarding the content can be found via the following link: https://algorithmic-sabotage.github.io/asrg/theorizing-algorithmic_sabotage/. If you are interested, please contact us via email .. image
“If you think #technology will solve your problems, you don’t understand technology — and you don’t understand your problems.”[1] — Laurie Anderson, 2020 “To compare human and machine intelligence implies also a judgement about which human behaviour or social group is more intelligent than another, which workers can be replaced and which cannot. Ultimately, #AI is not only a tool for automating labour but also for imposing standards of mechanical intelligence that propagate, more or less invisibly, social hierarchies of knowledge and skill. As with any previous form of automation, AI does not simply replace workers but displaces and restructures them into a new social order.” [2] — Matteo Pasquinelli 2023 Through horizontal forms of intervention that prioritise marginalised perspectives, “Algorithmic Sabotage” articulates a collective approach to challenging the ideology of “algorithms everywhere”, in particular by shifting the focus from statistical inference to mutually constituting solidarity, by undertaking the activities necessary to generate prefigurative practices of resistance, agency and refusal that disrupt the algorithmic enclosure and overturn the application of continuous states of exception, highlighting the entanglement of the algorithmic harmfulness of “AI” with ongoing forms of societal disintegration, from austerity to far-right politics, and from racialized algorithmic violence to the propagation of patterns of segregation and exclusion. --- [1] Sterling, B. (2020) Laurie Anderson, machine learning artist-in-residence, Wired. Available at: . [2] Pasquinelli, M. (2024) The eye of the master: A social history of artificial intelligence. London, UK: Verso. Available at: . image
The Police Officers Faces (POF) dataset is an investigative counter-surveillance artistic project that focuses on the utilisation of facial recognition technology, with a particular emphasis on its deployment by law enforcement agencies. The dataset comprises 88,783 facial images of thousands of police officers. The images have been sourced from the #internet and are being used for the purpose of facial recognition. In this project, the conventional paradigm, in which the powerful observe and the powerless are observed, is reversed. In order to address the dehumanising effects of the expansion of automated discrimination and segregation, the exacerbation of harm and the overarching correlation, the project advocates for a distinct technical mentality, a collective “counter-intelligence,” which seeks to emphasise the influence of citizens and grassroots realities in uncovering facts, focusing on the impact of collective actions to reveal wrongdoing and the collaborative production of social justice. .. By employing a multifaceted approach encompassing targeted #data collection and compilation, social engineering, and rigorous analysis, the #Police Officers Faces (POF) dataset has established a comprehensive and structural alternative framework for the emancipatory empowerment of community members engaged in #copwatch and other counter-#surveillance practices. * Please be advised that the URL below provides an initial overview only. As further information becomes available, the content will be updated accordingly. ⟶ https://algorithmic-sabotage.github.io/asrg/police_officers-faces-pof/