Oh god. This, so much:
Many researchers in the field cut their teeth as techno-optimists, studying the positive aspects of the internet—like bringing people together to enhance creativity or further democratic protest, á la the Arab Spring—says Marwick. But it didn’t last. The past decade has been an exercise in dystopian comeuppance to the utopian discourse of the ’90s and ‘00s. Consider Gamergate, the Internet Research Agency, fake news, the internet-fueled rise of the so-called alt-right, Pizzagate, QAnon, Elsagate and the ongoing horrors of kids YouTube, Facebook’s role in fanning the flames of genocide, Cambridge Analytica, and so much more. “In many ways, I think it [the malaise] is a bit about us being let down by something that many of us really truly believed in,” says Marwick. Even those who were more realistic about tech—and foresaw its misuse—are stunned by the extent of the problem, she says. “You have to come to terms with the fact that not only were you wrong, but even the bad consequences that many of us did foretell were nowhere near as bad as the actual consequences that either happened or are going to happen.” […..] “It’s not that one of our systems is broken; it’s not even that all of our systems are broken,” says Phillips. “It’s that all of our systems are working … toward the spread of polluted information and the undermining of democratic participation.”(via Paul Moloney)
‘The identity data of magistrates and members of the judiciary cannot be reused with the purpose or effect of evaluating, analysing, comparing or predicting their actual or alleged professional practices.’ As far as Artificial Lawyer understands, this is the very first example of such a ban anywhere in the world. Insiders in France told Artificial Lawyer that the new law is a direct result of an earlier effort to make all case law easily accessible to the general public, which was seen at the time as improving access to justice and a big step forward for transparency in the justice sector. However, judges in France had not reckoned on NLP and machine learning companies taking the public data and using it to model how certain judges behave in relation to particular types of legal matter or argument, or how they compare to other judges. In short, they didn’t like how the pattern of their decisions – now relatively easy to model – were potentially open for all to see.