the Irish Times latest foodie list, via Aoife McElwain
Absent clearer guidelines, the burden falls on the scientific enterprise to self-regulate—and it isn’t set up to do that well. Academia is intensely competitive, and “the drivers are about getting grants and publications, and not necessarily about being responsible citizens,” says Filippa Lentzos from King’s College London, who studies biological threats. This means that scientists often keep their work to themselves for fear of getting scooped by their peers. Their plans only become widely known once they’ve already been enacted, and the results are ready to be presented or published. This lack of transparency creates an environment where people can almost unilaterally make decisions that could affect the entire world. Take the horsepox study [the main topic of this article]. Evans was a member of a World Health Organization committee that oversees smallpox research, but he only told his colleagues about the experiment after it was completed. He sought approval from biosafety officers at his university, and had discussions with Canadian federal agencies, but it’s unclear if they had enough ethical expertise to fully appreciate the significance of the experiment. “It’s hard not to feel like he opted for agencies that would follow the letter of the law without necessarily understanding what they were approving,” says Kelly Hills, a bioethicist at Rogue Bioethics. She also sees a sense of impulsive recklessness in the interviews that Evans gave earlier this year. Science reported that he did the experiment “in part to end the debate about whether recreating a poxvirus was feasible.” And he told NPR that “someone had to bite the bullet and do this.” To Hills, that sounds like I did it because I could do it. “We don’t accept those arguments from anyone above age 6,” she says.
Amazon’s computer models were trained to vet applicants by observing patterns in resumes submitted to the company over a 10-year period. Most came from men, a reflection of male dominance across the tech industry. […] Amazon’s system taught itself that male candidates were preferable. It penalized resumes that included the word “women’s,” as in “women’s chess club captain.” And it downgraded graduates of two all-women’s colleges, according to people familiar with the matter.’nice demo of algorithmic bias right there. Worrying that there are plenty of other places carrying on with the concept though….