Unfortunately one major sector of image machine learning is CSAM scanning, which was also recently revealed as one of the major funding parties for the planned legislation intended to allow scanning all private communication in the EU. But generally i agree most of the things they will see might not be too bad by themselves but its still a job no human really wants to do of their own free will. If they do decide to do it, it is either out of a lack of choice or because they dont know what they are getting themselves into.
Around the world, millions of so-called “clickworkers” train artificial intelligence models, teaching machines the difference between pedestrians and palm trees, or what combination of words describe violence or sexual abuse.
Training an AI is not traumatizing - what you think it is moderating public networks
Unfortunately one major sector of image machine learning is CSAM scanning, which was also recently revealed as one of the major funding parties for the planned legislation intended to allow scanning all private communication in the EU. But generally i agree most of the things they will see might not be too bad by themselves but its still a job no human really wants to do of their own free will. If they do decide to do it, it is either out of a lack of choice or because they dont know what they are getting themselves into.
It depends.