White-sounding names like Emily and Matt with linked by the AI with words deemed 'pleasant', while black-sounding ones, such as Ebony and Jamal', were associated with 'unpleasant' ones Pictured are the names given to the AI and words it associated them with - European American names were linked with the 'pleasant' words and African American names with 'unpleasant'By feeding Reddit threads to DGX-1, it will hopefully read and learn a range of conversations faster than any other system has done before it, as conversations are filled with natural human language and commonly used slang.The team took their case one-step further and used the same terms from a study in 2004 that sent thousands of resumes to employers – some had white-sounding names and others black-sounding names.

Web became a sexists-75

This link was stronger than the non-controversial findings that musical instruments and flowers were pleasant and weapons and insects were unpleasant.

Female names were also strongly associated with artistic terms, while male names were found to be closer to maths and science ones.

Researchers found that a widely used AI characterizes black-sounding names as 'unpleasant', which they believe is a result of our own human prejudice hidden in the data it learns from on the World Wide Web.

Researchers found that a widely used AI characterizes black-sounding names as 'unpleasant', which they believe is a result of our own human prejudice hidden in the data it learns from on the World Wide Web A recent example was reported by Pro Publica in May, when an algorithm used by officials in Florida automatically rated a more seasoned white criminal as being a lower risk of committing a future crime, than a black offender with only misdemeanors on her record.

And were given the same results as the first word task (credit: Princeton University/Bath University)Flowers were linked to being 'pleasant' and insects to be 'unpleasant', but the algorithm also associated white-sounding names like Emily and Matt with words deemed 'pleasant', while black-sounding ones, such as Ebony and Jamal', were associated with 'unpleasant' ones.

Princeton's results do not just prove datasets are polluted with prejudices and assumptions, but the algorithms currently being used for researchers are reproducing human's worst values - racism and assumption.'We can learn that 'prejudice is bad', that 'women used to be trapped in their homes and men in their careers, but now gender doesn't necessarily determine family role' and so forth,' writes the researchers.'If AI is not built in a similar way, then it would be possible for prejudice absorbed by machine learning to have a much greater negative impact than when prejudice is absorbed in the same way by children.' Princeton's results do not just prove datasets are polluted with prejudices and assumptions, but the algorithms currently being used for researchers are reproducing human's worst values - racism and assumption.

Artificially intelligent robots and devices are being taught to be racist, sexist and otherwise prejudiced by learning from humans, according to new research.

A massive study of millions of words online looked at how closely different terms were to each other in the text – the same way that automatic translators use “machine learning” to establish what language means. The researchers found male names were more closely associated with career-related terms than female ones, which were more closely associated with words related to the family.

Some people of the world have experienced this firsthand, with Microsoft's racist chat bot Tay (pictured)Sony SRS-XB40 has a built-in multi-coloured line light, speaker lights and a flashing strobe.

It features 24 hours of battery life and claims to be a 'mini-disco on the move'.

Three guys are sitting at a Harlem bartop eating fries, drinking whiskey and talking about love.