This week in artificial intelligence (AI) news, we take a look at a programme that learns chess through listening to real match commentary, the UK’s National Health Service (NHS) making big investments in AI, and the development of AI tools for reading sign language.
Researchers at University College London have proposed a new algorithm, which allows the programme to learn chess via natural language processing (NLP). The algorithm, called SentiMATE, learned from 2,700 chess match commentaries that were available online to assess possible moves based on the discussion by the commentators of each move during these matches. Out of 100 matches played, SentiMATE achieved a success rate of 81 per cent. Read the full article here.
The National Health Service (NHS) in the United Kingdom recently announced an investment of £250m in order to advance the role of artificial intelligence (AI) in healthcare. The potential of AI technologies and tools has already been seen in areas such as medical imaging, patient diagnosis, and workflow. However, with these positive applications of AI tools, there also challenges to be addressed. Read more in this article from the BBC by clicking here.
A variety of AI tools have been used, and are being developed, for speech recognition and patterns; but what about for those who may not be able to speak or hear? Google’s AI labs have been working on a real-time hand tracking framework to read and interpret sign language. Because hand movements are often quick, making catching these movements and actions quite difficult in real-time. Although in its early stages, this technology may make for some incredible real-world applications! Read the full article here.