A critique and complement to “What gender gap in chess?”

The gap between women and men in chess has been a long-standing debate subject. There have been many studies attempting to explain the seemingly large gap between men and women, both in their participation and strength at the top level. This old debate has reopened in the past weeks with an article in Mint attempting to explain the low presence of women at the top level, and a more recent Chessbase article titled “What gender gap in chess?” that has confronted their main claims with a new statistical perspective. While…

This year I’m teaching a module on Applied Machine Learning with over a hundred students. The students come from very different backgrounds and may not be well prepared in advance to successfully complete this module. To get all students up to speed and level the field, I have compiled a list of online resources on three of the areas which are keys to become a successful Machine Learning practitioner: programming (Python), basic Mathematics and the usage of the command-line and virtual environments. In general, all these online courses and tutorials are suited for those interested in data science and machine…

Written by Jose Camacho Collados and Taher Pilehvar

Word embeddings are representations of words as low-dimensional vectors, learned by exploiting vast amounts of text corpora. As explained in a previous post, word embeddings (e.g. Word2Vec [1], GloVe [2] or FastText [3]) have proved to be powerful keepers of prior knowledge to be integrated into downstream Natural Language Processing (NLP) applications. However, despite their flexibility and success in capturing semantic properties of words, the effectiveness of word embeddings is generally hampered by an important limitation, known as the meaning conflation deficiency: the inability to discriminate among different meanings of a word.

Neural networks have contributed to outstanding advancements in fields such as computer vision [1,2] and speech recognition [3]. Lately, they have also started to be integrated in other challenging domains like Natural Language Processing (NLP). But how do neural networks contribute to the advance of text-based applications? In this post I will try to explain, in a very simplified way, how to apply neural networks and integrate word embeddings in text-based applications, and some of the main implicit benefits of using neural networks and word embeddings in NLP.

First, what are word embeddings? Word embeddings are (roughly) dense vector representations…

As you may probably know, DeepMind has recently published a paper on AlphaZero [1], a system that learns by itself and is able to master games like chess or Shogi.

Before getting into details, let me introduce myself. I am a researcher in the broad field of Artificial Intelligence (AI), specialized in Natural Language Processing. I am also a chess International Master, currently the top player in South Korea although practically inactive for the last few years due to my full-time research position. Given my background I have tried to build a reasoned opinion on the subject as constructive as…

Jose Camacho Collados

Mathematician, AI/NLP researcher and chess International Master. http://www.josecamachocollados.com

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store