Tidy Word Vectors Revisited

Julia Silge revisits her Hacker News word vectorization problem:

So hooray! We have found word vectors again, a bit faster, with clearer and easier-to-understand code. I do argue that this is a real benefit of this approach; it’s based on counting, dividing, and matrix decomposition and is thus much easier to understand and implement than anything with a neural network. And the results?

Click through to see the new method, as well as some basic analogy testing.

Related Posts

Capsule Neural Networks

Saurabh Kulshrestha covers the topic of capsule neural networks: This is the problem with Convolutional Neural Networks as well. CNN is good at detecting features, but will wrongly activate the neuron for face detection. This is because it is less effective at exploring the spatial relationships among features. A simple CNN model can extract the […]

Read More

Creating An Azure Chat Bot

Dustin Ryan shows how to build a QnA bot: After you’ve created your knowledge base you can then edit and update your knowledge base. There’s a few different ways to update your knowledge. a. Manually edit the knowledge base directly within QnAMaker.ai. You can do this by directly editing the questions by modifying the text […]

Read More

Leave a Reply

Your email address will not be published. Required fields are marked *

Categories

November 2017
MTWTFSS
« Oct Dec »
 12345
6789101112
13141516171819
20212223242526
27282930