In the last post, we talked about neural language models and how they could be used to predict the next word in some context. Now, one might ask how we could use the idea for our feature learning model? 376 more words

## Tags » Embeddings

#### Amalgamated products and HNN extensions (I): A theorem of B.H. Neumann

This is the first note of a series dedicated to constructions using amalgamated products and HNN extensions. We begin with an embedding theorem (an alternative proof can be found in our previous note… 349 more words

#### A free group contains a free group of any rank

For free groups, rank is a kind of dimension so it is surprising that a free group of rank can contain a free subgroup of rank ; it is even possible that be infinite! 611 more words

#### When does a finite metric space embed isometrically into an euclidean space?

A first preliminary remark is that embedding isometrically a finite metric space into an euclidean space is not a trivial problem. Indeed, although there exists always such an embedding into for a metric space of cardinality , there exist four-point metric spaces which cannot be emdebbed into any euclidean space: 586 more words

#### In case you were curious about Cantor Sets

This is my paper that I wrote for my masters, and I’m going to try to bring it back down to Earth. I’m going to try to give links to relevant topics that I think maybe need a bit more explaining. 2,417 more words

#### An old corker on the unknotting of knots

I imagine many readers of this blog are familiar with the fact that you can knot a circle in 3-space, but not in 4-space. If you enjoy thinking about why that is true, please read on! 523 more words

#### Stemming

A few weeks ago, I went to Québec Ouvert Hackathon 3.3, and I was most interested by Michael Mulley’s Open Parliament. One possible addition to the project is to use cross-referencing of entries based not only on the parliament-supplied subject tags but also on the content of the text itself. 1,001 more words