AI Computational Linguistics
Ricardo Lezama  

If You’re A Linguist, Dont’ Sweat The Formal Math (Too Much) in Natural Language Processing

In the realm of Natural Language Processing (NLP), a foundational understanding of linear algebra does help (somewhat) when trying to make money with NLP. Fundamentally, the concern for all us Philosophers, Linguists and Psychologists is to earn a decent living while not being penalized for our lack of immediate gratification in the workforce. The overarching disciplines connected to Cognitive Science, ultimately, took us all in a very enriching direction, but the practical consequence of being too squarely centered in any one of these domains is also very real. 

There’s also the fact that for many of us – even if not immediately evident in a Linguistics 101 course – already speak the formal vocabulary of logic and science when describing linguistic data. A feature matrix to describe syntactic transformations justified on feature matrices referencing lexical categories (or whatever other linguistic feature) is terribly similar to conventional linear algebra matrices in their more mainstream context. For example, consider the transformations below:

SentenceLexical Categories (Features)Transformation
The cat is on the mat.[Det, Noun, Verb, Prep, Det, Noun]Original Sentence
The cat is lying on the mat.[Det, Noun, Verb, Verb, Prep, Det, Noun]Substitution: “lying” instead of “on”
A dog sleeps on the sofa.[Det, Noun, Verb, Prep, Det, Noun]Generalization: “dog” instead of “cat”
The dogs are sleeping on the sofa.[Det, PluralNoun, Verb, GerundVerb, Prep, Det, Noun]Pluralization: “dogs” and gerund form of “sleep”
Cats like fish.[PluralNoun, Verb, Noun]Simplification: Removing a determiner (“the”)
Fish are liked by cats.[PluralNoun, Verb, Verb, Prep, PluralNoun]Passive
If making these connections is not difficult, you can figure out formal languages including programming languages

Therefore, we may as well engage the pertinent domains of mathematics and logic (even if not at the immediate full strength of a CS or Math Degree holder) immediately if we want to engage computing and enterprise tools, like very specific programming languages.

Ultimately, Linear Algebra plays in describing complex concepts and mechanisms that underpin various NLP tasks. But, if you don’t take a whole damn course, I don’t blame you. Just developing an intuition is good enough to hang in the industry – that’s my recommendation after going over the decade mark in my overlapping industry and media responsibilities.

Linear Algebra Intuition

Most pipelines use some kind of vectorized representations of meaning. As long as I understand that some process takes the co-occurrence of words and chomps them down into binary matrix representations in a 2-D graph, then I am fine.


import numpy as np

# Sample co-occurrence data (word-word matrix)
co_occurrence_matrix = np.array([[0, 1, 0, 1],
                                  [1, 0, 1, 0],
                                  [0, 1, 0, 1],
                                  [1, 0, 1, 0]])

# Converting co-occurrence matrix to binary matrix
binary_matrix = (co_occurrence_matrix > 0).astype(int)

# Display the binary matrix
print("Binary Matrix Representation:")
print(binary_matrix)
Word-Word Matrix Representation:
```
0 1 0 1 
1 0 1 0 
0 1 0 1 
1 0 1 0
```

Linguistic Representation:
- The word "cat" appears with the word "dog" and the word "bird"
- The word "dog" appears with the word "cat" and the word "mouse"
- The word "mouse" appears with the word "dog" and the word "bird"
- The word "bird" appears with the word "cat" and the word "mouse" 

In this representation, the matrix indicates the co-occurrence relationships between real words. Each row and column represents a specific word, and 
the values in the cells indicate whether the corresponding words tend to occur together or not.

Thus, word and sentence embeddings encapsulate semantic information in a continuous vector space. Through linear algebra operations on these vector representations, NLP systems can uncover underlying patterns, similarities, and relationships within textual data. However, that stuff is automated so, unless you’re doing research papers for the rest of your life, it really doesn’t matter if you’re sacrificing that date, night out or whatever to do the calculations by hand.

Bertrand Russel once quipped about how nice it would have been for formal logical calculations to have been available in an automated fashion. So, unless you’re on the heels of the great philosopher in your studies, research or enterprise task (which DOES NOT require that much time), don’t fret about the math because either a portion of the tedious calculation is automated or you can learn the core components quickly.

Moreover, linear algebra opens up avenues for performing operations such as computing dot products on vectorized representations. The dot product serves as a measure of similarity between vectors, allowing NLP systems to identify pieces of text that exhibit similar semantic traits. So, leveraging the dot product, NLP models can retrieve relevant documents, recommend similar content to users based on their input.

Additionally, linear algebra facilitates transformations and manipulations of vector representations through operations like matrix multiplication and matrix transformations.


import numpy as np

# Real words
words = ["cat", "dog", "mouse", "bird"]

# Word-Word matrix representation
matrix = np.array([[0, 1, 0, 1],
                   [1, 0, 1, 0],
                   [0, 1, 0, 1],
                   [1, 0, 1, 0]])

print("Word-Word Matrix Representation:")
for i in range(len(words)):
    for j in range(len(words)):
        print(matrix[i][j], end=" ")
    print()

# Matrix multiplication example
vector = np.array([0, 1, 1, 0])
result = np.dot(matrix, vector)

print("\nMatrix Multiplication (Vector Representation):")
print(result)

# Matrix transformations example
transpose = np.transpose(matrix)

print("\nMatrix Transpose:")
print(transpose)


These operations play a vital role in tasks such as sentiment analysis, machine translation, and summarization by enabling the encoding and decoding of textual information across different languages and domains. You can basically use vectors as indirect (but fairly accurate) descriptions of meaning.

Yes, possessing a solid foundation in linear algebra is essential for anyone venturing into the field of Natural Language Processing. Understanding the principles of vectorized representations, clustering algorithms, dot products, and matrix operations equips practitioners with the tools needed to navigate the intricacies of NLP tasks effectively. By grasping the power of linear algebra in NLP, one can unravel the complexities of language and unlock the full potential of computational linguistics in building intelligent and intuitive systems for processing and understanding human language.