Logarithms: The Basics Explained Simply
A logarithm is the opposite of exponentiation. It answers the question:
👉 "What power should we raise a base to get a number?"
But wait… what exactly is a "power"? 🤔
🔥 The Power of Exponents
The power of a number (also called an exponent) tells us how many times to multiply a number by itself.
For example, the expression:
2³ # 2 * 2 * 2 = 8
…means multiplying 2
by itself three times, resulting in 8
. 🚀
Now that you understand exponents, logarithms (log
) are just their reverse operation.
🔄 Logarithms: Undoing Exponents
For example, the logarithm:
log₂(8) = ?
…is asking "What power of 2 gives 8?" 🤔
Since we already know:
2³ = 8
…the answer is:
log₂(8) = 3
💡 A More Practical Example with Python
Let’s say we multiply 2
by itself 9 times:
2 ** 9 # 512
Now, what if we only have 512
and we want to reverse this process?
We need to find how many times we multiplied 2
to get 512
. That’s where logs come in!
🔹 Using Python’s math
module
import math
math.log(512, 2) # 9
✨ This tells us that 2
was raised to the power of 9 to get 512
.
🔷 NumPy Example
NumPy provides a natural logarithm (log
base e) function:
import numpy as np
x = np.array([1, 10, 100, 1000])
np.log(x) # Natural log (ln) with base 'e'
If you need a logarithm with a different base, use:
np.log2(x) # Log base 2
np.log10(x) # Log base 10
🔥 PyTorch Example
In Deep Learning, we often use log()
in calculations like log-likelihood and cross-entropy loss.
Here’s how to compute logs in PyTorch:
import torch
x = torch.tensor([1.0, 10.0, 100.0])
torch.log(x) # Natural log (ln)
If you need log base 2 or 10, you can manually convert it:
torch.log2(x) # Log base 2
torch.log10(x) # Log base 10
🎯 Final Thought
In short, logs tell us the exponent we need to reach a certain number.
Think of it like this:
- Exponentiation asks: "What’s the result when I multiply a base by itself?"
- Logarithms ask: "How many times did I multiply the base to get this result?"
🔹 Logs are everywhere in Deep Learning, from loss functions to probabilities and gradient descent.
Hope this helped! 🚀 Let me know if you want more examples! 😃