Usages of TMUX
TMUX could retain current working session. It’s pretty useful when you can’t have persistent connections to the remote server e.g. SSH login.
Key PRE
Default prefix (PRE as follows) is Ctrl + b.
If we need to input PRE + D, first press D and hold; press Ctrl + b simultaneously and release all keys.
Basic concepts
Session: different projects
...
Detailed Explanation and Non-applicable Scenarios of Batch Normalization
Algorithm
Batch Normalization (BN) has
non-learnable parameters:
mean_batch $\mu_{B}$
variance_batch $\sigma_{B}$
running_mean $\mu_{run}$
running_variance $\sigma_{run}$
learnable parameters (not fix during training):
weight $\gamma$
bias $\beta$
A random sample $i$ in a mini...
(PyTorch) CrossEntropyLoss Log-softmax and NLL
Softmax:
$\sigma(z)_j = \frac{e^{z_j}}{\sum_1^k e^{z_k}}$
Log-softmax:
$\sigma(z)_j = z_j-\log{\sum_1^k e^{z_k}}$
In PyTorch:
nn.CrossEntropyLoss is equivalent to the combination of F.log_softmax and F.nll_loss
F.nll_loss only computes $\sum_i -y_i$ (depends on reduction).
Packing and Unpacking in a Single Element Tuple
Parentheses could be interpreted as priority in mathematical computation and tuple simultaneously. To specifically indicate tuple, a comma needs to be added.
Packing
# int
int0 = (15)
print(type(int0))
# tuple
tuple0 = (15,)
print(type(tuple0))
Unpacking
# Failed unpacking
unpack0 = tuple0
print(type(unpack0)) # tuple
# succeeded unpacking
un...
Strategies for Imbalanced Data
Addressing issues of data imbalanced has many strategies. Another keyword: long-tailed distribution.
(Easiest) Oversamping and downsampling
Cons of oversampling: overfitting
Cons of downsampling: wasting data
(Binary classification) Adjustment of prediction threshold
If the ratio is 3:7, the classification probabili...
Difference between Likelihood, Negative Log-likelihood, Entropy, Cross-entropy and KL Divergence
Likelihood
Assume a three-class classification problem with one-hot coding label $y_i$. The observation (output of DNNs) of a sample is $\hat{y}_i$.
The equation \(\text{Likelihood} = \sum_{0}^{i}\hat{y}_i^{y_i}\) computes the likelihood of the sample.
Object
$\hat{y}_i$
$y_i$
Cat
0.7
...
Do Not Use exit() in IPython
Do not use exit() in any IPython script (Jupyter Notebook, etc).
The kernel will be trapped in a die-restart loop with the following error information:
The kernel appears to have died. It will restart automatically.
Differences between Python 2 and Python 3
division
Python 2 (By default integer division)
assert 1 / 2 == 0
Python 3 (By default true division)
assert 1 / 2 == 0.5
Remedy for Python 2 (forward compatible to Python 3)
from __future__ import division
print
Python 2 (no parentheses)
print 'I am a print function'
Python 3
print('I am a print function')
...
18 post articles, 3 pages.