# LeetCode Contest 54

class Solution(object):
def findShortestSubArray(self, nums):
"""
:type nums: List[int]
:rtype: int
"""
cnt = {}
f = 0
for i, n in enumerate(nums):
if n not in cnt:
cnt[n] = [1, i, i]
else:
c, a, _ = cnt[n]
cnt[n] = [c+1, a, i]
f = max(f, cnt[n][0])
return min(v[2]-v[1]+1 for v in cnt.values() if v[0]==f)


# Support Vector Machine

## Intuition

In a binary classification problem, we can use logistic regression

$$h_\theta(x) = \frac{1}{1+e^{-\theta^T x}} = g(\theta^T x),$$

where $g$ is the sigmoid function with a figure of it below.

Then given input $x$, the model predicts $1$ if and only if $\theta^x \ge 0$, in which case $h_\theta(x) = g(\theta^T x) \ge 0.5$; and it predicts $0$ if and only if $\theta^T x < 0$. Moreover, based on the shape of sigmoid function, if $\theta^T x >> 0$, we are very confident that $y=1$. Likewise, if $\theta^T x << 0$, we are very confident that $y=0$. Therefore, we hope that for the training set $\{(x^{(i)}, y^{(i)})\}_{i=1}^m$, we can find such a $\theta$ that $\theta^T x^{(i)} >> 0$ if $y^{(i)}=1$ and $\theta^T x^{(i)} << 0$ if $y^{(i)}=0$.

# Generative Model

This article is my notes on generative model for Lecture 5 and 6 of Machine Learning by Andrew Ng. What we do in logistic regression using generalized linear model is that, we approximate $P(y|x)$ using given data. This kind of learning algorithms is discriminative, in which we predict $y$ based on the input features $x$. On the contrary, generative model is to model $P(x|y)$, the probability of the features $x$ given class $y$. In other words, we want to study how the features structure looks like given a class $y$. If we also learn what $P(y)$ is, we can easily recover $P(y|x)$, for example, in the binary classification problem,

$$$$P(y=1|x) = \frac{P(x|y=1)P(y=1)}{P(x)}, \label{eqn:bayes}$$$$

where $P(x) = P(x|y=0)P(y=0) + P(x|y=1)P(y=1)$.

In this article, we are going to see a simple example of generative model on Gaussian discriminant analysis and Naive Bayes.

# Generalized Linear Model (Examples)

This article is a companion article to my another post Generalized Linear Model. In this article, I will implement some of the learning algorithms in Generalized Linear Model. To be more specific, I will do some examples on linear regression and logistic regression. With some effort, google search gives me some very good example data sets to work with. The datasets collected by Larry Winner is one of the excellent sets, which will be used in the article.

The implementations here use Python. Required 3rd party libraries are:

# Generalized Linear Model

This article on Generalized Linear Model (GLM) is based on the first four lectures of Machine Learning by Andrew Ng. But the structure of the article is quite different from the lecture. I will talk about exponential family of distributions first. Then I will discuss the general idea of GLM. Finally, I will try to derive some well known learning algorithms from GLM.

## Exponential Family

A family of distributions is an exponential family if it can be parametrized by vector $\eta$ in the form $$P(y; \eta) = b(y)\exp(\eta^{T} T(y)-a(\eta)),$$ where $T(y)$ and $b(y)$ are (vector-valued) functions in terms of $y$, and $a(\eta)$ is a function in terms of $\eta$.

$\eta$ is called the natural parameter and $T(y)$ is called the sufficient statistic.

# 意志力

“自我损耗”是作者以及书中提到的研究者在做实验的时候常用的手段。这也有警醒的意味：意志力减弱的时候，渴望还会变强！

1.你的意志力是有限的，使用就会消耗。 2.你从同一账户提取意志力用于各种不同任务。

# Pelican Signals

Pelican的插件系统是使用blinkersignal实现的。Pelican所有可以用的signals可以在signals.py找到。本文的目的是记录这些signals是在Pelican运行中什么时候发出的。

(1) Pelican有一个叫做Pelican的类，含有程序的主体框架。当Pelican的一个实例pelican初始化完成之后（基本设置，加载插件），发出第一个signal。

signals.initialized.send(pelican)


# Lambda Calculus

This post is my note for What is seminar on Lambda Calculus.

Lambda calculus was created by Alonzo Church in the 1930s, and was used by him to solve Entscheidungsproblem in 1936, which is related to Hilbert's tenth problem. In the same year, Alan Turing independently solved Entscheidungsproblem using his invention Turing machine. Shortly after, Turing realized that these two models are actually equivalent as models of computation.

In this note, I will first give the formal definition of lambda expressions. Then with the help of Python, I am going to show how to do Boolean algebra and basic arithmetic using lambda calculus, which to some extend gives an illustration that Turing machine and lambda calculus are equivalent.

## Definition

Lambda calculus consists of lambda expressions and operations on them. There are three basic elements in Lambda expression:

1. variables: x, y, z, ...
2. symbols in abstraction: λ and .
3. parentheses for association: ()