# Data Mining

**John Samuel**

CPE Lyon

**Year**: 2018-2019

**Email**: john(dot)samuel(at)cpe(dot)fr

- Artifical Neural Networks
- Deep Learning
- Reinforcement Learning
- Data Licences, Ethics and Privacy

- Inspired by biological neural networks
- Collection of connected nodes called artificial neurons.
- Artificial neurons can transmit signal from one to another (like in a synapse).
- Signal between artificial neurons is a real number
- The output of a neuron is the sum of weighted inputs.

- Algorithm for supervised learning of binary classifiers
- Binary classifier

- Let
*y = f(z)*be output of perceptron for an input vector*z* - Let
be the number of training examples**N** - Let
be the input feature space**X** - Let {
*(x*} be the_{1}, d_{1}),...,(x_{N}, d_{N})training examples, where**N***x*is the feature vector of_{i}*i*training example.^{th}*d*is the desired output value._{i}*x*be the_{j,i}*i*feature of^{th}*j*training example.^{th}*x*= 1_{j,0}

- Weights are represented in the following manner:
*w*is the_{i}*i*value of weight vector.^{th}*w*is the_{i}(t)*i*value of weight vector at a given time t.^{th}

- Initialize weights and threshold
- For each example
*(x*in training set_{j}, d_{j})- Calculate the weight:
*y*_{j}(t)=f[w(t).x_{j}] - Update the weights:
*w*_{i}(t + 1) = w_{i}(t) + (d_{j}-y_{j}(t))x_{j,i}

- Calculate the weight:
- Repeat step 2 until the iteration error
*1/s (Σ |d*is less than user-specified threshold._{j}- y_{j}(t)|)

- Backward propagation of errors
- Adjust the weight of neurons by calculating the gradient of the loss function
- Error is calculated and propagated back to the network layers

- Multiple hidden layers between the input and output layers

- Computer vision
- Speech recognition
- Drug design
- Natural language processing
- Machine translation

- Analysis of images
- Inspired by neurons in the virtual cortex
- Network learns the filters

- Inspired by behaviourist psychology
- Actions to be taken in order to maximize the cumulative award.

- Data usage licences
- Confidentiality and Privacy
- Ethics

- Volume
- Variety
- Velocity
- Veracity
- Value