Normalize/Scale data in python from EEG data collected in Looxid EEG Mask

In this tutorial we will utilize a python script to normalize or scale EEG data. This script however works for any kind of data.

Requirements

Setup

To have access to spyder it is recommended to download first Anaconda. After the download has been completed, you can run Anaconda and lunch Spyder. If the launch button does not show but an Install button shows instead, make sure to install Spyder and then run it. After you open Spyder, you can create a new file and save the file in the same directory where your data is located. Now we are ready to start coding.

Step 1 Data

Let's assume we have collected some data from two tasks. So we have two csv files. In my case two EEG data files in csv format. The data we have is in a format in which we cannot make propper analysis and therefore we need to normalize it or scale it. We can create a script that goes through both of those files at the same time and saves two data files in csv with normalized data for both tasks.
The files that I have used from the two tasks are collected from an EEG device and are in csv format:
Download EEG CSV File For Task 1

Download EEG CSV File For Task 2

Step 2 Go through both datasets and normalize the data

Now we can write a python script that scales the data in both datasets and saves it into the folder. Is important to note that this python script should be located into the same folder with the two datasets.
                                        
                                        """
                                        @author: Fjorda
                                        """

                                        import numpy as np
                                        import pandas as pd

                                        S1 = pd.read_csv('Task1_EEG.csv')
                                        S2 = pd.read_csv('Task2_EEG.csv')


                                        S = [S1, S2]
                                        attribs = ['alpha_AF3', 'beta_AF3', 'gamma_AF3', 'delta_AF3', 'theta_AF3',
                                                    'alpha_AF4', 'beta_AF4', 'gamma_AF4', 'delta_AF4', 'theta_AF4',
                                                    'alpha_FP1', 'beta_FP1', 'gamma_FP1', 'delta_FP1', 'theta_FP1',
                                                    'alpha_FP2', 'beta_FP2', 'gamma_FP2', 'delta_FP2', 'theta_FP2',
                                                    'alpha_AF7', 'beta_AF7', 'gamma_AF7', 'delta_AF7', 'theta_AF7',
                                                    'alpha_AF8', 'beta_AF8', 'gamma_AF8', 'delta_AF8', 'theta_AF8']

                                        X_max = -100000
                                        X_min =  100000
                                        for k in range(len(S)):
                                            Sk = S[k]
                                            x_max = -100000
                                            x_min =  100000
                                            for i in range(len(attribs)):
                                                x = Sk[attribs[i]]
                                                x = abs(x)
                                                x_max = max(x_max, max(x))
                                                x_min = min(x_min, min(x))

                                            X_max = max(X_max, x_max)
                                            X_min = min(X_min, x_min)
                                            
                                        for k in range(len(S)):
                                            Sk = S[k]
                                            for i in range(len(attribs)):
                                                x = Sk[attribs[i]]
                                                x = abs(x)
                                                
                                                x = (x - X_min)/(X_max - X_min)
                                                Sk[attribs[i]] = x
                                                
                                                filename = 'rTask' + str(k+1) + '_EEG.csv'
                                                pd.DataFrame.to_csv(Sk, filename)
                                        
                                    
After we run the above script, we now have saved into our folder the two scaled files for both tasks. Have a look at the scaled files below.
Download Scaled EEG CSV File For Task 1

Download Scaled EEG CSV File For Task 2

Conclusions

In this tutorial we used Spyder application in Anaconda and python code to scale or normalize the data from two tasks at the same time.