BADMINTON STROKE CLASSIFICATION

AN END TO END MACHINE LEARNING PROJECT

This project is on classifying the type of badminton stroke player played using the data collected from a small wrist device. This device gives us accelerometer and gyroscope data and is attached to the player's wrist. This project was part of Data Analytics course we did at IISc. And it involved everything from data collection, cleaning up till the final ML model evaluation.

This jupyter notebook explains "what we did?" and "how we did it?" (the code).

Defining the problem

To start off, we first need to define and properly formulate the problem we want to solve. Since, this was a course project we could have chosen any problem. We finalized to work on classifying the kind of stroke a player is playing based on the sensor's data we get from her smartwatch. This was an extension of Human Activity Recognition (HAR) project, which using similar data, classifies the activities as running, walking, etc. In our case, we decided to have the following badminton strokes as classes:

  • Backhand Overarm (bo)
  • Backhand Underarm (bu)
  • Forehand Overarm (fo)
  • Forehand Underarm (fu)
  • Forehand Smash (fs)

The 2 letter in parenthesis are abbreviations of the strokes, which we'll be using throughout this notebook.

Data

As per our knowledge, there was no publicly available data for this problem. There have been some research papers, but we couldn't get our hands on the data. So we decided to collect it ourselves (Why not? Can give it a try).

Setup for collecting data

We got hold on a device which can give us accelerometer and gyroscope readings over a period of time at a descent frequency. The device was powered by a small cell and connected via bluetooth low energy (BLE) to a rasperry pi. So we collected our data on rasperry pi in csv formats.

Collecting data

Thanks to our 5 volunteers, we collected the data for the 5 mentioned strokes by making them repeatedly play the same shot. Although this might be very different than a real game, but we tried to make them play as they are playing a real game, (i.e. starting from center position, going towards the shuttle cock and returning back to resting position). The device had it's own limitation (coverage, power, etc) which made it challenging. Going through all of this we finally got our data.

Visualizing the data

In [1]:
# Some imports
import pandas as pd
pd.set_option('display.max_columns', None)
import numpy as np
import itertools
import matplotlib
import matplotlib.pyplot as plt
import seaborn as sns
from scipy.signal import find_peaks  #For finding peaks (we'll see later)
import warnings
warnings.filterwarnings("ignore", category=FutureWarning)
import pickle

The data is saved in the data folder. We have 25 different csv files, for 5 players, 5 classes each. The naming convention is playerid_strokename. e.g. p2_bu.csv file contains data of player id 2 for "backhand underarm".

The columns in each file are: timestamp, accelerometer x-axis (ax), y-axis (ay), z-axis (az), gyroscope x-axis (gx), y-axis (gy), z-axis (gz). We also added magnitudes of accelerometer and gyroscope.

In [2]:
# Columns in data files
cols = ['ax', 'ay', 'az', 'gx', 'gy', 'gz']  # Accelerometer and gyroscope in 3 axis
addedCols = ['acc_mag', 'gyro_mag']          # Magnitude of acc and gyro
In [3]:
# Data reading
data = {}
persons = ["p1", "p2", "p3", "p4", "p5"]
shots = ["bo", "bu", "fo", "fs", "fu"]
for person in persons:
    for shot in shots:
        data[person + "_" + shot] = pd.read_csv("data/" + str(person) + "_" + str(shot) + ".csv")
        data[person + "_" + shot]['acc_mag'] = np.linalg.norm(data[person + "_" + shot][cols[:3]].values,axis=1)
        data[person + "_" + shot]['gyro_mag'] = np.linalg.norm(data[person + "_" + shot][cols[3:]].values,axis=1)

Let's plot one of the csv, say p5_bo.

In [4]:
plotName = "p5_bo"

# Set figure width as per the number of readings
plt.figure(figsize=(len(data[plotName])/10, 8), dpi=150)

# Plot for each accelerometer column
for col in cols[0:3]:
    plt.plot(data[plotName][col], label=col)

plt.xlim(xmin=0)
plt.xticks(np.arange(0, len(data[plotName])+1, 5.0))
plt.title(plotName + "_acc")
plt.legend()
plt.show()

Data cleaning

The peaks in the above plot are the shots, and in between we have some rest times. The shots are not at uniform intervals, sometimes the rest period is very long (as in between 160 and 200 in above plot). So we need to extract out shots from this continous plot and remove all these players' resting periods.

The approach we took was to take a window around the peaks in the signal. Many design decisions were to be made, like, should the window by dynamic or fixed? If fixed, window size? peaks in which signal, accelerometer or gyroscope, x or y or z? one signal or many signals? how to consider some spike as peak? any thresholding? any noise reduction algorithm?

We did some EDA by plotting all the data and chose a window size of 13. This was our choice by our observations. And we chose the ay-signal to consider for extracting peaks and window, with different threshold values for each class of stroke.

In [5]:
# Size of the window, it is common for all shots
windowSize = 13
# shots = ["bo", "bu", "fo", "fs", "fu"] for reference
# Name of the sensor to be used for thresholding and the threshold value, the order is as per the shots list order
sensorToThreshold = ['ay', 'ay', 'ay', 'ay', 'ay']
threshold = [1.25, 1.25, 1.5, 1.5, 1.5]
In [6]:
# DATA PLOTTING - This will save all the plots in plots folder
for p_name in persons:
    for s_name in shots:
        s_index = shots.index(s_name)  # Shot index of that shot
        plotName = p_name + "_" + s_name
        
        # Set figure width as per the number of readings
        plt.figure(figsize=(len(data[plotName])/10, 5), dpi=100)
        
        # Plot peaks with threashold of that shot and give the name of sensor used for thresholding
        peaks, _ = find_peaks(data[plotName][sensorToThreshold[s_index]], height=threshold[s_index])
        #plt.plot(peaks, data[plotName][sensorToThreshold[s_index]][peaks], "x", c='red')
        
        # Plot for each accelerometer column
        for col in cols[0:3]:
            plt.plot(data[plotName][col], label=col)

        # drawing windows around each peak
        for peak in peaks:
            plt.axvspan(int(peak - windowSize/2), int(peak + windowSize/2), alpha =.1, facecolor='g',edgecolor='black')
        
        plt.xlim(xmin=0)
        plt.xticks(np.arange(0, len(data[plotName])+1, 5.0))
        plt.title(plotName + "_acc")
        plt.legend()
        plt.savefig("plots/" + plotName + "_acc")
        #plt.show()
        plt.close();
        
        # Plot for each gyrometer column (not drawing peaks in these)
        
        # Set figure width as per the number of readings
        plt.figure(figsize=(len(data[plotName])/10, 5), dpi=100)
        
        for col in cols[3:]:
            plt.plot(data[plotName][col], label=col)

        # drawing windows around each peak
        for peak in peaks:
            plt.axvspan(int(peak - windowSize/2), int(peak + windowSize/2), alpha =.1, facecolor='g',edgecolor='black')
        
        plt.xlim(xmin=0)
        plt.xticks(np.arange(0, len(data[plotName])+1, 5.0))
        plt.title(plotName + "_gyro")
        plt.legend()
        plt.savefig("plots/" + plotName + "_gyro")
        #plt.show()
        plt.close();
                  
        # Magnitude - Acc
        # Set figure width as per the number of readings
        plt.figure(figsize=(len(data[plotName])/10, 5), dpi=100)

        plt.plot(data[plotName][addedCols[0]], label="Magnitude Accelerometer")

        # drawing windows around each peak
        for peak in peaks:
            plt.axvspan(int(peak - windowSize/2), int(peak + windowSize/2), alpha =.1, facecolor='g',edgecolor='black')
        
        plt.xlim(xmin=0)
        plt.xticks(np.arange(0, len(data[plotName])+1, 5.0))
        plt.title(plotName + "_mag_acc")
        plt.legend()
        plt.savefig("plots/" + plotName + "_mag_acc")
        #plt.show()
        plt.close();
        
        # Magnitude - Gyro
        # Set figure width as per the number of readings
        plt.figure(figsize=(len(data[plotName])/10, 5), dpi=100)

        plt.plot(data[plotName][addedCols[1]], label="Magnitude Gyroscope")

        # drawing windows around each peak
        for peak in peaks:
            plt.axvspan(int(peak - windowSize/2), int(peak + windowSize/2), alpha =.1, facecolor='g',edgecolor='black')
        
        plt.xlim(xmin=0)
        plt.xticks(np.arange(0, len(data[plotName])+1, 5.0))
        plt.title(plotName + "_mag_gyro")
        plt.legend()
        plt.savefig("plots/" + plotName + "_mag_gyro")
        #plt.show()
        plt.close();
        

Now this is what our extracted shots look like.

Accelerometer:

Gyroscope:

There are some overlaps, either because of shots being very close to each other or because of double peaks showing up in some shots. And some shots are missed because of low threshold value. But this is what we found to give reasonable balance between extracting good shots and not the noisy ones.

We'll save the start and end frame for each of these extracted shots in an X_y dataframe, along with the true labels. Later on, we'll augment this dataframe with hand engineered features. We are also saving person id although it's not used in classification.

In [7]:
# Final data frame with features described in doc along with shot name
X_y = pd.DataFrame(columns=['StartFrame', 'EndFrame', 'PersonID', 'ShotName'])

# Creating final data frame and adding end and begin frame of window
for p_name in persons:
    for s_name in shots:
        s_index = shots.index(s_name)  # Shot index of that shot
        plotName = p_name + "_" + s_name
        # Find peaks for window
        #TODO: we can look at peak_features(second returned value) for more data features
        timeSeries = data[plotName][sensorToThreshold[s_index]]
        peaks, _ = find_peaks(timeSeries, height=threshold[s_index])
        for peak in peaks:
            if(peak < windowSize/2 or peak > len(timeSeries)-windowSize/2):
                #print(peak)
                continue
            d = {'StartFrame': int(peak - windowSize/2),
                 'EndFrame': int(peak + windowSize/2), 
                 'PersonID': p_name, 
                 'ShotName': s_name}
            #print(d)
            X_y = X_y.append(d, ignore_index=True)
            
In [8]:
X_y
Out[8]:
StartFrame EndFrame PersonID ShotName
0 4 17 p1 bo
1 7 20 p1 bo
2 22 35 p1 bo
3 60 73 p1 bo
4 75 88 p1 bo
5 77 90 p1 bo
6 94 107 p1 bo
7 143 156 p1 bo
8 162 175 p1 bo
9 179 192 p1 bo
10 181 194 p1 bo
11 198 211 p1 bo
12 220 233 p1 bo
13 224 237 p1 bo
14 6 19 p1 bu
15 27 40 p1 bu
16 41 54 p1 bu
17 94 107 p1 bu
18 109 122 p1 bu
19 127 140 p1 bu
20 143 156 p1 bu
21 158 171 p1 bu
22 175 188 p1 bu
23 193 206 p1 bu
24 215 228 p1 bu
25 232 245 p1 bu
26 257 270 p1 bu
27 259 272 p1 bu
28 272 285 p1 bu
29 277 290 p1 bu
... ... ... ... ...
678 114 127 p5 fs
679 125 138 p5 fs
680 141 154 p5 fs
681 157 170 p5 fs
682 168 181 p5 fs
683 190 203 p5 fs
684 218 231 p5 fs
685 237 250 p5 fs
686 249 262 p5 fs
687 277 290 p5 fs
688 294 307 p5 fs
689 313 326 p5 fs
690 321 334 p5 fs
691 340 353 p5 fs
692 372 385 p5 fs
693 382 395 p5 fs
694 398 411 p5 fs
695 3 16 p5 fu
696 18 31 p5 fu
697 31 44 p5 fu
698 45 58 p5 fu
699 69 82 p5 fu
700 81 94 p5 fu
701 94 107 p5 fu
702 119 132 p5 fu
703 132 145 p5 fu
704 149 162 p5 fu
705 179 192 p5 fu
706 192 205 p5 fu
707 241 254 p5 fu

708 rows × 4 columns

We have extracted the shots out of the continous data we had. Total data: 708 (Not much, but let's see). Now let's see how much imbalance is there in our data. We'll handle that (if any). And then move on to adding features.

In [9]:
sns.set_style('whitegrid')
plt.figure(figsize=(16,8))
plt.title('Data provided by each user', fontsize=20)
sns.countplot(x='PersonID', hue='ShotName', data = X_y)
plt.savefig("plots/data_count")

plt.show()
In [10]:
shotFullName = {
    'fo': "Overhead Forehand",
    'bo': "Overhead Backhand",
    'fs': "Forehand Smash",
    'fu': "Underarm Forehand Stroke",
    'bu': "Underarm Backhand Stroke",
}
plt.title('No of Datapoints per Stroke', fontsize=15)
sns.countplot([shotFullName[i] for i in X_y.ShotName]).set_xticklabels([shotFullName[i] for i in shots], 
                                                                       rotation=30, horizontalalignment='right')

plt.tight_layout()
plt.savefig("plots/stroke_count")
plt.show()

There is a lot of difference in the amount of data we collected from each person (We didn't realized it during collection). But there isn't much imbalace between the classes (there is, but not much), so we'll work with this only.

Adding Features for classical ML models

We will be using this X_y dataframe to store all our features for all the samples. And then use it for training classical machine learning models.

After some searching we found following features that we can add to such a time series data.

  • Range
  • Minimum
  • Maximum
  • Average
  • Absolute Average
  • Kurtosis f
  • Kurtosis p
  • Skewness statistic and p value
  • Entropy
  • Standard Deviation
  • Angle betweenness
  • Inter quartile range
  • maxmin relative position ( Max position - min position , to see if max comes before min or vice-versa)
In [11]:
# List of features we'll add
features = []
In [12]:
# Some helper functions

# Add feature which depends only on one sensor, like range
def add_feature(fname, sensor):
    v = [fname(data[str(row['PersonID']) + "_" + str(row['ShotName'])][int(row['StartFrame']):int(row['EndFrame'])],
              sensor)
            for index, row in X_y.iterrows()]
    X_y[fname.__name__ + str(sensor)] = v
    if(fname.__name__ + str(sensor) not in features):
        features.append(fname.__name__ + str(sensor))
    print("Added feature " + fname.__name__ + str(sensor) + " for " + str(len(v)) + " rows.")
    
# Add feature which depends on more than one sensors, like magnitude
def add_feature_mult_sensor(fname, sensors):
    v = [fname(data[str(row['PersonID']) + "_" + str(row['ShotName'])][int(row['StartFrame']):int(row['EndFrame'])],
              sensors)
             for index, row in X_y.iterrows()]
    
    name = "_".join(sensors)
    X_y[fname.__name__ + name] = v
    if(fname.__name__ + name not in features):
        features.append(fname.__name__ + name)
    print("Added feature " + fname.__name__ + name + " for " + str(len(v)) + " rows.")
In [13]:
# Range 
def range_(df, sensor):
    return np.max(df[sensor]) - np.min(df[sensor])
for sensor in cols + addedCols:
    add_feature(range_, sensor)
Added feature range_ax for 708 rows.
Added feature range_ay for 708 rows.
Added feature range_az for 708 rows.
Added feature range_gx for 708 rows.
Added feature range_gy for 708 rows.
Added feature range_gz for 708 rows.
Added feature range_acc_mag for 708 rows.
Added feature range_gyro_mag for 708 rows.
In [14]:
# Minimum
def min_(df, sensor):
    return np.min(df[sensor])
for sensor in cols + addedCols:
    add_feature(min_, sensor)
Added feature min_ax for 708 rows.
Added feature min_ay for 708 rows.
Added feature min_az for 708 rows.
Added feature min_gx for 708 rows.
Added feature min_gy for 708 rows.
Added feature min_gz for 708 rows.
Added feature min_acc_mag for 708 rows.
Added feature min_gyro_mag for 708 rows.
In [15]:
# Maximum
def max_(df, sensor):
    return np.max(df[sensor])
for sensor in cols + addedCols:
    add_feature(max_, sensor)
Added feature max_ax for 708 rows.
Added feature max_ay for 708 rows.
Added feature max_az for 708 rows.
Added feature max_gx for 708 rows.
Added feature max_gy for 708 rows.
Added feature max_gz for 708 rows.
Added feature max_acc_mag for 708 rows.
Added feature max_gyro_mag for 708 rows.
In [16]:
# Average
def avg_(df, sensor):
    return np.mean(df[sensor])
for sensor in cols + addedCols:
    add_feature(avg_, sensor)
Added feature avg_ax for 708 rows.
Added feature avg_ay for 708 rows.
Added feature avg_az for 708 rows.
Added feature avg_gx for 708 rows.
Added feature avg_gy for 708 rows.
Added feature avg_gz for 708 rows.
Added feature avg_acc_mag for 708 rows.
Added feature avg_gyro_mag for 708 rows.
In [17]:
# Absolute Average
def absavg_(df, sensor):
    return np.mean(np.absolute(df[sensor]))
for sensor in cols + addedCols:
    add_feature(absavg_, sensor)
Added feature absavg_ax for 708 rows.
Added feature absavg_ay for 708 rows.
Added feature absavg_az for 708 rows.
Added feature absavg_gx for 708 rows.
Added feature absavg_gy for 708 rows.
Added feature absavg_gz for 708 rows.
Added feature absavg_acc_mag for 708 rows.
Added feature absavg_gyro_mag for 708 rows.
In [18]:
def kurtosis_f_(df , sensor):
    from scipy.stats import kurtosis 
    val = kurtosis(df[sensor],fisher = True)
    return val
for sensor in cols + addedCols:
    add_feature(kurtosis_f_, sensor)
Added feature kurtosis_f_ax for 708 rows.
Added feature kurtosis_f_ay for 708 rows.
Added feature kurtosis_f_az for 708 rows.
Added feature kurtosis_f_gx for 708 rows.
Added feature kurtosis_f_gy for 708 rows.
Added feature kurtosis_f_gz for 708 rows.
Added feature kurtosis_f_acc_mag for 708 rows.
Added feature kurtosis_f_gyro_mag for 708 rows.
In [19]:
def kurtosis_p_(df , sensor):
    from scipy.stats import kurtosis 
    val = kurtosis(df[sensor],fisher = False)
    return val
for sensor in cols + addedCols:
    add_feature(kurtosis_p_, sensor)
Added feature kurtosis_p_ax for 708 rows.
Added feature kurtosis_p_ay for 708 rows.
Added feature kurtosis_p_az for 708 rows.
Added feature kurtosis_p_gx for 708 rows.
Added feature kurtosis_p_gy for 708 rows.
Added feature kurtosis_p_gz for 708 rows.
Added feature kurtosis_p_acc_mag for 708 rows.
Added feature kurtosis_p_gyro_mag for 708 rows.
In [20]:
#skewness
def skewness_statistic_(df, sensor):
    if(len(df) == 0):
        print(df)
    from scipy.stats import skewtest 
    statistic, pvalue = skewtest(df[sensor], nan_policy='propagate')
    return statistic
for sensor in cols + addedCols:
    add_feature(skewness_statistic_, sensor)
Added feature skewness_statistic_ax for 708 rows.
Added feature skewness_statistic_ay for 708 rows.
Added feature skewness_statistic_az for 708 rows.
Added feature skewness_statistic_gx for 708 rows.
Added feature skewness_statistic_gy for 708 rows.
Added feature skewness_statistic_gz for 708 rows.
Added feature skewness_statistic_acc_mag for 708 rows.
Added feature skewness_statistic_gyro_mag for 708 rows.
In [21]:
def skewness_pvalue_(df, sensor):
    from scipy.stats import skewtest 
    statistic, pvalue = skewtest(df[sensor])
    return pvalue
for sensor in cols + addedCols:
    add_feature(skewness_pvalue_, sensor)
Added feature skewness_pvalue_ax for 708 rows.
Added feature skewness_pvalue_ay for 708 rows.
Added feature skewness_pvalue_az for 708 rows.
Added feature skewness_pvalue_gx for 708 rows.
Added feature skewness_pvalue_gy for 708 rows.
Added feature skewness_pvalue_gz for 708 rows.
Added feature skewness_pvalue_acc_mag for 708 rows.
Added feature skewness_pvalue_gyro_mag for 708 rows.
In [22]:
#entropy 
def entropy_(df, sensor):
    from scipy.stats import entropy
    ent = entropy(df[sensor])
    return ent
for sensor in addedCols:
    add_feature(entropy_, sensor)
Added feature entropy_acc_mag for 708 rows.
Added feature entropy_gyro_mag for 708 rows.
In [23]:
# Standard Deviation
def std_(df, sensor):
    return np.std(df[sensor])
for sensor in cols + addedCols:
    add_feature(std_, sensor)
Added feature std_ax for 708 rows.
Added feature std_ay for 708 rows.
Added feature std_az for 708 rows.
Added feature std_gx for 708 rows.
Added feature std_gy for 708 rows.
Added feature std_gz for 708 rows.
Added feature std_acc_mag for 708 rows.
Added feature std_gyro_mag for 708 rows.
In [24]:
#angle between two vectors
def anglebetween_(df, sensors):
    v1 = sensors[0]
    v2 = sensors[1]
    v1_u = df[v1] / np.linalg.norm(df[v1])
    v2_u = df[v2] / np.linalg.norm(df[v2])
    return np.arccos(np.clip(np.dot(v1_u, v2_u), -1.0, 1.0))
add_feature_mult_sensor(anglebetween_, ["ax", "ay"])
add_feature_mult_sensor(anglebetween_, ["ay", "az"])
add_feature_mult_sensor(anglebetween_, ["ax", "az"])
add_feature_mult_sensor(anglebetween_, ["gx", "gy"])
add_feature_mult_sensor(anglebetween_, ["gy", "gz"])
add_feature_mult_sensor(anglebetween_, ["gx", "gz"])
Added feature anglebetween_ax_ay for 708 rows.
Added feature anglebetween_ay_az for 708 rows.
Added feature anglebetween_ax_az for 708 rows.
Added feature anglebetween_gx_gy for 708 rows.
Added feature anglebetween_gy_gz for 708 rows.
Added feature anglebetween_gx_gz for 708 rows.
In [25]:
#inter quartile range
def iqr_(df, sensor):
    from scipy import stats
    return stats.iqr(df[sensor])
for sensor in cols + addedCols:
    add_feature(iqr_, sensor)
Added feature iqr_ax for 708 rows.
Added feature iqr_ay for 708 rows.
Added feature iqr_az for 708 rows.
Added feature iqr_gx for 708 rows.
Added feature iqr_gy for 708 rows.
Added feature iqr_gz for 708 rows.
Added feature iqr_acc_mag for 708 rows.
Added feature iqr_gyro_mag for 708 rows.
In [26]:
# Max position - min position (relative difference)
def maxmin_relative_pos_(df, sensor):
    return np.argmax(np.array(df[sensor])) - np.argmin(np.array(df[sensor]))
for sensor in cols + addedCols:
    add_feature(maxmin_relative_pos_, sensor)
Added feature maxmin_relative_pos_ax for 708 rows.
Added feature maxmin_relative_pos_ay for 708 rows.
Added feature maxmin_relative_pos_az for 708 rows.
Added feature maxmin_relative_pos_gx for 708 rows.
Added feature maxmin_relative_pos_gy for 708 rows.
Added feature maxmin_relative_pos_gz for 708 rows.
Added feature maxmin_relative_pos_acc_mag for 708 rows.
Added feature maxmin_relative_pos_gyro_mag for 708 rows.

Saving the processed data

Our final X_y, with all the hand-engineered features, is ready. Now we can do some "Machine Learning" on it. But before that, it's better to save this X_y, so that we need not do any preprocessing and we can do all ML model testing on a different notebook. (For case of this article, we'll continue in same notebook).

In [27]:
X_y
Out[27]:
StartFrame EndFrame PersonID ShotName range_ax range_ay range_az range_gx range_gy range_gz range_acc_mag range_gyro_mag min_ax min_ay min_az min_gx min_gy min_gz min_acc_mag min_gyro_mag max_ax max_ay max_az max_gx max_gy max_gz max_acc_mag max_gyro_mag avg_ax avg_ay avg_az avg_gx avg_gy avg_gz avg_acc_mag avg_gyro_mag absavg_ax absavg_ay absavg_az absavg_gx absavg_gy absavg_gz absavg_acc_mag absavg_gyro_mag kurtosis_f_ax kurtosis_f_ay kurtosis_f_az kurtosis_f_gx kurtosis_f_gy kurtosis_f_gz kurtosis_f_acc_mag kurtosis_f_gyro_mag kurtosis_p_ax kurtosis_p_ay kurtosis_p_az kurtosis_p_gx kurtosis_p_gy kurtosis_p_gz kurtosis_p_acc_mag kurtosis_p_gyro_mag skewness_statistic_ax skewness_statistic_ay skewness_statistic_az skewness_statistic_gx skewness_statistic_gy skewness_statistic_gz skewness_statistic_acc_mag skewness_statistic_gyro_mag skewness_pvalue_ax skewness_pvalue_ay skewness_pvalue_az skewness_pvalue_gx skewness_pvalue_gy skewness_pvalue_gz skewness_pvalue_acc_mag skewness_pvalue_gyro_mag entropy_acc_mag entropy_gyro_mag std_ax std_ay std_az std_gx std_gy std_gz std_acc_mag std_gyro_mag anglebetween_ax_ay anglebetween_ay_az anglebetween_ax_az anglebetween_gx_gy anglebetween_gy_gz anglebetween_gx_gz iqr_ax iqr_ay iqr_az iqr_gx iqr_gy iqr_gz iqr_acc_mag iqr_gyro_mag maxmin_relative_pos_ax maxmin_relative_pos_ay maxmin_relative_pos_az maxmin_relative_pos_gx maxmin_relative_pos_gy maxmin_relative_pos_gz maxmin_relative_pos_acc_mag maxmin_relative_pos_gyro_mag
0 4 17 p1 bo 3.05341 1.84393 1.12860 227.69928 465.68298 439.82696 2.399066 388.228389 -1.36011 0.15601 -1.11713 -42.58728 -250.00000 -189.83459 0.449619 10.848844 1.69330 1.99994 0.01147 185.11200 215.68298 249.99237 2.848685 399.077233 -0.327224 0.929121 -0.313877 25.042607 12.265720 -8.008517 1.209606 101.207901 0.587732 0.929121 0.315642 38.159298 60.093806 64.064612 1.209606 101.207901 4.332231 1.428159 4.290885 2.603525 2.390531 1.736930 4.257268 1.196086 7.332231 4.428159 7.290885 5.603525 5.390531 4.736930 7.257268 4.196086 3.099352 1.401094 -3.202052 2.811551 -1.303761 1.307331 3.281096 2.701275 0.001939 0.161186 0.001365 0.004930 0.192315 0.191100 0.001034 0.006907 2.482802 2.050504 0.658365 0.419744 0.260681 55.497505 98.367075 98.854521 0.532256 114.572893 1.705647 2.592944 1.704230 2.057122 2.872174 1.025739 0.15014 0.09363 0.07550 52.20032 40.33660 57.17468 0.246593 63.851815 2 1 1 -4 -1 1 1 4
1 7 20 p1 bo 3.05341 1.84393 1.12860 227.69928 465.68298 439.82696 2.399066 388.228389 -1.36011 0.15601 -1.11713 -42.58728 -250.00000 -189.83459 0.449619 10.848844 1.69330 1.99994 0.01147 185.11200 215.68298 249.99237 2.848685 399.077233 -0.358685 0.925064 -0.312102 22.215623 2.295274 -9.633578 1.219778 114.757904 0.619192 0.925064 0.313867 40.818435 71.522052 68.692135 1.219778 114.757904 4.516519 1.448142 3.658435 2.278342 0.921873 1.503207 4.240144 0.966073 7.516519 4.448142 6.658435 5.278342 3.921873 4.503207 7.240144 3.966073 3.227788 1.449650 -3.019751 2.709864 -0.929834 1.324887 3.250108 2.483311 0.001248 0.147156 0.002530 0.006731 0.352457 0.185209 0.001154 0.013017 2.484444 2.164318 0.662445 0.419967 0.267336 57.339286 107.273813 100.409502 0.529737 110.904302 1.738168 2.572913 1.665239 1.961745 2.788544 1.052583 0.27136 0.09363 0.21216 64.89563 64.77356 85.98327 0.246593 117.644558 2 1 1 -4 -1 1 1 4
2 22 35 p1 bo 2.59113 1.79791 2.55182 341.14075 314.94904 470.93964 2.820702 359.692988 -2.00000 0.20203 -0.55188 -190.50598 -64.95667 -220.94727 0.643330 41.909924 0.59113 1.99994 1.99994 150.63477 249.99237 249.99237 3.464032 401.602912 -0.589476 0.843896 0.092327 -4.143347 38.427499 -11.661236 1.238610 123.438871 0.680419 0.843896 0.390347 48.787044 74.330258 65.545888 1.238610 123.438871 1.143829 2.090847 3.104243 1.552827 0.082220 2.011180 5.666403 1.822596 4.143829 5.090847 6.104243 4.552827 3.082220 5.011180 8.666403 4.822596 -1.024435 2.097186 3.136364 -0.940783 1.774096 1.156026 3.860729 2.951899 0.305630 0.035977 0.001711 0.346816 0.076047 0.247670 0.000113 0.003158 2.449221 2.297862 0.582813 0.419766 0.647254 74.576184 91.462439 100.737093 0.684709 102.311370 2.619621 1.037181 2.228971 1.983909 1.475343 2.013400 0.39808 0.30738 0.32989 36.96441 85.13642 46.69189 0.476475 73.961360 -1 2 2 5 -3 2 -3 4
3 60 73 p1 bo 1.42700 1.93488 0.81769 181.92291 474.31946 475.95978 1.866181 349.843787 -1.31549 0.06506 -0.80542 -54.55017 -250.00000 -225.96741 0.607624 5.044586 0.11151 1.99994 0.01227 127.37274 224.31946 249.99237 2.473805 354.888372 -0.576142 0.907255 -0.311669 10.797940 0.294612 -7.702168 1.188472 115.416716 0.593298 0.907255 0.313557 34.821143 64.475426 68.515485 1.188472 115.416716 -0.143819 3.155069 0.001100 0.569585 1.349121 1.204342 3.627210 0.126530 2.856181 6.155069 3.001100 3.569585 4.349121 4.204342 6.627210 3.126530 -0.694345 1.587864 -1.641120 1.952565 -0.720400 1.121208 3.191432 1.971372 0.487466 0.112317 0.100773 0.050871 0.471279 0.262199 0.001416 0.048681 2.509193 2.143942 0.374384 0.394391 0.218167 47.383947 104.871959 107.420121 0.427817 107.876949 2.698599 2.506891 0.736958 1.305000 2.572208 1.627996 0.30560 0.03504 0.07873 44.90662 42.21344 60.92072 0.158270 115.045971 1 1 1 -4 -1 1 1 7
4 75 88 p1 bo 1.67822 1.68512 2.73737 170.34912 499.99237 499.99237 2.544345 380.297801 -2.00000 0.31482 -0.73743 -20.05005 -250.00000 -250.00000 0.919687 3.871427 -0.32178 1.99994 1.99994 150.29907 249.99237 249.99237 3.464032 384.169228 -0.891581 0.959407 -0.043837 29.310960 -7.772006 12.432979 1.476049 97.842099 0.891581 0.959407 0.442040 36.896925 57.654162 60.988792 1.476049 97.842099 -1.267835 0.690171 4.344974 0.737945 2.347301 2.147072 1.109940 0.733589 1.732165 3.690171 7.344974 3.737945 5.347301 5.147072 4.109940 3.733589 -1.317498 2.240916 3.526968 2.189892 0.278057 -0.490533 2.679185 2.582531 0.187672 0.025032 0.000420 0.028532 0.780968 0.623757 0.007380 0.009808 2.451057 1.900606 0.621917 0.481607 0.657929 47.483096 103.431016 104.600460 0.768229 124.076246 2.553833 1.261289 1.802243 2.020222 2.708513 1.017772 1.29065 0.28472 0.20545 44.05212 29.65546 51.49841 0.766604 98.432232 -7 -3 1 3 -1 1 -3 7
5 77 90 p1 bo 1.67822 1.68512 2.73737 170.34912 499.99237 499.99237 2.544345 380.297801 -2.00000 0.31482 -0.73743 -20.05005 -250.00000 -250.00000 0.919687 3.871427 -0.32178 1.99994 1.99994 150.29907 249.99237 249.99237 3.464032 384.169228 -0.898568 0.983432 -0.033104 32.079844 -8.632954 18.804111 1.496871 103.660160 0.898568 0.983432 0.431307 36.835890 61.641986 67.249591 1.496871 103.660160 -1.260085 0.395367 4.440146 0.872848 2.117778 2.089812 1.146730 0.705884 1.739915 3.395367 7.440146 3.872848 5.117778 5.089812 4.146730 3.705884 -1.329097 1.989036 3.548791 2.236668 0.316184 -0.808471 2.680918 2.523354 0.183816 0.046697 0.000387 0.025308 0.751862 0.418819 0.007342 0.011624 2.456716 1.975242 0.616361 0.483733 0.652414 45.892042 104.622579 105.331553 0.758008 121.895486 2.545061 1.256510 1.808812 2.021724 2.689395 0.989975 1.23761 0.34674 0.20471 43.99108 43.71643 74.51630 0.766358 98.432232 -7 -3 1 3 -1 1 -3 7
6 94 107 p1 bo 3.10632 1.23432 0.92858 354.01153 493.66760 437.27112 1.291482 312.450390 -2.00000 0.17230 -0.76031 -241.38641 -243.67523 -202.15607 0.908957 9.297542 1.10632 1.40662 0.16827 112.62512 249.99237 235.11505 2.200438 321.747932 -0.526227 0.833115 -0.282884 1.199575 15.411377 3.484873 1.242495 155.998273 0.699604 0.833115 0.308772 62.606811 107.174214 67.273653 1.242495 155.998273 1.555072 -0.603796 1.280847 1.627893 -0.611660 1.042730 2.723135 -1.682720 4.555072 2.396204 4.280847 4.627893 2.388340 4.042730 5.723135 1.317280 0.529287 -0.523077 -0.195113 -2.243886 0.399532 0.646515 2.908711 0.125787 0.596606 0.600920 0.845304 0.024840 0.689501 0.517946 0.003629 0.899900 2.533484 2.256316 0.671632 0.329076 0.204705 89.512982 139.393102 98.219873 0.329408 114.042918 2.190027 2.423324 0.569889 2.106033 2.377283 1.298514 0.46142 0.50659 0.09326 88.89007 182.25098 77.53754 0.296997 228.726362 1 4 4 -4 -4 3 -7 -8
7 143 156 p1 bo 1.67023 1.79779 2.48444 221.41266 477.60010 499.99237 2.480478 389.998399 -2.00000 0.20215 -0.48450 -42.26685 -250.00000 -250.00000 0.983554 6.346661 -0.32977 1.99994 1.99994 179.14581 227.60010 249.99237 3.464032 396.345060 -0.656800 0.961736 -0.093036 11.417681 11.800912 -5.396917 1.304580 113.814360 0.656800 0.961736 0.400719 37.712096 75.215268 68.376983 1.304580 113.814360 1.648364 3.663604 7.393257 3.200357 0.748508 1.627525 6.593097 0.377946 4.648364 6.663604 10.393257 6.200357 3.748508 4.627525 9.593097 3.377946 -2.863239 1.982122 4.317681 3.064810 -0.828019 0.478798 4.158060 2.309240 0.004193 0.047466 0.000016 0.002178 0.407660 0.632082 0.000032 0.020930 2.474353 2.070636 0.496363 0.362505 0.614279 56.490811 111.721304 107.960845 0.646664 121.134010 2.577763 1.418781 1.908538 2.345039 2.805693 0.720330 0.48822 0.04675 0.17327 53.80249 50.39216 34.99604 0.117790 130.106230 5 1 -1 2 -1 1 1 5
8 162 175 p1 bo 2.15625 1.83765 2.09332 345.12329 499.99237 391.51001 2.397498 429.639514 -2.00000 0.16229 -0.85620 -250.00000 -250.00000 -141.51764 0.689608 3.364378 0.15625 1.99994 1.23712 95.12329 249.99237 249.99237 3.087106 433.003892 -0.466810 0.866972 -0.199154 -15.955412 21.338242 7.785503 1.145207 130.041764 0.490848 0.866972 0.389480 44.542166 92.764634 70.852428 1.145207 130.041764 3.947451 2.479860 4.450730 3.721980 -0.035514 0.404974 6.931047 0.193224 6.947451 5.479860 7.450730 6.721980 2.964486 3.404974 9.931047 3.193224 -3.317391 1.869342 3.269808 -3.014066 -0.640014 1.443177 4.197251 1.729616 0.000909 0.061575 0.001076 0.002578 0.522163 0.148971 0.000027 0.083699 2.471646 2.077734 0.502397 0.409488 0.463576 77.788037 125.124325 100.893260 0.576803 125.483763 2.527545 1.559490 1.785308 2.567806 1.750129 1.923466 0.04278 0.15271 0.18634 37.26196 143.34869 79.12445 0.073978 170.240947 -1 2 2 1 -1 1 1 6
9 179 192 p1 bo 3.56635 1.29492 0.78571 372.31445 453.79639 499.99237 1.934282 350.944507 -2.00000 0.70502 -0.89270 -250.00000 -203.80402 -250.00000 1.031637 2.945232 1.56635 1.99994 -0.10699 122.31445 249.99237 249.99237 2.965919 353.889739 -0.483965 1.095345 -0.340632 -15.943675 1.118586 7.897597 1.416175 123.565683 0.724942 1.095345 0.340632 44.575030 71.773823 67.641625 1.416175 123.565683 1.824771 2.933261 1.984071 3.287867 0.426581 1.058607 1.618089 -1.411902 4.824771 5.933261 4.984071 6.287867 3.426581 4.058607 4.618089 1.588098 0.813838 2.993985 -2.867360 -2.654889 0.857129 -0.071120 2.850421 1.123050 0.415738 0.002754 0.004139 0.007933 0.391374 0.943303 0.004366 0.261416 2.495832 1.971833 0.802286 0.309171 0.202179 80.276682 113.293759 113.308796 0.569744 131.026908 2.119431 2.774556 1.076470 2.013046 2.234159 1.823712 0.12195 0.19434 0.06109 36.30828 31.43311 35.43090 0.292378 250.743424 -1 2 1 1 -3 2 5 6
10 181 194 p1 bo 3.56635 1.29492 0.94415 372.31445 453.79639 499.99237 2.010966 345.795432 -2.00000 0.70502 -0.89270 -250.00000 -203.80402 -250.00000 0.954952 8.094307 1.56635 1.99994 0.05145 122.31445 249.99237 249.99237 2.965919 353.889739 -0.470951 1.087688 -0.323219 -15.745309 0.764113 6.171592 1.406763 129.865801 0.711928 1.087688 0.331135 46.916668 77.717708 68.720891 1.406763 129.865801 1.729265 2.603602 0.898657 3.216404 0.304534 1.032180 1.538454 -1.382180 4.729265 5.603602 3.898657 6.216404 3.304534 4.032180 4.538454 1.617820 0.719170 2.823384 -1.836477 -2.636197 0.848982 0.016422 2.805518 1.155694 0.472036 0.004752 0.066287 0.008384 0.395892 0.986897 0.005024 0.247806 2.493110 2.088702 0.805256 0.317095 0.229966 80.561859 114.393538 113.488718 0.576384 126.004940 2.107430 2.663364 1.092721 2.018825 2.223631 1.824295 0.13086 0.19080 0.13196 38.43689 56.48804 41.80145 0.292378 226.301875 -1 2 6 1 -3 2 -7 5
11 198 211 p1 bo 2.51343 1.97119 1.67291 304.14581 468.03284 384.50623 1.690211 391.224523 -1.63672 -0.09686 -1.49200 -240.49377 -218.04047 -250.00000 0.860827 4.914335 0.87671 1.87433 0.18091 63.65204 249.99237 134.50623 2.551038 396.138858 -0.428498 0.804941 -0.332857 -9.170531 34.309975 -17.510633 1.218461 133.943226 0.674414 0.819842 0.360689 47.677846 92.343844 73.083144 1.218461 133.943226 0.313165 0.937511 4.204529 3.719118 -0.351555 0.152233 3.299575 -0.624820 3.313165 3.937511 7.204529 6.719118 2.648445 3.152233 6.299575 2.375180 0.990458 0.595832 -3.261396 -3.326627 0.327112 -1.431505 3.360394 1.526698 0.321950 0.551287 0.001109 0.000879 0.743583 0.152286 0.000778 0.126836 2.508546 2.102299 0.632798 0.447991 0.376821 76.151350 127.122972 104.123765 0.448120 128.172321 1.743707 2.420271 1.364796 2.316862 2.779894 0.915307 0.45111 0.35480 0.12891 51.63574 152.19116 95.42083 0.151963 164.004672 2 -1 -1 1 -3 7 -2 7
12 220 233 p1 bo 1.25909 2.03558 2.27307 285.00366 499.99237 478.96576 2.225653 368.202819 -1.47284 -0.03564 -0.87085 -250.00000 -250.00000 -250.00000 0.422914 3.502292 -0.21375 1.99994 1.40222 35.00366 249.99237 228.96576 2.648567 371.705111 -0.587655 0.867821 -0.162278 -62.545776 24.615948 -4.169758 1.261102 143.617799 0.587655 0.874008 0.408277 69.711539 86.473905 70.722726 1.261102 143.617799 -0.078696 -0.436852 3.678786 -0.253034 0.373417 0.827555 1.196396 -1.494619 2.921304 2.563148 6.678786 2.746966 3.373417 3.827555 4.196396 1.505381 -2.117945 0.398716 3.066346 -1.978430 -0.284171 -0.482675 2.024749 0.902144 0.034180 0.690103 0.002167 0.047880 0.776279 0.629326 0.042893 0.366981 2.481170 2.029689 0.392569 0.571005 0.519502 93.462874 124.196074 109.579307 0.533335 141.692737 2.225393 1.573346 1.407194 1.514596 2.341863 1.327714 0.39386 0.53326 0.22229 92.03338 67.97790 41.88538 0.455386 257.628909 5 2 1 3 -2 3 -3 4
13 224 237 p1 bo 1.30457 2.03558 2.27307 287.28485 499.99237 478.96576 2.225653 350.502525 -1.47284 -0.03564 -0.87085 -250.00000 -250.00000 -250.00000 0.422914 21.202586 -0.16827 1.99994 1.40222 37.28485 249.99237 228.96576 2.648567 371.705111 -0.582041 0.882992 -0.135673 -61.276950 19.867531 -2.816420 1.271103 152.384039 0.582041 0.889179 0.381672 73.386559 91.546278 75.431823 1.271103 152.384039 -0.135896 -0.552694 3.298558 -0.310121 0.182221 0.748211 1.145751 -1.458327 2.864104 2.447306 6.298558 2.689879 3.182221 3.748211 4.145751 1.541673 -2.018564 0.256044 2.860551 -1.872662 -0.088620 -0.539041 1.940487 0.965120 0.043533 0.797917 0.004229 0.061115 0.929384 0.589858 0.052321 0.334485 2.482104 2.166237 0.398969 0.576920 0.519906 95.041905 125.975853 110.257302 0.532415 134.149812 2.215267 1.530043 1.424759 1.525988 2.340089 1.324315 0.38458 0.76318 0.35120 94.29931 96.86279 57.02209 0.462991 242.629763 10 2 1 9 -2 3 -3 -7
14 6 19 p1 bu 2.83838 1.62488 0.72870 129.74548 374.09973 289.65759 1.975030 277.323635 -0.83844 0.37506 -0.84973 -69.89288 -250.00000 -117.43927 0.890068 5.296810 1.99994 1.99994 -0.12103 59.85260 124.09973 172.21832 2.865099 282.620445 -0.190688 0.940078 -0.412940 -13.863196 -10.161181 -10.879518 1.202327 85.711041 0.500371 0.940078 0.412940 23.748544 67.183273 38.456844 1.202327 85.711041 5.895584 4.240339 -0.188728 1.121040 0.665812 3.729177 6.186358 0.163970 8.895584 7.240339 2.811272 4.121040 3.665812 6.729177 9.186358 3.163970 3.884461 3.012904 -1.537687 1.049760 -2.074572 2.723486 4.079975 2.185225 0.000103 0.002588 0.124125 0.293828 0.038026 0.006460 0.000045 0.028872 2.497571 2.103990 0.670342 0.347401 0.196285 29.573978 99.465316 61.569303 0.503600 87.327931 1.544614 2.549666 1.296389 2.171408 1.664455 1.867364 0.12634 0.12054 0.21649 29.60205 23.21624 38.56659 0.074686 111.952505 -1 1 1 2 5 1 1 5
15 27 40 p1 bu 1.19024 1.02942 1.03778 109.33686 255.51605 247.05506 1.179858 184.951966 -1.33630 0.42450 -1.05432 -95.92438 -158.35571 -147.97211 0.839582 19.228326 -0.14606 1.45392 -0.01654 13.41248 97.16034 99.08295 2.019441 204.180292 -0.493324 0.865549 -0.340352 -21.241408 11.749855 -17.678482 1.112244 91.201049 0.493324 0.865549 0.340352 29.963567 58.066148 52.362883 1.112244 91.201049 1.264483 0.180113 3.615829 -0.781972 0.353090 -0.421954 2.955697 -0.734706 4.264483 3.180113 6.615829 2.218028 3.353090 2.578046 5.955697 2.265294 -2.373976 0.423154 -3.138118 -1.622602 -1.895390 -0.227950 3.163111 0.826671 0.017598 0.672183 0.001700 0.104675 0.058041 0.819685 0.001561 0.408424 2.530882 2.366122 0.317567 0.260116 0.237693 37.921420 70.914630 64.070250 0.310328 56.178907 2.460652 2.525231 0.516732 1.544973 2.293368 1.165482 0.24158 0.20850 0.10907 45.81451 79.84924 82.60346 0.201497 81.354033 1 4 4 5 3 1 -2 -8
16 41 54 p1 bu 0.98639 1.44342 0.88568 220.39032 467.04865 472.92328 1.478319 395.180834 -1.23682 0.31818 -0.95514 -193.55011 -250.00000 -222.93091 0.864521 7.879744 -0.25043 1.76160 -0.06946 26.84021 217.04865 249.99237 2.342840 403.060579 -0.502747 0.861550 -0.383718 -27.024488 9.036725 -17.564627 1.136771 104.256071 0.502747 0.861550 0.383718 32.327505 63.766478 69.186284 1.136771 104.256071 1.354099 1.028044 -0.128428 4.538247 1.982310 2.214743 6.971485 1.448061 4.354099 4.028044 2.871572 7.538247 4.982310 5.214743 9.971485 4.448061 -2.559645 1.222117 -1.833782 -3.582987 -1.117853 1.458864 4.202143 2.811519 0.010478 0.221663 0.066686 0.000340 0.263630 0.144602 0.000026 0.004931 2.525213 2.073890 0.273154 0.356648 0.276137 53.184480 100.115731 100.630076 0.357878 115.018842 2.646187 2.439647 0.613590 1.050661 2.597176 2.001150 0.26136 0.23462 0.32617 27.19879 51.20086 78.41492 0.081260 58.714430 -2 2 7 3 -1 1 1 6
17 94 107 p1 bu 1.01313 1.24310 0.84406 133.22449 499.99237 499.99237 0.909534 350.703583 -1.22327 0.20081 -0.72101 -98.38867 -250.00000 -250.00000 0.684635 3.318649 -0.21014 1.44391 0.12305 34.83582 249.99237 249.99237 1.594169 354.022232 -0.472050 0.804818 -0.279255 -11.753377 7.370582 -19.425025 1.050239 107.830667 0.472050 0.804818 0.299772 28.068545 62.979478 71.315472 1.050239 107.830667 2.405747 0.144638 -0.279560 0.107820 2.192151 1.910901 0.351962 0.744253 5.405747 3.144638 2.720440 3.107820 5.192151 4.910901 3.351962 3.744253 -3.068572 -0.645289 -0.277007 -1.638343 -0.317496 0.858976 1.738480 2.534900 0.002151 0.518740 0.781775 0.101350 0.750867 0.390354 0.082126 0.011248 2.537687 2.106316 0.265641 0.316477 0.216581 37.207783 104.089150 106.567139 0.251349 111.883295 2.497930 2.279947 0.930601 1.918005 2.630402 1.204576 0.11548 0.31512 0.22486 30.13611 55.85480 54.43573 0.097955 84.760137 2 2 1 7 -1 1 -1 5
18 109 122 p1 bu 1.67248 0.92810 1.86017 191.45966 456.97021 196.78497 1.913505 304.530904 -0.73773 0.34357 -2.00000 -153.87726 -250.00000 -95.63446 0.634220 5.968108 0.93475 1.27167 -0.13983 37.58240 206.97021 101.15051 2.547725 310.499013 -0.232285 0.861681 -0.508981 -24.596581 10.732798 -3.730187 1.148257 104.963170 0.395248 0.861681 0.508981 38.235592 83.949162 39.454533 1.148257 104.963170 3.216258 -0.081504 3.271959 0.473398 0.244027 -0.251205 4.021182 -0.255123 6.216258 2.918496 6.271959 3.473398 3.244027 2.748795 7.021182 2.744877 3.008195 -1.189444 -3.440649 -2.453688 -0.978899 0.272329 3.404867 1.749462 0.002628 0.234265 0.000580 0.014140 0.327630 0.785369 0.000662 0.080211 2.499180 2.179919 0.394673 0.246391 0.504926 60.036972 113.706237 49.814382 0.456554 93.444374 1.932775 2.399414 1.576775 1.667307 2.247407 1.928509 0.23565 0.19995 0.10772 20.62225 81.84814 67.84058 0.218534 115.708763 2 1 3 4 -1 3 1 5
19 127 140 p1 bu 1.36139 1.66010 1.20764 257.65991 351.69983 499.99237 1.678036 410.128796 -1.47400 0.33984 -0.98706 -220.25299 -250.00000 -250.00000 0.937523 6.413597 -0.11261 1.99994 0.22058 37.40692 101.69983 249.99237 2.615559 416.542392 -0.539870 0.919575 -0.333758 -28.169485 17.275883 -19.951454 1.181555 108.169343 0.539870 0.919575 0.377046 39.289621 67.174471 63.624455 1.181555 108.169343 2.333900 4.595520 0.068517 3.835385 3.031313 2.857057 7.239838 2.107182 5.333900 7.595520 3.068517 6.835385 6.031313 5.857057 10.239838 5.107182 -2.759364 3.132534 -0.723453 -3.321813 -3.018246 1.034702 4.288650 2.882213 0.005791 0.001733 0.469402 0.000894 0.002542 0.300808 0.000018 0.003949 2.515331 2.126621 0.329638 0.349742 0.308313 63.011914 90.575870 101.627926 0.422419 110.869990 2.813538 2.371099 0.803824 1.020586 2.552127 2.237704 0.24652 0.10486 0.22857 22.11762 75.17243 49.33167 0.115055 80.228567 -2 2 5 -1 -2 1 -1 7
20 143 156 p1 bu 1.79211 1.15448 1.57971 272.41516 380.34058 309.41009 1.330794 384.469683 -2.00000 0.31085 -0.87402 -107.68890 -250.00000 -59.41772 0.906118 5.569964 -0.20789 1.46533 0.70569 164.72626 130.34058 249.99237 2.236912 390.039646 -0.549137 0.895558 -0.255569 3.606357 -19.166799 22.578313 1.194670 117.960999 0.549137 0.895558 0.364137 39.110037 80.734841 67.138672 1.194670 117.960999 6.015779 1.234769 2.307069 1.783079 -0.052325 0.417921 3.259983 0.753772 9.015779 4.234769 5.307069 4.783079 2.947675 3.417921 6.259983 3.753772 -3.973622 -0.304361 2.025383 1.859710 -1.726743 2.238305 3.295675 2.378608 0.000071 0.760853 0.042828 0.062927 0.084214 0.025201 0.000982 0.017378 2.527773 2.157112 0.441717 0.254454 0.348594 60.877990 114.492504 93.702146 0.350761 112.116503 2.382216 2.165170 1.497764 1.618500 2.759396 1.377127 0.22541 0.13385 0.17994 31.63147 110.42786 114.28833 0.184357 88.064667 -1 2 1 -1 2 1 -4 5
21 158 171 p1 bu 2.23883 1.72992 1.80194 249.34387 499.99237 354.29382 2.938068 367.820792 -0.51331 0.27002 -2.00000 -123.39020 -250.00000 -104.30145 0.375115 4.298193 1.72552 1.99994 -0.19806 125.95367 249.99237 249.99237 3.313183 372.118985 -0.146438 0.943748 -0.477267 6.429232 -14.166612 23.701595 1.180865 138.488358 0.436946 0.943748 0.477267 46.455383 105.986962 63.572812 1.180865 138.488358 6.108327 3.252661 6.063222 -0.265768 -0.540705 0.792420 6.361118 -0.858851 9.108327 6.252661 9.063222 2.734232 2.459295 3.792420 9.361118 2.141149 4.032136 2.323963 -4.017812 0.407239 -0.403617 1.996202 3.949701 1.205421 0.000055 0.020127 0.000059 0.683832 0.686494 0.045912 0.000078 0.228041 2.450373 2.197500 0.568082 0.366265 0.462724 66.880586 139.153807 89.316770 0.645313 115.926910 1.547310 2.608756 2.014400 1.411347 1.281856 0.496909 0.08582 0.18164 0.19433 31.75354 216.20178 109.47419 0.126908 177.156019 2 -1 5 -1 -1 -3 -1 4
22 175 188 p1 bu 0.31970 1.77484 0.63306 439.31580 385.76508 366.48560 1.621982 427.562760 -0.59009 0.22510 -0.59845 -189.32343 -250.00000 -116.49323 0.403293 5.441132 -0.27039 1.99994 0.03461 249.99237 135.76508 249.99237 2.025275 433.003892 -0.375344 1.013278 -0.302945 -5.486708 -20.020116 19.396268 1.149608 115.305042 0.375344 1.013278 0.308270 62.308092 63.756502 66.447332 1.149608 115.305042 0.383264 0.697401 -0.525659 1.703731 0.775724 0.803316 0.485607 1.209209 3.383264 3.697401 2.474341 4.703731 3.775724 3.803316 3.485607 4.209209 -1.977265 1.109891 0.239995 1.456001 -1.718338 1.893231 0.904217 2.539108 0.048012 0.267046 0.810334 0.145392 0.085735 0.058327 0.365880 0.011114 2.506439 2.091412 0.087919 0.419282 0.176068 97.583724 95.646970 91.588627 0.390282 120.711533 2.671239 2.535780 0.541273 2.028181 2.737275 1.023089 0.07324 0.17908 0.16443 69.79370 40.39001 82.30591 0.193781 133.343073 -3 1 3 -1 -2 1 1 4
23 193 206 p1 bu 1.77783 1.62512 1.80914 187.81280 351.94397 433.63190 2.361311 279.357497 -2.00000 0.37482 -0.65283 -151.28326 -175.06409 -183.63953 0.694309 3.476104 -0.22217 1.99994 1.15631 36.52954 176.87988 249.99237 3.055620 282.833601 -0.655658 0.882240 -0.220628 -27.384244 -4.278329 0.009976 1.248883 118.645910 0.655658 0.882240 0.398522 38.619407 69.894643 72.655898 1.248883 118.645910 0.305469 3.834116 5.533806 0.065606 -0.311130 0.489955 4.202873 -1.481763 3.305469 6.834116 8.533806 3.065606 2.688870 3.489955 7.202873 1.518237 -2.438301 3.034644 3.753537 -1.849514 0.441670 1.056420 3.549183 0.619309 0.014756 0.002408 0.000174 0.064384 0.658728 0.290776 0.000386 0.535713 2.477697 2.152701 0.570925 0.369241 0.426245 52.136859 96.220153 107.262139 0.584107 100.861714 2.525793 1.717504 1.676154 2.140975 2.558219 0.935597 0.23627 0.19476 0.15539 59.14307 59.33380 65.74249 0.135363 193.664508 3 1 -2 2 -3 2 1 5
24 215 228 p1 bu 0.57874 1.45703 0.95129 133.63647 386.41357 366.82892 1.346827 357.711884 -0.68610 0.37848 -1.08191 -67.35229 -250.00000 -116.83655 0.679564 2.194376 -0.10736 1.83551 -0.13062 66.28418 136.41357 249.99237 2.026391 359.906260 -0.412565 0.944298 -0.416166 -0.225360 -22.365864 2.085172 1.153756 104.390302 0.412565 0.944298 0.416166 27.502206 75.083805 49.861615 1.153756 104.390302 0.715153 2.131568 2.274484 -0.529862 0.249922 3.428587 3.922429 0.913743 3.715153 5.131568 5.274484 2.470138 3.249922 6.428587 6.922429 3.913743 0.394340 1.902652 -2.736148 0.082269 -1.798824 2.918947 2.938728 2.304779 0.693330 0.057086 0.006216 0.934433 0.072047 0.003512 0.003296 0.021179 2.536356 2.134695 0.133150 0.326705 0.235337 36.028174 110.445815 83.360912 0.289480 100.262876 2.845714 2.477341 0.689828 1.703270 2.175225 2.023713 0.09937 0.11493 0.20269 49.16382 50.66681 27.95410 0.187772 102.774229 -1 2 5 -1 6 -1 2 6
25 232 245 p1 bu 2.82489 1.04681 1.85742 166.09955 499.99237 333.86230 2.244672 354.618668 -0.82495 0.40918 -2.00000 -115.07416 -250.00000 -83.86993 0.936470 5.766383 1.99994 1.45599 -0.14258 51.02539 249.99237 249.99237 3.181142 360.385051 -0.104482 0.934396 -0.511918 -19.757785 1.035837 8.699272 1.232884 103.254619 0.458402 0.934396 0.511918 28.607883 64.754192 61.505244 1.232884 103.254619 5.428680 0.627955 5.108362 0.729538 1.312481 1.648406 7.304737 0.328488 8.428680 3.627955 8.108362 3.729538 4.312481 4.648406 10.304737 3.328488 3.775883 0.181020 -3.794225 -1.476673 -0.395534 2.539067 4.305961 2.150185 0.000159 0.856352 0.000148 0.139763 0.692449 0.011115 0.000017 0.031541 2.485587 2.066455 0.653056 0.242327 0.465817 39.292025 110.917338 88.411539 0.572940 107.092126 1.542359 2.511136 2.076979 1.869341 1.328623 1.610408 0.14234 0.05145 0.13342 43.85376 12.37488 71.78498 0.210171 109.304010 1 1 5 1 -2 2 -1 4
26 257 270 p1 bu 1.31628 1.48029 1.84235 282.96661 499.99237 378.65448 2.210545 400.496100 -1.20306 0.51965 -2.00000 -88.21869 -250.00000 -128.66211 0.834041 3.136310 0.11322 1.99994 -0.15765 194.74792 249.99237 249.99237 3.044586 403.632410 -0.489623 0.975229 -0.535977 -4.815909 16.652619 6.009616 1.270897 110.391488 0.507042 0.975229 0.535977 35.137469 71.661728 64.818162 1.270897 110.391488 -0.068062 2.065421 3.636229 4.844037 1.288745 0.973489 3.388989 0.602103 2.931938 5.065421 6.636229 7.844037 4.288745 3.973489 6.388989 3.602103 -1.342457 2.607500 -3.384992 3.425153 -0.536302 2.114303 3.478801 2.283436 0.179448 0.009121 0.000712 0.000614 0.591750 0.034489 0.000504 0.022405 2.477440 2.050364 0.350506 0.370552 0.483278 63.428714 111.012435 96.913327 0.597173 117.853694 2.704550 2.639353 0.558700 0.998742 1.407963 0.568384 0.26709 0.12177 0.32349 19.81353 54.00848 41.00037 0.111128 151.575050 -1 -1 -2 -3 -2 -1 -4 4
27 259 272 p1 bu 1.31628 1.48029 1.94812 282.96661 499.99237 378.65448 2.210545 400.496100 -1.20306 0.51965 -2.00000 -88.21869 -250.00000 -128.66211 0.834041 3.136310 0.11322 1.99994 -0.05188 194.74792 249.99237 249.99237 3.044586 403.632410 -0.490149 0.969477 -0.523127 -3.308811 14.986477 4.581745 1.267000 113.370896 0.507568 0.969477 0.523127 38.766714 71.304320 66.246033 1.267000 113.370896 -0.155340 1.829198 3.348583 4.160170 1.239291 0.954881 3.343725 0.680561 2.844660 4.829198 6.348583 7.160170 4.239291 3.954881 6.343725 3.680561 -1.257560 2.541515 -3.255189 3.236905 -0.448493 2.138603 3.461172 2.313564 0.208551 0.011037 0.001133 0.001208 0.653798 0.032468 0.000538 0.020692 2.476079 2.094717 0.353744 0.376955 0.493858 64.499050 111.236422 97.482735 0.599692 115.931565 2.685107 2.606304 0.567996 0.987467 1.399036 0.586806 0.34015 0.30566 0.32953 19.81353 54.82482 55.64118 0.111128 141.900320 -1 -1 7 -3 -2 -1 -4 4
28 272 285 p1 bu 2.83063 1.87348 1.46557 179.11530 409.20258 278.47290 2.200381 300.839130 -0.83069 0.12646 -0.76898 -138.00812 -250.00000 -161.94153 0.712480 27.446264 1.99994 1.99994 0.69659 41.10718 159.20258 116.53137 2.912861 328.285395 -0.261841 0.994164 -0.248395 -23.867682 -36.149832 -7.778462 1.311581 116.815329 0.595835 0.994164 0.404514 34.654471 91.552148 46.295165 1.311581 116.815329 6.060024 -0.754908 0.400095 1.225477 -0.757504 0.761712 2.486950 0.304292 9.060024 2.245092 3.400095 4.225477 2.242496 3.761712 5.486950 3.304292 4.015009 0.571838 1.755966 -2.070762 -0.740259 -0.941339 2.984658 1.999082 0.000059 0.567432 0.079094 0.038381 0.459143 0.346531 0.002839 0.045599 2.487543 2.309233 0.687752 0.537952 0.390837 43.623624 114.969389 64.829164 0.559933 87.263747 1.624909 1.771501 0.616228 1.230850 1.650519 0.959352 0.15319 0.80114 0.41992 28.19824 162.82654 47.78290 0.439705 108.855717 3 1 1 2 -3 1 -3 5
29 277 290 p1 bu 2.62232 1.87348 1.46557 199.33319 316.26129 278.47290 2.200381 296.928051 -0.62238 0.12646 -0.76898 -138.00812 -250.00000 -161.94153 0.712480 31.357344 1.99994 1.99994 0.69659 61.32507 66.26129 116.53137 2.912861 328.285395 -0.198242 1.047738 -0.236953 -15.374405 -47.748859 -1.372705 1.324408 112.077155 0.532236 1.047738 0.393072 35.940904 78.961298 48.442547 1.324408 112.077155 6.418382 -0.642111 0.165540 0.657328 -0.746401 0.766772 2.471995 1.186003 9.418382 2.357889 3.165540 3.657328 2.253599 3.766772 5.471995 4.186003 4.112072 0.073646 1.571890 -1.661528 -1.460373 -1.145848 2.946551 2.611293 0.000039 0.941292 0.115976 0.096607 0.144188 0.251858 0.003213 0.009020 2.489543 2.335725 0.661498 0.526120 0.394735 49.541823 100.926192 65.669662 0.556188 83.125895 1.592607 1.767815 0.646096 1.180018 1.423547 0.806164 0.18774 0.54315 0.41034 39.89410 145.43152 68.44330 0.495061 63.098288 2 1 1 9 5 1 -3 2
... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ...
678 114 127 p5 fs 1.57013 2.61096 2.64441 284.26361 365.71503 499.99237 2.894207 328.264541 -2.00000 -0.61102 -2.00000 -181.41174 -115.72266 -250.00000 0.569860 7.355576 -0.42987 1.99994 0.64441 102.85187 249.99237 249.99237 3.464067 335.620117 -0.957304 0.581502 -0.158582 -15.894962 31.961881 -15.125568 1.263470 118.394751 0.957304 0.675505 0.323425 39.672852 68.041876 61.141382 1.263470 118.394751 1.301469 1.978425 4.913504 2.610011 0.299516 2.389842 4.699013 -1.029934 4.301469 4.978425 7.913504 5.610011 3.299516 5.389842 7.699013 1.970066 -1.989507 0.997107 -3.492407 -1.804048 1.853850 0.686930 3.571204 1.257432 0.046645 0.318713 0.000479 0.071224 0.063761 0.492127 0.000355 0.208597 2.444443 2.143415 0.395260 0.555151 0.583655 60.189378 99.280161 103.453341 0.698887 108.016586 2.490142 2.440977 1.104466 1.908562 1.929555 1.767362 0.34368 0.29181 0.25757 27.40478 52.26899 65.17029 0.309480 183.649596 -7 1 -1 -3 -5 1 7 2
679 125 138 p5 fs 1.44537 2.77808 0.75696 339.14184 399.77264 499.99237 2.184340 342.582939 -2.00000 -0.77814 -0.57916 -126.22070 -250.00000 -250.00000 0.644823 15.734065 -0.55463 1.99994 0.17780 212.92114 149.77264 249.99237 2.829163 358.317004 -0.934392 0.477468 -0.136426 10.587252 -1.541725 -8.979798 1.170818 108.236438 0.934392 0.597182 0.208440 43.481681 61.897278 59.700012 1.170818 108.236438 2.400012 3.191735 -1.189200 2.886885 1.647381 2.173737 5.693542 0.013984 5.400012 6.191735 1.810800 5.886885 4.647381 5.173737 8.693542 3.013984 -2.797105 1.282067 -1.013829 2.116892 -1.734738 0.268131 3.850042 2.155215 0.005156 0.199819 0.310665 0.034269 0.082787 0.788598 0.000118 0.031145 2.489919 2.067995 0.374014 0.559054 0.243242 71.842345 93.320367 104.563114 0.510350 115.253786 2.501228 1.718333 1.133634 1.946826 2.153114 2.024308 0.29559 0.20019 0.40179 35.65216 57.35016 30.92194 0.134868 132.666498 2 1 2 -2 2 1 -4 5
680 141 154 p5 fs 1.55133 2.78113 1.30109 342.94891 317.76428 457.74841 2.302838 424.256858 -2.00000 -0.78119 -0.80701 -250.00000 -78.21655 -207.75604 0.638424 2.802037 -0.44867 1.99994 0.49408 92.94891 239.54773 249.99237 2.941263 427.058895 -0.940828 0.486582 0.034305 6.870563 31.170772 -3.457876 1.193242 107.879467 0.940828 0.606765 0.221392 46.231197 61.816289 61.730018 1.193242 107.879467 0.821506 2.032202 1.908171 4.885281 0.350205 1.711746 2.798864 1.967653 3.821506 5.032202 4.908171 7.885281 3.350205 4.711746 5.798864 4.967653 -2.524690 0.905919 -2.367005 -3.513404 1.964274 1.012795 3.290119 2.714427 0.011580 0.364979 0.017933 0.000442 0.049498 0.311158 0.001001 0.006639 2.460998 2.062362 0.478212 0.589825 0.307487 81.283323 85.037534 99.752700 0.608604 114.811852 2.185201 1.966432 1.455748 2.239368 1.224410 2.341282 0.29230 0.42388 0.22986 32.68432 108.54340 49.04938 0.050695 113.244244 -3 1 2 2 3 1 2 5
681 157 170 p5 fs 2.12524 2.82440 1.56378 178.46679 405.08271 348.06824 2.263621 337.994150 -2.00000 -0.82446 -1.27417 -51.38397 -201.29395 -250.00000 0.838518 7.214914 0.12524 1.99994 0.28961 127.08282 203.78876 98.06824 3.102139 345.209063 -0.890208 0.420454 -0.006958 18.474285 -2.305837 -43.870778 1.173553 98.225230 0.909475 0.557942 0.238461 35.310598 57.770362 60.648405 1.173553 98.225230 2.068577 1.644497 5.618899 0.481868 1.245703 0.711208 6.917279 0.498725 5.068577 4.644497 8.618899 3.481868 4.245703 3.711208 9.917279 3.498725 -0.467987 1.350522 -3.893920 1.510347 0.250360 -1.730696 4.209719 2.137525 0.639794 0.176849 0.000099 0.130955 0.802309 0.083506 0.000026 0.032555 2.477152 2.102311 0.446199 0.620146 0.390288 44.653747 89.209108 84.469378 0.572776 98.541176 2.454402 2.217448 1.252678 1.986948 1.106185 2.554358 0.24688 0.28992 0.15308 45.88318 22.12525 79.97894 0.291334 111.596742 -1 1 2 -1 3 1 1 4
682 168 181 p5 fs 3.34534 1.85724 2.55158 283.64563 302.45209 366.87469 2.478474 259.905943 -1.34540 0.14270 -0.72101 -52.03247 -217.51404 -226.18103 0.890577 11.165010 1.99994 1.99994 1.83057 231.61316 84.93805 140.69366 3.369051 271.070952 -0.685299 0.517847 0.176147 24.915842 -6.375827 -10.564365 1.241984 88.390173 0.992982 0.517847 0.391255 50.108103 39.260864 39.053696 1.241984 88.390173 7.274664 4.492660 2.819668 1.976168 4.568268 3.683212 6.470031 -0.489862 10.274664 7.492660 5.819668 4.976168 7.568268 6.683212 9.470031 2.510138 4.276669 3.623241 2.474617 2.701727 -3.346808 -2.168101 4.122382 2.035784 0.000019 0.000291 0.013338 0.006898 0.000817 0.030151 0.000037 0.041772 2.467556 2.089018 0.790726 0.473813 0.576505 74.098162 67.751819 74.573055 0.639070 92.748156 1.585722 0.867214 1.144132 1.261241 1.954883 1.447667 0.17718 0.19520 0.26916 51.43738 30.64727 19.73725 0.182370 81.510902 1 1 -1 -1 -2 1 7 3
683 190 203 p5 fs 3.42352 2.53851 2.53980 328.23181 363.18207 459.30481 2.621138 397.023079 -1.42358 -0.53857 -0.53986 -250.00000 -187.34741 -209.31244 0.842860 3.095950 1.99994 1.99994 1.99994 78.23181 175.83466 249.99237 3.463998 400.119029 -0.680513 0.390605 0.313984 -1.568720 4.192059 -13.521634 1.229279 89.181953 0.988196 0.524196 0.397039 48.520015 41.729855 53.007858 1.229279 89.181953 6.978713 2.634473 4.896935 4.491592 2.125905 3.392379 6.455244 1.859536 9.978713 5.634473 7.896935 7.491592 5.125905 6.392379 9.455244 4.859536 4.187131 2.205469 3.303961 -3.474587 -0.495368 1.837873 4.120335 2.937679 0.000028 0.027421 0.000953 0.000512 0.620340 0.066081 0.000038 0.003307 2.456461 1.905592 0.795668 0.570217 0.538352 79.498356 76.382588 93.320945 0.671705 114.509978 1.373715 0.504224 1.265677 0.844133 2.855966 2.487025 0.08130 0.18060 0.13293 52.39106 13.58033 30.24292 0.162151 78.276297 1 2 1 2 -1 1 -5 6
684 218 231 p5 fs 2.86652 1.91925 1.86780 267.61627 400.45166 286.14044 1.932268 422.557628 -1.75385 0.08069 -0.65619 -17.62390 -239.21967 -250.00000 0.657287 4.317337 1.11267 1.99994 1.21161 249.99237 161.23199 36.14044 2.589555 426.874965 -0.796828 0.540931 0.139363 44.766353 -8.067204 -43.964678 1.259611 105.560954 0.968008 0.540931 0.376142 51.286550 64.970750 52.226726 1.259611 105.560954 3.532514 1.252902 0.220994 2.324300 0.667185 1.605213 0.998324 2.047023 6.532514 4.252902 3.220994 5.324300 3.667185 4.605213 3.998324 5.047023 3.011317 2.657609 1.031193 2.923905 -1.051537 -2.695319 2.660677 2.715056 0.002601 0.007870 0.302450 0.003457 0.293012 0.007032 0.007798 0.006626 2.481743 2.056041 0.640134 0.560715 0.471932 72.003077 96.012285 76.022111 0.554288 114.197406 1.882842 1.267603 1.381731 2.218545 1.197172 2.688755 0.03168 0.56104 0.40973 57.97577 34.79004 53.15399 0.235702 124.652290 -1 6 -1 6 3 4 -2 4
685 237 250 p5 fs 2.57965 2.02936 0.96228 286.49139 282.68433 311.64551 2.047728 425.299886 -2.00000 -0.02942 -0.01514 -36.49902 -250.00000 -250.00000 0.781501 7.708410 0.57965 1.99994 0.94714 249.99237 32.68433 61.64551 2.829230 433.008297 -0.902865 0.575425 0.231638 30.048078 -48.085138 -27.459952 1.252187 91.514948 0.992042 0.579952 0.233968 46.035765 55.387057 44.182410 1.252187 91.514948 1.880615 2.264206 2.270726 4.472735 1.088333 3.915638 3.726867 3.741241 4.880615 5.264206 5.270726 7.472735 4.088333 6.915638 6.726867 6.741241 1.446815 2.698546 2.813742 3.452092 -2.717659 -3.296678 3.467948 3.449158 0.147949 0.006964 0.004897 0.000556 0.006575 0.000978 0.000524 0.000562 2.494581 2.016621 0.565728 0.505331 0.252260 70.466434 82.400336 72.927695 0.520582 112.551824 2.323012 1.060209 2.191958 2.307128 1.028178 2.584717 0.17108 0.34723 0.23969 40.48156 39.54316 35.98022 0.202374 53.237106 -1 -4 3 -1 2 1 4 5
686 249 262 p5 fs 3.92285 2.23322 2.65473 286.73553 348.21320 272.79663 2.604047 425.880395 -1.92291 -0.23328 -0.65479 -36.74316 -250.00000 -250.00000 0.859951 7.127902 1.99994 1.99994 1.99994 249.99237 98.21320 22.79663 3.463998 433.008297 -0.828872 0.453142 0.415152 28.906602 -27.915368 -36.634006 1.405428 103.122632 1.136555 0.489032 0.515889 37.705055 71.823122 46.235305 1.405428 103.122632 4.448183 2.669016 2.823730 5.391793 0.311317 3.038082 2.221641 2.410248 7.448183 5.669016 5.823730 8.391793 3.311317 6.038082 5.221641 5.410248 3.291036 2.797828 2.147678 3.799835 -2.152892 -3.193056 2.971455 3.008750 0.000998 0.005145 0.031739 0.000145 0.031327 0.001408 0.002964 0.002623 2.454893 2.066378 0.911893 0.535728 0.563999 68.605612 105.446392 72.295340 0.725354 115.649990 1.582511 0.774741 1.470192 2.304002 0.890428 2.781817 0.24219 0.38794 0.20557 38.48267 50.36927 41.06903 0.603526 75.181127 -1 1 -1 -5 2 -3 3 4
687 277 290 p5 fs 2.48590 2.53919 1.42890 133.96454 412.17804 420.43304 1.654039 355.495707 -1.45721 -0.53925 -1.00092 -58.12073 -250.00000 -170.44067 0.807628 3.835587 1.02869 1.99994 0.42798 75.84381 162.17804 249.99237 2.461667 359.331293 -0.799340 0.527222 0.110370 15.203622 -12.725830 5.896348 1.244030 84.738997 0.957600 0.610183 0.311965 27.986966 47.704842 54.398168 1.244030 84.738997 5.097978 0.834016 3.161137 -0.444896 2.851249 1.779628 1.275359 1.346180 8.097978 3.834016 6.161137 2.555104 5.851249 4.779628 4.275359 4.346180 3.591272 1.544806 -3.139127 0.208219 -1.806321 1.730623 2.490754 2.627174 0.000329 0.122393 0.001695 0.835058 0.070868 0.083519 0.012747 0.008610 2.505039 1.926938 0.574329 0.597366 0.374254 36.499854 85.806543 93.482238 0.456335 103.349394 1.846592 1.798899 2.268386 1.990716 2.450142 1.447930 0.21186 0.53936 0.29761 40.90881 29.31976 21.10290 0.499653 104.029109 -1 1 -7 -4 -1 1 -3 5
688 294 307 p5 fs 1.50427 2.21387 2.28375 249.55750 248.90900 280.96008 2.111770 357.296954 -2.00000 -0.44574 -1.37280 -107.91016 -214.75983 -250.00000 0.890040 1.431012 -0.49573 1.76813 0.91095 141.64734 34.14917 30.96008 3.001810 358.727966 -0.995991 0.322374 0.165434 21.319462 -28.076761 -32.402039 1.202160 75.525473 0.995991 0.424688 0.376634 39.426950 37.071230 43.227561 1.202160 75.525473 4.320372 3.422502 4.179804 1.257188 4.773246 4.027874 7.465741 4.674924 7.320372 6.422502 7.179804 4.257188 7.773246 7.027874 10.465741 7.674924 -3.190563 2.722872 -3.078335 -0.228807 -3.576027 -3.441247 4.340613 3.615165 0.001420 0.006472 0.002082 0.819019 0.000349 0.000579 0.000014 0.000300 2.493692 2.041197 0.326510 0.489880 0.503222 54.828037 59.146562 70.704583 0.527188 89.956581 2.443802 2.212450 1.588390 2.210033 0.431367 2.068568 0.09876 0.12536 0.12812 54.63410 44.36493 46.66900 0.133559 65.753144 -1 1 -1 -1 -1 2 -3 5
689 313 326 p5 fs 3.99994 2.25226 2.24628 422.33276 399.30725 499.99237 2.829958 422.306376 -2.00000 -0.25232 -2.00000 -250.00000 -149.31488 -250.00000 0.634074 10.697515 1.99994 1.99994 0.24628 172.33276 249.99237 249.99237 3.464032 433.003892 -0.720055 0.428204 -0.070815 -11.535645 41.090157 -3.863995 1.228323 110.980965 1.027738 0.467022 0.285903 49.594586 67.143952 64.547025 1.228323 110.980965 5.425770 4.321404 6.976124 2.540893 0.421024 1.594411 4.618721 0.487988 8.425770 7.321404 9.976124 5.540893 3.421024 4.594411 7.618721 3.487988 3.551178 3.360891 -4.223095 -1.542941 1.422166 0.049408 3.722835 2.445360 0.000384 0.000777 0.000024 0.122845 0.154978 0.960595 0.000197 0.014471 2.438486 1.888617 0.850958 0.508262 0.572089 88.446914 103.810097 107.913853 0.712615 140.585990 1.432393 2.434034 2.193008 1.796883 1.701249 2.509556 0.09131 0.18372 0.25202 19.99664 53.31421 22.39227 0.150540 150.468537 1 1 -4 -1 -4 1 -2 4
690 321 334 p5 fs 1.52545 2.60059 0.56653 366.27960 399.30725 464.46228 2.197516 418.151513 -2.00000 -0.60065 -0.23914 -250.00000 -149.31488 -214.46991 0.634074 14.852379 -0.47455 1.99994 0.32739 116.27960 249.99237 249.99237 2.831590 433.003892 -1.018743 0.621446 0.037484 -9.218070 4.710857 -6.659288 1.286631 136.777836 1.018743 0.713854 0.131264 58.008635 86.381765 74.188820 1.286631 136.777836 0.955928 1.196869 -0.828892 1.667946 -0.246189 1.185705 1.519183 1.286147 3.955928 4.196869 2.171108 4.667946 2.753811 4.185705 4.519183 4.286147 -2.560918 0.939611 0.206112 -1.892243 1.163875 0.902612 2.712460 2.350978 0.010440 0.347417 0.836703 0.058459 0.244475 0.366732 0.006679 0.018724 2.479125 2.259725 0.441975 0.583303 0.157094 89.711010 110.099970 104.464114 0.574700 111.920305 2.343951 1.305527 1.808029 2.116285 1.416357 2.295076 0.23498 0.39838 0.21759 33.82111 132.54547 72.74628 0.119832 120.478664 -5 1 4 -1 5 1 6 3
691 340 353 p5 fs 1.66425 3.57446 1.29052 387.22229 392.84515 499.99237 2.462572 317.580849 -2.00000 -1.57452 -1.06152 -192.13867 -175.14038 -250.00000 0.558452 1.613717 -0.33575 1.99994 0.22900 195.08362 217.70477 249.99237 3.021024 319.194566 -1.041458 0.359459 -0.060369 3.242493 -1.815211 0.083336 1.263940 111.041332 1.041458 0.601693 0.188608 54.265536 57.246869 55.702796 1.263940 111.041332 1.861378 3.025460 4.781647 1.034625 1.077960 2.290568 2.671172 -1.150060 4.861378 6.025460 7.781647 4.034625 4.077960 5.290568 5.671172 1.849940 -1.665514 -1.085631 -3.628847 -0.048105 1.277228 -0.090855 3.088669 1.350886 0.095810 0.277642 0.000285 0.961633 0.201522 0.927608 0.002011 0.176732 2.467634 1.974600 0.364262 0.721873 0.316807 87.100129 90.504704 103.566210 0.611637 119.109748 2.093414 2.171277 1.176699 1.667538 2.063712 2.413969 0.22784 0.05267 0.21679 16.92200 40.19165 16.59393 0.175514 226.652612 1 1 -7 -2 2 1 -1 7
692 372 385 p5 fs 1.68970 3.17719 0.95850 300.63629 339.33258 499.99237 2.443822 354.503576 -2.00000 -1.17725 -0.63947 -102.89764 -250.00000 -250.00000 0.455951 3.627900 -0.31030 1.99994 0.31903 197.73865 89.33258 249.99237 2.899773 358.131476 -0.919995 0.369182 0.005324 15.397879 -20.719675 -25.177002 1.187254 100.275438 0.919995 0.550298 0.284875 44.942416 44.742878 67.745502 1.187254 100.275438 0.594703 1.866701 -0.682350 1.676324 3.209174 2.102125 3.586829 0.308598 3.594703 4.866701 2.317650 4.676324 6.209174 5.102125 6.586829 3.308598 -1.610759 0.435963 -1.563499 1.906851 -2.870361 1.020416 3.215952 2.326411 0.107232 0.662864 0.117935 0.056540 0.004100 0.307531 0.001300 0.019997 2.468973 1.981724 0.438493 0.673281 0.324443 68.869285 78.033959 106.129066 0.569201 115.502265 2.117046 2.140102 1.471385 0.965879 2.294610 2.126323 0.40076 0.26709 0.44250 44.30389 16.25824 75.28686 0.143270 89.104255 -3 1 -3 -2 -1 1 3 5
693 382 395 p5 fs 2.67474 2.30689 1.10382 311.27930 371.65070 486.47308 2.119160 423.319668 -2.00000 -0.30695 -0.58319 -61.28693 -250.00000 -250.00000 0.709378 9.688629 0.67474 1.99994 0.52063 249.99237 121.65070 236.47308 2.828538 433.008297 -0.853581 0.506075 -0.029594 21.621118 -42.981073 -10.942312 1.202506 116.190181 0.957387 0.553298 0.169495 50.286515 73.132441 65.026505 1.202506 116.190181 1.609860 1.475017 1.054715 3.642578 0.436607 1.886343 4.988097 1.360611 4.609860 4.475017 4.054715 6.642578 3.436607 4.886343 7.988097 4.360611 1.224129 2.134606 -0.206479 3.100144 -1.702498 0.179599 3.684193 2.825756 0.220903 0.032793 0.836417 0.001934 0.088662 0.857467 0.000229 0.004717 2.492373 2.120290 0.593201 0.555815 0.246675 75.673842 100.218926 102.796258 0.511110 123.579350 2.484760 1.821043 1.241502 2.267372 1.634783 2.144819 0.36694 0.64550 0.18201 51.06354 58.24280 53.05481 0.304779 60.093052 -1 1 6 -3 -1 1 3 3
694 398 411 p5 fs 2.41070 2.44708 2.29364 384.07898 328.44543 362.99896 2.321445 420.701511 -1.47955 -0.44714 -0.29370 -235.81696 -250.00000 -113.00659 0.656232 4.275928 0.93115 1.99994 1.99994 148.26202 78.44543 249.99237 2.977677 424.977439 -0.640593 0.578833 0.312336 10.499808 -38.690422 -8.598915 1.201351 116.168368 0.783847 0.683692 0.386170 58.412406 71.274977 55.050190 1.201351 116.168368 3.139596 1.392368 4.405653 2.116181 0.059317 3.790638 5.679127 1.140841 6.139596 4.392368 7.405653 5.116181 3.059317 6.790638 8.679127 4.140841 2.838856 1.118818 3.423269 -2.103326 -2.010585 3.147756 3.893642 2.394373 0.004528 0.263218 0.000619 0.035437 0.044369 0.001645 0.000099 0.016649 2.484456 2.086692 0.537978 0.561795 0.542300 89.259133 102.547212 85.279220 0.546662 118.075040 1.716718 0.739842 1.531820 1.460703 2.012088 2.444296 0.09253 0.23932 0.31403 45.42541 90.31677 42.52625 0.088176 128.699486 1 1 1 2 -1 2 3 5
695 3 16 p5 fu 1.47668 1.54602 2.09100 429.12292 260.87189 281.96716 2.349910 358.554349 -1.29559 0.45392 -0.09106 -250.00000 -218.93311 -31.97479 0.496615 1.359791 0.18109 1.99994 1.99994 179.12292 41.93878 249.99237 2.846526 359.914140 -0.606767 0.790649 0.254498 0.460698 -38.906392 35.503095 1.157973 94.184554 0.634627 0.790649 0.304912 51.598768 48.087487 44.246380 1.157973 94.184554 0.361205 5.109279 4.430519 2.238215 0.742394 1.602324 4.507227 0.254747 3.361205 8.109279 7.430519 5.238215 3.742394 4.602324 7.507227 3.254747 0.605863 3.735648 3.632582 -1.648348 -2.428580 3.066772 3.525468 2.341386 0.544605 0.000187 0.000281 0.099281 0.015158 0.002164 0.000423 0.019212 2.476302 1.790188 0.347682 0.378487 0.558544 93.090539 72.805517 86.705895 0.541541 124.048633 2.439581 0.781517 1.754523 2.278393 2.090093 1.745760 0.41290 0.21038 0.20081 14.98413 68.42041 9.32312 0.088175 90.552758 -1 -1 -2 2 -1 -1 -1 9
696 18 31 p5 fu 0.83752 1.41364 1.15888 454.52881 499.99237 346.00067 1.497424 429.627375 -1.09564 0.58630 -0.14954 -250.00000 -250.00000 -96.00830 0.996360 3.376517 -0.25812 1.99994 1.00934 204.52881 249.99237 249.99237 2.493783 433.003892 -0.626882 0.968229 0.208074 4.838796 -31.731239 26.545012 1.212982 111.884180 0.626882 0.968229 0.234958 49.131541 77.917245 54.566016 1.212982 111.884180 -0.415063 4.172078 1.113699 3.100853 0.852833 0.890855 4.924122 0.168921 2.584937 7.172078 4.113699 6.100853 3.852833 3.890855 7.924122 3.168921 -1.158826 3.440813 2.449612 -1.506678 0.657711 2.281571 3.774639 2.239957 0.246527 0.000580 0.014301 0.131893 0.510724 0.022515 0.000160 0.025094 2.519646 1.834403 0.231693 0.334239 0.307701 92.567166 117.046241 93.648551 0.403108 142.321492 2.822352 0.800700 2.258248 2.717333 1.358881 1.773272 0.20435 0.12494 0.23621 12.02393 109.37500 30.67779 0.174117 135.935660 -1 -1 -3 1 -1 1 -4 5
697 31 44 p5 fu 2.16339 1.33002 0.98913 459.74731 388.37433 279.32739 1.316516 391.048565 -1.73987 0.66992 -0.10577 -250.00000 -195.91522 -29.33502 0.884168 11.489252 0.42352 1.99994 0.88336 209.74731 192.45911 249.99237 2.200684 402.537817 -0.404499 0.982395 0.140319 0.117962 -23.827772 32.272927 1.172342 100.390645 0.508248 0.982395 0.170067 51.334672 63.054598 46.357962 1.172342 100.390645 2.287888 5.121102 2.253189 2.909120 0.874133 1.642037 1.609205 0.912965 5.287888 8.121102 5.253189 5.909120 3.874133 4.642037 4.609205 3.912965 -1.881949 3.720042 2.976412 -1.148191 0.489399 3.008563 3.103867 2.690862 0.059843 0.000199 0.002916 0.250890 0.624559 0.002625 0.001910 0.007127 2.509387 1.955879 0.486191 0.318682 0.263602 93.439821 90.904384 84.176649 0.425967 124.945397 2.180151 0.872698 1.776374 2.553869 1.416671 1.758194 0.29254 0.11310 0.19110 31.02875 75.94299 15.40375 0.093831 74.923886 -1 1 -1 2 -2 -1 1 3
698 45 58 p5 fu 1.20673 1.58637 1.82086 349.67804 198.05146 295.71533 1.767802 334.061223 -1.63556 0.41357 -0.28125 -250.00000 -185.66132 -78.63617 0.866762 6.420334 -0.42883 1.99994 1.53961 99.67804 12.39014 217.07916 2.634563 340.481557 -0.588932 0.995665 0.207407 -1.652644 -62.042236 24.875346 1.265225 103.131326 0.588932 0.995665 0.258441 40.888859 65.362784 50.989005 1.265225 103.131326 6.470940 3.438881 2.431687 4.841932 -1.382371 0.841579 2.929300 -0.824248 9.470940 6.438881 5.431687 7.841932 1.617629 3.841579 5.929300 2.175752 -4.124948 2.699335 3.136805 -3.446421 -1.209956 2.420720 3.273037 1.554218 0.000037 0.006948 0.001708 0.000568 0.226296 0.015490 0.001064 0.120133 2.507226 1.935432 0.314480 0.341741 0.473857 78.873198 77.652799 86.202744 0.469723 116.260631 2.610048 1.017748 1.908615 1.725933 1.896395 2.052608 0.10645 0.06219 0.19422 13.09204 141.96777 26.34430 0.168585 175.439030 -6 1 -1 2 2 2 -4 6
699 69 82 p5 fu 0.92609 1.60431 0.99548 349.95270 354.40063 277.06909 1.452005 359.364040 -1.17041 0.39563 -0.11682 -250.00000 -196.02203 -67.37518 0.787704 3.341512 -0.24432 1.99994 0.87866 99.95270 158.37860 209.69391 2.239709 362.705551 -0.501235 0.922058 0.165711 -3.267728 -23.279043 28.847328 1.129473 98.632700 0.501235 0.922058 0.206182 40.539082 64.543503 46.970074 1.129473 98.632700 2.567992 4.446137 0.555225 4.419368 -0.147539 0.561026 5.513202 0.431971 5.567992 7.446137 3.555225 7.419368 2.852461 3.561026 8.513202 3.431971 -3.037800 3.209248 2.429445 -3.223296 -0.197750 2.341897 3.860129 2.180814 0.002383 0.001331 0.015122 0.001267 0.843241 0.019186 0.000113 0.029197 2.526650 1.987748 0.232413 0.348783 0.298481 79.775717 89.222231 77.034156 0.343417 109.157608 2.619728 1.045836 2.038466 2.449268 1.464202 1.983248 0.03967 0.20886 0.23236 13.49640 97.22138 9.20868 0.064975 137.515372 -3 1 -2 1 -1 1 2 7
700 81 94 p5 fu 1.22577 1.51904 1.55267 372.13898 344.24591 322.64709 1.799844 387.426092 -1.26520 0.48090 -0.37671 -250.00000 -172.08862 -72.65472 0.520543 5.809610 -0.03943 1.99994 1.17596 122.13898 172.15729 249.99237 2.320387 393.235702 -0.430428 0.932721 0.106164 -11.193495 -20.749017 29.621416 1.116082 96.506893 0.430428 0.932721 0.217979 44.254597 55.661126 51.328804 1.116082 96.506893 3.461460 4.403753 2.382071 2.991704 0.925268 1.605117 3.503179 1.316625 6.461460 7.403753 5.382071 5.991704 3.925268 4.605117 6.503179 4.316625 -2.909900 3.221791 2.920885 -2.758182 0.858543 2.739131 3.060898 2.550360 0.003615 0.001274 0.003490 0.005812 0.390592 0.006160 0.002207 0.010761 2.507606 1.992736 0.280705 0.345201 0.380437 82.764348 79.572164 83.421737 0.404426 110.727218 2.394218 1.081624 1.474020 2.455259 1.168834 2.069957 0.08435 0.09619 0.10676 7.56836 46.14257 49.88861 0.061489 116.900698 -2 -1 -2 2 2 1 -1 3
701 94 107 p5 fu 1.66601 1.66358 1.50628 432.15179 292.90771 277.49634 2.076707 362.938980 -1.52429 0.33636 -0.11395 -250.00000 -137.80212 -27.50397 0.379642 3.567399 0.14172 1.99994 1.39233 182.15179 155.10559 249.99237 2.456349 366.506379 -0.442868 0.956378 0.196504 -3.515977 -17.043481 34.084027 1.148788 101.025612 0.464671 0.956378 0.250658 49.459015 58.438225 47.829850 1.148788 101.025612 4.259990 4.276866 4.278747 2.760766 -0.063817 1.522542 2.345717 0.305391 7.259990 7.276866 7.278747 5.760766 2.936183 4.522542 5.345717 3.305391 -3.034476 2.850310 3.493072 -1.690301 0.584845 3.057354 2.554324 2.325555 0.002410 0.004368 0.000477 0.090970 0.558652 0.002233 0.010639 0.020042 2.488405 1.962664 0.354359 0.344662 0.384934 90.004129 77.859377 92.934915 0.471409 118.566915 2.439376 0.859855 1.872985 1.802575 2.138398 1.722939 0.13483 0.07702 0.35669 16.20483 70.03784 13.66425 0.081454 134.898264 -1 -1 4 2 3 1 -1 4
702 119 132 p5 fu 1.41938 1.71655 1.34949 357.39899 385.08606 297.56164 2.493884 429.484652 -1.67200 0.28339 -0.10620 -250.00000 -135.09369 -47.56927 0.394215 3.519239 -0.25262 1.99994 1.24329 107.39899 249.99237 249.99237 2.888099 433.003892 -0.580989 0.890381 0.186539 1.698422 -8.527316 30.921350 1.118666 91.953486 0.580989 0.890381 0.202878 41.296152 59.456458 46.621468 1.118666 91.953486 4.911958 3.452469 3.495880 4.970360 2.191444 1.961507 5.673185 2.696652 7.911958 6.452469 6.495880 7.970360 5.191444 4.961507 8.673185 5.696652 -3.738362 2.608093 3.468649 -3.461613 2.280737 2.811293 3.744180 2.978898 0.000185 0.009105 0.000523 0.000537 0.022564 0.004934 0.000181 0.002893 2.469384 1.881457 0.344061 0.376966 0.353406 79.677244 92.591841 78.754135 0.546887 117.047905 2.876548 0.859518 2.475066 2.660137 1.000725 2.198701 0.11212 0.08630 0.10022 21.04187 64.44549 27.54974 0.038353 144.247824 1 -1 -1 2 2 1 -1 7
703 132 145 p5 fu 1.35834 2.05310 0.96545 372.93243 271.50726 259.23157 1.876420 300.915973 -1.69641 -0.05316 -0.10675 -250.00000 -146.39282 -75.74463 0.883097 2.029390 -0.33807 1.99994 0.85870 122.93243 125.11444 183.48694 2.759517 302.945363 -0.631605 0.939270 0.195021 2.792358 -33.020606 15.280503 1.220235 94.481899 0.631605 0.947448 0.242244 44.616699 60.546288 41.086637 1.220235 94.481899 4.643195 3.180711 0.034637 4.631110 -0.733157 1.242598 6.995153 -0.664444 7.643195 6.180711 3.034637 7.631110 2.266843 4.242598 9.995153 2.335556 -3.685379 0.474220 2.051144 -3.287507 0.125731 2.209831 4.225476 1.512740 0.000228 0.635343 0.040253 0.001011 0.899945 0.027117 0.000024 0.130346 2.510734 2.034631 0.338826 0.408471 0.310402 81.201132 77.584624 65.483865 0.456305 96.467386 2.832273 1.025408 2.270798 2.398363 1.391613 1.924570 0.11004 0.10571 0.34125 18.75306 134.35364 32.73010 0.100989 147.711395 -1 1 6 2 2 3 1 5
704 149 162 p5 fu 1.42700 1.97888 1.78961 364.83765 210.99090 394.41681 1.454818 341.643571 -1.40851 0.02106 -0.52997 -250.00000 -155.10559 -238.99078 0.703426 5.156961 0.01849 1.99994 1.25964 114.83765 55.88531 155.42603 2.158244 346.800531 -0.508391 0.823266 0.204327 4.591135 -40.924072 -15.543425 1.142790 96.515313 0.511235 0.823266 0.285861 44.097313 55.938720 42.921213 1.142790 96.515313 4.273535 1.560028 2.889112 4.528096 -1.093038 3.002996 3.608717 0.778120 7.273535 4.560028 5.889112 7.528096 1.906962 6.002996 6.608717 3.778120 -2.877333 0.998138 2.040874 -3.255157 -0.426850 -1.677504 3.193002 2.166933 0.004011 0.318212 0.041263 0.001133 0.669489 0.093444 0.001408 0.030240 2.526799 2.068763 0.296889 0.460599 0.376320 82.014933 60.600476 80.972205 0.337523 97.865294 2.379215 1.491720 1.843796 2.046006 1.682529 0.572162 0.05451 0.19342 0.26959 29.85382 92.51404 18.52417 0.117532 124.772386 -2 -1 -1 1 7 1 -3 5
705 179 192 p5 fu 2.07898 1.41156 2.10010 358.79517 499.99237 301.77307 1.624224 354.621856 -1.45007 0.27551 -0.94806 -250.00000 -250.00000 -56.64062 0.513281 3.434487 0.62891 1.68707 1.15204 108.79517 249.99237 245.13245 2.137505 358.056343 -0.345575 0.912414 0.171918 -4.587613 -38.396395 18.143875 1.113144 109.730846 0.442330 0.912414 0.317774 35.386893 88.165871 33.497736 1.113144 109.730846 3.402980 2.243080 3.166331 5.410802 0.402882 5.686640 1.117388 -0.567754 6.402980 5.243080 6.166331 8.410802 3.402882 8.686640 4.117388 2.432246 -0.887520 0.871530 -0.901757 -3.550241 0.351213 3.822855 2.388114 1.822612 0.374799 0.383465 0.367186 0.000385 0.725428 0.000132 0.016935 0.068362 2.500278 1.899956 0.411655 0.298715 0.420159 76.769068 126.006822 69.909683 0.421440 128.227312 2.079732 1.052357 1.090171 2.405752 2.228339 1.306512 0.03949 0.04388 0.05988 12.18414 107.02515 16.38031 0.051038 144.465768 -2 -1 -2 2 -1 1 -1 5
706 192 205 p5 fu 0.54926 1.82813 1.64869 363.48724 493.01148 165.92407 1.179169 343.918505 -0.68732 0.17181 -0.49921 -250.00000 -244.11774 -97.43500 0.952884 15.440433 -0.13806 1.99994 1.14948 113.48724 248.89374 68.48907 2.132052 359.358938 -0.443595 0.949867 0.157983 -5.881677 -36.972633 -2.544110 1.146511 104.739716 0.443595 0.949867 0.234785 40.908812 81.754832 32.257667 1.146511 104.739716 0.801818 2.858941 3.392843 4.515269 1.176139 0.157434 6.260796 0.112243 3.801818 5.858941 6.392843 7.515269 4.176139 3.157434 9.260796 3.112243 0.615347 1.670843 2.310968 -3.199000 1.018922 -1.302970 4.062492 2.149780 0.538325 0.094753 0.020835 0.001379 0.308240 0.192585 0.000049 0.031573 2.536813 2.109725 0.128364 0.380622 0.343315 78.886077 114.088874 42.194626 0.297843 107.038801 2.783383 1.474828 1.838763 2.483829 1.012093 1.861385 0.08203 0.06885 0.20020 23.31543 74.66889 44.54803 0.156177 136.501610 -2 1 -1 1 -1 1 -1 4
707 241 254 p5 fu 1.11701 1.63764 1.19897 371.39893 455.12390 304.80957 1.353696 401.041711 -1.40381 0.36230 -0.11981 -250.00000 -205.13153 -63.26294 0.844207 1.949278 -0.28680 1.99994 1.07916 121.39893 249.99237 241.54663 2.197902 402.990989 -0.551857 0.963384 0.195567 -0.198950 -22.943935 29.769898 1.204626 112.685276 0.551857 0.963384 0.221352 38.674576 80.098665 54.600642 1.204626 112.685276 4.500257 3.822534 1.767414 5.226224 0.340941 0.932627 2.701235 -0.106865 7.500257 6.822534 4.767414 8.226224 3.340941 3.932627 5.701235 2.893135 -3.558642 2.771928 2.784480 -3.444438 0.729491 2.545526 3.105804 1.945050 0.000373 0.005573 0.005361 0.000572 0.465701 0.010911 0.001898 0.051769 2.529349 1.957082 0.272123 0.347257 0.321865 78.769760 116.284978 86.471981 0.343529 126.170574 2.627626 1.052050 1.938188 2.495380 1.372489 2.022928 0.23480 0.16956 0.23352 20.19501 84.08356 50.77362 0.147301 167.058223 -2 1 -2 2 -2 2 2 4

708 rows × 108 columns

In [28]:
# Total number of features
len(features)
Out[28]:
104
In [29]:
# Save all the features in a txt file for later use.
with open('data/features.txt', 'w') as f:
    for feature in features:
        f.write("%s\n" %feature)
In [30]:
# Save X_y as csv file for using in (classical) ML models
X_y.to_csv('data/X_y.csv', index=False) 

Classical ML models

Now it's time for some machine learning model stuff! We'll load this saved X_y. Then apply some classical machine learning models and asses their performance by plotting the confusion matrix. For paramter tuning in each of the model, we used grid searching cross-validation.

In [40]:
# Read Features 
with open('data/features.txt') as f:
    features = f.read().strip().split("\n")
f.close()

# Load data
X_y = pd.read_csv('data/X_y.csv')
X_y = X_y.dropna()
shot_labels = X_y.ShotName.unique()

# Train Test split Randomly:
from sklearn.model_selection import train_test_split
train, test = train_test_split(X_y, test_size=0.2, random_state=42)

X_train = train[features].values
Y_train = train["ShotName"].values
X_test  = test[features].values
Y_test  = test["ShotName"].values
In [41]:
# Helper function for plotting confusion matrix
from sklearn.metrics import confusion_matrix
def plot_confusion_matrix(cm, shots,
                          model_name,
                          normalize=False,
                          cmap=plt.cm.Wistia):
    tick_marks = np.arange(len(shots))
    if normalize:
        cm = cm.astype('float') / cm.sum(axis=1)[:, np.newaxis]


    plt.imshow(cm, interpolation='nearest', cmap=cmap)
    plt.yticks(tick_marks, shots)
    plt.title("Confusion matrix - " + model_name)
    plt.colorbar()
    plt.xticks(tick_marks, shots, rotation='vertical')

    fmt = '.2f' if normalize else 'd'
    thresh = cm.max() / 2.
    for i, j in itertools.product(range(cm.shape[0]), range(cm.shape[1])):
        plt.text(j, i, format(cm[i, j], fmt),
                 horizontalalignment="center",
                 color="black")
    plt.tight_layout()
    plt.ylabel('True Shot')
    plt.xlabel('Predicted Shot')
    plt.savefig("plots/" + "Confusion matrix - " + model_name)

Logistic Regression

In [42]:
from sklearn import linear_model
from sklearn import metrics
from sklearn.model_selection import GridSearchCV

# Save the hyperparameters ie C value and loss-function type:
parameters = {'C':[0.01,0.1,1,10,20,30], 'penalty':['l2','l1']}
log_reg_clf = linear_model.LogisticRegression()
log_reg_model = GridSearchCV(log_reg_clf, param_grid=parameters, cv=3,verbose=1, n_jobs=8)

log_reg_model.fit(X_train,Y_train)
y_pred = log_reg_model.predict(X_test)
# y_prob = log_reg_model.predict_proba(X_test)
# print(y_prob)
accuracy = metrics.accuracy_score(y_true=Y_test,y_pred=y_pred)

# Accuracy of our stroke detectiony
print('Accuracy of strokes detection:   {}\n\n'.format(accuracy))
     
# confusion matrix
cm = metrics.confusion_matrix(Y_test, y_pred)

# plot confusion matrix
plt.figure(figsize=(8,8))
plt.grid(b=False)
plot_confusion_matrix(cm, model_name="Logistic Regression", 
                      shots=shot_labels, normalize=True)
plt.show()
    
# get classification report
print("Classifiction Report for this model")
classification_report = metrics.classification_report(Y_test, y_pred)
print(classification_report)
Fitting 3 folds for each of 12 candidates, totalling 36 fits
[Parallel(n_jobs=8)]: Using backend LokyBackend with 8 concurrent workers.
[Parallel(n_jobs=8)]: Done  21 out of  36 | elapsed:    2.7s remaining:    1.9s
[Parallel(n_jobs=8)]: Done  36 out of  36 | elapsed:   11.3s finished
/home/easy/anaconda3/lib/python3.7/site-packages/sklearn/model_selection/_search.py:813: DeprecationWarning: The default of the `iid` parameter will change from True to False in version 0.22 and will be removed in 0.24. This will change numeric results when test-set sizes are unequal.
  DeprecationWarning)
Accuracy of strokes detection:   0.7535211267605634


Classifiction Report for this model
              precision    recall  f1-score   support

          bo       0.89      0.76      0.82        42
          bu       0.73      0.90      0.81        21
          fo       0.62      0.64      0.63        28
          fs       0.70      0.66      0.68        32
          fu       0.81      0.89      0.85        19

    accuracy                           0.75       142
   macro avg       0.75      0.77      0.76       142
weighted avg       0.76      0.75      0.75       142

K- Nearest Neighbours

In [43]:
from sklearn.neighbors import KNeighborsClassifier

knn = KNeighborsClassifier(n_neighbors=4)
knn.fit(X_train, Y_train)
y_pred = knn.predict(X_test)

accuracy = metrics.accuracy_score(y_true=Y_test,y_pred=y_pred)

# Accuracy of our stroke detectiony
print('Accuracy of strokes detection:   {}\n\n'.format(accuracy))
     
# confusion matrix
cm = metrics.confusion_matrix(Y_test, y_pred)
    
# plot confusion matrix
plt.figure(figsize=(8,8))
plt.grid(b=False)
plot_confusion_matrix(cm, model_name='KNeighborsClassifier',
                      shots=shot_labels, normalize=True, )
plt.show()
    
# get classification report
print("Classifiction Report for this model")
classification_report = metrics.classification_report(Y_test, y_pred)
print(classification_report)
Accuracy of strokes detection:   0.6197183098591549


Classifiction Report for this model
              precision    recall  f1-score   support

          bo       0.60      0.76      0.67        42
          bu       0.58      0.71      0.64        21
          fo       0.48      0.39      0.43        28
          fs       0.64      0.50      0.56        32
          fu       0.93      0.74      0.82        19

    accuracy                           0.62       142
   macro avg       0.65      0.62      0.63       142
weighted avg       0.63      0.62      0.62       142

Linear SVC

In [44]:
from sklearn.svm import LinearSVC
parameters = {'C':[0.125, 0.5, 1, 2, 8, 16]}
lr_svc_reg_clf = LinearSVC(tol=0.00005)
lr_svc_reg_model = GridSearchCV(lr_svc_reg_clf, param_grid=parameters, n_jobs=8, verbose=1)

lr_svc_reg_model.fit(X_train,Y_train)
y_pred = lr_svc_reg_model.predict(X_test)
accuracy = metrics.accuracy_score(y_true=Y_test,y_pred=y_pred)
# Accuracy of our stroke detectiony
print('Accuracy of strokes detection:   {}\n\n'.format(accuracy))
     
# confusion matrix
cm = metrics.confusion_matrix(Y_test, y_pred)
        
# plot confusion matrix
plt.figure(figsize=(8,8))
plt.grid(b=False)
plot_confusion_matrix(cm, model_name='LinearSVC',
                      shots=shot_labels, normalize=True)
plt.show()
    
# get classification report
print("Classifiction Report for this model")
classification_report = metrics.classification_report(Y_test, y_pred)
print(classification_report)
Fitting 3 folds for each of 6 candidates, totalling 18 fits
[Parallel(n_jobs=8)]: Using backend LokyBackend with 8 concurrent workers.
[Parallel(n_jobs=8)]: Done  18 out of  18 | elapsed:    1.2s finished
/home/easy/anaconda3/lib/python3.7/site-packages/sklearn/model_selection/_search.py:813: DeprecationWarning: The default of the `iid` parameter will change from True to False in version 0.22 and will be removed in 0.24. This will change numeric results when test-set sizes are unequal.
  DeprecationWarning)
/home/easy/anaconda3/lib/python3.7/site-packages/sklearn/svm/base.py:929: ConvergenceWarning: Liblinear failed to converge, increase the number of iterations.
  "the number of iterations.", ConvergenceWarning)
Accuracy of strokes detection:   0.6901408450704225


Classifiction Report for this model
              precision    recall  f1-score   support

          bo       0.81      0.69      0.74        42
          bu       0.63      0.81      0.71        21
          fo       0.89      0.29      0.43        28
          fs       0.54      0.78      0.64        32
          fu       0.79      1.00      0.88        19

    accuracy                           0.69       142
   macro avg       0.73      0.71      0.68       142
weighted avg       0.74      0.69      0.67       142

SVC with RBF kernel

In [45]:
from sklearn.svm import SVC
parameters = {'C':[2,8,16],\
              'gamma': [ 0.0078125, 0.125, 2]}
rbf_svm_clf = SVC(kernel='rbf')
rbf_svm_model = GridSearchCV(rbf_svm_clf,param_grid=parameters,n_jobs=8)

rbf_svm_model.fit(X_train,Y_train )
y_pred = rbf_svm_model.predict(X_test)
accuracy = metrics.accuracy_score(y_true=Y_test,y_pred=y_pred)
# Accuracy of our stroke detectiony
print('Accuracy of strokes detection:   {}\n\n'.format(accuracy))
     
# confusion matrix
cm = metrics.confusion_matrix(Y_test, y_pred)

# plot confusion matrix
plt.figure(figsize=(8,8))
plt.grid(b=False)
plot_confusion_matrix(cm, model_name='SVC', shots=shot_labels, normalize=True)
plt.show()
    
# get classification report
print("Classifiction Report for this model")
classification_report = metrics.classification_report(Y_test, y_pred)
print(classification_report)
Accuracy of strokes detection:   0.30985915492957744


Classifiction Report for this model
              precision    recall  f1-score   support

          bo       0.30      1.00      0.46        42
          bu       1.00      0.05      0.09        21
          fo       0.00      0.00      0.00        28
          fs       0.00      0.00      0.00        32
          fu       1.00      0.05      0.10        19

    accuracy                           0.31       142
   macro avg       0.46      0.22      0.13       142
weighted avg       0.37      0.31      0.16       142

/home/easy/anaconda3/lib/python3.7/site-packages/sklearn/metrics/classification.py:1437: UndefinedMetricWarning: Precision and F-score are ill-defined and being set to 0.0 in labels with no predicted samples.
  'precision', 'predicted', average, warn_for)
In [46]:
from sklearn.tree import DecisionTreeClassifier
parameters = {'max_depth':np.arange(3,20,2)}
decision_trees_clf = DecisionTreeClassifier()
decision_trees = GridSearchCV(decision_trees_clf, param_grid=parameters, n_jobs=8)

decision_trees.fit(X_train,Y_train )
y_pred = decision_trees.predict(X_test)
accuracy = metrics.accuracy_score(y_true=Y_test,y_pred=y_pred)
# Accuracy of our stroke detection
print('Accuracy of strokes detection:   {}\n\n'.format(accuracy))
     
# confusion matrix
cm = metrics.confusion_matrix(Y_test, y_pred)

# plot confusion matrix
plt.figure(figsize=(8,8))
plt.grid(b=False)
plot_confusion_matrix(cm, model_name='Decision Tree',
                      shots=shot_labels, normalize=True)
plt.show()
    
# get classification report
print("Classifiction Report for this model")
classification_report = metrics.classification_report(Y_test, y_pred)
print(classification_report)
/home/easy/anaconda3/lib/python3.7/site-packages/sklearn/model_selection/_search.py:813: DeprecationWarning: The default of the `iid` parameter will change from True to False in version 0.22 and will be removed in 0.24. This will change numeric results when test-set sizes are unequal.
  DeprecationWarning)
Accuracy of strokes detection:   0.6690140845070423


Classifiction Report for this model
              precision    recall  f1-score   support

          bo       0.82      0.74      0.78        42
          bu       0.64      0.76      0.70        21
          fo       0.57      0.71      0.63        28
          fs       0.68      0.41      0.51        32
          fu       0.60      0.79      0.68        19

    accuracy                           0.67       142
   macro avg       0.66      0.68      0.66       142
weighted avg       0.68      0.67      0.66       142

Random Forest

In [47]:
from sklearn.ensemble import RandomForestClassifier
params = {'n_estimators': np.arange(10,120,20), 'max_depth':np.arange(3,15,2)}
rfclassifier_clf = RandomForestClassifier()
rfclassifier = GridSearchCV(rfclassifier_clf, param_grid=params, n_jobs=8)


rfclassifier.fit(X_train,Y_train )
y_pred = rfclassifier.predict(X_test)
accuracy = metrics.accuracy_score(y_true=Y_test,y_pred=y_pred)
# Accuracy of our stroke detection
print('Accuracy of strokes detection:   {}\n\n'.format(accuracy))
     
# confusion matrix
cm = metrics.confusion_matrix(Y_test, y_pred)

# plot confusion matrix
plt.figure(figsize=(8,8))
plt.grid(b=False)
plot_confusion_matrix(cm, model_name='Random Forest',
                      shots=shot_labels, normalize=True)
plt.show()

# get classification report
print("Classifiction Report for this model")
classification_report = metrics.classification_report(Y_test, y_pred)
print(classification_report)
Accuracy of strokes detection:   0.7816901408450704


Classifiction Report for this model
              precision    recall  f1-score   support

          bo       0.86      0.86      0.86        42
          bu       0.72      0.86      0.78        21
          fo       0.62      0.71      0.67        28
          fs       0.77      0.62      0.69        32
          fu       1.00      0.89      0.94        19

    accuracy                           0.78       142
   macro avg       0.79      0.79      0.79       142
weighted avg       0.79      0.78      0.78       142

Gradient Boosting

In [48]:
from sklearn.ensemble import GradientBoostingClassifier
param_grid = {'max_depth': np.arange(1,30,4), \
             'n_estimators':np.arange(1,300,15)}
gbdt_clf = GradientBoostingClassifier()
gbdt_model = GridSearchCV(gbdt_clf, param_grid=param_grid, n_jobs=8)

gbdt_model.fit(X_train,Y_train )
y_pred = gbdt_model.predict(X_test)
accuracy = metrics.accuracy_score(y_true=Y_test,y_pred=y_pred)
# Accuracy of our stroke detectiony
print('Accuracy of strokes detection:   {}\n\n'.format(accuracy))
     
# confusion matrix
cm = metrics.confusion_matrix(Y_test, y_pred)

# plot confusion matrix
plt.figure(figsize=(8,8))
plt.grid(b=False)
plot_confusion_matrix(cm, model_name='GradientBoostingClassifier',
                      shots=shot_labels, normalize=True)
plt.show()
    
# get classification report
print("Classifiction Report for this model")
classification_report = metrics.classification_report(Y_test, y_pred)
print(classification_report)
/home/easy/anaconda3/lib/python3.7/site-packages/sklearn/model_selection/_search.py:813: DeprecationWarning: The default of the `iid` parameter will change from True to False in version 0.22 and will be removed in 0.24. This will change numeric results when test-set sizes are unequal.
  DeprecationWarning)
Accuracy of strokes detection:   0.7183098591549296


Classifiction Report for this model
              precision    recall  f1-score   support

          bo       0.88      0.71      0.79        42
          bu       0.62      0.86      0.72        21
          fo       0.56      0.68      0.61        28
          fs       0.68      0.53      0.60        32
          fu       0.90      0.95      0.92        19

    accuracy                           0.72       142
   macro avg       0.73      0.75      0.73       142
weighted avg       0.74      0.72      0.72       142

Deep Learning Models

Applying some deep learning models. The obvious choice will be to apply 1D CNN or RNNs for such a task. We tried only LSTMs.

In [49]:
# Importing tensorflow
np.random.seed(42)
import tensorflow as tf
tf.random.set_seed(42)
from sklearn.preprocessing import StandardScaler
from keras.models import Sequential
from keras.layers import Dense
from keras.layers import Flatten
from keras.layers import Dropout
from keras.layers.convolutional import Conv1D
from keras.layers.convolutional import MaxPooling1D
from keras.models import Sequential
from keras.layers import LSTM
from keras.layers.core import Dense, Dropout
Using TensorFlow backend.
In [50]:
# ShotNames are the class labels
# It is a 5 class classification
ShotNames = {
    'bo': [1, 0, 0, 0, 0],
    'bu': [0, 1, 0, 0, 0],
    'fo': [0, 0, 1, 0, 0],
    'fs': [0, 0, 0, 1, 0],
    'fu': [0, 0, 0, 0, 1],
}
In [52]:
X = []
y = []
for index, row in X_y.iterrows():
    df = data[row["PersonID"] + "_" + row["ShotName"]][row["StartFrame"]:row["EndFrame"]][cols]
    X.append(df.to_numpy())
    y.append(row["ShotName"])
X = np.array(X)
# One Hot Encoding
y = np.array([ShotNames[i] for i in y])
In [53]:
n_classes = len(ShotNames)
timesteps = len(X[0])    # Window size
input_dim = len(X[0][0]) # num of sensors = 6
In [54]:
# Initializing parameters
epochs = 100
batch_size = 16
n_hidden = 32
In [55]:
# Loading the train and test data
from sklearn.model_selection import train_test_split
X_train, X_test, Y_train , Y_test = train_test_split(X, y, test_size=0.1)

LSTM

In [56]:
# Initiliazing the sequential model
model = Sequential()
# Configuring the parameters
model.add(LSTM(n_hidden, input_shape=(timesteps, input_dim)))
# Adding a dropout layer
model.add(Dropout(0.5))
# Adding a dense output layer with sigmoid activation
model.add(Dense(n_classes, activation='sigmoid'))
model.summary()
Model: "sequential_1"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
lstm_1 (LSTM)                (None, 32)                4992      
_________________________________________________________________
dropout_1 (Dropout)          (None, 32)                0         
_________________________________________________________________
dense_1 (Dense)              (None, 5)                 165       
=================================================================
Total params: 5,157
Trainable params: 5,157
Non-trainable params: 0
_________________________________________________________________
In [57]:
# Compiling the model
model.compile(loss='categorical_crossentropy',
              optimizer='rmsprop',
              metrics=['accuracy'])
In [58]:
# Training the model
model.fit(X_train, Y_train,
          batch_size=batch_size,
          validation_data=(X_test, Y_test),
          epochs=epochs)
Train on 637 samples, validate on 71 samples
Epoch 1/100
637/637 [==============================] - 3s 5ms/step - loss: 1.6241 - accuracy: 0.2308 - val_loss: 1.5853 - val_accuracy: 0.2676
Epoch 2/100
637/637 [==============================] - 1s 2ms/step - loss: 1.6098 - accuracy: 0.2465 - val_loss: 1.5887 - val_accuracy: 0.3099
Epoch 3/100
637/637 [==============================] - 1s 2ms/step - loss: 1.5880 - accuracy: 0.2622 - val_loss: 1.5809 - val_accuracy: 0.2958
Epoch 4/100
637/637 [==============================] - 1s 2ms/step - loss: 1.5921 - accuracy: 0.2684 - val_loss: 1.5780 - val_accuracy: 0.3099
Epoch 5/100
637/637 [==============================] - 1s 2ms/step - loss: 1.5727 - accuracy: 0.2700 - val_loss: 1.5787 - val_accuracy: 0.3099
Epoch 6/100
637/637 [==============================] - 1s 1ms/step - loss: 1.5603 - accuracy: 0.2826 - val_loss: 1.5757 - val_accuracy: 0.2817
Epoch 7/100
637/637 [==============================] - 2s 3ms/step - loss: 1.5319 - accuracy: 0.3234 - val_loss: 1.5740 - val_accuracy: 0.3099
Epoch 8/100
637/637 [==============================] - 1s 2ms/step - loss: 1.5252 - accuracy: 0.3312 - val_loss: 1.5738 - val_accuracy: 0.2958
Epoch 9/100
637/637 [==============================] - 1s 2ms/step - loss: 1.5223 - accuracy: 0.3061 - val_loss: 1.5744 - val_accuracy: 0.3380
Epoch 10/100
637/637 [==============================] - 1s 2ms/step - loss: 1.5247 - accuracy: 0.3485 - val_loss: 1.5719 - val_accuracy: 0.3380
Epoch 11/100
637/637 [==============================] - 1s 2ms/step - loss: 1.5113 - accuracy: 0.3328 - val_loss: 1.5748 - val_accuracy: 0.3239
Epoch 12/100
637/637 [==============================] - 1s 2ms/step - loss: 1.5057 - accuracy: 0.3454 - val_loss: 1.5731 - val_accuracy: 0.3521
Epoch 13/100
637/637 [==============================] - 1s 2ms/step - loss: 1.4972 - accuracy: 0.3265 - val_loss: 1.5702 - val_accuracy: 0.3239
Epoch 14/100
637/637 [==============================] - 1s 2ms/step - loss: 1.4964 - accuracy: 0.3548 - val_loss: 1.5698 - val_accuracy: 0.3239
Epoch 15/100
637/637 [==============================] - 1s 2ms/step - loss: 1.4825 - accuracy: 0.3642 - val_loss: 1.5688 - val_accuracy: 0.3521
Epoch 16/100
637/637 [==============================] - 1s 2ms/step - loss: 1.4550 - accuracy: 0.3768 - val_loss: 1.5678 - val_accuracy: 0.3662
Epoch 17/100
637/637 [==============================] - 1s 2ms/step - loss: 1.4712 - accuracy: 0.3642 - val_loss: 1.5672 - val_accuracy: 0.3803
Epoch 18/100
637/637 [==============================] - 1s 2ms/step - loss: 1.4628 - accuracy: 0.3752 - val_loss: 1.5698 - val_accuracy: 0.3803
Epoch 19/100
637/637 [==============================] - 1s 2ms/step - loss: 1.4290 - accuracy: 0.3987 - val_loss: 1.5669 - val_accuracy: 0.4085
Epoch 20/100
637/637 [==============================] - 1s 2ms/step - loss: 1.4489 - accuracy: 0.3815 - val_loss: 1.5629 - val_accuracy: 0.3803
Epoch 21/100
637/637 [==============================] - 1s 2ms/step - loss: 1.4339 - accuracy: 0.4160 - val_loss: 1.5670 - val_accuracy: 0.3662
Epoch 22/100
637/637 [==============================] - 1s 2ms/step - loss: 1.4477 - accuracy: 0.3595 - val_loss: 1.5649 - val_accuracy: 0.3662
Epoch 23/100
637/637 [==============================] - 1s 2ms/step - loss: 1.4365 - accuracy: 0.3642 - val_loss: 1.5725 - val_accuracy: 0.3521
Epoch 24/100
637/637 [==============================] - 1s 2ms/step - loss: 1.4062 - accuracy: 0.4035 - val_loss: 1.5661 - val_accuracy: 0.3521
Epoch 25/100
637/637 [==============================] - 1s 2ms/step - loss: 1.3929 - accuracy: 0.4192 - val_loss: 1.5650 - val_accuracy: 0.3662
Epoch 26/100
637/637 [==============================] - 1s 2ms/step - loss: 1.3949 - accuracy: 0.4050 - val_loss: 1.5693 - val_accuracy: 0.3239
Epoch 27/100
637/637 [==============================] - 2s 3ms/step - loss: 1.3887 - accuracy: 0.4066 - val_loss: 1.5613 - val_accuracy: 0.3099
Epoch 28/100
637/637 [==============================] - 2s 3ms/step - loss: 1.4001 - accuracy: 0.4097 - val_loss: 1.5520 - val_accuracy: 0.3380
Epoch 29/100
637/637 [==============================] - 2s 2ms/step - loss: 1.3706 - accuracy: 0.4097 - val_loss: 1.5628 - val_accuracy: 0.3662
Epoch 30/100
637/637 [==============================] - 2s 3ms/step - loss: 1.3600 - accuracy: 0.4254 - val_loss: 1.5602 - val_accuracy: 0.3521
Epoch 31/100
637/637 [==============================] - 2s 3ms/step - loss: 1.3574 - accuracy: 0.4270 - val_loss: 1.5477 - val_accuracy: 0.3380
Epoch 32/100
637/637 [==============================] - 1s 2ms/step - loss: 1.3235 - accuracy: 0.4584 - val_loss: 1.5470 - val_accuracy: 0.3239
Epoch 33/100
637/637 [==============================] - 1s 2ms/step - loss: 1.3326 - accuracy: 0.4411 - val_loss: 1.5538 - val_accuracy: 0.3239
Epoch 34/100
637/637 [==============================] - 2s 3ms/step - loss: 1.3443 - accuracy: 0.4349 - val_loss: 1.5535 - val_accuracy: 0.3099
Epoch 35/100
637/637 [==============================] - 2s 3ms/step - loss: 1.3214 - accuracy: 0.4741 - val_loss: 1.5526 - val_accuracy: 0.2958
Epoch 36/100
637/637 [==============================] - 1s 2ms/step - loss: 1.3229 - accuracy: 0.4615 - val_loss: 1.5319 - val_accuracy: 0.3099
Epoch 37/100
637/637 [==============================] - 1s 2ms/step - loss: 1.3163 - accuracy: 0.4710 - val_loss: 1.5273 - val_accuracy: 0.3239
Epoch 38/100
637/637 [==============================] - 1s 2ms/step - loss: 1.2795 - accuracy: 0.4694 - val_loss: 1.5228 - val_accuracy: 0.3099
Epoch 39/100
637/637 [==============================] - 1s 2ms/step - loss: 1.2817 - accuracy: 0.4600 - val_loss: 1.5231 - val_accuracy: 0.3521
Epoch 40/100
637/637 [==============================] - 2s 3ms/step - loss: 1.2847 - accuracy: 0.4678 - val_loss: 1.5045 - val_accuracy: 0.3380
Epoch 41/100
637/637 [==============================] - 2s 3ms/step - loss: 1.2639 - accuracy: 0.4882 - val_loss: 1.5021 - val_accuracy: 0.3380
Epoch 42/100
637/637 [==============================] - 2s 3ms/step - loss: 1.2370 - accuracy: 0.4961 - val_loss: 1.4993 - val_accuracy: 0.3944
Epoch 43/100
637/637 [==============================] - 2s 3ms/step - loss: 1.2538 - accuracy: 0.4961 - val_loss: 1.4870 - val_accuracy: 0.3944
Epoch 44/100
637/637 [==============================] - 2s 2ms/step - loss: 1.2197 - accuracy: 0.5118 - val_loss: 1.4648 - val_accuracy: 0.4225
Epoch 45/100
637/637 [==============================] - 1s 2ms/step - loss: 1.1903 - accuracy: 0.5683 - val_loss: 1.4607 - val_accuracy: 0.4085
Epoch 46/100
637/637 [==============================] - 2s 3ms/step - loss: 1.1867 - accuracy: 0.5369 - val_loss: 1.4387 - val_accuracy: 0.3803
Epoch 47/100
637/637 [==============================] - 1s 2ms/step - loss: 1.1951 - accuracy: 0.5165 - val_loss: 1.4328 - val_accuracy: 0.3944
Epoch 48/100
637/637 [==============================] - 1s 2ms/step - loss: 1.1657 - accuracy: 0.5683 - val_loss: 1.4019 - val_accuracy: 0.4507
Epoch 49/100
637/637 [==============================] - 2s 3ms/step - loss: 1.1447 - accuracy: 0.5651 - val_loss: 1.3884 - val_accuracy: 0.4507
Epoch 50/100
637/637 [==============================] - 2s 3ms/step - loss: 1.1810 - accuracy: 0.5385 - val_loss: 1.3857 - val_accuracy: 0.4225
Epoch 51/100
637/637 [==============================] - 2s 3ms/step - loss: 1.1246 - accuracy: 0.5730 - val_loss: 1.3510 - val_accuracy: 0.4366
Epoch 52/100
637/637 [==============================] - 2s 3ms/step - loss: 1.1217 - accuracy: 0.5824 - val_loss: 1.3518 - val_accuracy: 0.4789
Epoch 53/100
637/637 [==============================] - 2s 3ms/step - loss: 1.1240 - accuracy: 0.5557 - val_loss: 1.3008 - val_accuracy: 0.4930
Epoch 54/100
637/637 [==============================] - 2s 3ms/step - loss: 1.0460 - accuracy: 0.5997 - val_loss: 1.3398 - val_accuracy: 0.4789
Epoch 55/100
637/637 [==============================] - 2s 3ms/step - loss: 1.0609 - accuracy: 0.5903 - val_loss: 1.3302 - val_accuracy: 0.4507
Epoch 56/100
637/637 [==============================] - 2s 3ms/step - loss: 1.0335 - accuracy: 0.6028 - val_loss: 1.2892 - val_accuracy: 0.4648
Epoch 57/100
637/637 [==============================] - 1s 2ms/step - loss: 1.0564 - accuracy: 0.5965 - val_loss: 1.2604 - val_accuracy: 0.5070
Epoch 58/100
637/637 [==============================] - 1s 2ms/step - loss: 1.0014 - accuracy: 0.6013 - val_loss: 1.2146 - val_accuracy: 0.4648
Epoch 59/100
637/637 [==============================] - 1s 2ms/step - loss: 0.9893 - accuracy: 0.6154 - val_loss: 1.1943 - val_accuracy: 0.5070
Epoch 60/100
637/637 [==============================] - 1s 2ms/step - loss: 0.9850 - accuracy: 0.6358 - val_loss: 1.1446 - val_accuracy: 0.5070
Epoch 61/100
637/637 [==============================] - 2s 3ms/step - loss: 0.9695 - accuracy: 0.6075 - val_loss: 1.1165 - val_accuracy: 0.5493
Epoch 62/100
637/637 [==============================] - 1s 2ms/step - loss: 0.9453 - accuracy: 0.6641 - val_loss: 1.1455 - val_accuracy: 0.4789
Epoch 63/100
637/637 [==============================] - 1s 2ms/step - loss: 0.9221 - accuracy: 0.6546 - val_loss: 1.1140 - val_accuracy: 0.5493
Epoch 64/100
637/637 [==============================] - 1s 2ms/step - loss: 0.8814 - accuracy: 0.6688 - val_loss: 1.0826 - val_accuracy: 0.5211
Epoch 65/100
637/637 [==============================] - 1s 2ms/step - loss: 0.8942 - accuracy: 0.6703 - val_loss: 1.0380 - val_accuracy: 0.5915
Epoch 66/100
637/637 [==============================] - 2s 3ms/step - loss: 0.8851 - accuracy: 0.6609 - val_loss: 1.0540 - val_accuracy: 0.5634
Epoch 67/100
637/637 [==============================] - 2s 3ms/step - loss: 0.8718 - accuracy: 0.6813 - val_loss: 1.0825 - val_accuracy: 0.4930
Epoch 68/100
637/637 [==============================] - 1s 2ms/step - loss: 0.8709 - accuracy: 0.6656 - val_loss: 1.0379 - val_accuracy: 0.5915
Epoch 69/100
637/637 [==============================] - 1s 2ms/step - loss: 0.8566 - accuracy: 0.6625 - val_loss: 1.0041 - val_accuracy: 0.5634
Epoch 70/100
637/637 [==============================] - 1s 2ms/step - loss: 0.8341 - accuracy: 0.6954 - val_loss: 1.0387 - val_accuracy: 0.5634
Epoch 71/100
637/637 [==============================] - 1s 2ms/step - loss: 0.8653 - accuracy: 0.6609 - val_loss: 0.9512 - val_accuracy: 0.6197
Epoch 72/100
637/637 [==============================] - 1s 2ms/step - loss: 0.8317 - accuracy: 0.6860 - val_loss: 0.9942 - val_accuracy: 0.5915
Epoch 73/100
637/637 [==============================] - 1s 2ms/step - loss: 0.8312 - accuracy: 0.6703 - val_loss: 0.9277 - val_accuracy: 0.5915
Epoch 74/100
637/637 [==============================] - 1s 2ms/step - loss: 0.7547 - accuracy: 0.7206 - val_loss: 0.9639 - val_accuracy: 0.5493
Epoch 75/100
637/637 [==============================] - 1s 2ms/step - loss: 0.7607 - accuracy: 0.7096 - val_loss: 0.9448 - val_accuracy: 0.6197
Epoch 76/100
637/637 [==============================] - 2s 2ms/step - loss: 0.7629 - accuracy: 0.7159 - val_loss: 0.9486 - val_accuracy: 0.6620
Epoch 77/100
637/637 [==============================] - 1s 2ms/step - loss: 0.7279 - accuracy: 0.7143 - val_loss: 1.0015 - val_accuracy: 0.6338
Epoch 78/100
637/637 [==============================] - 1s 2ms/step - loss: 0.7447 - accuracy: 0.7253 - val_loss: 0.9642 - val_accuracy: 0.6056
Epoch 79/100
637/637 [==============================] - 1s 2ms/step - loss: 0.7054 - accuracy: 0.7488 - val_loss: 0.9946 - val_accuracy: 0.5634
Epoch 80/100
637/637 [==============================] - 1s 2ms/step - loss: 0.7017 - accuracy: 0.7504 - val_loss: 0.9401 - val_accuracy: 0.6056
Epoch 81/100
637/637 [==============================] - 1s 2ms/step - loss: 0.7075 - accuracy: 0.7347 - val_loss: 0.9268 - val_accuracy: 0.6197
Epoch 82/100
637/637 [==============================] - 2s 2ms/step - loss: 0.6920 - accuracy: 0.7441 - val_loss: 0.9395 - val_accuracy: 0.5915
Epoch 83/100
637/637 [==============================] - 2s 3ms/step - loss: 0.7147 - accuracy: 0.7582 - val_loss: 0.8831 - val_accuracy: 0.6620
Epoch 84/100
637/637 [==============================] - 1s 2ms/step - loss: 0.6686 - accuracy: 0.7394 - val_loss: 0.9293 - val_accuracy: 0.5634
Epoch 85/100
637/637 [==============================] - 1s 2ms/step - loss: 0.6355 - accuracy: 0.7755 - val_loss: 0.9403 - val_accuracy: 0.6056
Epoch 86/100
637/637 [==============================] - 1s 2ms/step - loss: 0.6327 - accuracy: 0.7630 - val_loss: 0.8972 - val_accuracy: 0.6197
Epoch 87/100
637/637 [==============================] - 2s 2ms/step - loss: 0.6413 - accuracy: 0.7300 - val_loss: 0.9135 - val_accuracy: 0.5915
Epoch 88/100
637/637 [==============================] - 1s 2ms/step - loss: 0.6389 - accuracy: 0.7645 - val_loss: 0.8484 - val_accuracy: 0.6338
Epoch 89/100
637/637 [==============================] - 2s 2ms/step - loss: 0.6256 - accuracy: 0.7755 - val_loss: 0.8899 - val_accuracy: 0.6338
Epoch 90/100
637/637 [==============================] - 2s 3ms/step - loss: 0.6100 - accuracy: 0.7724 - val_loss: 0.8704 - val_accuracy: 0.6338
Epoch 91/100
637/637 [==============================] - 2s 2ms/step - loss: 0.5675 - accuracy: 0.8006 - val_loss: 0.9041 - val_accuracy: 0.6056
Epoch 92/100
637/637 [==============================] - 2s 3ms/step - loss: 0.6046 - accuracy: 0.7849 - val_loss: 0.8838 - val_accuracy: 0.6338
Epoch 93/100
637/637 [==============================] - 2s 3ms/step - loss: 0.6058 - accuracy: 0.7724 - val_loss: 0.8063 - val_accuracy: 0.6479
Epoch 94/100
637/637 [==============================] - 2s 3ms/step - loss: 0.5836 - accuracy: 0.7661 - val_loss: 0.8685 - val_accuracy: 0.6338
Epoch 95/100
637/637 [==============================] - 2s 3ms/step - loss: 0.6204 - accuracy: 0.7896 - val_loss: 0.8823 - val_accuracy: 0.6056
Epoch 96/100
637/637 [==============================] - 2s 3ms/step - loss: 0.5924 - accuracy: 0.7912 - val_loss: 0.8876 - val_accuracy: 0.6197
Epoch 97/100
637/637 [==============================] - 2s 2ms/step - loss: 0.5548 - accuracy: 0.7991 - val_loss: 0.9293 - val_accuracy: 0.6338
Epoch 98/100
637/637 [==============================] - 2s 3ms/step - loss: 0.5855 - accuracy: 0.7881 - val_loss: 0.9942 - val_accuracy: 0.5915
Epoch 99/100
637/637 [==============================] - 2s 3ms/step - loss: 0.5363 - accuracy: 0.8257 - val_loss: 0.8965 - val_accuracy: 0.6338
Epoch 100/100
637/637 [==============================] - 2s 3ms/step - loss: 0.6074 - accuracy: 0.7771 - val_loss: 0.9183 - val_accuracy: 0.6338
Out[58]:
<keras.callbacks.callbacks.History at 0x7f9c0086de80>

Data being pretty less for deep learning models, so we didn't try deeper networks.

TODO

  1. Data Augmentation
  2. Try using 1D-CNN
  3. Attention models

Final Notes

This project taught us a lot about doing a machine learning project. Specially the challenges that arise in data collection and it's preprocessing. Overall this is just a small project we did for learning. It can be extended in several ways, first of all by getting more data, so that models like LSTM can be employed. For classical models, we can add more features, do PCA, etc.

Overall a great learning experience. Please feel free to play with it and suggestions are always welcome.