ТОП просматриваемых книг сайта:
Smart Healthcare System Design. Группа авторов
Читать онлайн.Название Smart Healthcare System Design
Год выпуска 0
isbn 9781119792239
Автор произведения Группа авторов
Жанр Программы
Издательство John Wiley & Sons Limited
Here we are using cloud of smart bridge to store the data. The data which is collected from the sensors is send to the cloud of domain smart bridge and sub domain health monitoring system through API. The patient can view his health details after logging-in. In this research we are using pulse sensor to know the patient heartbeat, LM35 to know his body temperature and EEG sensors to know his brain signals. So after login he will get a display of readings in tabular form as shown in the figure. In this research, we are using mindwave headset which works on EEG technology. This sensor consists of one main sensor and one reference electrode. This research can be implemented in future by making more sophisticated by expanding the sensors used to read the brain waves. The main working of mindwave mobile headset goes in ThinkGear ASIC module chip. In this research, we are using TGAM chip in the sensor [22].
The EEG Sensor (values of Attention, Meditation), for calculation Range of 1–100 was taken
• Range from 40 to 60 is considered “neutral”.
• Range from 60 to 80 is slightly high, and interpreted as higher over normal.
• Range from 80 to 100 are considered “high”, that mean it is strong indication levels Severe levels.
1.4.3 Cloud Feature Extraction
The most main role in creating an EEG signal classification system is generating mathematical representations and reductions of the input data which allow the input signal to be properly differentiated into its respective classes. These mathematical representations of the signal are, in a sense, a mapping of a multidimensional space (the input signal) into a space of fewer dimensions. This dimensional reduction is known as “feature extraction”. Ultimately, the extracted feature set should preserve only the most important information from the original signal [23].
Table 1.3 EEG signal mathematical transform with feature.
Set | Mathematical transform | Feature number |
1 | Linear predictive codes taps | 1–5 |
2 | Fast Fourier transform statics | 6–12 |
3 | Mel frequency cepstral coefficients | 13–22 |
4 | Log (FFT) analysis | 23–28 |
5 | Phase shift correlation | 29–36 |
6 | Hilbert transform statics | 37–44 |
7 | Wavelet decomposition | 45–55 |
8 | 1st, 2nd, 3rd derivatives | 56–62 |
9 | 1st, 2nd, 3rd derivatives | 63–67 |
10 | Auto regressive parameters | 68–72 |
Table 1.3 above describes feature classification for EEG signal. First, a feature set optimization algorithm is presented which is used to do a feature set study to reveal the mathematical transforms that are most useful in predicting the preictal state. After this, a set of algorithms are given that became the framework of the seizure on set prediction system described.
1.4.4 Feature Optimization
In order to find the features with the most potential, an algorithm was implemented to approximate individual feature strength with respect to every other feature [30]. The strength of a feature was determined by the accuracy with which the preictal state was classified as an average of several classifications. Similar to Cross-Validation by Elimination HANNSVM algorithm repartitions the feature set, performs a set of classifications, finds the best feature sets to drop, and then adjusts the feature space to only contain features that improve the accuracy.
1 1. Evaluate the accuracy of the classification using all N feature sets.
2 2. Dropping one feature set at a time, repartitions the feature space into N, N − 1 feature subsets and save the accuracy of each sub set at position K in vector P along with the resulting accuracy.
3 3. Denote the index of P with the maximum accuracy as B, and drop all the features listed in P from B to N from the final feature space.
The resulting feature set P has accuracy similar to the accuracy found at position B in P. Under training and overtraining must still be taken into consideration since it can have an effect on the accuracy of a prediction.
1.4.5 Classification and Validation
The two methods in this section were developed to complement the classification algorithms and enhance their classification potential for noisy dynamical systems that change state over time.
The first method SVM, which is called Cross-Validation by Elimination, is used to classify samples by testing the amount of correlation (determined by the accuracy of classifications) each sample has to every state and then remove classes that are least correlated to improve classification accuracy. The algorithm isolates each of the classes, compares the prediction results, and then makes a final decision based on a function of the independent predictions [23, 29].
Figure 1.8 is represented as EEG signal hybrid artificial neural network with support vector machine based (HANNSVM) classification, block diagram represents brain signal capture from EEG sensor with unit of hertz, artifact removed from the input signal, preprocessed data is segment, then sampled at Hz and a rectangular window function is applied [31]. An FIR filter is applied to the incoming EEG stream to decompose the incoming signals in to their respective brain waves. However, due to time constraints, only the original signals (unfiltered) are tested with the system. Next to extract the information/feature from segmented output signal. Extracted signal applied to the HANNSVM machine learning algorithm [32, 33].
This method puts testing samples that were weakly classified into classes that make accuracy. The second method, State Decision Neurons, Artificial Neural Network (ANN) is used to automatically make decisions about when to transition to the next defined state [34]. This algorithm, when used in conjunction with a set of classifiers, enables the system to make decisions based on previous predictions, a closed-loop system if you will. When there are three or more states to distinguish between in a noisy system, state decision neurons are useful in determining