CSE3CI – Computational Intelligence for Data Analytics Assignment, 2021 Due Date: Monday 17th May, 9:00am, 2021 Assessment Weight: 30% of the final mark for the subject Instructions • This is a GROUP...

1 answer below »
the subject is– Computational Intelligence for Data AnalyticsNeed to be done Part 3 and 4


CSE3CI – Computational Intelligence for Data Analytics Assignment, 2021 Due Date: Monday 17th May, 9:00am, 2021 Assessment Weight: 30% of the final mark for the subject Instructions • This is a GROUP assignment. You are permitted to work in groups of up to three. All group members will receive the same mark. You may complete the assignment as an individual, but if you do so, you will be marked in the same way as for a group. Plagiarism Plagiarism is the submission of somebody else’s work in a manner that gives the impression that the work is your own. When submitting your assignment via the LMS, the following announcement will appear: Software will be used to assist in the detection of plagiarism. Students are referred to the section on ‘Academic Misconduct’ in the subject’s guideline available on LMS. Lateness Policy Penalties are applied to late assignments (5% of total possible marks for the task is deducted per day, accepted up to 5 days after the due date only). An assignment submitted more than five working days after the due date will not be accepted. Submission Procedure You are required to submit the following: • A pdf format document containing your report. • A zip file containing all of the Python code that you used for the assignment. These documents are to be submitted electronically via the Learning Management System. In the case of group submissions, only one member of the group should submit, and the cover page of the report must contain the full name and Student ID of all group members. In the case of solo submissions, ensure your name and Student ID is on the cover page. You will also be required to do a short oral presentation of your report (10 minutes max ) during the scheduled lab class in Week 11. Depending on how many submissions are received, it may be necessary to schedule a second lab class during that week or week 12. Problem Description – Forecasting Electricity Prices The problem is to forecast electricity price based on historical data. Let the temperature and total demand of electricity at time instant t be T(t) and D(t) respectively. The goal is to predict the recommended retail price (RRP) price by using some historical data as system inputs. The historical data set consists of the following variables: T(t-2), T(t-1), T(t), D(t-2), D(t-1), D(t). The output should be a prediction of the Recommended Retail Price (RRP) of electricity at the next time instant t+1, denoted by P(t+1). You have been provided with real-world electricity pricing data from Queensland, Australia. There are two datasets: a training set, to be used for model development; and a test set, to be used to evaluate the performance of your models. Each dataset has the same structure. Rows correspond to successive time instants, and contain seven values: the predictor variables T(t-2), T(t-1), T(t), D(t-2), D(t-1), D(t), and the target variable P(t+1). The objective is to predict the value of P(t+1) on the basis of one or more of the six predictor variables. There are five parts to the assignment, described below, with the approximate assessment weighting. Parts 1, 2 and 3 are based on content that has been covered up to then end of Week 5. Content for Part 4 will be covered in Week 6 and 7. Part I – Data Preparation (approx. 5%) The performance of many systems can be improved through careful preparation of the data. Visualising the electricity prices will reveal that there are potential outliers1 in the dataset; i.e., observations that lie an abnormal distance from other values in a random sample from a population. Tasks: • Use an appropriate technique to identify and remove outliers of the output variable from the datasets (for both training and test sets). • Provide a plot showing the price data before and after the removal of outliers. Part 2 – Linear Regression Models (approx. 8%) Linear regression is often a good baseline against which to compare the performance of other models. Tasks: • Apply linear regression to the prediction of electricity prices. • For both the training and test sets, provide the Average Relative Error. • For both training and test sets, produce a plot showing, for each data point, how the predicted price compares with the actual price. Part 3 – Multilayer Perceptron Models (approx. 27%) Multilayer perceptrons can sometimes yield better performance over linear models. Tasks: • Experiment with the application of MLPs to predicting electricity prices. You should try varying MLPRegressor parameters such as the regularization coefficient, the number of training epochs, 1 You can read more about outliers here: http://www.itl.nist.gov/div898/handbook/prc/section1/prc16.htm, http://mathworld.wolfram.com/Outlier.html http://www.itl.nist.gov/div898/handbook/prc/section1/prc16.htm http://mathworld.wolfram.com/Outlier.html and the number of hidden units. Make sure that you record the training error and test error in each case. It is suggested that you use logistic units in the hidden layer, but you can use others if you wish. • Provide results for three different MLPRegressor parameter settings. − one of these should be the result for the best performing MLP that you were able to train; − one should clearly demonstrate underfitting; − one should clearly demonstrate overfitting. For each of these cases, provide the learning parameters that you have used, as well as the training error and the test error. • For the best-performing MLP, for both training data and test data, produce a plot showing, for each data point, how the predicted price compares with the actual price. Part 4 – Fuzzy Forecasting System (approx. 40%) For this part, you will develop a fuzzy forecasting system for predicting the electricity price. (You will learn about fuzzy inferencing systems in Weeks 6 and 7) Tasks: • Select appropriate values or fuzzy subsets for the linguistic variables that you will use in your fuzzy rules. • Apply statistical analysis (correlation coefficients) and heuristics to develop a set of fuzzy rules; • Implement your fuzzy system in Python, and produce clear plots of all membership functions involved in your system; • Evaluate the system performance in terms of the average relative error on both training and test sets. You may use either Mamdani-type or Sugeno-type inference, but you should include some justification for your decision. Part 5 – Report and Presentation (approx. 20%) This is the assignment ‘deliverable’; i.e., what you are required to submit. It should contain your results from Tasks 1 to 4, put together in a clear and coherent manner. It should also clearly describe how you conducted your investigation and any design choices you made (e.g., What parameters did you experiment with when applying the MLP?, What different membership functions did you experiment with in creating your fuzzy system?, Why did you opt for Mamdani-type inference as opposed to Sugeno- type inference?, and so on). Basically, the more thorough and systematic your analysis, the better. A summary of your overall findings should also be provided in the report. Assessment Approximate marks for each of Parts 1 to 5 have been indicated above. The marks for Parts 1 to 4 are based on correctness and completeness of the tasks specified. The 20% allocated for Part 5 will be based how clearly and coherently the report and presentation have been presented; the description and justification they provide for the design choices that have been made; the evidence they provide of systematic experimentation with different system parameters; the conclusions they make in regard to the use of the various approaches in predicting electricity prices. PowerPoint Presentation 1 Advice on designing your fuzzy inferencing system for Part 4 of the Assignment 2 Remove outliers from both the training and test set Removing Outliers 3 • Initially we have six input variables, T(t-2), T(t-1), T(t), D(t-2), D(t-1), D(t), and one output variable, P(t+1). • We can calculate the correlation coefficient matrix as follows: >> corrcoef(A) (Refer to the Week 5 lab) where the matrix A contains all seven columns (the six input variables and the output variable). • You will need to examine the correlations to decide upon which input variables you will use to constuct your fuzzy inferencing system. • In the following, it will be assumed that T(t-2) and D(t) have been selected. But note that these are probably not very good, and you must select variables that you expect will work well. • Then the objective of the FIS design is to find the unknown functional relationship between T(t-2), D(t) and P(t+1) Variable selection using correlation coefficient matrix 4 Two inputs and One ouput T(t-2), D(t) Price(t+1) 5 For each variable (both input and output), choose appropriate linguistic variables, and ‘roughly’ design membership functions based their distributions. Designing Membership Functions 6 You will need to come up with the fuzzy rules on your own. These can be based on heuristics; i.e., rules of thumb. Here are some examples: These are just examples, and you will need to come up with rules of your own! Fuzzy Rules 7 • Once you have selected your fuzzy membership functions and rules, you can implement them in Scikit-fuzzy using the code from Week 6 and 7 labs as a basis. • It is suggested that you organise the code into a function, so that it will be easy for you to perform the inferencing over all examples in the test set. • You should also write a function to calculate the Average Relative Error, which you will use to evaluate your systems performance. Implementing your fuzzy inferencing system in Scikit-fuzzy 8 • Part 4 of the assignment carries the greatest weight in terms of marks, so it is important that you can get a fuzzy system running, even if it does not perform very well. • It is strongly suggested that you
Answered 3 days AfterMay 12, 2021CSE3CILa Trobe University

Answer To: CSE3CI – Computational Intelligence for Data Analytics Assignment, 2021 Due Date: Monday 17th May,...

Alok Kumar answered on May 16 2021
140 Votes
{
"cells": [
{
"cell_type": "code",
"execution_count": 1,
"metadata": {
"id": "dZumBCbG0kwf"
},
"outputs": [],
"source": [
"import pandas as pd\n",
"import numpy as np\n",
"import matplotlib.pyplot as plt\n",
"%matplotlib inline"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {
"id": "ocilV1om0vGE"
},
"outputs": [],
"source": [
"from sklearn.metrics import confusion_matrix\n",
"from sklearn.metrics import accuracy_score"
]
},
{
"cell_type": "code",
"execution_count": 95,
"metadata": {
"id": "eBZKhlKf0xIL"
},
"outputs": [],
"source": [
"from keras.models import Sequential\n",
"from keras.layers import Flatten\n",
"from keras.layers import Dense,Dropout"
]
},
{
"cell_type": "code",
"execution_count": 9,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "EbRco7Bu0y1i",
"outputId": "c2e968ca-d3ae-4501-c5cb-d5d8d9190a13"
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Mounted at /content/gdrive\n"
]
}
],
"source": [
"from google.colab import drive\n",
"drive.mount(\"/content/gdrive\")\n"
]
},
{
"cell_type": "code",
"execution_count": 17,
"metadata": {
"id": "6_etsKnx2ot2"
},
"outputs": [],
"source": [
"data=pd.read_csv('/content/gdrive/My Drive/trainingdata-dbaj4tb3.csv')"
]
},
{
"cell_type": "code",
"execution_count": 18,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/",
"height": 202
},
"id": "4qvvar6m4Ifs",
"outputId": "4b44a8f0-3552-4e3c-f25d-7942a493548c"
},
"outputs": [
{
"data": {
"text/html": [
"
\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"<tr>\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"\n",
"
T(t-2)T(t-1)T(t)D(t-2)D(t-1)D(t)P(t+1)
023.424.323.94630.14522.84413.115.17
124.323.923.34522.84413.14262.314.56
223.923.323.04413.14262.34164.114.73
323.323.022.54262.34164.14094.815.14
423.022.522.84164.14094.84085.714.59
\n",
"
"
],
"text/plain": [
" T(t-2) T(t-1) T(t) D(t-2) D(t-1) D(t) P(t+1)\n",
"0 23.4 24.3 23.9 4630.1 4522.8 4413.1 15.17\n",
"1 24.3 23.9 23.3 4522.8 4413.1 4262.3 14.56\n",
"2 23.9 23.3 23.0 4413.1 4262.3 4164.1 14.73\n",
"3 23.3 23.0 22.5 4262.3 4164.1 4094.8 15.14\n",
"4 23.0 22.5 22.8 4164.1 4094.8 4085.7 14.59"
]
},
"execution_count": 18,
"metadata": {
"tags": []
},
"output_type": "execute_result"
}
],
"source": [
"data.head()"
]
},
{
"cell_type": "code",
"execution_count": 19,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "MrATzuWp441_",
"outputId": "65757583-13b4-4635-d60c-198a191be59b"
},
"outputs": [
{
"data": {
"text/plain": [
"(956, 7)"
]
},
"execution_count": 19,
"metadata": {
"tags": []
},
"output_type": "execute_result"
}
],
"source": [
"data.shape"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "1c33SajWMFns"
},
"source": [
"## **Best Performing Model :-"
]
},
{
"cell_type": "code",
"execution_count": 114,
"metadata": {
"id": "3mkQ_quC46p2"
},
"outputs": [],
"source": [
"x_input = data.drop(labels = 'P(t+1)', axis= 1)\n",
"y_label = data.iloc[:,6]"
]
},
{
"cell_type": "code",
"execution_count": 115,
"metadata": {
"id": "OGOEieaQIDlr"
},
"outputs": [],
"source": [
"model = Sequential()\n",
"model.add(Dense(units= 300, input_dim =6, activation = 'relu'))\n",
"model.add(Dense(units = 200, activation = 'relu'))\n",
"#model.add(Dropout(0.5))\n",
"model.add(Dense(units = 50, activation = 'relu'))\n",
"model.add(Dense(1, activation='linear'))\n"
]
},
{
"cell_type": "code",
"execution_count": 116,
"metadata": {
"id": "KCot2Ve7IDjD"
},
"outputs": [],
"source": [
"model.compile(loss='mse', optimizer='adam', metrics=['mse', 'mae'])"
]
},
{
"cell_type": "code",
"execution_count": 117,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "STQBNuB4IDfy",
"outputId": "9866da01-6183-44c6-c6cf-93380f3573fe"
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Epoch 1/200\n",
"24/24 [==============================] - 1s 13ms/step - loss: 56579.9349 - mse: 56579.9349 - mae: 173.0913 - val_loss: 1394.3212 - val_mse: 1394.3212 - val_mae: 35.5384\n",
"Epoch 2/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1264.4004 - mse: 1264.4004 - mae: 22.9590 - val_loss: 178.1509 - val_mse: 178.1509 - val_mae: 12.0964\n",
"Epoch 3/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1276.9077 - mse: 1276.9077 - mae: 18.0049 - val_loss: 383.4441 - val_mse: 383.4441 - val_mae: 17.9101\n",
"Epoch 4/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1376.4983 - mse: 1376.4983 - mae: 17.8739 - val_loss: 113.5558 - val_mse: 113.5558 - val_mae: 7.9181\n",
"Epoch 5/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 999.2231 - mse: 999.2231 - mae: 14.8368 - val_loss: 255.1637 - val_mse: 255.1637 - val_mae: 11.8357\n",
"Epoch 6/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1532.1751 - mse: 1532.1751 - mae: 23.6190 - val_loss: 510.0167 - val_mse: 510.0167 - val_mae: 20.6993\n",
"Epoch 7/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1697.8839 - mse: 1697.8839 - mae: 18.9039 - val_loss: 194.2902 - val_mse: 194.2902 - val_mae: 12.7640\n",
"Epoch 8/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1030.7195 - mse: 1030.7195 - mae: 14.7647 - val_loss: 114.2632 - val_mse: 114.2632 - val_mae: 8.7584\n",
"Epoch 9/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1921.1269 - mse: 1921.1269 - mae: 25.5920 - val_loss: 850.9990 - val_mse: 850.9990 - val_mae: 26.9663\n",
"Epoch 10/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 2253.2649 - mse: 2253.2649 - mae: 26.5106 - val_loss: 124.7940 - val_mse: 124.7940 - val_mae: 9.7077\n",
"Epoch 11/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 2183.1102 - mse: 2183.1102 - mae: 25.9422 - val_loss: 110.8702 - val_mse: 110.8702 - val_mae: 7.8232\n",
"Epoch 12/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1408.9542 - mse: 1408.9542 - mae: 14.8660 - val_loss: 1019.7334 - val_mse: 1019.7334 - val_mae: 29.9130\n",
"Epoch 13/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1937.3157 - mse: 1937.3157 - mae: 27.3706 - val_loss: 267.8039 - val_mse: 267.8039 - val_mae: 12.4684\n",
"Epoch 14/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1581.6487 - mse: 1581.6487 - mae: 19.6696 - val_loss: 235.3079 - val_mse: 235.3079 - val_mae: 11.1282\n",
"Epoch 15/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1940.5505 - mse: 1940.5505 - mae: 21.0230 - val_loss: 380.5006 - val_mse: 380.5006 - val_mae: 16.3433\n",
"Epoch 16/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1342.9305 - mse: 1342.9305 - mae: 18.6597 - val_loss: 110.3073 - val_mse: 110.3073 - val_mae: 7.5814\n",
"Epoch 17/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 725.3933 - mse: 725.3933 - mae: 12.1412 - val_loss: 116.3737 - val_mse: 116.3737 - val_mae: 7.1912\n",
"Epoch 18/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1357.7928 - mse: 1357.7928 - mae: 16.3674 - val_loss: 129.3470 - val_mse: 129.3470 - val_mae: 10.0263\n",
"Epoch 19/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 860.4584 - mse: 860.4584 - mae: 16.0444 - val_loss: 237.8960 - val_mse: 237.8960 - val_mae: 14.1770\n",
"Epoch 20/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 999.7822 - mse: 999.7822 - mae: 20.7260 - val_loss: 621.8353 - val_mse: 621.8353 - val_mae: 22.9404\n",
"Epoch 21/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 3302.8944 - mse: 3302.8944 - mae: 41.2597 - val_loss: 230.0024 - val_mse: 230.0024 - val_mae: 13.9207\n",
"Epoch 22/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1070.0575 - mse: 1070.0575 - mae: 15.8200 - val_loss: 320.7378 - val_mse: 320.7378 - val_mae: 16.4256\n",
"Epoch 23/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 849.4555 - mse: 849.4555 - mae: 16.2404 - val_loss: 383.5304 - val_mse: 383.5304 - val_mae: 17.9286\n",
"Epoch 24/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 997.3256 - mse: 997.3256 - mae: 16.8518 - val_loss: 238.5720 - val_mse: 238.5720 - val_mae: 11.3722\n",
"Epoch 25/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1843.9421 - mse: 1843.9421 - mae: 23.2866 - val_loss: 242.1027 - val_mse: 242.1027 - val_mae: 11.5265\n",
"Epoch 26/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1157.0304 - mse: 1157.0304 - mae: 15.0370 - val_loss: 110.3560 - val_mse: 110.3560 - val_mae: 8.4940\n",
"Epoch 27/200\n",
"24/24 [==============================] - 0s 4ms/step - loss: 1192.7513 - mse: 1192.7513 - mae: 17.7864 - val_loss: 152.9165 - val_mse: 152.9165 - val_mae: 11.1747\n",
"Epoch 28/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1517.1439 - mse: 1517.1439 - mae: 19.7622 - val_loss: 367.5334 - val_mse: 367.5334 - val_mae: 17.5461\n",
"Epoch 29/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1086.2786 - mse: 1086.2786 - mae: 19.3340 - val_loss: 113.1737 - val_mse: 113.1737 - val_mae: 8.8014\n",
"Epoch 30/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1183.6395 - mse: 1183.6395 - mae: 16.5681 - val_loss: 392.2818 - val_mse: 392.2818 - val_mae: 18.1215\n",
"Epoch 31/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 956.4853 - mse: 956.4853 - mae: 19.9323 - val_loss: 501.0977 - val_mse: 501.0977 - val_mae: 20.4894\n",
"Epoch 32/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1526.5263 - mse: 1526.5263 - mae: 25.0483 - val_loss: 107.4493 - val_mse: 107.4493 - val_mae: 8.0844\n",
"Epoch 33/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 739.3418 - mse: 739.3418 - mae: 12.5105 - val_loss: 130.2129 - val_mse: 130.2129 - val_mae: 10.0463\n",
"Epoch 34/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 832.8399 - mse: 832.8399 - mae: 16.6487 - val_loss: 170.5651 - val_mse: 170.5651 - val_mae: 8.5404\n",
"Epoch 35/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1264.7398 - mse: 1264.7398 - mae: 22.7538 - val_loss: 467.1175 - val_mse: 467.1175 - val_mae: 18.8988\n",
"Epoch 36/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1727.6659 - mse: 1727.6659 - mae: 24.0654 - val_loss: 138.9152 - val_mse: 138.9152 - val_mae: 10.5044\n",
"Epoch 37/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 986.4371 - mse: 986.4371 - mae: 14.0137 - val_loss: 148.6595 - val_mse: 148.6595 - val_mae: 7.6473\n",
"Epoch 38/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1424.3668 - mse: 1424.3668 - mae: 18.7199 - val_loss: 410.1967 - val_mse: 410.1967 - val_mae: 18.5079\n",
"Epoch 39/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1550.8550 - mse: 1550.8550 - mae: 18.6139 - val_loss: 482.0362 - val_mse: 482.0362 - val_mae: 20.0650\n",
"Epoch 40/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1258.6552 - mse: 1258.6552 - mae: 19.3974 - val_loss: 281.8907 - val_mse: 281.8907 - val_mae: 13.2071\n",
"Epoch 41/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1777.7478 - mse: 1777.7478 - mae: 26.4844 - val_loss: 424.1042 - val_mse: 424.1042 - val_mae: 17.6978\n",
"Epoch 42/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 797.5147 - mse: 797.5147 - mae: 17.9342 - val_loss: 154.0988 - val_mse: 154.0988 - val_mae: 11.1903\n",
"Epoch 43/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 831.6021 - mse: 831.6021 - mae: 13.1936 - val_loss: 239.0028 - val_mse: 239.0028 - val_mae: 11.4595\n",
"Epoch 44/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1378.4828 - mse: 1378.4828 - mae: 21.2822 - val_loss: 147.1168 - val_mse: 147.1168 - val_mae: 10.8729\n",
"Epoch 45/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 966.5891 - mse: 966.5891 - mae: 14.6834 - val_loss: 174.8909 - val_mse: 174.8909 - val_mae: 12.0051\n",
"Epoch 46/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1062.7526 - mse: 1062.7526 - mae: 18.5904 - val_loss: 204.7732 - val_mse: 204.7732 - val_mae: 9.9953\n",
"Epoch 47/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1108.6405 - mse: 1108.6405 - mae: 16.2776 - val_loss: 123.6179 - val_mse: 123.6179 - val_mae: 9.6034\n",
"Epoch 48/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1234.2459 - mse: 1234.2459 - mae: 21.0649 - val_loss: 840.3122 - val_mse: 840.3122 - val_mae: 26.9817\n",
"Epoch 49/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1271.3397 - mse: 1271.3397 - mae: 18.4077 - val_loss: 119.0835 - val_mse: 119.0835 - val_mae: 9.2765\n",
"Epoch 50/200\n",
"24/24 [==============================] - 0s 6ms/step - loss: 1196.1174 - mse: 1196.1174 - mae: 14.1855 - val_loss: 198.4570 - val_mse: 198.4570 - val_mae: 12.8469\n",
"Epoch 51/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1661.9437 - mse: 1661.9437 - mae: 21.4165 - val_loss: 116.4653 - val_mse: 116.4653 - val_mae: 9.0376\n",
"Epoch 52/200\n",
"24/24 [==============================] - 0s 4ms/step - loss: 1127.2013 - mse: 1127.2013 - mae: 13.9932 - val_loss: 126.1223 - val_mse: 126.1223 - val_mae: 9.7521\n",
"Epoch 53/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1413.4044 - mse: 1413.4044 - mae: 15.7517 - val_loss: 175.1726 - val_mse: 175.1726 - val_mae: 11.9828\n",
"Epoch 54/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 856.3741 - mse: 856.3741 - mae: 13.3167 - val_loss: 198.8943 - val_mse: 198.8943 - val_mae: 12.8323\n",
"Epoch 55/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 941.9539 - mse: 941.9539 - mae: 13.0230 - val_loss: 646.4907 - val_mse: 646.4907 - val_mae: 23.3627\n",
"Epoch 56/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1016.8651 - mse: 1016.8651 - mae: 20.1681 - val_loss: 161.1773 - val_mse: 161.1773 - val_mae: 8.2113\n",
"Epoch 57/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1305.5409 - mse: 1305.5409 - mae: 15.0694 - val_loss: 513.0620 - val_mse: 513.0620 - val_mae: 20.6921\n",
"Epoch 58/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1153.4232 - mse: 1153.4232 - mae: 18.2926 - val_loss: 314.4951 - val_mse: 314.4951 - val_mae: 16.1617\n",
"Epoch 59/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1062.8382 - mse: 1062.8382 - mae: 16.1439 - val_loss: 121.0605 - val_mse: 121.0605 - val_mae: 9.3912\n",
"Epoch 60/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1091.9500 - mse: 1091.9500 - mae: 13.7090 - val_loss: 384.5555 - val_mse: 384.5555 - val_mae: 17.8593\n",
"Epoch 61/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1378.3453 - mse: 1378.3453 - mae: 19.9986 - val_loss: 111.3288 - val_mse: 111.3288 - val_mae: 8.5358\n",
"Epoch 62/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1186.0439 - mse: 1186.0439 - mae: 17.9480 - val_loss: 532.5379 - val_mse: 532.5379 - val_mae: 21.0756\n",
"Epoch 63/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1200.0686 - mse: 1200.0686 - mae: 21.1024 - val_loss: 411.9507 - val_mse: 411.9507 - val_mae: 18.4528\n",
"Epoch 64/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1179.6294 - mse: 1179.6294 - mae: 18.5663 - val_loss: 117.9206 - val_mse: 117.9206 - val_mae: 7.0563\n",
"Epoch 65/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 819.1951 - mse: 819.1951 - mae: 12.9555 - val_loss: 891.6452 - val_mse: 891.6452 - val_mae: 27.8321\n",
"Epoch 66/200\n",
"24/24 [==============================] - 0s 6ms/step - loss: 1707.1665 - mse: 1707.1665 - mae: 28.6738 - val_loss: 298.9107 - val_mse: 298.9107 - val_mae: 13.7058\n",
"Epoch 67/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1376.3056 - mse: 1376.3056 - mae: 23.8581 - val_loss: 164.0730 - val_mse: 164.0730 - val_mae: 11.5222\n",
"Epoch 68/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 806.4895 - mse: 806.4895 - mae: 13.2183 - val_loss: 117.6989 - val_mse: 117.6989 - val_mae: 9.1474\n",
"Epoch 69/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1211.8929 - mse: 1211.8929 - mae: 17.0674 - val_loss: 115.6132 - val_mse: 115.6132 - val_mae: 8.8647\n",
"Epoch 70/200\n",
"24/24 [==============================] - 0s 6ms/step - loss: 1207.6364 - mse: 1207.6364 - mae: 16.6811 - val_loss: 173.7342 - val_mse: 173.7342 - val_mae: 11.8685\n",
"Epoch 71/200\n",
"24/24 [==============================] - 0s 6ms/step - loss: 837.6533 - mse: 837.6533 - mae: 13.7705 - val_loss: 262.8142 - val_mse: 262.8142 - val_mae: 12.5115\n",
"Epoch 72/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1156.0423 - mse: 1156.0423 - mae: 15.8287 - val_loss: 196.2815 - val_mse: 196.2815 - val_mae: 12.6703\n",
"Epoch 73/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1049.0590 - mse: 1049.0590 - mae: 14.5537 - val_loss: 151.6830 - val_mse: 151.6830 - val_mae: 10.9967\n",
"Epoch 74/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1299.6593 - mse: 1299.6593 - mae: 16.5733 - val_loss: 141.2736 - val_mse: 141.2736 - val_mae: 10.5068\n",
"Epoch 75/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1040.4066 - mse: 1040.4065 - mae: 14.9123 - val_loss: 106.6511 - val_mse: 106.6511 - val_mae: 7.8566\n",
"Epoch 76/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1256.9377 - mse: 1256.9377 - mae: 17.0805 - val_loss: 152.6314 - val_mse: 152.6314 - val_mae: 7.9269\n",
"Epoch 77/200\n",
"24/24 [==============================] - 0s 6ms/step - loss: 1337.9971 - mse: 1337.9971 - mae: 17.2204 - val_loss: 123.6311 - val_mse: 123.6311 - val_mae: 7.0324\n",
"Epoch 78/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1146.1462 - mse: 1146.1462 - mae: 18.6032 - val_loss: 267.2151 - val_mse: 267.2151 - val_mae: 12.7348\n",
"Epoch 79/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 936.7543 - mse: 936.7543 - mae: 17.3673 - val_loss: 253.6791 - val_mse: 253.6791 - val_mae: 12.2268\n",
"Epoch 80/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 904.0339 - mse: 904.0339 - mae: 14.0378 - val_loss: 442.3156 - val_mse: 442.3156 - val_mae: 19.1381\n",
"Epoch 81/200\n",
"24/24 [==============================] - 0s 6ms/step - loss: 1274.1152 - mse: 1274.1152 - mae: 17.3646 - val_loss: 140.3253 - val_mse: 140.3253 - val_mae: 10.4527\n",
"Epoch 82/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 742.8571 - mse: 742.8571 - mae: 13.9511 - val_loss: 234.4916 - val_mse: 234.4916 - val_mae: 13.8849\n",
"Epoch 83/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1083.5569 - mse: 1083.5569 - mae: 18.7618 - val_loss: 286.0374 - val_mse: 286.0374 - val_mae: 15.3498\n",
"Epoch 84/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 974.8519 - mse: 974.8519 - mae: 18.0006 - val_loss: 136.2307 - val_mse: 136.2307 - val_mae: 7.3365\n",
"Epoch 85/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1496.3928 - mse: 1496.3928 - mae: 21.2628 - val_loss: 136.6375 - val_mse: 136.6375 - val_mae: 7.3406\n",
"Epoch 86/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1048.9065 - mse: 1048.9065 - mae: 14.8730 - val_loss: 277.3787 - val_mse: 277.3787 - val_mae: 15.1386\n",
"Epoch 87/200\n",
"24/24 [==============================] - 0s 6ms/step - loss: 1611.7761 - mse: 1611.7761 - mae: 18.4462 - val_loss: 114.7812 - val_mse: 114.7812 - val_mae: 8.8392\n",
"Epoch 88/200\n",
"24/24 [==============================] - 0s 6ms/step - loss: 564.0368 - mse: 564.0368 - mae: 10.3493 - val_loss: 123.1416 - val_mse: 123.1416 - val_mae: 7.0182\n",
"Epoch 89/200\n",
"24/24 [==============================] - 0s 6ms/step - loss: 958.9688 - mse: 958.9688 - mae: 16.3055 - val_loss: 158.9483 - val_mse: 158.9483 - val_mae: 11.2520\n",
"Epoch 90/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 909.5599 - mse: 909.5599 - mae: 12.6107 - val_loss: 109.6177 - val_mse: 109.6177 - val_mae: 8.2895\n",
"Epoch 91/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 699.3389 - mse: 699.3389 - mae: 11.1328 - val_loss: 177.1954 - val_mse: 177.1954 - val_mae: 8.8965\n",
"Epoch 92/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1122.4775 - mse: 1122.4775 - mae: 16.3809 - val_loss: 139.4140 - val_mse: 139.4140 - val_mae: 7.3815\n",
"Epoch 93/200\n",
"24/24 [==============================] - 0s 6ms/step - loss: 1302.3217 - mse: 1302.3217 - mae: 19.3611 - val_loss: 125.1656 - val_mse: 125.1656 - val_mae: 7.0378\n",
"Epoch 94/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1101.0801 - mse: 1101.0801 - mae: 13.7593 - val_loss: 134.9294 - val_mse: 134.9294 - val_mae: 10.1440\n",
"Epoch 95/200\n",
"24/24 [==============================] - 0s 6ms/step - loss: 1017.9091 - mse: 1017.9091 - mae: 14.4911 - val_loss: 453.7914 - val_mse: 453.7914 - val_mae: 19.3710\n",
"Epoch 96/200\n",
"24/24 [==============================] - 0s 6ms/step - loss: 1132.7393 - mse: 1132.7393 - mae: 18.1794 - val_loss: 167.0961 - val_mse: 167.0961 - val_mae: 11.5404\n",
"Epoch 97/200\n",
"24/24 [==============================] - 0s 6ms/step - loss: 597.2760 - mse: 597.2760 - mae: 11.9976 - val_loss: 106.5979 - val_mse: 106.5979 - val_mae: 7.3433\n",
"Epoch 98/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1068.8889 - mse: 1068.8889 - mae: 12.5261 - val_loss: 110.6210 - val_mse: 110.6210 - val_mae: 7.0936\n",
"Epoch 99/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1572.3833 - mse: 1572.3833 - mae: 14.3286 - val_loss: 204.0423 - val_mse: 204.0423 - val_mae: 10.1593\n",
"Epoch 100/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1260.2636 - mse: 1260.2636 - mae: 16.6500 - val_loss: 213.8260 - val_mse: 213.8260 - val_mae: 10.6639\n",
"Epoch 101/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 2114.2506 - mse: 2114.2506 - mae: 18.0639 - val_loss: 108.8281 - val_mse: 108.8281 - val_mae: 7.0404\n",
"Epoch 102/200\n",
"24/24 [==============================] - 0s 6ms/step - loss: 1550.9013 - mse: 1550.9013 - mae: 17.4429 - val_loss: 333.2701 - val_mse: 333.2701 - val_mae: 16.6146\n",
"Epoch 103/200\n",
"24/24 [==============================] - 0s 6ms/step - loss: 972.7313 - mse: 972.7313 - mae: 16.5055 - val_loss: 167.8441 - val_mse: 167.8441 - val_mae: 8.6071\n",
"Epoch 104/200\n",
"24/24 [==============================] - 0s 6ms/step - loss: 915.2030 - mse: 915.2030 - mae: 13.0102 - val_loss: 121.0113 - val_mse: 121.0113 - val_mae: 9.3378\n",
"Epoch 105/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1177.8921 - mse: 1177.8921 - mae: 14.2319 - val_loss: 111.6314 - val_mse: 111.6314 - val_mae: 7.0199\n",
"Epoch 106/200\n",
"24/24 [==============================] - 0s 6ms/step - loss: 1260.1801 - mse: 1260.1801 - mae: 18.8987 - val_loss: 555.5529 - val_mse: 555.5529 - val_mae: 21.5290\n",
"Epoch 107/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1112.1883 - mse: 1112.1883 - mae: 19.1045 - val_loss: 120.0679 - val_mse: 120.0679 - val_mae: 9.2492\n",
"Epoch 108/200\n",
"24/24 [==============================] - 0s 6ms/step - loss: 910.2235 - mse: 910.2235 - mae: 14.3492 - val_loss: 692.5959 - val_mse: 692.5959 - val_mae: 24.2521\n",
"Epoch 109/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1125.8857 - mse: 1125.8857 - mae: 19.7076 - val_loss: 139.9741 - val_mse: 139.9741 - val_mae: 7.3649\n",
"Epoch 110/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1090.4381 - mse: 1090.4381 - mae: 12.5203 - val_loss: 113.1024 - val_mse: 113.1024 - val_mae: 8.7822\n",
"Epoch 111/200\n",
"24/24 [==============================] - 0s 6ms/step - loss: 1214.5478 - mse: 1214.5478 - mae: 13.9910 - val_loss: 188.1500 - val_mse: 188.1500 - val_mae: 12.3962\n",
"Epoch 112/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1137.3684 - mse: 1137.3684 - mae: 13.9807 - val_loss: 111.3473 - val_mse: 111.3473 - val_mae: 6.9861\n",
"Epoch 113/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 991.5284 - mse: 991.5284 - mae: 13.6592 - val_loss: 107.2623 - val_mse: 107.2623 - val_mae: 7.9624\n",
"Epoch 114/200\n",
"24/24 [==============================] - 0s 6ms/step - loss: 849.4387 - mse: 849.4387 - mae: 12.8648 - val_loss: 202.0173 - val_mse: 202.0173 - val_mae: 12.8318\n",
"Epoch 115/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1206.8839 - mse: 1206.8839 - mae: 16.1466 - val_loss: 118.2808 - val_mse: 118.2808 - val_mae: 6.9445\n",
"Epoch 116/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1155.0584 - mse: 1155.0584 - mae: 14.5383 - val_loss: 181.6214 - val_mse: 181.6214 - val_mae: 9.1339\n",
"Epoch 117/200\n",
"24/24 [==============================] - 0s 6ms/step - loss: 1510.1459 - mse: 1510.1459 - mae: 15.3983 - val_loss: 108.0808 - val_mse: 108.0808 - val_mae: 8.0678\n",
"Epoch 118/200\n",
"24/24 [==============================] - 0s 6ms/step - loss: 1067.0936 - mse: 1067.0936 - mae: 13.5313 - val_loss: 106.8765 - val_mse: 106.8765 - val_mae: 7.6210\n",
"Epoch 119/200\n",
"24/24 [==============================] - 0s 6ms/step - loss: 803.6539 - mse: 803.6539 - mae: 11.0354 - val_loss: 108.3178 - val_mse: 108.3178 - val_mae: 7.1958\n",
"Epoch 120/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1009.7448 - mse: 1009.7448 - mae: 12.5416 - val_loss: 107.0687 - val_mse: 107.0687 - val_mae: 7.9120\n",
"Epoch 121/200\n",
"24/24 [==============================] - 0s 6ms/step - loss: 1261.6091 - mse: 1261.6091 - mae: 13.4770 - val_loss: 119.5757 - val_mse: 119.5757 - val_mae: 6.9696\n",
"Epoch 122/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1022.4550 - mse: 1022.4550 - mae: 12.6712 - val_loss: 122.0315 - val_mse: 122.0315 - val_mae: 9.3596\n",
"Epoch 123/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 679.2376 - mse: 679.2376 - mae: 11.0181 - val_loss: 217.8362 - val_mse: 217.8362 - val_mae: 13.3024\n",
"Epoch 124/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 910.8362 - mse: 910.8362 - mae: 14.3532 - val_loss: 274.4497 - val_mse: 274.4497 - val_mae: 15.0458\n",
"Epoch 125/200\n",
"24/24 [==============================] - 0s 6ms/step - loss: 1115.6848 - mse: 1115.6848 - mae: 14.1580 - val_loss: 179.4435 - val_mse: 179.4435 - val_mae: 12.0441\n",
"Epoch 126/200\n",
"24/24 [==============================] - 0s 6ms/step - loss: 1329.8046 - mse: 1329.8046 - mae: 14.8988 - val_loss: 135.0968 - val_mse: 135.0968 - val_mae: 7.2658\n",
"Epoch 127/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1100.2083 - mse: 1100.2083 - mae: 13.8630 - val_loss: 164.4935 - val_mse: 164.4935 - val_mae: 11.4859\n",
"Epoch 128/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1707.5715 - mse: 1707.5715 - mae: 15.1943 - val_loss: 138.6945 - val_mse: 138.6945 - val_mae: 10.3110\n",
"Epoch 129/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1432.8339 - mse: 1432.8339 - mae: 14.1418 - val_loss: 106.0038 - val_mse: 106.0038 - val_mae: 7.3758\n",
"Epoch 130/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1298.5388 - mse: 1298.5388 - mae: 14.0785 - val_loss: 140.6155 - val_mse: 140.6155 - val_mae: 10.4742\n",
"Epoch 131/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 703.7685 - mse: 703.7685 - mae: 13.4920 - val_loss: 334.0876 - val_mse: 334.0876 - val_mae: 16.6226\n",
"Epoch 132/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 905.5967 - mse: 905.5967 - mae: 17.3644 - val_loss: 106.5314 - val_mse: 106.5314 - val_mae: 7.5379\n",
"Epoch 133/200\n",
"24/24 [==============================] - 0s 6ms/step - loss: 561.6799 - mse: 561.6799 - mae: 10.5569 - val_loss: 432.0350 - val_mse: 432.0350 - val_mae: 18.9208\n",
"Epoch 134/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 1361.1532 - mse: 1361.1532 - mae: 18.1798 - val_loss: 155.4801 - val_mse: 155.4801 - val_mae: 11.1707\n",
"Epoch 135/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 914.7711 - mse: 914.7711 - mae: 11.9147 - val_loss: 204.7598 - val_mse: 204.7598 - val_mae: 12.9827\n",
"Epoch 136/200\n",
"24/24 [==============================] - 0s 5ms/step - loss: 983.6263 - mse: 983.6263 - mae: 14.2839 - val_loss: 138.7785 - val_mse: 138.7785 - val_mae: 7.4258\n",
"Epoch 137/200\n",
"24/24 [==============================] - 0s 6ms/step - loss: 1051.6069...
SOLUTION.PDF

Answer To This Question Is Available To Download

Related Questions & Answers

More Questions »

Submit New Assignment

Copy and Paste Your Assignment Here