1000字范文,内容丰富有趣,学习的好帮手!
1000字范文 > TF之LSTM:利用LSTM算法对Boston(波士顿房价)数据集【13+1 506】进行回归预测(房价预测)

TF之LSTM:利用LSTM算法对Boston(波士顿房价)数据集【13+1 506】进行回归预测(房价预测)

时间:2021-07-20 21:21:49

相关推荐

TF之LSTM:利用LSTM算法对Boston(波士顿房价)数据集【13+1 506】进行回归预测(房价预测)

TF之LSTM:利用LSTM算法对Boston(波士顿房价)数据集【13+1,506】进行回归预测(房价预测)

相关文章

DL之LSTM:利用LSTM算法对Boston(波士顿房价)数据集【13+1,506】进行回归预测(房价预测)

目录

输出结果

Tensorboard可视化

设计思路

核心代码

输出结果

波士顿房价数据集前10行数据

boston.data: (506, 13)[[6.3200e-03 1.8000e+01 2.3100e+00 0.0000e+00 5.3800e-01 6.5750e+006.5200e+01 4.0900e+00 1.0000e+00 2.9600e+02 1.5300e+01 3.9690e+024.9800e+00][2.7310e-02 0.0000e+00 7.0700e+00 0.0000e+00 4.6900e-01 6.4210e+007.8900e+01 4.9671e+00 2.0000e+00 2.4200e+02 1.7800e+01 3.9690e+029.1400e+00][2.7290e-02 0.0000e+00 7.0700e+00 0.0000e+00 4.6900e-01 7.1850e+006.1100e+01 4.9671e+00 2.0000e+00 2.4200e+02 1.7800e+01 3.9283e+024.0300e+00][3.2370e-02 0.0000e+00 2.1800e+00 0.0000e+00 4.5800e-01 6.9980e+004.5800e+01 6.0622e+00 3.0000e+00 2.2200e+02 1.8700e+01 3.9463e+022.9400e+00][6.9050e-02 0.0000e+00 2.1800e+00 0.0000e+00 4.5800e-01 7.1470e+005.4200e+01 6.0622e+00 3.0000e+00 2.2200e+02 1.8700e+01 3.9690e+025.3300e+00][2.9850e-02 0.0000e+00 2.1800e+00 0.0000e+00 4.5800e-01 6.4300e+005.8700e+01 6.0622e+00 3.0000e+00 2.2200e+02 1.8700e+01 3.9412e+025.2100e+00][8.8290e-02 1.2500e+01 7.8700e+00 0.0000e+00 5.2400e-01 6.0120e+006.6600e+01 5.5605e+00 5.0000e+00 3.1100e+02 1.5200e+01 3.9560e+021.2430e+01][1.4455e-01 1.2500e+01 7.8700e+00 0.0000e+00 5.2400e-01 6.1720e+009.6100e+01 5.9505e+00 5.0000e+00 3.1100e+02 1.5200e+01 3.9690e+021.9150e+01][2.1124e-01 1.2500e+01 7.8700e+00 0.0000e+00 5.2400e-01 5.6310e+001.0000e+02 6.0821e+00 5.0000e+00 3.1100e+02 1.5200e+01 3.8663e+022.9930e+01][1.7004e-01 1.2500e+01 7.8700e+00 0.0000e+00 5.2400e-01 6.0040e+008.5900e+01 6.5921e+00 5.0000e+00 3.1100e+02 1.5200e+01 3.8671e+021.7100e+01]]boston.target: (506,)[24. 21.6 34.7 33.4 36.2 28.7 22.9 27.1 16.5 18.9]

Tensorboard可视化

设计思路

核心代码

def add_input_layer(self,):l_in_x = tf.reshape(self.xs, [-1, self.input_size], name='2_2D') Ws_in = self._weight_variable([self.input_size, self.cell_size])bs_in = self._bias_variable([self.cell_size,])# l_in_y = (batch * n_steps, cell_size)with tf.name_scope('Wx_plus_b'):l_in_y = tf.matmul(l_in_x, Ws_in) + bs_inself.l_in_y = tf.reshape(l_in_y, [-1, self.n_steps, self.cell_size], name='2_3D')def add_cell(self):lstm_cell = tf.nn.rnn_cell.BasicLSTMCell(self.cell_size, forget_bias=1.0, state_is_tuple=True)with tf.name_scope('initial_state'):self.cell_init_state = lstm_cell.zero_state(self.batch_size, dtype=tf.float32)self.cell_outputs, self.cell_final_state = tf.nn.dynamic_rnn(lstm_cell, self.l_in_y, initial_state=self.cell_init_state, time_major=False)def add_output_layer(self):# shape = (batch * steps, cell_size)l_out_x = tf.reshape(self.cell_outputs, [-1, self.cell_size], name='2_2D')Ws_out = self._weight_variable([self.cell_size, self.output_size])bs_out = self._bias_variable([self.output_size, ])# shape = (batch * steps, output_size)with tf.name_scope('Wx_plus_b'):self.pred = tf.matmul(l_out_x, Ws_out) + bs_out def compute_cost(self):losses = tf.contrib.legacy_seq2seq.sequence_loss_by_example([tf.reshape(self.pred, [-1], name='reshape_pred')],[tf.reshape(self.ys, [-1], name='reshape_target')],[tf.ones([self.batch_size * self.n_steps], dtype=tf.float32)],average_across_timesteps=True,softmax_loss_function=self.ms_error,name='losses')with tf.name_scope('average_cost'):self.cost = tf.div(tf.reduce_sum(losses, name='losses_sum'),self.batch_size,name='average_cost')tf.summary.scalar('cost', self.cost)

1、训练数据集的(递增式)索引批次段输出记录

训练数据集的(递增式)索引批次段: 0 300训练数据集的(递增式)索引批次段: 10 310训练数据集的(递增式)索引批次段: 20 320训练数据集的(递增式)索引批次段: 30 330训练数据集的(递增式)索引批次段: 40 340训练数据集的(递增式)索引批次段: 50 350训练数据集的(递增式)索引批次段: 60 360训练数据集的(递增式)索引批次段: 70 370训练数据集的(递增式)索引批次段: 80 380训练数据集的(递增式)索引批次段: 90 390训练数据集的(递增式)索引批次段: 100 400训练数据集的(递增式)索引批次段: 110 410训练数据集的(递增式)索引批次段: 120 420训练数据集的(递增式)索引批次段: 130 430训练数据集的(递增式)索引批次段: 140 440训练数据集的(递增式)索引批次段: 150 450训练数据集的(递增式)索引批次段: 160 460训练数据集的(递增式)索引批次段: 170 470训练数据集的(递增式)索引批次段: 180 480训练数据集的(递增式)索引批次段: 190 490训练数据集的(递增式)索引批次段: 200 5000 cost: 6.6038训练数据集的(递增式)索引批次段: 0 300训练数据集的(递增式)索引批次段: 10 310训练数据集的(递增式)索引批次段: 20 320训练数据集的(递增式)索引批次段: 30 330训练数据集的(递增式)索引批次段: 40 340训练数据集的(递增式)索引批次段: 50 350训练数据集的(递增式)索引批次段: 60 360训练数据集的(递增式)索引批次段: 70 370训练数据集的(递增式)索引批次段: 80 380训练数据集的(递增式)索引批次段: 90 390训练数据集的(递增式)索引批次段: 100 400训练数据集的(递增式)索引批次段: 110 410训练数据集的(递增式)索引批次段: 120 420训练数据集的(递增式)索引批次段: 130 430训练数据集的(递增式)索引批次段: 140 440训练数据集的(递增式)索引批次段: 150 450训练数据集的(递增式)索引批次段: 160 460训练数据集的(递增式)索引批次段: 170 470训练数据集的(递增式)索引批次段: 180 480训练数据集的(递增式)索引批次段: 190 4901 cost: 3.8826……2 cost: 2.7715…………8 cost: 1.0885………………55 cost: 0.1853……………………198 cost: 0.058训练数据集的(递增式)索引批次段: 0 300训练数据集的(递增式)索引批次段: 10 310训练数据集的(递增式)索引批次段: 20 320训练数据集的(递增式)索引批次段: 30 330训练数据集的(递增式)索引批次段: 40 340训练数据集的(递增式)索引批次段: 50 350训练数据集的(递增式)索引批次段: 60 360训练数据集的(递增式)索引批次段: 70 370训练数据集的(递增式)索引批次段: 80 380训练数据集的(递增式)索引批次段: 90 390训练数据集的(递增式)索引批次段: 100 400训练数据集的(递增式)索引批次段: 110 410训练数据集的(递增式)索引批次段: 120 420训练数据集的(递增式)索引批次段: 130 430训练数据集的(递增式)索引批次段: 140 440训练数据集的(递增式)索引批次段: 150 450训练数据集的(递增式)索引批次段: 160 460训练数据集的(递增式)索引批次段: 170 470训练数据集的(递增式)索引批次段: 180 480训练数据集的(递增式)索引批次段: 190 490199 cost: 0.0424

本内容不代表本网观点和政治立场,如有侵犯你的权益请联系我们处理。
网友评论
网友评论仅供其表达个人看法,并不表明网站立场。