CNN+LSTM的模型结合(keras代码实现)

更新时间:2023-05-15 08:54:32 阅读: 评论:0

CNN+LSTM的模型结合(keras代码实现)
CNN-LSTM模型
运⾏环境:python3.6.5 、Keras 2.1.5 、tensorflow 2.3.1等
CNN-LSTM的Sequential()写法:
dels import Sequential
from keras.layers import Den, Dropout, Activation
凯撒之死
from keras.layers import Convolution1D, MaxPooling1D
from keras.layers import LSTM
#模型参数
time_step=100
# Convolution  卷积
filter_length = 5    # 滤波器长度
nb_filter = 64      # 滤波器个数
pool_length = 4      # 池化长度
# LSTM
设计主管lstm_output_size = 70  # LSTM 层输出尺⼨
# Training  训练参数
batch_size = 30  # 批数据量⼤⼩
nb_epoch = 2      # 迭代次数
# 构建模型
华洋外国语学校model = Sequential()
model.add(Input(shape=(time_step, 128))  # 输⼊特征接收维度)  # 词嵌⼊层
model.add(Dropout(0.25))      # Dropout层
# 1D 卷积层,对词嵌⼊层输出做卷积操作
model.add(Convolution1D(nb_filter=nb_filter,
filter_length=filter_length,
border_mode='valid',
activation='relu',
subsample_length=1))佛教起源地
# 池化层
model.add(MaxPooling1D(pool_length=pool_length))
# LSTM 循环层
model.add(LSTM(lstm_output_size))
# 全连接层,只有⼀个神经元,输⼊是否为正⾯情感值
model.add(Den(1))
model.add(Activation('sigmoid'))  # sigmoid判断情感(此处来做⽂本的情感分类问题)
model.summary()  # 模型概述
做文明学生演讲稿
optimizer='adam',
metrics=['accuracy'])
# 训练
model.fit(X_train, y_train, batch_size=batch_size, nb_epoch=nb_epoch,
validation_data=(X_test, y_test))
自动档车正确起步方法输⼊[100, 128]的序列(100为序列长度,50为词嵌⼊维度),经过(池化层为MaxPooling1D,) Co
nvlution1D(nb_filters=64, filter_length=5) 后,变为 [96, 64],再经过 MaxPooling1D(pool_length=4) 后,变成了 [24, 64]。其模型维度结构如图所⽰:
Layer (type)                    Output Shape          Param #    Connected to
==================================================================================================== embedding_1 (Embedding)          (None, 100, 128)      2560000    embedding_input_1[0][0]
____________________________________________________________________________________________________ dropout_1 (Dropout)              (None, 100, 128)      0          embedding_1[0][0]
____________________________________________________________________________________________________ convolution1d_1 (Convolution1D)  (None, 96, 64)        41024      dropout_1[0][0]
____________________________________________________________________________________________________ maxpooling1d_1 (MaxPooling1D)    (None, 24, 64)        0          convolution1d_1[0][0]
____________________________________________________________________________________________________ lstm_1 (LSTM)                    (None, 70)            37800      maxpooling1d_1[0][0]
____________________________________________________________________________________________________ den_1 (Den)                  (None, 1)            71          lstm_1[0][0]
____________________________________________________________________________________________________ activation_1 (Activation)        (None, 1)            0          den_1[0][0]
==================================================================================================== CNN-LSTM的结构化写法:
inputs = Input(shape=(1, 3))  # 输⼊特征接收维度
a=Dropout(0.25)(inputs)
conv=Convolution1D(10, 1,strides=1,padding="valid", dilation_rate=1)(a)#filters, kernel_size, strides=1
max=MaxPooling1D(pool_length=pool_length)(conv)
lstm1=LSTM(lstm_output_size)(max)
output = Den(1, activation='linear')(lstm1)  # 输出类别(此次来做简单的线性预测问题)
model = Model(inputs=inputs, outputs=output)  # 初始命名训练的模型为model
model.summary()
其模型维度结构如图所⽰:
_________________________________________________________________
Layer (type)                Output Shape              Param #
=================================================================
酵母片的作用与功效
input_1 (InputLayer)        (None, 1, 3)              0
_________________________________________________________________
dropout_1 (Dropout)          (None, 1, 3)              0
_________________________________________________________________
conv1d_1 (Conv1D)            (None, 1, 10)            40
_________________________________________________________________
max_pooling1d_1 (MaxPooling1 (None, 1, 10)            0
_________________________________________________________________
lstm_1 (LSTM)                (None, 70)                22680
_________________________________________________________________
den_1 (Den)              (None, 1)                71
=================================================================
Total params: 22,791
Trainable params: 22,791
Non-trainable params: 0
_________________________________________________________________东北四大顺口溜

本文发布于:2023-05-15 08:54:32,感谢您对本站的认可!

本文链接:https://www.wtabcd.cn/fanwen/fan/89/898872.html

版权声明:本站内容均来自互联网,仅供演示用,请勿用于商业和其他非法用途。如果侵犯了您的权益请与我们联系,我们将在24小时内删除。

标签:模型   维度   情感
相关文章
留言与评论(共有 0 条评论)
   
验证码:
推荐文章
排行榜
Copyright ©2019-2022 Comsenz Inc.Powered by © 专利检索| 网站地图