这是个什么错误呢,求助!!!

策略分享
标签: #<Tag:0x00007fc828d73230>

(wygwsg) #1
克隆策略

使用深度学习技术预测股票价格

版本 v1.0

目录

  • ### 深度学习策略的交易规则

  • ### 策略构建步骤

  • ### 策略的实现

正文

一、深度学习策略的交易规则

  • 买入条件:预测的上涨概率>0.5,则买入或保持已有持仓。
  • 卖出条件 :预测的上涨概率<0.5,则卖出已有股票。

二、策略构建步骤

1、确定股票池和数据起止时间

  • 在证券代码列表m24和m28模块中输入要回测的单只股票,以及数据的起止日期(分别为训练集和验证集)。

2、确定因子

  • 在输入特征列表m8模块中输入用于预测的N个因子表达式。

3、获取基础数据

  • 通过基础特征数据抽取模块m22和m16获取指定股票池的基础数据,如收盘价等字段。

4、确定并计算模型标注

  • 通过自动标注股票模块m21计算需要的标注指标,本例中首先计算未来10天收益,然后根据其正负来给每日数据标注1或0,来标识涨跌。

5、抽取因子数据

  • 通过衍生数据抽取模块m23和m26计算因子数据。

6、合并标注与因子数据

  • 通过连接数据m17模块合并因子数据和标注数据。

7、生成序列窗口滚动数据集

  • 通过序列窗口滚动(深度学习)模块m25和m27将训练集和预测集的数据生成固定窗口长度的数据序列,为后续模型训练和预测做准备。

8、构建LSTM + CNN模型构架

  • 在画布左侧模块列表中依次拖入输入层模块、Reshape层模块、Conv2D层模块、Reshape层模块、LSTM层模块、Dropout层模块和全连接层模块(两组),构成深度学习网络构架,

    最后通过“构建(深度学习)”模块组装各层。这里需要注意:

    输入层的shape参数是 窗口滚动数据集的大小 X 因子数量 , 本例为 50 行 X 5个因子

    ReShape层的参数是 窗口滚动数据集的大小 X 因子数量 X 1 ,本例为 50 行 X 5个因子 X1

    Conv2D层中的 kernel_size参数是滑动窗口的尺寸,本例中使用 3行 X 5列 的窗口, 每次滑动的步长为 1行 X 1列 , 卷积核数目为32,这里的窗口设置决定了后面ReShape层的参数

    ReShape层中的target_shape 参数,这是由 窗口滚动数据集 X 因子数量 和 Conv2D层中设置的窗口尺寸以及步长决定的。本例中 50行 X 5因子 的输入数据,使用 3行 X5列 的窗口滑动取数据,

    每次移动1行,共计可以得到48次数据(即可以通过滑动3行 X 5列的窗口48次来获取完整的数据),因此target_shape= 48 X 卷积核数32

    LSTM层的输出空间维度设置为卷积核数32,并设置激活函数

    Dropout层是防止过度拟合采用的主动裁剪数据技术,这里设置rate 为0.8

    全连接层共两层,第一层的输出空间维度与LSTM的输出维度保持一致为32,第二层将第一层的32维数据转变为1维数据输出,即获取预测的label值,此例为0到1之间的连续值,可以认为是上涨的概率。

9、训练深度学习模型

  • 在画布左侧模块列表中拖入“训练(深度学习)”模块m6,设置属性中的优化器、目标函数、评估指标、每次训练的数据量batch_size、迭代次数epochs和GPU的数量以及日志输出频率。

10、使用深度学习模型预测

  • 在画布左侧模块列表中拖入“预测(深度学习)”模块m7,并将“训练(深度学习)”模块m6的模型输出和验证集的序列窗口滚动数据集传给预测模块,通过预测模块即根据股票验证集的数据预测上涨的概率。

11、将预测结果与时间拼接

  • 通过自定义模块m2将预测的每个滚动序列窗口的最后一个值最为当日的预测结果,并与预测集数据的时间列拼接,形成最终的每日预测结果。

12、根据模型预测结果构建策略

  • 如果当日预测的上涨概率大于0.5,则保持持仓或买入

  • 如果当日预测的上涨概率小于0.5,则卖出股票或保持空仓。

13、模拟回测

  • 通过 trade 模块中的初始化函数定义交易手续费和滑点,通过 context.prediction 获取每日的上涨概率预测结果;

  • 通过 trade 模块中的主函数(handle函数)查看每日的买卖交易信号,按照买卖原则执行相应的买入/卖出操作。

三、策略的实现

可视化策略实现如下:

    {"Description":"实验创建于2017/11/15","Summary":"","Graph":{"EdgesInternal":[{"DestinationInputPortId":"-281:options_data","SourceOutputPortId":"-214:data_1"},{"DestinationInputPortId":"-403:inputs","SourceOutputPortId":"-210:data"},{"DestinationInputPortId":"-293:inputs","SourceOutputPortId":"-210:data"},{"DestinationInputPortId":"-14834:inputs","SourceOutputPortId":"-218:data"},{"DestinationInputPortId":"-692:input_data","SourceOutputPortId":"-316:data"},{"DestinationInputPortId":"-294:input_1","SourceOutputPortId":"-320:data"},{"DestinationInputPortId":"-332:trained_model","SourceOutputPortId":"-320:data"},{"DestinationInputPortId":"-214:input_1","SourceOutputPortId":"-332:data"},{"DestinationInputPortId":"-692:features","SourceOutputPortId":"-2295:data"},{"DestinationInputPortId":"-333:features","SourceOutputPortId":"-2295:data"},{"DestinationInputPortId":"-341:features","SourceOutputPortId":"-2295:data"},{"DestinationInputPortId":"-300:features","SourceOutputPortId":"-2295:data"},{"DestinationInputPortId":"-307:features","SourceOutputPortId":"-2295:data"},{"DestinationInputPortId":"-316:features","SourceOutputPortId":"-2295:data"},{"DestinationInputPortId":"-293:outputs","SourceOutputPortId":"-259:data"},{"DestinationInputPortId":"-14841:inputs","SourceOutputPortId":"-14806:data"},{"DestinationInputPortId":"-14806:inputs","SourceOutputPortId":"-14834:data"},{"DestinationInputPortId":"-259:inputs","SourceOutputPortId":"-14841:data"},{"DestinationInputPortId":"-408:inputs","SourceOutputPortId":"-403:data"},{"DestinationInputPortId":"-446:inputs","SourceOutputPortId":"-408:data"},{"DestinationInputPortId":"-218:inputs","SourceOutputPortId":"-446:data"},{"DestinationInputPortId":"-2296:input_data","SourceOutputPortId":"-2290:data"},{"DestinationInputPortId":"-333:input_data","SourceOutputPortId":"-2296:data"},{"DestinationInputPortId":"-289:instruments","SourceOutputPortId":"-620:data"},{"DestinationInputPortId":"-300:instruments","SourceOutputPortId":"-620:data"},{"DestinationInputPortId":"-330:input_data","SourceOutputPortId":"-692:data"},{"DestinationInputPortId":"-320:training_data","SourceOutputPortId":"-333:data"},{"DestinationInputPortId":"-332:input_data","SourceOutputPortId":"-341:data"},{"DestinationInputPortId":"-214:input_2","SourceOutputPortId":"-341:data"},{"DestinationInputPortId":"-2290:data1","SourceOutputPortId":"-289:data"},{"DestinationInputPortId":"-307:input_data","SourceOutputPortId":"-300:data"},{"DestinationInputPortId":"-2290:data2","SourceOutputPortId":"-307:data"},{"DestinationInputPortId":"-316:instruments","SourceOutputPortId":"-322:data"},{"DestinationInputPortId":"-281:instruments","SourceOutputPortId":"-322:data"},{"DestinationInputPortId":"-341:input_data","SourceOutputPortId":"-330:data"},{"DestinationInputPortId":"-214:input_3","SourceOutputPortId":"-330:data"},{"DestinationInputPortId":"-320:input_model","SourceOutputPortId":"-293:data"}],"ModuleNodes":[{"Id":"-214","ModuleId":"BigQuantSpace.cached.cached-v3","ModuleParameters":[{"Name":"run","Value":"# Python 代码入口函数,input_1/2/3 对应三个输入端,data_1/2/3 对应三个输出端\ndef bigquant_run(input_1, input_2, input_3):\n\n test_data = input_2.read_pickle()\n pred_label = input_1.read_pickle()\n pred_result = pred_label.reshape(pred_label.shape[0]) \n dt = input_3.read_df()['date'][-1*len(pred_result):]\n pred_df = pd.Series(pred_result, index=dt)\n ds = DataSource.write_df(pred_df)\n \n return Outputs(data_1=ds)\n","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"post_run","Value":"# 后处理函数,可选。输入是主函数的输出,可以在这里对数据做处理,或者返回更友好的outputs数据格式。此函数输出不会被缓存。\ndef bigquant_run(outputs):\n return outputs\n","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"input_ports","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"params","Value":"{}","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"output_ports","Value":"","ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"input_1","NodeId":"-214"},{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"input_2","NodeId":"-214"},{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"input_3","NodeId":"-214"}],"OutputPortsInternal":[{"Name":"data_1","NodeId":"-214","OutputType":null},{"Name":"data_2","NodeId":"-214","OutputType":null},{"Name":"data_3","NodeId":"-214","OutputType":null}],"UsePreviousResults":true,"moduleIdForCode":2,"Comment":"模型预测结果输出","CommentCollapsed":false},{"Id":"-210","ModuleId":"BigQuantSpace.dl_layer_input.dl_layer_input-v1","ModuleParameters":[{"Name":"shape","Value":"50,19","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"batch_shape","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"dtype","Value":"float32","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"sparse","Value":"False","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"name","Value":"","ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"inputs","NodeId":"-210"}],"OutputPortsInternal":[{"Name":"data","NodeId":"-210","OutputType":null}],"UsePreviousResults":false,"moduleIdForCode":3,"Comment":"","CommentCollapsed":true},{"Id":"-218","ModuleId":"BigQuantSpace.dl_layer_lstm.dl_layer_lstm-v1","ModuleParameters":[{"Name":"units","Value":"32","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"activation","Value":"tanh","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_activation","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"recurrent_activation","Value":"hard_sigmoid","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_recurrent_activation","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"use_bias","Value":"True","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"kernel_initializer","Value":"glorot_uniform","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_kernel_initializer","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"recurrent_initializer","Value":"Orthogonal","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_recurrent_initializer","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"bias_initializer","Value":"Ones","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_bias_initializer","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"unit_forget_bias","Value":"True","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"kernel_regularizer","Value":"None","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"kernel_regularizer_l1","Value":"0","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"kernel_regularizer_l2","Value":"0","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_kernel_regularizer","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"recurrent_regularizer","Value":"None","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"recurrent_regularizer_l1","Value":0,"ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"recurrent_regularizer_l2","Value":0,"ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_recurrent_regularizer","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"bias_regularizer","Value":"None","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"bias_regularizer_l1","Value":0,"ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"bias_regularizer_l2","Value":0,"ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_bias_regularizer","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"activity_regularizer","Value":"L1L2","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"activity_regularizer_l1","Value":"0.003","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"activity_regularizer_l2","Value":"0.003","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_activity_regularizer","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"kernel_constraint","Value":"None","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_kernel_constraint","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"recurrent_constraint","Value":"None","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_recurrent_constraint","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"bias_constraint","Value":"None","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_bias_constraint","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"dropout","Value":"0.1","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"recurrent_dropout","Value":"0.1","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"return_sequences","Value":"False","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"implementation","Value":"2","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"name","Value":"","ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"inputs","NodeId":"-218"}],"OutputPortsInternal":[{"Name":"data","NodeId":"-218","OutputType":null}],"UsePreviousResults":false,"moduleIdForCode":4,"Comment":"","CommentCollapsed":true},{"Id":"-316","ModuleId":"BigQuantSpace.general_feature_extractor.general_feature_extractor-v7","ModuleParameters":[{"Name":"start_date","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"end_date","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"before_start_days","Value":"90","ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"instruments","NodeId":"-316"},{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"features","NodeId":"-316"}],"OutputPortsInternal":[{"Name":"data","NodeId":"-316","OutputType":null}],"UsePreviousResults":true,"moduleIdForCode":16,"Comment":"","CommentCollapsed":true},{"Id":"-320","ModuleId":"BigQuantSpace.dl_model_train.dl_model_train-v1","ModuleParameters":[{"Name":"optimizer","Value":"Adam","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_optimizer","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"loss","Value":"binary_crossentropy","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_loss","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"metrics","Value":"accuracy","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"batch_size","Value":"1600","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"epochs","Value":"1000","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"n_gpus","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"verbose","Value":"2:每个epoch输出一行记录","ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"input_model","NodeId":"-320"},{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"training_data","NodeId":"-320"},{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"validation_data","NodeId":"-320"}],"OutputPortsInternal":[{"Name":"data","NodeId":"-320","OutputType":null}],"UsePreviousResults":true,"moduleIdForCode":6,"Comment":"","CommentCollapsed":true},{"Id":"-332","ModuleId":"BigQuantSpace.dl_model_predict.dl_model_predict-v1","ModuleParameters":[{"Name":"batch_size","Value":"10240","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"n_gpus","Value":"0","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"verbose","Value":"0:不显示","ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"trained_model","NodeId":"-332"},{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"input_data","NodeId":"-332"}],"OutputPortsInternal":[{"Name":"data","NodeId":"-332","OutputType":null}],"UsePreviousResults":true,"moduleIdForCode":7,"Comment":"","CommentCollapsed":true},{"Id":"-2295","ModuleId":"BigQuantSpace.input_features.input_features-v1","ModuleParameters":[{"Name":"features","Value":"(close_0/open_0-1)*10\n(volume_0/volume_1-1)\navg_turn_0/avg_turn_4-1\navg_amount_0/avg_amount_4-1\n((high_0/open_0)-(close_0/low_0))*50\n(ta_ma(close_0, timeperiod=5)/ta_ma(close_0, timeperiod=30)-1)*10\nta_rsi(close_0, timeperiod=14)/50-1\nta_mom(close_0, timeperiod=14)/5\nta_adx(high_0, low_0, close_0, timeperiod=14)/ta_adx(high_0, low_0, close_0, timeperiod=28)-1\nta_roc(close_0, timeperiod=14)/10\n(ta_kdj_k(high_0, low_0, close_0, 12, 3)/ta_kdj_d(high_0, low_0, close_0, 12, 3,3)-1)*2\nta_bias(close_0, timeperiod=28)*10\nvolatility_5_0/volatility_60_0-1\n(close_0/open_4-1)*5\nta_macd_macd_12_26_9_0\nta_macd_macdhist_12_26_9_0\nta_macd_macdsignal_12_26_9_0\navg_mf_net_amount_4/avg_amount_4*5\nmf_net_pct_main_0*5\n","ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"features_ds","NodeId":"-2295"}],"OutputPortsInternal":[{"Name":"data","NodeId":"-2295","OutputType":null}],"UsePreviousResults":true,"moduleIdForCode":8,"Comment":"","CommentCollapsed":true},{"Id":"-259","ModuleId":"BigQuantSpace.dl_layer_dense.dl_layer_dense-v1","ModuleParameters":[{"Name":"units","Value":"1","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"activation","Value":"sigmoid","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_activation","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"use_bias","Value":"True","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"kernel_initializer","Value":"glorot_uniform","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_kernel_initializer","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"bias_initializer","Value":"Zeros","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_bias_initializer","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"kernel_regularizer","Value":"None","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"kernel_regularizer_l1","Value":0,"ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"kernel_regularizer_l2","Value":0,"ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_kernel_regularizer","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"bias_regularizer","Value":"L1L2","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"bias_regularizer_l1","Value":"0.01","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"bias_regularizer_l2","Value":"0.01","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_bias_regularizer","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"activity_regularizer","Value":"None","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"activity_regularizer_l1","Value":"0","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"activity_regularizer_l2","Value":"0","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_activity_regularizer","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"kernel_constraint","Value":"None","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_kernel_constraint","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"bias_constraint","Value":"None","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_bias_constraint","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"name","Value":"","ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"inputs","NodeId":"-259"}],"OutputPortsInternal":[{"Name":"data","NodeId":"-259","OutputType":null}],"UsePreviousResults":false,"moduleIdForCode":9,"Comment":"","CommentCollapsed":true},{"Id":"-14806","ModuleId":"BigQuantSpace.dl_layer_dense.dl_layer_dense-v1","ModuleParameters":[{"Name":"units","Value":"32","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"activation","Value":"tanh","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_activation","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"use_bias","Value":"True","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"kernel_initializer","Value":"glorot_uniform","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_kernel_initializer","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"bias_initializer","Value":"Zeros","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_bias_initializer","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"kernel_regularizer","Value":"None","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"kernel_regularizer_l1","Value":0,"ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"kernel_regularizer_l2","Value":0,"ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_kernel_regularizer","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"bias_regularizer","Value":"None","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"bias_regularizer_l1","Value":0,"ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"bias_regularizer_l2","Value":0,"ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_bias_regularizer","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"activity_regularizer","Value":"L1L2","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"activity_regularizer_l1","Value":"0.003","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"activity_regularizer_l2","Value":"0.003","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_activity_regularizer","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"kernel_constraint","Value":"None","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_kernel_constraint","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"bias_constraint","Value":"None","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_bias_constraint","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"name","Value":"","ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"inputs","NodeId":"-14806"}],"OutputPortsInternal":[{"Name":"data","NodeId":"-14806","OutputType":null}],"UsePreviousResults":false,"moduleIdForCode":10,"Comment":"","CommentCollapsed":true},{"Id":"-14834","ModuleId":"BigQuantSpace.dl_layer_dropout.dl_layer_dropout-v1","ModuleParameters":[{"Name":"rate","Value":"0.1","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"noise_shape","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"seed","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"name","Value":"","ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"inputs","NodeId":"-14834"}],"OutputPortsInternal":[{"Name":"data","NodeId":"-14834","OutputType":null}],"UsePreviousResults":false,"moduleIdForCode":11,"Comment":"","CommentCollapsed":true},{"Id":"-14841","ModuleId":"BigQuantSpace.dl_layer_dropout.dl_layer_dropout-v1","ModuleParameters":[{"Name":"rate","Value":"0.1","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"noise_shape","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"seed","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"name","Value":"","ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"inputs","NodeId":"-14841"}],"OutputPortsInternal":[{"Name":"data","NodeId":"-14841","OutputType":null}],"UsePreviousResults":false,"moduleIdForCode":12,"Comment":"","CommentCollapsed":true},{"Id":"-403","ModuleId":"BigQuantSpace.dl_layer_reshape.dl_layer_reshape-v1","ModuleParameters":[{"Name":"target_shape","Value":"50,19,1","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"name","Value":"","ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"inputs","NodeId":"-403"}],"OutputPortsInternal":[{"Name":"data","NodeId":"-403","OutputType":null}],"UsePreviousResults":false,"moduleIdForCode":13,"Comment":"","CommentCollapsed":true},{"Id":"-408","ModuleId":"BigQuantSpace.dl_layer_conv2d.dl_layer_conv2d-v1","ModuleParameters":[{"Name":"filters","Value":"32","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"kernel_size","Value":"3,5","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"strides","Value":"1,1","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"padding","Value":"valid","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"data_format","Value":"channels_last","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"dilation_rate","Value":"1,1","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"activation","Value":"relu","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_activation","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"use_bias","Value":"True","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"kernel_initializer","Value":"glorot_uniform","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_kernel_initializer","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"bias_initializer","Value":"Zeros","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_bias_initializer","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"kernel_regularizer","Value":"None","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"kernel_regularizer_l1","Value":0,"ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"kernel_regularizer_l2","Value":0,"ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_kernel_regularizer","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"bias_regularizer","Value":"None","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"bias_regularizer_l1","Value":0,"ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"bias_regularizer_l2","Value":0,"ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_bias_regularizer","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"activity_regularizer","Value":"None","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"activity_regularizer_l1","Value":0,"ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"activity_regularizer_l2","Value":0,"ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_activity_regularizer","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"kernel_constraint","Value":"None","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_kernel_constraint","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"bias_constraint","Value":"None","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_bias_constraint","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"name","Value":"","ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"inputs","NodeId":"-408"}],"OutputPortsInternal":[{"Name":"data","NodeId":"-408","OutputType":null}],"UsePreviousResults":false,"moduleIdForCode":14,"Comment":"","CommentCollapsed":true},{"Id":"-446","ModuleId":"BigQuantSpace.dl_layer_reshape.dl_layer_reshape-v1","ModuleParameters":[{"Name":"target_shape","Value":"720,32","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"name","Value":"","ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"inputs","NodeId":"-446"}],"OutputPortsInternal":[{"Name":"data","NodeId":"-446","OutputType":null}],"UsePreviousResults":false,"moduleIdForCode":15,"Comment":"","CommentCollapsed":true},{"Id":"-2290","ModuleId":"BigQuantSpace.join.join-v3","ModuleParameters":[{"Name":"on","Value":"date","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"how","Value":"inner","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"sort","Value":"True","ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"data1","NodeId":"-2290"},{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"data2","NodeId":"-2290"}],"OutputPortsInternal":[{"Name":"data","NodeId":"-2290","OutputType":null}],"UsePreviousResults":true,"moduleIdForCode":17,"Comment":"标注特征连接","CommentCollapsed":false},{"Id":"-2296","ModuleId":"BigQuantSpace.dropnan.dropnan-v1","ModuleParameters":[],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"input_data","NodeId":"-2296"}],"OutputPortsInternal":[{"Name":"data","NodeId":"-2296","OutputType":null}],"UsePreviousResults":true,"moduleIdForCode":18,"Comment":"去掉为nan的数据","CommentCollapsed":true},{"Id":"-620","ModuleId":"BigQuantSpace.instruments.instruments-v2","ModuleParameters":[{"Name":"start_date","Value":"2012-07-24","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"end_date","Value":"2017-07-24","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"market","Value":"CN_STOCK_A","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"instrument_list","Value":"002425.SZA\n600699.SHA\n601390.SHA","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"max_count","Value":0,"ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"rolling_conf","NodeId":"-620"}],"OutputPortsInternal":[{"Name":"data","NodeId":"-620","OutputType":null}],"UsePreviousResults":true,"moduleIdForCode":24,"Comment":"","CommentCollapsed":true},{"Id":"-692","ModuleId":"BigQuantSpace.derived_feature_extractor.derived_feature_extractor-v3","ModuleParameters":[{"Name":"date_col","Value":"date","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"instrument_col","Value":"instrument","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"drop_na","Value":"False","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"remove_extra_columns","Value":"False","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_functions","Value":"{}","ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"input_data","NodeId":"-692"},{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"features","NodeId":"-692"}],"OutputPortsInternal":[{"Name":"data","NodeId":"-692","OutputType":null}],"UsePreviousResults":true,"moduleIdForCode":26,"Comment":"","CommentCollapsed":true},{"Id":"-281","ModuleId":"BigQuantSpace.trade.trade-v4","ModuleParameters":[{"Name":"start_date","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"end_date","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"initialize","Value":"# 回测引擎:初始化函数,只执行一次\ndef bigquant_run(context):\n # 加载预测数据\n context.prediction = context.options['data'].read_df()\n\n # 系统已经设置了默认的交易手续费和滑点,要修改手续费可使用如下函数\n context.set_commission(PerOrder(buy_cost=0.0003, sell_cost=0.0013, min_cost=5))","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"handle_data","Value":"# 回测引擎:每日数据处理函数,每天执行一次\ndef bigquant_run(context, data):\n # 按日期过滤得到今日的预测数据\n try:\n prediction = context.prediction[data.current_dt.strftime('%Y-%m-%d')]\n except KeyError as e:\n return\n \n instrument = context.instruments[0]\n sid = context.symbol(instrument)\n cur_position = context.portfolio.positions[sid].amount\n \n # 交易逻辑\n if prediction > 0.9 and cur_position == 0:\n context.order_target_percent(context.symbol(instrument), 1)\n print(data.current_dt, '买入!')\n \n elif prediction < 0.5 and cur_position > 0:\n context.order_target_percent(context.symbol(instrument), 0)\n print(data.current_dt, '卖出!')\n ","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"prepare","Value":"# 回测引擎:准备数据,只执行一次\ndef bigquant_run(context):\n pass\n","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"before_trading_start","Value":"# 回测引擎:每个单位时间开始前调用一次,即每日开盘前调用一次。\ndef bigquant_run(context, data):\n pass\n","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"volume_limit","Value":0.025,"ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"order_price_field_buy","Value":"open","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"order_price_field_sell","Value":"close","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"capital_base","Value":1000000,"ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"auto_cancel_non_tradable_orders","Value":"True","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"data_frequency","Value":"daily","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"price_type","Value":"真实价格","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"product_type","Value":"股票","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"plot_charts","Value":"True","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"backtest_only","Value":"False","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"benchmark","Value":"000001.SHA","ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"instruments","NodeId":"-281"},{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"options_data","NodeId":"-281"},{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"history_ds","NodeId":"-281"},{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"benchmark_ds","NodeId":"-281"},{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"trading_calendar","NodeId":"-281"}],"OutputPortsInternal":[{"Name":"raw_perf","NodeId":"-281","OutputType":null}],"UsePreviousResults":false,"moduleIdForCode":1,"Comment":"","CommentCollapsed":true},{"Id":"-333","ModuleId":"BigQuantSpace.dl_convert_to_bin.dl_convert_to_bin-v2","ModuleParameters":[{"Name":"window_size","Value":"50","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"feature_clip","Value":5,"ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"flatten","Value":"False","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"window_along_col","Value":"","ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"input_data","NodeId":"-333"},{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"features","NodeId":"-333"}],"OutputPortsInternal":[{"Name":"data","NodeId":"-333","OutputType":null}],"UsePreviousResults":true,"moduleIdForCode":25,"Comment":"","CommentCollapsed":true},{"Id":"-341","ModuleId":"BigQuantSpace.dl_convert_to_bin.dl_convert_to_bin-v2","ModuleParameters":[{"Name":"window_size","Value":"50","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"feature_clip","Value":5,"ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"flatten","Value":"False","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"window_along_col","Value":"","ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"input_data","NodeId":"-341"},{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"features","NodeId":"-341"}],"OutputPortsInternal":[{"Name":"data","NodeId":"-341","OutputType":null}],"UsePreviousResults":true,"moduleIdForCode":27,"Comment":"","CommentCollapsed":true},{"Id":"-289","ModuleId":"BigQuantSpace.advanced_auto_labeler.advanced_auto_labeler-v2","ModuleParameters":[{"Name":"label_expr","Value":"# #号开始的表示注释\n# 0. 每行一个,顺序执行,从第二个开始,可以使用label字段\n# 1. 可用数据字段见 https://bigquant.com/docs/develop/datasource/deprecated/history_data.html\n# 添加benchmark_前缀,可使用对应的benchmark数据\n# 2. 可用操作符和函数见 `表达式引擎 <https://bigquant.com/docs/develop/bigexpr/usage.html>`_\n\n# 计算收益:5日收盘价(作为卖出价格)除以明日开盘价(作为买入价格)\nwhere((shift(close,-5)/open>1)&(mean(close,-5)/open>1)&(close/open>1),1,0)\n# 过滤掉一字涨停的情况 (设置label为NaN,在后续处理和训练中会忽略NaN的label)\n\n","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"start_date","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"end_date","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"benchmark","Value":"000300.SHA","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"drop_na_label","Value":"True","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"cast_label_int","Value":"True","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_functions","Value":"{}","ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"instruments","NodeId":"-289"}],"OutputPortsInternal":[{"Name":"data","NodeId":"-289","OutputType":null}],"UsePreviousResults":true,"moduleIdForCode":21,"Comment":"","CommentCollapsed":true},{"Id":"-300","ModuleId":"BigQuantSpace.general_feature_extractor.general_feature_extractor-v7","ModuleParameters":[{"Name":"start_date","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"end_date","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"before_start_days","Value":90,"ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"instruments","NodeId":"-300"},{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"features","NodeId":"-300"}],"OutputPortsInternal":[{"Name":"data","NodeId":"-300","OutputType":null}],"UsePreviousResults":true,"moduleIdForCode":22,"Comment":"","CommentCollapsed":true},{"Id":"-307","ModuleId":"BigQuantSpace.derived_feature_extractor.derived_feature_extractor-v3","ModuleParameters":[{"Name":"date_col","Value":"date","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"instrument_col","Value":"instrument","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"drop_na","Value":"False","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"remove_extra_columns","Value":"False","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_functions","Value":"{}","ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"input_data","NodeId":"-307"},{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"features","NodeId":"-307"}],"OutputPortsInternal":[{"Name":"data","NodeId":"-307","OutputType":null}],"UsePreviousResults":true,"moduleIdForCode":23,"Comment":"","CommentCollapsed":true},{"Id":"-322","ModuleId":"BigQuantSpace.instruments.instruments-v2","ModuleParameters":[{"Name":"start_date","Value":"2017-08-18","ValueType":"Literal","LinkedGlobalParameter":"交易日期"},{"Name":"end_date","Value":"2020-01-06","ValueType":"Literal","LinkedGlobalParameter":"交易日期"},{"Name":"market","Value":"CN_STOCK_A","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"instrument_list","Value":"002425.SZA","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"max_count","Value":0,"ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"rolling_conf","NodeId":"-322"}],"OutputPortsInternal":[{"Name":"data","NodeId":"-322","OutputType":null}],"UsePreviousResults":true,"moduleIdForCode":28,"Comment":"","CommentCollapsed":true},{"Id":"-330","ModuleId":"BigQuantSpace.dropnan.dropnan-v1","ModuleParameters":[],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"input_data","NodeId":"-330"}],"OutputPortsInternal":[{"Name":"data","NodeId":"-330","OutputType":null}],"UsePreviousResults":true,"moduleIdForCode":20,"Comment":"","CommentCollapsed":true},{"Id":"-293","ModuleId":"BigQuantSpace.dl_model_init.dl_model_init-v1","ModuleParameters":[],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"inputs","NodeId":"-293"},{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"outputs","NodeId":"-293"}],"OutputPortsInternal":[{"Name":"data","NodeId":"-293","OutputType":null}],"UsePreviousResults":false,"moduleIdForCode":5,"Comment":"","CommentCollapsed":true},{"Id":"-294","ModuleId":"BigQuantSpace.model_save.model_save-v1","ModuleParameters":[{"Name":"filedir","Value":"/home/bigquant/work/userlib/","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"filename","Value":"test0110+2","ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"input_1","NodeId":"-294"}],"OutputPortsInternal":[],"UsePreviousResults":true,"moduleIdForCode":19,"Comment":"","CommentCollapsed":true},{"Id":"-298","ModuleId":"BigQuantSpace.model_read.model_read-v1","ModuleParameters":[{"Name":"filedir","Value":"/home/bigquant/work/userlib/","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"filename","Value":"test0110-1000","ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[],"OutputPortsInternal":[{"Name":"data","NodeId":"-298","OutputType":null}],"UsePreviousResults":true,"moduleIdForCode":29,"Comment":"","CommentCollapsed":true}],"SerializedClientData":"<?xml version='1.0' encoding='utf-16'?><DataV1 xmlns:xsd='http://www.w3.org/2001/XMLSchema' xmlns:xsi='http://www.w3.org/2001/XMLSchema-instance'><Meta /><NodePositions><NodePosition Node='-214' Position='1012,714,200,200'/><NodePosition Node='-210' Position='280,-273,200,200'/><NodePosition Node='-218' Position='280,52,200,200'/><NodePosition Node='-316' Position='1229,-86,200,200'/><NodePosition Node='-320' Position='537,539,200,200'/><NodePosition Node='-332' Position='813,622,200,200'/><NodePosition Node='-2295' Position='1010,-259,200,200'/><NodePosition Node='-259' Position='281,387,200,200'/><NodePosition Node='-14806' Position='280,212.5460662841797,200,200'/><NodePosition Node='-14834' Position='281,135,200,200'/><NodePosition Node='-14841' Position='279,304,200,200'/><NodePosition Node='-403' Position='288,-190,200,200'/><NodePosition Node='-408' Position='283,-106,200,200'/><NodePosition Node='-446' Position='280,-23,200,200'/><NodePosition Node='-2290' Position='735,138,200,200'/><NodePosition Node='-2296' Position='734,242,200,200'/><NodePosition Node='-620' Position='721.5460815429688,-168,200,200'/><NodePosition Node='-692' Position='1251,-7,200,200'/><NodePosition Node='-281' Position='1246,851,200,200'/><NodePosition Node='-333' Position='753,329,200,200'/><NodePosition Node='-341' Position='1045,322,200,200'/><NodePosition Node='-289' Position='597,-13,200,200'/><NodePosition Node='-300' Position='860,-82,200,200'/><NodePosition Node='-307' Position='871,10,200,200'/><NodePosition Node='-322' Position='1245,-203,200,200'/><NodePosition Node='-330' Position='1250,84,200,200'/><NodePosition Node='-293' Position='358,462,200,200'/><NodePosition Node='-294' Position='480.54608154296875,621,200,200'/><NodePosition Node='-298' Position='798.1644897460938,454.51971435546875,200,200'/></NodePositions><NodeGroups /></DataV1>"},"IsDraft":true,"ParentExperimentId":null,"WebService":{"IsWebServiceExperiment":false,"Inputs":[],"Outputs":[],"Parameters":[{"Name":"交易日期","Value":"","ParameterDefinition":{"Name":"交易日期","FriendlyName":"交易日期","DefaultValue":"","ParameterType":"String","HasDefaultValue":true,"IsOptional":true,"ParameterRules":[],"HasRules":false,"MarkupType":0,"CredentialDescriptor":null}}],"WebServiceGroupId":null,"SerializedClientData":"<?xml version='1.0' encoding='utf-16'?><DataV1 xmlns:xsd='http://www.w3.org/2001/XMLSchema' xmlns:xsi='http://www.w3.org/2001/XMLSchema-instance'><Meta /><NodePositions></NodePositions><NodeGroups /></DataV1>"},"DisableNodesUpdate":false,"Category":"user","Tags":[],"IsPartialRun":true}
    In [203]:
    # 本代码由可视化策略环境自动生成 2020年1月11日 15:20
    # 本代码单元只能在可视化模式下编辑。您也可以拷贝代码,粘贴到新建的代码单元或者策略,然后修改。
    
    
    # Python 代码入口函数,input_1/2/3 对应三个输入端,data_1/2/3 对应三个输出端
    def m2_run_bigquant_run(input_1, input_2, input_3):
    
        test_data = input_2.read_pickle()
        pred_label = input_1.read_pickle()
        pred_result = pred_label.reshape(pred_label.shape[0]) 
        dt = input_3.read_df()['date'][-1*len(pred_result):]
        pred_df = pd.Series(pred_result, index=dt)
        ds = DataSource.write_df(pred_df)
        
        return Outputs(data_1=ds)
    
    # 后处理函数,可选。输入是主函数的输出,可以在这里对数据做处理,或者返回更友好的outputs数据格式。此函数输出不会被缓存。
    def m2_post_run_bigquant_run(outputs):
        return outputs
    
    # 回测引擎:初始化函数,只执行一次
    def m1_initialize_bigquant_run(context):
        # 加载预测数据
        context.prediction = context.options['data'].read_df()
    
        # 系统已经设置了默认的交易手续费和滑点,要修改手续费可使用如下函数
        context.set_commission(PerOrder(buy_cost=0.0003, sell_cost=0.0013, min_cost=5))
    # 回测引擎:每日数据处理函数,每天执行一次
    def m1_handle_data_bigquant_run(context, data):
        # 按日期过滤得到今日的预测数据
        try:
            prediction = context.prediction[data.current_dt.strftime('%Y-%m-%d')]
        except KeyError as e:
            return
        
        instrument = context.instruments[0]
        sid = context.symbol(instrument)
        cur_position = context.portfolio.positions[sid].amount
        
        # 交易逻辑
        if prediction > 0.9 and cur_position == 0:
            context.order_target_percent(context.symbol(instrument), 1)
            print(data.current_dt, '买入!')
            
        elif prediction < 0.5 and cur_position > 0:
            context.order_target_percent(context.symbol(instrument), 0)
            print(data.current_dt, '卖出!')
        
    # 回测引擎:准备数据,只执行一次
    def m1_prepare_bigquant_run(context):
        pass
    
    # 回测引擎:每个单位时间开始前调用一次,即每日开盘前调用一次。
    def m1_before_trading_start_bigquant_run(context, data):
        pass
    
    
    m3 = M.dl_layer_input.v1(
        shape='50,19',
        batch_shape='',
        dtype='float32',
        sparse=False,
        name=''
    )
    
    m13 = M.dl_layer_reshape.v1(
        inputs=m3.data,
        target_shape='50,19,1',
        name=''
    )
    
    m14 = M.dl_layer_conv2d.v1(
        inputs=m13.data,
        filters=32,
        kernel_size='3,5',
        strides='1,1',
        padding='valid',
        data_format='channels_last',
        dilation_rate='1,1',
        activation='relu',
        use_bias=True,
        kernel_initializer='glorot_uniform',
        bias_initializer='Zeros',
        kernel_regularizer='None',
        kernel_regularizer_l1=0,
        kernel_regularizer_l2=0,
        bias_regularizer='None',
        bias_regularizer_l1=0,
        bias_regularizer_l2=0,
        activity_regularizer='None',
        activity_regularizer_l1=0,
        activity_regularizer_l2=0,
        kernel_constraint='None',
        bias_constraint='None',
        name=''
    )
    
    m15 = M.dl_layer_reshape.v1(
        inputs=m14.data,
        target_shape='720,32',
        name=''
    )
    
    m4 = M.dl_layer_lstm.v1(
        inputs=m15.data,
        units=32,
        activation='tanh',
        recurrent_activation='hard_sigmoid',
        use_bias=True,
        kernel_initializer='glorot_uniform',
        recurrent_initializer='Orthogonal',
        bias_initializer='Ones',
        unit_forget_bias=True,
        kernel_regularizer='None',
        kernel_regularizer_l1=0,
        kernel_regularizer_l2=0,
        recurrent_regularizer='None',
        recurrent_regularizer_l1=0,
        recurrent_regularizer_l2=0,
        bias_regularizer='None',
        bias_regularizer_l1=0,
        bias_regularizer_l2=0,
        activity_regularizer='L1L2',
        activity_regularizer_l1=0.003,
        activity_regularizer_l2=0.003,
        kernel_constraint='None',
        recurrent_constraint='None',
        bias_constraint='None',
        dropout=0.1,
        recurrent_dropout=0.1,
        return_sequences=False,
        implementation='2',
        name=''
    )
    
    m11 = M.dl_layer_dropout.v1(
        inputs=m4.data,
        rate=0.1,
        noise_shape='',
        name=''
    )
    
    m10 = M.dl_layer_dense.v1(
        inputs=m11.data,
        units=32,
        activation='tanh',
        use_bias=True,
        kernel_initializer='glorot_uniform',
        bias_initializer='Zeros',
        kernel_regularizer='None',
        kernel_regularizer_l1=0,
        kernel_regularizer_l2=0,
        bias_regularizer='None',
        bias_regularizer_l1=0,
        bias_regularizer_l2=0,
        activity_regularizer='L1L2',
        activity_regularizer_l1=0.003,
        activity_regularizer_l2=0.003,
        kernel_constraint='None',
        bias_constraint='None',
        name=''
    )
    
    m12 = M.dl_layer_dropout.v1(
        inputs=m10.data,
        rate=0.1,
        noise_shape='',
        name=''
    )
    
    m9 = M.dl_layer_dense.v1(
        inputs=m12.data,
        units=1,
        activation='sigmoid',
        use_bias=True,
        kernel_initializer='glorot_uniform',
        bias_initializer='Zeros',
        kernel_regularizer='None',
        kernel_regularizer_l1=0,
        kernel_regularizer_l2=0,
        bias_regularizer='L1L2',
        bias_regularizer_l1=0.01,
        bias_regularizer_l2=0.01,
        activity_regularizer='None',
        activity_regularizer_l1=0,
        activity_regularizer_l2=0,
        kernel_constraint='None',
        bias_constraint='None',
        name=''
    )
    
    m5 = M.dl_model_init.v1(
        inputs=m3.data,
        outputs=m9.data
    )
    
    m8 = M.input_features.v1(
        features="""(close_0/open_0-1)*10
    (volume_0/volume_1-1)
    avg_turn_0/avg_turn_4-1
    avg_amount_0/avg_amount_4-1
    ((high_0/open_0)-(close_0/low_0))*50
    (ta_ma(close_0, timeperiod=5)/ta_ma(close_0, timeperiod=30)-1)*10
    ta_rsi(close_0, timeperiod=14)/50-1
    ta_mom(close_0, timeperiod=14)/5
    ta_adx(high_0, low_0, close_0, timeperiod=14)/ta_adx(high_0, low_0, close_0, timeperiod=28)-1
    ta_roc(close_0, timeperiod=14)/10
    (ta_kdj_k(high_0, low_0, close_0, 12, 3)/ta_kdj_d(high_0, low_0, close_0, 12, 3,3)-1)*2
    ta_bias(close_0, timeperiod=28)*10
    volatility_5_0/volatility_60_0-1
    (close_0/open_4-1)*5
    ta_macd_macd_12_26_9_0
    ta_macd_macdhist_12_26_9_0
    ta_macd_macdsignal_12_26_9_0
    avg_mf_net_amount_4/avg_amount_4*5
    mf_net_pct_main_0*5
    """
    )
    
    m24 = M.instruments.v2(
        start_date='2012-07-24',
        end_date='2017-07-24',
        market='CN_STOCK_A',
        instrument_list="""002425.SZA
    600699.SHA
    601390.SHA""",
        max_count=0
    )
    
    m21 = M.advanced_auto_labeler.v2(
        instruments=m24.data,
        label_expr="""# #号开始的表示注释
    # 0. 每行一个,顺序执行,从第二个开始,可以使用label字段
    # 1. 可用数据字段见 https://bigquant.com/docs/develop/datasource/deprecated/history_data.html
    #   添加benchmark_前缀,可使用对应的benchmark数据
    # 2. 可用操作符和函数见 `表达式引擎 <https://bigquant.com/docs/develop/bigexpr/usage.html>`_
    
    # 计算收益:5日收盘价(作为卖出价格)除以明日开盘价(作为买入价格)
    where((shift(close,-5)/open>1)&(mean(close,-5)/open>1)&(close/open>1),1,0)
    # 过滤掉一字涨停的情况 (设置label为NaN,在后续处理和训练中会忽略NaN的label)
    
    """,
        start_date='',
        end_date='',
        benchmark='000300.SHA',
        drop_na_label=True,
        cast_label_int=True,
        user_functions={}
    )
    
    m22 = M.general_feature_extractor.v7(
        instruments=m24.data,
        features=m8.data,
        start_date='',
        end_date='',
        before_start_days=90
    )
    
    m23 = M.derived_feature_extractor.v3(
        input_data=m22.data,
        features=m8.data,
        date_col='date',
        instrument_col='instrument',
        drop_na=False,
        remove_extra_columns=False,
        user_functions={}
    )
    
    m17 = M.join.v3(
        data1=m21.data,
        data2=m23.data,
        on='date',
        how='inner',
        sort=True
    )
    
    m18 = M.dropnan.v1(
        input_data=m17.data
    )
    
    m25 = M.dl_convert_to_bin.v2(
        input_data=m18.data,
        features=m8.data,
        window_size=50,
        feature_clip=5,
        flatten=False,
        window_along_col=''
    )
    
    m6 = M.dl_model_train.v1(
        input_model=m5.data,
        training_data=m25.data,
        optimizer='Adam',
        loss='binary_crossentropy',
        metrics='accuracy',
        batch_size=1600,
        epochs=1000,
        verbose='2:每个epoch输出一行记录'
    )
    
    m19 = M.model_save.v1(
        input_1=m6.data,
        filedir='/home/bigquant/work/userlib/',
        filename='test0110+2'
    )
    
    m28 = M.instruments.v2(
        start_date=T.live_run_param('trading_date', '2017-08-18'),
        end_date=T.live_run_param('trading_date', '2020-01-06'),
        market='CN_STOCK_A',
        instrument_list='002425.SZA',
        max_count=0
    )
    
    m16 = M.general_feature_extractor.v7(
        instruments=m28.data,
        features=m8.data,
        start_date='',
        end_date='',
        before_start_days=90
    )
    
    m26 = M.derived_feature_extractor.v3(
        input_data=m16.data,
        features=m8.data,
        date_col='date',
        instrument_col='instrument',
        drop_na=False,
        remove_extra_columns=False,
        user_functions={}
    )
    
    m20 = M.dropnan.v1(
        input_data=m26.data
    )
    
    m27 = M.dl_convert_to_bin.v2(
        input_data=m20.data,
        features=m8.data,
        window_size=50,
        feature_clip=5,
        flatten=False,
        window_along_col=''
    )
    
    m7 = M.dl_model_predict.v1(
        trained_model=m6.data,
        input_data=m27.data,
        batch_size=10240,
        n_gpus=0,
        verbose='0:不显示'
    )
    
    m2 = M.cached.v3(
        input_1=m7.data,
        input_2=m27.data,
        input_3=m20.data,
        run=m2_run_bigquant_run,
        post_run=m2_post_run_bigquant_run,
        input_ports='',
        params='{}',
        output_ports=''
    )
    
    m1 = M.trade.v4(
        instruments=m28.data,
        options_data=m2.data_1,
        start_date='',
        end_date='',
        initialize=m1_initialize_bigquant_run,
        handle_data=m1_handle_data_bigquant_run,
        prepare=m1_prepare_bigquant_run,
        before_trading_start=m1_before_trading_start_bigquant_run,
        volume_limit=0.025,
        order_price_field_buy='open',
        order_price_field_sell='close',
        capital_base=1000000,
        auto_cancel_non_tradable_orders=True,
        data_frequency='daily',
        price_type='真实价格',
        product_type='股票',
        plot_charts=True,
        backtest_only=False,
        benchmark='000001.SHA'
    )
    
    m29 = M.model_read.v1(
        filedir='/home/bigquant/work/userlib/',
        filename='test0110-1000'
    )
    
    Train on 9331 samples
    Epoch 1/1000
    9331/9331 - 84s - loss: 0.7900 - accuracy: 0.6827
    Epoch 2/1000
    9331/9331 - 52s - loss: 0.7634 - accuracy: 0.6828
    Epoch 3/1000
    9331/9331 - 54s - loss: 0.7423 - accuracy: 0.6852
    Epoch 4/1000
    9331/9331 - 52s - loss: 0.7214 - accuracy: 0.6858
    Epoch 5/1000
    9331/9331 - 51s - loss: 0.7062 - accuracy: 0.6865
    Epoch 6/1000
    9331/9331 - 61s - loss: 0.6955 - accuracy: 0.6879
    Epoch 7/1000
    9331/9331 - 61s - loss: 0.6851 - accuracy: 0.6876
    Epoch 8/1000
    9331/9331 - 53s - loss: 0.6779 - accuracy: 0.6879
    Epoch 9/1000
    9331/9331 - 55s - loss: 0.6697 - accuracy: 0.6881
    Epoch 10/1000
    9331/9331 - 71s - loss: 0.6655 - accuracy: 0.6875
    Epoch 11/1000
    9331/9331 - 58s - loss: 0.6575 - accuracy: 0.6884
    Epoch 12/1000
    9331/9331 - 71s - loss: 0.6550 - accuracy: 0.6879
    Epoch 13/1000
    9331/9331 - 51s - loss: 0.6496 - accuracy: 0.6882
    Epoch 14/1000
    9331/9331 - 53s - loss: 0.6482 - accuracy: 0.6889
    Epoch 15/1000
    9331/9331 - 90s - loss: 0.6429 - accuracy: 0.6887
    Epoch 16/1000
    9331/9331 - 52s - loss: 0.6409 - accuracy: 0.6892
    Epoch 17/1000
    9331/9331 - 87s - loss: 0.6390 - accuracy: 0.6895
    Epoch 18/1000
    9331/9331 - 68s - loss: 0.6369 - accuracy: 0.6871
    Epoch 19/1000
    9331/9331 - 53s - loss: 0.6349 - accuracy: 0.6889
    Epoch 20/1000
    9331/9331 - 53s - loss: 0.6328 - accuracy: 0.6894
    Epoch 21/1000
    9331/9331 - 63s - loss: 0.6313 - accuracy: 0.6880
    Epoch 22/1000
    9331/9331 - 92s - loss: 0.6310 - accuracy: 0.6892
    Epoch 23/1000
    9331/9331 - 54s - loss: 0.6296 - accuracy: 0.6873
    Epoch 24/1000
    9331/9331 - 60s - loss: 0.6277 - accuracy: 0.6915
    Epoch 25/1000
    9331/9331 - 60s - loss: 0.6273 - accuracy: 0.6896
    Epoch 26/1000
    9331/9331 - 53s - loss: 0.6272 - accuracy: 0.6906
    Epoch 27/1000
    9331/9331 - 75s - loss: 0.6255 - accuracy: 0.6893
    Epoch 28/1000
    9331/9331 - 58s - loss: 0.6228 - accuracy: 0.6904
    Epoch 29/1000
    9331/9331 - 53s - loss: 0.6233 - accuracy: 0.6918
    Epoch 30/1000
    9331/9331 - 57s - loss: 0.6238 - accuracy: 0.6901
    Epoch 31/1000
    9331/9331 - 60s - loss: 0.6207 - accuracy: 0.6921
    Epoch 32/1000
    9331/9331 - 61s - loss: 0.6212 - accuracy: 0.6905
    Epoch 33/1000
    9331/9331 - 54s - loss: 0.6209 - accuracy: 0.6914
    Epoch 34/1000
    9331/9331 - 85s - loss: 0.6215 - accuracy: 0.6901
    Epoch 35/1000
    9331/9331 - 63s - loss: 0.6185 - accuracy: 0.6924
    Epoch 36/1000
    9331/9331 - 53s - loss: 0.6187 - accuracy: 0.6922
    Epoch 37/1000
    9331/9331 - 53s - loss: 0.6172 - accuracy: 0.6902
    Epoch 38/1000
    9331/9331 - 55s - loss: 0.6185 - accuracy: 0.6902
    Epoch 39/1000
    9331/9331 - 53s - loss: 0.6171 - accuracy: 0.6915
    Epoch 40/1000
    9331/9331 - 54s - loss: 0.6159 - accuracy: 0.6920
    Epoch 41/1000
    9331/9331 - 58s - loss: 0.6158 - accuracy: 0.6917
    Epoch 42/1000
    9331/9331 - 73s - loss: 0.6156 - accuracy: 0.6911
    Epoch 43/1000
    9331/9331 - 58s - loss: 0.6155 - accuracy: 0.6926
    Epoch 44/1000
    9331/9331 - 60s - loss: 0.6137 - accuracy: 0.6922
    Epoch 45/1000
    9331/9331 - 82s - loss: 0.6152 - accuracy: 0.6921
    Epoch 46/1000
    9331/9331 - 54s - loss: 0.6143 - accuracy: 0.6932
    Epoch 47/1000
    9331/9331 - 56s - loss: 0.6138 - accuracy: 0.6910
    Epoch 48/1000
    9331/9331 - 54s - loss: 0.6128 - accuracy: 0.6971
    Epoch 49/1000
    9331/9331 - 54s - loss: 0.6122 - accuracy: 0.6919
    Epoch 50/1000
    9331/9331 - 79s - loss: 0.6140 - accuracy: 0.6941
    Epoch 51/1000
    9331/9331 - 56s - loss: 0.6134 - accuracy: 0.6940
    Epoch 52/1000
    9331/9331 - 73s - loss: 0.6131 - accuracy: 0.6910
    Epoch 53/1000
    9331/9331 - 74s - loss: 0.6119 - accuracy: 0.6949
    Epoch 54/1000
    9331/9331 - 55s - loss: 0.6105 - accuracy: 0.6931
    Epoch 55/1000
    9331/9331 - 57s - loss: 0.6102 - accuracy: 0.6940
    Epoch 56/1000
    9331/9331 - 53s - loss: 0.6114 - accuracy: 0.6929
    Epoch 57/1000
    9331/9331 - 66s - loss: 0.6111 - accuracy: 0.6941
    Epoch 58/1000
    9331/9331 - 75s - loss: 0.6094 - accuracy: 0.6957
    Epoch 59/1000
    9331/9331 - 52s - loss: 0.6097 - accuracy: 0.6941
    Epoch 60/1000
    9331/9331 - 103s - loss: 0.6097 - accuracy: 0.6950
    Epoch 61/1000
    9331/9331 - 78s - loss: 0.6097 - accuracy: 0.6944
    Epoch 62/1000
    9331/9331 - 90s - loss: 0.6087 - accuracy: 0.6938
    Epoch 63/1000
    9331/9331 - 54s - loss: 0.6073 - accuracy: 0.6944
    Epoch 64/1000
    9331/9331 - 76s - loss: 0.6077 - accuracy: 0.6962
    Epoch 65/1000
    9331/9331 - 54s - loss: 0.6081 - accuracy: 0.6966
    Epoch 66/1000
    9331/9331 - 52s - loss: 0.6076 - accuracy: 0.6934
    Epoch 67/1000
    9331/9331 - 52s - loss: 0.6082 - accuracy: 0.6947
    Epoch 68/1000
    9331/9331 - 53s - loss: 0.6075 - accuracy: 0.6962
    Epoch 69/1000
    9331/9331 - 80s - loss: 0.6061 - accuracy: 0.6952
    Epoch 70/1000
    9331/9331 - 59s - loss: 0.6068 - accuracy: 0.6959
    Epoch 71/1000
    9331/9331 - 84s - loss: 0.6065 - accuracy: 0.6977
    Epoch 72/1000
    9331/9331 - 66s - loss: 0.6074 - accuracy: 0.6924
    Epoch 73/1000
    9331/9331 - 54s - loss: 0.6066 - accuracy: 0.6950
    Epoch 74/1000
    9331/9331 - 72s - loss: 0.6060 - accuracy: 0.6917
    Epoch 75/1000
    9331/9331 - 55s - loss: 0.6057 - accuracy: 0.6949
    Epoch 76/1000
    9331/9331 - 93s - loss: 0.6058 - accuracy: 0.6955
    Epoch 77/1000
    9331/9331 - 62s - loss: 0.6055 - accuracy: 0.6947
    Epoch 78/1000
    9331/9331 - 54s - loss: 0.6044 - accuracy: 0.6982
    Epoch 79/1000
    9331/9331 - 105s - loss: 0.6052 - accuracy: 0.6949
    Epoch 80/1000
    9331/9331 - 147s - loss: 0.6052 - accuracy: 0.6955
    Epoch 81/1000
    9331/9331 - 176s - loss: 0.6046 - accuracy: 0.6938
    Epoch 82/1000
    9331/9331 - 163s - loss: 0.6036 - accuracy: 0.6968
    Epoch 83/1000
    9331/9331 - 114s - loss: 0.6027 - accuracy: 0.6989
    Epoch 84/1000
    9331/9331 - 79s - loss: 0.6034 - accuracy: 0.6983
    Epoch 85/1000
    9331/9331 - 62s - loss: 0.6039 - accuracy: 0.6968
    Epoch 86/1000
    9331/9331 - 69s - loss: 0.6032 - accuracy: 0.6979
    Epoch 87/1000
    9331/9331 - 57s - loss: 0.6022 - accuracy: 0.6967
    Epoch 88/1000
    9331/9331 - 131s - loss: 0.6033 - accuracy: 0.6945
    Epoch 89/1000
    9331/9331 - 68s - loss: 0.6017 - accuracy: 0.6971
    Epoch 90/1000
    9331/9331 - 74s - loss: 0.6034 - accuracy: 0.7006
    Epoch 91/1000
    9331/9331 - 53s - loss: 0.6022 - accuracy: 0.6980
    Epoch 92/1000
    9331/9331 - 73s - loss: 0.6026 - accuracy: 0.6962
    Epoch 93/1000
    9331/9331 - 105s - loss: 0.6008 - accuracy: 0.6981
    Epoch 94/1000
    9331/9331 - 76s - loss: 0.6021 - accuracy: 0.6959
    Epoch 95/1000
    9331/9331 - 105s - loss: 0.6016 - accuracy: 0.6982
    Epoch 96/1000
    9331/9331 - 55s - loss: 0.6017 - accuracy: 0.6976
    Epoch 97/1000
    9331/9331 - 53s - loss: 0.6010 - accuracy: 0.6989
    Epoch 98/1000
    9331/9331 - 54s - loss: 0.6005 - accuracy: 0.6967
    Epoch 99/1000
    9331/9331 - 54s - loss: 0.5998 - accuracy: 0.7020
    Epoch 100/1000
    9331/9331 - 64s - loss: 0.6018 - accuracy: 0.6983
    Epoch 101/1000
    9331/9331 - 66s - loss: 0.5989 - accuracy: 0.6995
    Epoch 102/1000
    9331/9331 - 109s - loss: 0.5972 - accuracy: 0.6987
    Epoch 103/1000
    9331/9331 - 118s - loss: 0.6005 - accuracy: 0.6989
    Epoch 104/1000
    9331/9331 - 53s - loss: 0.5996 - accuracy: 0.7010
    Epoch 105/1000
    9331/9331 - 52s - loss: 0.6007 - accuracy: 0.6998
    Epoch 106/1000
    9331/9331 - 54s - loss: 0.6006 - accuracy: 0.6980
    Epoch 107/1000
    9331/9331 - 53s - loss: 0.5999 - accuracy: 0.7004
    Epoch 108/1000
    9331/9331 - 94s - loss: 0.5983 - accuracy: 0.7005
    Epoch 109/1000
    9331/9331 - 142s - loss: 0.5990 - accuracy: 0.6984
    Epoch 110/1000
    9331/9331 - 122s - loss: 0.5992 - accuracy: 0.6994
    Epoch 111/1000
    9331/9331 - 85s - loss: 0.5978 - accuracy: 0.6995
    Epoch 112/1000
    9331/9331 - 300s - loss: 0.5985 - accuracy: 0.7017
    Epoch 113/1000
    9331/9331 - 243s - loss: 0.5992 - accuracy: 0.6989
    Epoch 114/1000
    9331/9331 - 587s - loss: 0.5980 - accuracy: 0.6972
    Epoch 115/1000
    9331/9331 - 1008s - loss: 0.5965 - accuracy: 0.7013
    Epoch 116/1000
    9331/9331 - 766s - loss: 0.5954 - accuracy: 0.7010
    Epoch 117/1000
    9331/9331 - 109s - loss: 0.5976 - accuracy: 0.7020
    Epoch 118/1000
    9331/9331 - 112s - loss: 0.5973 - accuracy: 0.7013
    Epoch 119/1000
    9331/9331 - 77s - loss: 0.5964 - accuracy: 0.7017
    Epoch 120/1000
    9331/9331 - 108s - loss: 0.5972 - accuracy: 0.7015
    Epoch 121/1000
    9331/9331 - 192s - loss: 0.5967 - accuracy: 0.6995
    Epoch 122/1000
    9331/9331 - 131s - loss: 0.5956 - accuracy: 0.7024
    Epoch 123/1000
    9331/9331 - 448s - loss: 0.5972 - accuracy: 0.7004
    Epoch 124/1000
    9331/9331 - 179s - loss: 0.5954 - accuracy: 0.7021
    Epoch 125/1000
    9331/9331 - 73s - loss: 0.5949 - accuracy: 0.7006
    Epoch 126/1000
    9331/9331 - 104s - loss: 0.5938 - accuracy: 0.7022
    Epoch 127/1000
    9331/9331 - 166s - loss: 0.5958 - accuracy: 0.7016
    Epoch 128/1000
    9331/9331 - 149s - loss: 0.5938 - accuracy: 0.7012
    Epoch 129/1000
    9331/9331 - 189s - loss: 0.5947 - accuracy: 0.7027
    Epoch 130/1000
    9331/9331 - 72s - loss: 0.5945 - accuracy: 0.7043
    Epoch 131/1000
    9331/9331 - 153s - loss: 0.5947 - accuracy: 0.7019
    Epoch 132/1000
    9331/9331 - 99s - loss: 0.5947 - accuracy: 0.7006
    Epoch 133/1000
    9331/9331 - 175s - loss: 0.5950 - accuracy: 0.7013
    Epoch 134/1000
    9331/9331 - 255s - loss: 0.5929 - accuracy: 0.7030
    Epoch 135/1000
    9331/9331 - 277s - loss: 0.5946 - accuracy: 0.7029
    Epoch 136/1000
    9331/9331 - 120s - loss: 0.5942 - accuracy: 0.7052
    Epoch 137/1000
    9331/9331 - 100s - loss: 0.5936 - accuracy: 0.7011
    Epoch 138/1000
    9331/9331 - 69s - loss: 0.5929 - accuracy: 0.7031
    Epoch 139/1000
    9331/9331 - 84s - loss: 0.5934 - accuracy: 0.7010
    Epoch 140/1000
    9331/9331 - 111s - loss: 0.5936 - accuracy: 0.7020
    Epoch 141/1000
    9331/9331 - 73s - loss: 0.5947 - accuracy: 0.6996
    Epoch 142/1000
    9331/9331 - 170s - loss: 0.5926 - accuracy: 0.7032
    Epoch 143/1000
    9331/9331 - 171s - loss: 0.5921 - accuracy: 0.7043
    Epoch 144/1000
    9331/9331 - 344s - loss: 0.5909 - accuracy: 0.7038
    Epoch 145/1000
    9331/9331 - 369s - loss: 0.5919 - accuracy: 0.7030
    Epoch 146/1000
    9331/9331 - 146s - loss: 0.5954 - accuracy: 0.7013
    Epoch 147/1000
    9331/9331 - 299s - loss: 0.5928 - accuracy: 0.7024
    Epoch 148/1000
    9331/9331 - 223s - loss: 0.5906 - accuracy: 0.7051
    Epoch 149/1000
    9331/9331 - 170s - loss: 0.5922 - accuracy: 0.7043
    Epoch 150/1000
    9331/9331 - 137s - loss: 0.5916 - accuracy: 0.7020
    Epoch 151/1000
    9331/9331 - 155s - loss: 0.5915 - accuracy: 0.7047
    Epoch 152/1000
    9331/9331 - 178s - loss: 0.5908 - accuracy: 0.7041
    Epoch 153/1000
    9331/9331 - 68s - loss: 0.5909 - accuracy: 0.7016
    Epoch 154/1000
    9331/9331 - 81s - loss: 0.5914 - accuracy: 0.7043
    Epoch 155/1000
    9331/9331 - 114s - loss: 0.5897 - accuracy: 0.7046
    Epoch 156/1000
    9331/9331 - 92s - loss: 0.5884 - accuracy: 0.7039
    Epoch 157/1000
    9331/9331 - 72s - loss: 0.5916 - accuracy: 0.7047
    Epoch 158/1000
    9331/9331 - 63s - loss: 0.5905 - accuracy: 0.7034
    Epoch 159/1000
    9331/9331 - 88s - loss: 0.5909 - accuracy: 0.7050
    Epoch 160/1000
    9331/9331 - 59s - loss: 0.5903 - accuracy: 0.7049
    Epoch 161/1000
    9331/9331 - 100s - loss: 0.5907 - accuracy: 0.7037
    Epoch 162/1000
    9331/9331 - 58s - loss: 0.5881 - accuracy: 0.7070
    Epoch 163/1000
    9331/9331 - 86s - loss: 0.5866 - accuracy: 0.7086
    Epoch 164/1000
    9331/9331 - 161s - loss: 0.5914 - accuracy: 0.7023
    Epoch 165/1000
    9331/9331 - 59s - loss: 0.5904 - accuracy: 0.7058
    Epoch 166/1000
    9331/9331 - 130s - loss: 0.5878 - accuracy: 0.7059
    Epoch 167/1000
    9331/9331 - 60s - loss: 0.5884 - accuracy: 0.7057
    Epoch 168/1000
    9331/9331 - 94s - loss: 0.5880 - accuracy: 0.7075
    Epoch 169/1000
    9331/9331 - 74s - loss: 0.5884 - accuracy: 0.7091
    Epoch 170/1000
    9331/9331 - 101s - loss: 0.5891 - accuracy: 0.7034
    Epoch 171/1000
    9331/9331 - 173s - loss: 0.5875 - accuracy: 0.7070
    Epoch 172/1000
    9331/9331 - 141s - loss: 0.5888 - accuracy: 0.7043
    Epoch 173/1000
    9331/9331 - 148s - loss: 0.5856 - accuracy: 0.7070
    Epoch 174/1000
    9331/9331 - 115s - loss: 0.5870 - accuracy: 0.7062
    Epoch 175/1000
    9331/9331 - 77s - loss: 0.5883 - accuracy: 0.7051
    Epoch 176/1000
    9331/9331 - 101s - loss: 0.5859 - accuracy: 0.7058
    Epoch 177/1000
    9331/9331 - 1680s - loss: 0.5870 - accuracy: 0.7071
    Epoch 178/1000
    9331/9331 - 256s - loss: 0.5847 - accuracy: 0.7084
    Epoch 179/1000
    9331/9331 - 232s - loss: 0.5849 - accuracy: 0.7094
    Epoch 180/1000
    9331/9331 - 100s - loss: 0.5863 - accuracy: 0.7072
    Epoch 181/1000
    9331/9331 - 74s - loss: 0.5867 - accuracy: 0.7053
    Epoch 182/1000
    9331/9331 - 64s - loss: 0.5839 - accuracy: 0.7084
    Epoch 183/1000
    9331/9331 - 75s - loss: 0.5863 - accuracy: 0.7082
    Epoch 184/1000
    9331/9331 - 69s - loss: 0.5844 - accuracy: 0.7044
    Epoch 185/1000
    9331/9331 - 82s - loss: 0.5847 - accuracy: 0.7096
    Epoch 186/1000
    9331/9331 - 112s - loss: 0.5849 - accuracy: 0.7072
    Epoch 187/1000
    9331/9331 - 323s - loss: 0.5849 - accuracy: 0.7044
    Epoch 188/1000
    9331/9331 - 156s - loss: 0.5841 - accuracy: 0.7077
    Epoch 189/1000
    9331/9331 - 77s - loss: 0.5817 - accuracy: 0.7117
    Epoch 190/1000
    9331/9331 - 74s - loss: 0.5826 - accuracy: 0.7092
    Epoch 191/1000
    9331/9331 - 68s - loss: 0.5847 - accuracy: 0.7066
    Epoch 192/1000
    9331/9331 - 72s - loss: 0.5832 - accuracy: 0.7145
    Epoch 193/1000
    9331/9331 - 110s - loss: 0.5838 - accuracy: 0.7107
    Epoch 194/1000
    9331/9331 - 168s - loss: 0.5828 - accuracy: 0.7094
    Epoch 195/1000
    9331/9331 - 76s - loss: 0.5839 - accuracy: 0.7058
    Epoch 196/1000
    9331/9331 - 61s - loss: 0.5829 - accuracy: 0.7072
    Epoch 197/1000
    9331/9331 - 62s - loss: 0.5827 - accuracy: 0.7126
    Epoch 198/1000
    9331/9331 - 88s - loss: 0.5825 - accuracy: 0.7090
    Epoch 199/1000
    9331/9331 - 59s - loss: 0.5823 - accuracy: 0.7106
    Epoch 200/1000
    9331/9331 - 72s - loss: 0.5825 - accuracy: 0.7107
    Epoch 201/1000
    9331/9331 - 62s - loss: 0.5843 - accuracy: 0.7085
    Epoch 202/1000
    9331/9331 - 61s - loss: 0.5840 - accuracy: 0.7087
    Epoch 203/1000
    9331/9331 - 60s - loss: 0.5820 - accuracy: 0.7090
    Epoch 204/1000
    9331/9331 - 74s - loss: 0.5810 - accuracy: 0.7112
    Epoch 205/1000
    9331/9331 - 61s - loss: 0.5829 - accuracy: 0.7064
    Epoch 206/1000
    9331/9331 - 56s - loss: 0.5818 - accuracy: 0.7110
    Epoch 207/1000
    9331/9331 - 56s - loss: 0.5805 - accuracy: 0.7091
    Epoch 208/1000
    9331/9331 - 63s - loss: 0.5808 - accuracy: 0.7104
    Epoch 209/1000
    9331/9331 - 54s - loss: 0.5824 - accuracy: 0.7112
    Epoch 210/1000
    9331/9331 - 65s - loss: 0.5790 - accuracy: 0.7088
    Epoch 211/1000
    9331/9331 - 58s - loss: 0.5797 - accuracy: 0.7141
    Epoch 212/1000
    9331/9331 - 54s - loss: 0.5791 - accuracy: 0.7118
    Epoch 213/1000
    9331/9331 - 61s - loss: 0.5788 - accuracy: 0.7147
    Epoch 214/1000
    9331/9331 - 54s - loss: 0.5785 - accuracy: 0.7134
    Epoch 215/1000
    9331/9331 - 66s - loss: 0.5793 - accuracy: 0.7141
    Epoch 216/1000
    9331/9331 - 52s - loss: 0.5803 - accuracy: 0.7158
    Epoch 217/1000
    9331/9331 - 54s - loss: 0.5778 - accuracy: 0.7127
    Epoch 218/1000
    9331/9331 - 61s - loss: 0.5783 - accuracy: 0.7139
    Epoch 219/1000
    9331/9331 - 54s - loss: 0.5782 - accuracy: 0.7146
    Epoch 220/1000
    9331/9331 - 57s - loss: 0.5756 - accuracy: 0.7155
    Epoch 221/1000
    9331/9331 - 63s - loss: 0.5776 - accuracy: 0.7143
    Epoch 222/1000
    9331/9331 - 58s - loss: 0.5780 - accuracy: 0.7129
    Epoch 223/1000
    9331/9331 - 59s - loss: 0.5787 - accuracy: 0.7139
    Epoch 224/1000
    9331/9331 - 54s - loss: 0.5752 - accuracy: 0.7131
    Epoch 225/1000
    9331/9331 - 55s - loss: 0.5761 - accuracy: 0.7153
    Epoch 226/1000
    9331/9331 - 60s - loss: 0.5772 - accuracy: 0.7145
    Epoch 227/1000
    9331/9331 - 54s - loss: 0.5760 - accuracy: 0.7140
    Epoch 228/1000
    9331/9331 - 60s - loss: 0.5782 - accuracy: 0.7133
    Epoch 229/1000
    9331/9331 - 54s - loss: 0.5752 - accuracy: 0.7155
    Epoch 230/1000
    9331/9331 - 67s - loss: 0.5778 - accuracy: 0.7106
    Epoch 231/1000
    9331/9331 - 65s - loss: 0.5772 - accuracy: 0.7129
    Epoch 232/1000
    9331/9331 - 56s - loss: 0.5773 - accuracy: 0.7106
    Epoch 233/1000
    9331/9331 - 53s - loss: 0.5765 - accuracy: 0.7150
    Epoch 234/1000
    9331/9331 - 53s - loss: 0.5769 - accuracy: 0.7153
    Epoch 235/1000
    9331/9331 - 55s - loss: 0.5759 - accuracy: 0.7143
    Epoch 236/1000
    9331/9331 - 57s - loss: 0.5747 - accuracy: 0.7159
    Epoch 237/1000
    9331/9331 - 67s - loss: 0.5729 - accuracy: 0.7214
    Epoch 238/1000
    9331/9331 - 55s - loss: 0.5767 - accuracy: 0.7178
    Epoch 239/1000
    9331/9331 - 55s - loss: 0.5766 - accuracy: 0.7146
    Epoch 240/1000
    9331/9331 - 55s - loss: 0.5722 - accuracy: 0.7172
    Epoch 241/1000
    9331/9331 - 55s - loss: 0.5738 - accuracy: 0.7151
    Epoch 242/1000
    9331/9331 - 53s - loss: 0.5769 - accuracy: 0.7143
    Epoch 243/1000
    9331/9331 - 55s - loss: 0.5723 - accuracy: 0.7178
    Epoch 244/1000
    9331/9331 - 56s - loss: 0.5730 - accuracy: 0.7172
    Epoch 245/1000
    9331/9331 - 57s - loss: 0.5758 - accuracy: 0.7158
    Epoch 246/1000
    9331/9331 - 59s - loss: 0.5740 - accuracy: 0.7171
    Epoch 247/1000
    9331/9331 - 56s - loss: 0.5773 - accuracy: 0.7176
    Epoch 248/1000
    9331/9331 - 57s - loss: 0.5748 - accuracy: 0.7154
    Epoch 249/1000
    9331/9331 - 52s - loss: 0.5711 - accuracy: 0.7192
    Epoch 250/1000
    9331/9331 - 55s - loss: 0.5749 - accuracy: 0.7146
    Epoch 251/1000
    9331/9331 - 53s - loss: 0.5719 - accuracy: 0.7150
    Epoch 252/1000
    9331/9331 - 56s - loss: 0.5716 - accuracy: 0.7202
    Epoch 253/1000
    9331/9331 - 58s - loss: 0.5727 - accuracy: 0.7144
    Epoch 254/1000
    9331/9331 - 62s - loss: 0.5725 - accuracy: 0.7143
    Epoch 255/1000
    9331/9331 - 55s - loss: 0.5741 - accuracy: 0.7149
    Epoch 256/1000
    9331/9331 - 54s - loss: 0.5736 - accuracy: 0.7184
    Epoch 257/1000
    9331/9331 - 53s - loss: 0.5723 - accuracy: 0.7173
    Epoch 258/1000
    9331/9331 - 52s - loss: 0.5714 - accuracy: 0.7190
    Epoch 259/1000
    9331/9331 - 54s - loss: 0.5718 - accuracy: 0.7209
    Epoch 260/1000
    9331/9331 - 52s - loss: 0.5703 - accuracy: 0.7187
    Epoch 261/1000
    9331/9331 - 60s - loss: 0.5729 - accuracy: 0.7191
    Epoch 262/1000
    9331/9331 - 55s - loss: 0.5680 - accuracy: 0.7186
    Epoch 263/1000
    9331/9331 - 56s - loss: 0.5690 - accuracy: 0.7178
    Epoch 264/1000
    9331/9331 - 53s - loss: 0.5719 - accuracy: 0.7192
    Epoch 265/1000
    9331/9331 - 55s - loss: 0.5711 - accuracy: 0.7153
    Epoch 266/1000
    9331/9331 - 55s - loss: 0.5723 - accuracy: 0.7203
    Epoch 267/1000
    9331/9331 - 55s - loss: 0.5702 - accuracy: 0.7196
    Epoch 268/1000
    9331/9331 - 57s - loss: 0.5677 - accuracy: 0.7213
    Epoch 269/1000
    9331/9331 - 55s - loss: 0.5687 - accuracy: 0.7228
    Epoch 270/1000
    9331/9331 - 55s - loss: 0.5708 - accuracy: 0.7169
    Epoch 271/1000
    9331/9331 - 58s - loss: 0.5696 - accuracy: 0.7189
    Epoch 272/1000
    9331/9331 - 53s - loss: 0.5700 - accuracy: 0.7175
    Epoch 273/1000
    9331/9331 - 55s - loss: 0.5729 - accuracy: 0.7164
    Epoch 274/1000
    9331/9331 - 56s - loss: 0.5690 - accuracy: 0.7162
    Epoch 275/1000
    9331/9331 - 56s - loss: 0.5715 - accuracy: 0.7188
    Epoch 276/1000
    9331/9331 - 59s - loss: 0.5699 - accuracy: 0.7173
    Epoch 277/1000
    9331/9331 - 70s - loss: 0.5688 - accuracy: 0.7228
    Epoch 278/1000
    9331/9331 - 95s - loss: 0.5676 - accuracy: 0.7216
    Epoch 279/1000
    9331/9331 - 55s - loss: 0.5671 - accuracy: 0.7160
    Epoch 280/1000
    9331/9331 - 55s - loss: 0.5676 - accuracy: 0.7220
    Epoch 281/1000
    9331/9331 - 60s - loss: 0.5686 - accuracy: 0.7195
    Epoch 282/1000
    9331/9331 - 56s - loss: 0.5702 - accuracy: 0.7202
    Epoch 283/1000
    9331/9331 - 64s - loss: 0.5687 - accuracy: 0.7168
    Epoch 284/1000
    9331/9331 - 54s - loss: 0.5702 - accuracy: 0.7162
    Epoch 285/1000
    9331/9331 - 55s - loss: 0.5670 - accuracy: 0.7217
    Epoch 286/1000
    9331/9331 - 52s - loss: 0.5682 - accuracy: 0.7193
    Epoch 287/1000
    9331/9331 - 56s - loss: 0.5682 - accuracy: 0.7158
    Epoch 288/1000
    9331/9331 - 54s - loss: 0.5679 - accuracy: 0.7162
    Epoch 289/1000
    9331/9331 - 58s - loss: 0.5637 - accuracy: 0.7234
    Epoch 290/1000
    9331/9331 - 54s - loss: 0.5622 - accuracy: 0.7230
    Epoch 291/1000
    9331/9331 - 54s - loss: 0.5672 - accuracy: 0.7224
    Epoch 292/1000
    9331/9331 - 55s - loss: 0.5630 - accuracy: 0.7221
    Epoch 293/1000
    9331/9331 - 60s - loss: 0.5677 - accuracy: 0.7219
    Epoch 294/1000
    9331/9331 - 62s - loss: 0.5652 - accuracy: 0.7219
    Epoch 295/1000
    9331/9331 - 57s - loss: 0.5662 - accuracy: 0.7225
    Epoch 296/1000
    9331/9331 - 74s - loss: 0.5641 - accuracy: 0.7205
    Epoch 297/1000
    9331/9331 - 55s - loss: 0.5655 - accuracy: 0.7151
    Epoch 298/1000
    9331/9331 - 53s - loss: 0.5701 - accuracy: 0.7208
    Epoch 299/1000
    9331/9331 - 53s - loss: 0.5668 - accuracy: 0.7210
    Epoch 300/1000
    9331/9331 - 53s - loss: 0.5665 - accuracy: 0.7260
    Epoch 301/1000
    9331/9331 - 58s - loss: 0.5624 - accuracy: 0.7229
    Epoch 302/1000
    9331/9331 - 55s - loss: 0.5625 - accuracy: 0.7193
    Epoch 303/1000
    9331/9331 - 57s - loss: 0.5632 - accuracy: 0.7190
    Epoch 304/1000
    9331/9331 - 54s - loss: 0.5615 - accuracy: 0.7239
    Epoch 305/1000
    9331/9331 - 56s - loss: 0.5634 - accuracy: 0.7204
    Epoch 306/1000
    9331/9331 - 54s - loss: 0.5641 - accuracy: 0.7201
    Epoch 307/1000
    9331/9331 - 56s - loss: 0.5608 - accuracy: 0.7247
    Epoch 308/1000
    9331/9331 - 54s - loss: 0.5628 - accuracy: 0.7240
    Epoch 309/1000
    9331/9331 - 52s - loss: 0.5627 - accuracy: 0.7258
    Epoch 310/1000
    9331/9331 - 54s - loss: 0.5651 - accuracy: 0.7194
    Epoch 311/1000
    9331/9331 - 56s - loss: 0.5621 - accuracy: 0.7240
    Epoch 312/1000
    9331/9331 - 59s - loss: 0.5619 - accuracy: 0.7229
    Epoch 313/1000
    9331/9331 - 54s - loss: 0.5629 - accuracy: 0.7226
    Epoch 314/1000
    9331/9331 - 56s - loss: 0.5600 - accuracy: 0.7252
    Epoch 315/1000
    9331/9331 - 54s - loss: 0.5591 - accuracy: 0.7250
    Epoch 316/1000
    9331/9331 - 54s - loss: 0.5600 - accuracy: 0.7226
    Epoch 317/1000
    9331/9331 - 61s - loss: 0.5624 - accuracy: 0.7236
    Epoch 318/1000
    9331/9331 - 61s - loss: 0.5581 - accuracy: 0.7266
    Epoch 319/1000
    9331/9331 - 55s - loss: 0.5608 - accuracy: 0.7201
    Epoch 320/1000
    9331/9331 - 54s - loss: 0.5627 - accuracy: 0.7244
    Epoch 321/1000
    9331/9331 - 56s - loss: 0.5589 - accuracy: 0.7249
    Epoch 322/1000
    9331/9331 - 53s - loss: 0.5633 - accuracy: 0.7219
    Epoch 323/1000
    9331/9331 - 55s - loss: 0.5588 - accuracy: 0.7261
    Epoch 324/1000
    9331/9331 - 59s - loss: 0.5604 - accuracy: 0.7254
    Epoch 325/1000
    9331/9331 - 56s - loss: 0.5595 - accuracy: 0.7248
    Epoch 326/1000
    9331/9331 - 59s - loss: 0.5582 - accuracy: 0.7231
    Epoch 327/1000
    9331/9331 - 56s - loss: 0.5571 - accuracy: 0.7249
    Epoch 328/1000
    9331/9331 - 54s - loss: 0.5565 - accuracy: 0.7282
    Epoch 329/1000
    9331/9331 - 54s - loss: 0.5573 - accuracy: 0.7274
    Epoch 330/1000
    9331/9331 - 58s - loss: 0.5586 - accuracy: 0.7258
    Epoch 331/1000
    9331/9331 - 56s - loss: 0.5572 - accuracy: 0.7253
    Epoch 332/1000
    9331/9331 - 56s - loss: 0.5591 - accuracy: 0.7284
    Epoch 333/1000
    9331/9331 - 54s - loss: 0.5582 - accuracy: 0.7269
    Epoch 334/1000
    9331/9331 - 58s - loss: 0.5584 - accuracy: 0.7245
    Epoch 335/1000
    9331/9331 - 68s - loss: 0.5581 - accuracy: 0.7221
    Epoch 336/1000
    9331/9331 - 60s - loss: 0.5565 - accuracy: 0.7240
    Epoch 337/1000
    9331/9331 - 54s - loss: 0.5578 - accuracy: 0.7253
    Epoch 338/1000
    9331/9331 - 58s - loss: 0.5571 - accuracy: 0.7261
    Epoch 339/1000
    9331/9331 - 53s - loss: 0.5559 - accuracy: 0.7283
    Epoch 340/1000
    9331/9331 - 53s - loss: 0.5573 - accuracy: 0.7273
    Epoch 341/1000
    9331/9331 - 54s - loss: 0.5537 - accuracy: 0.7250
    Epoch 342/1000
    9331/9331 - 55s - loss: 0.5536 - accuracy: 0.7320
    Epoch 343/1000
    9331/9331 - 53s - loss: 0.5540 - accuracy: 0.7300
    Epoch 344/1000
    9331/9331 - 55s - loss: 0.5529 - accuracy: 0.7311
    Epoch 345/1000
    9331/9331 - 55s - loss: 0.5554 - accuracy: 0.7261
    Epoch 346/1000
    9331/9331 - 54s - loss: 0.5572 - accuracy: 0.7232
    Epoch 347/1000
    9331/9331 - 54s - loss: 0.5536 - accuracy: 0.7305
    Epoch 348/1000
    9331/9331 - 53s - loss: 0.5573 - accuracy: 0.7253
    Epoch 349/1000
    9331/9331 - 53s - loss: 0.5562 - accuracy: 0.7262
    Epoch 350/1000
    9331/9331 - 56s - loss: 0.5550 - accuracy: 0.7267
    Epoch 351/1000
    9331/9331 - 59s - loss: 0.5553 - accuracy: 0.7274
    Epoch 352/1000
    9331/9331 - 60s - loss: 0.5544 - accuracy: 0.7271
    Epoch 353/1000
    9331/9331 - 56s - loss: 0.5551 - accuracy: 0.7304
    Epoch 354/1000
    9331/9331 - 54s - loss: 0.5546 - accuracy: 0.7277
    Epoch 355/1000
    9331/9331 - 54s - loss: 0.5568 - accuracy: 0.7260
    Epoch 356/1000
    9331/9331 - 54s - loss: 0.5531 - accuracy: 0.7282
    Epoch 357/1000
    9331/9331 - 53s - loss: 0.5506 - accuracy: 0.7295
    Epoch 358/1000
    9331/9331 - 53s - loss: 0.5514 - accuracy: 0.7270
    Epoch 359/1000
    9331/9331 - 53s - loss: 0.5528 - accuracy: 0.7310
    Epoch 360/1000
    9331/9331 - 55s - loss: 0.5521 - accuracy: 0.7292
    Epoch 361/1000
    9331/9331 - 55s - loss: 0.5507 - accuracy: 0.7309
    Epoch 362/1000
    9331/9331 - 54s - loss: 0.5504 - accuracy: 0.7325
    Epoch 363/1000
    9331/9331 - 53s - loss: 0.5507 - accuracy: 0.7323
    Epoch 364/1000
    9331/9331 - 54s - loss: 0.5496 - accuracy: 0.7308
    Epoch 365/1000
    9331/9331 - 54s - loss: 0.5522 - accuracy: 0.7322
    Epoch 366/1000
    9331/9331 - 53s - loss: 0.5509 - accuracy: 0.7304
    Epoch 367/1000
    9331/9331 - 54s - loss: 0.5508 - accuracy: 0.7284
    Epoch 368/1000
    9331/9331 - 53s - loss: 0.5490 - accuracy: 0.7290
    Epoch 369/1000
    9331/9331 - 53s - loss: 0.5496 - accuracy: 0.7324
    Epoch 370/1000
    9331/9331 - 56s - loss: 0.5489 - accuracy: 0.7295
    Epoch 371/1000
    9331/9331 - 54s - loss: 0.5498 - accuracy: 0.7321
    Epoch 372/1000
    9331/9331 - 54s - loss: 0.5472 - accuracy: 0.7349
    Epoch 373/1000
    9331/9331 - 54s - loss: 0.5505 - accuracy: 0.7328
    Epoch 374/1000
    9331/9331 - 53s - loss: 0.5490 - accuracy: 0.7329
    Epoch 375/1000
    9331/9331 - 55s - loss: 0.5465 - accuracy: 0.7336
    Epoch 376/1000
    9331/9331 - 55s - loss: 0.5487 - accuracy: 0.7315
    Epoch 377/1000
    9331/9331 - 56s - loss: 0.5487 - accuracy: 0.7316
    Epoch 378/1000
    9331/9331 - 53s - loss: 0.5471 - accuracy: 0.7310
    Epoch 379/1000
    9331/9331 - 53s - loss: 0.5494 - accuracy: 0.7318
    Epoch 380/1000
    9331/9331 - 54s - loss: 0.5502 - accuracy: 0.7325
    Epoch 381/1000
    9331/9331 - 54s - loss: 0.5484 - accuracy: 0.7320
    Epoch 382/1000
    9331/9331 - 53s - loss: 0.5499 - accuracy: 0.7313
    Epoch 383/1000
    9331/9331 - 54s - loss: 0.5466 - accuracy: 0.7349
    Epoch 384/1000
    9331/9331 - 53s - loss: 0.5510 - accuracy: 0.7275
    Epoch 385/1000
    9331/9331 - 54s - loss: 0.5497 - accuracy: 0.7321
    Epoch 386/1000
    9331/9331 - 59s - loss: 0.5477 - accuracy: 0.7325
    Epoch 387/1000
    9331/9331 - 53s - loss: 0.5498 - accuracy: 0.7315
    Epoch 388/1000
    9331/9331 - 53s - loss: 0.5479 - accuracy: 0.7352
    Epoch 389/1000
    9331/9331 - 54s - loss: 0.5478 - accuracy: 0.7325
    Epoch 390/1000
    9331/9331 - 53s - loss: 0.5487 - accuracy: 0.7330
    Epoch 392/1000
    9331/9331 - 55s - loss: 0.5477 - accuracy: 0.7307
    Epoch 393/1000
    9331/9331 - 54s - loss: 0.5467 - accuracy: 0.7282
    Epoch 394/1000
    9331/9331 - 57s - loss: 0.5478 - accuracy: 0.7303
    Epoch 395/1000
    9331/9331 - 53s - loss: 0.5461 - accuracy: 0.7361
    Epoch 396/1000
    9331/9331 - 54s - loss: 0.5407 - accuracy: 0.7379
    Epoch 397/1000
    9331/9331 - 53s - loss: 0.5472 - accuracy: 0.7367
    Epoch 398/1000
    9331/9331 - 53s - loss: 0.5469 - accuracy: 0.7361
    Epoch 399/1000
    9331/9331 - 53s - loss: 0.5453 - accuracy: 0.7274
    Epoch 400/1000
    9331/9331 - 61s - loss: 0.5456 - accuracy: 0.7320
    Epoch 401/1000
    9331/9331 - 56s - loss: 0.5429 - accuracy: 0.7358
    Epoch 402/1000
    9331/9331 - 62s - loss: 0.5456 - accuracy: 0.7327
    Epoch 403/1000
    9331/9331 - 56s - loss: 0.5408 - accuracy: 0.7352
    Epoch 404/1000
    9331/9331 - 55s - loss: 0.5443 - accuracy: 0.7368
    Epoch 405/1000
    9331/9331 - 55s - loss: 0.5410 - accuracy: 0.7359
    Epoch 406/1000
    9331/9331 - 57s - loss: 0.5415 - accuracy: 0.7337
    Epoch 407/1000
    9331/9331 - 54s - loss: 0.5441 - accuracy: 0.7338
    Epoch 408/1000
    9331/9331 - 56s - loss: 0.5428 - accuracy: 0.7364
    Epoch 409/1000
    9331/9331 - 54s - loss: 0.5458 - accuracy: 0.7338
    Epoch 410/1000
    9331/9331 - 57s - loss: 0.5454 - accuracy: 0.7274
    Epoch 411/1000
    9331/9331 - 57s - loss: 0.5435 - accuracy: 0.7374
    Epoch 412/1000
    9331/9331 - 56s - loss: 0.5447 - accuracy: 0.7357
    Epoch 413/1000
    9331/9331 - 54s - loss: 0.5444 - accuracy: 0.7370
    Epoch 414/1000
    9331/9331 - 57s - loss: 0.5423 - accuracy: 0.7348
    Epoch 415/1000
    9331/9331 - 53s - loss: 0.5435 - accuracy: 0.7376
    Epoch 416/1000
    9331/9331 - 55s - loss: 0.5433 - accuracy: 0.7365
    Epoch 417/1000
    9331/9331 - 55s - loss: 0.5398 - accuracy: 0.7376
    Epoch 418/1000
    9331/9331 - 57s - loss: 0.5444 - accuracy: 0.7369
    Epoch 419/1000
    9331/9331 - 64s - loss: 0.5435 - accuracy: 0.7316
    Epoch 420/1000
    9331/9331 - 55s - loss: 0.5408 - accuracy: 0.7333
    Epoch 421/1000
    9331/9331 - 53s - loss: 0.5406 - accuracy: 0.7373
    Epoch 422/1000
    9331/9331 - 52s - loss: 0.5391 - accuracy: 0.7380
    Epoch 423/1000
    9331/9331 - 54s - loss: 0.5404 - accuracy: 0.7396
    Epoch 424/1000
    9331/9331 - 56s - loss: 0.5400 - accuracy: 0.7375
    Epoch 425/1000
    9331/9331 - 53s - loss: 0.5381 - accuracy: 0.7383
    Epoch 426/1000
    9331/9331 - 55s - loss: 0.5426 - accuracy: 0.7366
    Epoch 427/1000
    9331/9331 - 64s - loss: 0.5376 - accuracy: 0.7365
    Epoch 428/1000
    9331/9331 - 56s - loss: 0.5416 - accuracy: 0.7409
    Epoch 429/1000
    9331/9331 - 56s - loss: 0.5368 - accuracy: 0.7416
    Epoch 430/1000
    9331/9331 - 57s - loss: 0.5409 - accuracy: 0.7381
    Epoch 431/1000
    9331/9331 - 53s - loss: 0.5379 - accuracy: 0.7391
    Epoch 432/1000
    9331/9331 - 54s - loss: 0.5411 - accuracy: 0.7354
    Epoch 433/1000
    9331/9331 - 56s - loss: 0.5393 - accuracy: 0.7370
    Epoch 434/1000
    9331/9331 - 55s - loss: 0.5352 - accuracy: 0.7435
    Epoch 435/1000
    9331/9331 - 57s - loss: 0.5362 - accuracy: 0.7394
    Epoch 436/1000
    9331/9331 - 55s - loss: 0.5397 - accuracy: 0.7365
    Epoch 437/1000
    9331/9331 - 55s - loss: 0.5403 - accuracy: 0.7401
    Epoch 438/1000
    9331/9331 - 55s - loss: 0.5385 - accuracy: 0.7372
    Epoch 439/1000
    9331/9331 - 57s - loss: 0.5357 - accuracy: 0.7368
    Epoch 440/1000
    9331/9331 - 53s - loss: 0.5401 - accuracy: 0.7389
    Epoch 441/1000
    9331/9331 - 53s - loss: 0.5359 - accuracy: 0.7398
    Epoch 442/1000
    9331/9331 - 55s - loss: 0.5404 - accuracy: 0.7344
    Epoch 443/1000
    9331/9331 - 53s - loss: 0.5416 - accuracy: 0.7375
    Epoch 444/1000
    9331/9331 - 56s - loss: 0.5363 - accuracy: 0.7412
    Epoch 445/1000
    9331/9331 - 55s - loss: 0.5376 - accuracy: 0.7368
    Epoch 446/1000
    9331/9331 - 55s - loss: 0.5360 - accuracy: 0.7411
    Epoch 447/1000
    9331/9331 - 56s - loss: 0.5380 - accuracy: 0.7345
    Epoch 448/1000
    9331/9331 - 52s - loss: 0.5358 - accuracy: 0.7390
    Epoch 449/1000
    9331/9331 - 53s - loss: 0.5336 - accuracy: 0.7438
    Epoch 450/1000
    9331/9331 - 54s - loss: 0.5339 - accuracy: 0.7389
    Epoch 451/1000
    9331/9331 - 52s - loss: 0.5347 - accuracy: 0.7430
    Epoch 452/1000
    9331/9331 - 54s - loss: 0.5365 - accuracy: 0.7390
    Epoch 453/1000
    9331/9331 - 55s - loss: 0.5335 - accuracy: 0.7442
    Epoch 454/1000
    9331/9331 - 53s - loss: 0.5354 - accuracy: 0.7428
    Epoch 455/1000
    9331/9331 - 53s - loss: 0.5325 - accuracy: 0.7412
    Epoch 456/1000
    9331/9331 - 53s - loss: 0.5384 - accuracy: 0.7386
    Epoch 457/1000
    9331/9331 - 52s - loss: 0.5337 - accuracy: 0.7440
    Epoch 458/1000
    9331/9331 - 54s - loss: 0.5345 - accuracy: 0.7416
    Epoch 459/1000
    9331/9331 - 52s - loss: 0.5346 - accuracy: 0.7446
    Epoch 460/1000
    9331/9331 - 53s - loss: 0.5373 - accuracy: 0.7416
    Epoch 461/1000
    9331/9331 - 55s - loss: 0.5344 - accuracy: 0.7376
    Epoch 462/1000
    9331/9331 - 53s - loss: 0.5374 - accuracy: 0.7357
    Epoch 463/1000
    9331/9331 - 53s - loss: 0.5323 - accuracy: 0.7446
    Epoch 464/1000
    9331/9331 - 54s - loss: 0.5346 - accuracy: 0.7426
    Epoch 465/1000
    9331/9331 - 53s - loss: 0.5357 - accuracy: 0.7382
    Epoch 466/1000
    9331/9331 - 54s - loss: 0.5288 - accuracy: 0.7467
    Epoch 467/1000
    9331/9331 - 54s - loss: 0.5351 - accuracy: 0.7386
    Epoch 468/1000
    9331/9331 - 54s - loss: 0.5319 - accuracy: 0.7400
    Epoch 469/1000
    9331/9331 - 55s - loss: 0.5338 - accuracy: 0.7454
    Epoch 470/1000
    9331/9331 - 54s - loss: 0.5356 - accuracy: 0.7420
    Epoch 471/1000
    9331/9331 - 54s - loss: 0.5336 - accuracy: 0.7403
    Epoch 472/1000
    9331/9331 - 56s - loss: 0.5301 - accuracy: 0.7419
    Epoch 473/1000
    9331/9331 - 56s - loss: 0.5327 - accuracy: 0.7464
    Epoch 474/1000
    9331/9331 - 55s - loss: 0.5326 - accuracy: 0.7398
    Epoch 475/1000
    9331/9331 - 53s - loss: 0.5325 - accuracy: 0.7459
    Epoch 476/1000
    9331/9331 - 55s - loss: 0.5290 - accuracy: 0.7468
    Epoch 477/1000
    9331/9331 - 53s - loss: 0.5345 - accuracy: 0.7413
    Epoch 478/1000
    9331/9331 - 55s - loss: 0.5320 - accuracy: 0.7411
    Epoch 479/1000
    9331/9331 - 56s - loss: 0.5313 - accuracy: 0.7415
    Epoch 480/1000
    9331/9331 - 55s - loss: 0.5354 - accuracy: 0.7426
    Epoch 481/1000
    9331/9331 - 55s - loss: 0.5307 - accuracy: 0.7443
    Epoch 482/1000
    9331/9331 - 54s - loss: 0.5293 - accuracy: 0.7431
    Epoch 483/1000
    9331/9331 - 55s - loss: 0.5315 - accuracy: 0.7401
    Epoch 484/1000
    9331/9331 - 54s - loss: 0.5306 - accuracy: 0.7400
    Epoch 485/1000
    9331/9331 - 55s - loss: 0.5321 - accuracy: 0.7423
    Epoch 486/1000
    9331/9331 - 54s - loss: 0.5348 - accuracy: 0.7462
    Epoch 487/1000
    9331/9331 - 53s - loss: 0.5305 - accuracy: 0.7448
    Epoch 488/1000
    9331/9331 - 55s - loss: 0.5297 - accuracy: 0.7435
    Epoch 489/1000
    9331/9331 - 54s - loss: 0.5285 - accuracy: 0.7452
    Epoch 490/1000
    9331/9331 - 54s - loss: 0.5291 - accuracy: 0.7438
    Epoch 491/1000
    9331/9331 - 55s - loss: 0.5291 - accuracy: 0.7442
    Epoch 492/1000
    9331/9331 - 57s - loss: 0.5291 - accuracy: 0.7449
    Epoch 493/1000
    9331/9331 - 54s - loss: 0.5292 - accuracy: 0.7442
    Epoch 494/1000
    9331/9331 - 54s - loss: 0.5251 - accuracy: 0.7501
    Epoch 495/1000
    9331/9331 - 55s - loss: 0.5299 - accuracy: 0.7434
    Epoch 496/1000
    9331/9331 - 56s - loss: 0.5279 - accuracy: 0.7459
    Epoch 497/1000
    9331/9331 - 53s - loss: 0.5273 - accuracy: 0.7470
    Epoch 498/1000
    9331/9331 - 56s - loss: 0.5292 - accuracy: 0.7479
    Epoch 499/1000
    9331/9331 - 53s - loss: 0.5285 - accuracy: 0.7445
    Epoch 500/1000
    9331/9331 - 55s - loss: 0.5299 - accuracy: 0.7439
    Epoch 501/1000
    9331/9331 - 54s - loss: 0.5237 - accuracy: 0.7446
    Epoch 502/1000
    9331/9331 - 54s - loss: 0.5288 - accuracy: 0.7430
    Epoch 503/1000
    9331/9331 - 57s - loss: 0.5271 - accuracy: 0.7450
    Epoch 504/1000
    9331/9331 - 56s - loss: 0.5292 - accuracy: 0.7439
    Epoch 505/1000
    9331/9331 - 54s - loss: 0.5302 - accuracy: 0.7478
    Epoch 506/1000
    9331/9331 - 53s - loss: 0.5312 - accuracy: 0.7426
    Epoch 507/1000
    9331/9331 - 53s - loss: 0.5252 - accuracy: 0.7501
    Epoch 508/1000
    9331/9331 - 54s - loss: 0.5255 - accuracy: 0.7484
    Epoch 509/1000
    9331/9331 - 54s - loss: 0.5222 - accuracy: 0.7503
    Epoch 510/1000
    9331/9331 - 53s - loss: 0.5274 - accuracy: 0.7414
    Epoch 511/1000
    9331/9331 - 56s - loss: 0.5242 - accuracy: 0.7470
    Epoch 512/1000
    9331/9331 - 55s - loss: 0.5267 - accuracy: 0.7457
    Epoch 513/1000
    9331/9331 - 54s - loss: 0.5296 - accuracy: 0.7441
    Epoch 514/1000
    9331/9331 - 53s - loss: 0.5289 - accuracy: 0.7433
    Epoch 515/1000
    9331/9331 - 54s - loss: 0.5249 - accuracy: 0.7438
    Epoch 516/1000
    9331/9331 - 55s - loss: 0.5291 - accuracy: 0.7429
    Epoch 517/1000
    9331/9331 - 57s - loss: 0.5292 - accuracy: 0.7431
    Epoch 518/1000
    9331/9331 - 55s - loss: 0.5269 - accuracy: 0.7447
    Epoch 519/1000
    9331/9331 - 54s - loss: 0.5252 - accuracy: 0.7470
    Epoch 520/1000
    9331/9331 - 55s - loss: 0.5254 - accuracy: 0.7472
    Epoch 521/1000
    9331/9331 - 55s - loss: 0.5245 - accuracy: 0.7497
    Epoch 522/1000
    9331/9331 - 56s - loss: 0.5262 - accuracy: 0.7467
    Epoch 523/1000
    9331/9331 - 56s - loss: 0.5289 - accuracy: 0.7478
    Epoch 524/1000
    9331/9331 - 58s - loss: 0.5216 - accuracy: 0.7477
    Epoch 525/1000
    9331/9331 - 55s - loss: 0.5256 - accuracy: 0.7452
    Epoch 526/1000
    9331/9331 - 56s - loss: 0.5281 - accuracy: 0.7439
    Epoch 527/1000
    9331/9331 - 55s - loss: 0.5242 - accuracy: 0.7472
    Epoch 528/1000
    9331/9331 - 56s - loss: 0.5226 - accuracy: 0.7448
    Epoch 529/1000
    9331/9331 - 55s - loss: 0.5246 - accuracy: 0.7485
    Epoch 530/1000
    9331/9331 - 56s - loss: 0.5174 - accuracy: 0.7523
    Epoch 531/1000
    9331/9331 - 55s - loss: 0.5207 - accuracy: 0.7505
    Epoch 532/1000
    9331/9331 - 55s - loss: 0.5213 - accuracy: 0.7521
    Epoch 533/1000
    9331/9331 - 55s - loss: 0.5206 - accuracy: 0.7517
    Epoch 534/1000
    9331/9331 - 54s - loss: 0.5186 - accuracy: 0.7552
    Epoch 535/1000
    9331/9331 - 53s - loss: 0.5215 - accuracy: 0.7513
    Epoch 536/1000
    9331/9331 - 56s - loss: 0.5212 - accuracy: 0.7492
    Epoch 537/1000
    9331/9331 - 54s - loss: 0.5188 - accuracy: 0.7525
    Epoch 538/1000
    9331/9331 - 56s - loss: 0.5252 - accuracy: 0.7493
    Epoch 539/1000
    9331/9331 - 54s - loss: 0.5198 - accuracy: 0.7499
    Epoch 540/1000
    9331/9331 - 54s - loss: 0.5233 - accuracy: 0.7469
    Epoch 541/1000
    9331/9331 - 53s - loss: 0.5210 - accuracy: 0.7479
    Epoch 542/1000
    9331/9331 - 53s - loss: 0.5242 - accuracy: 0.7472
    Epoch 543/1000
    9331/9331 - 53s - loss: 0.5214 - accuracy: 0.7488
    Epoch 544/1000
    9331/9331 - 54s - loss: 0.5163 - accuracy: 0.7569
    Epoch 545/1000
    9331/9331 - 55s - loss: 0.5211 - accuracy: 0.7474
    Epoch 546/1000
    9331/9331 - 53s - loss: 0.5216 - accuracy: 0.7544
    Epoch 547/1000
    9331/9331 - 56s - loss: 0.5193 - accuracy: 0.7498
    Epoch 548/1000
    9331/9331 - 53s - loss: 0.5229 - accuracy: 0.7472
    Epoch 549/1000
    9331/9331 - 54s - loss: 0.5166 - accuracy: 0.7544
    Epoch 550/1000
    9331/9331 - 53s - loss: 0.5190 - accuracy: 0.7502
    Epoch 551/1000
    9331/9331 - 53s - loss: 0.5259 - accuracy: 0.7460
    Epoch 552/1000
    9331/9331 - 54s - loss: 0.5216 - accuracy: 0.7482
    Epoch 553/1000
    9331/9331 - 55s - loss: 0.5186 - accuracy: 0.7544
    Epoch 554/1000
    9331/9331 - 55s - loss: 0.5229 - accuracy: 0.7443
    Epoch 555/1000
    9331/9331 - 53s - loss: 0.5230 - accuracy: 0.7489
    Epoch 556/1000
    9331/9331 - 54s - loss: 0.5223 - accuracy: 0.7464
    Epoch 557/1000
    9331/9331 - 52s - loss: 0.5215 - accuracy: 0.7507
    Epoch 558/1000
    9331/9331 - 52s - loss: 0.5177 - accuracy: 0.7504
    Epoch 559/1000
    9331/9331 - 54s - loss: 0.5193 - accuracy: 0.7539
    Epoch 560/1000
    9331/9331 - 53s - loss: 0.5154 - accuracy: 0.7515
    Epoch 561/1000
    9331/9331 - 54s - loss: 0.5166 - accuracy: 0.7533
    Epoch 562/1000
    9331/9331 - 53s - loss: 0.5202 - accuracy: 0.7520
    Epoch 563/1000
    9331/9331 - 53s - loss: 0.5180 - accuracy: 0.7461
    Epoch 564/1000
    9331/9331 - 54s - loss: 0.5198 - accuracy: 0.7527
    Epoch 565/1000
    9331/9331 - 52s - loss: 0.5191 - accuracy: 0.7528
    Epoch 566/1000
    9331/9331 - 52s - loss: 0.5140 - accuracy: 0.7546
    Epoch 567/1000
    9331/9331 - 53s - loss: 0.5144 - accuracy: 0.7537
    Epoch 568/1000
    9331/9331 - 53s - loss: 0.5156 - accuracy: 0.7553
    Epoch 569/1000
    9331/9331 - 54s - loss: 0.5138 - accuracy: 0.7535
    Epoch 570/1000
    9331/9331 - 54s - loss: 0.5183 - accuracy: 0.7490
    Epoch 571/1000
    9331/9331 - 56s - loss: 0.5154 - accuracy: 0.7518
    Epoch 572/1000
    9331/9331 - 55s - loss: 0.5153 - accuracy: 0.7567
    Epoch 573/1000
    9331/9331 - 54s - loss: 0.5168 - accuracy: 0.7516
    Epoch 574/1000
    9331/9331 - 54s - loss: 0.5140 - accuracy: 0.7525
    Epoch 575/1000
    9331/9331 - 55s - loss: 0.5170 - accuracy: 0.7487
    Epoch 576/1000
    9331/9331 - 54s - loss: 0.5158 - accuracy: 0.7547
    Epoch 577/1000
    9331/9331 - 56s - loss: 0.5107 - accuracy: 0.7542
    Epoch 578/1000
    9331/9331 - 55s - loss: 0.5137 - accuracy: 0.7524
    Epoch 579/1000
    9331/9331 - 55s - loss: 0.5168 - accuracy: 0.7521
    Epoch 580/1000
    9331/9331 - 54s - loss: 0.5148 - accuracy: 0.7539
    Epoch 581/1000
    9331/9331 - 56s - loss: 0.5118 - accuracy: 0.7568
    Epoch 582/1000
    9331/9331 - 53s - loss: 0.5163 - accuracy: 0.7491
    Epoch 583/1000
    9331/9331 - 54s - loss: 0.5129 - accuracy: 0.7549
    Epoch 584/1000
    9331/9331 - 54s - loss: 0.5109 - accuracy: 0.7593
    Epoch 585/1000
    9331/9331 - 55s - loss: 0.5180 - accuracy: 0.7554
    Epoch 586/1000
    9331/9331 - 54s - loss: 0.5133 - accuracy: 0.7537
    Epoch 587/1000
    9331/9331 - 54s - loss: 0.5149 - accuracy: 0.7568
    Epoch 588/1000
    9331/9331 - 55s - loss: 0.5135 - accuracy: 0.7593
    Epoch 589/1000
    9331/9331 - 54s - loss: 0.5092 - accuracy: 0.7554
    Epoch 590/1000
    9331/9331 - 55s - loss: 0.5140 - accuracy: 0.7546
    Epoch 591/1000
    9331/9331 - 53s - loss: 0.5130 - accuracy: 0.7538
    Epoch 592/1000
    9331/9331 - 53s - loss: 0.5143 - accuracy: 0.7540
    Epoch 593/1000
    9331/9331 - 53s - loss: 0.5162 - accuracy: 0.7515
    Epoch 594/1000
    9331/9331 - 52s - loss: 0.5137 - accuracy: 0.7549
    Epoch 595/1000
    9331/9331 - 52s - loss: 0.5138 - accuracy: 0.7515
    Epoch 596/1000
    9331/9331 - 58s - loss: 0.5120 - accuracy: 0.7566
    Epoch 597/1000
    9331/9331 - 55s - loss: 0.5133 - accuracy: 0.7581
    Epoch 598/1000
    9331/9331 - 53s - loss: 0.5118 - accuracy: 0.7540
    Epoch 599/1000
    9331/9331 - 53s - loss: 0.5155 - accuracy: 0.7550
    Epoch 600/1000
    9331/9331 - 55s - loss: 0.5120 - accuracy: 0.7589
    Epoch 601/1000
    9331/9331 - 54s - loss: 0.5065 - accuracy: 0.7594
    Epoch 602/1000
    9331/9331 - 54s - loss: 0.5128 - accuracy: 0.7531
    Epoch 603/1000
    9331/9331 - 54s - loss: 0.5054 - accuracy: 0.7578
    Epoch 604/1000
    9331/9331 - 54s - loss: 0.5106 - accuracy: 0.7543
    Epoch 605/1000
    9331/9331 - 52s - loss: 0.5090 - accuracy: 0.7600
    Epoch 606/1000
    9331/9331 - 52s - loss: 0.5130 - accuracy: 0.7552
    Epoch 607/1000
    9331/9331 - 54s - loss: 0.5140 - accuracy: 0.7558
    Epoch 608/1000
    9331/9331 - 54s - loss: 0.5117 - accuracy: 0.7548
    Epoch 609/1000
    9331/9331 - 54s - loss: 0.5108 - accuracy: 0.7581
    Epoch 610/1000
    9331/9331 - 53s - loss: 0.5060 - accuracy: 0.7582
    Epoch 611/1000
    9331/9331 - 53s - loss: 0.5078 - accuracy: 0.7554
    Epoch 612/1000
    9331/9331 - 53s - loss: 0.5083 - accuracy: 0.7553
    Epoch 613/1000
    9331/9331 - 53s - loss: 0.5065 - accuracy: 0.7585
    Epoch 614/1000
    9331/9331 - 53s - loss: 0.5066 - accuracy: 0.7561
    Epoch 615/1000
    9331/9331 - 55s - loss: 0.5089 - accuracy: 0.7536
    Epoch 616/1000
    9331/9331 - 53s - loss: 0.5053 - accuracy: 0.7592
    Epoch 617/1000
    9331/9331 - 57s - loss: 0.5091 - accuracy: 0.7549
    Epoch 618/1000
    9331/9331 - 53s - loss: 0.5109 - accuracy: 0.7529
    Epoch 619/1000
    9331/9331 - 53s - loss: 0.5071 - accuracy: 0.7567
    Epoch 620/1000
    9331/9331 - 55s - loss: 0.5045 - accuracy: 0.7626
    Epoch 621/1000
    9331/9331 - 54s - loss: 0.5092 - accuracy: 0.7587
    Epoch 622/1000
    9331/9331 - 53s - loss: 0.5075 - accuracy: 0.7593
    Epoch 623/1000
    9331/9331 - 56s - loss: 0.5063 - accuracy: 0.7575
    Epoch 624/1000
    9331/9331 - 53s - loss: 0.5076 - accuracy: 0.7562
    Epoch 625/1000
    9331/9331 - 55s - loss: 0.5077 - accuracy: 0.7606
    Epoch 626/1000
    9331/9331 - 52s - loss: 0.5073 - accuracy: 0.7591
    Epoch 627/1000
    9331/9331 - 53s - loss: 0.5061 - accuracy: 0.7574
    Epoch 628/1000
    9331/9331 - 53s - loss: 0.5060 - accuracy: 0.7573
    Epoch 629/1000
    9331/9331 - 52s - loss: 0.5030 - accuracy: 0.7607
    Epoch 630/1000
    9331/9331 - 53s - loss: 0.5083 - accuracy: 0.7582
    Epoch 631/1000
    9331/9331 - 54s - loss: 0.5044 - accuracy: 0.7607
    Epoch 632/1000
    9331/9331 - 54s - loss: 0.5056 - accuracy: 0.7613
    Epoch 633/1000
    9331/9331 - 55s - loss: 0.5017 - accuracy: 0.7614
    Epoch 634/1000
    9331/9331 - 53s - loss: 0.5088 - accuracy: 0.7563
    Epoch 635/1000
    9331/9331 - 53s - loss: 0.5076 - accuracy: 0.7555
    Epoch 636/1000
    9331/9331 - 54s - loss: 0.5057 - accuracy: 0.7593
    Epoch 637/1000
    9331/9331 - 54s - loss: 0.5094 - accuracy: 0.7569
    Epoch 638/1000
    9331/9331 - 52s - loss: 0.5041 - accuracy: 0.7608
    Epoch 639/1000
    9331/9331 - 54s - loss: 0.5047 - accuracy: 0.7619
    Epoch 640/1000
    9331/9331 - 54s - loss: 0.5069 - accuracy: 0.7617
    Epoch 641/1000
    9331/9331 - 55s - loss: 0.5077 - accuracy: 0.7597
    Epoch 642/1000
    9331/9331 - 53s - loss: 0.5062 - accuracy: 0.7591
    Epoch 643/1000
    9331/9331 - 52s - loss: 0.5025 - accuracy: 0.7623
    Epoch 644/1000
    9331/9331 - 53s - loss: 0.5093 - accuracy: 0.7578
    Epoch 645/1000
    9331/9331 - 54s - loss: 0.5072 - accuracy: 0.7566
    Epoch 646/1000
    9331/9331 - 53s - loss: 0.5083 - accuracy: 0.7574
    Epoch 647/1000
    9331/9331 - 53s - loss: 0.5013 - accuracy: 0.7634
    Epoch 648/1000
    9331/9331 - 54s - loss: 0.5064 - accuracy: 0.7550
    Epoch 649/1000
    9331/9331 - 55s - loss: 0.5004 - accuracy: 0.7636
    Epoch 650/1000
    9331/9331 - 55s - loss: 0.5045 - accuracy: 0.7578
    Epoch 651/1000
    9331/9331 - 54s - loss: 0.5071 - accuracy: 0.7578
    Epoch 652/1000
    9331/9331 - 54s - loss: 0.5062 - accuracy: 0.7587
    Epoch 653/1000
    9331/9331 - 55s - loss: 0.5035 - accuracy: 0.7590
    Epoch 654/1000
    9331/9331 - 55s - loss: 0.5026 - accuracy: 0.7607
    Epoch 655/1000
    9331/9331 - 54s - loss: 0.5012 - accuracy: 0.7603
    Epoch 656/1000
    9331/9331 - 55s - loss: 0.5032 - accuracy: 0.7627
    Epoch 657/1000
    9331/9331 - 53s - loss: 0.5013 - accuracy: 0.7587
    Epoch 658/1000
    9331/9331 - 53s - loss: 0.5010 - accuracy: 0.7610
    Epoch 659/1000
    9331/9331 - 55s - loss: 0.5010 - accuracy: 0.7627
    Epoch 660/1000
    9331/9331 - 52s - loss: 0.5035 - accuracy: 0.7605
    Epoch 661/1000
    9331/9331 - 54s - loss: 0.5076 - accuracy: 0.7576
    Epoch 662/1000
    9331/9331 - 55s - loss: 0.5042 - accuracy: 0.7599
    Epoch 663/1000
    9331/9331 - 55s - loss: 0.5021 - accuracy: 0.7611
    Epoch 664/1000
    9331/9331 - 54s - loss: 0.5043 - accuracy: 0.7577
    Epoch 665/1000
    9331/9331 - 55s - loss: 0.5010 - accuracy: 0.7617
    Epoch 666/1000
    9331/9331 - 55s - loss: 0.4996 - accuracy: 0.7634
    Epoch 667/1000
    9331/9331 - 57s - loss: 0.5004 - accuracy: 0.7614
    Epoch 668/1000
    9331/9331 - 56s - loss: 0.5007 - accuracy: 0.7660
    Epoch 669/1000
    9331/9331 - 54s - loss: 0.4960 - accuracy: 0.7643
    Epoch 670/1000
    9331/9331 - 55s - loss: 0.5041 - accuracy: 0.7596
    Epoch 671/1000
    9331/9331 - 55s - loss: 0.5063 - accuracy: 0.7591
    Epoch 672/1000
    9331/9331 - 54s - loss: 0.4947 - accuracy: 0.7602
    Epoch 673/1000
    9331/9331 - 54s - loss: 0.5030 - accuracy: 0.7595
    Epoch 674/1000
    9331/9331 - 56s - loss: 0.5009 - accuracy: 0.7589
    Epoch 675/1000
    9331/9331 - 56s - loss: 0.5022 - accuracy: 0.7642
    Epoch 676/1000
    9331/9331 - 53s - loss: 0.5023 - accuracy: 0.7610
    Epoch 677/1000
    9331/9331 - 53s - loss: 0.5032 - accuracy: 0.7602
    Epoch 678/1000
    9331/9331 - 52s - loss: 0.4976 - accuracy: 0.7635
    Epoch 679/1000
    9331/9331 - 53s - loss: 0.4952 - accuracy: 0.7682
    Epoch 680/1000
    9331/9331 - 54s - loss: 0.4982 - accuracy: 0.7634
    Epoch 681/1000
    9331/9331 - 54s - loss: 0.5010 - accuracy: 0.7649
    Epoch 682/1000
    9331/9331 - 54s - loss: 0.5011 - accuracy: 0.7590
    Epoch 683/1000
    9331/9331 - 55s - loss: 0.4960 - accuracy: 0.7664
    Epoch 684/1000
    9331/9331 - 54s - loss: 0.4985 - accuracy: 0.7621
    Epoch 685/1000
    9331/9331 - 54s - loss: 0.4983 - accuracy: 0.7636
    Epoch 686/1000
    9331/9331 - 54s - loss: 0.4991 - accuracy: 0.7652
    Epoch 687/1000
    9331/9331 - 54s - loss: 0.4991 - accuracy: 0.7612
    Epoch 688/1000
    9331/9331 - 55s - loss: 0.4973 - accuracy: 0.7581
    Epoch 689/1000
    9331/9331 - 54s - loss: 0.4945 - accuracy: 0.7620
    Epoch 690/1000
    9331/9331 - 55s - loss: 0.4977 - accuracy: 0.7683
    Epoch 691/1000
    9331/9331 - 56s - loss: 0.4917 - accuracy: 0.7718
    Epoch 692/1000
    9331/9331 - 55s - loss: 0.4999 - accuracy: 0.7640
    Epoch 693/1000
    9331/9331 - 54s - loss: 0.5002 - accuracy: 0.7628
    Epoch 694/1000
    9331/9331 - 54s - loss: 0.4977 - accuracy: 0.7632
    Epoch 695/1000
    9331/9331 - 55s - loss: 0.4964 - accuracy: 0.7639
    Epoch 696/1000
    9331/9331 - 54s - loss: 0.4989 - accuracy: 0.7628
    Epoch 697/1000
    9331/9331 - 53s - loss: 0.4985 - accuracy: 0.7643
    Epoch 698/1000
    9331/9331 - 54s - loss: 0.5021 - accuracy: 0.7581
    Epoch 699/1000
    9331/9331 - 54s - loss: 0.4967 - accuracy: 0.7627
    Epoch 700/1000
    9331/9331 - 56s - loss: 0.4930 - accuracy: 0.7644
    Epoch 701/1000
    9331/9331 - 55s - loss: 0.4956 - accuracy: 0.7702
    Epoch 702/1000
    9331/9331 - 54s - loss: 0.4994 - accuracy: 0.7620
    Epoch 703/1000
    9331/9331 - 56s - loss: 0.4942 - accuracy: 0.7664
    Epoch 704/1000
    9331/9331 - 57s - loss: 0.4948 - accuracy: 0.7652
    Epoch 705/1000
    9331/9331 - 57s - loss: 0.4953 - accuracy: 0.7669
    Epoch 706/1000
    9331/9331 - 54s - loss: 0.4935 - accuracy: 0.7670
    Epoch 707/1000
    9331/9331 - 54s - loss: 0.4906 - accuracy: 0.7692
    Epoch 708/1000
    9331/9331 - 58s - loss: 0.4944 - accuracy: 0.7660
    Epoch 709/1000
    9331/9331 - 54s - loss: 0.4978 - accuracy: 0.7583
    Epoch 710/1000
    9331/9331 - 54s - loss: 0.4921 - accuracy: 0.7683
    Epoch 711/1000
    9331/9331 - 56s - loss: 0.4933 - accuracy: 0.7612
    Epoch 712/1000
    9331/9331 - 60s - loss: 0.4914 - accuracy: 0.7678
    Epoch 713/1000
    9331/9331 - 55s - loss: 0.4902 - accuracy: 0.7705
    Epoch 714/1000
    9331/9331 - 54s - loss: 0.4963 - accuracy: 0.7662
    Epoch 715/1000
    9331/9331 - 55s - loss: 0.4950 - accuracy: 0.7629
    Epoch 716/1000
    9331/9331 - 55s - loss: 0.4972 - accuracy: 0.7634
    Epoch 717/1000
    9331/9331 - 55s - loss: 0.4924 - accuracy: 0.7607
    Epoch 718/1000
    9331/9331 - 56s - loss: 0.4949 - accuracy: 0.7655
    Epoch 719/1000
    9331/9331 - 57s - loss: 0.4966 - accuracy: 0.7588
    Epoch 720/1000
    9331/9331 - 56s - loss: 0.4933 - accuracy: 0.7607
    Epoch 721/1000
    9331/9331 - 57s - loss: 0.4937 - accuracy: 0.7695
    Epoch 722/1000
    9331/9331 - 55s - loss: 0.4936 - accuracy: 0.7639
    Epoch 723/1000
    9331/9331 - 53s - loss: 0.4908 - accuracy: 0.7705
    Epoch 724/1000
    9331/9331 - 56s - loss: 0.4946 - accuracy: 0.7611
    Epoch 725/1000
    9331/9331 - 54s - loss: 0.4958 - accuracy: 0.7635
    Epoch 726/1000
    9331/9331 - 53s - loss: 0.4899 - accuracy: 0.7655
    Epoch 727/1000
    9331/9331 - 57s - loss: 0.4922 - accuracy: 0.7643
    Epoch 728/1000
    9331/9331 - 55s - loss: 0.4890 - accuracy: 0.7722
    Epoch 729/1000
    9331/9331 - 54s - loss: 0.4908 - accuracy: 0.7644
    Epoch 730/1000
    9331/9331 - 54s - loss: 0.4925 - accuracy: 0.7656
    Epoch 731/1000
    9331/9331 - 53s - loss: 0.4943 - accuracy: 0.7650
    Epoch 732/1000
    9331/9331 - 55s - loss: 0.4946 - accuracy: 0.7660
    Epoch 733/1000
    9331/9331 - 55s - loss: 0.4965 - accuracy: 0.7620
    Epoch 734/1000
    9331/9331 - 55s - loss: 0.4954 - accuracy: 0.7651
    Epoch 735/1000
    9331/9331 - 54s - loss: 0.4919 - accuracy: 0.7659
    Epoch 736/1000
    9331/9331 - 57s - loss: 0.4887 - accuracy: 0.7675
    Epoch 737/1000
    9331/9331 - 53s - loss: 0.4893 - accuracy: 0.7670
    Epoch 738/1000
    9331/9331 - 54s - loss: 0.4925 - accuracy: 0.7650
    Epoch 739/1000
    9331/9331 - 53s - loss: 0.4917 - accuracy: 0.7698
    Epoch 740/1000
    9331/9331 - 53s - loss: 0.4918 - accuracy: 0.7670
    Epoch 741/1000
    9331/9331 - 55s - loss: 0.4880 - accuracy: 0.7694
    Epoch 742/1000
    9331/9331 - 55s - loss: 0.4938 - accuracy: 0.7671
    Epoch 743/1000
    9331/9331 - 55s - loss: 0.4873 - accuracy: 0.7692
    Epoch 744/1000
    9331/9331 - 54s - loss: 0.4871 - accuracy: 0.7710
    Epoch 745/1000
    9331/9331 - 54s - loss: 0.4941 - accuracy: 0.7673
    Epoch 746/1000
    9331/9331 - 54s - loss: 0.4928 - accuracy: 0.7662
    Epoch 747/1000
    9331/9331 - 55s - loss: 0.4894 - accuracy: 0.7698
    Epoch 748/1000
    9331/9331 - 54s - loss: 0.4871 - accuracy: 0.7733
    Epoch 749/1000
    9331/9331 - 55s - loss: 0.4889 - accuracy: 0.7649
    Epoch 750/1000
    9331/9331 - 52s - loss: 0.4927 - accuracy: 0.7674
    Epoch 751/1000
    9331/9331 - 55s - loss: 0.4828 - accuracy: 0.7770
    Epoch 752/1000
    9331/9331 - 55s - loss: 0.4854 - accuracy: 0.7702
    Epoch 753/1000
    9331/9331 - 54s - loss: 0.4905 - accuracy: 0.7703
    Epoch 754/1000
    9331/9331 - 53s - loss: 0.4887 - accuracy: 0.7664
    Epoch 755/1000
    9331/9331 - 54s - loss: 0.4927 - accuracy: 0.7666
    Epoch 756/1000
    9331/9331 - 54s - loss: 0.4849 - accuracy: 0.7690
    Epoch 757/1000
    9331/9331 - 56s - loss: 0.4845 - accuracy: 0.7693
    Epoch 758/1000
    9331/9331 - 54s - loss: 0.4871 - accuracy: 0.7645
    Epoch 759/1000
    9331/9331 - 58s - loss: 0.4880 - accuracy: 0.7664
    Epoch 760/1000
    9331/9331 - 54s - loss: 0.4876 - accuracy: 0.7714
    Epoch 761/1000
    9331/9331 - 52s - loss: 0.4884 - accuracy: 0.7703
    Epoch 762/1000
    9331/9331 - 52s - loss: 0.4904 - accuracy: 0.7680
    Epoch 763/1000
    9331/9331 - 51s - loss: 0.4867 - accuracy: 0.7717
    Epoch 764/1000
    9331/9331 - 58s - loss: 0.4858 - accuracy: 0.7679
    Epoch 765/1000
    9331/9331 - 57s - loss: 0.4883 - accuracy: 0.7703
    Epoch 766/1000
    9331/9331 - 54s - loss: 0.4873 - accuracy: 0.7736
    Epoch 767/1000
    9331/9331 - 54s - loss: 0.4842 - accuracy: 0.7771
    Epoch 768/1000
    9331/9331 - 52s - loss: 0.4856 - accuracy: 0.7742
    Epoch 769/1000
    9331/9331 - 53s - loss: 0.4885 - accuracy: 0.7707
    Epoch 770/1000
    9331/9331 - 53s - loss: 0.4865 - accuracy: 0.7673
    Epoch 771/1000
    9331/9331 - 58s - loss: 0.4816 - accuracy: 0.7695
    Epoch 772/1000
    9331/9331 - 53s - loss: 0.4819 - accuracy: 0.7711
    Epoch 773/1000
    9331/9331 - 54s - loss: 0.4891 - accuracy: 0.7687
    Epoch 774/1000
    9331/9331 - 52s - loss: 0.4886 - accuracy: 0.7703
    Epoch 775/1000
    9331/9331 - 53s - loss: 0.4862 - accuracy: 0.7684
    Epoch 776/1000
    9331/9331 - 52s - loss: 0.4873 - accuracy: 0.7670
    Epoch 777/1000
    9331/9331 - 52s - loss: 0.4809 - accuracy: 0.7746
    Epoch 778/1000
    9331/9331 - 56s - loss: 0.4835 - accuracy: 0.7694
    Epoch 779/1000
    9331/9331 - 55s - loss: 0.4838 - accuracy: 0.7731
    Epoch 780/1000
    9331/9331 - 52s - loss: 0.4836 - accuracy: 0.7673
    Epoch 781/1000
    9331/9331 - 68s - loss: 0.4810 - accuracy: 0.7740
    Epoch 782/1000
    9331/9331 - 55s - loss: 0.4829 - accuracy: 0.7721
    Epoch 783/1000
    9331/9331 - 54s - loss: 0.4821 - accuracy: 0.7701
    Epoch 784/1000
    9331/9331 - 54s - loss: 0.4891 - accuracy: 0.7666
    Epoch 785/1000
    9331/9331 - 53s - loss: 0.4870 - accuracy: 0.7680
    Epoch 786/1000
    9331/9331 - 54s - loss: 0.4815 - accuracy: 0.7760
    Epoch 787/1000
    9331/9331 - 53s - loss: 0.4787 - accuracy: 0.7751
    Epoch 788/1000
    9331/9331 - 54s - loss: 0.4805 - accuracy: 0.7729
    Epoch 789/1000
    9331/9331 - 53s - loss: 0.4805 - accuracy: 0.7767
    Epoch 790/1000
    9331/9331 - 72s - loss: 0.4803 - accuracy: 0.7716
    Epoch 791/1000
    9331/9331 - 55s - loss: 0.4862 - accuracy: 0.7703
    Epoch 792/1000
    9331/9331 - 54s - loss: 0.4828 - accuracy: 0.7698
    Epoch 793/1000
    9331/9331 - 53s - loss: 0.4837 - accuracy: 0.7707
    Epoch 794/1000
    9331/9331 - 54s - loss: 0.4860 - accuracy: 0.7740
    Epoch 795/1000
    9331/9331 - 53s - loss: 0.4822 - accuracy: 0.7707
    Epoch 796/1000
    9331/9331 - 58s - loss: 0.4836 - accuracy: 0.7687
    Epoch 797/1000
    9331/9331 - 54s - loss: 0.4883 - accuracy: 0.7716
    Epoch 798/1000
    9331/9331 - 67s - loss: 0.4859 - accuracy: 0.7743
    Epoch 799/1000
    9331/9331 - 55s - loss: 0.4825 - accuracy: 0.7728
    Epoch 800/1000
    9331/9331 - 55s - loss: 0.4823 - accuracy: 0.7710
    Epoch 801/1000
    9331/9331 - 55s - loss: 0.4823 - accuracy: 0.7697
    Epoch 802/1000
    9331/9331 - 55s - loss: 0.4843 - accuracy: 0.7752
    Epoch 803/1000
    9331/9331 - 54s - loss: 0.4820 - accuracy: 0.7731
    Epoch 804/1000
    9331/9331 - 60s - loss: 0.4816 - accuracy: 0.7783
    Epoch 805/1000
    9331/9331 - 74s - loss: 0.4815 - accuracy: 0.7745
    Epoch 806/1000
    9331/9331 - 55s - loss: 0.4801 - accuracy: 0.7760
    Epoch 807/1000
    9331/9331 - 127s - loss: 0.4788 - accuracy: 0.7746
    Epoch 808/1000
    9331/9331 - 59s - loss: 0.4772 - accuracy: 0.7748
    Epoch 809/1000
    9331/9331 - 59s - loss: 0.4812 - accuracy: 0.7716
    Epoch 810/1000
    9331/9331 - 55s - loss: 0.4817 - accuracy: 0.7709
    Epoch 811/1000
    9331/9331 - 55s - loss: 0.4837 - accuracy: 0.7726
    Epoch 812/1000
    9331/9331 - 54s - loss: 0.4824 - accuracy: 0.7713
    Epoch 813/1000
    9331/9331 - 54s - loss: 0.4749 - accuracy: 0.7760
    Epoch 814/1000
    9331/9331 - 58s - loss: 0.4766 - accuracy: 0.7764
    Epoch 815/1000
    9331/9331 - 56s - loss: 0.4868 - accuracy: 0.7685
    Epoch 816/1000
    9331/9331 - 55s - loss: 0.4723 - accuracy: 0.7768
    Epoch 817/1000
    9331/9331 - 59s - loss: 0.4834 - accuracy: 0.7684
    Epoch 818/1000
    9331/9331 - 55s - loss: 0.4797 - accuracy: 0.7743
    Epoch 819/1000
    9331/9331 - 70s - loss: 0.4803 - accuracy: 0.7730
    Epoch 820/1000
    9331/9331 - 59s - loss: 0.4780 - accuracy: 0.7713
    Epoch 821/1000
    9331/9331 - 64s - loss: 0.4758 - accuracy: 0.7753
    Epoch 822/1000
    9331/9331 - 54s - loss: 0.4807 - accuracy: 0.7731
    Epoch 823/1000
    9331/9331 - 64s - loss: 0.4814 - accuracy: 0.7677
    Epoch 824/1000
    9331/9331 - 172s - loss: 0.4813 - accuracy: 0.7718
    Epoch 825/1000
    9331/9331 - 76s - loss: 0.4828 - accuracy: 0.7732
    Epoch 826/1000
    9331/9331 - 76s - loss: 0.4781 - accuracy: 0.7769
    Epoch 827/1000
    9331/9331 - 59s - loss: 0.4785 - accuracy: 0.7783
    Epoch 828/1000
    9331/9331 - 63s - loss: 0.4769 - accuracy: 0.7756
    Epoch 829/1000
    9331/9331 - 56s - loss: 0.4775 - accuracy: 0.7748
    Epoch 830/1000
    9331/9331 - 57s - loss: 0.4771 - accuracy: 0.7734
    Epoch 831/1000
    9331/9331 - 54s - loss: 0.4778 - accuracy: 0.7722
    Epoch 832/1000
    9331/9331 - 53s - loss: 0.4824 - accuracy: 0.7710
    Epoch 833/1000
    9331/9331 - 54s - loss: 0.4726 - accuracy: 0.7779
    Epoch 834/1000
    9331/9331 - 58s - loss: 0.4811 - accuracy: 0.7704
    Epoch 835/1000
    9331/9331 - 60s - loss: 0.4712 - accuracy: 0.7806
    Epoch 836/1000
    9331/9331 - 58s - loss: 0.4740 - accuracy: 0.7733
    Epoch 837/1000
    9331/9331 - 56s - loss: 0.4741 - accuracy: 0.7767
    Epoch 838/1000
    9331/9331 - 55s - loss: 0.4721 - accuracy: 0.7792
    Epoch 839/1000
    9331/9331 - 53s - loss: 0.4785 - accuracy: 0.7745
    Epoch 840/1000
    9331/9331 - 56s - loss: 0.4761 - accuracy: 0.7752
    Epoch 841/1000
    9331/9331 - 56s - loss: 0.4707 - accuracy: 0.7762
    Epoch 842/1000
    9331/9331 - 64s - loss: 0.4756 - accuracy: 0.7770
    Epoch 843/1000
    9331/9331 - 54s - loss: 0.4751 - accuracy: 0.7802
    Epoch 844/1000
    9331/9331 - 62s - loss: 0.4730 - accuracy: 0.7805
    Epoch 845/1000
    9331/9331 - 55s - loss: 0.4771 - accuracy: 0.7747
    Epoch 846/1000
    9331/9331 - 52s - loss: 0.4743 - accuracy: 0.7770
    Epoch 847/1000
    9331/9331 - 52s - loss: 0.4744 - accuracy: 0.7762
    Epoch 848/1000
    9331/9331 - 53s - loss: 0.4762 - accuracy: 0.7779
    Epoch 849/1000
    9331/9331 - 73s - loss: 0.4780 - accuracy: 0.7760
    Epoch 850/1000
    9331/9331 - 58s - loss: 0.4727 - accuracy: 0.7775
    Epoch 851/1000
    9331/9331 - 74s - loss: 0.4758 - accuracy: 0.7759
    Epoch 852/1000
    9331/9331 - 58s - loss: 0.4817 - accuracy: 0.7788
    Epoch 853/1000
    9331/9331 - 59s - loss: 0.4781 - accuracy: 0.7731
    Epoch 854/1000
    9331/9331 - 76s - loss: 0.4709 - accuracy: 0.7796
    Epoch 855/1000
    9331/9331 - 64s - loss: 0.4738 - accuracy: 0.7805
    Epoch 856/1000
    9331/9331 - 67s - loss: 0.4741 - accuracy: 0.7747
    Epoch 857/1000
    9331/9331 - 127s - loss: 0.4747 - accuracy: 0.7763
    Epoch 858/1000
    9331/9331 - 76s - loss: 0.4758 - accuracy: 0.7756
    Epoch 859/1000
    9331/9331 - 55s - loss: 0.4717 - accuracy: 0.7778
    Epoch 860/1000
    9331/9331 - 60s - loss: 0.4730 - accuracy: 0.7755
    Epoch 861/1000
    9331/9331 - 60s - loss: 0.4756 - accuracy: 0.7803
    Epoch 862/1000
    9331/9331 - 59s - loss: 0.4737 - accuracy: 0.7792
    Epoch 863/1000
    9331/9331 - 69s - loss: 0.4816 - accuracy: 0.7728
    Epoch 864/1000
    9331/9331 - 74s - loss: 0.4722 - accuracy: 0.7766
    Epoch 865/1000
    9331/9331 - 83s - loss: 0.4751 - accuracy: 0.7723
    Epoch 866/1000
    9331/9331 - 101s - loss: 0.4680 - accuracy: 0.7783
    Epoch 867/1000
    9331/9331 - 131s - loss: 0.4720 - accuracy: 0.7800
    Epoch 868/1000
    9331/9331 - 64s - loss: 0.4740 - accuracy: 0.7786
    Epoch 869/1000
    9331/9331 - 62s - loss: 0.4690 - accuracy: 0.7811
    Epoch 870/1000
    9331/9331 - 55s - loss: 0.4745 - accuracy: 0.7786
    Epoch 871/1000
    9331/9331 - 77s - loss: 0.4731 - accuracy: 0.7778
    Epoch 872/1000
    9331/9331 - 73s - loss: 0.4729 - accuracy: 0.7779
    Epoch 873/1000
    9331/9331 - 54s - loss: 0.4709 - accuracy: 0.7824
    Epoch 874/1000
    9331/9331 - 54s - loss: 0.4691 - accuracy: 0.7824
    Epoch 875/1000
    9331/9331 - 53s - loss: 0.4793 - accuracy: 0.7743
    Epoch 876/1000
    9331/9331 - 69s - loss: 0.4738 - accuracy: 0.7794
    Epoch 877/1000
    9331/9331 - 77s - loss: 0.4732 - accuracy: 0.7768
    Epoch 878/1000
    9331/9331 - 87s - loss: 0.4780 - accuracy: 0.7727
    Epoch 879/1000
    9331/9331 - 57s - loss: 0.4714 - accuracy: 0.7781
    Epoch 880/1000
    9331/9331 - 61s - loss: 0.4754 - accuracy: 0.7752
    Epoch 881/1000
    9331/9331 - 76s - loss: 0.4733 - accuracy: 0.7769
    Epoch 882/1000
    9331/9331 - 54s - loss: 0.4772 - accuracy: 0.7779
    Epoch 883/1000
    9331/9331 - 72s - loss: 0.4662 - accuracy: 0.7848
    Epoch 884/1000
    9331/9331 - 54s - loss: 0.4671 - accuracy: 0.7813
    Epoch 885/1000
    9331/9331 - 56s - loss: 0.4703 - accuracy: 0.7794
    Epoch 886/1000
    9331/9331 - 62s - loss: 0.4702 - accuracy: 0.7802
    Epoch 887/1000
    9331/9331 - 67s - loss: 0.4705 - accuracy: 0.7804
    Epoch 888/1000
    9331/9331 - 55s - loss: 0.4705 - accuracy: 0.7775
    Epoch 889/1000
    9331/9331 - 56s - loss: 0.4674 - accuracy: 0.7799
    Epoch 890/1000
    9331/9331 - 62s - loss: 0.4684 - accuracy: 0.7836
    Epoch 891/1000
    9331/9331 - 91s - loss: 0.4721 - accuracy: 0.7758
    Epoch 892/1000
    9331/9331 - 86s - loss: 0.4691 - accuracy: 0.7782
    Epoch 893/1000
    9331/9331 - 63s - loss: 0.4746 - accuracy: 0.7791
    Epoch 894/1000
    9331/9331 - 115s - loss: 0.4666 - accuracy: 0.7797
    Epoch 895/1000
    9331/9331 - 54s - loss: 0.4659 - accuracy: 0.7788
    Epoch 896/1000
    9331/9331 - 55s - loss: 0.4649 - accuracy: 0.7843
    Epoch 897/1000
    9331/9331 - 62s - loss: 0.4644 - accuracy: 0.7820
    Epoch 898/1000
    9331/9331 - 56s - loss: 0.4692 - accuracy: 0.7779
    Epoch 899/1000
    9331/9331 - 93s - loss: 0.4660 - accuracy: 0.7826
    Epoch 900/1000
    9331/9331 - 80s - loss: 0.4712 - accuracy: 0.7807
    Epoch 901/1000
    9331/9331 - 54s - loss: 0.4681 - accuracy: 0.7822
    Epoch 902/1000
    9331/9331 - 60s - loss: 0.4632 - accuracy: 0.7836
    Epoch 903/1000
    9331/9331 - 54s - loss: 0.4652 - accuracy: 0.7865
    Epoch 904/1000
    9331/9331 - 56s - loss: 0.4725 - accuracy: 0.7788
    Epoch 905/1000
    9331/9331 - 55s - loss: 0.4702 - accuracy: 0.7794
    Epoch 906/1000
    9331/9331 - 52s - loss: 0.4690 - accuracy: 0.7852
    Epoch 907/1000
    9331/9331 - 56s - loss: 0.4694 - accuracy: 0.7842
    Epoch 908/1000
    9331/9331 - 53s - loss: 0.4708 - accuracy: 0.7833
    Epoch 909/1000
    9331/9331 - 67s - loss: 0.4691 - accuracy: 0.7778
    Epoch 910/1000
    9331/9331 - 54s - loss: 0.4679 - accuracy: 0.7816
    Epoch 911/1000
    9331/9331 - 65s - loss: 0.4660 - accuracy: 0.7853
    Epoch 912/1000
    9331/9331 - 53s - loss: 0.4694 - accuracy: 0.7817
    Epoch 913/1000
    9331/9331 - 52s - loss: 0.4678 - accuracy: 0.7863
    Epoch 914/1000
    9331/9331 - 53s - loss: 0.4643 - accuracy: 0.7801
    Epoch 915/1000
    9331/9331 - 53s - loss: 0.4676 - accuracy: 0.7802
    Epoch 916/1000
    9331/9331 - 56s - loss: 0.4692 - accuracy: 0.7822
    Epoch 917/1000
    9331/9331 - 55s - loss: 0.4698 - accuracy: 0.7841
    Epoch 918/1000
    9331/9331 - 76s - loss: 0.4672 - accuracy: 0.7777
    Epoch 919/1000
    9331/9331 - 85s - loss: 0.4679 - accuracy: 0.7822
    Epoch 920/1000
    9331/9331 - 55s - loss: 0.4720 - accuracy: 0.7792
    Epoch 921/1000
    9331/9331 - 55s - loss: 0.4744 - accuracy: 0.7741
    Epoch 922/1000
    9331/9331 - 53s - loss: 0.4705 - accuracy: 0.7802
    Epoch 923/1000
    9331/9331 - 55s - loss: 0.4658 - accuracy: 0.7815
    Epoch 924/1000
    9331/9331 - 59s - loss: 0.4666 - accuracy: 0.7846
    Epoch 925/1000
    9331/9331 - 67s - loss: 0.4679 - accuracy: 0.7777
    Epoch 926/1000
    9331/9331 - 66s - loss: 0.4684 - accuracy: 0.7814
    Epoch 927/1000
    9331/9331 - 56s - loss: 0.4653 - accuracy: 0.7807
    Epoch 928/1000
    9331/9331 - 53s - loss: 0.4640 - accuracy: 0.7832
    Epoch 929/1000
    9331/9331 - 56s - loss: 0.4593 - accuracy: 0.7878
    Epoch 930/1000
    9331/9331 - 59s - loss: 0.4657 - accuracy: 0.7837
    Epoch 931/1000
    9331/9331 - 58s - loss: 0.4632 - accuracy: 0.7896
    Epoch 932/1000
    9331/9331 - 54s - loss: 0.4693 - accuracy: 0.7804
    Epoch 933/1000
    9331/9331 - 78s - loss: 0.4705 - accuracy: 0.7839
    Epoch 934/1000
    9331/9331 - 93s - loss: 0.4680 - accuracy: 0.7779
    Epoch 935/1000
    9331/9331 - 88s - loss: 0.4629 - accuracy: 0.7862
    Epoch 936/1000
    9331/9331 - 89s - loss: 0.4665 - accuracy: 0.7823
    Epoch 937/1000
    9331/9331 - 59s - loss: 0.4643 - accuracy: 0.7874
    Epoch 938/1000
    9331/9331 - 55s - loss: 0.4579 - accuracy: 0.7898
    Epoch 939/1000
    9331/9331 - 54s - loss: 0.4605 - accuracy: 0.7860
    Epoch 940/1000
    9331/9331 - 54s - loss: 0.4619 - accuracy: 0.7873
    Epoch 941/1000
    9331/9331 - 66s - loss: 0.4616 - accuracy: 0.7842
    Epoch 942/1000
    9331/9331 - 85s - loss: 0.4706 - accuracy: 0.7771
    Epoch 943/1000
    9331/9331 - 54s - loss: 0.4714 - accuracy: 0.7813
    Epoch 944/1000
    9331/9331 - 57s - loss: 0.4590 - accuracy: 0.7887
    Epoch 945/1000
    9331/9331 - 56s - loss: 0.4659 - accuracy: 0.7850
    Epoch 946/1000
    9331/9331 - 52s - loss: 0.4662 - accuracy: 0.7830
    Epoch 947/1000
    9331/9331 - 58s - loss: 0.4693 - accuracy: 0.7792
    Epoch 948/1000
    9331/9331 - 53s - loss: 0.4671 - accuracy: 0.7845
    Epoch 949/1000
    9331/9331 - 56s - loss: 0.4672 - accuracy: 0.7837
    Epoch 950/1000
    9331/9331 - 53s - loss: 0.4654 - accuracy: 0.7842
    Epoch 951/1000
    9331/9331 - 60s - loss: 0.4637 - accuracy: 0.7797
    Epoch 952/1000
    9331/9331 - 64s - loss: 0.4627 - accuracy: 0.7858
    Epoch 953/1000
    9331/9331 - 57s - loss: 0.4652 - accuracy: 0.7803
    Epoch 954/1000
    9331/9331 - 65s - loss: 0.4648 - accuracy: 0.7836
    Epoch 955/1000
    9331/9331 - 58s - loss: 0.4646 - accuracy: 0.7785
    Epoch 956/1000
    9331/9331 - 53s - loss: 0.4612 - accuracy: 0.7813
    Epoch 957/1000
    9331/9331 - 55s - loss: 0.4622 - accuracy: 0.7836
    Epoch 958/1000
    9331/9331 - 54s - loss: 0.4678 - accuracy: 0.7773
    Epoch 959/1000
    9331/9331 - 54s - loss: 0.4660 - accuracy: 0.7831
    Epoch 960/1000
    9331/9331 - 61s - loss: 0.4642 - accuracy: 0.7819
    Epoch 961/1000
    9331/9331 - 54s - loss: 0.4719 - accuracy: 0.7807
    Epoch 962/1000
    9331/9331 - 55s - loss: 0.4621 - accuracy: 0.7859
    Epoch 963/1000
    9331/9331 - 54s - loss: 0.4603 - accuracy: 0.7847
    Epoch 964/1000
    9331/9331 - 63s - loss: 0.4618 - accuracy: 0.7848
    Epoch 965/1000
    9331/9331 - 55s - loss: 0.4562 - accuracy: 0.7893
    Epoch 966/1000
    9331/9331 - 56s - loss: 0.4677 - accuracy: 0.7812
    Epoch 967/1000
    9331/9331 - 64s - loss: 0.4698 - accuracy: 0.7787
    Epoch 968/1000
    9331/9331 - 55s - loss: 0.4665 - accuracy: 0.7806
    Epoch 969/1000
    9331/9331 - 70s - loss: 0.4528 - accuracy: 0.7890
    Epoch 970/1000
    9331/9331 - 62s - loss: 0.4639 - accuracy: 0.7852
    Epoch 971/1000
    9331/9331 - 96s - loss: 0.4628 - accuracy: 0.7865
    Epoch 972/1000
    9331/9331 - 101s - loss: 0.4668 - accuracy: 0.7814
    Epoch 973/1000
    9331/9331 - 74s - loss: 0.4631 - accuracy: 0.7850
    Epoch 974/1000
    9331/9331 - 53s - loss: 0.4571 - accuracy: 0.7880
    Epoch 975/1000
    9331/9331 - 54s - loss: 0.4647 - accuracy: 0.7836
    Epoch 976/1000
    9331/9331 - 59s - loss: 0.4607 - accuracy: 0.7807
    Epoch 977/1000
    9331/9331 - 52s - loss: 0.4570 - accuracy: 0.7863
    Epoch 978/1000
    9331/9331 - 58s - loss: 0.4597 - accuracy: 0.7843
    Epoch 979/1000
    9331/9331 - 54s - loss: 0.4589 - accuracy: 0.7837
    Epoch 980/1000
    9331/9331 - 57s - loss: 0.4689 - accuracy: 0.7823
    Epoch 981/1000
    9331/9331 - 65s - loss: 0.4559 - accuracy: 0.7864
    Epoch 982/1000
    9331/9331 - 71s - loss: 0.4625 - accuracy: 0.7836
    Epoch 983/1000
    9331/9331 - 67s - loss: 0.4674 - accuracy: 0.7865
    Epoch 984/1000
    9331/9331 - 55s - loss: 0.4621 - accuracy: 0.7867
    Epoch 985/1000
    9331/9331 - 58s - loss: 0.4645 - accuracy: 0.7836
    Epoch 986/1000
    9331/9331 - 59s - loss: 0.4646 - accuracy: 0.7847
    Epoch 987/1000
    9331/9331 - 56s - loss: 0.4656 - accuracy: 0.7848
    Epoch 988/1000
    9331/9331 - 54s - loss: 0.4586 - accuracy: 0.7877
    Epoch 989/1000
    9331/9331 - 55s - loss: 0.4647 - accuracy: 0.7808
    Epoch 990/1000
    9331/9331 - 55s - loss: 0.4640 - accuracy: 0.7851
    Epoch 991/1000
    9331/9331 - 58s - loss: 0.4585 - accuracy: 0.7873
    Epoch 992/1000
    9331/9331 - 55s - loss: 0.4639 - accuracy: 0.7830
    Epoch 993/1000
    9331/9331 - 69s - loss: 0.4636 - accuracy: 0.7869
    Epoch 994/1000
    9331/9331 - 53s - loss: 0.4605 - accuracy: 0.7892
    Epoch 995/1000
    9331/9331 - 53s - loss: 0.4582 - accuracy: 0.7887
    Epoch 996/1000
    9331/9331 - 54s - loss: 0.4591 - accuracy: 0.7839
    Epoch 997/1000
    9331/9331 - 53s - loss: 0.4593 - accuracy: 0.7871
    Epoch 998/1000
    9331/9331 - 53s - loss: 0.4538 - accuracy: 0.7875
    Epoch 999/1000
    9331/9331 - 54s - loss: 0.4518 - accuracy: 0.7893
    Epoch 1000/1000
    9331/9331 - 65s - loss: 0.4598 - accuracy: 0.7837
    
    d8e2de4a5fbc4a74ada448eb07b0870bT not found.
    
    d8e2de4a5fbc4a74ada448eb07b0870bT not found.
    d8e2de4a5fbc4a74ada448eb07b0870bT not found.
    

    基础特征抽取(general_feature_extractor)使用错误,你可以:

    1.一键查看文档

    2.一键搜索答案

    Traceback (most recent call last):
      File "module2/common/moduleinvoker.py", line 208, in biglearning.module2.common.moduleinvoker._invoke_with_cache
      File "module2/common/moduleinvoker.py", line 165, in biglearning.module2.common.moduleinvoker._module_run
      File "module2/modules/general_feature_extractor/v7/__init__.py", line 64, in biglearning.module2.modules.general_feature_extractor.v7.__init__.BigQuantModule.__init__
      File "module2/modules/general_feature_extractor/v7/__init__.py", line 68, in biglearning.module2.modules.general_feature_extractor.v7.__init__.BigQuantModule.__update_feature_state
    TypeError: 'NoneType' object is not iterable
    
    During handling of the above exception, another exception occurred:
    
    Traceback (most recent call last):
      File "/usr/local/python3/lib/python3.5/site-packages/logbook/handlers.py", line 213, in handle
        self.emit(record)
      File "/usr/local/python3/lib/python3.5/site-packages/logbook/handlers.py", line 839, in emit
        self.perform_rollover()
      File "/usr/local/python3/lib/python3.5/site-packages/logbook/handlers.py", line 828, in perform_rollover
        self.stream.close()
    AttributeError: 'NoneType' object has no attribute 'close'
    Logged from file <ipython-input-203-55ef858326f6>, line 320
    
    ---------------------------------------------------------------------------
    TypeError                                 Traceback (most recent call last)
    <ipython-input-203-55ef858326f6> in <module>()
        318     start_date='',
        319     end_date='',
    --> 320     before_start_days=90
        321 )
        322 
    
    TypeError: 'NoneType' object is not iterable

    (小Q) #2

    轮次少时没问题,您这个问题再试试,是训练轮次多了之后出现的么?

    克隆策略

    使用深度学习技术预测股票价格

    版本 v1.0

    目录

    • ### 深度学习策略的交易规则

    • ### 策略构建步骤

    • ### 策略的实现

    正文

    一、深度学习策略的交易规则

    • 买入条件:预测的上涨概率>0.5,则买入或保持已有持仓。
    • 卖出条件 :预测的上涨概率<0.5,则卖出已有股票。

    二、策略构建步骤

    1、确定股票池和数据起止时间

    • 在证券代码列表m24和m28模块中输入要回测的单只股票,以及数据的起止日期(分别为训练集和验证集)。

    2、确定因子

    • 在输入特征列表m8模块中输入用于预测的N个因子表达式。

    3、获取基础数据

    • 通过基础特征数据抽取模块m22和m16获取指定股票池的基础数据,如收盘价等字段。

    4、确定并计算模型标注

    • 通过自动标注股票模块m21计算需要的标注指标,本例中首先计算未来10天收益,然后根据其正负来给每日数据标注1或0,来标识涨跌。

    5、抽取因子数据

    • 通过衍生数据抽取模块m23和m26计算因子数据。

    6、合并标注与因子数据

    • 通过连接数据m17模块合并因子数据和标注数据。

    7、生成序列窗口滚动数据集

    • 通过序列窗口滚动(深度学习)模块m25和m27将训练集和预测集的数据生成固定窗口长度的数据序列,为后续模型训练和预测做准备。

    8、构建LSTM + CNN模型构架

    • 在画布左侧模块列表中依次拖入输入层模块、Reshape层模块、Conv2D层模块、Reshape层模块、LSTM层模块、Dropout层模块和全连接层模块(两组),构成深度学习网络构架,

      最后通过“构建(深度学习)”模块组装各层。这里需要注意:

      输入层的shape参数是 窗口滚动数据集的大小 X 因子数量 , 本例为 50 行 X 5个因子

      ReShape层的参数是 窗口滚动数据集的大小 X 因子数量 X 1 ,本例为 50 行 X 5个因子 X1

      Conv2D层中的 kernel_size参数是滑动窗口的尺寸,本例中使用 3行 X 5列 的窗口, 每次滑动的步长为 1行 X 1列 , 卷积核数目为32,这里的窗口设置决定了后面ReShape层的参数

      ReShape层中的target_shape 参数,这是由 窗口滚动数据集 X 因子数量 和 Conv2D层中设置的窗口尺寸以及步长决定的。本例中 50行 X 5因子 的输入数据,使用 3行 X5列 的窗口滑动取数据,

      每次移动1行,共计可以得到48次数据(即可以通过滑动3行 X 5列的窗口48次来获取完整的数据),因此target_shape= 48 X 卷积核数32

      LSTM层的输出空间维度设置为卷积核数32,并设置激活函数

      Dropout层是防止过度拟合采用的主动裁剪数据技术,这里设置rate 为0.8

      全连接层共两层,第一层的输出空间维度与LSTM的输出维度保持一致为32,第二层将第一层的32维数据转变为1维数据输出,即获取预测的label值,此例为0到1之间的连续值,可以认为是上涨的概率。

    9、训练深度学习模型

    • 在画布左侧模块列表中拖入“训练(深度学习)”模块m6,设置属性中的优化器、目标函数、评估指标、每次训练的数据量batch_size、迭代次数epochs和GPU的数量以及日志输出频率。

    10、使用深度学习模型预测

    • 在画布左侧模块列表中拖入“预测(深度学习)”模块m7,并将“训练(深度学习)”模块m6的模型输出和验证集的序列窗口滚动数据集传给预测模块,通过预测模块即根据股票验证集的数据预测上涨的概率。

    11、将预测结果与时间拼接

    • 通过自定义模块m2将预测的每个滚动序列窗口的最后一个值最为当日的预测结果,并与预测集数据的时间列拼接,形成最终的每日预测结果。

    12、根据模型预测结果构建策略

    • 如果当日预测的上涨概率大于0.5,则保持持仓或买入

    • 如果当日预测的上涨概率小于0.5,则卖出股票或保持空仓。

    13、模拟回测

    • 通过 trade 模块中的初始化函数定义交易手续费和滑点,通过 context.prediction 获取每日的上涨概率预测结果;

    • 通过 trade 模块中的主函数(handle函数)查看每日的买卖交易信号,按照买卖原则执行相应的买入/卖出操作。

    三、策略的实现

    可视化策略实现如下:

    In [13]:
    m26.data.read().head()
    
    Out[13]:
    avg_amount_0 avg_amount_4 avg_mf_net_amount_4 avg_turn_0 avg_turn_4 close_0 date high_0 instrument low_0 ... ta_rsi(close_0, timeperiod=14)/50-1 ta_mom(close_0, timeperiod=14)/5 ta_adx(high_0, low_0, close_0, timeperiod=14)/ta_adx(high_0, low_0, close_0, timeperiod=28)-1 ta_roc(close_0, timeperiod=14)/10 (ta_kdj_k(high_0, low_0, close_0, 12, 3)/ta_kdj_d(high_0, low_0, close_0, 12, 3,3)-1)*2 ta_bias(close_0, timeperiod=28)*10 volatility_5_0/volatility_60_0-1 (close_0/open_4-1)*5 avg_mf_net_amount_4/avg_amount_4*5 mf_net_pct_main_0*5
    0 53222984.0 73392976.0 -9057139.00 1.442490 2.178047 37.819042 2017-05-22 39.466816 002425.SZA 37.765888 ... NaN NaN NaN NaN NaN NaN -0.136811 -0.086326 -0.617030 -0.3050
    1 73127320.0 72582416.0 -14969738.00 2.078617 2.066250 36.304153 2017-05-23 37.898773 002425.SZA 35.506844 ... NaN NaN NaN NaN NaN NaN 0.131016 -0.305842 -1.031223 -0.0105
    2 98551248.0 75055576.0 -15820568.00 2.925663 2.084204 34.948730 2017-05-24 36.224422 002425.SZA 34.656384 ... NaN NaN NaN NaN NaN NaN -0.154168 -0.499658 -1.053924 -0.3795
    3 70397784.0 71569024.0 -8935081.00 1.341838 1.891072 35.666306 2017-05-25 35.878922 002425.SZA 34.815845 ... NaN NaN NaN NaN NaN NaN 0.186588 -0.453930 -0.624228 0.3180
    4 102040712.0 79468008.0 -3681509.25 1.869540 1.931630 37.686157 2017-05-26 38.084812 002425.SZA 35.506844 ... NaN NaN NaN NaN NaN NaN 0.898535 -0.143836 -0.231635 0.4860

    5 rows × 36 columns

      {"Description":"实验创建于2017/11/15","Summary":"","Graph":{"EdgesInternal":[{"DestinationInputPortId":"-281:options_data","SourceOutputPortId":"-214:data_1"},{"DestinationInputPortId":"-403:inputs","SourceOutputPortId":"-210:data"},{"DestinationInputPortId":"-293:inputs","SourceOutputPortId":"-210:data"},{"DestinationInputPortId":"-14834:inputs","SourceOutputPortId":"-218:data"},{"DestinationInputPortId":"-692:input_data","SourceOutputPortId":"-316:data"},{"DestinationInputPortId":"-294:input_1","SourceOutputPortId":"-320:data"},{"DestinationInputPortId":"-332:trained_model","SourceOutputPortId":"-320:data"},{"DestinationInputPortId":"-214:input_1","SourceOutputPortId":"-332:data"},{"DestinationInputPortId":"-692:features","SourceOutputPortId":"-2295:data"},{"DestinationInputPortId":"-333:features","SourceOutputPortId":"-2295:data"},{"DestinationInputPortId":"-341:features","SourceOutputPortId":"-2295:data"},{"DestinationInputPortId":"-300:features","SourceOutputPortId":"-2295:data"},{"DestinationInputPortId":"-307:features","SourceOutputPortId":"-2295:data"},{"DestinationInputPortId":"-316:features","SourceOutputPortId":"-2295:data"},{"DestinationInputPortId":"-293:outputs","SourceOutputPortId":"-259:data"},{"DestinationInputPortId":"-14841:inputs","SourceOutputPortId":"-14806:data"},{"DestinationInputPortId":"-14806:inputs","SourceOutputPortId":"-14834:data"},{"DestinationInputPortId":"-259:inputs","SourceOutputPortId":"-14841:data"},{"DestinationInputPortId":"-408:inputs","SourceOutputPortId":"-403:data"},{"DestinationInputPortId":"-446:inputs","SourceOutputPortId":"-408:data"},{"DestinationInputPortId":"-218:inputs","SourceOutputPortId":"-446:data"},{"DestinationInputPortId":"-2296:input_data","SourceOutputPortId":"-2290:data"},{"DestinationInputPortId":"-333:input_data","SourceOutputPortId":"-2296:data"},{"DestinationInputPortId":"-289:instruments","SourceOutputPortId":"-620:data"},{"DestinationInputPortId":"-300:instruments","SourceOutputPortId":"-620:data"},{"DestinationInputPortId":"-330:input_data","SourceOutputPortId":"-692:data"},{"DestinationInputPortId":"-320:training_data","SourceOutputPortId":"-333:data"},{"DestinationInputPortId":"-332:input_data","SourceOutputPortId":"-341:data"},{"DestinationInputPortId":"-214:input_2","SourceOutputPortId":"-341:data"},{"DestinationInputPortId":"-2290:data1","SourceOutputPortId":"-289:data"},{"DestinationInputPortId":"-307:input_data","SourceOutputPortId":"-300:data"},{"DestinationInputPortId":"-2290:data2","SourceOutputPortId":"-307:data"},{"DestinationInputPortId":"-316:instruments","SourceOutputPortId":"-322:data"},{"DestinationInputPortId":"-281:instruments","SourceOutputPortId":"-322:data"},{"DestinationInputPortId":"-341:input_data","SourceOutputPortId":"-330:data"},{"DestinationInputPortId":"-214:input_3","SourceOutputPortId":"-330:data"},{"DestinationInputPortId":"-320:input_model","SourceOutputPortId":"-293:data"}],"ModuleNodes":[{"Id":"-214","ModuleId":"BigQuantSpace.cached.cached-v3","ModuleParameters":[{"Name":"run","Value":"# Python 代码入口函数,input_1/2/3 对应三个输入端,data_1/2/3 对应三个输出端\ndef bigquant_run(input_1, input_2, input_3):\n\n test_data = input_2.read_pickle()\n pred_label = input_1.read_pickle()\n pred_result = pred_label.reshape(pred_label.shape[0]) \n dt = input_3.read_df()['date'][-1*len(pred_result):]\n pred_df = pd.Series(pred_result, index=dt)\n ds = DataSource.write_df(pred_df)\n \n return Outputs(data_1=ds)\n","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"post_run","Value":"# 后处理函数,可选。输入是主函数的输出,可以在这里对数据做处理,或者返回更友好的outputs数据格式。此函数输出不会被缓存。\ndef bigquant_run(outputs):\n return outputs\n","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"input_ports","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"params","Value":"{}","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"output_ports","Value":"","ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"input_1","NodeId":"-214"},{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"input_2","NodeId":"-214"},{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"input_3","NodeId":"-214"}],"OutputPortsInternal":[{"Name":"data_1","NodeId":"-214","OutputType":null},{"Name":"data_2","NodeId":"-214","OutputType":null},{"Name":"data_3","NodeId":"-214","OutputType":null}],"UsePreviousResults":true,"moduleIdForCode":2,"Comment":"模型预测结果输出","CommentCollapsed":false},{"Id":"-210","ModuleId":"BigQuantSpace.dl_layer_input.dl_layer_input-v1","ModuleParameters":[{"Name":"shape","Value":"50,19","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"batch_shape","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"dtype","Value":"float32","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"sparse","Value":"False","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"name","Value":"","ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"inputs","NodeId":"-210"}],"OutputPortsInternal":[{"Name":"data","NodeId":"-210","OutputType":null}],"UsePreviousResults":false,"moduleIdForCode":3,"Comment":"","CommentCollapsed":true},{"Id":"-218","ModuleId":"BigQuantSpace.dl_layer_lstm.dl_layer_lstm-v1","ModuleParameters":[{"Name":"units","Value":"32","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"activation","Value":"tanh","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_activation","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"recurrent_activation","Value":"hard_sigmoid","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_recurrent_activation","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"use_bias","Value":"True","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"kernel_initializer","Value":"glorot_uniform","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_kernel_initializer","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"recurrent_initializer","Value":"Orthogonal","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_recurrent_initializer","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"bias_initializer","Value":"Ones","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_bias_initializer","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"unit_forget_bias","Value":"True","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"kernel_regularizer","Value":"None","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"kernel_regularizer_l1","Value":"0","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"kernel_regularizer_l2","Value":"0","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_kernel_regularizer","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"recurrent_regularizer","Value":"None","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"recurrent_regularizer_l1","Value":0,"ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"recurrent_regularizer_l2","Value":0,"ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_recurrent_regularizer","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"bias_regularizer","Value":"None","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"bias_regularizer_l1","Value":0,"ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"bias_regularizer_l2","Value":0,"ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_bias_regularizer","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"activity_regularizer","Value":"L1L2","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"activity_regularizer_l1","Value":"0.003","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"activity_regularizer_l2","Value":"0.003","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_activity_regularizer","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"kernel_constraint","Value":"None","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_kernel_constraint","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"recurrent_constraint","Value":"None","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_recurrent_constraint","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"bias_constraint","Value":"None","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_bias_constraint","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"dropout","Value":"0.1","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"recurrent_dropout","Value":"0.1","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"return_sequences","Value":"False","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"implementation","Value":"2","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"name","Value":"","ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"inputs","NodeId":"-218"}],"OutputPortsInternal":[{"Name":"data","NodeId":"-218","OutputType":null}],"UsePreviousResults":false,"moduleIdForCode":4,"Comment":"","CommentCollapsed":true},{"Id":"-316","ModuleId":"BigQuantSpace.general_feature_extractor.general_feature_extractor-v7","ModuleParameters":[{"Name":"start_date","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"end_date","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"before_start_days","Value":"90","ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"instruments","NodeId":"-316"},{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"features","NodeId":"-316"}],"OutputPortsInternal":[{"Name":"data","NodeId":"-316","OutputType":null}],"UsePreviousResults":true,"moduleIdForCode":16,"Comment":"","CommentCollapsed":true},{"Id":"-320","ModuleId":"BigQuantSpace.dl_model_train.dl_model_train-v1","ModuleParameters":[{"Name":"optimizer","Value":"Adam","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_optimizer","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"loss","Value":"binary_crossentropy","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_loss","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"metrics","Value":"accuracy","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"batch_size","Value":"1600","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"epochs","Value":"1","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"n_gpus","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"verbose","Value":"2:每个epoch输出一行记录","ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"input_model","NodeId":"-320"},{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"training_data","NodeId":"-320"},{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"validation_data","NodeId":"-320"}],"OutputPortsInternal":[{"Name":"data","NodeId":"-320","OutputType":null}],"UsePreviousResults":true,"moduleIdForCode":6,"Comment":"","CommentCollapsed":true},{"Id":"-332","ModuleId":"BigQuantSpace.dl_model_predict.dl_model_predict-v1","ModuleParameters":[{"Name":"batch_size","Value":"10240","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"n_gpus","Value":"0","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"verbose","Value":"0:不显示","ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"trained_model","NodeId":"-332"},{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"input_data","NodeId":"-332"}],"OutputPortsInternal":[{"Name":"data","NodeId":"-332","OutputType":null}],"UsePreviousResults":true,"moduleIdForCode":7,"Comment":"","CommentCollapsed":true},{"Id":"-2295","ModuleId":"BigQuantSpace.input_features.input_features-v1","ModuleParameters":[{"Name":"features","Value":"(close_0/open_0-1)*10\n(volume_0/volume_1-1)\navg_turn_0/avg_turn_4-1\navg_amount_0/avg_amount_4-1\n((high_0/open_0)-(close_0/low_0))*50\n(ta_ma(close_0, timeperiod=5)/ta_ma(close_0, timeperiod=30)-1)*10\nta_rsi(close_0, timeperiod=14)/50-1\nta_mom(close_0, timeperiod=14)/5\nta_adx(high_0, low_0, close_0, timeperiod=14)/ta_adx(high_0, low_0, close_0, timeperiod=28)-1\nta_roc(close_0, timeperiod=14)/10\n(ta_kdj_k(high_0, low_0, close_0, 12, 3)/ta_kdj_d(high_0, low_0, close_0, 12, 3,3)-1)*2\nta_bias(close_0, timeperiod=28)*10\nvolatility_5_0/volatility_60_0-1\n(close_0/open_4-1)*5\nta_macd_macd_12_26_9_0\nta_macd_macdhist_12_26_9_0\nta_macd_macdsignal_12_26_9_0\navg_mf_net_amount_4/avg_amount_4*5\nmf_net_pct_main_0*5\n","ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"features_ds","NodeId":"-2295"}],"OutputPortsInternal":[{"Name":"data","NodeId":"-2295","OutputType":null}],"UsePreviousResults":true,"moduleIdForCode":8,"Comment":"","CommentCollapsed":true},{"Id":"-259","ModuleId":"BigQuantSpace.dl_layer_dense.dl_layer_dense-v1","ModuleParameters":[{"Name":"units","Value":"1","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"activation","Value":"sigmoid","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_activation","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"use_bias","Value":"True","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"kernel_initializer","Value":"glorot_uniform","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_kernel_initializer","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"bias_initializer","Value":"Zeros","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_bias_initializer","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"kernel_regularizer","Value":"None","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"kernel_regularizer_l1","Value":0,"ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"kernel_regularizer_l2","Value":0,"ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_kernel_regularizer","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"bias_regularizer","Value":"L1L2","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"bias_regularizer_l1","Value":"0.01","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"bias_regularizer_l2","Value":"0.01","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_bias_regularizer","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"activity_regularizer","Value":"None","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"activity_regularizer_l1","Value":"0","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"activity_regularizer_l2","Value":"0","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_activity_regularizer","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"kernel_constraint","Value":"None","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_kernel_constraint","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"bias_constraint","Value":"None","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_bias_constraint","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"name","Value":"","ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"inputs","NodeId":"-259"}],"OutputPortsInternal":[{"Name":"data","NodeId":"-259","OutputType":null}],"UsePreviousResults":false,"moduleIdForCode":9,"Comment":"","CommentCollapsed":true},{"Id":"-14806","ModuleId":"BigQuantSpace.dl_layer_dense.dl_layer_dense-v1","ModuleParameters":[{"Name":"units","Value":"32","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"activation","Value":"tanh","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_activation","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"use_bias","Value":"True","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"kernel_initializer","Value":"glorot_uniform","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_kernel_initializer","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"bias_initializer","Value":"Zeros","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_bias_initializer","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"kernel_regularizer","Value":"None","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"kernel_regularizer_l1","Value":0,"ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"kernel_regularizer_l2","Value":0,"ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_kernel_regularizer","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"bias_regularizer","Value":"None","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"bias_regularizer_l1","Value":0,"ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"bias_regularizer_l2","Value":0,"ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_bias_regularizer","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"activity_regularizer","Value":"L1L2","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"activity_regularizer_l1","Value":"0.003","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"activity_regularizer_l2","Value":"0.003","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_activity_regularizer","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"kernel_constraint","Value":"None","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_kernel_constraint","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"bias_constraint","Value":"None","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_bias_constraint","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"name","Value":"","ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"inputs","NodeId":"-14806"}],"OutputPortsInternal":[{"Name":"data","NodeId":"-14806","OutputType":null}],"UsePreviousResults":false,"moduleIdForCode":10,"Comment":"","CommentCollapsed":true},{"Id":"-14834","ModuleId":"BigQuantSpace.dl_layer_dropout.dl_layer_dropout-v1","ModuleParameters":[{"Name":"rate","Value":"0.1","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"noise_shape","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"seed","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"name","Value":"","ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"inputs","NodeId":"-14834"}],"OutputPortsInternal":[{"Name":"data","NodeId":"-14834","OutputType":null}],"UsePreviousResults":false,"moduleIdForCode":11,"Comment":"","CommentCollapsed":true},{"Id":"-14841","ModuleId":"BigQuantSpace.dl_layer_dropout.dl_layer_dropout-v1","ModuleParameters":[{"Name":"rate","Value":"0.1","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"noise_shape","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"seed","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"name","Value":"","ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"inputs","NodeId":"-14841"}],"OutputPortsInternal":[{"Name":"data","NodeId":"-14841","OutputType":null}],"UsePreviousResults":false,"moduleIdForCode":12,"Comment":"","CommentCollapsed":true},{"Id":"-403","ModuleId":"BigQuantSpace.dl_layer_reshape.dl_layer_reshape-v1","ModuleParameters":[{"Name":"target_shape","Value":"50,19,1","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"name","Value":"","ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"inputs","NodeId":"-403"}],"OutputPortsInternal":[{"Name":"data","NodeId":"-403","OutputType":null}],"UsePreviousResults":false,"moduleIdForCode":13,"Comment":"","CommentCollapsed":true},{"Id":"-408","ModuleId":"BigQuantSpace.dl_layer_conv2d.dl_layer_conv2d-v1","ModuleParameters":[{"Name":"filters","Value":"32","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"kernel_size","Value":"3,5","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"strides","Value":"1,1","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"padding","Value":"valid","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"data_format","Value":"channels_last","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"dilation_rate","Value":"1,1","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"activation","Value":"relu","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_activation","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"use_bias","Value":"True","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"kernel_initializer","Value":"glorot_uniform","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_kernel_initializer","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"bias_initializer","Value":"Zeros","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_bias_initializer","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"kernel_regularizer","Value":"None","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"kernel_regularizer_l1","Value":0,"ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"kernel_regularizer_l2","Value":0,"ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_kernel_regularizer","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"bias_regularizer","Value":"None","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"bias_regularizer_l1","Value":0,"ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"bias_regularizer_l2","Value":0,"ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_bias_regularizer","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"activity_regularizer","Value":"None","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"activity_regularizer_l1","Value":0,"ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"activity_regularizer_l2","Value":0,"ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_activity_regularizer","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"kernel_constraint","Value":"None","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_kernel_constraint","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"bias_constraint","Value":"None","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_bias_constraint","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"name","Value":"","ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"inputs","NodeId":"-408"}],"OutputPortsInternal":[{"Name":"data","NodeId":"-408","OutputType":null}],"UsePreviousResults":false,"moduleIdForCode":14,"Comment":"","CommentCollapsed":true},{"Id":"-446","ModuleId":"BigQuantSpace.dl_layer_reshape.dl_layer_reshape-v1","ModuleParameters":[{"Name":"target_shape","Value":"720,32","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"name","Value":"","ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"inputs","NodeId":"-446"}],"OutputPortsInternal":[{"Name":"data","NodeId":"-446","OutputType":null}],"UsePreviousResults":false,"moduleIdForCode":15,"Comment":"","CommentCollapsed":true},{"Id":"-2290","ModuleId":"BigQuantSpace.join.join-v3","ModuleParameters":[{"Name":"on","Value":"date","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"how","Value":"inner","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"sort","Value":"True","ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"data1","NodeId":"-2290"},{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"data2","NodeId":"-2290"}],"OutputPortsInternal":[{"Name":"data","NodeId":"-2290","OutputType":null}],"UsePreviousResults":true,"moduleIdForCode":17,"Comment":"标注特征连接","CommentCollapsed":false},{"Id":"-2296","ModuleId":"BigQuantSpace.dropnan.dropnan-v1","ModuleParameters":[],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"input_data","NodeId":"-2296"}],"OutputPortsInternal":[{"Name":"data","NodeId":"-2296","OutputType":null}],"UsePreviousResults":true,"moduleIdForCode":18,"Comment":"去掉为nan的数据","CommentCollapsed":true},{"Id":"-620","ModuleId":"BigQuantSpace.instruments.instruments-v2","ModuleParameters":[{"Name":"start_date","Value":"2012-07-24","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"end_date","Value":"2017-07-24","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"market","Value":"CN_STOCK_A","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"instrument_list","Value":"002425.SZA\n600699.SHA\n601390.SHA","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"max_count","Value":0,"ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"rolling_conf","NodeId":"-620"}],"OutputPortsInternal":[{"Name":"data","NodeId":"-620","OutputType":null}],"UsePreviousResults":true,"moduleIdForCode":24,"Comment":"","CommentCollapsed":true},{"Id":"-692","ModuleId":"BigQuantSpace.derived_feature_extractor.derived_feature_extractor-v3","ModuleParameters":[{"Name":"date_col","Value":"date","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"instrument_col","Value":"instrument","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"drop_na","Value":"False","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"remove_extra_columns","Value":"False","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_functions","Value":"{}","ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"input_data","NodeId":"-692"},{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"features","NodeId":"-692"}],"OutputPortsInternal":[{"Name":"data","NodeId":"-692","OutputType":null}],"UsePreviousResults":true,"moduleIdForCode":26,"Comment":"","CommentCollapsed":true},{"Id":"-281","ModuleId":"BigQuantSpace.trade.trade-v4","ModuleParameters":[{"Name":"start_date","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"end_date","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"initialize","Value":"# 回测引擎:初始化函数,只执行一次\ndef bigquant_run(context):\n # 加载预测数据\n context.prediction = context.options['data'].read_df()\n\n # 系统已经设置了默认的交易手续费和滑点,要修改手续费可使用如下函数\n context.set_commission(PerOrder(buy_cost=0.0003, sell_cost=0.0013, min_cost=5))","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"handle_data","Value":"# 回测引擎:每日数据处理函数,每天执行一次\ndef bigquant_run(context, data):\n # 按日期过滤得到今日的预测数据\n try:\n prediction = context.prediction[data.current_dt.strftime('%Y-%m-%d')]\n except KeyError as e:\n return\n \n instrument = context.instruments[0]\n sid = context.symbol(instrument)\n cur_position = context.portfolio.positions[sid].amount\n \n # 交易逻辑\n if prediction > 0.9 and cur_position == 0:\n context.order_target_percent(context.symbol(instrument), 1)\n print(data.current_dt, '买入!')\n \n elif prediction < 0.5 and cur_position > 0:\n context.order_target_percent(context.symbol(instrument), 0)\n print(data.current_dt, '卖出!')\n ","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"prepare","Value":"# 回测引擎:准备数据,只执行一次\ndef bigquant_run(context):\n pass\n","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"before_trading_start","Value":"# 回测引擎:每个单位时间开始前调用一次,即每日开盘前调用一次。\ndef bigquant_run(context, data):\n pass\n","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"volume_limit","Value":0.025,"ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"order_price_field_buy","Value":"open","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"order_price_field_sell","Value":"close","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"capital_base","Value":1000000,"ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"auto_cancel_non_tradable_orders","Value":"True","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"data_frequency","Value":"daily","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"price_type","Value":"真实价格","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"product_type","Value":"股票","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"plot_charts","Value":"True","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"backtest_only","Value":"False","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"benchmark","Value":"000001.SHA","ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"instruments","NodeId":"-281"},{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"options_data","NodeId":"-281"},{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"history_ds","NodeId":"-281"},{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"benchmark_ds","NodeId":"-281"},{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"trading_calendar","NodeId":"-281"}],"OutputPortsInternal":[{"Name":"raw_perf","NodeId":"-281","OutputType":null}],"UsePreviousResults":false,"moduleIdForCode":1,"Comment":"","CommentCollapsed":true},{"Id":"-333","ModuleId":"BigQuantSpace.dl_convert_to_bin.dl_convert_to_bin-v2","ModuleParameters":[{"Name":"window_size","Value":"50","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"feature_clip","Value":5,"ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"flatten","Value":"False","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"window_along_col","Value":"","ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"input_data","NodeId":"-333"},{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"features","NodeId":"-333"}],"OutputPortsInternal":[{"Name":"data","NodeId":"-333","OutputType":null}],"UsePreviousResults":true,"moduleIdForCode":25,"Comment":"","CommentCollapsed":true},{"Id":"-341","ModuleId":"BigQuantSpace.dl_convert_to_bin.dl_convert_to_bin-v2","ModuleParameters":[{"Name":"window_size","Value":"50","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"feature_clip","Value":5,"ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"flatten","Value":"False","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"window_along_col","Value":"","ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"input_data","NodeId":"-341"},{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"features","NodeId":"-341"}],"OutputPortsInternal":[{"Name":"data","NodeId":"-341","OutputType":null}],"UsePreviousResults":true,"moduleIdForCode":27,"Comment":"","CommentCollapsed":true},{"Id":"-289","ModuleId":"BigQuantSpace.advanced_auto_labeler.advanced_auto_labeler-v2","ModuleParameters":[{"Name":"label_expr","Value":"# #号开始的表示注释\n# 0. 每行一个,顺序执行,从第二个开始,可以使用label字段\n# 1. 可用数据字段见 https://bigquant.com/docs/develop/datasource/deprecated/history_data.html\n# 添加benchmark_前缀,可使用对应的benchmark数据\n# 2. 可用操作符和函数见 `表达式引擎 <https://bigquant.com/docs/develop/bigexpr/usage.html>`_\n\n# 计算收益:5日收盘价(作为卖出价格)除以明日开盘价(作为买入价格)\nwhere((shift(close,-5)/open>1)&(mean(close,-5)/open>1)&(close/open>1),1,0)\n# 过滤掉一字涨停的情况 (设置label为NaN,在后续处理和训练中会忽略NaN的label)\n\n","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"start_date","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"end_date","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"benchmark","Value":"000300.SHA","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"drop_na_label","Value":"True","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"cast_label_int","Value":"True","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_functions","Value":"{}","ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"instruments","NodeId":"-289"}],"OutputPortsInternal":[{"Name":"data","NodeId":"-289","OutputType":null}],"UsePreviousResults":true,"moduleIdForCode":21,"Comment":"","CommentCollapsed":true},{"Id":"-300","ModuleId":"BigQuantSpace.general_feature_extractor.general_feature_extractor-v7","ModuleParameters":[{"Name":"start_date","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"end_date","Value":"","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"before_start_days","Value":90,"ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"instruments","NodeId":"-300"},{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"features","NodeId":"-300"}],"OutputPortsInternal":[{"Name":"data","NodeId":"-300","OutputType":null}],"UsePreviousResults":true,"moduleIdForCode":22,"Comment":"","CommentCollapsed":true},{"Id":"-307","ModuleId":"BigQuantSpace.derived_feature_extractor.derived_feature_extractor-v3","ModuleParameters":[{"Name":"date_col","Value":"date","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"instrument_col","Value":"instrument","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"drop_na","Value":"False","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"remove_extra_columns","Value":"False","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"user_functions","Value":"{}","ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"input_data","NodeId":"-307"},{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"features","NodeId":"-307"}],"OutputPortsInternal":[{"Name":"data","NodeId":"-307","OutputType":null}],"UsePreviousResults":true,"moduleIdForCode":23,"Comment":"","CommentCollapsed":true},{"Id":"-322","ModuleId":"BigQuantSpace.instruments.instruments-v2","ModuleParameters":[{"Name":"start_date","Value":"2017-08-18","ValueType":"Literal","LinkedGlobalParameter":"交易日期"},{"Name":"end_date","Value":"2020-01-06","ValueType":"Literal","LinkedGlobalParameter":"交易日期"},{"Name":"market","Value":"CN_STOCK_A","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"instrument_list","Value":"002425.SZA","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"max_count","Value":0,"ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"rolling_conf","NodeId":"-322"}],"OutputPortsInternal":[{"Name":"data","NodeId":"-322","OutputType":null}],"UsePreviousResults":true,"moduleIdForCode":28,"Comment":"","CommentCollapsed":true},{"Id":"-330","ModuleId":"BigQuantSpace.dropnan.dropnan-v1","ModuleParameters":[],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"input_data","NodeId":"-330"}],"OutputPortsInternal":[{"Name":"data","NodeId":"-330","OutputType":null}],"UsePreviousResults":true,"moduleIdForCode":20,"Comment":"","CommentCollapsed":true},{"Id":"-293","ModuleId":"BigQuantSpace.dl_model_init.dl_model_init-v1","ModuleParameters":[],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"inputs","NodeId":"-293"},{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"outputs","NodeId":"-293"}],"OutputPortsInternal":[{"Name":"data","NodeId":"-293","OutputType":null}],"UsePreviousResults":false,"moduleIdForCode":5,"Comment":"","CommentCollapsed":true},{"Id":"-294","ModuleId":"BigQuantSpace.model_save.model_save-v1","ModuleParameters":[{"Name":"filedir","Value":"/home/bigquant/work/userlib/","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"filename","Value":"test01102","ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[{"DataSourceId":null,"TrainedModelId":null,"TransformModuleId":null,"Name":"input_1","NodeId":"-294"}],"OutputPortsInternal":[],"UsePreviousResults":true,"moduleIdForCode":19,"Comment":"","CommentCollapsed":true},{"Id":"-298","ModuleId":"BigQuantSpace.model_read.model_read-v1","ModuleParameters":[{"Name":"filedir","Value":"/home/bigquant/work/userlib/","ValueType":"Literal","LinkedGlobalParameter":null},{"Name":"filename","Value":"test01102","ValueType":"Literal","LinkedGlobalParameter":null}],"InputPortsInternal":[],"OutputPortsInternal":[{"Name":"data","NodeId":"-298","OutputType":null}],"UsePreviousResults":true,"moduleIdForCode":29,"Comment":"","CommentCollapsed":true}],"SerializedClientData":"<?xml version='1.0' encoding='utf-16'?><DataV1 xmlns:xsd='http://www.w3.org/2001/XMLSchema' xmlns:xsi='http://www.w3.org/2001/XMLSchema-instance'><Meta /><NodePositions><NodePosition Node='-214' Position='1012,714,200,200'/><NodePosition Node='-210' Position='280,-273,200,200'/><NodePosition Node='-218' Position='280,52,200,200'/><NodePosition Node='-316' Position='1234.9698486328125,-99.92964935302734,200,200'/><NodePosition Node='-320' Position='537,539,200,200'/><NodePosition Node='-332' Position='813,622,200,200'/><NodePosition Node='-2295' Position='1010,-259,200,200'/><NodePosition Node='-259' Position='281,387,200,200'/><NodePosition Node='-14806' Position='280,212,200,200'/><NodePosition Node='-14834' Position='281,135,200,200'/><NodePosition Node='-14841' Position='279,304,200,200'/><NodePosition Node='-403' Position='288,-190,200,200'/><NodePosition Node='-408' Position='283,-106,200,200'/><NodePosition Node='-446' Position='280,-23,200,200'/><NodePosition Node='-2290' Position='735,138,200,200'/><NodePosition Node='-2296' Position='734,242,200,200'/><NodePosition Node='-620' Position='721,-168,200,200'/><NodePosition Node='-692' Position='1251,-7,200,200'/><NodePosition Node='-281' Position='1246,851,200,200'/><NodePosition Node='-333' Position='753,329,200,200'/><NodePosition Node='-341' Position='1045,322,200,200'/><NodePosition Node='-289' Position='597,-13,200,200'/><NodePosition Node='-300' Position='860,-82,200,200'/><NodePosition Node='-307' Position='871,10,200,200'/><NodePosition Node='-322' Position='1245,-203,200,200'/><NodePosition Node='-330' Position='1250,84,200,200'/><NodePosition Node='-293' Position='358,462,200,200'/><NodePosition Node='-294' Position='470.05023193359375,648.8593139648438,200,200'/><NodePosition Node='-298' Position='798,455.9899597167969,200,200'/></NodePositions><NodeGroups /></DataV1>"},"IsDraft":true,"ParentExperimentId":null,"WebService":{"IsWebServiceExperiment":false,"Inputs":[],"Outputs":[],"Parameters":[{"Name":"交易日期","Value":"","ParameterDefinition":{"Name":"交易日期","FriendlyName":"交易日期","DefaultValue":"","ParameterType":"String","HasDefaultValue":true,"IsOptional":true,"ParameterRules":[],"HasRules":false,"MarkupType":0,"CredentialDescriptor":null}}],"WebServiceGroupId":null,"SerializedClientData":"<?xml version='1.0' encoding='utf-16'?><DataV1 xmlns:xsd='http://www.w3.org/2001/XMLSchema' xmlns:xsi='http://www.w3.org/2001/XMLSchema-instance'><Meta /><NodePositions></NodePositions><NodeGroups /></DataV1>"},"DisableNodesUpdate":false,"Category":"user","Tags":[],"IsPartialRun":true}
      In [11]:
      # 本代码由可视化策略环境自动生成 2020年1月11日 17:24
      # 本代码单元只能在可视化模式下编辑。您也可以拷贝代码,粘贴到新建的代码单元或者策略,然后修改。
      
      
      # Python 代码入口函数,input_1/2/3 对应三个输入端,data_1/2/3 对应三个输出端
      def m2_run_bigquant_run(input_1, input_2, input_3):
      
          test_data = input_2.read_pickle()
          pred_label = input_1.read_pickle()
          pred_result = pred_label.reshape(pred_label.shape[0]) 
          dt = input_3.read_df()['date'][-1*len(pred_result):]
          pred_df = pd.Series(pred_result, index=dt)
          ds = DataSource.write_df(pred_df)
          
          return Outputs(data_1=ds)
      
      # 后处理函数,可选。输入是主函数的输出,可以在这里对数据做处理,或者返回更友好的outputs数据格式。此函数输出不会被缓存。
      def m2_post_run_bigquant_run(outputs):
          return outputs
      
      # 回测引擎:初始化函数,只执行一次
      def m1_initialize_bigquant_run(context):
          # 加载预测数据
          context.prediction = context.options['data'].read_df()
      
          # 系统已经设置了默认的交易手续费和滑点,要修改手续费可使用如下函数
          context.set_commission(PerOrder(buy_cost=0.0003, sell_cost=0.0013, min_cost=5))
      # 回测引擎:每日数据处理函数,每天执行一次
      def m1_handle_data_bigquant_run(context, data):
          # 按日期过滤得到今日的预测数据
          try:
              prediction = context.prediction[data.current_dt.strftime('%Y-%m-%d')]
          except KeyError as e:
              return
          
          instrument = context.instruments[0]
          sid = context.symbol(instrument)
          cur_position = context.portfolio.positions[sid].amount
          
          # 交易逻辑
          if prediction > 0.9 and cur_position == 0:
              context.order_target_percent(context.symbol(instrument), 1)
              print(data.current_dt, '买入!')
              
          elif prediction < 0.5 and cur_position > 0:
              context.order_target_percent(context.symbol(instrument), 0)
              print(data.current_dt, '卖出!')
          
      # 回测引擎:准备数据,只执行一次
      def m1_prepare_bigquant_run(context):
          pass
      
      # 回测引擎:每个单位时间开始前调用一次,即每日开盘前调用一次。
      def m1_before_trading_start_bigquant_run(context, data):
          pass
      
      
      m3 = M.dl_layer_input.v1(
          shape='50,19',
          batch_shape='',
          dtype='float32',
          sparse=False,
          name=''
      )
      
      m13 = M.dl_layer_reshape.v1(
          inputs=m3.data,
          target_shape='50,19,1',
          name=''
      )
      
      m14 = M.dl_layer_conv2d.v1(
          inputs=m13.data,
          filters=32,
          kernel_size='3,5',
          strides='1,1',
          padding='valid',
          data_format='channels_last',
          dilation_rate='1,1',
          activation='relu',
          use_bias=True,
          kernel_initializer='glorot_uniform',
          bias_initializer='Zeros',
          kernel_regularizer='None',
          kernel_regularizer_l1=0,
          kernel_regularizer_l2=0,
          bias_regularizer='None',
          bias_regularizer_l1=0,
          bias_regularizer_l2=0,
          activity_regularizer='None',
          activity_regularizer_l1=0,
          activity_regularizer_l2=0,
          kernel_constraint='None',
          bias_constraint='None',
          name=''
      )
      
      m15 = M.dl_layer_reshape.v1(
          inputs=m14.data,
          target_shape='720,32',
          name=''
      )
      
      m4 = M.dl_layer_lstm.v1(
          inputs=m15.data,
          units=32,
          activation='tanh',
          recurrent_activation='hard_sigmoid',
          use_bias=True,
          kernel_initializer='glorot_uniform',
          recurrent_initializer='Orthogonal',
          bias_initializer='Ones',
          unit_forget_bias=True,
          kernel_regularizer='None',
          kernel_regularizer_l1=0,
          kernel_regularizer_l2=0,
          recurrent_regularizer='None',
          recurrent_regularizer_l1=0,
          recurrent_regularizer_l2=0,
          bias_regularizer='None',
          bias_regularizer_l1=0,
          bias_regularizer_l2=0,
          activity_regularizer='L1L2',
          activity_regularizer_l1=0.003,
          activity_regularizer_l2=0.003,
          kernel_constraint='None',
          recurrent_constraint='None',
          bias_constraint='None',
          dropout=0.1,
          recurrent_dropout=0.1,
          return_sequences=False,
          implementation='2',
          name=''
      )
      
      m11 = M.dl_layer_dropout.v1(
          inputs=m4.data,
          rate=0.1,
          noise_shape='',
          name=''
      )
      
      m10 = M.dl_layer_dense.v1(
          inputs=m11.data,
          units=32,
          activation='tanh',
          use_bias=True,
          kernel_initializer='glorot_uniform',
          bias_initializer='Zeros',
          kernel_regularizer='None',
          kernel_regularizer_l1=0,
          kernel_regularizer_l2=0,
          bias_regularizer='None',
          bias_regularizer_l1=0,
          bias_regularizer_l2=0,
          activity_regularizer='L1L2',
          activity_regularizer_l1=0.003,
          activity_regularizer_l2=0.003,
          kernel_constraint='None',
          bias_constraint='None',
          name=''
      )
      
      m12 = M.dl_layer_dropout.v1(
          inputs=m10.data,
          rate=0.1,
          noise_shape='',
          name=''
      )
      
      m9 = M.dl_layer_dense.v1(
          inputs=m12.data,
          units=1,
          activation='sigmoid',
          use_bias=True,
          kernel_initializer='glorot_uniform',
          bias_initializer='Zeros',
          kernel_regularizer='None',
          kernel_regularizer_l1=0,
          kernel_regularizer_l2=0,
          bias_regularizer='L1L2',
          bias_regularizer_l1=0.01,
          bias_regularizer_l2=0.01,
          activity_regularizer='None',
          activity_regularizer_l1=0,
          activity_regularizer_l2=0,
          kernel_constraint='None',
          bias_constraint='None',
          name=''
      )
      
      m5 = M.dl_model_init.v1(
          inputs=m3.data,
          outputs=m9.data
      )
      
      m8 = M.input_features.v1(
          features="""(close_0/open_0-1)*10
      (volume_0/volume_1-1)
      avg_turn_0/avg_turn_4-1
      avg_amount_0/avg_amount_4-1
      ((high_0/open_0)-(close_0/low_0))*50
      (ta_ma(close_0, timeperiod=5)/ta_ma(close_0, timeperiod=30)-1)*10
      ta_rsi(close_0, timeperiod=14)/50-1
      ta_mom(close_0, timeperiod=14)/5
      ta_adx(high_0, low_0, close_0, timeperiod=14)/ta_adx(high_0, low_0, close_0, timeperiod=28)-1
      ta_roc(close_0, timeperiod=14)/10
      (ta_kdj_k(high_0, low_0, close_0, 12, 3)/ta_kdj_d(high_0, low_0, close_0, 12, 3,3)-1)*2
      ta_bias(close_0, timeperiod=28)*10
      volatility_5_0/volatility_60_0-1
      (close_0/open_4-1)*5
      ta_macd_macd_12_26_9_0
      ta_macd_macdhist_12_26_9_0
      ta_macd_macdsignal_12_26_9_0
      avg_mf_net_amount_4/avg_amount_4*5
      mf_net_pct_main_0*5
      """
      )
      
      m24 = M.instruments.v2(
          start_date='2012-07-24',
          end_date='2017-07-24',
          market='CN_STOCK_A',
          instrument_list="""002425.SZA
      600699.SHA
      601390.SHA""",
          max_count=0
      )
      
      m21 = M.advanced_auto_labeler.v2(
          instruments=m24.data,
          label_expr="""# #号开始的表示注释
      # 0. 每行一个,顺序执行,从第二个开始,可以使用label字段
      # 1. 可用数据字段见 https://bigquant.com/docs/develop/datasource/deprecated/history_data.html
      #   添加benchmark_前缀,可使用对应的benchmark数据
      # 2. 可用操作符和函数见 `表达式引擎 <https://bigquant.com/docs/develop/bigexpr/usage.html>`_
      
      # 计算收益:5日收盘价(作为卖出价格)除以明日开盘价(作为买入价格)
      where((shift(close,-5)/open>1)&(mean(close,-5)/open>1)&(close/open>1),1,0)
      # 过滤掉一字涨停的情况 (设置label为NaN,在后续处理和训练中会忽略NaN的label)
      
      """,
          start_date='',
          end_date='',
          benchmark='000300.SHA',
          drop_na_label=True,
          cast_label_int=True,
          user_functions={}
      )
      
      m22 = M.general_feature_extractor.v7(
          instruments=m24.data,
          features=m8.data,
          start_date='',
          end_date='',
          before_start_days=90
      )
      
      m23 = M.derived_feature_extractor.v3(
          input_data=m22.data,
          features=m8.data,
          date_col='date',
          instrument_col='instrument',
          drop_na=False,
          remove_extra_columns=False,
          user_functions={}
      )
      
      m17 = M.join.v3(
          data1=m21.data,
          data2=m23.data,
          on='date',
          how='inner',
          sort=True
      )
      
      m18 = M.dropnan.v1(
          input_data=m17.data
      )
      
      m25 = M.dl_convert_to_bin.v2(
          input_data=m18.data,
          features=m8.data,
          window_size=50,
          feature_clip=5,
          flatten=False,
          window_along_col=''
      )
      
      m6 = M.dl_model_train.v1(
          input_model=m5.data,
          training_data=m25.data,
          optimizer='Adam',
          loss='binary_crossentropy',
          metrics='accuracy',
          batch_size=1600,
          epochs=1,
          verbose='2:每个epoch输出一行记录'
      )
      
      m19 = M.model_save.v1(
          input_1=m6.data,
          filedir='/home/bigquant/work/userlib/',
          filename='test01102'
      )
      
      m28 = M.instruments.v2(
          start_date=T.live_run_param('trading_date', '2017-08-18'),
          end_date=T.live_run_param('trading_date', '2020-01-06'),
          market='CN_STOCK_A',
          instrument_list='002425.SZA',
          max_count=0
      )
      
      m16 = M.general_feature_extractor.v7(
          instruments=m28.data,
          features=m8.data,
          start_date='',
          end_date='',
          before_start_days=90
      )
      
      m26 = M.derived_feature_extractor.v3(
          input_data=m16.data,
          features=m8.data,
          date_col='date',
          instrument_col='instrument',
          drop_na=False,
          remove_extra_columns=False,
          user_functions={}
      )
      
      m20 = M.dropnan.v1(
          input_data=m26.data
      )
      
      m27 = M.dl_convert_to_bin.v2(
          input_data=m20.data,
          features=m8.data,
          window_size=50,
          feature_clip=5,
          flatten=False,
          window_along_col=''
      )
      
      m7 = M.dl_model_predict.v1(
          trained_model=m6.data,
          input_data=m27.data,
          batch_size=10240,
          n_gpus=0,
          verbose='0:不显示'
      )
      
      m2 = M.cached.v3(
          input_1=m7.data,
          input_2=m27.data,
          input_3=m20.data,
          run=m2_run_bigquant_run,
          post_run=m2_post_run_bigquant_run,
          input_ports='',
          params='{}',
          output_ports=''
      )
      
      m1 = M.trade.v4(
          instruments=m28.data,
          options_data=m2.data_1,
          start_date='',
          end_date='',
          initialize=m1_initialize_bigquant_run,
          handle_data=m1_handle_data_bigquant_run,
          prepare=m1_prepare_bigquant_run,
          before_trading_start=m1_before_trading_start_bigquant_run,
          volume_limit=0.025,
          order_price_field_buy='open',
          order_price_field_sell='close',
          capital_base=1000000,
          auto_cancel_non_tradable_orders=True,
          data_frequency='daily',
          price_type='真实价格',
          product_type='股票',
          plot_charts=True,
          backtest_only=False,
          benchmark='000001.SHA'
      )
      
      m29 = M.model_read.v1(
          filedir='/home/bigquant/work/userlib/',
          filename='test01102'
      )
      
      In [7]:
      m2.data_1.read()
      
      Out[7]:
      date
      2017-08-09    0.288473
      2017-08-10    0.289217
      2017-08-11    0.288273
      2017-08-14    0.287949
      2017-08-15    0.287881
      2017-08-16    0.288170
      2017-08-17    0.288116
      2017-08-18    0.288743
      2017-08-21    0.287100
      2017-08-22    0.287116
      2017-08-23    0.286825
      2017-08-24    0.287735
      2017-08-25    0.288107
      2017-08-28    0.287002
      2017-08-29    0.289703
      2017-08-30    0.288837
      2017-08-31    0.288039
      2017-09-01    0.286601
      2017-09-04    0.286633
      2017-09-05    0.287963
      2017-09-06    0.289466
      2017-09-07    0.289755
      2017-09-08    0.289210
      2017-09-11    0.288507
      2017-09-12    0.289125
      2017-09-13    0.288540
      2017-09-14    0.287339
      2017-09-15    0.288017
      2017-09-18    0.288662
      2017-09-19    0.290031
                      ...   
      2019-11-25    0.286490
      2019-11-26    0.285232
      2019-11-27    0.286565
      2019-11-28    0.284484
      2019-11-29    0.285476
      2019-12-02    0.285566
      2019-12-03    0.289433
      2019-12-04    0.289773
      2019-12-05    0.289474
      2019-12-06    0.290629
      2019-12-09    0.289382
      2019-12-10    0.290435
      2019-12-11    0.290023
      2019-12-12    0.290131
      2019-12-13    0.288661
      2019-12-16    0.288076
      2019-12-17    0.287481
      2019-12-18    0.287572
      2019-12-19    0.289250
      2019-12-20    0.289284
      2019-12-23    0.287748
      2019-12-24    0.286208
      2019-12-25    0.286309
      2019-12-26    0.287221
      2019-12-27    0.287820
      2019-12-30    0.286697
      2019-12-31    0.287002
      2020-01-02    0.284698
      2020-01-03    0.287234
      2020-01-06    0.290159
      Length: 572, dtype: float32

      (wygwsg) #3

      不是,这次才训练1000轮出现了,以前有过训练5000轮出现的,不过这次训练的股票数据相对多些,大概5000多个。


      (达达) #4

      目前平台会自动清理运行结果缓存,策略运行超过13小时,缓存被删了。