{"description":"实验创建于2017/8/26","graph":{"edges":[{"to_node_id":"-2228:input_1","from_node_id":"287d2cb0-f53c-4101-bdf8-104b137c8601-15:data"},{"to_node_id":"-274:features","from_node_id":"287d2cb0-f53c-4101-bdf8-104b137c8601-24:data"},{"to_node_id":"-281:features","from_node_id":"287d2cb0-f53c-4101-bdf8-104b137c8601-24:data"},{"to_node_id":"-7857:input_2","from_node_id":"287d2cb0-f53c-4101-bdf8-104b137c8601-24:data"},{"to_node_id":"-10445:input_2","from_node_id":"287d2cb0-f53c-4101-bdf8-104b137c8601-24:data"},{"to_node_id":"-7857:input_1","from_node_id":"287d2cb0-f53c-4101-bdf8-104b137c8601-53:data"},{"to_node_id":"287d2cb0-f53c-4101-bdf8-104b137c8601-53:data2","from_node_id":"287d2cb0-f53c-4101-bdf8-104b137c8601-84:data"},{"to_node_id":"-281:input_data","from_node_id":"-274:data"},{"to_node_id":"287d2cb0-f53c-4101-bdf8-104b137c8601-84:input_data","from_node_id":"-281:data"},{"to_node_id":"287d2cb0-f53c-4101-bdf8-104b137c8601-53:data1","from_node_id":"-2228:data"},{"to_node_id":"-10445:input_1","from_node_id":"-7857:data_1"},{"to_node_id":"-141:options_data","from_node_id":"-10445:data_1"},{"to_node_id":"287d2cb0-f53c-4101-bdf8-104b137c8601-15:instruments","from_node_id":"287d2cb0-f53c-4101-bdf8-104b137c8601-8:data"},{"to_node_id":"-274:instruments","from_node_id":"287d2cb0-f53c-4101-bdf8-104b137c8601-8:data"},{"to_node_id":"-141:instruments","from_node_id":"-14905:data"}],"nodes":[{"node_id":"287d2cb0-f53c-4101-bdf8-104b137c8601-15","module_id":"BigQuantSpace.advanced_auto_labeler.advanced_auto_labeler-v2","parameters":[{"name":"label_expr","value":"# #号开始的表示注释\n# 0. 每行一个,顺序执行,从第二个开始,可以使用label字段\n# 1. 可用数据字段见 https://bigquant.com/docs/data_history_data.html\n# 添加benchmark_前缀,可使用对应的benchmark数据\n# 2. 可用操作符和函数见 `表达式引擎 <https://bigquant.com/docs/big_expr.html>`_\n\n# 计算收益:5日收盘价(作为卖出价格)除以明日开盘价(作为买入价格)\nshift(close, -5) / shift(open, -1)\n\n# 极值处理:用1%和99%分位的值做clip\nclip(label, all_quantile(label, 0.01), all_quantile(label, 0.99))\n\n# 将分数映射到分类,这里使用20个分类\n# all_wbins(label, 20)\n\n# 过滤掉一字涨停的情况 (设置label为NaN,在后续处理和训练中会忽略NaN的label)\nwhere(shift(high, -1) == shift(low, -1), NaN, label)\n","type":"Literal","bound_global_parameter":null},{"name":"start_date","value":"","type":"Literal","bound_global_parameter":null},{"name":"end_date","value":"","type":"Literal","bound_global_parameter":null},{"name":"benchmark","value":"000300.SHA","type":"Literal","bound_global_parameter":null},{"name":"drop_na_label","value":"True","type":"Literal","bound_global_parameter":null},{"name":"cast_label_int","value":"False","type":"Literal","bound_global_parameter":null},{"name":"user_functions","value":"","type":"Literal","bound_global_parameter":null}],"input_ports":[{"name":"instruments","node_id":"287d2cb0-f53c-4101-bdf8-104b137c8601-15"}],"output_ports":[{"name":"data","node_id":"287d2cb0-f53c-4101-bdf8-104b137c8601-15"}],"cacheable":true,"seq_num":2,"comment":"","comment_collapsed":true},{"node_id":"287d2cb0-f53c-4101-bdf8-104b137c8601-24","module_id":"BigQuantSpace.input_features.input_features-v1","parameters":[{"name":"features","value":"close_0\nopen_0\nhigh_0\nlow_0 \namount_0\nturn_0 \nreturn_0\n \nclose_1\nopen_1\nhigh_1\nlow_1\nreturn_1\namount_1\nturn_1\n \nclose_2\nopen_2\nhigh_2\nlow_2\namount_2\nturn_2\nreturn_2\n \nclose_3\nopen_3\nhigh_3\nlow_3\namount_3\nturn_3\nreturn_3\n \nclose_4\nopen_4\nhigh_4\nlow_4\namount_4\nturn_4\nreturn_4\n \nmean(close_0, 5)\nmean(low_0, 5)\nmean(open_0, 5)\nmean(high_0, 5)\nmean(turn_0, 5)\nmean(amount_0, 5)\nmean(return_0, 5)\n \nts_max(close_0, 5)\nts_max(low_0, 5)\nts_max(open_0, 5)\nts_max(high_0, 5)\nts_max(turn_0, 5)\nts_max(amount_0, 5)\nts_max(return_0, 5)\n \nts_min(close_0, 5)\nts_min(low_0, 5)\nts_min(open_0, 5)\nts_min(high_0, 5)\nts_min(turn_0, 5)\nts_min(amount_0, 5)\nts_min(return_0, 5) \n \nstd(close_0, 5)\nstd(low_0, 5)\nstd(open_0, 5)\nstd(high_0, 5)\nstd(turn_0, 5)\nstd(amount_0, 5)\nstd(return_0, 5)\n \nts_rank(close_0, 5)\nts_rank(low_0, 5)\nts_rank(open_0, 5)\nts_rank(high_0, 5)\nts_rank(turn_0, 5)\nts_rank(amount_0, 5)\nts_rank(return_0, 5)\n \ndecay_linear(close_0, 5)\ndecay_linear(low_0, 5)\ndecay_linear(open_0, 5)\ndecay_linear(high_0, 5)\ndecay_linear(turn_0, 5)\ndecay_linear(amount_0, 5)\ndecay_linear(return_0, 5)\n \ncorrelation(volume_0, return_0, 5)\ncorrelation(volume_0, high_0, 5)\ncorrelation(volume_0, low_0, 5)\ncorrelation(volume_0, close_0, 5)\ncorrelation(volume_0, open_0, 5)\ncorrelation(volume_0, turn_0, 5)\n \ncorrelation(return_0, high_0, 5)\ncorrelation(return_0, low_0, 5)\ncorrelation(return_0, close_0, 5)\ncorrelation(return_0, open_0, 5)\ncorrelation(return_0, turn_0, 5)\n \ncorrelation(high_0, low_0, 5)\ncorrelation(high_0, close_0, 5)\ncorrelation(high_0, open_0, 5)\ncorrelation(high_0, turn_0, 5)\n \ncorrelation(low_0, close_0, 5)\ncorrelation(low_0, open_0, 5)\ncorrelation(low_0, turn_0, 5)\n \ncorrelation(close_0, open_0, 5)\ncorrelation(close_0, turn_0, 5)\n\ncorrelation(open_0, turn_0, 5)","type":"Literal","bound_global_parameter":null}],"input_ports":[{"name":"features_ds","node_id":"287d2cb0-f53c-4101-bdf8-104b137c8601-24"}],"output_ports":[{"name":"data","node_id":"287d2cb0-f53c-4101-bdf8-104b137c8601-24"}],"cacheable":true,"seq_num":3,"comment":"","comment_collapsed":true},{"node_id":"287d2cb0-f53c-4101-bdf8-104b137c8601-53","module_id":"BigQuantSpace.join.join-v3","parameters":[{"name":"on","value":"date,instrument","type":"Literal","bound_global_parameter":null},{"name":"how","value":"inner","type":"Literal","bound_global_parameter":null},{"name":"sort","value":"True","type":"Literal","bound_global_parameter":null}],"input_ports":[{"name":"data1","node_id":"287d2cb0-f53c-4101-bdf8-104b137c8601-53"},{"name":"data2","node_id":"287d2cb0-f53c-4101-bdf8-104b137c8601-53"}],"output_ports":[{"name":"data","node_id":"287d2cb0-f53c-4101-bdf8-104b137c8601-53"}],"cacheable":true,"seq_num":7,"comment":"","comment_collapsed":true},{"node_id":"287d2cb0-f53c-4101-bdf8-104b137c8601-84","module_id":"BigQuantSpace.dropnan.dropnan-v1","parameters":[],"input_ports":[{"name":"input_data","node_id":"287d2cb0-f53c-4101-bdf8-104b137c8601-84"}],"output_ports":[{"name":"data","node_id":"287d2cb0-f53c-4101-bdf8-104b137c8601-84"}],"cacheable":true,"seq_num":13,"comment":"","comment_collapsed":true},{"node_id":"-274","module_id":"BigQuantSpace.general_feature_extractor.general_feature_extractor-v7","parameters":[{"name":"start_date","value":"","type":"Literal","bound_global_parameter":null},{"name":"end_date","value":"","type":"Literal","bound_global_parameter":null},{"name":"before_start_days","value":"7","type":"Literal","bound_global_parameter":null}],"input_ports":[{"name":"instruments","node_id":"-274"},{"name":"features","node_id":"-274"}],"output_ports":[{"name":"data","node_id":"-274"}],"cacheable":true,"seq_num":15,"comment":"","comment_collapsed":true},{"node_id":"-281","module_id":"BigQuantSpace.derived_feature_extractor.derived_feature_extractor-v3","parameters":[{"name":"date_col","value":"date","type":"Literal","bound_global_parameter":null},{"name":"instrument_col","value":"instrument","type":"Literal","bound_global_parameter":null},{"name":"drop_na","value":"False","type":"Literal","bound_global_parameter":null},{"name":"remove_extra_columns","value":"False","type":"Literal","bound_global_parameter":null},{"name":"user_functions","value":"","type":"Literal","bound_global_parameter":null}],"input_ports":[{"name":"input_data","node_id":"-281"},{"name":"features","node_id":"-281"}],"output_ports":[{"name":"data","node_id":"-281"}],"cacheable":true,"seq_num":16,"comment":"","comment_collapsed":true},{"node_id":"-2228","module_id":"BigQuantSpace.standardlize.standardlize-v9","parameters":[{"name":"standard_func","value":"ZScoreNorm","type":"Literal","bound_global_parameter":null},{"name":"columns_input","value":"label","type":"Literal","bound_global_parameter":null}],"input_ports":[{"name":"input_1","node_id":"-2228"},{"name":"input_2","node_id":"-2228"}],"output_ports":[{"name":"data","node_id":"-2228"}],"cacheable":true,"seq_num":8,"comment":"","comment_collapsed":true},{"node_id":"-7857","module_id":"BigQuantSpace.cached.cached-v3","parameters":[{"name":"run","value":"# Python 代码入口函数,input_1/2/3 对应三个输入端,data_1/2/3 对应三个输出端\ndef zscore(x):\n m = x.mean()\n s = x.std(ddof=0)\n z = (x-m)/s\n return z\n\ndef normalization(df, features):\n \"\"\"截面标准化\"\"\"\n ins_df = df.copy()\n \n for fea in features:\n # if fea.startswith(\"std\") or fea.startswith(\"corr\") or fea.startswith(\"ts_rank\"):\n # continue\n\n ins_df[fea] = zscore(ins_df[fea].values)\n ins_df[fea] = np.clip(ins_df[fea], -5, 5)\n \n return ins_df\n\ndef bigquant_run(input_1, input_2, input_3):\n # 示例代码如下。在这里编写您的代码\n df = input_1.read()\n feature = input_2.read()\n df[\"year\"] = df[\"date\"].apply(lambda x: x.year)\n df.replace([np.inf, -np.inf], np.nan, inplace=True)\n \n groups = []\n for ins_, group in df.groupby(\"instrument\"):\n years = group[\"year\"].unique()\n if len(years) <= 3:\n continue\n\n groups.append(group)\n\n result = Parallel(n_jobs=-1)(delayed(normalization)(group, feature) for group in groups)\n df_feat = pd.concat(result, ignore_index=True)\n df_feat.dropna(inplace=True)\n \n data_1 = DataSource.write_df(df_feat)\n return Outputs(data_1=data_1, data_2=None, data_3=None)\n","type":"Literal","bound_global_parameter":null},{"name":"post_run","value":"# 后处理函数,可选。输入是主函数的输出,可以在这里对数据做处理,或者返回更友好的outputs数据格式。此函数输出不会被缓存。\ndef bigquant_run(outputs):\n return outputs\n","type":"Literal","bound_global_parameter":null},{"name":"input_ports","value":"","type":"Literal","bound_global_parameter":null},{"name":"params","value":"{}","type":"Literal","bound_global_parameter":null},{"name":"output_ports","value":"","type":"Literal","bound_global_parameter":null}],"input_ports":[{"name":"input_1","node_id":"-7857"},{"name":"input_2","node_id":"-7857"},{"name":"input_3","node_id":"-7857"}],"output_ports":[{"name":"data_1","node_id":"-7857"},{"name":"data_2","node_id":"-7857"},{"name":"data_3","node_id":"-7857"}],"cacheable":false,"seq_num":4,"comment":"数据标准化","comment_collapsed":false},{"node_id":"-10445","module_id":"BigQuantSpace.cached.cached-v3","parameters":[{"name":"run","value":"# Python 代码入口函数,input_1/2/3 对应三个输入端,data_1/2/3 对应三个输出端\n\nclass TimeseriesKfold(object):\n def __init__(self, n_split):\n self.n_split = n_split\n \n def split(self, df, train_year=3, val_year=1):\n years = sorted(df[\"year\"].unique())\n\n start_index = len(years) - self.n_split\n for i in range(self.n_split):\n train_years = years[start_index-train_year:start_index]\n val_years = [years[start_index]]\n print(train_years, val_years)\n start_index += 1\n\n train_index = df[df[\"year\"].isin(train_years)].index\n val_index = df[df[\"year\"].isin(val_years)].index\n yield train_index, val_index\n\ndef pearson_correlation(y_true, y_pred, axis=-1):\n y_true = y_true-tf.reduce_mean(y_true)\n y_pred = y_pred-tf.reduce_mean(y_pred)\n y_true = tf.linalg.l2_normalize(y_true, axis=axis)\n y_pred = tf.linalg.l2_normalize(y_pred, axis=axis)\n return tf.reduce_sum(y_true * y_pred, axis=axis)\n\ndef dnn_keras_model():\n model = keras.Sequential([\n keras.layers.Dense(256, activation=tf.nn.relu, input_shape=(98,)),\n keras.layers.Dropout(0.1),\n keras.layers.BatchNormalization(),\n keras.layers.Dense(128, activation=tf.nn.relu),\n keras.layers.Dropout(0.1),\n keras.layers.Dense(1, activation=\"linear\")\n ])\n optimizer = tf.keras.optimizers.Adam(learning_rate=1e-3, clipnorm=1)\n rmse = keras.metrics.RootMeanSquaredError(name=\"rmse\")\n model.compile(loss='mse',\n optimizer=optimizer,\n metrics=[\"mse\", rmse, pearson_correlation]) \n # model.summary()\n return model\n\ndef bigquant_run(input_1, input_2, input_3):\n # 示例代码如下。在这里编写您的代码\n df = input_1.read()\n feature = input_2.read()\n \n pred_data = []\n kfold = TimeseriesKfold(n_split=5)\n for train_index, val_index in kfold.split(df):\n train = df.loc[train_index]\n val = df.loc[val_index]\n \n x_train = train[feature].values\n y_train = train[\"label\"]\n x_val = val[feature].values\n y_val = val[\"label\"]\n model = dnn_keras_model()\n model.fit(x_train, y_train, validation_data=(x_val, y_val), batch_size=1024, epochs=10, verbose=2)\n \n y_pred = model.predict(x_val)\n val[\"pred_label\"] = y_pred.reshape(-1)\n pred_data.append(val[[\"date\", \"instrument\", \"pred_label\"]])\n \n result = pd.concat(pred_data)\n result.sort_values(['date','pred_label'], inplace=True, ascending=[True,False])\n data_1 = DataSource.write_df(result)\n return Outputs(data_1=data_1, data_2=None, data_3=None)\n","type":"Literal","bound_global_parameter":null},{"name":"post_run","value":"# 后处理函数,可选。输入是主函数的输出,可以在这里对数据做处理,或者返回更友好的outputs数据格式。此函数输出不会被缓存。\ndef bigquant_run(outputs):\n return outputs\n","type":"Literal","bound_global_parameter":null},{"name":"input_ports","value":"","type":"Literal","bound_global_parameter":null},{"name":"params","value":"{}","type":"Literal","bound_global_parameter":null},{"name":"output_ports","value":"","type":"Literal","bound_global_parameter":null}],"input_ports":[{"name":"input_1","node_id":"-10445"},{"name":"input_2","node_id":"-10445"},{"name":"input_3","node_id":"-10445"}],"output_ports":[{"name":"data_1","node_id":"-10445"},{"name":"data_2","node_id":"-10445"},{"name":"data_3","node_id":"-10445"}],"cacheable":true,"seq_num":5,"comment":"滚动训练DNN","comment_collapsed":false},{"node_id":"-141","module_id":"BigQuantSpace.trade.trade-v4","parameters":[{"name":"start_date","value":"","type":"Literal","bound_global_parameter":null},{"name":"end_date","value":"","type":"Literal","bound_global_parameter":null},{"name":"initialize","value":"# 回测引擎:初始化函数,只执行一次\ndef bigquant_run(context):\n # 加载预测数据\n context.ranker_prediction = context.options['data'].read_df()\n\n # 系统已经设置了默认的交易手续费和滑点,要修改手续费可使用如下函数\n context.set_commission(PerOrder(buy_cost=0.0003, sell_cost=0.0013, min_cost=5))\n # 预测数据,通过options传入进来,使用 read_df 函数,加载到内存 (DataFrame)\n # 设置买入的股票数量,这里买入预测股票列表排名靠前的5只\n stock_count = 50\n # 每只的股票的权重,如下的权重分配会使得靠前的股票分配多一点的资金,[0.339160, 0.213986, 0.169580, ..]\n context.stock_weights = T.norm([1 / math.log(i + 2) for i in range(0, stock_count)])\n # 设置每只股票占用的最大资金比例\n context.max_cash_per_instrument = 0.2\n context.options['hold_days'] = 5\n","type":"Literal","bound_global_parameter":null},{"name":"handle_data","value":"# 回测引擎:每日数据处理函数,每天执行一次\ndef bigquant_run(context, data):\n # 按日期过滤得到今日的预测数据\n ranker_prediction = context.ranker_prediction[\n context.ranker_prediction.date == data.current_dt.strftime('%Y-%m-%d')]\n\n # 1. 资金分配\n # 平均持仓时间是hold_days,每日都将买入股票,每日预期使用 1/hold_days 的资金\n # 实际操作中,会存在一定的买入误差,所以在前hold_days天,等量使用资金;之后,尽量使用剩余资金(这里设置最多用等量的1.5倍)\n is_staging = context.trading_day_index < context.options['hold_days'] # 是否在建仓期间(前 hold_days 天)\n cash_avg = context.portfolio.portfolio_value / context.options['hold_days']\n cash_for_buy = min(context.portfolio.cash, (1 if is_staging else 1.5) * cash_avg)\n cash_for_sell = cash_avg - (context.portfolio.cash - cash_for_buy)\n positions = {e.symbol: p.amount * p.last_sale_price\n for e, p in context.perf_tracker.position_tracker.positions.items()}\n\n # 2. 生成卖出订单:hold_days天之后才开始卖出;对持仓的股票,按机器学习算法预测的排序末位淘汰\n if not is_staging and cash_for_sell > 0:\n equities = {e.symbol: e for e, p in context.perf_tracker.position_tracker.positions.items()}\n instruments = list(reversed(list(ranker_prediction.instrument[ranker_prediction.instrument.apply(\n lambda x: x in equities and not context.has_unfinished_sell_order(equities[x]))])))\n # print('rank order for sell %s' % instruments)\n for instrument in instruments:\n context.order_target(context.symbol(instrument), 0)\n cash_for_sell -= positions[instrument]\n if cash_for_sell <= 0:\n break\n\n # 3. 生成买入订单:按机器学习算法预测的排序,买入前面的stock_count只股票\n buy_cash_weights = context.stock_weights\n buy_instruments = list(ranker_prediction.instrument[:len(buy_cash_weights)])\n max_cash_per_instrument = context.portfolio.portfolio_value * context.max_cash_per_instrument\n for i, instrument in enumerate(buy_instruments):\n cash = cash_for_buy * buy_cash_weights[i]\n if cash > max_cash_per_instrument - positions.get(instrument, 0):\n # 确保股票持仓量不会超过每次股票最大的占用资金量\n cash = max_cash_per_instrument - positions.get(instrument, 0)\n if cash > 0:\n context.order_value(context.symbol(instrument), cash)\n","type":"Literal","bound_global_parameter":null},{"name":"prepare","value":"# 回测引擎:准备数据,只执行一次\ndef bigquant_run(context):\n pass\n","type":"Literal","bound_global_parameter":null},{"name":"before_trading_start","value":"","type":"Literal","bound_global_parameter":null},{"name":"volume_limit","value":0.025,"type":"Literal","bound_global_parameter":null},{"name":"order_price_field_buy","value":"open","type":"Literal","bound_global_parameter":null},{"name":"order_price_field_sell","value":"close","type":"Literal","bound_global_parameter":null},{"name":"capital_base","value":1000000,"type":"Literal","bound_global_parameter":null},{"name":"auto_cancel_non_tradable_orders","value":"True","type":"Literal","bound_global_parameter":null},{"name":"data_frequency","value":"daily","type":"Literal","bound_global_parameter":null},{"name":"price_type","value":"后复权","type":"Literal","bound_global_parameter":null},{"name":"product_type","value":"股票","type":"Literal","bound_global_parameter":null},{"name":"plot_charts","value":"True","type":"Literal","bound_global_parameter":null},{"name":"backtest_only","value":"False","type":"Literal","bound_global_parameter":null},{"name":"benchmark","value":"000300.SHA","type":"Literal","bound_global_parameter":null}],"input_ports":[{"name":"instruments","node_id":"-141"},{"name":"options_data","node_id":"-141"},{"name":"history_ds","node_id":"-141"},{"name":"benchmark_ds","node_id":"-141"},{"name":"trading_calendar","node_id":"-141"}],"output_ports":[{"name":"raw_perf","node_id":"-141"}],"cacheable":false,"seq_num":6,"comment":"","comment_collapsed":true},{"node_id":"287d2cb0-f53c-4101-bdf8-104b137c8601-8","module_id":"BigQuantSpace.instruments.instruments-v2","parameters":[{"name":"start_date","value":"2011-01-01","type":"Literal","bound_global_parameter":null},{"name":"end_date","value":"2021-12-31","type":"Literal","bound_global_parameter":null},{"name":"market","value":"CN_STOCK_A","type":"Literal","bound_global_parameter":null},{"name":"instrument_list","value":"","type":"Literal","bound_global_parameter":null},{"name":"max_count","value":"0","type":"Literal","bound_global_parameter":null}],"input_ports":[{"name":"rolling_conf","node_id":"287d2cb0-f53c-4101-bdf8-104b137c8601-8"}],"output_ports":[{"name":"data","node_id":"287d2cb0-f53c-4101-bdf8-104b137c8601-8"}],"cacheable":true,"seq_num":1,"comment":"","comment_collapsed":true},{"node_id":"-14905","module_id":"BigQuantSpace.instruments.instruments-v2","parameters":[{"name":"start_date","value":"2017-01-01","type":"Literal","bound_global_parameter":null},{"name":"end_date","value":"2021-12-31","type":"Literal","bound_global_parameter":null},{"name":"market","value":"CN_STOCK_A","type":"Literal","bound_global_parameter":null},{"name":"instrument_list","value":"","type":"Literal","bound_global_parameter":null},{"name":"max_count","value":"0","type":"Literal","bound_global_parameter":null}],"input_ports":[{"name":"rolling_conf","node_id":"-14905"}],"output_ports":[{"name":"data","node_id":"-14905"}],"cacheable":true,"seq_num":10,"comment":"","comment_collapsed":true}],"node_layout":"<node_postions><node_position Node='287d2cb0-f53c-4101-bdf8-104b137c8601-15' Position='37,227,200,200'/><node_position Node='287d2cb0-f53c-4101-bdf8-104b137c8601-24' Position='616,78,200,200'/><node_position Node='287d2cb0-f53c-4101-bdf8-104b137c8601-53' Position='186,463,200,200'/><node_position Node='287d2cb0-f53c-4101-bdf8-104b137c8601-84' Position='371,344,200,200'/><node_position Node='-274' Position='367,199,200,200'/><node_position Node='-281' Position='371,273,200,200'/><node_position Node='-2228' Position='40,324,200,200'/><node_position Node='-7857' Position='187,548,200,200'/><node_position Node='-10445' Position='187,663,200,200'/><node_position Node='-141' Position='112,796,200,200'/><node_position Node='287d2cb0-f53c-4101-bdf8-104b137c8601-8' Position='132,29,200,200'/><node_position Node='-14905' Position='967,79,200,200'/></node_postions>"},"nodes_readonly":false,"studio_version":"v2"}
[2022-04-14 14:00:05.811475] INFO: moduleinvoker: input_features.v1 开始运行..
[2022-04-14 14:00:05.821821] INFO: moduleinvoker: 命中缓存
[2022-04-14 14:00:05.824425] INFO: moduleinvoker: input_features.v1 运行完成[0.012951s].
[2022-04-14 14:00:05.835168] INFO: moduleinvoker: instruments.v2 开始运行..
[2022-04-14 14:00:05.843326] INFO: moduleinvoker: 命中缓存
[2022-04-14 14:00:05.845650] INFO: moduleinvoker: instruments.v2 运行完成[0.010482s].
[2022-04-14 14:00:05.865640] INFO: moduleinvoker: advanced_auto_labeler.v2 开始运行..
[2022-04-14 14:00:05.874244] INFO: moduleinvoker: 命中缓存
[2022-04-14 14:00:05.877015] INFO: moduleinvoker: advanced_auto_labeler.v2 运行完成[0.011374s].
[2022-04-14 14:00:05.889308] INFO: moduleinvoker: standardlize.v9 开始运行..
[2022-04-14 14:00:05.895834] INFO: moduleinvoker: 命中缓存
[2022-04-14 14:00:05.897848] INFO: moduleinvoker: standardlize.v9 运行完成[0.008538s].
[2022-04-14 14:00:05.930292] INFO: moduleinvoker: general_feature_extractor.v7 开始运行..
[2022-04-14 14:00:05.937475] INFO: moduleinvoker: 命中缓存
[2022-04-14 14:00:05.939449] INFO: moduleinvoker: general_feature_extractor.v7 运行完成[0.009188s].
[2022-04-14 14:00:05.950625] INFO: moduleinvoker: derived_feature_extractor.v3 开始运行..
[2022-04-14 14:00:05.957722] INFO: moduleinvoker: 命中缓存
[2022-04-14 14:00:05.959937] INFO: moduleinvoker: derived_feature_extractor.v3 运行完成[0.009317s].
[2022-04-14 14:00:05.971891] INFO: moduleinvoker: dropnan.v1 开始运行..
[2022-04-14 14:00:05.981520] INFO: moduleinvoker: 命中缓存
[2022-04-14 14:00:05.983318] INFO: moduleinvoker: dropnan.v1 运行完成[0.011427s].
[2022-04-14 14:00:06.003156] INFO: moduleinvoker: join.v3 开始运行..
[2022-04-14 14:00:06.011291] INFO: moduleinvoker: 命中缓存
[2022-04-14 14:00:06.014089] INFO: moduleinvoker: join.v3 运行完成[0.010924s].
[2022-04-14 14:00:06.038141] INFO: moduleinvoker: cached.v3 开始运行..
[2022-04-14 14:19:23.775491] INFO: moduleinvoker: cached.v3 运行完成[1157.737316s].
[2022-04-14 14:19:23.816286] INFO: moduleinvoker: cached.v3 开始运行..
[2022-04-14 14:30:46.721969] INFO: moduleinvoker: cached.v3 运行完成[682.905703s].
[2022-04-14 14:30:46.778016] INFO: moduleinvoker: instruments.v2 开始运行..
[2022-04-14 14:30:46.795862] INFO: moduleinvoker: 命中缓存
[2022-04-14 14:30:46.798526] INFO: moduleinvoker: instruments.v2 运行完成[0.020517s].
[2022-04-14 14:30:48.178172] INFO: moduleinvoker: backtest.v8 开始运行..
[2022-04-14 14:30:48.184320] INFO: backtest: biglearning backtest:V8.6.2
[2022-04-14 14:30:48.185788] INFO: backtest: product_type:stock by specified
[2022-04-14 14:30:49.457891] INFO: moduleinvoker: cached.v2 开始运行..
[2022-04-14 14:30:49.470347] INFO: moduleinvoker: 命中缓存
[2022-04-14 14:30:49.472903] INFO: moduleinvoker: cached.v2 运行完成[0.015043s].
[2022-04-14 14:30:58.521105] INFO: algo: TradingAlgorithm V1.8.7
[2022-04-14 14:31:02.221141] INFO: algo: trading transform...
[2022-04-14 14:35:01.236280] INFO: Performance: Simulated 1217 trading days out of 1217.
[2022-04-14 14:35:01.238356] INFO: Performance: first open: 2017-01-03 09:30:00+00:00
[2022-04-14 14:35:01.239971] INFO: Performance: last close: 2021-12-31 15:00:00+00:00
[2022-04-14 14:35:27.871123] INFO: moduleinvoker: backtest.v8 运行完成[279.692941s].
[2022-04-14 14:35:27.874264] INFO: moduleinvoker: trade.v4 运行完成[281.056945s].
[2014, 2015, 2016] [2017]
Epoch 1/10
1724/1724 - 13s - loss: 1.0063 - mse: 1.0063 - rmse: 1.0031 - pearson_correlation: 0.0428 - val_loss: 0.9867 - val_mse: 0.9867 - val_rmse: 0.9933 - val_pearson_correlation: 0.0333
Epoch 2/10
1724/1724 - 8s - loss: 0.9844 - mse: 0.9844 - rmse: 0.9922 - pearson_correlation: 0.0533 - val_loss: 0.9874 - val_mse: 0.9874 - val_rmse: 0.9937 - val_pearson_correlation: 0.0470
Epoch 3/10
1724/1724 - 8s - loss: 0.9829 - mse: 0.9829 - rmse: 0.9914 - pearson_correlation: 0.0575 - val_loss: 0.9859 - val_mse: 0.9859 - val_rmse: 0.9929 - val_pearson_correlation: 0.0363
Epoch 4/10
1724/1724 - 8s - loss: 0.9816 - mse: 0.9816 - rmse: 0.9907 - pearson_correlation: 0.0598 - val_loss: 0.9853 - val_mse: 0.9853 - val_rmse: 0.9926 - val_pearson_correlation: 0.0457
Epoch 5/10
1724/1724 - 8s - loss: 0.9803 - mse: 0.9803 - rmse: 0.9901 - pearson_correlation: 0.0667 - val_loss: 0.9866 - val_mse: 0.9866 - val_rmse: 0.9933 - val_pearson_correlation: 0.0327
Epoch 6/10
1724/1724 - 8s - loss: 0.9790 - mse: 0.9790 - rmse: 0.9894 - pearson_correlation: 0.0701 - val_loss: 0.9855 - val_mse: 0.9855 - val_rmse: 0.9927 - val_pearson_correlation: 0.0340
Epoch 7/10
1724/1724 - 8s - loss: 0.9777 - mse: 0.9777 - rmse: 0.9888 - pearson_correlation: 0.0703 - val_loss: 0.9868 - val_mse: 0.9868 - val_rmse: 0.9934 - val_pearson_correlation: 0.0369
Epoch 8/10
1724/1724 - 8s - loss: 0.9767 - mse: 0.9767 - rmse: 0.9883 - pearson_correlation: 0.0732 - val_loss: 0.9856 - val_mse: 0.9856 - val_rmse: 0.9928 - val_pearson_correlation: 0.0345
Epoch 9/10
1724/1724 - 9s - loss: 0.9758 - mse: 0.9758 - rmse: 0.9878 - pearson_correlation: 0.0761 - val_loss: 0.9847 - val_mse: 0.9847 - val_rmse: 0.9923 - val_pearson_correlation: 0.0393
Epoch 10/10
1724/1724 - 8s - loss: 0.9749 - mse: 0.9749 - rmse: 0.9874 - pearson_correlation: 0.0763 - val_loss: 0.9855 - val_mse: 0.9855 - val_rmse: 0.9927 - val_pearson_correlation: 0.0369
[2015, 2016, 2017] [2018]
Epoch 1/10
1890/1890 - 10s - loss: 0.9994 - mse: 0.9994 - rmse: 0.9997 - pearson_correlation: 0.0490 - val_loss: 0.9956 - val_mse: 0.9956 - val_rmse: 0.9978 - val_pearson_correlation: 0.0383
Epoch 2/10
1890/1890 - 8s - loss: 0.9817 - mse: 0.9817 - rmse: 0.9908 - pearson_correlation: 0.0610 - val_loss: 0.9916 - val_mse: 0.9916 - val_rmse: 0.9958 - val_pearson_correlation: 0.0547
Epoch 3/10
1890/1890 - 8s - loss: 0.9800 - mse: 0.9800 - rmse: 0.9899 - pearson_correlation: 0.0629 - val_loss: 0.9938 - val_mse: 0.9938 - val_rmse: 0.9969 - val_pearson_correlation: 0.0480
Epoch 4/10
1890/1890 - 9s - loss: 0.9786 - mse: 0.9786 - rmse: 0.9892 - pearson_correlation: 0.0662 - val_loss: 0.9961 - val_mse: 0.9961 - val_rmse: 0.9980 - val_pearson_correlation: 0.0628
Epoch 5/10
1890/1890 - 7s - loss: 0.9772 - mse: 0.9772 - rmse: 0.9885 - pearson_correlation: 0.0692 - val_loss: 0.9934 - val_mse: 0.9934 - val_rmse: 0.9967 - val_pearson_correlation: 0.0497
Epoch 6/10
1890/1890 - 8s - loss: 0.9761 - mse: 0.9761 - rmse: 0.9880 - pearson_correlation: 0.0719 - val_loss: 0.9947 - val_mse: 0.9947 - val_rmse: 0.9974 - val_pearson_correlation: 0.0538
Epoch 7/10
1890/1890 - 8s - loss: 0.9751 - mse: 0.9751 - rmse: 0.9875 - pearson_correlation: 0.0743 - val_loss: 0.9932 - val_mse: 0.9932 - val_rmse: 0.9966 - val_pearson_correlation: 0.0524
Epoch 8/10
1890/1890 - 8s - loss: 0.9740 - mse: 0.9740 - rmse: 0.9869 - pearson_correlation: 0.0772 - val_loss: 0.9921 - val_mse: 0.9921 - val_rmse: 0.9960 - val_pearson_correlation: 0.0460
Epoch 9/10
1890/1890 - 9s - loss: 0.9732 - mse: 0.9732 - rmse: 0.9865 - pearson_correlation: 0.0784 - val_loss: 0.9953 - val_mse: 0.9953 - val_rmse: 0.9976 - val_pearson_correlation: 0.0513
Epoch 10/10
1890/1890 - 10s - loss: 0.9725 - mse: 0.9725 - rmse: 0.9862 - pearson_correlation: 0.0799 - val_loss: 0.9936 - val_mse: 0.9936 - val_rmse: 0.9968 - val_pearson_correlation: 0.0546
[2016, 2017, 2018] [2019]
Epoch 1/10
2137/2137 - 12s - loss: 1.0034 - mse: 1.0034 - rmse: 1.0017 - pearson_correlation: 0.0380 - val_loss: 0.9611 - val_mse: 0.9611 - val_rmse: 0.9804 - val_pearson_correlation: 0.0170
Epoch 2/10
2137/2137 - 9s - loss: 0.9843 - mse: 0.9843 - rmse: 0.9921 - pearson_correlation: 0.0478 - val_loss: 0.9629 - val_mse: 0.9629 - val_rmse: 0.9813 - val_pearson_correlation: 0.0347
Epoch 3/10
2137/2137 - 8s - loss: 0.9832 - mse: 0.9832 - rmse: 0.9915 - pearson_correlation: 0.0523 - val_loss: 0.9625 - val_mse: 0.9625 - val_rmse: 0.9811 - val_pearson_correlation: 0.0432
Epoch 4/10
2137/2137 - 9s - loss: 0.9822 - mse: 0.9822 - rmse: 0.9910 - pearson_correlation: 0.0554 - val_loss: 0.9602 - val_mse: 0.9602 - val_rmse: 0.9799 - val_pearson_correlation: 0.0175
Epoch 5/10
2137/2137 - 10s - loss: 0.9811 - mse: 0.9811 - rmse: 0.9905 - pearson_correlation: 0.0581 - val_loss: 0.9607 - val_mse: 0.9607 - val_rmse: 0.9802 - val_pearson_correlation: 0.0248
Epoch 6/10
2137/2137 - 10s - loss: 0.9804 - mse: 0.9804 - rmse: 0.9901 - pearson_correlation: 0.0592 - val_loss: 0.9609 - val_mse: 0.9609 - val_rmse: 0.9802 - val_pearson_correlation: 0.0267
Epoch 7/10
2137/2137 - 10s - loss: 0.9799 - mse: 0.9799 - rmse: 0.9899 - pearson_correlation: 0.0628 - val_loss: 0.9613 - val_mse: 0.9613 - val_rmse: 0.9804 - val_pearson_correlation: 0.0243
Epoch 8/10
2137/2137 - 10s - loss: 0.9791 - mse: 0.9791 - rmse: 0.9895 - pearson_correlation: 0.0620 - val_loss: 0.9626 - val_mse: 0.9626 - val_rmse: 0.9811 - val_pearson_correlation: 0.0314
Epoch 9/10
2137/2137 - 10s - loss: 0.9784 - mse: 0.9784 - rmse: 0.9892 - pearson_correlation: 0.0658 - val_loss: 0.9607 - val_mse: 0.9607 - val_rmse: 0.9802 - val_pearson_correlation: 0.0149
Epoch 10/10
2137/2137 - 9s - loss: 0.9780 - mse: 0.9780 - rmse: 0.9889 - pearson_correlation: 0.0653 - val_loss: 0.9625 - val_mse: 0.9625 - val_rmse: 0.9811 - val_pearson_correlation: 0.0206
[2017, 2018, 2019] [2020]
Epoch 1/10
2355/2355 - 13s - loss: 0.9930 - mse: 0.9930 - rmse: 0.9965 - pearson_correlation: 0.0315 - val_loss: 0.9564 - val_mse: 0.9564 - val_rmse: 0.9780 - val_pearson_correlation: -3.3044e-03
Epoch 2/10
2355/2355 - 10s - loss: 0.9762 - mse: 0.9762 - rmse: 0.9880 - pearson_correlation: 0.0377 - val_loss: 0.9515 - val_mse: 0.9515 - val_rmse: 0.9754 - val_pearson_correlation: 0.0021
Epoch 3/10
2355/2355 - 9s - loss: 0.9751 - mse: 0.9751 - rmse: 0.9875 - pearson_correlation: 0.0404 - val_loss: 0.9544 - val_mse: 0.9544 - val_rmse: 0.9770 - val_pearson_correlation: -7.5114e-03
Epoch 4/10
2355/2355 - 9s - loss: 0.9743 - mse: 0.9743 - rmse: 0.9871 - pearson_correlation: 0.0422 - val_loss: 0.9540 - val_mse: 0.9540 - val_rmse: 0.9767 - val_pearson_correlation: -1.3212e-02
Epoch 5/10
2355/2355 - 10s - loss: 0.9737 - mse: 0.9737 - rmse: 0.9867 - pearson_correlation: 0.0418 - val_loss: 0.9565 - val_mse: 0.9565 - val_rmse: 0.9780 - val_pearson_correlation: 0.0072
Epoch 6/10
2355/2355 - 9s - loss: 0.9729 - mse: 0.9729 - rmse: 0.9863 - pearson_correlation: 0.0424 - val_loss: 0.9563 - val_mse: 0.9563 - val_rmse: 0.9779 - val_pearson_correlation: 0.0025
Epoch 7/10
2355/2355 - 9s - loss: 0.9724 - mse: 0.9724 - rmse: 0.9861 - pearson_correlation: 0.0430 - val_loss: 0.9586 - val_mse: 0.9586 - val_rmse: 0.9791 - val_pearson_correlation: 0.0044
Epoch 8/10
2355/2355 - 9s - loss: 0.9720 - mse: 0.9720 - rmse: 0.9859 - pearson_correlation: 0.0459 - val_loss: 0.9632 - val_mse: 0.9632 - val_rmse: 0.9814 - val_pearson_correlation: -8.9622e-03
Epoch 9/10
2355/2355 - 10s - loss: 0.9715 - mse: 0.9715 - rmse: 0.9856 - pearson_correlation: 0.0469 - val_loss: 0.9580 - val_mse: 0.9580 - val_rmse: 0.9788 - val_pearson_correlation: -5.9859e-03
Epoch 10/10
2355/2355 - 10s - loss: 0.9711 - mse: 0.9711 - rmse: 0.9854 - pearson_correlation: 0.0490 - val_loss: 0.9539 - val_mse: 0.9539 - val_rmse: 0.9767 - val_pearson_correlation: -4.9825e-04
[2018, 2019, 2020] [2021]
Epoch 1/10
2465/2465 - 13s - loss: 0.9804 - mse: 0.9804 - rmse: 0.9901 - pearson_correlation: 0.0215 - val_loss: 0.9552 - val_mse: 0.9552 - val_rmse: 0.9774 - val_pearson_correlation: 0.0037
Epoch 2/10
2465/2465 - 9s - loss: 0.9651 - mse: 0.9651 - rmse: 0.9824 - pearson_correlation: 0.0268 - val_loss: 0.9512 - val_mse: 0.9512 - val_rmse: 0.9753 - val_pearson_correlation: 0.0078
Epoch 3/10
2465/2465 - 10s - loss: 0.9641 - mse: 0.9641 - rmse: 0.9819 - pearson_correlation: 0.0295 - val_loss: 0.9593 - val_mse: 0.9593 - val_rmse: 0.9795 - val_pearson_correlation: -9.3071e-03
Epoch 4/10
2465/2465 - 11s - loss: 0.9634 - mse: 0.9634 - rmse: 0.9816 - pearson_correlation: 0.0314 - val_loss: 0.9525 - val_mse: 0.9525 - val_rmse: 0.9760 - val_pearson_correlation: 0.0191
Epoch 5/10
2465/2465 - 10s - loss: 0.9625 - mse: 0.9625 - rmse: 0.9811 - pearson_correlation: 0.0349 - val_loss: 0.9544 - val_mse: 0.9544 - val_rmse: 0.9769 - val_pearson_correlation: 0.0033
Epoch 6/10
2465/2465 - 11s - loss: 0.9618 - mse: 0.9618 - rmse: 0.9807 - pearson_correlation: 0.0380 - val_loss: 0.9552 - val_mse: 0.9552 - val_rmse: 0.9773 - val_pearson_correlation: -1.0704e-03
Epoch 7/10
2465/2465 - 10s - loss: 0.9613 - mse: 0.9613 - rmse: 0.9805 - pearson_correlation: 0.0407 - val_loss: 0.9546 - val_mse: 0.9546 - val_rmse: 0.9770 - val_pearson_correlation: 0.0017
Epoch 8/10
2465/2465 - 11s - loss: 0.9606 - mse: 0.9606 - rmse: 0.9801 - pearson_correlation: 0.0432 - val_loss: 0.9545 - val_mse: 0.9545 - val_rmse: 0.9770 - val_pearson_correlation: 0.0112
Epoch 9/10
2465/2465 - 11s - loss: 0.9602 - mse: 0.9602 - rmse: 0.9799 - pearson_correlation: 0.0473 - val_loss: 0.9595 - val_mse: 0.9595 - val_rmse: 0.9796 - val_pearson_correlation: 0.0045
Epoch 10/10
2465/2465 - 11s - loss: 0.9595 - mse: 0.9595 - rmse: 0.9796 - pearson_correlation: 0.0452 - val_loss: 0.9613 - val_mse: 0.9613 - val_rmse: 0.9804 - val_pearson_correlation: 0.0095
- 收益率324.83%
- 年化收益率34.92%
- 基准收益率49.25%
- 阿尔法0.29
- 贝塔0.48
- 夏普比率1.6
- 胜率0.52
- 盈亏比1.3
- 收益波动率17.9%
- 信息比率0.07
- 最大回撤22.34%
bigcharts-data-start/{"__type":"tabs","__id":"bigchart-5d76438d8fc84c5e9f01c805578548fd"}/bigcharts-data-end