问答交流

【代码报错】请问这个错应该怎么修复TypeError Traceback (most recent call last)

由henry128创建,最终由small_q 被浏览 20 用户

10-31", "portfolio_value": 1000000.0, "cash": 1000000.0},json_cum_benchmark={"000300.HIX": 0.0, "000300.HIX.CUM": 0.0}) done rv=None
cum_return_plot: 1, before_shared_cum_return_plot: 0,                    after_shared_cum_return_plot: 0, benchmark_cum_return_plot: 1, hold_percent_plot: 1
[2023-11-01 00:11:29.251127] INFO hfpapertrading: invoke update_equity_algo(henry128,173975,2023-10-31,cum_return=0.0,today_return=0.0) done rv=None
[2023-11-01 00:11:29.254764] INFO hfpapertrading: hfpapertrading result, new_cash 1000000.0
---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-1-0c660c0cbff3> in <module>
    196 )
    197 
--> 198 m7 = M.bigtrader.v1(
    199     instruments=m2.data,
    200     options_data=m6.data,

/var/app/enabled/biglearning/module2/common/modulemanagerv2.cpython-38-x86_64-linux-gnu.so in biglearning.module2.common.modulemanagerv2.BigQuantModuleVersion.__call__()

/var/app/enabled/biglearning/module2/common/moduleinvoker.cpython-38-x86_64-linux-gnu.so in biglearning.module2.common.moduleinvoker.module_invoke()

/var/app/enabled/biglearning/module2/common/moduleinvoker.cpython-38-x86_64-linux-gnu.so in biglearning.module2.common.moduleinvoker._invoke_with_cache()

/var/app/enabled/biglearning/module2/common/modulecache.cpython-38-x86_64-linux-gnu.so in biglearning.module2.common.modulecache.cache_set()

/usr/local/python3/lib/python3.8/json/__init__.py in dumps(obj, skipkeys, ensure_ascii, check_circular, allow_nan, cls, indent, separators, default, sort_keys, **kw)
    232     if cls is None:
    233         cls = JSONEncoder
--> 234     return cls(
    235         skipkeys=skipkeys, ensure_ascii=ensure_ascii,
    236         check_circular=check_circular, allow_nan=allow_nan, indent=indent,

/usr/local/python3/lib/python3.8/json/encoder.py in encode(self, o)
    197         # exceptions aren't as detailed.  The list call should be roughly
    198         # equivalent to the PySequence_Fast that ''.join() would do.
--> 199         chunks = self.iterencode(o, _one_shot=True)
    200         if not isinstance(chunks, (list, tuple)):
    201             chunks = list(chunks)

/usr/local/python3/lib/python3.8/json/encoder.py in iterencode(self, o, _one_shot)
    255                 self.key_separator, self.item_separator, self.sort_keys,
    256                 self.skipkeys, _one_shot)
--> 257         return _iterencode(o, 0)
    258 
    259 def _make_iterencode(markers, _default, _encoder, _indent, _floatstr,

/var/app/enabled/biglearning/module2/common/modulecache.cpython-38-x86_64-linux-gnu.so in biglearning.module2.common.modulecache._cache_value_encoder()

TypeError: _cache_value_encoder: not supported type: <class 'datetime.datetime'>

In [2]:

\n

评论
  • 报错的信息看是不支持 'datetime.datetime'的数据类型,能分享下代码看一下具体报错原因吗?
{link}