PS F:\> perceptilabs -v=3 INFO: Could not find files for the given pattern(s). PerceptiLabs: Your environment does not have git installed, so interactions with GitHub will not be available 2021-06-09 13:47:42.594816: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library cudart64_110.dll 2021-06-09 13:47:44.165854: I tensorflow/compiler/jit/xla_cpu_device.cc:41] Not creating XLA devices, tf_xla_enable_xla_devices not set 2021-06-09 13:47:44.166585: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library nvcuda.dll 2021-06-09 13:47:44.187720: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1720] Found device 0 with properties: pciBusID: 0000:08:00.0 name: GeForce GTX 1080 Ti computeCapability: 6.1 coreClock: 1.6575GHz coreCount: 28 deviceMemorySize: 11.00GiB deviceMemoryBandwidth: 451.17GiB/s 2021-06-09 13:47:44.187820: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library cudart64_110.dll 2021-06-09 13:47:44.196498: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library cublas64_11.dll 2021-06-09 13:47:44.196565: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library cublasLt64_11.dll 2021-06-09 13:47:44.199965: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library cufft64_10.dll 2021-06-09 13:47:44.200971: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library curand64_10.dll 2021-06-09 13:47:44.204970: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library cusolver64_10.dll 2021-06-09 13:47:44.207885: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library cusparse64_11.dll 2021-06-09 13:47:44.208365: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library cudnn64_8.dll 2021-06-09 13:47:44.208465: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1862] Adding visible gpu devices: 0 Operations to perform: Apply all migrations: admin, api, auth, contenttypes, sessions Running migrations: No migrations to apply. PerceptiLabs: Starting 2021-06-09 13:47:46.326741: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library cudart64_110.dll Performing system checks... Performing system checks... System check identified no issues (0 silenced). June 09, 2021 - 13:47:46 Django version 3.2, using settings 'static_file_server.settings' Starting development server at http://127.0.0.1:8080/ Quit the server with CTRL-BREAK. System check identified no issues (0 silenced). June 09, 2021 - 13:47:46 Django version 3.2, using settings 'fileserver.settings' Starting development server at http://127.0.0.1:8011/ Quit the server with CTRL-BREAK. Performing system checks... System check identified no issues (0 silenced). June 09, 2021 - 13:47:47 Django version 3.2, using settings 'rygg.settings' Starting development server at http://127.0.0.1:8000/ Quit the server with CTRL-BREAK. 2021-06-09 13:47:47.954297: I tensorflow/compiler/jit/xla_cpu_device.cc:41] Not creating XLA devices, tf_xla_enable_xla_devices not set 2021-06-09 13:47:47.954787: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library nvcuda.dll 2021-06-09 13:47:47.976403: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1720] Found device 0 with properties: pciBusID: 0000:08:00.0 name: GeForce GTX 1080 Ti computeCapability: 6.1 coreClock: 1.6575GHz coreCount: 28 deviceMemorySize: 11.00GiB deviceMemoryBandwidth: 451.17GiB/s 2021-06-09 13:47:47.976523: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library cudart64_110.dll 2021-06-09 13:47:47.980978: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library cublas64_11.dll 2021-06-09 13:47:47.981061: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library cublasLt64_11.dll 2021-06-09 13:47:47.983288: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library cufft64_10.dll 2021-06-09 13:47:47.983988: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library curand64_10.dll 2021-06-09 13:47:47.986888: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library cusolver64_10.dll 2021-06-09 13:47:47.988296: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library cusparse64_11.dll 2021-06-09 13:47:47.988846: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library cudnn64_8.dll 2021-06-09 13:47:47.988972: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1862] Adding visible gpu devices: 0 [09/Jun/2021 13:47:49] "GET /?token=tWcU7ryorWvMunfRhx9vSAdnJh-iJJNY0lkG5-fe_Ak HTTP/1.1" 200 493 2021-06-09 13:47:49,055 - INFO - :1 - Reporting errors with commit id: 83efcce773bf7b536564aa36d699df84bda619e9 PerceptiLabs is ready... 2021-06-09 13:47:49,074 - INFO - :1 - Trying to listen to: 0.0.0.0 5000 [09/Jun/2021 13:47:49] "GET /static/js/0.3f4117f65c7d6f13a74b.js HTTP/1.1" 304 0 [09/Jun/2021 13:47:49] "GET /static/styles/vendors.63d2ede8fd86a38e9249.css HTTP/1.1" 304 0 [09/Jun/2021 13:47:49] "GET /static/js/app.12c5185cb4c1cd5596c9.js HTTP/1.1" 304 0 [09/Jun/2021 13:47:49] "GET /fileserver_url HTTP/1.1" 200 0 [09/Jun/2021 13:47:49] "GET /rygg_url HTTP/1.1" 200 0 [09/Jun/2021 13:47:49] "GET /kernel_url HTTP/1.1" 200 0 [09/Jun/2021 13:47:49] "GET /keycloak_url HTTP/1.1" 200 0 [09/Jun/2021 13:47:49,254] - Broken pipe from ('127.0.0.1', 63041) [09/Jun/2021 13:47:50] "GET /keycloak_url HTTP/1.1" 200 0 [09/Jun/2021 13:47:50] "GET /?token=tWcU7ryorWvMunfRhx9vSAdnJh-iJJNY0lkG5-fe_Ak HTTP/1.1" 200 493 PerceptiLabs: PerceptiLabs Started PerceptiLabs: PerceptiLabs is running at http://localhost:8080/?token=tWcU7ryorWvMunfRhx9vSAdnJh-iJJNY0lkG5-fe_Ak PerceptiLabs: Use Control-C to stop this server and shut down all PerceptiLabs processes. [09/Jun/2021 13:47:51] "GET /fileserver_url HTTP/1.1" 200 0 [09/Jun/2021 13:47:51] "GET /rygg_url HTTP/1.1" 200 0 [09/Jun/2021 13:47:51] "GET /kernel_url HTTP/1.1" 200 0 [09/Jun/2021 13:47:51] "GET /keycloak_url HTTP/1.1" 200 0 [09/Jun/2021 13:47:51] "GET /keycloak_url HTTP/1.1" 200 0 2021-06-09 13:47:51,878 - INFO - server.py:191 - Created coreLogic for network 'None' [09/Jun/2021 13:47:51] "GET /rygg_url HTTP/1.1" 200 0 [09/Jun/2021 13:47:51] "OPTIONS /version?token=tWcU7ryorWvMunfRhx9vSAdnJh-iJJNY0lkG5-fe_Ak HTTP/1.1" 200 0 [09/Jun/2021 13:47:51] "GET /projects/ HTTP/1.1" 200 265 [09/Jun/2021 13:47:51] "GET /version?token=tWcU7ryorWvMunfRhx9vSAdnJh-iJJNY0lkG5-fe_Ak HTTP/1.1" 200 21 2021-06-09 13:47:51,921 - INFO - server.py:191 - Created coreLogic for network '' 2021-06-09 13:47:51,921 - INFO - server.py:191 - User has been set to dr.gm@live.com [09/Jun/2021 13:47:52] "GET /app/updates_available/ HTTP/1.1" 200 22 [09/Jun/2021 13:47:52] "GET /mixpanel/decide/?verbose=1&version=1&lib=web&token=1480b2244fdd4d821227a29e2637f922&ip=1&_=1623232071891 HTTP/1.1" 200 0 [09/Jun/2021 13:47:52] "POST /mixpanel/track/?ip=1&_=1623232071894 HTTP/1.1" 200 0 [09/Jun/2021 13:47:57] "POST /mixpanel/engage/?ip=1&_=1623232076860 HTTP/1.1" 200 0 [09/Jun/2021 13:47:57] "POST /mixpanel/engage/?ip=1&_=1623232076860 HTTP/1.1" 200 0 [09/Jun/2021 13:47:58] "GET /mixpanel/decide/?verbose=1&version=3&lib=web&token=1480b2244fdd4d821227a29e2637f922&distinct_id=dr.gm%40live.com&ip=1&_=1623232076858 HTTP/1.1" 200 0 [09/Jun/2021 13:49:04] "GET /static/img/model-empty.png HTTP/1.1" 200 36694 2021-06-09 13:49:04,450 - INFO - server.py:191 - Created coreLogic for network '0' [09/Jun/2021 13:49:05] "POST /mixpanel/track/?ip=1&_=1623232145304 HTTP/1.1" 200 0 [09/Jun/2021 13:49:07] "GET /static/img/file-picker/home.svg HTTP/1.1" 200 664 [09/Jun/2021 13:49:07] "OPTIONS /directories/get_folder_content?path=F:/DFU_Ai/DFUC2021_train/images&token=tWcU7ryorWvMunfRhx9vSAdnJh-iJJNY0lkG5-fe_Ak HTTP/1.1" 200 0 [09/Jun/2021 13:49:08] "GET /directories/get_folder_content?path=F:/DFU_Ai/DFUC2021_train/images&token=tWcU7ryorWvMunfRhx9vSAdnJh-iJJNY0lkG5-fe_Ak HTTP/1.1" 200 83516 [09/Jun/2021 13:49:08] "GET /static/img/file-picker/folder.svg HTTP/1.1" 200 307 [09/Jun/2021 13:49:08] "GET /static/img/file-picker/file.svg HTTP/1.1" 200 304 [09/Jun/2021 13:49:09] "OPTIONS /directories/get_folder_content?path=&token=tWcU7ryorWvMunfRhx9vSAdnJh-iJJNY0lkG5-fe_Ak HTTP/1.1" 200 0 [09/Jun/2021 13:49:09] "GET /directories/get_folder_content?path=&token=tWcU7ryorWvMunfRhx9vSAdnJh-iJJNY0lkG5-fe_Ak HTTP/1.1" 200 150 [09/Jun/2021 13:49:14] "OPTIONS /directories/get_folder_content?path=.&token=tWcU7ryorWvMunfRhx9vSAdnJh-iJJNY0lkG5-fe_Ak HTTP/1.1" 200 0 [09/Jun/2021 13:49:14] "GET /directories/get_folder_content?path=.&token=tWcU7ryorWvMunfRhx9vSAdnJh-iJJNY0lkG5-fe_Ak HTTP/1.1" 200 107 [09/Jun/2021 13:49:16] "OPTIONS /directories/get_folder_content?path=F:\&token=tWcU7ryorWvMunfRhx9vSAdnJh-iJJNY0lkG5-fe_Ak HTTP/1.1" 200 0 [09/Jun/2021 13:49:17] "GET /directories/get_folder_content?path=F:\&token=tWcU7ryorWvMunfRhx9vSAdnJh-iJJNY0lkG5-fe_Ak HTTP/1.1" 200 484 [09/Jun/2021 13:49:18] "OPTIONS /directories/get_folder_content?path=F:/DFU_Ai&token=tWcU7ryorWvMunfRhx9vSAdnJh-iJJNY0lkG5-fe_Ak HTTP/1.1" 200 0 [09/Jun/2021 13:49:18] "GET /directories/get_folder_content?path=F:/DFU_Ai&token=tWcU7ryorWvMunfRhx9vSAdnJh-iJJNY0lkG5-fe_Ak HTTP/1.1" 200 166 [09/Jun/2021 13:49:21] "OPTIONS /directories/get_folder_content?path=F:/DFU_Ai/DFUC2021_train&token=tWcU7ryorWvMunfRhx9vSAdnJh-iJJNY0lkG5-fe_Ak HTTP/1.1" 200 0 [09/Jun/2021 13:49:21] "GET /directories/get_folder_content?path=F:/DFU_Ai/DFUC2021_train&token=tWcU7ryorWvMunfRhx9vSAdnJh-iJJNY0lkG5-fe_Ak HTTP/1.1" 200 126 [09/Jun/2021 13:49:24] "GET /directories/get_folder_content?path=F:/DFU_Ai/DFUC2021_train/images&token=tWcU7ryorWvMunfRhx9vSAdnJh-iJJNY0lkG5-fe_Ak HTTP/1.1" 200 83516 [09/Jun/2021 13:49:27] "OPTIONS /files/get_file_content?path=F:/DFU_Ai/DFUC2021_train/images/11111train.csv&token=tWcU7ryorWvMunfRhx9vSAdnJh-iJJNY0lkG5-fe_Ak HTTP/1.1" 200 0 ('F:/DFU_Ai/DFUC2021_train/images/11111train.csv', 4) [09/Jun/2021 13:49:27] "GET /files/get_file_content?path=F:/DFU_Ai/DFUC2021_train/images/11111train.csv&token=tWcU7ryorWvMunfRhx9vSAdnJh-iJJNY0lkG5-fe_Ak HTTP/1.1" 200 132 [09/Jun/2021 13:49:29] "GET /static/fonts/roboto-regular-webfont.736b705.woff2 HTTP/1.1" 200 20000 [09/Jun/2021 13:49:29] "OPTIONS /directories/resolved_dir?path=C:\Users\Dr.GM\Documents\Perceptilabs\Default&token=tWcU7ryorWvMunfRhx9vSAdnJh-iJJNY0lkG5-fe_Ak HTTP/1.1" 200 0 [09/Jun/2021 13:49:29] "GET /directories/resolved_dir?path=C:\Users\Dr.GM\Documents\Perceptilabs\Default&token=tWcU7ryorWvMunfRhx9vSAdnJh-iJJNY0lkG5-fe_Ak HTTP/1.1" 200 62 [09/Jun/2021 13:49:29] "OPTIONS /directories/get_folder_content?path=C:\Users\Dr.GM\Documents\Perceptilabs\Default&token=tWcU7ryorWvMunfRhx9vSAdnJh-iJJNY0lkG5-fe_Ak HTTP/1.1" 200 0 [09/Jun/2021 13:49:29] "GET /directories/get_folder_content?path=C:\Users\Dr.GM\Documents\Perceptilabs\Default&token=tWcU7ryorWvMunfRhx9vSAdnJh-iJJNY0lkG5-fe_Ak HTTP/1.1" 200 113 [09/Jun/2021 13:49:50] [09/Jun/2021 13:49:50] "OPTIONS /directories?path=C:\Users\Dr.GM\Documents\Perceptilabs\Default/Model%201&token=tWcU7ryorWvMunfRhx9vSAdnJh-iJJNY0lkG5-fe_Ak HTTP/1.1" 200 0"OPTIONS /models/ HTTP/1.1" 200 0 [09/Jun/2021 13:49:50] "HEAD /directories?path=C:\Users\Dr.GM\Documents\Perceptilabs\Default/Model%201&token=tWcU7ryorWvMunfRhx9vSAdnJh-iJJNY0lkG5-fe_Ak HTTP/1.1" 204 34 [09/Jun/2021 13:49:50] "POST /models/ HTTP/1.1" 201 239 2021-06-09 13:49:50.651662: I tensorflow/core/platform/cpu_feature_guard.cc:142] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations: AVX2 To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags. 2021-06-09 13:49:50.652594: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1720] Found device 0 with properties: pciBusID: 0000:08:00.0 name: GeForce GTX 1080 Ti computeCapability: 6.1 coreClock: 1.6575GHz coreCount: 28 deviceMemorySize: 11.00GiB deviceMemoryBandwidth: 451.17GiB/s 2021-06-09 13:49:50.652700: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library cudart64_110.dll 2021-06-09 13:49:50.652790: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library cublas64_11.dll 2021-06-09 13:49:50.652882: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library cublasLt64_11.dll 2021-06-09 13:49:50.652936: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library cufft64_10.dll 2021-06-09 13:49:50.652974: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library curand64_10.dll 2021-06-09 13:49:50.653038: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library cusolver64_10.dll 2021-06-09 13:49:50.654003: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library cusparse64_11.dll 2021-06-09 13:49:50.655002: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library cudnn64_8.dll 2021-06-09 13:49:50.655482: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1862] Adding visible gpu devices: 0 2021-06-09 13:49:51.199266: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1261] Device interconnect StreamExecutor with strength 1 edge matrix: 2021-06-09 13:49:51.199380: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1267] 0 2021-06-09 13:49:51.200683: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1280] 0: N 2021-06-09 13:49:51.201213: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1406] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 9526 MB memory) -> physical GPU (device: 0, name: GeForce GTX 1080 Ti, pci bus id: 0000:08:00.0, compute capability: 6.1) 2021-06-09 13:49:51.202116: I tensorflow/compiler/jit/xla_gpu_device.cc:99] Not creating XLA devices, tf_xla_enable_xla_devices not set WARNING:tensorflow:AutoGraph could not transform .Loader.call of .Loader object at 0x000001CBE347D5C8>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert 2021-06-09 13:49:51.226075: I tensorflow/compiler/mlir/mlir_graph_optimization_pass.cc:116] None of the MLIR optimization passes are enabled (registered 2) [09/Jun/2021 13:49:51] "OPTIONS /directories?path=C:\Users\Dr.GM\Documents\Perceptilabs\Default/Model%201&token=tWcU7ryorWvMunfRhx9vSAdnJh-iJJNY0lkG5-fe_Ak HTTP/1.1" 200 0 [09/Jun/2021 13:49:51] "POST /models/ HTTP/1.1" 201 239 [09/Jun/2021 13:49:51] "HEAD /directories?path=C:\Users\Dr.GM\Documents\Perceptilabs\Default/Model%201&token=tWcU7ryorWvMunfRhx9vSAdnJh-iJJNY0lkG5-fe_Ak HTTP/1.1" 204 34 [09/Jun/2021 13:49:53] "OPTIONS /directories?path=C:\Users\Dr.GM\Documents\Perceptilabs\Default/Model%201&token=tWcU7ryorWvMunfRhx9vSAdnJh-iJJNY0lkG5-fe_Ak HTTP/1.1" 200 0 [09/Jun/2021 13:49:53] "POST /models/ HTTP/1.1" 201 239 [09/Jun/2021 13:49:53] "HEAD /directories?path=C:\Users\Dr.GM\Documents\Perceptilabs\Default/Model%201&token=tWcU7ryorWvMunfRhx9vSAdnJh-iJJNY0lkG5-fe_Ak HTTP/1.1" 204 34 [09/Jun/2021 13:49:57] "OPTIONS /directories?path=C:\Users\Dr.GM\Documents\Perceptilabs\Default/Model%201&token=tWcU7ryorWvMunfRhx9vSAdnJh-iJJNY0lkG5-fe_Ak HTTP/1.1" 200 0 [09/Jun/2021 13:49:57] "HEAD /directories?path=C:\Users\Dr.GM\Documents\Perceptilabs\Default/Model%201&token=tWcU7ryorWvMunfRhx9vSAdnJh-iJJNY0lkG5-fe_Ak HTTP/1.1" 204 34 [09/Jun/2021 13:49:57] "POST /models/ HTTP/1.1" 201 239 WARNING:tensorflow:AutoGraph could not transform .Loader.call of .Loader object at 0x000001CB8471E308>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert [09/Jun/2021 13:49:58] "OPTIONS /json_models?path=C:\Users\Dr.GM\Documents\Perceptilabs\Default/Model%201&token=tWcU7ryorWvMunfRhx9vSAdnJh-iJJNY0lkG5-fe_Ak HTTP/1.1" 200 0 [09/Jun/2021 13:49:58] "POST /json_models?path=C:\Users\Dr.GM\Documents\Perceptilabs\Default/Model%201&token=tWcU7ryorWvMunfRhx9vSAdnJh-iJJNY0lkG5-fe_Ak HTTP/1.1" 200 83 [09/Jun/2021 13:49:58] "POST /mixpanel/track/?ip=1&_=1623232197986 HTTP/1.1" 200 0 WARNING:tensorflow:AutoGraph could not transform .Loader.call of .Loader object at 0x000001CB8472E408>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert [09/Jun/2021 13:50:03] "POST /json_models?path=C:\Users\Dr.GM\Documents\Perceptilabs\Default/Model%201&token=tWcU7ryorWvMunfRhx9vSAdnJh-iJJNY0lkG5-fe_Ak HTTP/1.1" 200 83 WARNING:tensorflow:AutoGraph could not transform .Loader.call of .Loader object at 0x000001CB846D52C8>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert [09/Jun/2021 13:50:08] "POST /json_models?path=C:\Users\Dr.GM\Documents\Perceptilabs\Default/Model%201&token=tWcU7ryorWvMunfRhx9vSAdnJh-iJJNY0lkG5-fe_Ak HTTP/1.1" 200 83 2021-06-09 13:50:12,917 - INFO - server.py:191 - Created coreLogic for network '111' WARNING:tensorflow:AutoGraph could not transform .Loader.call of .Loader object at 0x000001CB84661508>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert [09/Jun/2021 13:50:13] "POST /json_models?path=C:\Users\Dr.GM\Documents\Perceptilabs\Default/Model%201&token=tWcU7ryorWvMunfRhx9vSAdnJh-iJJNY0lkG5-fe_Ak HTTP/1.1" 200 83 [09/Jun/2021 13:50:13] "POST /mixpanel/track/?ip=1&_=1623232212892 HTTP/1.1" 200 0 [09/Jun/2021 13:50:14] "POST /mixpanel/track/?ip=1&_=1623232214508 HTTP/1.1" 200 0 2021-06-09 13:50:16,117 - WARNING - server.py:191 - Settings engine is not set. Cannot make recommendations. Using old json_network. WARNING:tensorflow:AutoGraph could not transform .Pipeline.call of .Pipeline object at 0x000001CBE2E50988>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert WARNING:tensorflow:AutoGraph could not transform .TrainingPipeline.call of .TrainingPipeline object at 0x000001CB8465CE48>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert WARNING:tensorflow:AutoGraph could not transform .TrainingPipeline.call of .TrainingPipeline object at 0x000001CBE259C708>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert WARNING:tensorflow:AutoGraph could not transform .TrainingPipeline.call of .TrainingPipeline object at 0x000001CB846C8588>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert WARNING:tensorflow:AutoGraph could not transform .TrainingPipeline.call of .TrainingPipeline object at 0x000001CB84769748>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert 2021-06-09 13:50:16.166875: W tensorflow/core/kernels/data/cache_dataset_ops.cc:757] The calling iterator did not fully read the dataset being cached. In order to avoid unexpected truncation of the dataset, the partially cached contents of the dataset will be discarded. This can happen if you have an input pipeline similar to `dataset.cache().take(k).repeat()`. You should use `dataset.take(k).cache().repeat()` instead. 2021-06-09 13:50:16.276787: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library cudnn64_8.dll 2021-06-09 13:50:16.714907: I tensorflow/core/platform/windows/subprocess.cc:308] SubProcess ended with return code: 0 2021-06-09 13:50:16.756653: I tensorflow/core/platform/windows/subprocess.cc:308] SubProcess ended with return code: 0 2021-06-09 13:50:16.762847: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library cublas64_11.dll 2021-06-09 13:50:17.029253: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library cublasLt64_11.dll 2021-06-09 13:50:17,247 - INFO - server.py:191 - Ran lightweight core. Duration: 1.130023399999999s. Used cache for layers: WARNING:tensorflow:AutoGraph could not transform .Loader.call of .Loader object at 0x000001CB846F7148>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert 2021-06-09 13:50:20,686 - WARNING - server.py:191 - Settings engine is not set. Cannot make recommendations. Using old json_network. WARNING:tensorflow:AutoGraph could not transform .Pipeline.call of .Pipeline object at 0x000001CB84732908>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert WARNING:tensorflow:AutoGraph could not transform .TrainingPipeline.call of .TrainingPipeline object at 0x000001CB847223C8>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert WARNING:tensorflow:AutoGraph could not transform .TrainingPipeline.call of .TrainingPipeline object at 0x000001CB84799388>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert WARNING:tensorflow:AutoGraph could not transform .TrainingPipeline.call of .TrainingPipeline object at 0x000001CB84791888>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert WARNING:tensorflow:AutoGraph could not transform .TrainingPipeline.call of .TrainingPipeline object at 0x000001CB847A5B08>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert 2021-06-09 13:50:20.734962: W tensorflow/core/kernels/data/cache_dataset_ops.cc:757] The calling iterator did not fully read the dataset being cached. In order to avoid unexpected truncation of the dataset, the partially cached contents of the dataset will be discarded. This can happen if you have an input pipeline similar to `dataset.cache().take(k).repeat()`. You should use `dataset.take(k).cache().repeat()` instead. 2021-06-09 13:50:20,736 - INFO - server.py:191 - Ran lightweight core. Duration: 0.050408799999985376s. Used cache for layers: 0, 1, 2, 3, 4, 7, 8, 5, 6, 9, 10 WARNING:tensorflow:AutoGraph could not transform .Loader.call of .Loader object at 0x000001CB862191C8>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert 2021-06-09 13:50:24,247 - WARNING - server.py:191 - Settings engine is not set. Cannot make recommendations. Using old json_network. WARNING:tensorflow:AutoGraph could not transform .Pipeline.call of .Pipeline object at 0x000001CB8630AF48>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert WARNING:tensorflow:AutoGraph could not transform .TrainingPipeline.call of .TrainingPipeline object at 0x000001CB8647B648>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert WARNING:tensorflow:AutoGraph could not transform .TrainingPipeline.call of .TrainingPipeline object at 0x000001CB8647B688>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert WARNING:tensorflow:AutoGraph could not transform .TrainingPipeline.call of .TrainingPipeline object at 0x000001CB8622E6C8>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert WARNING:tensorflow:AutoGraph could not transform .TrainingPipeline.call of .TrainingPipeline object at 0x000001CB86231DC8>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert 2021-06-09 13:50:24.298817: W tensorflow/core/kernels/data/cache_dataset_ops.cc:757] The calling iterator did not fully read the dataset being cached. In order to avoid unexpected truncation of the dataset, the partially cached contents of the dataset will be discarded. This can happen if you have an input pipeline similar to `dataset.cache().take(k).repeat()`. You should use `dataset.take(k).cache().repeat()` instead. 2021-06-09 13:50:24,300 - INFO - server.py:191 - Ran lightweight core. Duration: 0.05257819999999924s. Used cache for layers: 0, 1, 2, 3, 4, 7, 8, 5, 6, 9, 10 WARNING:tensorflow:AutoGraph could not transform .Loader.call of .Loader object at 0x000001CC01F91748>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert 2021-06-09 13:50:27,795 - WARNING - server.py:191 - Settings engine is not set. Cannot make recommendations. Using old json_network. WARNING:tensorflow:AutoGraph could not transform .Pipeline.call of .Pipeline object at 0x000001CB86269288>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert WARNING:tensorflow:AutoGraph could not transform .TrainingPipeline.call of .TrainingPipeline object at 0x000001CB86248AC8>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert WARNING:tensorflow:AutoGraph could not transform .TrainingPipeline.call of .TrainingPipeline object at 0x000001CB86248E08>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert WARNING:tensorflow:AutoGraph could not transform .TrainingPipeline.call of .TrainingPipeline object at 0x000001CB8625E648>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert WARNING:tensorflow:AutoGraph could not transform .TrainingPipeline.call of .TrainingPipeline object at 0x000001CC01FBF988>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert 2021-06-09 13:50:27.845980: W tensorflow/core/kernels/data/cache_dataset_ops.cc:757] The calling iterator did not fully read the dataset being cached. In order to avoid unexpected truncation of the dataset, the partially cached contents of the dataset will be discarded. This can happen if you have an input pipeline similar to `dataset.cache().take(k).repeat()`. You should use `dataset.take(k).cache().repeat()` instead. 2021-06-09 13:50:27,847 - INFO - server.py:191 - Ran lightweight core. Duration: 0.051721399999991036s. Used cache for layers: 0, 1, 2, 3, 4, 7, 8, 5, 6, 9, 10 WARNING:tensorflow:AutoGraph could not transform .Loader.call of .Loader object at 0x000001CC01FBB208>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert 2021-06-09 13:50:31,291 - WARNING - server.py:191 - Settings engine is not set. Cannot make recommendations. Using old json_network. WARNING:tensorflow:AutoGraph could not transform .Pipeline.call of .Pipeline object at 0x000001CB8652B808>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert WARNING:tensorflow:AutoGraph could not transform .TrainingPipeline.call of .TrainingPipeline object at 0x000001CB865A3088>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert WARNING:tensorflow:AutoGraph could not transform .TrainingPipeline.call of .TrainingPipeline object at 0x000001CB865A6808>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert WARNING:tensorflow:AutoGraph could not transform .TrainingPipeline.call of .TrainingPipeline object at 0x000001CC0A18F148>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert WARNING:tensorflow:AutoGraph could not transform .TrainingPipeline.call of .TrainingPipeline object at 0x000001CC0A184448>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert 2021-06-09 13:50:31.343424: W tensorflow/core/kernels/data/cache_dataset_ops.cc:757] The calling iterator did not fully read the dataset being cached. In order to avoid unexpected truncation of the dataset, the partially cached contents of the dataset will be discarded. This can happen if you have an input pipeline similar to `dataset.cache().take(k).repeat()`. You should use `dataset.take(k).cache().repeat()` instead. 2021-06-09 13:50:31,345 - INFO - server.py:191 - Ran lightweight core. Duration: 0.05301210000001788s. Used cache for layers: 0, 1, 2, 3, 4, 7, 8, 5, 6, 9, 10 [09/Jun/2021 13:50:31] "GET /static/webworkers/calcChartPic.js HTTP/1.1" 200 470 [09/Jun/2021 13:50:31] "GET /static/webworkers/calcChartPic.js HTTP/1.1" 200 470 [09/Jun/2021 13:50:31] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:50:31] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:50:31] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:50:31] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:50:31] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:50:31] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:50:31] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:50:31] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:50:31] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 2021-06-09 13:50:37,823 - INFO - server.py:191 - Created coreLogic for network '112' 2021-06-09 13:50:37,827 - INFO - server.py:191 - Created coreLogic for network '113' 2021-06-09 13:50:37,831 - INFO - server.py:191 - Created coreLogic for network '114' [09/Jun/2021 13:50:37] "GET /static/img/info.png HTTP/1.1" 200 616 [09/Jun/2021 13:50:37] "GET /static/img/stats-empty.png HTTP/1.1" 200 41500 [09/Jun/2021 13:50:37] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:50:37] "GET /static/webworkers/calcChartPic.js HTTP/1.1" 200 470 [09/Jun/2021 13:50:38] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:50:38] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:50:38] "GET /static/webworkers/calcChartPic.js HTTP/1.1" 200 470 [09/Jun/2021 13:50:38] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:50:38] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:50:38] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:50:38] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:50:38] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:50:38] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:50:38] "POST /mixpanel/track/?ip=1&_=1623232237835 HTTP/1.1" 200 0 [09/Jun/2021 13:50:38] "POST /mixpanel/track/?ip=1&_=1623232238687 HTTP/1.1" 200 0 [09/Jun/2021 13:50:38] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:50:38] "GET /static/webworkers/calcChartPic.js HTTP/1.1" 200 470 [09/Jun/2021 13:50:38] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:50:38] "GET /static/webworkers/calcChartPic.js HTTP/1.1" 200 470 [09/Jun/2021 13:50:38] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:50:38] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:50:38] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:50:38] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:50:38] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:50:38] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:50:38] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 WARNING:tensorflow:AutoGraph could not transform .Loader.call of .Loader object at 0x000001CB8657C608>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert [09/Jun/2021 13:50:41] "POST /mixpanel/track/?ip=1&_=1623232241616 HTTP/1.1" 200 0 2021-06-09 13:50:42,193 - WARNING - server.py:191 - Settings engine is not set. Cannot make recommendations. Using old json_network. WARNING:tensorflow:AutoGraph could not transform .Pipeline.call of .Pipeline object at 0x000001CB846B5708>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert WARNING:tensorflow:AutoGraph could not transform .TrainingPipeline.call of .TrainingPipeline object at 0x000001CBE2EA2188>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert WARNING:tensorflow:AutoGraph could not transform .TrainingPipeline.call of .TrainingPipeline object at 0x000001CB86308F48>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert WARNING:tensorflow:AutoGraph could not transform .TrainingPipeline.call of .TrainingPipeline object at 0x000001CB8623B088>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert WARNING:tensorflow:AutoGraph could not transform .TrainingPipeline.call of .TrainingPipeline object at 0x000001CC014A2588>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert 2021-06-09 13:50:42.317879: W tensorflow/core/kernels/data/cache_dataset_ops.cc:757] The calling iterator did not fully read the dataset being cached. In order to avoid unexpected truncation of the dataset, the partially cached contents of the dataset will be discarded. This can happen if you have an input pipeline similar to `dataset.cache().take(k).repeat()`. You should use `dataset.take(k).cache().repeat()` instead. 2021-06-09 13:50:42,319 - INFO - server.py:191 - Ran lightweight core. Duration: 0.1254658999999947s. Used cache for layers: 0, 1, 2, 3, 4, 7, 8, 5, 6, 9, 10 [09/Jun/2021 13:50:42] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:50:42] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:50:42] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:50:42] "GET /static/webworkers/calcChartPic.js HTTP/1.1" 200 470 [09/Jun/2021 13:50:42] "GET /static/webworkers/calcChartPic.js HTTP/1.1" 200 470 [09/Jun/2021 13:50:42] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:50:42] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:50:42] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:50:42] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:50:42] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:50:42] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:50:43] "POST /mixpanel/track/?ip=1&_=1623232242812 HTTP/1.1" 200 0 [09/Jun/2021 13:50:51] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:50:59] "GET /static/img/spinner.gif HTTP/1.1" 200 81572 [09/Jun/2021 13:50:59] "OPTIONS /directories?path=C:\Users\Dr.GM\Documents\Perceptilabs\Default/Model%201&token=tWcU7ryorWvMunfRhx9vSAdnJh-iJJNY0lkG5-fe_Ak HTTP/1.1" 200 0 [09/Jun/2021 13:50:59] "HEAD /directories?path=C:\Users\Dr.GM\Documents\Perceptilabs\Default/Model%201&token=tWcU7ryorWvMunfRhx9vSAdnJh-iJJNY0lkG5-fe_Ak HTTP/1.1" 200 71 [09/Jun/2021 13:50:59] "POST /json_models?path=C:\Users\Dr.GM\Documents\Perceptilabs\Default/Model%201&token=tWcU7ryorWvMunfRhx9vSAdnJh-iJJNY0lkG5-fe_Ak HTTP/1.1" 200 83 2021-06-09 13:50:59,217 - INFO - server.py:191 - Running mode training set for coreLogic w\ network '114' [09/Jun/2021 13:50:59] "POST /mixpanel/track/?ip=1&_=1623232259166 HTTP/1.1" 200 0 GPU limit: 0 GPU count: 1 Core limit: 1 WARNING:tensorflow:AutoGraph could not transform .Loader.call of .Loader object at 0x000001CB8652D488>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert WARNING:tensorflow:AutoGraph could not transform .Pipeline.call of .Pipeline object at 0x000001CBFDC15488>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert WARNING:tensorflow:AutoGraph could not transform .TrainingPipeline.call of .TrainingPipeline object at 0x000001CB8477F608>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert WARNING:tensorflow:AutoGraph could not transform .TrainingPipeline.call of .TrainingPipeline object at 0x000001CB864FAA48>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert WARNING:tensorflow:AutoGraph could not transform .TrainingPipeline.call of .TrainingPipeline object at 0x000001CB84708388>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert WARNING:tensorflow:AutoGraph could not transform .TrainingPipeline.call of .TrainingPipeline object at 0x000001CB8648B6C8>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert 2021-06-09 13:51:04,806 - INFO - threading.py:67 - CompabilityCore is_running set to True 2021-06-09 13:51:04,806 - INFO - server.py:191 - Started core for network 114. Mode: training 2021-06-09 13:51:04,809 - INFO - threading.py:67 - Training model initialized [09/Jun/2021 13:51:05] "POST /mixpanel/track/?ip=1&_=1623232264814 HTTP/1.1" 200 0 2021-06-09 13:51:05,605 - INFO - threading.py:67 - Entering training loop WARNING:tensorflow:AutoGraph could not transform > and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert WARNING:tensorflow:AutoGraph could not transform > and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert WARNING:tensorflow:AutoGraph could not transform > and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: Unable to locate the source code of >. Note that functions defined in certain environments, like the interactive Python shell do not expose their source code. If that is the case, you should to define them in a .py source file. If you are certain the code is graph-compatible, wrap the call using @tf.autograph.do_not_convert. Original error: could not get source code To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert WARNING:tensorflow:AutoGraph could not transform > and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: Unable to locate the source code of >. Note that functions defined in certain environments, like the interactive Python shell do not expose their source code. If that is the case, you should to define them in a .py source file. If you are certain the code is graph-compatible, wrap the call using @tf.autograph.do_not_convert. Original error: could not get source code To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert WARNING:tensorflow:AutoGraph could not transform > and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: Unable to locate the source code of >. Note that functions defined in certain environments, like the interactive Python shell do not expose their source code. If that is the case, you should to define them in a .py source file. If you are certain the code is graph-compatible, wrap the call using @tf.autograph.do_not_convert. Original error: could not get source code To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert WARNING:tensorflow:AutoGraph could not transform > and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: Unable to locate the source code of >. Note that functions defined in certain environments, like the interactive Python shell do not expose their source code. If that is the case, you should to define them in a .py source file. If you are certain the code is graph-compatible, wrap the call using @tf.autograph.do_not_convert. Original error: could not get source code To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert WARNING:tensorflow:AutoGraph could not transform > and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: Unable to locate the source code of >. Note that functions defined in certain environments, like the interactive Python shell do not expose their source code. If that is the case, you should to define them in a .py source file. If you are certain the code is graph-compatible, wrap the call using @tf.autograph.do_not_convert. Original error: could not get source code To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert WARNING:tensorflow:AutoGraph could not transform > and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: Unable to locate the source code of >. Note that functions defined in certain environments, like the interactive Python shell do not expose their source code. If that is the case, you should to define them in a .py source file. If you are certain the code is graph-compatible, wrap the call using @tf.autograph.do_not_convert. Original error: could not get source code To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert [09/Jun/2021 13:51:11] "GET /static/webworkers/calcChartPic.js HTTP/1.1" 200 470 [09/Jun/2021 13:51:11] "GET /static/webworkers/calcChartHeatMap.js HTTP/1.1" 200 1422 [09/Jun/2021 13:51:11] "GET /static/webworkers/calcChartPic.js HTTP/1.1" 200 470 [09/Jun/2021 13:51:11] "POST /mixpanel/track/?ip=1&_=1623232271020 HTTP/1.1" 200 0 [09/Jun/2021 13:51:17] "GET /static/webworkers/calcChartPic.js HTTP/1.1" 200 470 [09/Jun/2021 13:51:17] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:51:17] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:51:17] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:51:17] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:51:17] "GET /static/webworkers/calcChartPic.js HTTP/1.1" 200 470 [09/Jun/2021 13:51:17] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:51:17] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:51:17] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:51:17] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:51:17] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 2021-06-09 13:51:18,752 - INFO - threading.py:67 - Finished epoch 1/3 - Epoch duration: 13.147 s - Num training (validation) batches completed : 131 (38) 2021-06-09 13:51:26,375 - INFO - threading.py:67 - Finished epoch 2/3 - Epoch duration: 7.622 s - Num training (validation) batches completed : 131 (38) 2021-06-09 13:51:34,150 - INFO - threading.py:67 - Finished epoch 3/3 - Epoch duration: 7.774 s - Num training (validation) batches completed : 131 (38) 2021-06-09 13:51:34,278 - INFO - threading.py:67 - Training completed. Total duration: 28.543 s WARNING:tensorflow:AutoGraph could not transform .Loader.call of .Loader object at 0x000001CC357DA888>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert [09/Jun/2021 13:51:36] "POST /mixpanel/track/?ip=1&_=1623232295687 HTTP/1.1" 200 0 [09/Jun/2021 13:51:36] "POST /mixpanel/track/?ip=1&_=1623232295688 HTTP/1.1" 200 0 2021-06-09 13:51:38,928 - WARNING - server.py:191 - Settings engine is not set. Cannot make recommendations. Using old json_network. WARNING:tensorflow:AutoGraph could not transform .Pipeline.call of .Pipeline object at 0x000001CC24EFBD88>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert WARNING:tensorflow:AutoGraph could not transform .TrainingPipeline.call of .TrainingPipeline object at 0x000001CC24EF7D08>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert WARNING:tensorflow:AutoGraph could not transform .TrainingPipeline.call of .TrainingPipeline object at 0x000001CC24EE62C8>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert WARNING:tensorflow:AutoGraph could not transform .TrainingPipeline.call of .TrainingPipeline object at 0x000001CC24EE1C48>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert WARNING:tensorflow:AutoGraph could not transform .TrainingPipeline.call of .TrainingPipeline object at 0x000001CC24EC7A08>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert 2021-06-09 13:51:39.067029: W tensorflow/core/kernels/data/cache_dataset_ops.cc:757] The calling iterator did not fully read the dataset being cached. In order to avoid unexpected truncation of the dataset, the partially cached contents of the dataset will be discarded. This can happen if you have an input pipeline similar to `dataset.cache().take(k).repeat()`. You should use `dataset.take(k).cache().repeat()` instead. 2021-06-09 13:51:39,365 - INFO - server.py:191 - Ran lightweight core. Duration: 0.43644449999999324s. Used cache for layers: 2021-06-09 13:51:46,180 - INFO - server.py:191 - Created coreLogic for network 'stop_tests' [09/Jun/2021 13:51:46] "GET /static/webworkers/calcChartPic.js HTTP/1.1" 200 470 [09/Jun/2021 13:51:46] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:51:46] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:51:46] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:51:46] "GET /static/webworkers/calcChartPic.js HTTP/1.1" 200 470 [09/Jun/2021 13:51:46] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:51:46] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:51:46] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:51:46] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:51:46] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:51:46] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:51:46] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:51:46] "POST /mixpanel/track/?ip=1&_=1623232306033 HTTP/1.1" 200 0 2021-06-09 13:51:46,414 - ERROR - server.py:191 - Error in create_response (action='StopTests') (issue origin: asyncio.events:88) Traceback (most recent call last): File "perceptilabs\mainInterface.py", line 291, in perceptilabs.mainInterface.Interface.create_response File "perceptilabs\mainInterface.py", line 533, in perceptilabs.mainInterface.Interface._create_response AttributeError: 'NoneType' object has no attribute 'process_request' WARNING:tensorflow:AutoGraph could not transform .Loader.call of .Loader object at 0x000001CC3592E548>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert [09/Jun/2021 13:51:46] "POST /mixpanel/track/?ip=1&_=1623232306603 HTTP/1.1" 200 0 2021-06-09 13:51:49,636 - WARNING - server.py:191 - Settings engine is not set. Cannot make recommendations. Using old json_network. WARNING:tensorflow:AutoGraph could not transform .Pipeline.call of .Pipeline object at 0x000001CC35934E08>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert WARNING:tensorflow:AutoGraph could not transform .TrainingPipeline.call of .TrainingPipeline object at 0x000001CC3580A908>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert WARNING:tensorflow:AutoGraph could not transform .TrainingPipeline.call of .TrainingPipeline object at 0x000001CC357FAB88>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert WARNING:tensorflow:AutoGraph could not transform .TrainingPipeline.call of .TrainingPipeline object at 0x000001CC357EFD88>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert WARNING:tensorflow:AutoGraph could not transform .TrainingPipeline.call of .TrainingPipeline object at 0x000001CC51005D48>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert 2021-06-09 13:51:49.688033: W tensorflow/core/kernels/data/cache_dataset_ops.cc:757] The calling iterator did not fully read the dataset being cached. In order to avoid unexpected truncation of the dataset, the partially cached contents of the dataset will be discarded. This can happen if you have an input pipeline similar to `dataset.cache().take(k).repeat()`. You should use `dataset.take(k).cache().repeat()` instead. 2021-06-09 13:51:49,690 - INFO - server.py:191 - Ran lightweight core. Duration: 0.052972100000005184s. Used cache for layers: 0, 1, 2, 3, 4, 7, 8, 5, 6, 9, 10 [09/Jun/2021 13:51:50] "POST /mixpanel/track/?ip=1&_=1623232309976 HTTP/1.1" 200 0 [09/Jun/2021 13:51:50] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 WARNING:tensorflow:AutoGraph could not transform .Loader.call of .Loader object at 0x000001CC359B2888>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert [09/Jun/2021 13:51:51] "POST /mixpanel/track/?ip=1&_=1623232311567 HTTP/1.1" 200 0 2021-06-09 13:51:53,442 - WARNING - server.py:191 - Settings engine is not set. Cannot make recommendations. Using old json_network. WARNING:tensorflow:AutoGraph could not transform .Pipeline.call of .Pipeline object at 0x000001CC3598D288>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert WARNING:tensorflow:AutoGraph could not transform .TrainingPipeline.call of .TrainingPipeline object at 0x000001CC359873C8>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert WARNING:tensorflow:AutoGraph could not transform .TrainingPipeline.call of .TrainingPipeline object at 0x000001CC3597C7C8>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert WARNING:tensorflow:AutoGraph could not transform .TrainingPipeline.call of .TrainingPipeline object at 0x000001CC359629C8>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert WARNING:tensorflow:AutoGraph could not transform .TrainingPipeline.call of .TrainingPipeline object at 0x000001CC3595EF08>> and will run it as-is. Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output. Cause: module, class, method, function, traceback, frame, or code object was expected, got cython_function_or_method To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert 2021-06-09 13:51:53.496122: W tensorflow/core/kernels/data/cache_dataset_ops.cc:757] The calling iterator did not fully read the dataset being cached. In order to avoid unexpected truncation of the dataset, the partially cached contents of the dataset will be discarded. This can happen if you have an input pipeline similar to `dataset.cache().take(k).repeat()`. You should use `dataset.take(k).cache().repeat()` instead. 2021-06-09 13:51:53,497 - INFO - server.py:191 - Ran lightweight core. Duration: 0.054578000000020666s. Used cache for layers: 0, 1, 2, 3, 4, 7, 8, 5, 6, 9, 10 [09/Jun/2021 13:51:53] "GET /static/webworkers/calcChartPic.js HTTP/1.1" 200 470 [09/Jun/2021 13:51:53] "GET /static/webworkers/calcChartPic.js HTTP/1.1" 200 470 [09/Jun/2021 13:51:53] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:51:53] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:51:53] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:51:53] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:51:53] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:51:53] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:51:53] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:51:53] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403 [09/Jun/2021 13:51:53] "GET /static/webworkers/calcChartBase.js HTTP/1.1" 200 2403