Machine Learning with TensorFlow in Vertex AI Rezensionen
Wird geladen…
Keine Ergebnisse gefunden.

    Machine Learning with TensorFlow in Vertex AI Rezensionen

    13794 Rezensionen

    Sauro G. · Vor mehr als ein Jahr überprüft

    Nikita M. · Vor mehr als ein Jahr überprüft

    Pradheepan R. · Vor mehr als ein Jahr überprüft

    Jorge V. · Vor mehr als ein Jahr überprüft

    chong y. · Vor mehr als ein Jahr überprüft

    Carlos A. · Vor mehr als ein Jahr überprüft

    Paramvir B. · Vor mehr als ein Jahr überprüft

    Yuji S. · Vor mehr als ein Jahr überprüft

    not enough time. the last task has errors. and no instruction on how to solve it.

    Waliur R. · Vor mehr als ein Jahr überprüft

    WILLIAM H. · Vor mehr als ein Jahr überprüft

    could only get a score of 80% as task 6 kept failing due to error listed below: Google Cloud Self-Paced Labs Machine Learning with TensorFlow in Vertex AI - GSP273 Task 6 gs://qwiklabs-gcp-00-905ba1094efe-dsongcp/ch9/trained_model/export/flights_20230726-210005/ Using endpoint [https://us-central1-aiplatform.googleapis.com/] Endpoint for flights_xai-20230726-215121 already exists Using endpoint [https://us-central1-aiplatform.googleapis.com/] ENDPOINT_ID=6499675026667601920 Using endpoint [https://us-central1-aiplatform.googleapis.com/] Using endpoint [https://us-central1-aiplatform.googleapis.com/] Waiting for operation [7021920166275448832]... ..................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................failed. ERROR: (gcloud.beta.ai.models.upload) Error occurred in Explanation preprocessing. <class 'ValueError'> NodeDef mentions attr 'Tsegmentids' not in Op<name=SparseSegmentMean; signature=data:T, indices:Tidx, segment_ids:int32 -> output:T; attr=T:type,allowed=[DT_FLOAT, DT_DOUBLE]; attr=Tidx:type,default=DT_INT32,allowed=[DT_INT32, DT_INT64]>; NodeDef: {{node model_3/deep_inputs/arr_airport_lat_bucketized_X_arr_airport_lon_bucketized_X_dep_airport_lat_bucketized_X_dep_airport_lon_bucketized_embedding/arr_airport_lat_bucketized_X_arr_airport_lon_bucketized_X_dep_airport_lat_bucketized_X_dep_airport_lon_bucketized_embedding_weights/embedding_lookup_sparse}}. (Check whether your GraphDef-interpreting binary is up to date with your GraphDef-generating binary.). Using endpoint [https://us-central1-aiplatform.googleapis.com/] MODEL_ID= Using endpoint [https://us-central1-aiplatform.googleapis.com/] ERROR: (gcloud.beta.ai.endpoints.deploy-model) could not parse resource [] --------------------------------------------------------------------------- CalledProcessError Traceback (most recent call last) Cell In[42], line 1 ----> 1 get_ipython().run_cell_magic('bash', '', '# note TF_VERSION set in 1st cell, but ENDPOINT_NAME is being changed\n# TF_VERSION=2-6\nENDPOINT_NAME=flights_xai\nTIMESTAMP=$(date +%Y%m%d-%H%M%S)\nMODEL_NAME=${ENDPOINT_NAME}-${TIMESTAMP}\nEXPORT_PATH=$(gsutil ls ${OUTDIR}/export | tail -1)\necho $EXPORT_PATH\n# create the model endpoint for deploying the model\nif [[ $(gcloud beta ai endpoints list --region=$REGION \\\n --format=\'value(DISPLAY_NAME)\' --filter=display_name=${ENDPOINT_NAME}) ]]; then\n echo "Endpoint for $MODEL_NAME already exists"\nelse\n # create model endpoint\n echo "Creating Endpoint for $MODEL_NAME"\n gcloud beta ai endpoints create --region=${REGION} --display-name=${ENDPOINT_NAME}\nfi\nENDPOINT_ID=$(gcloud beta ai endpoints list --region=$REGION \\\n --format=\'value(ENDPOINT_ID)\' --filter=display_name=${ENDPOINT_NAME})\necho "ENDPOINT_ID=$ENDPOINT_ID"\n# delete any existing models with this name\nfor MODEL_ID in $(gcloud beta ai models list --region=$REGION --format=\'value(MODEL_ID)\' --filter=display_name=${MODEL_NAME}); do\n echo "Deleting existing $MODEL_NAME ... $MODEL_ID "\n gcloud ai models delete --region=$REGION $MODEL_ID\ndone\n# upload the model using the parameters docker conatiner image, artifact URI, explanation method, \n# explanation path count and explanation metadata JSON file `explanation-metadata.json`. \n# Here, you keep number of feature permutations to `10` when approximating the Shapley values for explanation.\ngcloud beta ai models upload --region=$REGION --display-name=$MODEL_NAME \\\n --container-image-uri=us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.${TF_VERSION}:latest \\\n --artifact-uri=$EXPORT_PATH \\\n --explanation-method=sampled-shapley --explanation-path-count=10 --explanation-metadata-file=explanation-metadata.json\nMODEL_ID=$(gcloud beta ai models list --region=$REGION --format=\'value(MODEL_ID)\' --filter=display_name=${MODEL_NAME})\necho "MODEL_ID=$MODEL_ID"\n# deploy the model to the endpoint\ngcloud beta ai endpoints deploy-model $ENDPOINT_ID \\\n --region=$REGION \\\n --model=$MODEL_ID \\\n --display-name=$MODEL_NAME \\\n --machine-type=n1-standard-2 \\\n --min-replica-count=1 \\\n --max-replica-count=1 \\\n --traffic-split=0=100\n') File /opt/conda/lib/python3.10/site-packages/IPython/core/interactiveshell.py:2478, in InteractiveShell.run_cell_magic(self, magic_name, line, cell) 2476 with self.builtin_trap: 2477 args = (magic_arg_s, cell) -> 2478 result = fn(*args, **kwargs) 2480 # The code below prevents the output from being displayed 2481 # when using magics with decodator @output_can_be_silenced 2482 # when the last Python token in the expression is a ';'. 2483 if getattr(fn, magic.MAGIC_OUTPUT_CAN_BE_SILENCED, False): File /opt/conda/lib/python3.10/site-packages/IPython/core/magics/script.py:154, in ScriptMagics._make_script_magic.<locals>.named_script_magic(line, cell) 152 else: 153 line = script --> 154 return self.shebang(line, cell) File /opt/conda/lib/python3.10/site-packages/IPython/core/magics/script.py:314, in ScriptMagics.shebang(self, line, cell) 309 if args.raise_error and p.returncode != 0: 310 # If we get here and p.returncode is still None, we must have 311 # killed it but not yet seen its return code. We don't wait for it, 312 # in case it's stuck in uninterruptible sleep. -9 = SIGKILL 313 rc = p.returncode or -9 --> 314 raise CalledProcessError(rc, cell) CalledProcessError: Command 'b'# note TF_VERSION set in 1st cell, but ENDPOINT_NAME is being changed\n# TF_VERSION=2-6\nENDPOINT_NAME=flights_xai\nTIMESTAMP=$(date +%Y%m%d-%H%M%S)\nMODEL_NAME=${ENDPOINT_NAME}-${TIMESTAMP}\nEXPORT_PATH=$(gsutil ls ${OUTDIR}/export | tail -1)\necho $EXPORT_PATH\n# create the model endpoint for deploying the model\nif [[ $(gcloud beta ai endpoints list --region=$REGION \\\n --format=\'value(DISPLAY_NAME)\' --filter=display_name=${ENDPOINT_NAME}) ]]; then\n echo "Endpoint for $MODEL_NAME already exists"\nelse\n # create model endpoint\n echo "Creating Endpoint for $MODEL_NAME"\n gcloud beta ai endpoints create --region=${REGION} --display-name=${ENDPOINT_NAME}\nfi\nENDPOINT_ID=$(gcloud beta ai endpoints list --region=$REGION \\\n --format=\'value(ENDPOINT_ID)\' --filter=display_name=${ENDPOINT_NAME})\necho "ENDPOINT_ID=$ENDPOINT_ID"\n# delete any existing models with this name\nfor MODEL_ID in $(gcloud beta ai models list --region=$REGION --format=\'value(MODEL_ID)\' --filter=display_name=${MODEL_NAME}); do\n echo "Deleting existing $MODEL_NAME ... $MODEL_ID "\n gcloud ai models delete --region=$REGION $MODEL_ID\ndone\n# upload the model using the parameters docker conatiner image, artifact URI, explanation method, \n# explanation path count and explanation metadata JSON file `explanation-metadata.json`. \n# Here, you keep number of feature permutations to `10` when approximating the Shapley values for explanation.\ngcloud beta ai models upload --region=$REGION --display-name=$MODEL_NAME \\\n --container-image-uri=us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.${TF_VERSION}:latest \\\n --artifact-uri=$EXPORT_PATH \\\n --explanation-method=sampled-shapley --explanation-path-count=10 --explanation-metadata-file=explanation-metadata.json\nMODEL_ID=$(gcloud beta ai models list --region=$REGION --format=\'value(MODEL_ID)\' --filter=display_name=${MODEL_NAME})\necho "MODEL_ID=$MODEL_ID"\n# deploy the model to the endpoint\ngcloud beta ai endpoints deploy-model $ENDPOINT_ID \\\n --region=$REGION \\\n --model=$MODEL_ID \\\n --display-name=$MODEL_NAME \\\n --machine-type=n1-standard-2 \\\n --min-replica-count=1 \\\n --max-replica-count=1 \\\n --traffic-split=0=100\n'' returned non-zero exit status 1.

    Paul C. · Vor mehr als ein Jahr überprüft

    I can't finish this lab because some trouble when deploy ai model at lát step

    Tung L. · Vor mehr als ein Jahr überprüft

    Anawat T. · Vor mehr als ein Jahr überprüft

    Shinto T. · Vor mehr als ein Jahr überprüft

    chong y. · Vor mehr als ein Jahr überprüft

    fix your notebooks please

    Mikael M. · Vor mehr als ein Jahr überprüft

    Mikael M. · Vor mehr als ein Jahr überprüft

    Not working, not able to create Jupyter notebook because 'no resources available'

    Javier V. · Vor mehr als ein Jahr überprüft

    Claudia F. · Vor mehr als ein Jahr überprüft

    Nguyễn X. · Vor mehr als ein Jahr überprüft

    Kaung M. · Vor mehr als ein Jahr überprüft

    could only get a score of 80% as task 6 kept failing due to error listed below: Google Cloud Self-Paced Labs Machine Learning with TensorFlow in Vertex AI - GSP273 Task 6 gs://qwiklabs-gcp-00-905ba1094efe-dsongcp/ch9/trained_model/export/flights_20230726-210005/ Using endpoint [https://us-central1-aiplatform.googleapis.com/] Endpoint for flights_xai-20230726-215121 already exists Using endpoint [https://us-central1-aiplatform.googleapis.com/] ENDPOINT_ID=6499675026667601920 Using endpoint [https://us-central1-aiplatform.googleapis.com/] Using endpoint [https://us-central1-aiplatform.googleapis.com/] Waiting for operation [7021920166275448832]... ..................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................failed. ERROR: (gcloud.beta.ai.models.upload) Error occurred in Explanation preprocessing. <class 'ValueError'> NodeDef mentions attr 'Tsegmentids' not in Op<name=SparseSegmentMean; signature=data:T, indices:Tidx, segment_ids:int32 -> output:T; attr=T:type,allowed=[DT_FLOAT, DT_DOUBLE]; attr=Tidx:type,default=DT_INT32,allowed=[DT_INT32, DT_INT64]>; NodeDef: {{node model_3/deep_inputs/arr_airport_lat_bucketized_X_arr_airport_lon_bucketized_X_dep_airport_lat_bucketized_X_dep_airport_lon_bucketized_embedding/arr_airport_lat_bucketized_X_arr_airport_lon_bucketized_X_dep_airport_lat_bucketized_X_dep_airport_lon_bucketized_embedding_weights/embedding_lookup_sparse}}. (Check whether your GraphDef-interpreting binary is up to date with your GraphDef-generating binary.). Using endpoint [https://us-central1-aiplatform.googleapis.com/] MODEL_ID= Using endpoint [https://us-central1-aiplatform.googleapis.com/] ERROR: (gcloud.beta.ai.endpoints.deploy-model) could not parse resource [] --------------------------------------------------------------------------- CalledProcessError Traceback (most recent call last) Cell In[42], line 1 ----> 1 get_ipython().run_cell_magic('bash', '', '# note TF_VERSION set in 1st cell, but ENDPOINT_NAME is being changed\n# TF_VERSION=2-6\nENDPOINT_NAME=flights_xai\nTIMESTAMP=$(date +%Y%m%d-%H%M%S)\nMODEL_NAME=${ENDPOINT_NAME}-${TIMESTAMP}\nEXPORT_PATH=$(gsutil ls ${OUTDIR}/export | tail -1)\necho $EXPORT_PATH\n# create the model endpoint for deploying the model\nif [[ $(gcloud beta ai endpoints list --region=$REGION \\\n --format=\'value(DISPLAY_NAME)\' --filter=display_name=${ENDPOINT_NAME}) ]]; then\n echo "Endpoint for $MODEL_NAME already exists"\nelse\n # create model endpoint\n echo "Creating Endpoint for $MODEL_NAME"\n gcloud beta ai endpoints create --region=${REGION} --display-name=${ENDPOINT_NAME}\nfi\nENDPOINT_ID=$(gcloud beta ai endpoints list --region=$REGION \\\n --format=\'value(ENDPOINT_ID)\' --filter=display_name=${ENDPOINT_NAME})\necho "ENDPOINT_ID=$ENDPOINT_ID"\n# delete any existing models with this name\nfor MODEL_ID in $(gcloud beta ai models list --region=$REGION --format=\'value(MODEL_ID)\' --filter=display_name=${MODEL_NAME}); do\n echo "Deleting existing $MODEL_NAME ... $MODEL_ID "\n gcloud ai models delete --region=$REGION $MODEL_ID\ndone\n# upload the model using the parameters docker conatiner image, artifact URI, explanation method, \n# explanation path count and explanation metadata JSON file `explanation-metadata.json`. \n# Here, you keep number of feature permutations to `10` when approximating the Shapley values for explanation.\ngcloud beta ai models upload --region=$REGION --display-name=$MODEL_NAME \\\n --container-image-uri=us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.${TF_VERSION}:latest \\\n --artifact-uri=$EXPORT_PATH \\\n --explanation-method=sampled-shapley --explanation-path-count=10 --explanation-metadata-file=explanation-metadata.json\nMODEL_ID=$(gcloud beta ai models list --region=$REGION --format=\'value(MODEL_ID)\' --filter=display_name=${MODEL_NAME})\necho "MODEL_ID=$MODEL_ID"\n# deploy the model to the endpoint\ngcloud beta ai endpoints deploy-model $ENDPOINT_ID \\\n --region=$REGION \\\n --model=$MODEL_ID \\\n --display-name=$MODEL_NAME \\\n --machine-type=n1-standard-2 \\\n --min-replica-count=1 \\\n --max-replica-count=1 \\\n --traffic-split=0=100\n') File /opt/conda/lib/python3.10/site-packages/IPython/core/interactiveshell.py:2478, in InteractiveShell.run_cell_magic(self, magic_name, line, cell) 2476 with self.builtin_trap: 2477 args = (magic_arg_s, cell) -> 2478 result = fn(*args, **kwargs) 2480 # The code below prevents the output from being displayed 2481 # when using magics with decodator @output_can_be_silenced 2482 # when the last Python token in the expression is a ';'. 2483 if getattr(fn, magic.MAGIC_OUTPUT_CAN_BE_SILENCED, False): File /opt/conda/lib/python3.10/site-packages/IPython/core/magics/script.py:154, in ScriptMagics._make_script_magic.<locals>.named_script_magic(line, cell) 152 else: 153 line = script --> 154 return self.shebang(line, cell) File /opt/conda/lib/python3.10/site-packages/IPython/core/magics/script.py:314, in ScriptMagics.shebang(self, line, cell) 309 if args.raise_error and p.returncode != 0: 310 # If we get here and p.returncode is still None, we must have 311 # killed it but not yet seen its return code. We don't wait for it, 312 # in case it's stuck in uninterruptible sleep. -9 = SIGKILL 313 rc = p.returncode or -9 --> 314 raise CalledProcessError(rc, cell) CalledProcessError: Command 'b'# note TF_VERSION set in 1st cell, but ENDPOINT_NAME is being changed\n# TF_VERSION=2-6\nENDPOINT_NAME=flights_xai\nTIMESTAMP=$(date +%Y%m%d-%H%M%S)\nMODEL_NAME=${ENDPOINT_NAME}-${TIMESTAMP}\nEXPORT_PATH=$(gsutil ls ${OUTDIR}/export | tail -1)\necho $EXPORT_PATH\n# create the model endpoint for deploying the model\nif [[ $(gcloud beta ai endpoints list --region=$REGION \\\n --format=\'value(DISPLAY_NAME)\' --filter=display_name=${ENDPOINT_NAME}) ]]; then\n echo "Endpoint for $MODEL_NAME already exists"\nelse\n # create model endpoint\n echo "Creating Endpoint for $MODEL_NAME"\n gcloud beta ai endpoints create --region=${REGION} --display-name=${ENDPOINT_NAME}\nfi\nENDPOINT_ID=$(gcloud beta ai endpoints list --region=$REGION \\\n --format=\'value(ENDPOINT_ID)\' --filter=display_name=${ENDPOINT_NAME})\necho "ENDPOINT_ID=$ENDPOINT_ID"\n# delete any existing models with this name\nfor MODEL_ID in $(gcloud beta ai models list --region=$REGION --format=\'value(MODEL_ID)\' --filter=display_name=${MODEL_NAME}); do\n echo "Deleting existing $MODEL_NAME ... $MODEL_ID "\n gcloud ai models delete --region=$REGION $MODEL_ID\ndone\n# upload the model using the parameters docker conatiner image, artifact URI, explanation method, \n# explanation path count and explanation metadata JSON file `explanation-metadata.json`. \n# Here, you keep number of feature permutations to `10` when approximating the Shapley values for explanation.\ngcloud beta ai models upload --region=$REGION --display-name=$MODEL_NAME \\\n --container-image-uri=us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.${TF_VERSION}:latest \\\n --artifact-uri=$EXPORT_PATH \\\n --explanation-method=sampled-shapley --explanation-path-count=10 --explanation-metadata-file=explanation-metadata.json\nMODEL_ID=$(gcloud beta ai models list --region=$REGION --format=\'value(MODEL_ID)\' --filter=display_name=${MODEL_NAME})\necho "MODEL_ID=$MODEL_ID"\n# deploy the model to the endpoint\ngcloud beta ai endpoints deploy-model $ENDPOINT_ID \\\n --region=$REGION \\\n --model=$MODEL_ID \\\n --display-name=$MODEL_NAME \\\n --machine-type=n1-standard-2 \\\n --min-replica-count=1 \\\n --max-replica-count=1 \\\n --traffic-split=0=100\n'' returned non-zero exit status 1.

    Paul C. · Vor mehr als ein Jahr überprüft

    Too many issues from the start

    Claudia F. · Vor mehr als ein Jahr überprüft

    Wir können nicht garantieren, dass die veröffentlichten Rezensionen von Verbrauchern stammen, die die Produkte gekauft oder genutzt haben. Die Rezensionen werden von Google nicht überprüft.