Build and Deploy Machine Learning Solutions with Vertex AI: Challenge Lab Reviews
Loading...
No results found.

    Build and Deploy Machine Learning Solutions with Vertex AI: Challenge Lab Reviews

    15722 reviews

    ainda com erro na ultima tarefa - não foi consertado

    fernando r. · Reviewed about 1 month ago

    cassio r. · Reviewed about 1 month ago

    Massami D. · Reviewed about 1 month ago

    AHMED A. · Reviewed about 1 month ago

    Vivekanand S. · Reviewed about 1 month ago

    Too little time, dependencies broken.

    Jesús A. · Reviewed about 1 month ago

    instruction not relrvznt

    Aditya B. · Reviewed about 1 month ago

    guys, the lab is broken. tried to follow this issue, but the conflicts between the libraries are bigger https://github.com/GoogleCloudPlatform/generative-ai/issues/849

    Krzysztof W. · Reviewed about 1 month ago

    Not able to complete, getting timeout before script could run

    Madhumitha S. · Reviewed about 1 month ago

    Too many errors just by loading the imports

    Victor Hugo V. · Reviewed about 1 month ago

    --------------------------------------------------------------------------- AttributeError Traceback (most recent call last) Cell In[19], line 18 15 import matplotlib.pyplot as plt 17 # Import the Vertex AI Python SDK. ---> 18 from google.cloud import aiplatform as vertexai 20 #import vertexai 21 22 # Install pydot and graphviz 23 get_ipython().system('pip install pydot') File ~/.local/lib/python3.10/site-packages/google/cloud/aiplatform/__init__.py:24 19 from google.cloud.aiplatform import version as aiplatform_version 21 __version__ = aiplatform_version.__version__ ---> 24 from google.cloud.aiplatform import initializer 26 from google.cloud.aiplatform.datasets import ( 27 ImageDataset, 28 TabularDataset, (...) 31 VideoDataset, 32 ) 33 from google.cloud.aiplatform import explain File ~/.local/lib/python3.10/site-packages/google/cloud/aiplatform/initializer.py:34 32 from google.cloud.aiplatform.constants import base as constants 33 from google.cloud.aiplatform import utils ---> 34 from google.cloud.aiplatform.metadata import metadata 35 from google.cloud.aiplatform.utils import resource_manager_utils 36 from google.cloud.aiplatform.tensorboard import tensorboard_resource File ~/.local/lib/python3.10/site-packages/google/cloud/aiplatform/metadata/metadata.py:26 23 from google.protobuf import timestamp_pb2 25 from google.cloud.aiplatform import base ---> 26 from google.cloud.aiplatform import pipeline_jobs 27 from google.cloud.aiplatform.compat.types import execution as gca_execution 28 from google.cloud.aiplatform.metadata import constants File ~/.local/lib/python3.10/site-packages/google/cloud/aiplatform/pipeline_jobs.py:31 29 from google.cloud.aiplatform import utils 30 from google.cloud.aiplatform.constants import pipeline as pipeline_constants ---> 31 from google.cloud.aiplatform.metadata import artifact 32 from google.cloud.aiplatform.metadata import context 33 from google.cloud.aiplatform.metadata import execution File ~/.local/lib/python3.10/site-packages/google/cloud/aiplatform/metadata/artifact.py:26 23 from google.auth import credentials as auth_credentials 25 from google.cloud.aiplatform import base ---> 26 from google.cloud.aiplatform import models 27 from google.cloud.aiplatform import utils 28 from google.cloud.aiplatform.compat.types import artifact as gca_artifact File ~/.local/lib/python3.10/site-packages/google/cloud/aiplatform/models.py:46 44 from google.cloud.aiplatform import explain 45 from google.cloud.aiplatform import initializer ---> 46 from google.cloud.aiplatform import jobs 47 from google.cloud.aiplatform import models 48 from google.cloud.aiplatform import utils File ~/.local/lib/python3.10/site-packages/google/cloud/aiplatform/jobs.py:26 23 import time 25 from google.cloud import storage ---> 26 from google.cloud import bigquery 28 from google.auth import credentials as auth_credentials 29 from google.protobuf import duration_pb2 # type: ignore File ~/.local/lib/python3.10/site-packages/google/cloud/bigquery/__init__.py:35 31 from google.cloud.bigquery import version as bigquery_version 33 __version__ = bigquery_version.__version__ ---> 35 from google.cloud.bigquery.client import Client 36 from google.cloud.bigquery.dataset import AccessEntry 37 from google.cloud.bigquery.dataset import Dataset File ~/.local/lib/python3.10/site-packages/google/cloud/bigquery/client.py:76 72 except ImportError: 73 DEFAULT_BQSTORAGE_CLIENT_INFO = None # type: ignore ---> 76 from google.cloud.bigquery import _job_helpers 77 from google.cloud.bigquery._job_helpers import make_job_id as _make_job_id 78 from google.cloud.bigquery._helpers import _get_sub_prop File ~/.local/lib/python3.10/site-packages/google/cloud/bigquery/_job_helpers.py:24 21 import google.api_core.exceptions as core_exceptions 22 from google.api_core import retry as retries ---> 24 from google.cloud.bigquery import job 26 # Avoid circular imports 27 if TYPE_CHECKING: # pragma: NO COVER File ~/.local/lib/python3.10/site-packages/google/cloud/bigquery/job/__init__.py:27 25 from google.cloud.bigquery.job.base import TransactionInfo 26 from google.cloud.bigquery.job.base import UnknownJob ---> 27 from google.cloud.bigquery.job.copy_ import CopyJob 28 from google.cloud.bigquery.job.copy_ import CopyJobConfig 29 from google.cloud.bigquery.job.copy_ import OperationType File ~/.local/lib/python3.10/site-packages/google/cloud/bigquery/job/copy_.py:21 19 from google.cloud.bigquery.encryption_configuration import EncryptionConfiguration 20 from google.cloud.bigquery import _helpers ---> 21 from google.cloud.bigquery.table import TableReference 23 from google.cloud.bigquery.job.base import _AsyncJob 24 from google.cloud.bigquery.job.base import _JobConfig File ~/.local/lib/python3.10/site-packages/google/cloud/bigquery/table.py:38 35 pyarrow = None 37 try: ---> 38 import geopandas # type: ignore 39 except ImportError: 40 geopandas = None File /opt/conda/lib/python3.10/site-packages/geopandas/__init__.py:3 1 from geopandas._config import options ----> 3 from geopandas.geoseries import GeoSeries 4 from geopandas.geodataframe import GeoDataFrame 5 from geopandas.array import points_from_xy File /opt/conda/lib/python3.10/site-packages/geopandas/geoseries.py:18 15 from shapely.geometry.base import BaseGeometry 17 import geopandas ---> 18 from geopandas.base import GeoPandasBase, _delegate_property 19 from geopandas.explore import _explore_geoseries 20 from geopandas.plotting import plot_series File /opt/conda/lib/python3.10/site-packages/geopandas/base.py:13 10 from shapely.geometry.base import BaseGeometry 12 from . import _compat as compat ---> 13 from .array import GeometryArray, GeometryDtype, points_from_xy 16 def is_geometry_type(data): 17 """ 18 Check if the data is of geometry dtype. 19 20 Does not include object array of shapely scalars. 21 """ File /opt/conda/lib/python3.10/site-packages/geopandas/array.py:23 20 from shapely.geometry.base import BaseGeometry 22 from ._compat import HAS_PYPROJ, requires_pyproj ---> 23 from .sindex import SpatialIndex 25 if HAS_PYPROJ: 26 from pyproj import Transformer File /opt/conda/lib/python3.10/site-packages/geopandas/sindex.py:9 6 from . import _compat as compat 7 from . import array, geoseries ----> 9 PREDICATES = {p.name for p in shapely.strtree.BinaryPredicate} | {None} 11 if compat.GEOS_GE_310: 12 PREDICATES.update(["dwithin"]) AttributeError: module 'shapely' has no attribute 'strtree' getting this error

    Siddharth B. · Reviewed about 1 month ago

    Cannot continue: File /opt/conda/lib/python3.10/site-packages/geopandas/sindex.py:9 6 from . import _compat as compat 7 from . import array, geoseries ----> 9 PREDICATES = {p.name for p in shapely.strtree.BinaryPredicate} | {None} 11 if compat.GEOS_GE_310: 12 PREDICATES.update(["dwithin"]) AttributeError: module 'shapely' has no attribute 'strtree'

    A Z. · Reviewed about 1 month ago

    Viktor K. · Reviewed about 1 month ago

    Dhiraj S. · Reviewed about 1 month ago

    Maxime C. · Reviewed about 1 month ago

    lab code not working

    Douglas M. · Reviewed about 1 month ago

    Gonzalo L. · Reviewed about 1 month ago

    Kevin Q. · Reviewed about 2 months ago

    Can't import packages

    Paulina H. · Reviewed about 2 months ago

    Kevin Q. · Reviewed about 2 months ago

    The training steps are wrong. I have followed every step correctly and still the pipeline job fails. I have spend more than 10 hrs in this single lab. My trials are over now. PLease correct the lab and provide the feedback. it's very much frustrating. RuntimeError: Job failed with: code: 9 message: " The DAG failed because some tasks failed. The failed tasks are: [customcontainertrainingjob-run].; Job (project_id = qwiklabs-gcp-03-b98d93c3911d, job_id = 3120486567395721216) is failed due to the above error.; Failed to handle the job: {project_number = 737709931874, job_id = 3120486567395721216}"

    Jiji A. · Reviewed about 2 months ago

    Harish S. · Reviewed about 2 months ago

    Kinga K. · Reviewed about 2 months ago

    Frank W. · Reviewed about 2 months ago

    Kevin Q. · Reviewed about 2 months ago

    We do not ensure the published reviews originate from consumers who have purchased or used the products. Reviews are not verified by Google.