Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ERROR] NameError: Field name "json" shadows a BaseModel attribute; use a different field name with "alias='json'" #4944

Open
sravanthijoshi opened this issue Dec 2, 2024 · 3 comments

Comments

@sravanthijoshi
Copy link

sravanthijoshi commented Dec 2, 2024

Describe the bug
A clear and concise description of what the bug is.

[ERROR] NameError: Field name "json" shadows a BaseModel attribute; use a different field name with "alias='json'".
Traceback (most recent call last):
File "/var/lang/lib/python3.9/importlib/init.py", line 127, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "", line 1030, in _gcd_import
File "", line 1007, in _find_and_load
File "", line 986, in _find_and_load_unlocked
File "", line 680, in _load_unlocked
File "", line 850, in exec_module
File "", line 228, in _call_with_frames_removed
File "/var/task/test_lambda_docker_container_image/hrfin_mec_provision_llm.py", line 5, in
from sagemaker.jumpstart.model import JumpStartModel

To reproduce
A clear, step-by-step set of instructions to reproduce the bug.
The provided code need to be complete and runnable, if additional data is needed, please include them in the issue.

We are trying to create a sagemaker jumpstart llm model endpoint from the AWS lambda using docker

This code was running very well for last one year , but it is failing from 12/01 with below error . I have tried to pip install --upgrade sagemaker as well to upgrade the sagemaker packages in docker container.

Code :

import json
import time
from boto3 import client as boto3_client
from aws_lambda_powertools import Logger
from sagemaker.jumpstart.model import JumpStartModel

logger = Logger(service="test_lambda_docker_container_image")

def lambda_handler(event, context):

# Retrieve the model_id and model_version parameters from the json
# payload. Defaults to llama 7b optimized for text generation. Other
# parameters can be added to the lambda in this way.
model_id = event.get('model_id', 'meta-textgeneration-llama-2-7b-f')
model_version = event.get('model_version', '2.*')

# Deploy the model to a predictor object. Set the wait parameter to False
# to allow the lambda to terminate before the resource is provisioned.
# This allows asynchronous communication between lambda controls and
# provisioned predictor resources.
model = JumpStartModel(model_id=model_id, model_version=model_version, instance_type='ml.g5.4xlarge')
predictor = model.deploy(accept_eula=True, wait=False)

# We want to return a reference to the endpoint so that other programs can
# communicate with it once it has become available. To do this, we extract
# the endpoint_name attribute from the Predictor object `predictor`
endpoint_name = predictor.endpoint_name

payload = {'endpoint_name': endpoint_name}

print(payload)

time.sleep(600)

Expected behavior
A clear and concise description of what you expected to happen.

above code should execute well and create a meta llm model

Screenshots or logs
If applicable, add screenshots or logs to help explain your problem.

System information
A description of your system. Please provide:

  • SageMaker Python SDK version: 2.235.2 --- I have tried pip install ---upgrade sagemaker in the code package to fetch latest version and this bug exists
  • Framework name (eg. PyTorch) or algorithm (eg. KMeans):
  • Framework version:
  • Python version: 3.9
  • CPU or GPU:
  • Custom Docker image (Y/N): Y

Additional context
Add any other context about the problem here.

@sravanthijoshi
Copy link
Author

we were able to execute above code with sagemaker 2.227.0 version but not with latest version 2.235.2

@sravanthijoshi
Copy link
Author

Team , any udate on this bug?

@faithfulalabi
Copy link

Hey, team friendly bump on this issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants