-
Notifications
You must be signed in to change notification settings - Fork 44
Open
Labels
P1PriorityPriorityarea:uploadsFocused on functional modules of the productFocused on functional modules of the productpriority:p1High priority/Major issue but not blocking or Big percentage of customers affected.Bug SLA <=7daysHigh priority/Major issue but not blocking or Big percentage of customers affected.Bug SLA <=7daystype:bugA broken experienceA broken experience
Description
Describe the bug
I am using LargeFileUploadTask to upload files to a sharepoint directory using the following code.
The file is uploaded in visible in the sharepoint, but the code returns an error.
Code:
from azure.identity.aio import ClientSecretCredential # async credentials only
from kiota_authentication_azure.azure_identity_authentication_provider import AzureIdentityAuthenticationProvider
import os
from msgraph import GraphServiceClient
from msgraph_core.tasks.large_file_upload import LargeFileUploadTask, LargeFileUploadSession
import asyncio
from msgraph.generated.drives.item.items.item.create_upload_session.create_upload_session_post_request_body import CreateUploadSessionPostRequestBody
from msgraph.generated.models.drive_item_uploadable_properties import DriveItemUploadableProperties
from datetime import datetime, timedelta, timezone
from kiota_abstractions.api_error import APIError
# Replace these with your app and tenant information
client_id = ''
client_secret = ''
tenant_id = ''
site_id = ''
drive_id = ''
folder_path = '' # Local path of the file to upload
file_name = ''
scopes = ['https://graph.microsoft.com/.default']
credential = ClientSecretCredential(
tenant_id,
client_id,
client_secret)
# Initialize the Graph client
graph_client = GraphServiceClient(credential, scopes)
async def upload_large_file_to_sharepoint(graph_client, site_id, drive_id, folder_path, file_path, file_name):
try:
# Create the upload session URL path
file = open(f"{folder_path}/{file_name}",'rb')
uploadable_properties = DriveItemUploadableProperties(
additional_data={'@microsoft.graph.conflictBehavior': 'replace'}
)
upload_session_request_body = CreateUploadSessionPostRequestBody(
item=uploadable_properties)
try:
upload_session = await graph_client.drives.by_drive_id(drive_id).items.by_drive_item_id('root:/export/out_seasons2.csv:').create_upload_session.post(upload_session_request_body)
except APIError as ex:
print(f"Error creating upload session: {ex}")
# to be used for large file uploads
large_file_upload_session = LargeFileUploadSession(
upload_url=upload_session.upload_url,
expiration_date_time=datetime.now(timezone.utc) + timedelta(days=1),
additional_data=upload_session.additional_data,
is_cancelled=False,
next_expected_ranges=upload_session.next_expected_ranges
)
task = LargeFileUploadTask(
large_file_upload_session,
graph_client.request_adapter,
file)
total_length = os.path.getsize(f"{folder_path}/{file_name}")
# Upload the file
# The callback
def progress_callback(uploaded_byte_range: tuple[int, int]):
print(f"Uploaded {int(uploaded_byte_range[0])/1024/1024:.2f} MegaBytes of {int(total_length)/1024/1024:.2f} MegaBytes\n")
try:
upload_result = await task.upload(progress_callback)
print(f"Upload complete {upload_result}")
except APIError as ex:
print(f"Error uploading: {ex.message} - {ex.response_status_code}")
except APIError as e:
print(f"Error: {e}")
# Run the async upload function
asyncio.run(upload_large_file_to_sharepoint(graph_client, site_id, drive_id, folder_path, folder_path, file_name))
Error:
Traceback (most recent call last):
File "/Users/timwiethoff/Documents/dev/spielwiese_python/sharepoint-func copy.py", line 72, in <module>
asyncio.run(upload_large_file_to_sharepoint(graph_client, site_id, drive_id, folder_path, folder_path, file_name))
File "/opt/homebrew/anaconda3/envs/spielwiese/lib/python3.12/asyncio/runners.py", line 194, in run
return runner.run(main)
^^^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/envs/spielwiese/lib/python3.12/asyncio/runners.py", line 118, in run
return self._loop.run_until_complete(task)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/envs/spielwiese/lib/python3.12/asyncio/base_events.py", line 687, in run_until_complete
return future.result()
^^^^^^^^^^^^^^^
File "/Users/timwiethoff/Documents/dev/spielwiese_python/sharepoint-func copy.py", line 65, in upload_large_file_to_sharepoint
upload_result = await task.upload(progress_callback)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/envs/spielwiese/lib/python3.12/site-packages/msgraph_core/tasks/large_file_upload.py", line 110, in upload
response = await self.last_chunk(self.stream)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/envs/spielwiese/lib/python3.12/site-packages/msgraph_core/tasks/large_file_upload.py", line 220, in last_chunk
return await self.request_adapter.send_async(info, parsable_factory, error_map)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/envs/spielwiese/lib/python3.12/site-packages/kiota_http/httpx_request_adapter.py", line 190, in send_async
value = root_node.get_object_value(parsable_factory)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/envs/spielwiese/lib/python3.12/site-packages/kiota_serialization_json/json_parse_node.py", line 219, in get_object_value
result = factory.create_from_discriminator_value(self)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
AttributeError: 'NoneType' object has no attribute 'create_from_discriminator_value'
Expected behavior
Dont generate an error, because the file is uploaded successfully.
How to reproduce
Use my code snippet and upload a file larger than 5 mb.
SDK Version
1.1.6
Latest version known to work for scenario above?
No response
Known Workarounds
No response
Debug output
Click to expand log
```</details>
### Configuration
OS: MacOS Sequoia 15.0.1
ARM
### Other information
_No response_
Metadata
Metadata
Assignees
Labels
P1PriorityPriorityarea:uploadsFocused on functional modules of the productFocused on functional modules of the productpriority:p1High priority/Major issue but not blocking or Big percentage of customers affected.Bug SLA <=7daysHigh priority/Major issue but not blocking or Big percentage of customers affected.Bug SLA <=7daystype:bugA broken experienceA broken experience