Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ repos:
# biome-format Format the committed files
# biome-lint Lint and apply safe fixes to the committed files
- repo: https://github.com/biomejs/pre-commit
rev: v2.3.9
rev: v2.3.10
hooks:
- id: biome-check
additional_dependencies: ["@biomejs/biome@^1.0.0"]
Expand All @@ -45,7 +45,7 @@ repos:

# runs the ruff linter and formatter
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.14.9
rev: v0.14.10
hooks:
# linter
- id: ruff-check # runs ruff check --force-exclude
Expand Down
3 changes: 2 additions & 1 deletion sdk/docs/mkdocs.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,8 @@ nav:
- "Basics":
- "Getting Started": "getting-started.md"
- "Terminology": "terminology.md"
- "Common Workflows": "common-workflows.md"
- "Common Workflows":
- "Dataset downloads": "common-workflows/dataset-downloads.md"
- "Changelog": "changelog.md"
- "Advanced":
- "Concurrent Access": "advanced/concurrent-access.md"
Expand Down
4 changes: 2 additions & 2 deletions sdk/docs/mkdocs/changelog.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,10 +2,10 @@

## `0.1.18` - YYYY-MM-DD

## `0.1.17` - YYYY-MM-DD
## `0.1.17` - 2025-12-20

+ Fixes:
+ [**Improved file downloads**](https://github.com/spectrumx/sds-code/pull/236):
+ [**Improved file downloads**](https://github.com/spectrumx/sds-code/pull/236): more reliability for file downloads when they need to be resumed.
+ File downloads now use a temporary file during download to avoid partial files being left behind if the download is interrupted.
+ When overwrite is `False` and a local file would be overwritten, we skip re-downloading it.
+ When overwrite is `True` and the checksums don't match with server, we re-download and replace the local file to match the server's.
Expand Down
5 changes: 0 additions & 5 deletions sdk/docs/mkdocs/common-workflows/common-workflows.md

This file was deleted.

11 changes: 11 additions & 0 deletions sdk/docs/mkdocs/faq.md
Original file line number Diff line number Diff line change
Expand Up @@ -32,6 +32,7 @@ And others not as much.
+ [Can multiple clients write to the same location simultaneously?](#can-multiple-clients-write-to-the-same-location-simultaneously)
+ [Is it safe to have multiple clients reading from the same location?](#is-it-safe-to-have-multiple-clients-reading-from-the-same-location)
+ [Why is the SDK stateless?](#why-is-the-sdk-stateless)
+ [What protections help prevent accidental deletions?](#what-protections-help-prevent-accidental-deletions)
+ [Troubleshooting](#troubleshooting)
+ [I'm getting an `AuthError` when trying to authenticate. What should I check?](#im-getting-an-autherror-when-trying-to-authenticate-what-should-i-check)
+ [I'm getting a `NetworkError`. What does this mean?](#im-getting-a-networkerror-what-does-this-mean)
Expand Down Expand Up @@ -392,6 +393,16 @@ simultaneously without the complexity of session management. However, this means
request must contain all information needed to complete it, and the SDK cannot detect or
prevent concurrent writes to the same location.

### What protections help prevent accidental deletions?

SDS layers several safeguards to keep assets from being removed unintentionally:

+ Sharing defaults to **Viewer** access. Granting write permissions always requires an explicit choice.
+ Files that belong to captures or datasets cannot be deleted until they are unlinked from that grouping. Captures linked into datasets are equally protected while that relationship exists.
+ **Final** datasets are read-only, even for their owners. This is the recommended state for broader distribution once contents are stable. See [What are Draft and Final Datasets?](#what-are-draft-and-final-datasets) for details.
+ All SDS assets use soft deletion. Administrators can restore items for a short window after removal if contacted promptly. Reach out to support if needed.
+ When working in the SDK, keep shared-asset listings distinct from your own to reduce the chance of edits in the wrong context.

## Troubleshooting

### I'm getting an `AuthError` when trying to authenticate. What should I check?
Expand Down
2 changes: 1 addition & 1 deletion sdk/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@
name = "spectrumx"
readme = "./docs/README.md"
requires-python = ">=3.11"
version = "0.1.16"
version = "0.1.17"

# https://pypi.org/classifiers/
classifiers = [
Expand Down
17 changes: 12 additions & 5 deletions sdk/tests/e2e_examples/check_build_acceptance.py
Original file line number Diff line number Diff line change
Expand Up @@ -90,7 +90,9 @@ def check_basic_usage() -> None:
sds.upload(
local_path=local_dir, # may be a single file or a directory
sds_path=reference_name, # files will be created under this virtual directory
persist_state=False, # do not persist state in tests
verbose=True, # shows a progress bar (default)
warn_skipped=True, # warn if some files were skipped (default)
)

# download the files from an SDS directory
Expand Down Expand Up @@ -153,7 +155,9 @@ def check_error_handling() -> None:
upload_results: list[Result[File]] = sds.upload(
local_path=local_dir,
sds_path=reference_name,
persist_state=False, # do not persist state in tests
verbose=True,
warn_skipped=True,
)

# Since `upload()` is a batch operation, some files may succeed and some
Expand Down Expand Up @@ -268,21 +272,23 @@ def check_capture_usage() -> None:
# upload a single-channel capture
local_dir = Path("my_spectrum_files")
sds.upload_capture(
local_path=local_dir,
sds_path=capture_sds_dir,
capture_type=CaptureType.RadioHound,
index_name="", # automatically inferred from capture type
channel=None,
scan_group=None,
index_name="", # automatically inferred from capture type
local_path=local_dir,
name="Test Single Channel Capture",
persist_state=False, # do not persist state in tests
scan_group=None,
sds_path=capture_sds_dir,
verbose=True,
)

# upload a multi-channel capture
sds.upload_multichannel_drf_capture(
channels=[],
local_path=local_dir,
persist_state=False, # do not persist state in tests
sds_path=capture_sds_dir,
channels=[],
verbose=True,
)

Expand All @@ -300,6 +306,7 @@ def check_download_modes() -> None:
files_to_download=file_paginator,
to_local_path=Path("sds-downloads") / "files" / reference_name,
overwrite=False, # do not overwrite local existing files (default)
skip_contents=False,
verbose=True,
)

Expand Down
12 changes: 6 additions & 6 deletions sdk/tests/integration/regressions/test_paths.py
Original file line number Diff line number Diff line change
Expand Up @@ -168,13 +168,13 @@ def test_paths_sds_capture_ops(
f"{type(sds_path_random)!s:>40} '{sds_path_random}'"
)
capture = integration_client.upload_capture(
local_path=drf_sample_top_level_dir,
sds_path=sds_path_random,
capture_type=CaptureType.DigitalRF,
channel=drf_channel,
local_path=drf_sample_top_level_dir,
persist_state=False,
sds_path=sds_path_random,
verbose=False,
warn_skipped=False,
persist_state=False,
)
assert capture is not None, (
f"Failed to upload capture to '{sds_path_random}'"
Expand Down Expand Up @@ -206,13 +206,13 @@ def test_paths_sds_capture_ops(
f"{type(sds_path_random)!s:>40} '{sds_path_random}'"
)
capture = integration_client.upload_capture(
local_path=rh_sample_top_level_dir,
sds_path=sds_path_random,
capture_type=CaptureType.RadioHound,
local_path=rh_sample_top_level_dir,
persist_state=False,
scan_group=rh_data.get("scan_group"),
sds_path=sds_path_random,
verbose=False,
warn_skipped=False,
persist_state=False,
)
assert capture is not None, f"Failed to upload capture to '{sds_path_random}'"
assert capture.uuid is not None, f"Capture UUID is None for '{sds_path_random}'"
Expand Down
24 changes: 12 additions & 12 deletions sdk/tests/integration/test_captures.py
Original file line number Diff line number Diff line change
Expand Up @@ -448,13 +448,13 @@ def test_capture_upload_drf(

# ACT by uploading the capture
capture = integration_client.upload_capture(
local_path=test_dir,
sds_path=sds_path,
capture_type=capture_type,
channel=drf_channel,
warn_skipped=True,
raise_on_error=True,
local_path=test_dir,
persist_state=False,
raise_on_error=True,
sds_path=sds_path,
warn_skipped=True,
)

# ASSERT capture was correctly created
Expand Down Expand Up @@ -508,13 +508,13 @@ def test_capture_upload_rh(integration_client: Client) -> None:

# ACT
capture = integration_client.upload_capture(
local_path=dir_top_level,
sds_path=sds_path,
capture_type=CaptureType.RadioHound,
local_path=dir_top_level,
persist_state=False,
raise_on_error=True,
scan_group=scan_group,
sds_path=sds_path,
warn_skipped=True,
raise_on_error=True,
persist_state=False,
)

# ASSERT
Expand Down Expand Up @@ -569,11 +569,11 @@ def test_capture_upload_missing_required_fields_drf(
# ACT & ASSERT - Missing channel for DigitalRF
with pytest.raises(CaptureError):
integration_client.upload_capture(
local_path=test_dir,
sds_path=sds_path,
capture_type=capture_type,
raise_on_error=True,
local_path=test_dir,
persist_state=False,
raise_on_error=True,
sds_path=sds_path,
# Missing required channel parameter
)

Expand Down Expand Up @@ -869,9 +869,9 @@ def _upload_assets(
log.debug(f"Uploading assets as '/{sds_path}'")
upload_results = integration_client.upload(
local_path=local_path,
persist_state=False,
sds_path=sds_path,
verbose=False,
persist_state=False,
)
success_results = [success for success in upload_results if success]
failed_results = [success for success in upload_results if not success]
Expand Down
1 change: 1 addition & 0 deletions sdk/tests/integration/test_file_ops.py
Original file line number Diff line number Diff line change
Expand Up @@ -234,6 +234,7 @@ def test_upload_files_in_bulk(integration_client: Client, temp_file_tree: Path)
with disable_ssl_warnings():
results = integration_client.upload(
local_path=temp_file_tree,
persist_state=False, # do not persist state in tests
sds_path=PurePosixPath("/test-tree") / random_subdir_name,
verbose=True,
warn_skipped=False,
Expand Down
32 changes: 22 additions & 10 deletions sdk/tests/ops/test_captures.py
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,8 @@
# globally toggles dry run mode in case we want to run these under an integration mode.
DRY_RUN: bool = False

MULTICHANNEL_EXPECTED_COUNT = 2
MULTICHANNEL_EXPECTED_COUNT: int = 2 # expected num of captures in multi-channel tests
TEST_STATE_PERSISTENCE: bool = False # don't persist upload state in tests


@pytest.fixture
Expand Down Expand Up @@ -384,9 +385,10 @@ def test_upload_capture_dry_run(client: Client, tmp_path: Path) -> None:

# ACT
capture = client.upload_capture(
capture_type=capture_type,
local_path=test_dir,
persist_state=TEST_STATE_PERSISTENCE,
sds_path="/test/capture/dry-run",
capture_type=capture_type,
)

# ASSERT
Expand Down Expand Up @@ -419,17 +421,19 @@ def test_upload_capture_upload_fails(
# ACT & ASSERT
with pytest.raises(SDSError):
client.upload_capture(
local_path=test_dir,
sds_path="/test/capture/fail",
capture_type=CaptureType.DigitalRF,
local_path=test_dir,
persist_state=TEST_STATE_PERSISTENCE,
raise_on_error=True,
sds_path="/test/capture/fail",
)

# Test with raise_on_error=False
result = client.upload_capture(
local_path=test_dir,
sds_path="/test/capture/fail",
capture_type=CaptureType.DigitalRF,
persist_state=TEST_STATE_PERSISTENCE,
raise_on_error=False,
verbose=False,
)
Expand All @@ -451,6 +455,7 @@ def test_upload_capture_no_files(client: Client, tmp_path: Path) -> None:
local_path=empty_dir,
sds_path="/test/capture/empty",
capture_type=CaptureType.DigitalRF,
persist_state=TEST_STATE_PERSISTENCE,
verbose=False,
)

Expand All @@ -476,6 +481,7 @@ def test_upload_multichannel_drf_capture_dry_run(
local_path=test_dir,
sds_path="/test/multichannel/dry-run",
channels=channels,
persist_state=TEST_STATE_PERSISTENCE,
)

# ASSERT
Expand Down Expand Up @@ -540,6 +546,7 @@ def test_upload_multichannel_drf_capture_success(
local_path=test_dir,
sds_path="/test/multichannel",
channels=channels,
persist_state=TEST_STATE_PERSISTENCE,
)

# ASSERT
Expand Down Expand Up @@ -633,6 +640,7 @@ def test_upload_multichannel_drf_capture_existing_capture(
local_path=test_dir,
sds_path="/test/multichannel",
channels=channels,
persist_state=TEST_STATE_PERSISTENCE,
)

# ASSERT
Expand Down Expand Up @@ -707,6 +715,7 @@ def test_upload_multichannel_drf_capture_creation_fails(
local_path=test_dir,
sds_path="/test/multichannel",
channels=channels,
persist_state=TEST_STATE_PERSISTENCE,
raise_on_error=False,
)

Expand Down Expand Up @@ -916,11 +925,12 @@ def test_upload_capture_with_name_dry_run(client: Client, tmp_path: Path) -> Non

# ACT
capture = client.upload_capture(
local_path=test_dir,
sds_path="/test/capture/with/name",
capture_type=CaptureType.DigitalRF,
channel="test_channel",
local_path=test_dir,
name=capture_name,
persist_state=TEST_STATE_PERSISTENCE,
sds_path="/test/capture/with/name",
verbose=False,
)

Expand All @@ -941,10 +951,11 @@ def test_upload_capture_without_name_dry_run(client: Client, tmp_path: Path) ->

# ACT
capture = client.upload_capture(
local_path=test_dir,
sds_path="/test/capture/no/name",
capture_type=CaptureType.DigitalRF,
channel="test_channel",
local_path=test_dir,
persist_state=TEST_STATE_PERSISTENCE,
sds_path="/test/capture/no/name",
verbose=False,
)

Expand Down Expand Up @@ -1116,11 +1127,12 @@ def test_upload_capture_with_name_success(

# ACT
capture = client.upload_capture(
local_path=test_dir,
sds_path="/test/upload/success",
capture_type=CaptureType.DigitalRF,
channel="test_channel",
local_path=test_dir,
name=capture_name,
persist_state=TEST_STATE_PERSISTENCE,
sds_path="/test/upload/success",
verbose=False,
)

Expand Down
Loading
Loading